Can we talk to computers using our body ?

This is a long one ..... so please bear with me

As a human-centered designer, the next few years are going to see a dynamic shift in interface design. Amplified by the pandemic, touchless and embodied interactions are going to disrupt customers as well as personal experiences. With gesture-based interaction as one of the frontiers in this new model of interaction, I wanted to explore the possibility of developing one holistic interaction model that would integrate the use of Natural User Interaction as the primary mode of communication. to create an interaction system that is truly seamless and intuitive and adaptive to the user. the purpose of incorporating such a model in HCI would shift the focus from having humans adapt to the language of technology to having technology adapt and simulate human behaviors.

Breaking Down the Context

While defining the scope, I aimed to select a context where gestures serve as a primary mode of interaction, rather than merely an alternative to an established interaction ecosystem.

One of the core goals of a Natural User Interface (NUI) is to replicate the experience of natural interaction. Since most of our real-world interactions occur in three dimensions and involve a wide range of movements across the body, it was crucial that both the technology and experiment accounted for this full range of motion.

For my Scope, I have thereby narrowed it down to the interaction in AR and MR environments. Within these environments, I am looking at analysing my line of inquiry with the context of interaction models of navigational and instructional modules as they present a level of uniformity and they form the initial and basic primitive map of the entire interaction system

Narrowed Scope

Developing a vocabulary for instructional and navigational interactions within XR environments
To analyse more uniform and universal interaction systems

Looking at Sign Language for Inspiration

Sign vs. Gestures: Traditionally, sign was often equated with gestures due to their shared manual modalities.

Early Perception: Sign was not initially considered a language, as it was viewed as pictorial rather than symbolic. It was believed to lack precision, subtlety, flexibility, and the ability for abstract thinking.

Linguistic Structure: Sign language was eventually recognized as part of the linguistic ecosystem.It possesses all fundamental features of language, including its own rules, word order, and word formation.

Key Question: Given the shared physical modalities between sign and gestures, along with sign’s linguistic structure, can sign be integrated into a fluidic and natural gesture-based interactions?

Conclusion: Given the structure of Sign language and its versatility across different cultures and geographical locations, it in its current scope sign language would have limited adaptability as a Natural user interaction.

Types of Gestures

ILLUSTRATORS
Illustrate the spoken language that they accompany
EMBLEMS
Have a universally accepted Meaning. Does not require the assistance of spoken language
MANUPILATORS
Behaviors' that are indicative of the individuals internal state

Designing an Experiment

AIM
To create and validate a gesture-based vocabulary set for Mixed Reality whilst also making note of the relevant inspirations and factors that influence the same
Participant
Persona
Age Group
Profession
Background
Average Digital Literacy
20-24
Student Designers
Upper Middle-Class Urban Upbringing
7.5 (1-10)
Objective
The objective of this experiment is to analyze user interactions within a Mixed Reality environment by assigning participants a set of tasks. These tasks are performed under two contrasting environmental conditions. Participants must complete predefined goals, and their common gestures for each task are recorded to establish a standardised gesture vocabulary.
METHOD
  • The user is shown a 10-second Microsoft Mesh video to introduce the Mixed Reality environment and provide context.
  • A storyline is narrated to stitch together all the interactions and ensure a cohesive experience.
  • The experimenter provides technological feedback throughout the experiment to guide and assist the participants.
TASKS

1. Activate and deactivate the technology
2. View the object from different perspective
3. Scale the object to be bigger and smaller
4. Increase and Decrease the size of the object
5. Initiate and Terminate a process
Environmental Settings
In solitude
In a Crowded Space

Setting up the Context

"You are part of the design team, before your presentation, you need to fix a part of the chair design before presenting its hologram to your clients. You put on your glasses, activate (switch on) the device, and view the chair design. You view it from different perspectives, Scale it up and down and increase and decrease its size (example from 150 cm to 300 cm). Once happy with the final design, you initiate (start) the process of rendering the chair. Midway through the render, you realize that you need to make another change so you terminate(stop) the process. After making the change, you initiate the process again and let it complete. Happy with your design, you deactivate(switch off) your device, take it off, and head to work."

Conducting the Experiment

TOP GESTURES

Documenting the most common gestures for the defined interactions

Developing and testing the Vocabulary set

The reverse of experiment 1, here the participant is informed about the technology, context, and space.
After which the experimenter performs the gestures from the vocabulary set and the participants' task is to associate the gesture with the function they feel is relevant.
The Participants chosen for this experiment are different from the set chosen to create the vocabulary
1
2

KEY INSIGHTS

INSIGHT 1

The Movie MANIA

55% of the participants drew their influences from interaction models that they had witnessed in movies. Out of that 45% drew their inspiration from one specific character and franchise- Iron Man (Tony Stark)

INSIGHT 2

The Power of VOICE

For many of the tasks, most participants instinctively used voice commands to initiate the process, when deprived of that option, then they used gestures to along with an imaginary UI interface to perform the respective tasks

INSIGHT 3

EYE - See You

Interestingly enough, a lot of participants relied on gaze based navigation and wink/blink based interactions to control the mixed reality space, especially in crowded spaces.

INSIGHT 4

Digital is the new natural

Frequent interaction with digital devices has conditioned the current and upcoming generations to perform specific gestures that, while not originally organic, have become deeply ingrained in muscle memory. This evolution is embedding digital gestures into what is now perceived as ‘natural’ interaction, blurring the boundary between physical and screen-based experiences.

Accounting for Biases

  • Every participant was a peer from the same educational institute and therefore is aware and accustomed to some form of body storming as well as has a similar level of digital awareness.
  • Every participant is a designer therefore the context of designing a chair correlates easily with their preexisting mental models from past design software experiences
  • Belonging to the same age group and similar economic background as well as having a common urban upbringing, results in having similar references and experiences in terms of technological interactions as well as content consumption
  • The size of the object influenced the imagination and the level of manipulation that each participant carried out
  • The lack of a simulated environment would often make the participant forget the context within which they had to perform the task
  • Participants with a background in UI/UX would put more thought behind the usability of their gestures which would influence their natural initial intuitive interaction choices
  • The video and the references within the video could have possibly subconsciously made the participants replicate some of the gestures
  • The position of the object is in respect to distance from the ground as well as the participant is speculated based on the experimenter's assumptions and imagination

If you still wish to view this project in further detail, view my complete process book

Click Here

Heading