Imagine this – you are playing a video game and want to interact with the other character. But, instead of using the mouse, you simply rest your eyes on that particular character and almost instantly, you get a response from it.
Wouldn’t that be outstanding?
Eye-tracking replacing a task usually performed by the mouse!
That’s interactivity at a completely different level. And this, indeed, is the future of the multi-billion dollar industry of video gaming.
Eyes can reveal a lot about a person’s intentions, thoughts and actions, as they are good indicators of what we’re interested in. Eye-tracking can help in picking up the cues that are subconsciously given by the other person and hence, in enhancing the overall interaction. The most amazing part about these micro-gestures and expressions is that they are totally intuitive. There’s a lot of potential in eye-tracking in other arenas as well.
Eye-tracking is just one of the varied aspects of technological advancements. There’s much more to this phase of evolution.
Wearables, The Internet of Things and Smart Materials – you have probably come across these terms a number of times.
These realms are opening ways for even more eccentric and unimaginable things, into a future where usability reaches a whole new level.
The Dramatic Shift and the World of Sensors
Humans have been interfacing with machines for thousands of years. Studies about human computer interaction (HCI) date back to 1975.
Right from the 19th-century Jacquard loom that changed the way a machine could work, to the unpredictable Kinect and then Siri in 2011, the world has clearly witnessed a dramatic shift. Today, sending commands to machines is not just about the keyboard-mouse paradigm.
The way people interact with devices is changing owing to several affordable sensors.
From cellular phones to smartphones, touch and multi-touch screens have driven the change. And gestures are now the main interaction modality to activate functions on personal devices. On the other hand, speech recognition technologies and CPUs’ increased computational power let users efficiently provide inputs when they can’t perform gestures.
Personal devices stand as an example of how the various new forms of HCI can reduce the gap between humans and technology.
Entertainment is one market where HCI is witnessing deep innovation.
Why, you ask?
Users are now demanding new ways to control characters. This has eventually led to game console developers release the players from the constraints of using a keyboard and a mouse. They have proposed controllers for this task. The new interface becomes a means for providing tactile feedback as well as acting as a sort of tangible interface.
The introduction of Microsoft Kinect was a huge step towards the complete implementation of natural interfaces in which the human body becomes the controller. The honorable Wii let users command the machine via gestures and body movements. Not only is it instantly accessible but also very satisfying. Recognizing the position and orientation of bones lets the hardware identify poses and gestures, which can be mapped to command the machine.
Along with this, researches have also proposed sensors that can track a user’s hands and all the fingers. For example, the Leap Motion can interactively track both the hands of a user by identifying the positions of finger tips and the center of the palm, and computing finger joints using an inverse kinematics solver.
In order to best utilize the Leap Motion, researchers have also come with a new medical study, where they clearly state that Leap Motion has the potential to play a key role in how doctors diagnose and treat several brain disorders- even during live surgery.
Adding to this, many car makers are already proposing a hand-tracking based alternative interaction modality in order to replace traditional touch screens that can easily manage infotainment functions. Similarly, some smart TVs let users control their choices with a set of gestures, thus replacing the traditional remote control.
Wearable Tech
How could there be no mention of wearables in the current scenario?
Many applications for tourism, entertainment, maintenance, shopping, and social networks are already available for personal devices, but new wearable sensors might soon change the way we live. The now defunct Google Glass and the recent newcomer, the Apple Watch are brilliant examples of this. Moreover, new application fields are being proposed daily.
All of these incredibly surreal scenarios could be found in science fiction movies earlier. But now it has become a reality.
There’s already an eye tracking upgrade for the Oculus Rift virtual reality headset. If users are willing to wear something on their heads, why not add an eye tracker too and enhance interaction using all that information that’s being given away by the eyes. Now that, indeed, would be a remarkable addition.
A Better Future – A Better World
There’s one thing that remains certain – new forms of HCI will change our lives significantly. They will, undoubtedly, offer the chance to improve the quality of life of people who can’t take advantage of current interfaces due to physical disabilities. But then again, it comes with a hitch. New issues will arise such as the ones related to privacy, security, and ethics, thus potentially slowing the diffusion of new hardware and software.
What is amazing about human computer interaction is that just as the mouse and the keyboard were innovations, and the iPhone was a better innovation, the Kinect is also an innovation in response to input technology. But all these are just stepping stones towards less physical interfaces.
The opportunities for HCI are tremendous. Progress toward more user-friendly and natural interfaces for human-machine interaction can yield a plethora of advantages and can majorly impact everyday life.
Want to learn more?
If you’d like to become an expert in UX Design, Design Thinking, UI Design, or another related design topic, then consider to take an online UX course from the Interaction Design Foundation. For example, Design Thinking, Become a UX Designer from Scratch, Conducting Usability Testing or User Research – Methods and Best Practices. Good luck on your learning journey!
(Lead image: Depositphotos)