What is a sense of agency?
A sense of agency is simply the feeling that our actions produce an obvious effect. More broadly speaking, it’s a feeling of being in control of our environment – or, at least, clearly having an influence on it.
The sense of agency is usually present in the natural world. Whenever you are holding a physical object, your hand feels that object; you can intuitively sense its weight and texture and easily move it around. Compare that familiar sensation to fumbling to pick up an object in a video game. Even with the best virtual reality (VR) headsets, the sense of agency in a virtual world is far weaker, leading to objects that seem unreal, with no presence or texture.
Virtual environments and user interfaces (UIs) superficially mimic the appearance of the natural world without offering the full range of sensory feedback. The result is unintuitive UIs and dissatisfied users – for example, an automatic door that unexpectedly doesn’t open as we approach, or a power button on a phone or computer that doesn’t produce an immediate response. Of course, no user will ever complain to a vendor, “your product lacks a sense of agency!” We just press the ‘button’ harder or press it repeatedly.
Why is it important?
Years of scientific research and testing have emphasized how the sense of agency is intrinsically linked with a good user experience. Users “strongly desire the sense that they are in charge of the system and that the system responds to their actions.”
Because of the perceived satisfaction, simply providing a stronger sense of agency in some cases could be enough to make users ignore other problems with a UI. For example, research has shown that when we feel we’re in control, we’re less aware of delays in response. This phenomenon is known as the intentional binding effect (Figure 1).
For designers and engineers, a useful and practical lesson to of the intentional binding effect is that people tend to feel that any UI that makes them feel in control is better than one where their sense of agency is less clear (Sidebar 1). In fact, most users feel that such products are simply better and more responsive.
Faster, more powerful hardware, carefully redesigned software, and other conventional means can be used to design a more responsive interface. Or, the same perceived benefits can be achieved by strengthening a user’s sense of agency.
Touch: The missing element in a strong sense of agency
Building on work in intentional binding and time perception, recent research has shown how the human sense of agency becomes stronger or weaker depending on which of our senses are used in an action (Figure 2). Importantly, the findings show that haptic feedback provides a stronger sense of agency than visual feedback, with the perceived action-outcome time interval being shorter. This, therefore, provides an opportunity for designers and developers to harness the sense of touch in order to achieve a stronger sense of agency.
It should not be a surprise that the sense of touch provides the greatest sense of agency. The skin is by far the largest sensory organ in the human body, with an average surface area of almost two square meters. It consists of about 5 million receptors, which are densely packed in the areas they are needed most – for example, there are approximately 3000 touch receptors in each fingertip.
Although commercial products have focused on sound and vision for decades, touch-based haptic feedback can, in fact, create a much stronger sense of agency. Practical experience with real-world products has shown that haptic feedback offers numerous benefits. These include:
- Reinforcing the sense of agency and sense of reality
- Allowing faster and more accurate control in the absence of real physical contact, or when physical contact provides a limited sensory input (such as a smooth touch screen)
- Strengthening feedback to the user in cases when other sensory feedback is limited, weak, or confusing (such as in a noisy environment, situations in which vision is obscured, or the user’s attention is focused elsewhere)
What’s wrong with haptics?
Haptic technologies are already being used in a variety of markets, from the ubiquitous tiny vibration motor built into mobile phones and tablets to more sophisticated entertainment setups that use a variety of haptic devices to provide a range of sensations. Indeed, the ability of haptics as an additional, intuitive sensory channel to sound and vision also makes it extremely valuable in professional applications.
VR and augmented reality (AR) have numerous applications beyond entertainment, such as simulation and training. Adding a sense of touch can make simulations more realistic and more effective, and thereby shorten costly training sessions. The aerospace industry has decades of experience and knowledge in haptics and similar technologies – probably more than any other industry – for both simulation training and operational flight.
However, while the widespread applications of haptics could lead one to believe that the technology is mature and pervasive, there are serious drawbacks to current haptics technology that make it expensive and difficult (in many cases, impossible) to implement. This is evident when considering that haptic feedback is far less common that visual and audio outputs, and often based on primitive components such as simple vibration motors where it is available.
The most obvious challenge with haptics is that it requires physical contact. To put it simply, in order to perceive the feeling of touch you need to be touching something. Generally speaking, in virtual environments the real-world interface will not closely match the virtual object in shape or texture. A video game controller is not a gun or a ball, and a joystick is not a scalpel. Haptic devices are also restricted by sometimes bulky and unwieldy form factors that may block a user’s view of the display or environment they are trying to control.
By utilizing an array of dozens of ultrasound transducers arranged in a flat, square, or rectangular grid to emit an inaudible ultrasound signal, Ultrahaptics provides invisible, contactless haptic feedback over a range of up to a meter. In addition to working at a distance, the technology produces a wider range of effects than vibration motors. These include the sensation of moving objects (like a ball held in the fingers), flowing water, a strong breeze, a virtual pushbutton or dial, or bubbles bursting against the skin. Even materials with a variety of textures can be simulated (Figure 3).
By varying the output of each transducer, a sensation of physical force can be created at points in the air where the ultrasonic sound waves intersect. The waves interfere with each other and – precisely at those points, a few millimeters wide – they produce a stronger and much lower frequency signal that stimulates the skin’s tactile mechanoreceptors just as a physical object would.
Stronger agency, lower touch
Contactless haptic tech greatly strengthens sense of agency, enhancing existing applications for haptic technology, increasing user satisfaction and safety, and enabling new applications and products. In the competitive consumer electronics markets filled with commodity products, companies can use exciting technologies like invisible contactless haptic feedback to make their products stand out.
Steve Cliffe is President and CEO of Ultrahaptics.
1. Shneiderman's Eight Golden Rules of Interface Design. Accessed November 07, 2017. https://faculty.washington.edu/jtenenbg/courses/360/f04/sessions/schneidermanGoldenRules.html.