Design and the (Ir)Rational Mind: The Rise of Affective Sensing

Sensing peers into our subconscious and promises to change the way designers work.

“The intuitive mind is a sacred gift and the rational mind is a faithful servant. We have created a society that honors the servant and has forgotten the gift.” – Albert Einstein

The human brain remains one of the least understood structures in the natural world. Yet over the past two decades, researchers have developed a growing kit of remarkable tools that are beginning to shed new light on the inner workings of our most complex organ.

Even our perception of our day-to-day lives plays out in two versions, the interplay of which we don’t fully comprehend. The first version unspools in our conscious mind, of which we are constantly aware. The second unfolds in the intuitive, unconscious mind, a realm that remains largely hidden from us. For millennia, most cultures maintained a dominant belief that the conscious mind ought to control and suppress the intuitive one.

However, scientists are beginning to learn we may have had this backward. They are becoming increasingly convinced that how we experience the world—our perception, behavior, memory, and social judgment—may be driven more by the mind's subliminal, pre-cognitive processes than by the conscious ones.

As the neuroanatomist Jill Bolte Taylor puts it in her book My Stroke of Insight, “We live in a world where we are taught from the start that we are thinking creatures that feel. The truth is, we are feeling creatures that think.” In other words, even when we believe we’re thinking our way towards a logical solution, we are often simply feeling our way towards a decision we’ve pre-made via intuition.

This revelation fundamentally challenges much of what we thought we knew about human decision-making. Further, it raises important questions about design and how it may be perceived by such highly emotional, irrational beings.

Might there be an opportunity to leverage this revised understanding about the role of the emotional, pre-cognitive mind and apply it directly to the business of design? Might we be able to use neuroscience to help us build reflexive experiences that could better respond to the emotional state of a user?

Until recently, such a suggestion would have sounded preposterous. However, the convergence of low-cost wearable sensors, and powerful analytics capable of sifting through petabyte-scale datasets, is quickly changing the conversation.

Affective Sensing
So how might we gain a view into the subconscious, emotional mind? We must dig into the bedrock of human emotion, the autonomic nervous system (ANS). The ANS is the part of our peripheral nervous system—that is, existing beyond the brain and spinal cord—that helps our brain deal with both external environmental demands and internal emotional states, such as psychological stress. The ANS is, thus, the single most direct and reliable physiological indicator of our emotional selves.

For nearly a decade, Rosalind Picard and her team of researchers at MIT’s Media Laboratory have been working to understand how machines might measure the ANS to interpret human emotional states. For Picard, doing so begins with a suite of passive sensors that captures data about a user's physical state or behavior.

Developments in wireless and computer vision-based technologies have recently enabled this so-called affect detection to be conducted remotely. This is a critical step as it frees the user from the laboratory setting, allowing them to move normally and behave naturally in their home, office or elsewhere.

Some of the affect data Picard’s team gathers is actually very similar to the cues humans use to perceive emotions in others. For example, a video camera might capture facial expressions, body posture and physical gestures, while a microphone might capture speech patterns, tone and inflections. Other sensors go beyond human perception, detecting changes in skin temperature and electrodermal activity.

The next challenge is how to extrapolate emotional information from the massive flows of heterogeneous data that result from these sensor-based studies. This is done using machine learning techniques able to process different modalities, including speech recognition, natural language processing, or facial expression detection. These modalities can either generate labels—such as 'confused' or ‘angry’—about a given person’s state, or plot coordinates on a valence-arousal space diagram.

Not surprisingly, these systems perform best when they combine a number of different signals, such as facial expressions, postures, gestures, speech, word-choice, or physiological response. The finding echoes human behavior: the more indicators machine learning systems are able to asses, the more confident they become at identifying emotion.

New Design Frontiers
Traditionally, design research has relied on self-description of the rational mind, often via user-feedback surveys, opinion polls, and interviews. Unfortunately, these testing methods draw from a level of awareness that is a cloudy soup of bias, ego, and constructed memory. This is why focus groups often miss the mark. People are either completely unaware of, or unable to articulate, their true, subconscious, emotional impulses.

Sensing technology promises a better way. Using wearable and remote affective sensors to collect emotional data from users in real-time—at the very instant they interact with a product, service, or interface—could deliver better insights more quickly and at a lower cost than conventional methods. Affective sensing also opens up the opportunity for longitudinal studies that might monitor the evolution of behavior over time.

Notably, data collected through these technologies is digitally native, unencumbered by interpretive filters of language, social mores, taboos, and faux pas. This suggests truer responses will emerge from potentially sensitive contexts where individuals might otherwise be unwilling to report their uncensored feelings.

Affective sensing also allows for dynamic analysis of moment-to-moment reactions that does not require introspective responses from participants. It promises to parse the measurement of perception into finer slices too, detecting responses that precede conscious awareness.

As our ability to measure subliminal emotions advances, we may learn to influence these emotions. After all, if the unconscious mind communicates in feelings, rather than words, shouldn’t we be speaking back to it? How then might we apply what we learn about human emotions to design more compelling and meaningful experiences?

The idea is less far fetched than some may think: the evolution of affective sensing is underway and gaining speed. Location tracking via our smartphones and via our web browsing habits has already become an accepted—even welcome—way to personalize how we use information and communicate. It’s not a great leap to imagine that well-executed affective sensing may very well become normalized as the technology matures.

Imagine if investment banks could limit the amount traders are wagering if sensors detected they were angry. On our roads, cars might automatically limit the speed of frustrated or overly distracted drivers. Affective sensing could have dramatic implications across many industrial sectors, including financial, automotive, retail, healthcare, and education.

What if retail displays or advertisements could reconfigure themselves based on a user’s sensed state of happiness, irritation, or curiosity? Or what if an online learning system could adapt to maintain the focus and attention of a pupil with attention deficit disorder?

The promise of affective sensing casts an entirely new light on the design process and presents an exciting opportunity at the frontier of interface and experience design, for both designers and users alike.

Modes Of Affective Sensing

Emotional Speech
Changes in the autonomic nervous system indirectly alter speech, which provides a way to recognize affect from shifts in vocal patterns. For example, speech produced in a state of fear, anger, or joy becomes faster, louder, and more precisely enunciated, with a higher and wider pitch range. Other emotions such as tiredness, boredom, or sadness, lead to slower, lower-pitched, and more slurred speech. Emotional speech processing recognizes the subject’s emotional state by analyzing speech patterns. Vocal parameters and prosody features such as pitch variables and speech rate are analyzed through pattern recognition. Interestingly, many speech characteristics are independent of semantics or culture, which makes this technique very robust.

Voice patterns and speech can be measured either on or off of the body with conventional microphones. This creates the opportunity to use microphones in smartphones to measure human affect.

Cardiovascular Response
In simplest terms, the cardiovascular system consists of the heart, arteries, veins, and other vessels through which oxygenated blood is delivered to the periphery and deoxygenated blood returns to the heart. Psychologically, this system is responsive to affective states, motivation, attention, and reflexes. Additionally, these responses have been commonly linked to vulnerabilities in physical and mental illness. A host of related tests serve as reliable cardiovascular indicators of ANS activity. These track heart rate variability, cardiac output, and respiration rates and include impedance cardiography (ICG), electrocardiograms (ECG), and electroencephalograms (EEG).

Electrodermal Activity (EDA)
Electrodermal activity is a measure of skin conductivity, which is a function of how moist the skin is. EDA measures responses in the eccrine sweat glands, which are widely distributed across the body, but are most densely distributed in the hands and soles of the feet. As the sweat glands are controlled by the body’s nervous system, there is a correlation between EDA and the arousal state of the body. The more aroused a subject is, the greater the skin conductivity and EDA reading. EDA can be measured using two small silver chloride electrodes placed on the skin. A small voltage is applied between them and the conductance is measured by a wearable sensor. This sensor can be battery powered and portable, relaying conductivity information in real time via wireless links. Currently, there is no remote method to measure EDA.

Facial Gesture Detection
From his cross-cultural research in Papua New Guinea, psychologist Paul Ekman proposed the idea that facial expressions of emotion are not culturally determined, but universal. Defining expressions in terms of muscle actions, Ekman conceived a system to formally categorize the physical expression of emotions. This theory was later codified into the concept of the Facial Action Coding System (FACS). Today, webcams are the sensor of choice for detecting affect from facial gestures, largely because the test subject can be studied in a very repeatable location with their face in plain view. In the near future, it is possible to imagine that human emotion could be detected from other cameras in the environment, including CCTV security cameras, at ATMs, on public trains and busses, etc.