Anyone who’s ever wrestled with a malfunctioning computer can attest to the fact that the machines can give a damn about your frustration.
But this sad state of affairs may someday improve with more companies developing technology that enables machines to potentially be more helpful by responding to people’s feelings. A couple weeks ago, I reported on the development of sensors that can prevent drunk drivers from operating a vehicle. And last month, the Italian automotive company Ferarri reportedly filed a patent application for a car technology that can get a beat on a driver’s mental state and adjust the dynamics of the car accordingly.
Across the way in Florida, Design Interactive, an engineering and consulting firm is collaborating with the Defense Advanced Research Projects Agency or DARPA on a series of technological enhancements that would allow a computer to sense the user’s present emotional and physiological state and react appropriately.
At this week’s Blur conference, the company has been talking up a couple of intriguing projects, both for military and security purposes. One prototype, the Next Generation Interactive Systems, or NexIS, involves equipping soldiers with biological sensors that monitor the person’s heart rate as well as other indicators for signs of emotional distress, in which case the system would alert medical response personnel or administer adrenaline. The Auto-Diagnostic Adaptive Precision Training for Baggage Screeners or Screen-ADAPT uses a combination of electroencephalography (EEG), eye tracking, and heart-rate monitoring tools to figure out the most optimal practices for airport baggage screeners.
However, extending the application for the technology has proven difficult. Kay Stanney, the company’s owner, discussed some of these challenges with Technology Review Magazine:
Stanney admits this is challenging, because not every successful baggage screener does the job in exactly the same way. “This will really come down to the art of the algorithm—what it is that we’re trying to optimize,” Stanney says. Sensors can already detect when a person is drowsy, distracted, overloaded, or engaged. But it would be ideal to be able to determine other states such as frustration, or even to distinguish between different types of frustration.
While there can be advantages to having a computer that is emotionally attentive, I’m not so sure that users would readily embrace the notion of machines making decisions based on what it has determined about a person’s emotional state. Emotions are so complex that they are often misread and misunderstood by even other humans. And emotions such as frustration can be triggered by a wide range of problematic issues a user may be experiencing during a given situation. Mathematical algorithms still have a ways to go before they can accurately pinpoint which ones they are.
But I think an even more curious question to ponder is whether humans would want such an intimate connection with their computers.