030415_The-Face-Behind-the-Wheel_Chudakov

You think of your face—when you think of it at all—as residing just above your shoulders and more or less staying put. You think of it as yours, something owned that you make up or shave each day in the mirror. But when the Internet arrived, you quietly underwent a digital facelift. With this painless procedure, your face lifts off its customary perch and resides (somewhere) in the cloud. This is particularly useful since Intelligent Facial Recognition Video Analysis surveillance cameras can track objects—briefcases or boxes, handbags or bombs—and we can then digitize those images and shuffle them, transport them, match and mix them, and of course, identify them.

The result: Your face is now an unwitting recruit in the Internet of Everything.

More than 100 supermarkets and convenience stores in Tokyo record images of shoppers’ faces as part of anti-shoplifting measures. (Hitachi Kokusai face recognition software sorts through 36 million faces per second.) Match.com uses facial recognition software to determine whether strangers’ faces match people who “look like your exes.”

People’s faces—via face and emotion recognition software—generate vast amounts of information that can be combined to yield insights, such as confirming our identity, how we really feel, and what we like and don’t like.

It is not surprising, then, that your face has become a shiny new automotive gauge.

A joint project between Ford and Intel, Project Mobii (for Mobile Interior Imaging) uses facial recognition to disable the start/stop system if the driver is not an authorized user. Volvo is testing dashboard-mounted sensors that watch drivers for signs of fatigue or distraction; Volkswagen and Ford have already launched driver attention and drowsiness systems that use sensors and cameras to monitor a driver’s steering habits, ability to remain within a lane, and time at the wheel. (The rationale for face monitoring was underscored recently in an accident that turned deadly: a Walmart truck driver hit comedian Tracy Morgan’s limo; police said the driver had been awake for more than 24 hours.) These new technologies have the potential to make driving safer by watching us and implementing attention or mood-altering solutions.

They are also testament to how much our technologies are watching us, using our faces and emotions as content to determine, and then change, our behaviors.

University of Waterloo engineering students in Ontario, Canada, have developed technologies that detect road rage and then mollify drivers’ anger by playing soothing music. “Music is confirmed to be extremely beneficial in calming individuals down,” says Dipshikha Goyal, project manager for the group. As reported in BusinessWeek, Volvo spokesperson Malin Persson elaborated that there are several ways to nudge a sleepy driver back to attention—using sounds, vibrations, light, or even fragrance: “If the driver does not wake up, [a vehicle’s safety system could even ensure that it] comes to a safe stop.” The Swedish carmaker is also looking into using facial recognition technology to monitor drivers’ anger levels, although rage is more difficult to detect than drowsiness. “Different people have very different expressions when it comes to anger,” Persson explained. “Sleepiness can be detected by eye movements.”

As our vehicles learn whether we are irritated, angry, or fatigued, here are some consequences of our cars watching us as we drive:

    1. The face is the new interface. Watching is knowing: As facial expressions (monitored by software) become sequences of numbers, patterns in the data emerge that monitor, define, and predict your behavior. Numbers correlating eyelids drooping and relaxed facial muscles—you are getting drowsy; data showing eyes wide, nostrils flared, mouth in a straight line or a frown— you’re becoming enraged. Not only are car manufacturers working to make safer cars; they want to watch the number one factor in ensuring auto safety: you. In this process, car manufacturers are turning your face into “the point where two systems interact” (an interface) in order to get to know you intimately.
    2. Your face predicts your actions. The information written all over our faces enables predictive analytics: Your face tells what you’re likely to do—such as fall asleep, become angry, or drive dangerously.
    3. You are turning into data. Your face and actions behind the wheel are converting, through the process of data mining and analytics, to a pattern, a trend. Your numeric pattern defines you. Automatic, an auto accessory that talks to your car’s onboard computer and uses your smartphone’s GPS and data plan, monitors your driving style and gives you “subtle audio cues” when you speed, brake roughly, or accelerate rapidly and waste gas. Zubie, a “smart service that connects cars and drivers,” uses a key that plugs under your car’s dashboard and connects your car and driving behaviors to the cloud, constantly analyzing your car’s data—and, of course, the data that is you.
    4. I am he as you are he as you are me and we are all together. If data can be said to have intent, it is to marry with more data; paraphrasing John Lennon, we all get smooshed. Your face and car are players in this watch-data integration, now piling up at a faster pace than many realize. Tweetping shows the astonishing rapidity of our global Twitter activity; MIT’s Immersion Project visualizes the most people you email after you connect your Gmail, Yahoo, or Microsoft Exchange account, as well as how your contacts connect to each other. We Are Data, built as a companion site to the video game Watch Dogs, gathers publicly available data from Paris, London, and Berlin and displays it live, in real time: traffic lights, ATMs, CCTVs, Instagrams, Tweets, Flickr and Foursquare messaging, Wi-Fi Hotspots. As we are watched—and connected—in more places (airports, shopping malls, schools), our face is no longer solely ours; nor is the movement of our information, even though the content may be shielded. What we cannot say—because it is so new—is what will be done with our emotion responses. Who will own this face-behind-the-wheel data, and who will protect it when it is shared—as all information is eventually?
    5. Knowing how you feel is a rapidly advancing business model. As face and emotion recognition technologies advance, their predictive capabilities are becoming more sophisticated and precise. Like the pre-crime systems envisioned by science fiction writer Phillip K. Dick and popularized in Minority Report, we are moving through a series of behavior-altering paradigms: (1) algorithms yield data, (2) data yields patterns, (3) patterns yield insights, (4) insights alter behaviors. This is because face and emotion recognition technologies are improving rapidly. Norberto Andrade wrote in The Atlantic:

“As facial recognition software improves, computers are getting the edge. [A recent] Ohio State study, when attempted by a facial recognition software program, achieved an accuracy rate on the order of 96.9 percent in the identification of the six basic emotions, and 76.9 percent in the case of the compound emotions. Computers are now adept at figuring out how we feel.”

  1. You are for sale. Just as the public is becoming aware that shared personal information can be owned by the service that provides information sharing—say, Google—soon our facial reactions while driving will become valuable information that can be bought and sold.
  2. Whose face is it, anyway? In the next five years, our faces will start to replace passwords via products such as Facelock, potentially giving face data to merchants, governments, and spammers. As information about our facial reactions while driving becomes a digital commodity for sale, a battle will likely ensue over use and ownership of our facial response information.
  3. You will swap privacy for safety. Again, as we have done in other contexts, users will encounter a familiar information trade-off: keep personal information private or allow the sharing (buying and selling) of that information for enhanced safety and security.
  4. Dashboard sensors fuel the rationale of everywhere-watching. The car—like the mall, airport, school, and street—is yet another place where we are being watched. Cameras create watch culture, and the narratives of watch culture create the justification for watching and being watched.
  5. You have the right of explicit consent. Just as we sacrifice some of our privacy in a public place like a store or an airport for the sake of security and maintaining order and being able to prevent crime, using these emotion recognition technologies means giving up the “aloneness” of driving in favor of being watched—albeit for our own good. We make this sacrifice for our own safety and for the potential safety of those drivers and cars around us. But this greater good should have limitations, as Paul Ekman—the man who invented emotion recognition technologies—wrote: “A specific, explicit consent to have our emotions monitored by a particular company for a specific purpose, as well as assurance that is time-limited, probably to one- or two-hour period, should be obtained. Reading children’s emotions should require parental consent for each occasion it is done.”

As we are watched behind the wheel and elsewhere, we encounter an emerging moral imperative: to respond. This ongoing drill of watching the watchers ensures that we are aware of the unexpected realities—and quandaries—that new technologies pose.

Click here to get inspired by Rose’s easy steps to positively change your mind

Leave a comment



Social

Subscribe to Our Newsletter