Tech firms want to detect emotions and expressions, but people don't like it

  
Tech firms want to detect emotions and expressions, but people don't like it
Tech firms want to detect your emotions and expressions, but people don’t like it. Credit: Sergey Nivens

As revealed in a patent filing, Facebook is interested in using webcams and smartphone cameras to read our emotions, and track expressions and reactions. The idea is that by understanding emotional behaviour, Facebook can show us more of what we react positively to in our Facebook news feeds and less of what we do not – whether that's friends' holiday photos, or advertisements.

This might appear innocuous, but consider some of the detail. In addition to smiles, joy, amazement, surprise, humour and excitement, the patent also lists negative emotions. Possibly being read for signs of disappointment, confusion, indifference, boredom, anger, pain and depression is neither innocent, nor fun.

In fact, Facebook is no stranger to using data about emotions. Some readers might remember the furore when Facebook secretly tweaked user's news feeds to understand "emotional contagion". This meant that when users logged into their Facebook pages, some were shown content in their news feeds with a greater number of positive words and others were shown content deemed as sadder than average. This changed the emotional behaviour of those users that were "infected".

Given that Facebook has around two billion users, this patent to read emotions via cameras is important. But there is a bigger story, which is that the largest technology companies have been buying, researching and developing these applications for some time.

Watching you feel

For example, Apple bought Emotient in 2016, a firm that pioneered facial coding software to read emotions. Microsoft offers its own "cognitive services", and IBM's Watson is also a key player in industrial efforts to read emotions. It's possible that Amazon's Alexa voice-activated assistant could soon be listening for signs of emotions, too.

This is not the end though: interest in emotions is not just about screens and worn devices, but also our environments. Consider retail, where increasingly the goal is to understand who we are and what we think, feel and do. Somewhat reminiscent of Steven Spielberg's 2002 film Minority Report, eyeQ Go, for example, measures facial emotional responses as people look at goods at shelf-level.


What these and other examples show is that we are witnessing a rise of interest in our emotional lives, encompassing any situation where it might be useful for a machine to know how a person feels. Some less obvious examples include emotion-reactive sex toys, the use of video cameras by lawyers to identify emotions in witness testimony, and in-car cameras and emotion analysis to prevent accidents (and presumably to lower insurance rates).

Users are not happy

In a report assessing the rise of "emotion AI" and what I term "empathic media", I point out that this is not innately bad. There are already games that use emotion-based biofeedback, which take advantage of eye-trackers, facial coding and wearable heart rate sensors. These are a lot of fun, so the issue is not the technology itself but how it is used. Does it enhance, serve or exploit? After all, the scope to make emotions and intimate human life machine-readable has to be treated cautiously.

The report covers views from industry, policymakers, lawyers, regulators and NGOs, but it's useful to consider what ordinary people say. I conducted a survey of 2,000 people and asked questions about emotion detection in social media, digital advertising outside the home, gaming, interactive movies through tablets and phones, and using voice and emotion analysis through smartphones.

I found that more than half (50.6%) of UK citizens are "not OK" with any form of emotion capture technology, while just under a third (30.6%) feel "OK" with it, as long as the emotion-sensitive application does not identify the individual. A mere 8.2% are "OK" with having data about their emotions connected with personally identifiable information, while 10.4% "don't know". That such a small proportion are happy for emotion-recognition data to be connected with personally identifying information about them is pretty significant considering what Facebook is proposing.

But do the young care? I found that younger people are twice as likely to be "OK" with emotion detection than the oldest people. But we should not take this to mean they are "OK" with having data about emotions linked with personally identifiable information. Only 13.8% of 18- to 24-year-olds accept this. Younger people are open to new forms of media experiences, but they want meaningful control over the process. Facebook and others, take note.

New frontiers, new regulation?

So what should be done about these types of technologies? UK and European law is being strengthened, especially given the introduction of the General Data Protection Regulation. While this has little to say about emotions, there are strict codes on the use of personal data and information about the body (biometrics), especially when used to infer mental states (as Facebook have proposed to do).

This leaves us with a final problem: what if the data used to read emotions is not strictly personal? What if shop cameras pick out expressions in such a way as to detect emotion, but not identify a person? This is what retailers are proposing and, as it stands, there is nothing in the law to prevent them.

I suggest we need to tackle the following question: are citizens and the reputation of the industries involved best served by covert surveillance of emotions?

If the answer is no, then codes of practice need to be amended immediately. Questions of ethics, emotion capture and rendering bodies passively machine-readable is not contingent upon personal identification, but something more important. Ultimately, this is a matter of human dignity, and about what kind of environment we want to live in.

There's nothing definitively wrong with technology that interacts with emotions. The question is whether they can be shaped to serve, enhance and entertain, rather than exploit. And given that survey respondents of all ages are rightfully wary, it's a question that the people should be involved in answering.

Explore further: Facebook patent explores smartphone camera emotion detection to deliver relevant content