Using Microsoft’s Emotion API to Settle an Old Argument

So Tuesday happened. I have a lot to say about our President-elect, and you’ll probably see a couple posts about him in the next few weeks. But that’s for later. Today, I’m writing about a more pressing issue, one that has been subject to debate for years without resolution, and I plan to settle this debate once and for all.

Do I look angry in this picture?

Here’s the backstory: in my first semester of college in 2010, I took a drawing class. We were required to draw a self-portrait at the beginning of the class, then at the end, after having been trained. I had taken many drawing courses prior to the class and thought myself quite skilled, and I remember when I drew my first self-portrait thinking “How can I do better than this?” But a semester later, I drew this.

I remember after drawing this picture thinking “That’s how.”

Everyone who saw the picture was impressed, from my parents to my art professor (who begged me not to stop drawing; sadly, though, I have not drawn like this since that class, though I hope someday to recover those skills and draw again). I hung the picture on my bedroom door, and one morning, my little sister’s dog Fancy woke me up when she was barking frantically at it; it was so realistic it apparently frightened her.

Yet while the picture was impressive, there was a common reaction to it: I look angry. Everyone would ask me, “Why do you look so angry in this picture?” It even reached a point where my grandpa, when he saw it, pointed to it and said, “This is not you. This is not the person who you are.” (My professor, when I mentioned this to him, said what I managed to do was become so focused I captured that focus and intensity in the drawing.)

I can assure you that I was not angry while drawing this picture. Just consider some of the selfies I took that night:

No people, I was not angry, and I don’t have anger issues. (Well, I do have a deep anger at the universe for all the slights it dealt me and all the injustices in the world, but that’s a separate issue.)

Months ago, I saw this tutorial for using Microsoft’s Emotion’s API for detecting emotions in photos, using R. I thought the idea would be fun, and today I decided to see what the app would say about my drawing.

Here’s the result (using a lot of the code from the aforementioned tutorial; note that you will have to create an account and get a key to use the API):

library("httr")
library("XML")
library("stringr")
library("ggplot2")
library("reshape2")

# Define image source
img.url = 'https://ntguardian.files.wordpress.com/2016/11/selfportrait.jpg'

# Define Microsoft API URL to request data
URL.emoface = 'https://api.projectoxford.ai/emotion/v1.0/recognize'

# Define image
mybody = list(url = img.url)

# Request data from Microsoft
(faceEMO = POST(
    url = URL.emoface,
    content_type('application/json'), add_headers(.headers = c('Ocp-Apim-Subscription-Key' = emotionKEY)),
    body = mybody,
    encode = 'json'
))
## Response [https://api.projectoxford.ai/emotion/v1.0/recognize]
##   Date: 2016-11-13 22:41
##   Status: 200
##   Content-Type: application/json; charset=utf-8
##   Size: 264 B
# Request results from face analysis
(SelfPortrait = httr::content(faceEMO)[[1]])
## $faceRectangle
## $faceRectangle$height
## [1] 530
## 
## $faceRectangle$left
## [1] 584
## 
## $faceRectangle$top
## [1] 549
## 
## $faceRectangle$width
## [1] 530
## 
## 
## $scores
## $scores$anger
## [1] 0.004670194
## 
## $scores$contempt
## [1] 8.175469e-05
## 
## $scores$disgust
## [1] 5.616587e-05
## 
## $scores$fear
## [1] 0.0008795662
## 
## $scores$happiness
## [1] 3.147202e-05
## 
## $scores$neutral
## [1] 0.9871294
## 
## $scores$sadness
## [1] 0.003053426
## 
## $scores$surprise
## [1] 0.00409801
# Define results in data frame
o<-melt(as.data.frame(SelfPortrait$scores))
## Using  as id variables
names(o) <- c("Emotion", "Level")
o
##     Emotion        Level
## 1     anger 4.670194e-03
## 2  contempt 8.175469e-05
## 3   disgust 5.616587e-05
## 4      fear 8.795662e-04
## 5 happiness 3.147202e-05
## 6   neutral 9.871294e-01
## 7   sadness 3.053426e-03
## 8  surprise 4.098010e-03
# Make plot
ggplot(data=o, aes(x=Emotion, y=Level, fill = Emotion)) +
    geom_bar(stat="identity") +
    scale_fill_brewer(palette = "Set3") +
    ggtitle("Detected Emotions in Self Portrait") +
    theme_bw()

The API successfully detected the region of the picture where my face is located, and I’m confident it provides a reasonable analysis of the emotions seen. It appears there is only one emotion in the drawing that Microsoft detects: neutral. Lots and lots of neutral. I will also admit that there is an intensity in my expression that the API probably does not capture or quantify, and that intensity coupled with the shere “neutrality” of my expression perhaps is perceived as anger.

Will that end the discussion? Probably not. Computers see what they see, but what matters in the end is what people perceive, not computers; it’s to them emotions matter. Nevertheless, it’s nice to get some vindication from Microsoft; I was not angry!

Advertisements

3 thoughts on “Using Microsoft’s Emotion API to Settle an Old Argument

  1. Hi,

    Nice post, i think what human is seeing and what microsoft api is detecting is completely different. you might not have been angry at the time you were making the self portrait,but its just they way you drew your eyes makes you look angry. I covered your eyes with a pen and the portrait looked absolutely neutral. I guess Microsoft emotion algorithm has not been trained enough for this type of image.

    Cheers

    Liked by 1 person

  2. Thank you for an interesting article.
    An article I recently read performed similar emotion analyses on a suspect, and for ALL the pictures the dominant score was neutral one: over 88 % as I remember. The reporter simply discarded the neutral score and focused on the other ones.

    I can see some portions on anger, sadness, and surprise in your graph. It would be better to compare your result with the ones from other pictures.

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s