Project description

Estadão, the newspaper I work for, was co-hosting a live-TV debate between presidential candidates in the 2018 Brazilian elections. My team at the arts desk was thinking about how to cover this event in a innovative way and we decided to go for something slightly crazy.

We partnered with the video desk to record the faces of all the candidates, all night long. With the footage, the data team then used a computer vision API to detect which emotion the contenders were displaying, second by second.

This way, we retold the story of the night by highlighting how each politician behaved in the key moments of the event – who was showing confidence, who was looking angry, what were the topics that raised tension in the atmosphere?

What makes this project innovative?

We found a novel way to talk about one event that was being covered extensively by almost every major publication in the country. The interactions in a debate are somewhat insincere, since all actors involved are heavily media trained and follow well-planned scripts. Instead of looking only to what was being said, the team was able to peer into a more elusive dimension of the meeting. Analyzing facial expressions is commonplace after a debate night. Doing it in a computational, data-driven way, however, is not.

What was the impact of your project? How did you measure it?

The story was widely shared and reached an above-average level of engagement on social media.

Source and methodology

The emotion analysis was done with an application developed by Microsoft. This is the company's Facial Recognition API. The program receives an image as input, finds all the human faces present in it and assigns to each one of them a percentage for the following feelings: joy, sadness, surprise, anger, disgust, fear, contempt and neutrality. A smiling face, for example, expresses a greater percentage of joy. A frowning brow may indicate anger or disgust. The software looks for this kind of pattern in the facial expressions to perform the evaluation. Although the documentation provided by Microsoft does not refer directly to any research in the area, it is worth mentioning that there is scientific knowledge that supports this type of analysis. According to American psychologist Paul Ekman, from the University of California, there is evidence that the seven listed emotions are universally expressed: individuals and populations from different cultures demonstrate these feelings in the same way. To support this hypothesis, he conducted research in various countries and regions. Ekman showed face pictures to the participants of his experiment and asked them to identify what kind of feeling was shown in each of them. Almost everyone got them right. The researcher also classified which movements of the facial muscles corresponded to each emotion. He is one of the creators of the Facial Action Coding System (FACS), a kind of catalog that correlates, for example, the action of "pulling the corner of the lips" and "lifting the cheeks" - that is, smiling - with the feeling of joy. In essence, the Microsoft application uses techniques from a field of computer science known as "computer vision" to do this same analysis automatically. More details and the code used to obtain, aggregate, and analyze the data are available on the Estadão page in GitHub (see additional links)

Technologies Used

Python and Microsoft’s Face Recognition API.

Project members

Rodrigo Menegat - Data and story writing Ariel Tonglet - Infographics and web development Everton Oliveira - Multimedia production coordinator Bruno Nogueirão - Video editing Fabio Lúcio and Cláudio Fonseca Jr. – Camera crew Bruno Ponceano, Augusto Conconi, Cecília do Lago - Collaboration

Link

Additional links

Followers

Click Follow to keep up with the evolution of this project:
you will receive a notification anytime the project leader updates the project page.