Facebook Mood Manipulator Filters Content Based On How You Feel

The browser extension lets users select emotions so the social media site can show different content accordingly.

As a response to the recent backlash that Facebook received due to its emotion contagion experiment, Brooklyn artist and programmer Lauren McCarthy developed a Google Chrome Extension that lets users set how they’re feeling, having the social networking site change displayed content depending on their mood.

The emotion contagion experiment conducted by Facebook’s Data Science Team reportedly tried to manipulate how users felt by using word omission algorithms. The researchers at Facebook wanted to find out if the content on their newsfeed could affect the users’ emotional states. The algorithm omitted content that included words associated with positive or negative emotions from the newsfeeds of almost 700,000 users.

The Facebook Mood Manipulator lets users choose how they would like to feel and it filters their Facebook feed accordingly. The linguistic analysis is done by Linguistic Inquiry Word Count (LIWC), a text analysis software developed by researchers at the University of Texas in Austin and is the same text analysis software that was used in Facebook’s study.

Facebook-Mood-Manipulator-1.jpg

After installing the browser extension, a white box called the Mood Manipulator appears on the upper right corner of the user’s Facebook feed. The box asks the question “How do you like to feel?” and features four sliding scales that users can manipulate to filter the content of their newsfeed. The sliding scales are labeled Positive, Emotional, Negative, and Open. Users can slide the scales back and forth from “less” to “more” and their newsfeed content would change accordingly. It’s not immediately obvious what changes in the content, but users will be able to see an underlying mood running through their feed. Reports say, for example, that sliding the Emotional scale to the right results to more posts about weddings and babies.

On her project page, McCarthy poses the question, “Why should Zuckerberg get to decide how you feel?” and encourages people to “take back control” and “manipulate your emotions on your own terms.” The programmer, however, also poses questions that encourage users to think beyond the Facebook experiment and the ethics of it all. She writes,

Yes, we are all freaked about the ethics of the Facebook study. And then what? What implications does this finding have for what we might do with our technologies? What would you do with an interface to your emotions?

Her Facebook Mood Manipulator projects shows how users can take control of their social media feeds. It also shows how much what people see on their social media feeds affects the way they use it or feel about it.

Facebook-Mood-Manipulator-2.jpg

Lauren McCarthy

[h/t]: WSJ

Quantcast