Search Engine Breaks Down Algorithmic Prejudice With Unexpected Content
A new kind of search engine that does not polarize information based on user preference and helps to burst the "Filter Bubble."
Whether you know it or not, search engines generate recommendations based on specific algorithmic equations. However, these algorithms compile a “filter bubble” that is specially tailored to give you topics you would more likely prefer or be familiar with, which then shields you from opposing point of views. Such a polarization of topics poses a threat to society because it can create harmful divisions and a limitation in knowledge. Luckily, Edurardo Graells-Garrido, Mounia Lalmas and Daniel Quercia discovered a way to burst that “filter bubble.”
Different from other search engines, this system points people towards each other based on any overlapping interests and experiences they may have. The trio believes that although people may have opposing views on sensitive topics, their similar interests can be a great foundation for meaningful dialogue.
To better test their thesis, 40,000 Twitter users were selected and analyzed for their stance on abortion. With such a large sample, it was very difficult for the team to focus on a particular group, which then led them to trim the number of users to a number more manageable – approximately 3000 Twitter accounts were used for the final results. The users were then assigned unique wordclouds – which serve like a kind of data portrait – based on certain keywords they used in their Tweets. Because the researchers were able to distinguish user personalities, it was easier for them to make recommendations that didn’t result in rejection from the user.
Graells-Garrido’s indirect approach to connecting people with opposing views has great potential. While changes in herding behavior cannot happen overnight, this new search engine does help expose people to a greater range of content for discussion.
Source, Image: Technology Review