You are constantly confronted with news. The amount of apps, sites and platforms you can follow on the internet is so overwhelming sometimes, that you don’t know where to start looking. This is when you check your Facebook newsfeed. Here you find a beautifully (although the beauty of it is arguable) arranged overview of the news you might be interested in. This overview is created by a smart algorithm, which according to Caplan and Boyd (2016) can be seen as an editor who shaped the news to your interests. But can we trust the news we see?
In an episode of Tegenlicht last Sunday, a television program of the VPRO, this trust is taking into consideration. On the one hand, it is clear that if we allow ourselves to become a collection of data, we will obviously be manipulable. And if direct-targeted advertising on social media can respond to the subconscious mind, it means that we can also be influenced by political influence during elections. On the other hand, Facebook makes it possible to share knowledge and discussions about politics, which makes participation in democracy more accessible. The news on which you base your opinions is filtered by Facebook. So the question is how can you be sure that what you see is what you get?
Boyd, Danah, and Caplan, Roy. “Who Controls the Public Sphere in an Era of Algorithms?” Data & Society, 13 May 2016, pp. 1-19.