Discriminated by computers: how algorithms and big data adopt and replicate inequality

‘Weapons of math destruction’ is what mathematician and writer Cathy O’Neil calls them (Janssen, 2016): big data and algorithms, both of which are increasingly used in media, politics, education and many other fields that shape our everyday lives.

Although algorithms are blind, they always bring certain assumptions with them. As machines are fed by data that reflect the historical inequality that dominates societies worldwide, these programs have the possibility of not only repeating but even amplifying such unjust social structures (Slob, 2017).

Recent experiments with new technology confirm such fears: from an image recognition program by Google that identified several black persons as gorillas to Microsoft’s chatbot that only needed one day of learning from Twitter to start spewing antisemitic comments (Buranyi, 2017). A study of Google searches revealed “significant discrimination” (Sweeney, 2013 in: Caplan and Boyd, 2016: p. 7) in the online ads that Google provided with black-identifying names as opposed to white-identifying names, with searches for black names having a much larger likelihood of returning ads for arrest records. By using feedback given by users, the algorithm adopted existing racist structures and replicated them (Caplan and Boyd, 2016).

Although algorithms can be seen as ‘accurate’, since they work with feedback from real people, that does not mean that they are fair, O’Neil argues. The blind trust that people put in algorithms can be very dangerous as they can be ‘destructive and secret’ (Burack, 2017). As such, much can be said for increasing ‘algorithmic literacy’, also put forward by Caplan and Boyd (2016). By making data and their workings accessible for a larger share of people, unwanted features could be weeded out quicker and transparency can increase. By investing in such education we can hopefully shift towards unbiased data that helps society move forward rather than replicating our historical downfalls.

Literature

Burack, C. (2017). Algorithms are ‘existential threat’ to shared reality, says Cathy O’Neil. Retrieved from: http://www.dw.com/en/algorithms-are-existential-threat-to-shared-reality-says-cathy-oneil/a-40167802 on September 13, 2017.

Buranyi, S. (2017). Rise of the racist robots: how AI is learning all our worst impulses. Retrieved from: https://www.theguardian.com/inequality/2017/aug/08/rise-of-the-racist-robots-how-ai-is-learning-all-our-worst-impulses on September 13, 2017.

Caplan, R., Boyd, D. (2016). Who controls the public sphere in an era of algorithms? Data and Society, 1-19.

Janssen, G. (2016) Wiskundige Cathy O’Neil en de ‘weapons of math destruction’. Retrieved from: https://www.vn.nl/cathy-oneil-en-weapons-math-destruction/ on September 13, 2017.

Slob, M. (2017). Ingebouwde aannamen in algoritmes houden sociale ongelijkheden in stand. Retrieved from: https://www.volkskrant.nl/opinie/ingebouwde-aannamen-in-algoritmes-houden-sociale-ongelijkheden-in-stand~a4515920/ on September 13, 2017.

Leave a Reply