The Racist Music Algorithm

by Janusz Kaliszczak

We want big data algorithms to work for small venues, undiscovered bands, and all artists coming from diverse walks of life. We often find that big data works against such types of users, and our Listen Local Initiative was designed to change this trend.

Big data algorithms will increase injustice and breach social norms if they are trained on biased data, or if the algorithms themselves are biased — this is why we place so much emphasis on algorithm transparency and input data transparency. Facebook, Apple, and Spotify are often criticized for helping the spread of hate, for example. But how can an algorithm turn racist?

Daniel Kraemer & Steve Holden reported on BBC recently that Spotify, Apple Music, Deezer and YouTube found recently hosting racist music – and they removed it after the BBC’s investigation.

On Spotify, public playlists and “suggested artist” algorithms make it easier to find extreme content. In some cases, users created playlists that collated songs and bands associated with the National Socialist Black Metal (NSBM) movement, mainly from Eastern Europe and Russia. Anti-Semitism and glorification of the Holocaust is common in their lyrics.

How can an algorithm become racist?

  • If an algorithm is trained on the tastes and habits of white users, it will less likely recommend Black music (i.e. genres with histories in Black communities such as hip-hop or jazz, or Black artists making any genre of music) to anyone.

  • If music is coming from a Black community (based on genre, location, or artists with a Black identity), a racist algorithm will be less likely to recommend it to white users, according to what the algorithm predicts white users typically listen to and like.

  • If the algorithm learns that certain users have listened to and liked music with racist content, it will provide them with more racist material.

Of course, you can replace black and white with any minority and majority group. Take Slovak and American; major release and independent release. European independent music makers may feel a minority compared to international majors. The problem with algorithms is that they can ruthlessly reinforce injustice, because they are focused on learning how to do things better, faster, and more — but not in qualitatively different ways — than they were done before.

What can we do?

  • Fight for the removal of inappropriate content from the training set. If you see any non-English hateful songs on these platforms, let us know, because English-language moderation is just the tip of the iceberg.
  • Make sure that the training sets of the algorithms behind sales or listening recommendations are inclusive. Our Listen Local Initative is an early beta of a recommendation system that makes sure that locally relevant music gets recommended to locally relevant audiences, and does not need to engage in a David and Goliath fight against international giants.
  • Allow the external verification of the algorithm and its results. We allow multiple levels of audit for all our algorithms, and we make all critical elements of our data products open source.

A few examples that the tides are turning:

Justin Joffe: Capitalism Has Long Suppressed the Contributions of Black Musicians 2017/02/02

Ben Sisaro: The Music Industry Is Wrestling With Race. Here’s What It Has Promised 2020/07/01

Photo credit: Janusz Kaliszczak.

This post was slightly edited for additional clarity by Emily on 25 February 2021.