In an opinion piece in NRC, Christine Moser argues for a more cautious approach to the use of algorithms. The motivation for the opinion piece is the news that the Dutch Healthcare Authority (NZa) is collecting privacy-sensitive information from mental health patients to feed an algorithm that is meant to predict healthcare demand.
Despite many bad examples from the past, such as the ‘toeslagenaffaire’ in the Netherlands, organisations like NZa continue to blindly rely on algorithms, according to Moser. Sometimes without any form of human intervention.
This blind trust has three main causes, according to Moser, who has been researching this topic. "First, it is easy for organisations to express things in numbers -with the motto 'measuring is knowing'. It makes sense to use algorithms for this, which, after all, can only deal with numbers. But not everything can be expressed in numbers. Everyone understands that a question like "On a scale of 1 to 10, how does love/disappointment/anxiety feel?" doesn't cover it. Some things you can't measure," she writes in the opinion piece in NRC.
"Secondly, algorithms are agnostic in terms of culture and environment: the algorithm does not care whether it is used in the Netherlands, the US or Madagascar. This is easy for organisations because they can implement the same algorithm everywhere. For people, the environment does matter," Moser continues.
The third reason is that the outcomes of algorithms, for example scores or percentages, are powerful and persuasive. Moser: "This sounds strange because we normally think that we, the people, are in charge of technology. However, when faced with these numbers, it is very difficult not to go along with the logic of the number."
Moser does not want to advocate that algorithms should no longer be used at all in organisations. "We are past that stage and besides, there are many issues and applications that would benefit from using algorithms. But what recent history shows is that algorithms are not suitable for addressing moral, ethical, and social issues. Even if the NZa's issue of optimising waiting lists is at first sight a planning issue, they are using data in an unethical way," Moser writes.
Read more
The full opinion piece, in Dutch, can be read in NRC.
Moser baseerde het opiniestuk op verschillende onderzoeken, waaronder:
On the mechanization of values, Academy of Management Review
What humans lose when we let AI decide, MIT Sloan Management Review
Is technology a useful servant or a dangerous master? Business & Society
Morality in the age of artificially intelligent algorithms, Academy of Management Learning & Education
When algorithms rule, values can wither, MIT Sloan Management Review
For questions, please contact editor Yrla van de Ven, y.f.van.de.ven@vu.nl, +31 6 26512492