27 August 2019
A large Dutch municipality recently made the news because it deploys algorithms to detect social welfare scheme fraud. This caused protests among dozens of citizens as it occurred in neighbourhoods with a relatively high migrant population, which was considered to be discriminatory. ‘More and more local authorities are experimenting with a data-driven approach,’ says Helwegen. ‘They are increasingly adopting machine learning algorithms to this end. That in itself does not necessarily produce adverse effects; but if and when ethnicity or other sensitive background characteristics of people influence the decision-making of such authorities, then this may give rise to ethical or legal objections.’
This prompted the young researcher to develop a practical method which ensures that the use of algorithms elicits impartial answers to complex questions. Helwegen based his Fair Trade method on causality. The method uses estimates to adjust causal links that arise from the combination of data and domain knowledge. Helwegen says: ‘The underlying principle is that the outcome of the model must be the same if the parameter of a sensitive personal characteristic is switched, for example, a non-Western background for a Western background.’
Helwegen spent a few months conducting theoretical research in an experimental setting, after which he wanted to test the model in practice. ‘I approached several organisations and companies to do this. The City of Amsterdam responded immediately as they recognised the problem.’
Helwegen spoke to social domain experts at the City of Amsterdam and CBS. He then successfully tested his model on a sample selection of approximately 11,000 profiles of Dutch citizens who receive income support or have in the past been convicted for wrongly receiving benefit payments. He was given access to CBS’ System of Social Statistical Databases (SSB) for the experiment. The tests were conducted in the highly secure working environment of CBS under strict privacy conditions.
In recent times, a number of incidents have sparked the debate on the undesired consequences of using algorithms. Barteld Braaksma, innovation manager at CBS and Helwegen’s supervisor, says: ‘There are many examples of such incidents. A large, well-known American company, for instance, noticed that its algorithms for recruiting staff preferred men to women. This was because the programme was based on the past, a time in which more men were hired. In addition, more undesired effects emerged, which is why the company in question stopped using this approach. Such examples increase the awareness that we must monitor the disadvantages and risks associated with the use of algorithms. Ethical norms and standards play an important role in this regard. CBS’ wealth of information creates excellent opportunities to help other government organisations develop and test methods to preclude or minimise undesired side effects.’
Braaksma speaks highly of Helwegen’s research. ‘Not only because of its scientific value, but also because of its great social importance. Helwegen has managed to design a method to combat unfair algorithms in actual practice. In doing so, the use of CBS data was of crucial importance.’ The use of algorithms was also recently discussed in the Dutch House of Representatives. Several parties want stricter rules and a supervisory authority that monitors the use of algorithms by the government.
Tamas Erkelens is data innovation programme manager at the City of Amsterdam. He also acknowledges that the incorrect use of algorithms can be detrimental to certain groups in society. ‘The inequality of opportunity in Amsterdam and other cities is increasing. When using algorithms – especially by non-government organisations – provisions are increasingly made for the backgrounds of people, their education, their income, the neighbourhood in which they live, etc. This means that the digital society is potentially reinforcing discrimination against population groups. Amsterdam's ambition is that not a single algorithm used in the city discriminates or makes undesired decisions.’
Amsterdam wants to screen its own algorithms, but also wants other companies and organisations in the capital to develop fair algorithms. How does the local authority intend to do this? Erkelens says: ‘We will take various measures, including the introduction of fair algorithms as a tendering condition for the local authority. The establishment of quality marks can also contribute to fair algorithms. In addition, we are raising awareness on the use of algorithms by providing information on how they work and conducting discussions about them within the local authority with people from various disciplines. These include policy makers and specialists in the field of privacy and communication, but also Amsterdam residents. In the end, it is the municipal council – elected by the people of Amsterdam – that must impose conditions on algorithms.’ Erkelens is of the opinion that human rights such as the prohibition of discrimination must be enforced digitally, both nationally and at European level. ‘We can use technology to protect our digital rights.’
The City of Amsterdam and CBS have been working together for years and for this project the City of Amsterdam once again explicitly approached CBS. According to Erkelens, there are three reasons for doing so. ‘First, CBS has the methodological skills and experience. CBS also has a highly secure working environment and anonymised data. And finally, we want what we develop as open source to become available to other local authorities as well. CBS can help achieve this.’ Erkelens is also full of praise for the work carried out by Rik Helwegen: ‘It is not only a great study in terms of content, but it has also ensured – through the many discussions Helwegen had with various experts within the local authority – that employees have gained a better understanding of technological developments and changes.’
Rik Helwegen gained a Bachelor’s degree in Econometrics from the University of Amsterdam, and then dived into the world of artificial intelligence. His graduate research for his Master's degree in Artificial Intelligence was on 'Fairness in Machine Learning models using Causality’. He defended his thesis at the University of Amsterdam on 2 August 2019. Helwegen will also present the results of his study at the ‘Beyond Smart Cities today’ seminar to be held in Rotterdam on 18-19 September 2019. Helwegen has been awarded an innovation budget by the Ministry of the Interior and Kingdom Relations to conduct further research.