For best experience please turn on javascript and use a modern browser!
You are using a browser that is no longer supported by Microsoft. Please upgrade your browser. The site may not present itself correctly if you continue browsing.
Unfortunately, our society is not yet free of inequality and discrimination. When you develop artificial intelligence in such an environment, you have to be careful that algorithms do not copy this inequality. UvA neuroinformatician Sennay Ghebreab is committed to AI technology that safeguards values such as equality and privacy. And he goes one step further: he develops artificial intelligence to promote equal opportunities in Amsterdam.

‘We are developing AI technology to, on the one hand, expose the chance of inequality in the city and, on the other hand, to promote equality of opportunity. To this end, we have set up the Civic AI Lab, a collaboration between the UvA, the VU, the City of Amsterdam, and the Ministry of the Interior. The lab is part of the national Innovation Center for Artificial Intelligence (ICAI)and has the ambition to be a leader in societal applications of artificial intelligence. In collaboration with the City of Amsterdam, we will tackle problems in various domains: education, health-care, welfare, mobility, and environmental factors. Together with the Ministry, we are looking at broad scaling up of government applications and research findings.

We hope to extract predictive factors from data streams from Amsterdam and the surrounding area.

In the field of health, for example, we are looking at the first thousand days of a child’s life. These first thousand days influen-ce life expectancy, the risk of illness, but also the position in education and the labour market. Not all children have the same chance of a healthy start in life. There many factors that affect the future: socio-economic factors, where you grow up, health, communication with parents, and parents’ love.

We work together with various parties regarding pregnancy, birth, and youth healthcare. They have collected a lot of data. We want to use machine learning algorithms to see how we can integrate these data flows in Amsterdam and the surrounding areas. We hope to be able to use that to extract predictive factors. We do this with respect for fundamental human rights such as non-discrimination, equality, and privacy. This means that the algorithms we develop take into account differences in gender, race, etcetera, but include them in an honest way in the analysis of the data. This is how we are working towards recommendati-ons to improve equality of opportunity.

Another project we are working on focuses on education in Amsterdam. Money is available from both the central govern-ment and the City of Amsterdam to eliminate educational disad-vantages and inequality of opportunity in primary education. But the question is, of course, which policy really contributes to providing the best opportunities for all pupils. Money has been made available, but is it arriving in the right place, and is it having any effect? Using AI technology and smart data analysis, we look at whether it really contributes to equality of opportuni-ty, and if not, how we can improve that.’