The Dutch Institute for Human Rights has commissioned research exploring the possible risks for discrimination and exclusion relating to the use of algorithms in education in the Netherlands.
KBA Nijmegen and ResearchNED list the ways that algorithms are used in educational settings: in personalised learning, when assessing student texts, for learning analytics, to look for or prevent fraud, and for student placement. In all of these domains, there is a potential for these systems not to work in the same way for all students.
The Institute for Human Rights gives three recommendations to the Dutch Ministry of Education (translated from Dutch):
- Prevent discrimination by digital systems in education.
- Test digital systems for inequality of opportunity.
- Provide information about the risks of discrimination and exclusion with digital systems.
These are a bit bland but could be helpful if taken seriously (particularly the second recommendation).
You can read the factsheet here, or the full report here.
See: Overheid, help scholen te voorkomen dat digitale systemen hun leerlingen ongelijk behandelen at College voor de Rechten van de Mens.
Image from the factsheet.