On Saturday the 13th of June, I submitted a report titled The Human Error in AI and Children’s Rights as response to the EU White Paper on Artificial Intelligence a European Approach Consultation. My response combines the findings of the Child | Data | Citizen Project with the preliminary observations resulting from my forthcoming project The Human Error Project: AI, Algorithmic Bias and the Failure of Digital Profiling (launch September, 2020).
In the report I argue that citizens today are governed through data in ways that were not possible before and thanks to AI innovation. Predictive analytics is used by policing and courts; biometric monitoring is a common practice by border patrolling; data driven decision making is used by governments to decide fundamental matters such as welfare provision or child protection. The governing of citizens through data implies not only that citizens are being datafied from birth, but also that they are exposed to all sorts of algorithmic error and inaccuracies. For this reason, we need to find political solutions that recognize the ‘human error’ in algorithms, and the fact that when it comes to human profiling algorithms are always going to be inevitably inaccurate and biased. We also need political solutions that discourage the use of children’s data and the profiling of children. In this regard, I would personally encourage the EU Commission to add the following points to their White Paper On Artificial Intelligence – A European Approach to Excellence and Trust:
- Make sure that ALL AI system that are trained to profile human beings and are used for data-driven decision making are considered high-risk.
- Make it a legal requirement for those private and public sector actors who wish to use AI technologies for automated decision making on individual rights to : a) inform citizens that decisions have been made by an AI system b) offer the possibility to appeal the decision and request human oversight.
- Make sure that AI systems aimed at human profiling and decision making, do not train technologies on the data of individuals gathered before the age of 18.
- Make sure that all the children’s data, which is collected through adult profiles – which are not designed and are not aimed at children and hence which are not required to abide to COPPA or the GDPR special children’s protections – is deleted, not used and not sold or shared
You can read the full report here: