Personalization of public services, biases and artificial intelligence: towards the consolidation of digital rights in public administrations Julio Ponce Sole (IP), Agusti Cerrillo Martinez (IP), Clara Isabel Velasco Rico, Joost Johannes Joosten, Ramon Lluis Galindo Caldes, Anahi Casadesus de Mingo, Oscar Capdeferro Villagrasa, Javier Miranzo Diaz
Project data (GREC)
The purpose of this project is to define the guarantees of citizens against the use of artificial intelligence by the Public Administration in the personalization of services.
The project addresses a new topic with respect to previous research, on which there is little legal literature and which requires an interdisciplinary approach, guaranteed by the composition of the research team and the working group.
It is about the improvement of public services within the framework of the process of digital transformation of public administrations through artificial intelligence. This improvement will be studied from the perspective of the personalization of public services and the fight against bad administration, fraud and corruption.
Although this improvement is linked to the relevant right to good administration, there is no doubt that it presents several risks that will be the subject of consideration. On the one hand, the possibilities and limits of the exercise of discretionary administrative powers in a fully automated way. On the other hand, possible software errors that affect the proper functioning of public services within the framework of Artificial Intelligence. Third, the biases, both cognitive of programmers and those generated by machine learning. Fourthly, the possible manipulation of citizens generated by the union of the contributions of behavioral sciences and artificial intelligence (hypernudging). Finally, excessive public technological dependence on the private sector could mean the surreptitious transfer of public functions to it.
The project will analyze the improvement of public services, through the two ways mentioned, and their risks, proposing mechanisms to prevent and mitigate these (use of nudges, organizational and procedural designs, role of digital rights letters...).