Data Protection and Digital Technologies

In Europe, irregular migration is approached through the lens of criminal law. The criminalisation of irregular migration is reflected in both EU and national legislation. It results in a blurring of the line between immigration policy and security or policing, with a person’s irregular immigration status prioritised above everything else.

This is reflected in the growing use of personal data and digital technology to facilitate the policing of people who are undocumented, with damaging consequences for their health and safety.

In Europe personal data is often shared when undocumented people report crime or mistreatment to the police, exposing them to detention and deportation and discouraging them from seeking help. Personal data is also used to “police” undocumented people who access health care, social services, and education. In addition to their harmful effects on people’s health and safety, these practices lead to racial profiling and discrimination. This is why we advocate for the creation of “firewalls”, which means separating immigration enforcement from people’s access to services.

The European Commission has recognised that the “gathering and use of biometric data for remote identification purposes, for instance through deployment of facial recognition in public places, carries specific risks for fundamental rights”. And yet, the EU is actively supporting initiatives to increase the use of these technologies at the border.


These practices exist despite the EU’s strong safeguards on the protection of personal data, which is a fundamental right under the EU Charter of Fundamental Rights and protected under the 2018 General Data Protection Regulation.


PICUM is gradually expanding its efforts to understand and monitor these fast-moving and far-reaching developments. This webpage assembles various resources that we have prepared on data protection and digital technologies, and sets out our basic recommendations in this area.

Our Report:

Data Protection, Immigration Enforcement and Fundamental Rights

PICUM’s Recommendations

International, European, national and city-level authorities should:
  • Revisit and reform their approach to irregular migration on a criminal justice model that perpetuates discrimination and inequalities based on class and race; and move towards proportionate, humane and sustainable migration policies. There are already steps being made in this direction, based on growing international consensus, in favour of non-custodial, community-based initiatives and away from the systemic use of immigration detention (incarceration) as a tool of immigration control.
  • Establish “firewalls” to ensure that personal data obtained when undocumented people access health care or social services, or report crime, is not repurposed for immigration control. This protects personal data and safeguards the right to privacy, as well as a host of other rights, including to due process, that are the bedrock of our democracies.
  • Closely review the implications for communities of colour, and other at-risk groups, of the use of technology in predictive policing and immigration control; and develop clear guidelines about policing and the use of personal data and algorithms, based on meaningful input from and engagement with relevant stakeholders including law enforcement, digital rights organisations, representatives from affected communities, non-governmental organisations, data protection authorities, and equality bodies. These guidelines should address data-driven profiling as a form of discrimination incompatible with fundamental rights; and clarify the strict standards for derogations.
  • Empower equality bodies, data protection authorities, and other relevant public bodies to enhance their capacities to ensure accountability for the implications of digital technology and data processing for human rights and discrimination.
Our Briefings:

PICUM’s Submissions to various consultations

Digital rights organisations working on this subject: