Can Europe Make It?

Travel authorization in the EU: automated processing and profiling

The ETIAS system will include automated tools even though the personal data in its information systems may be inaccurate, in a classic case of garbage in and garbage out.

Elspeth Guild Niovi Vavoula
12 October 2020
"Joint Inspection and One-time Release" customs inspection in border channels of Hong Kong-Zhuhai-Macao Bridge, October, 2018.
|
Liang Xu/Xinhua News Agency/PA. All rights reserved.

On 12 September 2018, Regulation (EU) 2018/1240 established a European Travel Information and Authorisation System (ETIAS) with the aim of identifying whether the presence of visa-exempt third-country nationals – soon to include UK nationals – in the territory of the Member States would pose a security, irregular migration or high epidemic risk.

To those ends, the ETIAS Regulation foresees that visa-free travellers will be required to apply for a travel authorisation prior to the date of their departure, in a similar manner to the US ESTA. In a previous piece we analysed how the categories of personal data collected are disproportionate for pre-vetting travellers and concluded that the EU falls below the standards expected from a global leader in data protection law.

In this follow-up piece, we expose the flawed tools that will be used for processing the ETIAS applications.

The ETIAS will automatically compare the personal data submitted to: (a) the data already stored in EU information systems such as the Schengen Information System (SIS), the Visa Information System (VIS), Eurodac, the Entry/Exit System (EES), Europol data and certain Interpol databases; (b) an ETIAS watchlist and; (c) specific risk indicators. Such comparison pre-supposes the implementation of interoperability amongst EU information systems, for which reason the Commission has adopted a package of two proposals. This package is currently under negotiation, with the report by the European Parliament Rapporteur imminently due.

Automated processing

Automatic processing aims to identify whether the applicant’s data is listed in an information system for a reason that may justify the refusal of the travel authorisation (e.g. the applicant has been refused a visa). In this section we unpack concerns as regards the reliability of the data contained in EU information systems and the relevance of ECRIS-TCN, a future database on criminal records, which will be added to the EU information systems for the purposes of automated processing.

In Opinion 1/15 on the agreement between the EU and Canada on the transfer of passenger name records (PNR) data to Canadian authorities, for example, the EU Court of Justice considered that the automated processing of passengers’ data would take place on the basis of pre-determined criteria and cross-checking against various Canadian databases. The databases cross-checked must be reliable and up to date, not based on pre-established models or criteria deemed to be discriminatory, and they must take account of statistical data and the results of international research.

Reliability and relevance of personal data in information systems

The quality of stored personal data has been a longstanding problem of the currently operational EU information systems. Spelling errors, lack of documentation, insufficient language skills are only some of the reasons why EU information systems may record bad quality data. For example, in November 2019, the European Court of Auditors stressed that though EU-LISA performs automated monthly data quality checks on certain SIS alerts, the potential data quality issues are not addressed sufficiently at the national level.

If the stored information is not of sufficient quality, automated processing may lead to incorrect processing and false hits, with significant repercussions for third-country nationals. Furthermore, concerns are also raised in relation to alerts stored in SIS specifically on individuals who should be subject to discreet checks or specific checks. For example in France it seems that alerts on discreet checks were registered ‘en masse’, as a response to terrorist events that occurred in 2015.

The relevance of ECRIS-TCN

A key amendment of the package involves the addition of ECRIS-TCN – a centralised database that will assist in the identification of the Member State(s) holding criminal record information on a third-country national – to the information systems for automated processing of ETIAS applications.

The use of ECRIS-TCN for ETIAS-related purposes allows processing of the data stored for purposes different than those for which the data were initially intended. It constitutes an example of ‘function creep’, which is difficult to reconcile with the purpose limitation principle.

Further, the need for automated processing of ECRIS-TCN is doubtful in view of the operation of SIS, as Article 24 of Regulation 2018/1861 (SIS border checks). SIS includes one category of alerts involving third-country nationals who are unwelcome because they have been convicted in a Member State of an offence carrying a penalty involving the deprivation of liberty of at least one year. Whereas Article 21 of Regulation 2018/1861 requires Member States to determine whether the case is ‘adequate, relevant and important enough to warrant an alert in SIS’, terrorist offences are excluded from a proportionality assessment.

In practice, in relation to terrorist offences the overlap between SIS and ECRIS-TCN will be complete. Therefore, if a third-country national convicted of a terrorist offence applies for an ETIAS authorisation, the automated processing of their data against SIS will necessarily generate a match. As for serious offences, the extent of that overlap is opaque. However, since the purpose of these alerts is precisely to prevent the entry of unwelcome third-country nationals, those cases that deserve an alert in SIS for refusal of entry and stay are and will be recorded. It is regrettable that the Commission did not conduct any prior evaluation of the extent of the future overlap between SIS and ECRIS-TCN, therefore, the necessity of the addition must be reconsidered in a future evaluation of the SIS and the ETIAS.

Defining who should be excluded: a new watchlist and an algorithm

Article 34 of the ETIAS Regulation provides for a new EU watchlist if a person is suspected of having committed or taken part in a terrorist offence or other serious criminal offence. It will also include persons in respect of whom there are factual indications or reasonable grounds for believing that they will commit such offences.

There is no territorial limit to the convictions, nor to the suspicion. The definition of a terrorist offence is cross-referenced to Regulation 2017/541 on combatting terrorism which is wide for instance, including Hong Kong residents who have campaigned against the imposition of Chinese security legislation in the territory and who have been targeted as ‘terrorists’ by the Beijing regime.

The term ‘other serious criminal offence’ is equally poorly defined though cross-reference to the European Arrest Warrant (EAW) offences for extradition. The use of this list for the ETIAS watchlist with no limitation to the Member States magnifies exponentially all the criticisms which have been leveled against it. The lack of commonality of the essence of the offences, for instance of ‘rape’ among Member States which resulted in the unsuccessful attempt to extradition Julian Assange from the UK to Sweden in 2012 is much greater when the conviction could have occurred in any country in the world.

Further, it is feared that refugees will be among the most immediate victims of this list as frequently they are persons targeted by their home states as ‘terrorists’ or ‘serious criminals’, for example the Uighurs in China, or the supporters of the alternative President of Venezuela Guaido, who are subject to prosecution for serious crimes and/or terrorism in Maduro’s Venezuela. Finally, the purpose of a new watchlist remains poorly justified. As the EU already has the SIS list of persons to be excluded, which includes criminal convictions (at least within the EU), this new watchlist does not appear to add anything except uncertainty.

In addition to a new watchlist, the ETIAS regulation provides for an algorithm to profile applications against pre-determined risk indicators related to security, illegal immigration or high epidemic transmission. The algorithm will require machine learning to determine which applications are dodgy on the basis of statistics by the EES on abnormal rates of overstaying and refusals of entry, ETIAS decisions on rates of refusals, statistics on overstaying by profile and Member State.

As with all algorithmic tools, as they are trained on past decision-making, they replicate all of the implicit and inherent biases of those earlier decisions. In August 2020, the UK Home Secretary chose to withdraw an algorithm used in visa applications when challenged by an NGO on the basis, inter alia, of racial discrimination rather than to try to justify its use before the courts.

In August 2020, the UK Home Secretary chose to withdraw an algorithm used in visa applications when challenged by an NGO on the basis, inter alia, of racial discrimination rather than to try to justify its use before the courts.

The ETIAS Regulation actually sets out the specific risks which the algorithm will be programmed to assess: age, sex, nationality, country and city of residence, level of education and current occupation (Article 35). In recognition that this list includes a series of grounds of prohibited discrimination under the UN Convention for the elimination of all forms of racial discrimination 1965and the EU Charter, it states that risk indicators will be proportionate.

It is unclear what that could possibly mean in practice. Further the provision states that ‘in no circumstances’ should the algorithm target individuals on the basis of information revealing colour, race, religion, ethnic or social origin, genetic features, language, political or any other opinion, religion or philosophical belief, trade union membership, membership of a national minority, property, birth, disability or sexual orientation. Yet, the list to be used obviously discriminates on exactly these grounds!

Designating sex may often indicate sexual orientation, city of residence will often reveal ethnicity and colour, level of education and current occupation will often be an indicator of property and also of trade union membership. The provision is profoundly internally contradictory on the one hand encouraging the technical designers of the algorithms to insert prohibited grounds of discrimination as measures for screening, then prohibits the same in the actual operation of the system. But the prohibition is in a subsection well below the rule, a clear indication of hierarchy and the subordination of Charter rights to the replication of existing biases.

But the prohibition is in a subsection well below the rule, a clear indication of hierarchy and the subordination of Charter rights to the replication of existing biases.

The ETIAS system is flawed and with little respect for the privacy rights of applicants. It will include automated tools even though the personal data in its information systems may be inaccurate, in a classic case of garbage in and garbage out.

As if the EU does not have enough ‘watchlists’, ETIAS creates yet another one, which purports or at least authorises the refusal of an ETIAS authorisation on the basis of politically motivated convictions by totalitarian regimes.

Finally, the ETIAS regulation not only appears to condone unlawful discrimination, but hard wires it into its screen process through the elements on which risk profiles will be constructed for algorithmic tools to ‘weed’ out applicants on dubious grounds such as education, occupation or age.

Regrettably, indirect discrimination hides behind exactly these apparently neutral elements, seeking to disguise the true nature of the profiling process, which if revealed would contravene not only the EU Charter but the Member States’ international human rights obligations.

US election: what's at stake for the rest of us?

Our editor-in-chief, Mary Fitzgerald, is on the ground in key battleground states ahead of the US election.

There's never been more at stake. But the pandemic has kept many foreign journalists away. Hundreds of international observers who normally oversee US elections aren't there.

Hear Mary describe what she's seeing and hearing across the country, from regular citizens to social justice activists to right-wing militias arming themselves for election day.

Plus: hear from the journalists behind openDemocracy's latest big 'follow-the-money' investigation, which lifts the lid on how Trump-linked groups have exported their culture wars across the world.

Join us for a free live discussion on Thursday 29 October, 5pm UK time/1pm EDT.

Get weekly updates on Europe A thoughtful weekly email of economic, political, social and cultural developments from the storm-tossed continent. Join the conversation: get our weekly email

Comments

We encourage anyone to comment, please consult the oD commenting guidelines if you have any questions.
Audio available Bookmark Check Language Close Comments Download Facebook Link Email Newsletter Newsletter Play Print Share Twitter Youtube Search Instagram WhatsApp yourData