Here’s something I’m embarrassed to admit: Even though I’ve been reporting on
the problems with
facial recognition for
half a dozen years, I have allowed my face to be scanned at airports. Not once. Not twice. Many times.
There are lots of reasons for that. For one thing, traveling is stressful. I feel time pressure to make it to my gate quickly and social pressure not to hold up long lines. (This alone makes it feel like I’m not truly consenting to the face scans so much as being coerced into them.) Plus, I’m always getting “randomly selected” for additional screenings, maybe because of my Middle Eastern background. So I get nervous about doing anything that might lead to extra delays or interrogations.
But the main reason I haven’t declined airport face scans is actually very simple: I had no idea I could opt out.
It turns out that saying no is not only doable, but
surprisingly easy — at least in theory. Everyone, regardless of citizenship, can opt out when it comes to domestic flights in the US. (For international flights, US citizens can opt out but foreign nationals have to participate in face scanning,
with some exceptions.) Simply stand away from the camera or keep your face covered with a mask, present your ID, and say, “I opt out of biometrics. I want the standard verification process.”
In theory, an officer is then supposed to manually look over your ID and compare it to your face, as they used to do before facial recognition. But in practice, there have been
reports of passengers — even a senator — facing resistance or intimidation when they try to go this route.
The Transportation Security Administration (TSA) and Customs and Border Protection (CBP) are also supposed to have clear signs informing passengers of the right to opt out. But at many airports, you have to look really, really hard to spot that message. Be prepared to crane your neck at an unnatural angle or squint at a very small font!
This is why the Algorithmic Justice League, a nonprofit that sheds light on AI harms, launched a campaign this month called
“Freedom Flyers” to raise awareness of your right to opt out. The timing is perfect: The TSA
recorded an all-time record day for air travel on June 23, with nearly 3 million people screened at the country’s airports as summer vacation season picked up.
Now is the ideal time to make sure you know your rights when you pass through airport security — and understand exactly what’s at stake. The implications go way beyond air travel.
How facial recognition works at the airport
In the US,
over 80 airports are currently piloting facial recognition technology. The TSA’s goal is to roll out the tech in all of the more than 430 airports that it covers,
arguing that this kind of automation would reduce “friction” at airports — meaning, presumably, how long it takes passengers to move through security.
That should raise some eyebrows, because there are known risks with this AI technology, from the possibility that your face data will be
stolen due to breaches to the chance that you’ll be
misidentified as a criminal suspect — and jailed. Neither of these are hypothetical scenarios; the former has happened due to CBP system vulnerabilities and the latter has happened at the hands of police. And then, of course, there’s
AI bias; facial recognition tech is known to disproportionately
misidentify people of color. (A CBP spokesperson insisted that the agency’s facial comparison algorithm “shows virtually no measurable differential performance in results based on demographic factors.”)
But as dangerous as face recognition can be if it goes wrong, a greater concern could be what happens if it’s seen to work as intended. When I asked
Joy Buolamwini, the founder of the Algorithmic Justice League, what worries her about the use of this tech in airports, she said, “The big one for me is normalizing surveillance.”
Buolamwini argued that airport face recognition is a way of acclimating the public to having more and more sensitive information taken. “I see this on a longer trajectory,” she said. “And they’ve shown you the trajectory.”
She was referring to
a roadmap released in 2018 by the TSA. It distinguishes between two types of facial recognition: There’s one-to-one matching, where the TSA compares the photo in your passport with the photo they take of you at the airport, to make sure that the photos match. (If you ever use your face to unlock your iPhone, this is the kind of facial recognition you’re using.)
Then there’s one-to-many matching, where your image is compared with images of others. One-to-many matching is already in use by CBP and airline partners in that they compare a passenger’s photo to a database of government documents (like US passports) for verification, TSA press secretary Carter Langston told me by email.
A particularly worrisome form of one-to-many matching is live biometrics. “Live biometrics is the
Minority Report kind of thing — where you’re just walking around and they can identify you,” Buolamwini said. And if everyone’s face becomes fair game for live biometrics, your likeness could one day be checked against a criminal database any time you walk through a drug store or show up at a protest, which may
create a dangerous chilling effect across society.
The TSA’s own 2018 roadmap says they aim to use “live biometrics” in the future. However, Langston disputed Buolamwini’s interpretation of that term. “That interpretation of TSA’s use case is nothing that I have heard anyone involved in the program indicate. TSA’s use case is and continues to be about identity verification,” he told me.
For now, Buolamwini said, “You might hear people say ‘Oh, we’re only doing one-to-one matching. You show us your ID, you show us your face, and we delete the data.’” But, she stressed, the full story is more complicated.
You have the right to opt out of facial recognition tech. Here’s how.
www.vox.com