The Department of Homeland Security will forge ahead with plans to implement its problematic Vehicle Face System (VFS), an AI-powered facial recognition system, at the US/Mexico border.
After years of development, the Federal government will install the VFS system in Texas at the Anzalduas border crossing. Every person driving across the border will, at that time, have a photograph taken of their face in order to cross-reference their identity with various government databases, according to documents obtained by The Verge.
The VFS system was developed to capture images of vehicle occupants through windshields. It can disregard reflections and, using depth sensors and various other sophisticated hardware and AI components, identify the occupants of a vehicle.
The government intends to roll out VFS to process images of every passenger and driver in each vehicle crossing the border, in all lanes, in both directions.
We spoke with Brian Brackeen, CEO and Founder of Kairos, a facial recognition technology company, to see what he thought of the rollout. “The recent reports of face recognition surveillance at the US-Mexico border are troubling. And highlights again the human rights implications of selling facial recognition software to governments,” he said.
Despite the fact his company makes and sells facial recognition technology, Brackeen believes that large scale surveillance is unethical and has urged companies such as Amazon to cease providing the government with technology it can use to watch us. He continues, “The DHS’s mandate is clear. However, this goes beyond protecting our borders. This is a step closer to omniscient, ‘always on’ surveillance of society. The US government has deliberately designed camera technology for the purpose of peering into vehicles, through windows, to gather facial profiles of drivers and passengers. All without their permission or knowledge. This is HIGHLY intrusive and wrong.”
It appears as though the US government is absolutely determined to deploy AI solutions that will provide it with an ubiquitous surveillance state wherein citizens have no right to privacy. Any camera that can see through a windshield can see through a window (and facial recognition through walls is a reality).
We currently live in a world where billions of people walk around with a camera in their pocket, and we’re subject to having our pictures taken whenever we’re in public. Most people are fine with this, and most of us accept that we’re being recorded in stores, on the streets, and in our places of employment. The difference here, is that we know we get recorded at a gas station in case someone tries to rob it – they (the business, law enforcement, courtrooms, etc) can go back and check the tape.
It’s the same at the border. If law enforcement needs to check the footage from a certain time and date they have that option now – most border crossings have CCTV cameras. But adding AI and facial recognition to the mix means we have to trust the government. We have to have faith that the Federal government isn’t using biased data, imperfect algorithms, and/or using the data gained from such surveillance for unethical purposes.
Brackeen argues that’s a leap too far. “Now, introduce the very real shortcomings of facial recognition technology, its history of poor match rates in these scenarios, and the misidentification of individuals based on their appearance. Magnify all that by the prejudices and biases that exists in law enforcement and our security agencies—it’s a recipe for disaster” he said.
DHS didn’t ask permission when it developed the training data for VFS — it took thousands of pictures of people for the purpose of developing an AI for surveillance without informing the general public it was doing so. And it won’t ask your permission in August when it deploys VFS in Texas. It follows, then, that we won’t be in the loop as these technologies continue to pop up all over our country.
In 2014 Edward Snowden, speaking to The Guardian, said “No system of mass surveillance has existed in any society, that we know of to this point, that has not been abused.”
Read the source article in The NextWeb.