“Democracy of the Gullible.” Becoming Invisible in a Constantly Monitored AI World
In a world where we are constantly monitored, computers have become adept at identifying people, from tracking criminals to countering terrorism. Finding ways to beat up the facial recognition system is a modern day’s challenge. Activists are afraid about their lost privacy, thinking society might be at the doorsteps of a dystopia where Big Brother sees it all. A group of artists are trying to make a statement by becoming invisible.
Facial recognition systems are becoming more pervasive. Till now, most of the facial recognition were illustrations or academic projects experimenting with face-jamming. The spread of COVID-19 has fueled increased surveillance. People in the UK are being scanned, misidentified and wrongly stopped by police as a result. Police allegedly stopped people for covering their faces or wearing hoods, and one man was fined for refusing to be scanned. 96% of times Facial recognition wrongly identifies public as potential criminals.
Facial Recognition technology will change our Lives
Who is using Facial Recognition?
- U.S. government at airports. Facial recognition systems can monitor people coming and going in airports.
- Mobile phone makers in products. Apple first used facial recognition to unlock its iPhone X, and continues with the iPhone XS. Face ID authenticates. Apple says the chance of a random face unlocking your phone is about one in 1 million.
- Colleges in the classroom. Facial recognition software can, in essence, take roll. If you decide to bunk classes, your professor could know.
- Social media companies . Facebook uses an algorithm to spot faces when you upload a photo to its platform with 98% accuracy.
- Retailers in stores. They combine surveillance cameras and facial recognition to identify suspicious characters and potential shoplifters
The number of people wearing face masks, scarves, and bandannas, there’s an iota of possibility that these masks will make face recognition harder to implement.
Real-time Facial Recognition Challenges our Privacy
Historically, black people are disproportionately oppressed and targeted by surveillance technology. Worldwide protests against police brutality in support of Black Lives Matter movement continue in the wake of George Floyd’s killing.
Protecting oneself against mass surveillance while rallying has become a pressing issue. It is inexplicably ominous, and civilians are considering ways to resist being tracked and profiled everywhere they go. Silicon Valley has admitted facial recognition technology is toxic.
Clearview AI, an American technology startup provides facial recognition software, which is used by private companies, law enforcement agencies, universities and individuals. They have been talking with state agencies about using its technology to “track patients infected by the corona virus”. China’s facial recognition software is linked to a phone apps that identifies people based on their contagion risk and determines when they’re cleared to enter an array of public spaces. Russia’s facial recognition technology is being deployed to track people who violate quarantine orders.
Companies are upgrading their facial recognition with sophisticated algorithms to recognize masked faces due to Covid-19.
Artificial Intelligence companies are exploiting these opportunities promising benefits that go beyond immediate health concerns, such as crowd management systems, preventing crime, parental controls, terrorism and upgraded transportation.
It is natural to feel formidable and conflicted about surveillance in the times of crisis. The lawmakers pondering how to best ensure public welfare while maintaining our Civil Rights.
New York Times posted
- Clearview’s Facial Recognition App Is Identifying Child Victims of Abuse and
- Video Games and Online Chats Are ‘Hunting Grounds’ for Sexual Predators
We are a Bunch of Codes- Facts & Fiction
Facial recognition works by mapping facial features – mainly the Eyes, Nose and Chin – by identifying dark and light areas, then calculating the distance between them. That unique facial fingerprint (faceprint) is then matched with others on a database.
An option that is regularly tossed around is the idea of annoying the identification systems with clothing, makeup, and accessories that obscure and distort our appearance confusing the tracking devices.
CV Dazzle makeup Beats Facial Recognition
Fashion designers and artists are coming up with novel ways to confuse intrusive cameras in a twinkle of hope that these will make facial recognition algorithms harder to implement.
Emily Roderick and her cohorts from The Dazzle Club walked around London last week with red, blue, and black stripes painted across their faces in an effort to escape the watchful eyes of facial-recognition cameras. They said camera camouflage is helping people reclaim their identities and hope the initiative will spread to other cities in the world, said Roderick.
Altering people’s looks to cheat cameras has become increasingly popular with artists and designers in recent years. Growth in the use of facial recognition has raised fears over privacy, says fashion experts.
Defying Facial Recognition
Fashion Masks and T-Shirts
Many other artists, designers and technologists have been inspired by his attempts to hide without covering the face. Jing-cai Liu, a design student, created a wearable face projector, while Dutch artist Jip van Leeuwenstein made a clear plastic mask that creates the illusion of ridges along the face.
Another online retailer Adversarial Fashion sells shirts, skirts and other garments emblazoned with fake licence plates that gets picked by traffic surveillance cameras, “injecting junk data” into the system used “to monitor and track civilians”.
Chicago-based designer Scott Urban has developed sunglasses that block infrared facial recognition cameras.
A makeup technique known as CV Dazzle, first pioneered into tricking facial recognition software, making it think you are not a human.
Harvey is skeptical about gait recognition but has other concerns about infrared, thermal and polarimetric techniques which measure heat and the polarization qualities of skin.
“Giving too much power to the police or any security agency provides the conditions for authoritarian abuse. It’s like pollution – there’s a cost which we are ignoring. Hopefully this provides an alert to people that it’s so easy – you can turn on the system in one day, and the cost is so low, that there aren’t any built in frictions that discourage people from using it.” – The Guardian
While makeup presents some challenges to facial recognition, it can still be beaten up by other technologies, said Dr. Sasan Mahmoodi, a lecturer in computer vision at the University of Southampton. “They cannot do makeup on their ears,” he said. “It’s difficult to change your ear, although you can hide it with a hat or hair.”
Zach Blas Facial Weaponization Suite protests against biometric facial recognition–and the inequalities these technologies propagate–by making “collective masks”
There is also gait recognition – people have distinctive walking patterns. Already used by police on the streets of Beijing and Shanghai, “gait recognition” is part of a push across China to develop artificial-intelligence and data-driven surveillance that is raising concern about how far the technology will go.
Watrix software extracts a person’s silhouette from video and analyzes the silhouette’s movement to create a model of the way the person walks. This technology isn’t new. Scientists in Japan, the United Kingdom and the U.S. Defense Information Systems Agency have been researching gait recognition for over a decade
Live facial recognition is a mass surveillance tool which scans thousands of innocent people in a public space, subjecting them to a bio-metric identity check, much like taking a fingerprint. Government agencies and others could have the ability to track you.
What you do, wherever you go might no longer be private. It could become impossible to remain anonymous.
- Facebook allows you to opt out of its facial recognition system.
- Google won’t enable facial recognition until you opt in. The system also allows you to turn face recognition on and off.
Will hackers really want to steal your Faceprints?
Privacy matters! Privacy refers to any rights you have to control your personal information and how it’s used. It is smart in general to be careful about what you share on social networks. Posting too much personal information, including photos, could lead to identity theft. After all, there are few rules governing its use.
We have made ourselves so visible in order to hide. The companies selling this tech talk about preventing crime is quasi and misnomer. There is no evidence that this prevents crime. It might be sometimes used when crime has been committed, but they push the idea that this will make our lives safer.
How can you avoid the AI cameras
Our rights should really be protected by parliament, the law makers and the courts. But if they fail us on facial recognition, people will have to protect themselves in these ways.
Enjoyed this ! Speak your Mind below & Share it along.