Hotwire – Global ideas news updates

g r

“Democracy of the Gullible.” Becoming Invisible in a Constantly Monitored AI World

fashion Lifestyle Technology

  • 1.5K
  • 173
  • 230
  •  
  • 121
  •  
    2K
    Shares

In a world where we are constantly monitored, computers have become adept at identifying people, from tracking criminals to countering terrorism. Finding ways to beat up the facial recognition system is a modern day’s challenge. Activists are afraid about their lost privacy, thinking society might be at the doorsteps of a dystopia where Big Brother sees it all. A group of artists are trying to make a statement by becoming invisible.

Facial recognition systems  are becoming more pervasive. Till now, most of the facial recognition were illustrations or academic projects experimenting with face-jamming. The spread of COVID-19 has fueled increased surveillance. People in the UK are being scanned, misidentified and wrongly stopped by police, as a result police allegedly stopped people for covering their faces or wearing hoodies, and one man was even fined for refusing to be scanned. 96% of times Facial recognition wrongly identifies public as potential criminals.

What is facial recognition?

Face recognition is a method of identifying / verifying the identity of an individual using their face, usually called faceprints. Face recognition systems can be used to identify people in photos, video, all in real-time. Law enforcement may also use mobile devices to identify people during police stops.

How facial identification works

Source: Iowa Department of Transportation

Face recognition systems use computer algorithms to pick up specific and distinctive details about a person’s face. These details, such as distance between the eyes or shape of the chin, are then converted into a mathematical representation and compared to data on other faces collected in a face recognition database. The data about a particular face is often called a face template 

Facial recognition technology will change our lives

Some face recognition systems, instead of positively identifying an unknown person, are designed to calculate a probability match score between the unknown person and specific face templates stored in the database.

Face recognition data can be prone to errors, which can implicate people for crimes they haven’t committed.

Who is using facial recognition?

  • U.S. government at airports. Facial recognition systems can monitor people coming and going in airports. Police collect mugshots from arrestees and compare them against local, state, and federal face recognition databases.  A team in the FBI is dedicated just to face recognition searches called Facial Analysis, Comparison and Evaluation (“FACE”)
  • Mobile phone makers in products. Apple first used facial recognition to unlock its iPhone X, and continues with the iPhone XS. Face ID authenticates. Apple says the chance of a random face unlocking your phone is about one in 1 million.
  • Colleges in the classroom. Facial recognition software can, in essence, take roll. If you decide to bunk classes, your professor could know. 
  • Social media companies . Facebook uses an algorithm to spot faces when you upload a photo to its platform with 98% accuracy. 
  • Retailers in stores.  They combine surveillance cameras and facial recognition to identify suspicious characters and potential shoplifters

The number of people wearing face masks, scarves, and bandannas, there’s an iota of possibility that these masks will make face recognition harder to implement. 

Real-time facial recognition challenges our privacy

Historically, black people are disproportionately oppressed and targeted by surveillance technology. Worldwide protests against police brutality in support of Black Lives Matter movement continue in the wake of George Floyd’s killing. 

Protecting oneself against mass surveillance while rallying has become a pressing issue. It is inexplicably ominous, and civilians are considering ways to resist being tracked and profiled everywhere they go. Silicon Valley has admitted facial recognition technology is toxic.

Clearview AI, an American technology startup provides facial recognition software, which is used by private companies, law enforcement agencies, universities and individuals. They have been talking with state agencies about using its technology to “track patients infected by the corona virus”.  China’s facial recognition software is linked to a phone apps that identifies people based on their contagion risk and determines when they’re cleared to enter an array of public spaces.  Russia’s facial recognition technology is being deployed to track people who violate quarantine orders.

Companies are upgrading their facial recognition with sophisticated algorithms to recognize masked faces due to Covid-19.

Parental WarningArtificial Intelligence companies are exploiting these opportunities promising benefits that go beyond immediate health concerns, such as crowd management systems, preventing crime, parental controls, terrorism and upgraded transportation.

It is natural to feel formidable and conflicted about surveillance in the times of crisis. The lawmakers pondering how to best ensure public welfare while maintaining our Civil Rights.

New York Times posted

We are just a bunch of codes – Facts & Fiction

Facial recognition works by mapping facial features – mainly the Eyes, Nose and Chin – by identifying dark and light areas, then calculating the distance between them. That unique facial fingerprint (faceprint) is then matched with others on a database.

An option that is regularly tossed around is the idea of annoying the identification systems with clothing, makeup, and accessories that obscure and distort our appearance confusing the tracking devices.

CV Dazzle makeup beats facial algorithm

Fashion designers and artists are coming up with novel ways to confuse intrusive cameras in a twinkle of hope that these will make facial recognition algorithms harder to implement. 

Emily Roderick and her cohorts from The Dazzle Club walked around London last week with red, blue, and black stripes painted across their faces in an effort to escape the watchful eyes of facial-recognition cameras. They said camera camouflage is helping people reclaim their identities and hope the initiative will spread to other cities in the world, said Roderick.

The Dazzle Club exploring surveillance in public space

The Dazzle Club exploring surveillance in public space

The concept was created by an artist, Adam Harvey, who coined the term “Computer Vision Dazzle”, or “CV Dazzle”, a modern version of the camouflage used by the Royal Navy during the first world war.

Altering people’s looks to cheat cameras has become increasingly popular with artists and designers in recent years. Growth in the use of  facial recognition has raised fears over privacy, says fashion experts.

How to defying facial recognition?

Fashion masks and T-Shirts

Many other artists, designers and technologists have been inspired by his attempts to hide without covering the face. Jing-cai Liu, a design student, created a wearable face projector, while Dutch artist Jip van Leeuwenstein made a clear plastic mask that creates the illusion of ridges along the face.

Wearable face projector

Wearable face projector to trick facial recognition software into thinking you are not a human

Lens Mask

A lens-shaped mask makes its user undetectable to facial recognition algo

Transparent Mask

Because of the mask’s transparency you will not lose your identity making it possible to interact with people around you

Headscarf Decorated

By giving an overload of information software gets confused, rendering you invisible.

Another online retailer Adversarial Fashion sells shirts, skirts and other garments emblazoned with fake licence plates that gets picked by traffic surveillance cameras, “injecting junk data” into the system used “to monitor and track civilians”.

Adversarial Fashion T Shirt Prints

Adversarial Fashion T Shirt Prints

Chicago-based designer Scott Urban has developed sunglasses that block infrared facial recognition cameras.

infrared glasses

The device is fitted with “a near-infrared LED that appends noise to photograph images without affecting human visibility. When switched on, a user’s face is no longer scanned as human face to AI, indicated by lack of green boxes.

A makeup technique known as CV Dazzle, first pioneered into tricking facial recognition software, making it think you are not a human.

An artist designed a toolkit makeup and styling tips that can make faces unrecognizable to facial recognition

CVDazzle makeup

CV Dazzle – The technique gets its name from a World War I tactic

Harvey is skeptical about gait recognition but has other concerns about infrared, thermal and polarimetric techniques which measure heat and the polarization qualities of skin.

“Giving too much power to the police or any security agency provides the conditions for authoritarian abuse. It’s like pollution – there’s a cost which we are ignoring. Hopefully this provides an alert to people that it’s so easy – you can turn on the system in one day, and the cost is so low, that there aren’t any built in frictions that discourage people from using it.” – The Guardian

While makeup presents some challenges to facial recognition, it can still be beaten up by other technologies, said Dr. Sasan Mahmoodi, a lecturer in computer vision at the University of Southampton. “They cannot do makeup on their ears,” he said. “It’s difficult to change your ear, although you can hide it with a hat or hair.”

Zach Blas Facial Weaponization Suite protests against biometric facial recognition–and the inequalities these technologies propagate–by making “collective masks”

Militancy, Vulnerability, Obfuscation, tableau vivant, Performative Nanorobotics Lab, University of California, San Diego, photo by Tanner Cook (2013)

Militancy, Vulnerability, Obfuscation, tableau vivant, Performative Nanorobotics Lab, University of California, San Diego, photo by Tanner Cook (2013)

There is also gait recognition – people have distinctive walking patterns. Already used by police on the streets of Beijing and Shanghai, “gait recognition” is part of a push across China to develop artificial-intelligence and data-driven surveillance that is raising concern about how far the technology will go.

Huang Yongzhen, the CEO of Watrix, said that its system can identify people from up to 50 meters (165 feet) away, even with their back turned or face covered. This can fill a gap in facial recognition, which needs close-up, high-resolution images of a person’s face to work.

Huang Yongzhen, the CEO of Watrix, said that its system can identify people from up to 50 meters (165 feet) away, even with their back turned or face covered. This can fill a gap in facial recognition, which needs close-up, high-resolution images of a person’s face to work.

Watrix software extracts a person’s silhouette from video and analyzes the silhouette’s movement to create a model of the way the person walks. This technology isn’t new. Scientists in Japan, the United Kingdom and the U.S. Defense Information Systems Agency have been researching gait recognition for over a decade

Live facial recognition is a mass surveillance tool which scans thousands of innocent people in a public space, subjecting them to a bio-metric identity check, much like taking a fingerprint. Government agencies and others could have the ability to track you.

What you do, wherever you go might no longer be private. It could become impossible to remain anonymous.
  • Facebook allows you to opt out of its facial recognition system.
  • Google won’t enable facial recognition until you opt in. The system also allows you to turn face recognition on and off.

Will hackers really want to steal your faceprints?

Privacy matters! Privacy refers to any rights you have to control your personal information and how it’s used. It is smart in general to be careful about what you share on social networks. Posting too much personal information, including photos, could lead to identity theft. After all, there are few rules governing its use.

We have made ourselves so visible in order to hide. The companies selling this tech talk about preventing crime is quasi and misnomer. There is no evidence that this prevents crime. It might be sometimes used when crime has been committed, but they push the idea that this will make our lives safer.

How can you avoid the AI cameras?

Our rights should really be protected by parliament, the law makers and the courts. But if they fail us on facial recognition, people will have to protect themselves in these ways.

 


  • 1.5K
  • 173
  • 230
  •  
  • 121
  •  
    2K
    Shares

Enjoyed this ! Speak your Mind below & Share it along.


By Vinit

Masters in Business Management he currently works as a marketer and a strategic planner. Loves to travel and write about technology, business trends and the ever changing world. He is effortlessly funny and consistently engaging.


Leave a Reply

Your email address will not be published. Required fields are marked *