Photography Matthew Modoono/Northeastern University

This ugly AF t-shirt blocks facial recognition technology

Researchers at Northwestern University have developed a garment designed to confuse digital surveillance algorithms into thinking you don’t exist

The average Londoner is caught on camera more than 300 times a day – that’s one CCTV camera for every 14 people, and that number’s predicted to rise to one in 11 in the next five years. Thankfully, researchers at Northeastern University, MIT, and IBM have designed a top that makes you invisible to facial recognition technology.

Normally, surveillance algorithms work by recognising a characteristic in an image, drawing a ‘bounding box’ around it, and assigning a label to that object. To interrupt this, the t-shirt uses colourful, pixelated patterns to confuse the technology into thinking you don’t exist. In other words, the clusters of pixels are placed to confuse the AI’s classification and labelling system, making it harder for it to map out your facial features.

“The adversarial t-shirt works on the neural networks used for object detection,” Xue Lin, an assistant professor of electrical and computer engineering at Northeastern, told Wired.

According to Lin, wearing the t-shirt makes you 63 per cent less likely to be detected by digital surveillance technology, but it’s not failsafe. “We still have difficulties in making it work in the real world because there’s that strong assumption that we know everything about the detection algorithm,” she said. “It’s not perfect, so there may be problems here or there.”

In the meantime, people are turning to make-up to throw off facial recognition algorithms.  This includes blocking out shapes in geometric patterns, applying flowers to your face to obscure key features such as the eyes or nose-bridge, or applying Juggalo clown face, because fuck the system, I guess?

Read Next
NewsInstagram is testing a feature that will let you control your own feed

The photo-sharing platform is trialling Favourites, which will enable users to choose whose content they see first – the update might also help reduce shadow-banning

FeatureAn AI bot making fake nudes of women is thriving on Telegram

A ‘deepfake ecosystem’ has generated over 100,000 non-consensual, explicit nude images of women and underage girls, with warnings going ignored by the messaging app

NewsSuccess, smears, and an inevitable slump: Houseparty is shutting down

Everyone’s favourite quarantine app is set to be discontinued after a year of unprecedented popularity, a hacking scandal, and eventual abandonment by its users

FeatureThe hot, hot rise of the NSFW Twitter alt

While Instagram, Facebook, and Tumblr heavily censor their users, Twitter is a hotbed for anonymous, explicit accounts – is it about sex work, art, or exhibitionism?