Last year, Leicestershire police scanned the faces of 90,000 festival-goers at Download Festival, checking them against a list of wanted criminals across the country. It was the first time anywhere in the UK that facial recognition technology — NeoFace — was used at a public outdoor event.
Privacy campaigners — and Muse frontman Matt Bellamy — expressed their fury at authorities after they casually mentioned the use of the surveillance project on Police Oracle, a police news and information website. Police didn't use any other method to warn festival-goers about the controversial initiative.
Oh, and they keep the data.
"The concept of facial recognition is moving towards a Blade Runner-type future. The question is: did I really give informed and explicit consent to this? Where's the transparency?" Raj Samani, CTO at Intel Security, told Mashable.
"In the case of festivals, it raises a lot of questions around what is done with our data once the event is over," he says.
In order for facial recognition to be of use, the data has to be stored. But it's unclear how the data is stored and protected or for how long it remains and when it's deleted.
It's nearly impossible to find out who the dataset is shared with or cross-referenced against, Christopher Weatherhead, technology officer at Privacy International, told Mashable.
"For example is the imagery being compared to law enforcement databases, medical databases, or social media profiles?" he said.