Posted By: Mark Deull May 17, 2019
Britain has privacy laws similar to the U.S, but that didn’t restrain police from stopping and fining a resident for trying to cover his face to avoid being photographed by an AI camera on a public street. Every pedestrian was being photo’d and compared to a master database of wanted persons. ⁃ TN Editor
Police fined a pedestrian £90 for disorderly behaviour after he tried to cover his face when he saw a controversial facial recognition camera on a street in London.
Officers set up the camera on a van in Romford, East London, which then cross-checked photos of faces of passers-by against a database of wanted criminals.
But one man was unimpressed about being filmed and covered his face with his hat and jacket, before being stopped by officers who took his picture anyway.
After being pulled aside, the man told police: ‘If I want to cover me face, I’ll cover me face. Don’t push me over when I’m walking down the street.’
It comes just weeks after it was claimed the new technology incorrectly identified members of the public in 96 per cent of matches made between 2016 and 2018.
he cameras have been rolled out in a trial in parts of Britain, with the Met making its first arrest last December when shoppers in London’s West End were scanned.
But their use has sparked a privacy debate, with civil liberties group Big Brother Watch branding the move a ‘breach of fundamental rights to privacy and freedom of assembly’. Police argue they are necessary to crack down on spiralling crime.
Officers previously insisted people could decline to be scanned, before later clarifying that anyone trying to avoid scanners may be stopped and searched.
It was first deployed by South Wales Police ahead of the Champions League final in Cardiff in 2007, but wrongly matched more than 2,000 people to possible criminals.
Police and security services worldwide are keen to use facial recognition technology to bolster their efforts to fight crime and identify suspects.
But they have been hampered by the unreliability of the software, with some trials failing to correctly identify a single person.
The technology made incorrect matches in every case during two deployments at Westfield shopping centre in Stratford last year, according to Big Brother Watch. It was also reportedly 96 per cent accurate in eight uses by the Met from 2016 to 2018
In Romford, the man was fined £90 at the scene by officers, who also arrested three other people during the day thanks to the technology, according to BBC Click.
After being stopped he asked an officer: ‘How would you like it if you walked down the street and someone grabbed your shoulder? You wouldn’t like it, would you?
The officer told him: ‘Calm yourself down or you’re going in handcuffs. It’s up to you. Wind your neck in.’ But the man replied: ‘You wind your neck in.’
After being fined, the man told a reporter: ‘The chap told me down the road – he said they’ve got facial recognition. So I walked past like that (covering my face).
‘It’s a cold day as well. As soon as I’ve done that, the police officer’s asked me to come to him. So I’ve got me back up. I said to him ‘f*** off’, basically.
‘I said ‘I don’t want me face shown on anything. If I want to cover me face, I’ll cover me face, it’s not for them to tell me not to cover me face.
‘I’ve got a now £90 fine, here you go, look at that. Thanks lads, £90. Well done.’
Silkie Carlo, the director of civil liberties group Big Brother Watch, was at the scene holding a placard saying ‘stop facial recognition’ – before she asked an officer about the man they had taken aside: ‘What’s your suspicion?’
The officer replied: ‘The fact that he’s walked past clearly masking his face from recognition and covered his face. It gives us grounds to stop him and verify.’
Ivan Balhatchet, the Metropolitan Police’s covert and intelligence lead, said: ‘We ought to explore all technology to see how it can make people safer, how it can make policing more effective.