Facial recognition systems ‘in need of regulation’, says security specialist

By admin In News, Technology No comments

Facial recognition systems ‘in need of regulation’, says security specialist

Recent legislative development is is driven by a heated debate. Since San Francisco announced a ban on the purchase and use of live-feed facial recognition technology by government departments, there is speculation about which country or region will be next. 

San Francisco officials’ concerns included a lack of standards, inadequate knowledge and pervasive public discontent in the context of law enforcement. Many of the concerns are also reflected in the UK where some police forces are trialling the technology at present. 

David Warburton, senior threat research evangelist at F5 Labs with over 20 years’ experience in the IT and security industry and based in the UK says he expects changes in UK legislation on the products itself and how vendors sell facial recognition tech to government. “I am certain legislation is coming. I would be surprised if it happens as quickly as in the next 12 months, in the context what else is going in politics at the moment”, he told E&T.  

One of the central problems and where we need action is that no quality mark exists that would bind vendors to minimum standards for their software and hardware being sold to government departments: There is absolutely no water-mark or recognised UK or global standard to meet a certain level of anything. Anyone that has a solution and wants to sell it can add an offer to the market. It comes down to if you can do a good-enough sales pitch and convince someone to buy it, Warburton said.  

Opportunities in the UK to address weaknesses were found to be widely missed, according to research by the BBC. Automated systems trying to identify faces in real time as they pass a camera are part of several police forces’ tests of the technology. This would draw a close connection to what happened in San Francisco, according to Warburton. There, the officials ruling was specifically on live-feeds for facial recognition. Nothing was mentioned in regards to capturing the video.

Edward Snowden’s findings on the governments mass surveillance would come to mind. Warburton says that the distinction between using a piece of data to go after an individual or to research a particular crime and “to just surveil everybody” would be important.

He points out that the same software and systems could be used on existing CCTV cameras in the UK. One of the worries is that we can simply flip a switch on every single CCTV camera and have this facial recognition software constantly running, with the effect that everyone is being profiled and tracked all the time.  

That more people are pushing back is shown in the case that began today in Cardiff. Ed Bridges started the first major legal challenge against South Wales Police after being captured on facial recognition cameras while he was on a shopping trip, which he claims was an unlawful violation of his privacy. Bridges will also argue it breaches data protection and equality laws, during a three-day hearing at Cardiff Civil Justice and Family Centre.

The lack of regulation and known shortcomings – notably inaccuracies in recognition and gender and race biases – could exacerbate the backlash as the market in the UK grows bigger and more profitable for vendors.  

The problem facial recognition has right now is that it is little understood, argues Warburton. “As an individual and member of the public, it is fairly concerning. We are not kept well informed about what has been done and why and what technology is being used and how it is being used and whether it has been managed securely”. 

San Francisco could offer important lessons for the UK and other parts of the world on how to approach the right form of communication to the public if a region or nation were to set about starting to regulate it. Legislators in San Francisco were not giving their absolute ‘No’ to biometrics and facial recognition. “Instead, they say ‘let’s pause here, this needs to be regulated’. Without any regulation and oversight this data and technology could be misused”, Warburton explained. 

To counteract concerns, training and education on how facial recognition is being used is key – and it’s needed as much among law enforcement personnel as among the public. [Facial recognition] solutions are being provided to [police forces] as a black box and you wouldn’t necessarily expect police officers to be IT or cybersecurity specialists because they have far more important things to focus on. But if they don’t understand how the technology works and the limits and caveats, then we, as consumers and members of the public, have to put a huge amount of trust into law enforcement and government”.

Lack of knowhow among law enforcement professionals is also a problem, Warburton argues. He recommends police forces to understand the tech and its weaknesses as well as possible. “There will always be situations when something doesn’t work as expected”, he says.  Worrying would also be that in the courting and selling process of facial recognition systems, software and hardware vendors would not be incentivised to be ‘as honest as possible’. “They won’t make a big deal out of all the areas where the technology doesn’t work. There it would be most important to see updates to current legislation, soon, he said.  

The ethical debate among government departments and public about facial recognition in law enforcement would be a tricky subject. “You get the people that insist that if it catches one bad guy it is worth it. The counter-argument is equally extreme saying that surveilling everybody all the time is not worth the invasion of privacy,” Warburton said. He thinks that there needs to be is a lot more honesty and meeting-in-the-middle.  

The deployment of facial recognition [in law enforcement] remains fairly expensive, as examples in the UK show. It warrants remembering that acceptance of the benefits of this technology comes with the requirement to use it in a very controlled and useful place only, he said.  

As facial recognition is increasingly adopted in Britain’s police forces, there’s a concern that individuals who have ‘something to worry about’ will increasingly find ways around exposure – such as by disguising themselves or hiding their faces in public. The problem of diminished efficacy over time due to the widespread use of facial recognition is to be taken seriously, agrees Warburton.

He draws parallels to problems in other areas of technology – such as discussions around banning encryption or similar, to make law enforcement and catching terrorists and criminals easier, and whether that is ultimately what we want. Those really going to be hurt are the general public who do not really have any need to cover themselves up or hide,” he says.  People that have an interest in committing crimes would not hesitate to cover their faces or use special hats or clothing that can help to disguise, or to manipulate images. Career criminals in particular will know how to circumvent this technology.

There are already examples of products that aim to trick smart cameras and recognition systems. Hyphen Labs NeuroSpeculative AfroFeminism is one of them. It is used to ‘false-face computer vision camouflage patterns’. The 2017 Hyperface project printed patterns on to clothing or textiles – which then appear to have eyes, mouths and other features that a computer can interpret as a face. Other examples included creating an aesthetic of makeup and hairstyling that would cause machines to be unable to detect a face. 

Judging efficacy of facial recognition in quantitative terms also remains tough, Warburton said. This would be crucial to shed light on whether the return on investment makes it worth the trouble. The first concern is that few vendors are transparent about the shortcomings of their systems. At present at least, Warburton would be surprised if vendors would be keen to share information about the quality of their systems and caveats. “Transparency should be a quality mark, not only to technology buyers but also to the public”, he added.

It would not only be essential to convince the relevant buyers of a facial recognition system such as police forces but also the people whose images may be captured, to “convince them that everything is done securely and according to best practice. We don’t often see that so far”. There are no commercials, adverts or communication to the public that make reassurance and facial recognition use the subject of the discussion, he said.  

Something else that muddies the waters for the public and regulators to make up their minds about facial recognition relates to the question of what would constitute a ‘successful trial’ of facial recognition software within law enforcement. The measures on which we should judge them are unclear. You could say people are taking this as a very pervasive instrument that makes them cover their faces, but on the other hand you have three people arrested on suspicion of serious crimes, he says. What would be the real value of those arrests enabled by the technology?

In Big Brother Watch’s briefing for a Westminster Hall debate on facial recognition and the biometrics strategy, the independent non-profit organisation urged members of parliament to call on police to immediately stop using live facial recognition surveillance, and call on the Home Office to make a firm commitment to automatically remove the thousands of images of unconvicted individuals from the custody image database. 

Big Brother Watch director Silkie Carlo told TIME that the UK is on its way to adopting surveillance technologies in a style “more typical of China than of the West”. “With live facial recognition, these standards are being surreptitiously occluded under the banner of technological innovation. That’s why the ban in Silicon Valley is so commendable”, Carlo said. “In the UK, we’re dangerously wavering on the precipice of a very different country. I have little hope that our police will make the right decision. But I have great faith that the public won’t tolerate such a loss of liberty. And that’s why we’ll fight police facial recognition”, he said.      

Recent legislative development is is driven by a heated debate. Since San Francisco announced a ban on the purchase and use of live-feed facial recognition technology by government departments, there is speculation about which country or region will be next. 

San Francisco officials’ concerns included a lack of standards, inadequate knowledge and pervasive public discontent in the context of law enforcement. Many of the concerns are also reflected in the UK where some police forces are trialling the technology at present. 

David Warburton, senior threat research evangelist at F5 Labs with over 20 years’ experience in the IT and security industry and based in the UK says he expects changes in UK legislation on the products itself and how vendors sell facial recognition tech to government. “I am certain legislation is coming. I would be surprised if it happens as quickly as in the next 12 months, in the context what else is going in politics at the moment”, he told E&T.  

One of the central problems and where we need action is that no quality mark exists that would bind vendors to minimum standards for their software and hardware being sold to government departments: There is absolutely no water-mark or recognised UK or global standard to meet a certain level of anything. Anyone that has a solution and wants to sell it can add an offer to the market. It comes down to if you can do a good-enough sales pitch and convince someone to buy it, Warburton said.  

Opportunities in the UK to address weaknesses were found to be widely missed, according to research by the BBC. Automated systems trying to identify faces in real time as they pass a camera are part of several police forces’ tests of the technology. This would draw a close connection to what happened in San Francisco, according to Warburton. There, the officials ruling was specifically on live-feeds for facial recognition. Nothing was mentioned in regards to capturing the video.

Edward Snowden’s findings on the governments mass surveillance would come to mind. Warburton says that the distinction between using a piece of data to go after an individual or to research a particular crime and “to just surveil everybody” would be important.

He points out that the same software and systems could be used on existing CCTV cameras in the UK. One of the worries is that we can simply flip a switch on every single CCTV camera and have this facial recognition software constantly running, with the effect that everyone is being profiled and tracked all the time.  

That more people are pushing back is shown in the case that began today in Cardiff. Ed Bridges started the first major legal challenge against South Wales Police after being captured on facial recognition cameras while he was on a shopping trip, which he claims was an unlawful violation of his privacy. Bridges will also argue it breaches data protection and equality laws, during a three-day hearing at Cardiff Civil Justice and Family Centre.

The lack of regulation and known shortcomings – notably inaccuracies in recognition and gender and race biases – could exacerbate the backlash as the market in the UK grows bigger and more profitable for vendors.  

The problem facial recognition has right now is that it is little understood, argues Warburton. “As an individual and member of the public, it is fairly concerning. We are not kept well informed about what has been done and why and what technology is being used and how it is being used and whether it has been managed securely”. 

San Francisco could offer important lessons for the UK and other parts of the world on how to approach the right form of communication to the public if a region or nation were to set about starting to regulate it. Legislators in San Francisco were not giving their absolute ‘No’ to biometrics and facial recognition. “Instead, they say ‘let’s pause here, this needs to be regulated’. Without any regulation and oversight this data and technology could be misused”, Warburton explained. 

To counteract concerns, training and education on how facial recognition is being used is key – and it’s needed as much among law enforcement personnel as among the public. [Facial recognition] solutions are being provided to [police forces] as a black box and you wouldn’t necessarily expect police officers to be IT or cybersecurity specialists because they have far more important things to focus on. But if they don’t understand how the technology works and the limits and caveats, then we, as consumers and members of the public, have to put a huge amount of trust into law enforcement and government”.

Lack of knowhow among law enforcement professionals is also a problem, Warburton argues. He recommends police forces to understand the tech and its weaknesses as well as possible. “There will always be situations when something doesn’t work as expected”, he says.  Worrying would also be that in the courting and selling process of facial recognition systems, software and hardware vendors would not be incentivised to be ‘as honest as possible’. “They won’t make a big deal out of all the areas where the technology doesn’t work. There it would be most important to see updates to current legislation, soon, he said.  

The ethical debate among government departments and public about facial recognition in law enforcement would be a tricky subject. “You get the people that insist that if it catches one bad guy it is worth it. The counter-argument is equally extreme saying that surveilling everybody all the time is not worth the invasion of privacy,” Warburton said. He thinks that there needs to be is a lot more honesty and meeting-in-the-middle.  

The deployment of facial recognition [in law enforcement] remains fairly expensive, as examples in the UK show. It warrants remembering that acceptance of the benefits of this technology comes with the requirement to use it in a very controlled and useful place only, he said.  

As facial recognition is increasingly adopted in Britain’s police forces, there’s a concern that individuals who have ‘something to worry about’ will increasingly find ways around exposure – such as by disguising themselves or hiding their faces in public. The problem of diminished efficacy over time due to the widespread use of facial recognition is to be taken seriously, agrees Warburton.

He draws parallels to problems in other areas of technology – such as discussions around banning encryption or similar, to make law enforcement and catching terrorists and criminals easier, and whether that is ultimately what we want. Those really going to be hurt are the general public who do not really have any need to cover themselves up or hide,” he says.  People that have an interest in committing crimes would not hesitate to cover their faces or use special hats or clothing that can help to disguise, or to manipulate images. Career criminals in particular will know how to circumvent this technology.

There are already examples of products that aim to trick smart cameras and recognition systems. Hyphen Labs NeuroSpeculative AfroFeminism is one of them. It is used to ‘false-face computer vision camouflage patterns’. The 2017 Hyperface project printed patterns on to clothing or textiles – which then appear to have eyes, mouths and other features that a computer can interpret as a face. Other examples included creating an aesthetic of makeup and hairstyling that would cause machines to be unable to detect a face. 

Judging efficacy of facial recognition in quantitative terms also remains tough, Warburton said. This would be crucial to shed light on whether the return on investment makes it worth the trouble. The first concern is that few vendors are transparent about the shortcomings of their systems. At present at least, Warburton would be surprised if vendors would be keen to share information about the quality of their systems and caveats. “Transparency should be a quality mark, not only to technology buyers but also to the public”, he added.

It would not only be essential to convince the relevant buyers of a facial recognition system such as police forces but also the people whose images may be captured, to “convince them that everything is done securely and according to best practice. We don’t often see that so far”. There are no commercials, adverts or communication to the public that make reassurance and facial recognition use the subject of the discussion, he said.  

Something else that muddies the waters for the public and regulators to make up their minds about facial recognition relates to the question of what would constitute a ‘successful trial’ of facial recognition software within law enforcement. The measures on which we should judge them are unclear. You could say people are taking this as a very pervasive instrument that makes them cover their faces, but on the other hand you have three people arrested on suspicion of serious crimes, he says. What would be the real value of those arrests enabled by the technology?

In Big Brother Watch’s briefing for a Westminster Hall debate on facial recognition and the biometrics strategy, the independent non-profit organisation urged members of parliament to call on police to immediately stop using live facial recognition surveillance, and call on the Home Office to make a firm commitment to automatically remove the thousands of images of unconvicted individuals from the custody image database. 

Big Brother Watch director Silkie Carlo told TIME that the UK is on its way to adopting surveillance technologies in a style “more typical of China than of the West”. “With live facial recognition, these standards are being surreptitiously occluded under the banner of technological innovation. That’s why the ban in Silicon Valley is so commendable”, Carlo said. “In the UK, we’re dangerously wavering on the precipice of a very different country. I have little hope that our police will make the right decision. But I have great faith that the public won’t tolerate such a loss of liberty. And that’s why we’ll fight police facial recognition”, he said.      

Ben Heublhttps://eandt.theiet.org/rss

E&T News

https://eandt.theiet.org/content/articles/2019/05/facial-recognition-regulation-awaited-in-the-uk/

Powered by WPeMatico