Facial recognition tech's cavalier ways

The British Computer Society has issued a warning against the cavalier attitude adopted by organisations using facial recognition technology to monitor crowds. The organisation says the threat comes from a combination of flawed technology and a lack of ethical and rigorous safeguards around how that data is captured, stored and processed.

The BCS highlights a lack of diversity in product development teams leading to bias in new products or services that are data-dependent, the incorrect restructuring of data leading to the wrong data being associated with an individual, and using analysis methodologies that are invalid in a particular context, among others.

Then there’s the recent University of Essex report that found there have been significant flaws in the way UK police forces have trialled AI-enabled facial recognition technology.

“If the police can’t get it to work properly, why should we assume that property developers, museums, or music festival organisers can make it work?” says Dr Bill Mitchell, director of policy at BCS.

Europe takes on facial recognition

Voting goes back to paper