BLOG

Children and facial recognition

By Emeka Forbes.

Disclaimer: This is a guest post. The views, opinions, and positions expressed within the post are those of the author alone and do not represent those of Future Advocacy.

Take a walk along London’s Regent’s canal from Camden heading South East and you’ll eventually spot the towering frames of several old gas holders which now house swanky blocks of apartments. This is the edge of Kings Cross Central, a redeveloped former industrial site behind Kings Cross Station spanning 67-acres and home to high-end shops, cafés, bars and housing. Last year, the site was at the centre of a row over facial recognition technology after the Financial Times revealed several AI-powered cameras had been installed around the development prompting an investigation by the Information Commissioner’s Office (ICO).

Kings Cross Central isn’t exactly the last place you’d expect to find facial recognition technology. Private security guards in red baseball caps constantly patrol the site carefully looking out for signs of disorder. Granary Square — a large open space at the centre of the development furnished with minimalist stone benches and fountains which shoot out of the ground — is kept in near-perfect condition. It has none of the litter or grime usually found in the centre of a large city. 

If the prices don’t give it away (£4.50 for a latte), the kinds of products and services on offer certainly do (beeswax candle-making workshops in one store, bespoke terrarium styling services in another) — this isn’t a place for everyone. In fact, despite the presence of large areas which feel like public space, this whole development including the squares and pedestrian walkways is owned by a consortium of private developers and investment companies. In other words, this is all pseudo-public space, a detail which adds some useful context to the presence of roaming private security guards. The line between private and public is intentionally murky and people passing through Kings Cross Central could be forgiven for assuming they’re in an area controlled by the council like the surrounding streets.

After the existence of facial recognition technology was revealed, the King’s Cross Central Limited Partnership released a statement committing to cooperate with the ICO’s investigation and admitting it had used the cameras for a period of almost 2-years ending in March 2018. Plenty of privacy campaigners spoke out, warning about the risks posed by these cameras, but children and young people were largely left out of the conversation. On a bright sunny day in Granary Square, the fountains are a bustle of life and laughter. Children run wildly back and forth between jets of water, some getting drenched in the process. If facial recognition cameras were active in the square, did they capture children’s faces? If so, did their parents consent to this — and who had access to the data generated by the cameras?

One of the key hurdles for technologists working on any kind of AI technology is the risk that human bias is inadvertently baked into automated systems. According to an article published in Nature last year, there’s plenty of peer-reviewed evidence to suggest this is a real problem for facial recognition technology which typically has a much lower accuracy rate when it comes to identifying faces belonging to people from black and ethnic minority communities. The same article carried a warning that the roll-out of facial recognition technology should be paused until appropriate safeguards are in place to keep it from causing harm. A report published by Axon, the world’s largest supplier of police body cameras, also issued similar warnings last year admitting that the current state of accuracy in facial recognition technology could “perpetuate or exacerbate the racial inequities that cut across the criminal justice system”. Seemingly unfazed, the Metropolitan Police announced plans to deploy new facial recognition cameras in targeted areas across London. On the ground, this could mean more children and young people from black and minority ethnic communities being targeted by police in London after being incorrectly flagged as potential suspects by technology driven by underlying bias. 

There are other more serious risks which could arise from the premature roll-out of facial recognition technology. In 2011, a security guard employed by a tourist attraction in Edinburgh was found guilty of assault and placed on the sex offenders register after using his access to conventional CCTV cameras to track the movements of a young female cleaner before sexually assaulting her. Without clear safeguarding measures to ensure people with access to the far more advanced data generated by facial recognition cameras don’t misuse information, there’s a real risk we could be handing over new tools to adults intent on causing harm to children.  

Until technology companies and governments have dealt with these big problems and figured out how to develop sensible regulation capable of protecting children, young people and the wider public from harm, we should urgently pause the use of facial recognition technology in both public and private spaces. We should take on board lessons from our experience with social media and the explosion of online harms and think carefully about how harms enabled by digital technology are spilling over into the offline world.

Emeka Forbes is a policy officer for a leading children’s charity based in London. He is the co-founder of Possible Space, a think tank exploring access to public spaces by marginalised communities, and sits on the advisory board for independent campaign group Clean Up the Internet.