Welcome to our

Cyber Security News Aggregator

.

Cyber Tzar

provide a

cyber security risk management

platform; including automated penetration tests and risk assesments culminating in a "cyber risk score" out of 1,000, just like a credit score.

Exploring facial recognition – beyond the headlines

published on 2025-07-21 10:53:22 UTC by Millie Marshall Loughran
Content:

Andy Schofield, CTO, Reliance High-Tech discusses the differences between facial recognition and authentication and how staff must be trained to operate said systems.

That phrase rang in my ears a few weeks back after reading yet another sensational story about facial recognition – this time from the BBC.

A woman was wrongly denied entry to a retail store. That should have sparked an important conversation about oversight, training and how these systems are managed.

Instead, the headline – “Woman mistaken for thief after shop face scan alert” – blamed the technology itself.

The system worked exactly as it was told to. The mistake was not made by the camera or the algorithm. It was made by a human being.

She was manually added to a watchlist. No biometric match. No clear justification. No path to appeal. And from that moment forward, the system simply did its job.

What failed her was not artificial intelligence. It was the absence of real intelligence in how the technology was governed, implemented and managed. Unfortunately, stories like this are not rare and corrode public trust.

Facial recognition technology

I have spent many years working with facial recognition technology.

As a chief technology officer in the security space and chair of the British Security Industry Association (BSIA) group focused on biometrics and AI, I have had the opportunity to see where these systems succeed and where they stumble.

What I have come to understand is that facial recognition is only as responsible as the environment it operates in.

When it is used to unlock your phone, verify your identity for a payment or pass through an airport gate, people are largely comfortable.

You know it is happening. You have given permission. You see the result instantly. The process is smooth, clear and under your control.

But take that same system and place it in a shopping centre or on a high street or in a law enforcement context – and people get nervous.

Not because the technology is different, but because the experience is.

There is often inadequate signage. No explanation. No ability to opt in or out. And worst of all – no ability to challenge a decision if you’ve been wrongly flagged.

It’s not the facial recognition people fear. It’s the feeling of being judged silently and unfairly by an invisible machine with no one to talk to when it goes wrong.

The semantics around facial recognition are also unhelpful. The way we label things shapes how people feel about them. We say “surveillance” and it sounds ominous.

We say “frictionless identity authentication” and suddenly it feels helpful. The challenge is not just the technology – it is the terminology.

The difference between facial recognition and authentication

Responsible deployment is possible and when done right, it earns trust. What is often misunderstood is the difference between facial recognition and facial authentication.

Recognition scans crowds for faces it is been told to watch for – people who have not consented but have been flagged, rightly or wrongly, as persons of interest.

Authentication, on the other hand, is more like the age-old rule: “If your name is not on the list, you are not coming in.”

It is the digital version of showing your ID when buying alcohol. You opt in, your identity is validated and only then are you granted access.

It is the same principle access control has used for decades – just faster and more secure.

Think of an office building. I often ask people, “Do you know who is in your building right now?” Most glance at an access control report.

But that only tells you which card was used – not who is actually inside. We would not dream of using that logic for our networks or digital systems.

Facial recognition, when properly implemented, is fast, secure, accurate and contactless.

It’s not about watching people. It is about knowing who is where and when. And in moments of crisis that knowledge becomes vital.

But again, it has to be done well. And “well” means ethically. It means transparently. It means accountability.

I once attended a live facial recognition debate at University College London, hosted by the Metropolitan Police. The backlash was intense. But one exchange has always stayed with me.

A protester stood up and demanded: “What gives you the right to force this technology on us and invade our privacy and rights?”

A police officer stepped forward and said: “Let me be honest – the worst part of my job, which I would not wish on my worst enemy, is knocking on doors and telling families their loved one is not coming home.”

He said: “And I’ve done that knowing there was technology out there that might have prevented it.”

The clearly distressed Police Officer addressed the protester and said: “To answer your question: We are here to assure you that we’re using this technology strictly within the limits of the law.

“We’re doing everything we can to ensure it is ethical, proportionate and fair. I want to protect your rights just as much as I want to protect people’s lives.

“But when there’s a credible threat – when someone’s safety is truly at risk – do I not have a duty to use every tool available to me to prevent harm? That’s the balance I’m trying to strike.”

BSIA – a code of practice for facial recognition

Yes, like many technologies, facial recognition can do harm if misused, but it can also do extraordinarily good when guided by strong ethics and smart governance.

That’s why the work we do now matters so much.

Across the industry, we are beginning to see progress. The BSIA has released a code of practice for facial recognition – a world-first document built not just on legal requirements but also on ethical expectations.

It lays out how systems should be designed, how data should be managed, how mistakes should be corrected.

It does not ask organisations to blindly adopt. It shows them how to do so with integrity.

Because it is not enough for a system to be compliant. It must also be fair. Understandable. Humane. The best organisations in this space are already taking that to heart.

They are involving privacy officers, legal teams and community representatives before deployment, not after.

They are designing systems with human oversight in mind. They recognise that a biometric match is not a conclusion.

It is a question – one that must be answered by people, not just code.

We need more of that. We need to shift the culture and focus not just on how we build these tools, but on how we talk about them and how we train the people who use them.

Staff training

Staff need to know how to operate facial recognition systems. But more importantly, they need to know how to question them and how to spot when something does not feel right. And how to escalate without fear of being ignored.

Because when something does go wrong – and inevitably, something will – there needs to be redress.

A real process. A real person. A real answer. Otherwise, the damage spreads. Trust erodes. And all the potential this technology holds starts to slip away.

This technology, when guided by strong values, has the potential to make our spaces safer, our lives easier and our systems smarter.

It can reduce friction for the vast majority of people who simply want to go about their day. It can protect the vulnerable. It can find the missing. It can secure the critical.

The answer is not to reject facial recognition – it is to take responsibility for how we use it.

This technology has the power to enhance safety, streamline access and uphold our duty of care – but only if we deploy it with integrity, transparency and purpose.

Let’s stop blaming the tool because it is easy or banning it because responsible implementation is hard.

Instead, let’s lead. Let’s set the standard. Let’s build systems people can trust and train the people who manage them.

When done right, facial recognition can be as seamless and secure as unlocking a phone – while giving organisations confidence in who is accessing their spaces.

It is time to stop fearing the future and start shaping it – for the better.

Article: Exploring facial recognition – beyond the headlines - published 3 months ago.

https://securityjournaluk.com/when-headlines-miss-the-mark-the-truth-about-facial-recognition/   
Published: 2025 07 21 10:53:22
Received: 2025 07 22 10:03:35
Feed: Security Journal UK
Source: Security Journal UK
Category: Security
Topic: Security
Views: 9

Custom HTML Block

Click to Open Code Editor