Phone us
Fraud is changing in ways that can be difficult for organisations and individuals to detect. While many scams still rely on familiar techniques, the way they are delivered has become more targeted, more credible and more closely aligned with everyday communication.

This shift has developed gradually. Between 2024 and 2026, scams have moved from obvious, clumsy attempts into activity that is far harder to separate from legitimate interaction. The change is not purely technological, but structural, with scams now mirroring normal communication so closely that the boundary between genuine and fraudulent activity has become harder to define.

The Credibility Gap Closes

In 2024, the biggest shift was not the emergence of new tools, but better targeting. Scammers began moving away from mass phishing campaigns and towards carefully constructed, personalised approaches.

Instead of guessing, they researched. They pulled fragments of data from social media, leaked databases and public profiles, then used that information to make messages feel contextually correct. A delivery notification that matches a real purchase. A bank message that aligns with spending habits. A workplace request that sounds like internal communication.

A report by BBC News during this period highlighted how fraudsters were increasingly gathering information in advance to make their approaches more targeted and believable. The key change was not sophistication in isolation, but accuracy: removing the small inconsistencies that might normally trigger suspicion.

Artificial intelligence began to support this process. It helped refine tone, generate responses faster and maintain consistency across conversations. However, at this stage, most scams were still human-directed, with AI acting as an assistant rather than the main driver.

The Rise of Scam Infrastructure

By 2025, scams began to look less like individual acts and more like organised systems.

This is where the idea of scam farms became more visible in security reporting. These are coordinated operations where groups of fraudsters work at scale, often using automated tools, rented infrastructure and large datasets of stolen or scraped personal information. Instead of a single scammer managing one victim at a time, these operations can run thousands of parallel conversations simultaneously.

BBC reporting on fraud ecosystems during this period highlighted how scammers increasingly operate in structured networks, often supported by “fraud-as-a-service” models where tools, scripts and access to compromised data are sold or rented to other criminals.

Within these systems, automation plays a growing role. Messages can be triggered based on behaviour, conversations can be partially or fully AI-assisted, and responses can be adjusted in real time. This creates a smoother, more consistent interaction that feels less like manipulation and more like normal communication unfolding.

Voice cloning and early deepfake tools also began to appear in practical use, not always as perfect impersonations, but as “good enough” moments of trust. A short call confirming a payment or a voice note asking for urgency can be sufficient when combined with convincing context.

2026: The Disappearance of the Scam Boundary

By 2026, the most important change is not visibility or scale, but continuity. The moment where a scam becomes identifiable is increasingly difficult to pinpoint.

Interactions are no longer isolated events. They unfold across messages, calls, emails and sometimes video, with each step reinforcing the last. A request for money or data no longer appears as a sudden interruption. It is framed as the logical continuation of an ongoing exchange.

At the same time, scam infrastructure has matured further. The scam farm model described in earlier BBC reporting has evolved into more distributed, semi-automated networks. These systems blend human operators with AI-driven communication tools, allowing fraud to scale while maintaining consistency and realism across thousands of simultaneous targets.

Voice synthesis has improved to the point where short live interactions can feel authentic. Digital identities can be maintained across platforms with consistent history and behaviour, making them harder to flag as fake. Even behavioural cues, response timing, language style and conversational flow are increasingly replicated by AI systems trained on real human interaction data.

The result is not just better scams, but smoother ones. There are fewer obvious breaks, fewer contradictions and fewer moments that feel “off”.

What Actually Changed

Across these three years, the evolution is best understood as a shift in structure rather than appearance.

  • In 2024, scams improved by becoming more context-aware and targeted.
  • In 2025, they became more industrialised through scam farms and fraud-as-a-service ecosystems.
  • By 2026, they had become continuous systems that blend into everyday communication flows.

The BBC’s reporting on scam operations and fraud networks shows a clear trajectory. What began as isolated deception has become an organised, scalable industry.

Conclusion

Modern scams are no longer defined by obvious trickery. They are defined by integration. They fit into normal life so effectively that detecting them is less about spotting errors and more about questioning continuity, context and trust.

Between 2024 and 2026, fraud has shifted from something people receive to something they experience: quiet, structured and increasingly difficult to separate from legitimate communication.

How AJC Can Help

AJC helps organisations understand and respond to the changing fraud and cyber threat landscape. As scams become more targeted, automated and difficult to identify, businesses need clear governance, practical controls and informed decision-making across both technology and people.

Our consultants support clients in assessing fraud and cyber risks, strengthening internal controls, improving awareness and aligning security measures with real-world business operations. We help organisations move beyond reactive responses and build a more resilient approach to emerging threats.

Contact us on 020 7101 4861 or email us at info@ajollyconsulting.co.uk if you think we can help.

Sources:

Scams have grown more sophisticated, but people are fighting back – BBC News

In case you missed it...

AI Is Only as Good as Its Data
AI Is Only as Good as...

Artificial intelligence is rapidly moving from experimentation to everyday business use. However, as adoption increases, the success of AI depends...

Read More
UK Fraud Strategy
UK Fraud Strategy 2026–2029: A New...

The UK government has launched its latest fraud strategy, setting out a three-year plan to combat what remains the most...

Read More
What Should Our Cyber Security Goal Be?
What Should Our Cyber Security Goal...

For many SMEs, one of the first cyber security questions is how to define a clear and realistic objective. In...

Read More

Get in touch

    By submitting this form you are consenting that your data be handled in accordance with our Privacy Notice and we will be in touch regarding your enquiry.