Last week, a Reuters special investigation revealed that Meta’s platforms are showing billions of scam ads every day, and that in some cases the company was profiting from advertising it believed was fraudulent. The report was a stark reminder of something many people already feel – the internet can often seem like a place that’s hostile, unfriendly, suspicious. When platforms appear to tolerate, or even benefit from, harm, people draw the obvious conclusion – that digital systems are not built for them.
Safeguarding trust
This is an issue that goes far beyond any single social media platform. It speaks to a much wider problem in the UK’s digital environment. For years, our approach to technology policy has focused on innovation, efficiency, and scale; and rightly so. But we have not given the same weight to trust, safety, and accountability.
The result is that many people now feel the digital world is somewhere they must navigate defensively. They are guarding against scams, impersonation, exploitation, and misuse of their data, rather than experiencing digital services as tools that support their lives. That is exactly the kind of environment that teaches people that digital systems are something to fear, not benefit from.
READ MORE: ‘AI deception is a threat to democracy. It’s time the law caught up’
And this is a serious problem, because the government’s industrial strategy and decade of renewal strategy both rely heavily on digital transformation and the successful adoption of AI. These technologies are the key to driving productivity in the NHS; improving public service delivery; helping businesses grow; and supporting the transition to a more dynamic economy. But this future is only possible if people actually trust the systems they are being asked to use.
If the public comes to associate AI and emerging technologies with fraud, impersonation, extortion, and manipulation, adoption will simply stall. Once trust collapses, it is almost impossible to rebuild. The real risk is not that robots take our jobs, but that people refuse to engage with digital services at all because the environment feels fundamentally unsafe. At a time when the cost-of-living crisis has already pushed people into financial precarity, asking them to take further risks online – without strong protection – is not credible.
We already see signs of this trust deficit in everyday behaviours. Older people, who stand to benefit most from digital inclusion, are giving up on online banking and public service portals because they fear misinformation and scams. Small businesses are stepping back from digital advertising because they cannot compete with fraudulent copycats and impersonators.
Meanwhile, the NHS is struggling to persuade people to use online appointment systems and symptom checkers; in part due to anxieties about data security and misuse. These are not isolated incidents, but rather early indicators of a system that is losing legitimacy.
And when trust erodes, so does our capacity for national digital transformation. AI cannot deliver productivity gains in workplaces where staff distrust the tools. Digital public services cannot succeed if the public fear the platforms that host them. Businesses cannot rely on data-driven operations if customers worry that data will be exploited.
Accountability and standards
To ensure trust in our digital systems we need to shift how we think about our the accountability and qualifications of the people creating and leading them. We should understand standards in the digital environment as a form of national infrastructure – as essential to a functioning economy as roads, power networks and clean water.
Subscribe here to our daily newsletter roundup of Labour news, analysis and comment– and follow us on Bluesky, WhatsApp, X and Facebook.
We do not rely on organisation’s ‘in-house’ codes of practice to ensure the safety of our bridges, or the quality of our drinking water. Likewise, we should not rely on the goodwill of multinational platforms to determine the safety and legitimacy of the systems that mediate our daily lives.
What is needed is a professional standards framework that treats digital platforms and AI systems as public-impact infrastructure, with clear expectations for safety, reliability, identity assurance and accountability.
This is not about adding regulatory friction. It is about establishing the same kind of professional and ethical Codes of Conduct that underpin trust in the professionals leading medicine, engineering, law, and finance. In every other domain where the public depends on expertise and systems they cannot directly inspect, we rely on standards, accountability and institutional stewardship to guarantee integrity. The digital economy should be no exception.
Trust is not built by reassurance alone. It is restored when systems are created to operate fairly, predictably and transparently. Treating digital trust as infrastructure, underpinned by recognised professional standards, is how we create the conditions in which digital transformation and AI can deliver lasting public value.
Share your thoughts. Contribute on this story or tell your own by writing to our Editor. The best letters every week will be published on the site. Find out how to get your letter published.
This is not about slowing innovation, but rather about creating the conditions that allow innovation to scale safely and with public confidence. A digital economy that people do not trust, cannot grow. A public sector that deploys AI without legitimacy will face resistance. A government that wants to reshape the country must start by ensuring that the systems people interact with every day feel fair, safe, and on their side.
Trust is not a ‘nice to have’ in the digital economy. Trust is the growth strategy.
- SHARE: If you have anything to share that we should be looking into or publishing about this story – or any other topic involving Labour– contact us (strictly anonymously if you wish) at [email protected].
- SUBSCRIBE: Sign up to LabourList’s morning email here for the best briefing on everything Labour, every weekday morning.
- DONATE: If you value our work, please chip in a few pounds a week and become one of our supporters, helping sustain and expand our coverage.
- PARTNER: If you or your organisation might be interested in partnering with us on sponsored events or projects, email [email protected].
- ADVERTISE: If your organisation would like to advertise or run sponsored pieces on LabourList‘s daily newsletter or website, contact our exclusive ad partners Total Politics at [email protected].
labourlist.org (Article Sourced Website)
#trust #digital #transformation #fail #LabourList
