Technologist and former British Telecom chief scientist Peter Cochrane joins the Plutopia to talk about his lifelong pursuit of truth and his work on a “truth engine” that used AI to grade the reliability of news sources and authors. Cochrane argues that real truth is hard, costly, and collaborative — unlike social media, which feeds users comforting falsehoods that match their worldview — and warns that losing a shared grip on truth threatens civilization.
Drawing on his career in communications, AI, and cybersecurity, Peter explains how he boosted lie detection rates by tracking sources over time, factoring in bias, and adding linguistic and psychological analysis, pushing accuracy toward 95%. The conversation widens into science as an ongoing search rather than final certainty, the distortions of corporate media, the risks and inevitability of AI-driven systems like driverless cars, and his own experiment living with AI-assisted hearing. Throughout, Cochrane stays optimistic but insistent on building human and machine ethics, noting that technology should be judged by whether it improves on fallible humans and helps us keep truth at the center of society.
Podcast: Play in new window | Download
Peter Cochrane:
Truth is very expensive. It costs you a lot of time, energy, concentration. You have to have these inner arguments. You have discussions with other people and you gradually zero down to an opinion based on the facts. Whereas on Facebook it is easy. You you just believe it. And it’s so outrageous — that it fits your world model. That’s the worst aspect. The whole of social media is tuned to your social or world model, and they just feed you the stuff that reinforces your belief system. I think that can be said of most religions, they do the same thing. Uh they feed you the story from being a child continually till it becomes perfect.