Sophie Nightingale: Our Minds on Digital Technology

by Plutopia News Network
Photo of Sophie Nightingale

The Plutopia podcast hosts Dr. Sophie Nightingale, a psychologist at Lancaster University, to discuss how digital technology — especially social media, generative AI, and the constant flow of online information — shapes human memory, judgment, and vulnerability to deception. She explains that people struggle to evaluate critically the sheer volume of information they encounter, so they’re more likely to accept content that aligns with their preexisting beliefs, and this helps misinformation spread. Nightingale traces her research from early work on how taking photos can impair memory to current studies showing that most people can spot fake or AI-generated images only slightly better than chance, and even training improves performance only modestly. She and the hosts dig into the limits of AI “guardrails,” the uneven global landscape of AI regulation, the rise of misogynistic online spaces, and the troubling growth of AI-enabled nonconsensual intimate imagery, arguing that legal reform, platform accountability, and public education are all needed to reduce harm.

One of the things that tends to make people quite susceptible is just information overload, purely that we live in an age where we are accessing so much information all the time we can’t possibly interpret, or critically think about, everything. So we might well just accept things that we wouldn’t otherwise. There’s quite a lot of evidence showing that’s especially the case, if that information coincides with your pre-existing beliefs. So for example, if I happen to be a huge fan of Donald Trump, let’s say, and I saw some misinformation around Donald Trump that was positive about him, then I would probably be more likely to believe that than somebody who was not a fan of Donald Trump already, if you see what I mean. So those biases definitely exist. There’s a lot of evidence showing that. And then I think, you know, it kind of comes back as well to — if you want to believe something, you will.

Dig these related posts...

Leave a Comment

* By using this form you agree with the storage and handling of your data by this website.