Couverture de AI Bias, Sex Robots & The Algorithms Radicalising Your Kids

AI Bias, Sex Robots & The Algorithms Radicalising Your Kids

AI Bias, Sex Robots & The Algorithms Radicalising Your Kids

Écouter gratuitement

Voir les détails

À propos de ce contenu audio

Episode Summary

Tracey Spicer is one of Australia’s most respected journalists and the author of Man-Made: How the Bias of the Past Is Being Built into the Future. In this episode, Georgie sits down with Tracey for a sharp, funny, and occasionally jaw-dropping conversation about what happens when we treat AI like neutral math instead of what it really is: opinion written in code.

They unpack why algorithmic bias is getting worse in the generative AI era, how recommendation engines can quietly radicalise people (from Andrew Tate pipelines to hyper-performative “tradwife” culture), and why “move fast” without guardrails is a dangerous blueprint. The discussion also goes into the weird and unsettling frontier of humanoid home robots, privacy risks in always-on devices, and what Tracey learned researching sex robots, including the disturbing ways consent is engineered out of the product.

Plus: why Tracey’s favourite AI tool is Claude, what she thinks about Grok and the chaos machine of X, why we are not getting a four day work week anytime soon, and her case for “regulatory sandpits” to test AI safely before it hits the rest of the world.

Time Stamps

01:10 – Tracey’s TEDx “The lady stripped bare” moment and why it still matters

04:45 – Beauty standards, AI filters, and why expectations on young women have intensified

08:20 – Man-Made and the epiphany that sparked Tracey’s AI obsession

11:10 – The AI arms race, speed, and why we are in the “seatbelt era” of tech

14:30 – Digital natives vs critical thinking: the hallucination blind spot

16:45 – Tracey’s AI stack: why Claude is her daily driver

19:05 – Humanoid home robots: convenience vs surveillance

21:55 – Strength vs security: what actually scares Tracey about robots

24:35 – Sex robots and the consent problem manufacturers do not talk about

28:10 – Algorithms as “opinions in code” and how radicalisation happens

33:10 – Removing bias: conversations, perspective checks, and inclusive design

35:00 – Grok, MechaHitler, and what happens when platforms mirror their owners

36:45 – Deepfake porn, consent, and why regulation is finally catching up

38:10 – No, AI will not magically deliver a four day work week

41:10 – Future jobs: law, AI assistants, and why juniors still need fundamentals

44:15 – Indigenous knowledge, language revitalisation, and the full-circle AI story

46:50 – Rapid fire: brain chips, Waymo, smart glasses, and AI “snog marry avoid”

49:55 – What we should do now: regulatory sandpits and real guardrails

In the Blink of AI is made possible by our wonderful partnersStripe

For early-stage, venture-backed founders – Stripe Startups is where to start. Enrol in the program and receive access to credits on Stripe fees, expert insights, and a focused community of other founders building on Stripe.

Apply for Stripe Startups at https://www.dayone.fm/stripe

✨ Connect with Georgie Healy

Linkedin: https://www.linkedin.com/in/georginahealy/

Instagram: https://www.instagram.com/georgina_healy/

Twitter: https://x.com/georgina__healy?lang=en

The Day One Network

In The Blink of AI is part of Day One, the podcast network dedicated to founders, operators & investors.

Sign up to get your weekly insights into the up-and-coming AI startups: https://dayone.fm/newsletter

Mentioned in this episode:

Stripe Startups

For early-stage, venture-backed founders – Stripe Startups is where to start. Enrol in the program and receive access to credits on Stripe fees, expert insights, and a focused community of other founders building on Stripe. Apply for Stripe Startups at dayone.fm/stripe

Stripe Ad_Nov 2025_02

Les membres Amazon Prime bénéficient automatiquement de 2 livres audio offerts chez Audible.

Vous êtes membre Amazon Prime ?

Bénéficiez automatiquement de 2 livres audio offerts.
Bonne écoute !
    Aucun commentaire pour le moment