3: Can AI discriminate?
Impossible d'ajouter des articles
Désolé, nous ne sommes pas en mesure d'ajouter l'article car votre panier est déjà plein.
Veuillez réessayer plus tard
Veuillez réessayer plus tard
Échec de l’élimination de la liste d'envies.
Veuillez réessayer plus tard
Impossible de suivre le podcast
Impossible de ne plus suivre le podcast
-
Lu par :
-
De :
À propos de ce contenu audio
But what about when AI is used for decisions that actually matter? Like whether a person with disabilities gets the support they need to live independently, or how the police predict who is going to commit a crime?
With people’s rights and freedom on the line, the stakes are much higher – especially because AI can discriminate.
To unpack all of this we’re joined by Griff Ferris, Senior Legal and Policy Officer at campaign organisation Fair Trials, to discuss the extent to which AI can discriminate, the impact it has on people who are already marginalised, and what we can do about it.
Mentioned in this episode:
- Fair Trials’ predictive policing quiz
- Fair Trials’ report Automating Injustice
- The HART algorithm used by Durham Police
- The Government’s AI regulation white paper
- Public Law Project’s Tracking Automated Government register
Vous êtes membre Amazon Prime ?
Bénéficiez automatiquement de 2 livres audio offerts.Bonne écoute !
Aucun commentaire pour le moment