Couverture de 2021 MIRI Conversations

2021 MIRI Conversations

2021 MIRI Conversations

De : Peter Barnett
Écouter gratuitement

À propos de ce contenu audio

These are AI generated podcasts of the 2021 MIRI Conversations https://www.lesswrong.com/s/n945eovrA3oDueqtq This podcast is a personal project because I like listening to audio, and there weren't good audio versions of the conversations. Please remember that these conversations are from 2021.Peter Barnett Philosophie Sciences sociales
Les membres Amazon Prime bénéficient automatiquement de 2 livres audio offerts chez Audible.

Vous êtes membre Amazon Prime ?

Bénéficiez automatiquement de 2 livres audio offerts.
Bonne écoute !
    Épisodes
    • Shah and Yudkowsky on alignment failures
      Sep 10 2025

      This is the final discussion log in the Late 2021 MIRI Conversations sequence, featuring Rohin Shah and Eliezer Yudkowsky, with additional comments from Rob Bensinger, Nate Soares, Richard Ngo, and Jaan Tallinn.

      The discussion begins with summaries and comments on Richard and Eliezer's debate. Rohin's summary has since been revised and published in the Alignment Newsletter.

      This was originally posted on 28th Feb 2022.

      https://www.lesswrong.com/s/n945eovrA3oDueqtq/p/tcCxPLBrEXdxN5HCQ

      Afficher plus Afficher moins
      2 h et 46 min
    • Christiano and Yudkowsky on AI predictions and human intelligence
      Sep 10 2025

      This is a transcript of a conversation between Paul Christiano and Eliezer Yudkowsky, with comments by Rohin Shah, Beth Barnes, Richard Ngo, and Holden Karnofsky, continuing the Late 2021 MIRI Conversations.

      This was originally posted on 23rd Feb 2022.

      https://www.lesswrong.com/posts/NbGmfxbaABPsspib7/christiano-and-yudkowsky-on-ai-predictions-and-human

      Afficher plus Afficher moins
      1 h et 13 min
    • Ngo and Yudkowsky on scientific reasoning and pivotal acts
      Sep 10 2025

      This is a transcript of a conversation between Richard Ngo and Eliezer Yudkowsky, facilitated by Nate Soares (and with some comments from Carl Shulman). This transcript continues the Late 2021 MIRI Conversations sequence, following Ngo's view on alignment difficulty.

      This was originally posted on 21st Feb 2022.

      https://www.lesswrong.com/s/n945eovrA3oDueqtq/p/cCrpbZ4qTCEYXbzje

      Afficher plus Afficher moins
      1 h et 1 min
    Aucun commentaire pour le moment