Show notes
Eliezer Yudkowsky is a researcher, writer, and philosopher on the topic of superintelligent AI. Please support this podcast by checking out our sponsors:– Linode: https://linode.com/lex to get $100 free credit– House of Macadamias: https://houseofmacadamias.com/lex and use code LEX to get 20% off your first order– InsideTracker: https://insidetracker.com/lex to get 20% offEPISODE LINKS:Eliezer’s Twitter: https://twitter.com/ESYudkowskyLessWrong Blog: https://lesswrong.comEliezer’s Blog page: https://www.lesswrong.com/users/eliezer_yudkowskyBooks and resources mentioned:1. AGI Ruin (blog post): https://lesswrong.com/posts/uMQ3cqWDPHhjtiesc/agi-ruin-a-list-of-lethalities2. Adaptation and Natural Selection: https://amzn.to/40F5gfaPODCAST INFO:Podcast website: https://lexfridman.com/podcastApple Podcasts: https://apple.co/2lwqZIrSpotify: https://spoti.fi/2nEwCF8RSS: https://lexfridman.com/feed/podcast/YouTube Full Episodes: https://youtube.com/lexfridmanYouTube Clips: https://youtube.com/lexclipsSUPPORT & CONNECT:– Check out the sponsors above, it’s the best way to support this podcast– Support on Patreon: https://www.patreon.com/lexfridman– Twitter: https://twitter.com/lexfridman– Instagram: https://www.instagram.com/lexfridman– LinkedIn: https://www.linkedin.com/in/lexfridman– Facebook: https://www.facebook.com/lexfridman– Medium: https://medium.com/@lexfridmanOUTLINE:Here’s the timestamps for the episode. On some podcast players you should be able to click the timestamp to jump to that time.(((((((((((((((

