Show notes
Seven lawsuits blame OpenAI for enabling a mass shooting. Could the same legal theory come for DeFi?Thanks to our sponsor!Coinbase OneGet 20% off the first year of your Coinbase One annual plancoinbase.com/unchained Seven families just sued OpenAI in federal court, arguing ChatGPT was a defective product that helped plan a mass shooting. OpenAI's own safety team flagged the risk eight months earlier and did nothing. The legal theory being tested here, that software developers can be held liable for foreseeable misuse of their tools, is the same theory that has been circling DeFi for years. Meanwhile, April ended as the most hacked month in crypto history, with over $600 million stolen in roughly 30 exploits, most of them linked to North Korea and its weapons programs. DeFi United, a $300M relief coalition led by Aave, emerged as the industry's response. KK, Vy, and Jessi unpack what it means when the 'code is law' defense starts to crack, why basic operational security is still not standard practice, and how close the Clarity Act actually is to crossing the finish line.Hosts: Katherine Kirkpatrick Bos, General Counsel at StarkWare. Previously held senior legal roles across DeFi and centralized exchanges. Jessi Brooks, General Counsel at Ribbit Capital TuongVy Le, General Counsel at VedaLearn more about your ad choices. Visit megaphone.fm/adchoices



