Who’s to Blame When AI Screws Us Over? with Jason Kapadia, Matt Webster and Jag Sharma - E106
When AI goes wrong and signs you up to an awful legal deal, who’s to blame - you, or the agent?
AI is crossing the line from ‘nice-to-have’ to ‘invisible infrastructure’, and once it does there’ll be no turning back.
We dive into the AI-first future, the landmark trial on addictive design for social media, and questions whether OpenClaw is already plotting the demise of the ‘meat bags’...
This episode of Disconnected covers:
The blurred line between tool and infrastructure
Accountability when AI fails
Addictive design and teens’ mental health
The rise of Agentic AI with a new, open source tool
Episode Highlights:
“When does a tool stop being a tool and start being infrastructure? We tend to talk about AI and a lot of emerging tech as if it's just another productivity layer… but history suggests it's not how these things evolve.” - 1:50 - Jag Sharma
“I'm seeing the UX of how I ask questions change how I go about getting information. I think I'll be looking back at 2025, 2026 and going, ‘Oh yeah, that's when I started to ditch traditional Google.’” - 9:45 - Matt Webster
“Once systems are right enough, most of the time, we as human beings start conserving that cognitive energy and checking less… that gradual over‑trust over time can compound.” - 11:40 - Jag Sharma
“This lawsuit is a shift from harmful content to harmful design. This could be a massive turning point for how social media and AI are treated legally…” - 33:20 - Jag Sharma
“This is kind of legitimately the moment that we ditch the passive chat bot chit chat… and step into agentic AI stuff that actually hustles on your behalf all day.” - 40:15 - Jason Kapadia
“What if your AI agent signs off on some horrendous legal terms? Are you liable for that contract? Can you sue your AI agent like you could a human personal assistant?” - 49:35 - Matt Webster
“What I initially thought was novelty is actually the gateway into turning this kind of activity into social normalisation… we’re trusting AI not just to answer questions, but to take actions we might not double‑check.” - 53:05 - Jag Sharma
Links & references:
Matt Webster:
https://www.linkedin.com/in/mattwwebster/
Jason Kapadia:
https://www.linkedin.com/in/jasonkapadia/
https://www.instagram.com/jasonkapadia/
Jag Sharma:

