
This week we're doing something a little different. Instead of discussing AI's greatest hits, we're pulling back the curtain on its greatest fumbles. Because it turns out that building the technology and fully understanding it are two very different things, and everyone is learning as they go.
Grab a coffee and/or a haggis and fried egg roll (my personal recommendation). This is a good one.
The Big Stories
Air Canada once let its AI chatbot promise a grieving passenger a bereavement discount that didn't exist. The man booked flights, was denied the discount, and took Air Canada to tribunal. He won. The airline's legal defence, and this is genuinely what they argued, was that the chatbot was "a separate legal entity" responsible for its own answers. The judge was not impressed. Neither was anyone else. Air Canada paid up, and presumably had a very long internal meeting afterwards. The lesson: if your AI is making promises, maybe check what it's promising.
Samsung had a rather memorable 2023. Three separate engineers uploaded confidential source code and internal meeting notes to ChatGPT while trying to fix bugs and summarise documents. Efficient, yes. Also: the data was then potentially used to train the model, meaning Samsung's proprietary code was, in a sense, no longer entirely Samsung's. The company banned AI tools internally shortly after. The irony of using AI to accidentally give away the thing you were using AI to protect is something that still deserves quiet appreciation.
IBM predicted in 2017 that its Watson AI would revolutionise cancer treatment at MD Anderson Cancer Centre in Houston. The project cost around $62 million, took years, and was ultimately cancelled without being deployed for patient care. Watson had been trained largely on hypothetical cases rather than real ones, which is, medically speaking, not ideal. The hospital's own doctors reportedly disagreed with Watson's treatment recommendations in a significant number of cases. A $62 million lesson in the difference between impressive demos and functional products.
Zillow, the American property platform, built an AI to buy and flip houses. The model predicted prices, they bought thousands of homes, and then the market did what markets occasionally do, moved differently than expected. Zillow lost roughly £300 million in the fourth quarter of 2021 alone, shut the entire division down, and laid off 25% of its workforce. In hindsight, trusting an algorithm to time the housing market was perhaps optimistic. But then again, so is the housing market generally.
Amazon ran an AI recruitment tool from 2014 to 2017 before discovering it had taught itself to penalise CVs that included the word "women's"; as in women's chess club, women's sports team, and so on. The model had been trained on ten years of successful CVs, which had skewed male, and had simply learned the pattern. Amazon scrapped it. The system had done exactly what it was designed to do, which makes it somehow worse.
The Takeaway
The thread connecting all of these is not that AI is dangerous or that companies are reckless; it's that AI does exactly what it's told, based entirely on what it's been shown. Feed it bad data, it learns bad lessons. Ask it to do something it wasn't properly built for, it will do something. Just not necessarily the right thing.
The companies that lost money here weren't all negligent. Some were genuinely trying to innovate. But they moved faster than their understanding, which is, historically, how most expensive mistakes happen.
Final Thought
Somewhere right now, a very confident AI is generating a very confident answer to a question it doesn't fully understand. Let's hope nobody's put it in charge of anything important.
Enjoy your week🙂,
Jamie
Attio is the AI CRM for modern teams.
Connect your email and calendar, and Attio instantly builds your CRM. Every contact, every company, every conversation, all organized in one place.
Then Ask Attio anything:
Prep for meetings in seconds with full context from across your business
Know what’s happening across your entire pipeline instantly
Spot deals going sideways before they do
No more digging and no more data entry. Just answers.

