AI not only ruins images, but story-telling. Teaching and training has always been about storytelling. Even if they were somewhat apocryphal stories, they still could convey a lesson. I remember I used to tell the story of Lawn Chair Larry flying above L.A. with some weather balloons and a BB gun. And the lawn chair.
Then there was the story of the Gimli Glider. That one’s very true, and it’s even more impactful because it’s true. It highlights how little mistakes (like units of measure) can end in a disaster. How assumptions can be catastrophic. And how serendipity can be salvation. One of the pilots of that fateful incident happened to grow up in the area where their wide-body jet flamed out due to fuel starvation. I won’t tell the story. The point of my post is to show how AI ruins s**t; all of these posts are starting to look the same, with the same breathless cadence and emoji: bullet, bullet, bullet. (I didn’t repost the original (i copy/pasted below). I don’t want to insult anybody. I think we all should be using AI. But the same people that post the fantastical yellow Class 9 labels are posting this.
————————————-
“Fuel quantity is normally a solved problem.
It is measured twice, converted once, and rarely questioned after pushback.
On this flight, nothing failed loudly.
No alarms. No smoke. No fire.
Only a quiet disagreement between numbers that were all internally consistent.
The aircraft departed with what the paperwork said was enough fuel.
The gauges agreed with the calculations.
The calculations agreed with the conversion.
The conversion, however, belonged to a system mid-transition—imperial habits inside a metric future.
At altitude, both engines flamed out within minutes of each other.
Electrical power degraded to what the designers had intentionally preserved: just enough.
Hydraulics followed their designed priorities.
The crew did not troubleshoot. They flew.
The destination was no longer an airport, but a former air force base repurposed as an industrial strip—active that day with ground traffic and spectators.
The aircraft arrived unpowered, fast, and heavy.
The landing gear partially complied.
The nose gear did not.
Only after everything stopped did the event receive its shorthand name: the Gimli Glider, operated by Air Canada.
What remains useful, decades later, is not the drama but the structure:
• Systems can agree with each other and still be wrong together.
• Transitional states—units, standards, cultures—are operationally hazardous long after they are administratively complete.
• Design choices made for rare failures quietly decide what “survivable” means when procedure ends.
• Professionalism often looks like refusing to solve the wrong problem under time pressure.
Most complex organizations don’t fail at the edge of capability.
They fail in the overlap—between old assumptions and new defaults—where everyone is technically correct, and no one is fully aligned.”

