Blog

Back To Blog

Crash Landings and Silver Linings

5th October 2022

Mistakes are the portals of discovery”

James Joyce

Once upon a time, in the not-so-distant-past… one of my employees cocked up. Badly.

It was late on a Friday night when – through a series of errors – the young man in question had managed to destroy about £150,000 worth of work. It was almost impressive, truth be told. He had been under-invoicing one of our larger clients and had only just realised that he had charged them far less than he should have been.

He swiftly went through the full seven stages of grief before reaching acceptance and realising that he’d have to own up to the mistake. Sweating through his shirt, he approached me about the issue and began apologising; offering me his deepest regrets, possible solutions, and – in a roundabout way – asking to keep his job.

My response? I just paid £150,000 to train one employee how to never make that mistake again, and learned about a dangerous flaw in my business that could have been even more destructive – why would I fire him! I sent him home with strict orders not to think about the problem at all over the weekend, and to come back refreshed and with a positive attitude on Monday morning.

Whilst this was a problem, the speed at which the young man had owned up to his mistake had stopped it from escalating to a much greater issue. This demonstrates one of the most vital aspects of any work environment: the safety to come forward and admit mistakes without fear of punishment – and to be able to flag issues anywhere in the company without fear of retribution from higher ups.

 This is an example of a concept that has become known as Black Box Thinking.


Mistakes can be costly – after all, £150,000 is not to be sniffed at – but the way in which we approach failures can determine how much we can gain from failure. By creating an environment in which failure is recognised as a part of doing business, and in which employees can come forward to openly discuss issues without fear of reprisal, you are ensuring that any issues are addressed as soon as possible, minimising damage.

Moreover, I didn’t see this as a monumental cock-up as some might have done – I saw it as a valuable lesson on how we could improve our practices. If this employee (who was a competent employee) was able to make that mistake, then perhaps it was an operational failure by the company.

If there is a vulnerability or flaw in your business that allows for catastrophes like this to occur, then it’s not really on the employee in question, it’s on the business. If you leave a vulnerability exposed, it’s inevitable that someone, eventually, will accidentally trigger that vulnerability.

Black Box Thinking is a mindset that values openness and transparency above fear of addressing a failure. But where does it come from?


No-one understands Black Box Thinking better than the aviation industry. You might have heard the statistic that you’re more likely to die from lightning than in a plane, but there’s a very good reason for this.

Aeroplanes are – or rather were – death traps. The idea that we could fly through the sky in the 1920s is both a marvel of human ingenuity, and a testament to our arrogance. In the early years of aviation, crashes were a common part of flying. In 1929 alone there were 51 crashes, which would equate to around about 7,000 crashes in today’s number of flights.

In 2017 in comparison, there were only 10 fatal airline accidents – a dramatic improvement in anyone’s book. Whilst the improvement in safety was driven by technological updates and carefully planned protocols, both advances were driven by the concept of Black Box Thinking.


A ‘black box’ is a small, nearly indestructible block of sensors and recorders designed to withstand the destruction of a plane crashing. Its single purpose is to chart and record every action leading up to a disaster. In the tragic event of a crash, the Black Box can be salvaged to learn what events lead up to the incident, providing insight on the altitude, vectoring and even the pilots’ communications.

Over the decades, the aviation industry has used black boxes to learn from each and every crash, implementing new protocols and technologies to ensure that each incident is never repeated. With each year, the industry became safer and safer, as the industry learnt from mistakes.

However, the concept known as Black Box Thinking has evolved in tandem with the use of black boxes, but is not as simple as “learning from one’s mistakes”. The term came to be associated with the strict regulations within the aviation industry around whistleblowing and openness.

Due to the incredible risks of flying and the potential for catastrophic loss of life, there is no tolerance for not announcing issues or ignoring potential failures. Any employee within the industry is expected to immediately speak up if they see anything that might indicate a failure or potential risk. Conversely, employees know that any declaration they make will be free of retribution.

If an air steward suspects that their pilot may have had a whiskey or two in the pre-flight lounge, or if an engineer suspects that maintenance hasn’t been carried out to the right standard, they are expected to immediately flag the issue – whilst they in turn expect that they will face no reprisal from the captain or lead engineer for doing so.

If we pause to consider the mistakes that we have made throughout our lives, we would find that many of the subsequent disasters would have been far worse had they been discovered even later. That £150,000 mistake was a blow to my business, sure – but we soon came to realise that had it not been discovered, the same mistake made later in our operations could have cost us millions. That is the value of Black Box Thinking.

Now, If a company had approached me and told me they could identify and fix a flaw in my business that would eventually cost me £1.5 million, and all it would cost was £150,000? That’s one-tenth of the overall loss – a bargain in anyone’s books.


Black Box Thinking isn’t just about learning from your mistakes, it’s a mindset shift that encourages you to value openness and transparency, and to disregard blame and repercussions.

It’s more than putting on a brave face following a disaster, it’s a fundamental shift in perception: a refocusing on what is actually the biggest problem facing your business – the danger of ignoring key issues. If you can achieve this refocusing, I guarantee that you’ll see the positive effects within a few short years.

£150,000 was probably one of the most expensive lessons that I’ve been taught, but I’m always glad that it was one that my company learnt. It’s not only helped us patch a potential threat to the business, but also helped us change our perspective on failure as a collective. In many ways, it was one of the best things that ever happened to us.

Saying that – he better not do it ever again!

Book preview

Now Available at Amazon, E-Book and Audible

Buy Now