Why AI, Big Data and Machine Learning Isn’t a Silver Bullet: The Bullet Hole Misconception

The Allies in WWII analyzed the bullet holes in planes returning from missions and determined they should reinforce those areas in the planes where the most bullet holes were found. This may seem like the obvious solution, but it’s dead(ly) wrong. The place to reinforce was where the bullet holes weren’t! Because planes receiving damage in those areas weren’t returning!

Read Daniel Siegel’s excellent article on the bullet hole misconception and the limitations of Big Data here.

The quest for the perfect AI / Data Analytics / Machine Learning tool is today’s Holy Grail. Monty Python might have an interesting twist on that, I assume. But…

While data analytics and AI (and other tools) are useful for finding the right data, that’s insufficient. And we could use the data either foolishly or harmfully – like reinforcing planes in all the wrong places. To convert to useful information requires having an a priori mental model – and one that is then tested and refined with the data that’s given. Building systemic intelligence (SysQ) increases our capacity to generate the most useful mental models. Using it helps avoid working on the wrong issues, using ineffective analytical techniques, and reaching erroneous conclusions.

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *