In 1960 Korea, my father—a decorated U.S. Army Colonel and pilot with over 3,000 flight hours—was one of three pilots tasked to fly VIPs. He sensed they shouldn't fly. The VIPs insisted. He opted out on gut instinct. The Piasecki helicopter broke up in flight just after takeoff, killing everyone on board.
Twenty years later, my father was assigned to the Joint Chiefs of Staff as senior aviator advisor. He was asked to advise on what would become the Iran hostage rescue mission—Operation Eagle Claw. He knew the mission was ill-fated. The decision was made to choose a less experienced pilot. In April 1980, the mission failed catastrophically—helicopters crashed in the Iranian desert. My father convinced me during this time to enlist in the Marines and make a difference.
I did. And by 1983, while completing my MS in Expert Systems and Decision Support Systems at USC, I'd developed an early version of what would become The Compass Protocol. My thesis focused on spatial topology for night-vision flying—how pilots build mental models to navigate 3D space in total darkness. I didn't know it then, but I was studying the structure of expertise itself.
In March 1984, I was assigned to a CH-53 squadron in Pohang, Korea. The mission was RAIDEX—designed to prove the Marines could execute complex night operations after the Iran failure. Six helicopters were scheduled to lift off at 4 AM into darkness, snow, and a moonless night. My helicopter was YN666.
My crew invited me to sit in the jump seat. Something in me—the spatial reasoning I'd been studying, now internalized—said: don't add to the risk. I could feel the topology of the decision. I wished them safe flying. I declined the invitation.
Thirty minutes later, YN666 crashed. My crew died. Of six helicopters that lifted off, one never came back. I was the lone survivor in the sense that I should have been on that bird.
I tried to help afterward. I sent an advisory to the Naval Safety Center with recommendations to prevent future tragedies. Months later, I was called into the CO's office and told in no uncertain terms to stop. I was grounded. No more flying. I spent a year managing simulators before being discharged as a Captain.
I felt I had left my Marines on the hillside. Their families would never know the truth.
But through nearly four decades of research across more than 50 US government agencies—including all 18 members of the Intelligence Community—collecting over 4,000 diagnostic artifacts from Fortune 100 executives, intelligence analysts, government leaders, and business owners across nine countries, I finally understood what saved me that night.
The systematic thinking protocol I'd been developing accelerated my expertise. It helped me make the right call with less than 1,000 flight hours, when my father needed 3,000 to trust his instinct in a similar situation. My Marines died because they didn't have what I had. And I was silenced when I tried to share it.
Now, after 40 years, I understand it well enough to teach it. This is for them. This is for you. This is what I wish they'd had.
Semper Fidelis.