Skip to content

APOLLO 11’s almost-aborted lunar landing: lessons learned

APOLLO 11’s almost-aborted lunar landing: lessons learned

“It’s a 1202… What is that? Give us a reading on the 1202 Program Diagram.” 
                -Neil Armstrong, Apollo 11 Commander, minutes before lunar landing

We have just celebrated the 50th anniversary of the lunar landing, arguably the most significant news event of this century. One relatively small but significant moment during this monumental event was Apollo 11’s 1202 Alarm that occurred at about 3,000 feet up from the lunar surface that almost lead to an abort of the historic landing. 

The technology (and human) “fail” in brief: The software program on the lunar module (LM) had to accommodate information such as state vector (location and movement), module performance, and landing parameters. This entire constellation of programs, aptly named the Apollo Guidance Computer, was deconstructed into modules designed to run individually but the computer itself was a single processor computer (not designed to run multiple programs). Spurious radar signals were flooding this computer with limited capacity at the time of the landing (and vital programs were not able to be run). The alarms (one 1201 and multiple 1202s) were triggered as a result of this condition that warranted Executive Overflow (computer overload). Important lessons were learned from this key moment have relevance to AI in medicine and health care:

Human oversight is essential. During these alarms, astronaut Buzz Aldrin noted that a certain condition (16/68 code display which is range to the landing site and the LM velocity) consistently triggered this 1202 alarm. This astute observation enabled an easy solution: this data could be directly relayed to the astronauts from Houston (rather than displayed on the computer). In addition, it was the computer specialist Jack Garman down at NASA who wisely advised a “no abort” for the lunar landing. This moment illustrated how critical human insight at appropriate times was essential for the correct decision. 

Over-preparedness is key. These alarms were not known to anyone in the program, including the astronauts. Richard Koos, a NASA simulations supervisor, did train Garman and CAPCOM astronaut Charlie Duke on the program alarms as part of the multidisciplinary comprehensive training a few weeks before the mission, just to cover that remote possibility. This comprehensive training accelerated the velocity of response to this alarm and assured the success of the landing.  

Human-machine synergy is best. Contrary to the public perception, there was no possibility that the module could have been landed without computer support. This necessary partnership was best with Armstrong fine tuning the landing aspect while the computer managed the bulk of the process. Human and machine collaboration during the landing was better than either human or computer functioning alone. 

Show Buttons
Hide Buttons