How We Saw Armstrong’s First Steps

Posted on Mon, July 8, 2019

As we approach the 50th anniversary of humankind’s first steps on the Moon, our ability to reflect on those events is thanks in part to how the moment was shared with people around the world. The Apollo 11 mission was not the first time television signals returned from the orbit of the Moon, but the landing in July 1969 was by far the most important to get just right. From 240,000 miles away, millions of people around the world needed to see this incredible moment, and that was just one of the challenges of sharing the experience.

Interior view of the Mission Operations Control Room (MOCR) in the Mission Control Center (MCC) during the Apollo 11 lunar extravehicular activity (EVA). The television monitor shows astronauts Neil A. Armstrong and Edwin E. Aldrin Jr. on the surface of the moon.

Based on the rotation of the Earth and position of the Moon at the time of the expected first steps, Houston, Texas, the home of NASA’s mission control, would be out of range for reception of the television signal sent from the lunar module. Instead, receiving stations in California and Australia would coordinate on providing the best transmission possible. On top of that, issues of temperature, size, weight, power use, and light availability complicated the real-world operation of a television broadcast from our natural satellite. And sharing this moment with the world through television was a priority, as magazine and newspaper subscriptions were on the decline and televisions were in 95% of homes in the United States. Despite the challenges, the broadcast was, of course, a major success, with over 500 million viewers globally, which was actually only 7% of the world’s population at the time of 3.5 billion. Inside the U.S., 53 million households or 125 million viewers (93% of the population) participated in what is still the most watched television moment in this country’s history.

The Apollo 11 mission was not the first time television signals returned from the orbit of the Moon, but the landing in July 1969 was by far the most important to get just right.

How exactly did NASA develop and utilize technology of the time to facilitate such a moment? As with all components of the space program, NASA sought out technologies that would work reliably under the difficult circumstances of space travel. On the advice of the military, they contracted with Westinghouse to develop a small black and white camera for $2.29 million that could condense the capabilities of a large television studio camera into a lightweight and manageable camera. The company had recently developed advances in low light camera technology, something NASA knew benefitted them because of the complicated lighting in space and on the Moon. Challenges also included finding high quality lenses and the conversion of the signal to the U.S.-standard NTSC format. While they found lenses quickly, their unconventional solution for the signal format sounds unusual now: Engineers simply displayed the slow scan signal on a monitor and used a properly formatted camera to film the scene on the monitor.

This black and white television camera made by Westinghouse is a replica of the camera used to transmit images of astronauts Neil Armstrong and Buzz Aldrin on the first mission to the Moon. The original camera is still on the Moon.

The Westinghouse camera was stored for flight in the lunar module’s Modular Equipment Stowage Assembly (MESA), a compartment near the ladder that Armstrong climbed down to reach the Moon’s surface. To activate the camera, he pulled on a handle that in turn released the door to the MESA. Engineers attached the camera upside down to secure it to the door, and tilted at an 11-degree angle because of how the door rested in its final position. Both issues were overcome in retransmission of the signal back on Earth.

Anyone watching the original broadcast, on either July 20, 1969, or today, knows that it looks very fuzzy. Circumstances were working against NASA from the start. The distance the signal needed to travel, the conversions and retransmissions required to reach televisions in homes, meant a reduction in the quality of the picture, resulting in reduced resolution and a mismatch in the signal layers. That is why sometimes the astronauts appear more like translucent ghosts than people.

Apollo 11 astronaut Neil Armstrong climbs down the ladder of the lunar module Eagle to take the first steps on the lunar surface.

So what did viewers see in their homes, at community gatherings, offices, and millions of other locations? The best signals were received at Australian receiving stations at Honeysuckle Creek and the Parkes Observatory, as well as Goldstone in California. NASA used the Goldstone signal initially, though the contrast was far too high and they switched to the Honeysuckle signal, which had better detail just before Armstrong’s first steps. In the minutes after the steps, they ultimately broadcast the signal received at Parkes, a story told humorously in the 2000 film The Dish. Once received, however, the signal needed to travel from the stations either straight to Houston from California or via satellite from Australia. In Australia, however, the broadcast was sent directly and not through Houston. As a result, it was seen there .3 seconds sooner than in the rest of the world.

Later Apollo missions used better cameras, usually color ones, but none of those ever achieved the same level of attention as Apollo 11. For us today, we can thank technicians for having recorded the broadcast, though the tapes assumed to be of the highest quality were never located despite searches for them in the last few decades. What we see now on YouTube or in films are high definition scans of those best available copies. So while those of us who did not witness Armstrong’s steps firsthand cannot relate our own stories about that shared cultural experience, we can continue to share in the memory thanks to the technological choices made in the late 1960s to bring that moment back to Earth.

Related Topics