What We Need to Get to Mars: Traveling to the Red Planet Will Require a Major Computer Upgrade

May 08, 2017 • Blog Post • By The Atlantic Re:Think


  • Landing on Mars is going to prove far more complicated and demanding than the Moon mission 50 years ago

Traveling to the Red Planet will require a major computer upgrade and HPE is on the path to make that happen

Eight minutes before the lunar module landed, making Buzz Aldrin and Neil Armstrong the first humans ever to touch the Moon, the on-board computer balked.

The two had undocked from the command module, where their colleague Michael Collins remained in orbit, and were as happily anticipatory as any two people who were hurtling toward the Moon's pitted surface could be. Collins lightheartedly told mission control in Houston that everything was going "swimmingly". Charlie Duke, manning communications from NASA headquarters, sounded buoyant.

Then came the alarm. Just after Aldrin was struck by the view of Earth from his front window, in the middle of the descent sequence, Armstrong told mission control that the computer was registering error 1202. Mission control scrambled to figure out what that meant - should the landing be stopped?

Twenty-two seconds ticked by. "Give us a reading on the 1202 program alarm", Armstrong radioed. Five more seconds passed. Duke told them to stay the course. Four minutes later, 3,000 feet from the Moon, the computer sent up an alarm again. This time the answer from Houston came more quickly. "Eagle, looking great", Duke said. "You're go".

When they landed, Duke let out a relieved exultation. "We copy you on the ground", he told Armstrong. "You got a bunch of guys about to turn blue. We're breathing again. Thanks a lot."

The computer problems, it turned out, were due to a data overload that forced the system into a limited reboot. Data related to the landing sequence had brought the computer to near capacity, and when something flummoxed the lander's rendezvous radar threatened overload and a botched mission. NASA forged ahead after Houston decided (in a matter of seconds) that the best bet was for the module to land. But even before the astronauts returned to Earth, engineers started looking at the computer issues that had almost put the whole operation at risk.

The progress since then has been exponential. Computing capabilities have made leaps whose figurative distance would almost seem to match the trip to that grey rock. Now, space exploration has turned to an even greater challenge with far more trying conditions: getting to Mars. NASA has said it intends to send manned missions to the Red Planet and private companies like SpaceX are also developing plans to land there.

"The computing capacity and requirements for a Mars mission will be vastly larger than the Apollo program that landed us on the Moon", said Chris Carberry, the CEO and co-founder of Explore Mars, Inc., a nonprofit created to promote sending humans to Mars. "A pocket calculator - in fact, most washing machines - has more computing power than the computers we used to send Neil Armstrong and Buzz Aldrin to the surface of the Moon."

Getting to Mars is riddled with obstacles that include, not least, how and whether to come home. The technological and engineering hurdles range from the existential to the banal. Just getting off the ground is a significant challenge: Any mission to Mars will require a significant amount of equipment that would be impossible, at this stage, to send in one trip. While most mission concepts require sending at least 20 to 40 metric tons to Mars, the largest vehicle that has successfully landed is the Curiosity Rover, which only weighed about 1 metric ton. Payloads will have to be as light as possible. The leaps that have been made in computers' size and weight will play a crucial role in keeping mission load down. At Hewlett Packard Labs, the R&D arm at Hewlett Packard Enterprise, where a new computing architecture called Memory-Driven Computing is underway, the company is advancing several new technologies. These includes technologies that will make computers lighter and faster - like using photonics, which use photons, or light, to transmit data rather than traditional, heavier and heat-producing copper wires.

But NASA could send equipment to Mars ahead of any potential crew that lands there, flying multiple missions to make sure that resources will be available when humans arrive. The real key is making sure that any mission is self-sufficient. That means no calling back to Houston about an error 1202, or for help with mechanical repairs, farming or medical emergencies. When it comes to its Mars strategy, NASA says "astronauts in deep space must be more self-reliant". One crucial element will be computers that can deal with unexpected parameters (one of the snags that caused Aldrin and Armstrong's computer to send out a program alarm) and constantly calculate ahead without needing to relay information back to Earth. This would call for complex data analysis that would correlate all of the data sources available in order to drive insights right then and there regardless of whether that data is from a camera, sensor, the navigation system or a database of all of the weather information ever collected from Mars.

The data itself will be coming from a variety of sources. Wearable sensors on astronauts' bodies will be recording and processing biometric information, which will be able to instantly identify when someone's heart rate accelerates by a beat. Cameras on board shuttles could also be constantly analyzing astronauts' facial expressions - is someone getting aggressive or does someone else look over stressed? Meanwhile, navigational data will be streaming into the shuttle's computer, which will constantly need to readjust and account for any number of unexpected events - does the route need to change? It will also need to track the shuttle's functions - is a mechanical part in need of repair? The right computer will be able to answer these questions ahead of time.

That could make a significant difference for automating routine tasks, but also for avoiding catastrophes. "You can't do any real time kind of computing from Earth", says Kirk Bresniker, chief architect of Hewlett Packard Labs and HPE fellow. "Say something goes wrong with environmental control. By the time you've decided what to do and you've communicated that need, well, they might be dead."

What's more, the computers have to survive unforgiving physical conditions - extreme temperatures, new atmospheres, more radiation. When a computer is responsible for running key systems, ensuring that it stays functional is critical to the survival of the crew. NASA has run tests to harden traditional computer processors against radiation in an attempt to protect their circuitry from cosmic rays that can trigger errors in the chips. The Memory-Driven Computing architecture in development envisions one day swapping copper wires for optical communications in chips themselves - making the computer naturally more resistant to radiation effects in space.

But the most significant contribution of Memory-Driven Computing is its capacity to generate insights from significantly more data hundreds of times faster than even today's most advanced computers. And it can also do more complex things with that data with minimal human intervention. "We're designing an architecture that would be very flexible", Bresniker says. "So whatever new tasks arrive, you have the processing power and you have the memory to not only store everything thats happened to every sensor, but also to pre-load information from other missions."

Set further away from Earth than anyone has ever been, missions to Mars will be harrowing in the old fashioned sense. Once on the Red Planet, astronauts will face the loneliness of the true frontier, out of immediate range from the rest of mankind. It will be dangerous, and like all explorers throughout history, the pioneers will have to decide what to bring and what to leave behind. But with new computing powers capable of maintaining vastly more data than ever, the pioneers might be able to bring even more knowledge with them as well as be better equipped with new resources to adapt to what has never been seen before.


HPE CEO Meg Whitman on the Value of a Moonshot

Blog Post

SAVE THE DATE: Making the Mission to Mars Compute

Blog Post

Memory-Driven Computing Explained

Blog Post