Skip to main content
Exploring what’s next in tech – Insights, information, and ideas for today’s IT leaders

The science behind the movie 'Ghost in the Shell'

The creators of the movie 'Ghost in the Shell' needed to make an imaginary world feel real by understanding technologies that don't yet exist.

Hard science fiction relies on imaginary technology that operates according to clear rules and derives from current technical possibilities. The story is more arresting and realistic if the far-fetched technology that powers it can be seen to have contemporary antecedents.

In the movie "Ghost in the Shell," and the Japanese manga and anime it is based on, humans have managed to fuse technology and biology. Scarlett Johansson plays the Major, a soldier who fights terrorists using a fully cybernetic body that houses a human brain. In this world, hackers can steal your very proprioception—the physical and psychological sense of where your body ends and other matter begins. 

What would it take, technologically speaking, to create the world of "Ghost in the Shell" in real life? The four technologies necessary to bridge the divide between our world and the Major’s are security against digital intrusion, edge computing, compute power, and interface technology. 

We sat down recently with Kirk Bresniker, chief architect at Hewlett Packard Labs, and Richard Lewington, senior communications manager at Hewlett Packard Enterprise. Both were part of an HPE team that advised the filmmakers on how to make an imaginary world feel real by understanding the demands of a technology that does not yet exist. 



"In a world where anything can be hacked, including your visual stream, how do you know what real means?" asks Lewington. "In the real world, we have control of our senses, so you can believe what you’re seeing. Drugs and injuries can alter our perceptions but not replace them wholesale. But if all your senses are mitigated and mediated by computers, how do you know?"

One of the suggestions the HPE team made to the filmmakers was based on the security work it did for The Machine, an example of the Memory-Driven Computing architecture that HPE believes will transform IT infrastructure in coming years. 

"If we can make The Machine unhackable, employing a root or chain of trust from the silicon up, perhaps that same ability could be given to the Major," says Lewington. "That ability could be given to the Major by, say, a teammate, who could transfer the code in a tactile manner."

The Machine employs a type of hardware initialization that uses an extra chip attached to the motherboard, which provides what Bresniker calls "self-awareness circuits" that can check for compromised components before they are ever powered on.

"Using attestation, we create a chain of demonstrable proofs," says Bresniker. Before The Machine starts, for example, a technology surveys the entire system and compares it with a ground state that is cryptographically protected from alteration. Anything different from that known truth is suspect. Theoretically, such a system could sift through the memories of a cyborg like the Major in order to distinguish true memories from implanted ones.

Edge computing

"Ghost in the Shell" depicts a massively networked world in which literally everything and everyone is exchanging digital information. By definition, such a world must depend heavily on edge computing, an emerging technology that co-locates sensor, storage, and compute at the edge of the network, avoiding the latency problems created by sending data from the edge to a central data center for processing. 

Imagine how long it would take, even at lightning-fast processing speeds, if every human being the world over—the overwhelming majority of whom use technological augmentations to their sensoria—had to send every visual data point, every sound file, every haptic input back to a data center for processing and then wait for all those bits and bytes to come back to the sender. 

You would need exponentially more powerful and ubiquitous edge computing to power the world of "Ghost in the Shell." Every sensor would have to be capable of in situ processing, so as to make the augmentations work in real time. 


The Major wears a thermoptic suit that allows her to become invisible. This suit would require prodigious computing power. So would the film's three-dimensional tactical maps. When the Major responds to a geisha attack in a hotel, she jacks into the hotel's communications and security system and builds a data visualization to assist her in quickly developing a defense strategy.

Such projects require far more computing power and speed than today's computers can deliver. Memory-Driven Computing architectures like The Machine offer a way around this bottleneck by combining massive pools of non-volatile memory with fast memory fabric and task-specific processing. 


We have already made progress in merging technology with the body, including pacemakers, subcutaneous insulin pumps, mind-controlled prostheses, devices to assist in hearing and seeing, and internal support hardware. We have even begun to use biological elements, like the DNA of bacteria, to store information. But we are far from the wholesale integration of biology and technology that you see in the film.

When we asked HPE research scientist Miranda Mowbray to imagine what material computer "brains" are likely to be made of a century into the future, she said, "It'll be biological. They are likely to be formed of huge numbers of genetically engineered E. coli or other simple organisms, interacting with each other and their environment. Squishy computing is the future."

In "Ghost in the Shell," that future has become the present. As we continue to make computers more biological and bodies more mechanical, engineers will explore ways to make that integration more seamless.  

"Ghost in the Shell" is coming soon to Digital HD and Blu-ray.

This article/content was written by the individual writer identified and does not necessarily reflect the view of Hewlett Packard Enterprise Company.