Skip to main content

Say goodbye to your keyboard

The human-computer interface is changing: Welcome to a world of holograms, augmented reality, and thought-controlled Word docs.

The way people interact with their computers and devices is set to change a lot over the next five to 10 years. That's right—goodbye keyboard and touchscreens and hello holograms, image-based interfaces, and thought-controlled Word documents. But could communicating more easily with our devices also mean they could communicate more easily with each other about us?

Will our devices be divulging our every move and thought to corporate marketing teams? Will my GPS or autonomous car, while easier to use in the next five years, suddenly only take routes to get me to work that go past a Starbucks?

"It's not just about how we personally will interact with our devices, but how other people and companies will interact with our devices," says Michael Facemire, an analyst at Forrester Research. "Human-computer interactions today aren't natural. We're still so early in the journey of what is a good digital experience."

For a while now, people have been hearing that the way we interact with our laptops, tablets, and smartphones is going to change—a lot.  

We've been hearing that in less than 10 years, we may no longer carry our smartphones in our bags or pockets but instead wear them on our wrists, for example. 

We'll tap out messages on either images of touchscreens on our skin or flexible electronics that can be worn on our wrists.

Instead of using Skype to chat with a friend on a tablet or smartphone, people in the not-so-distant future will connect with friends via holograms that look like they're sitting next to them on the couch or in their office.

How soon will we see this new technology? How much will the human-device interface actually change in, say, the next decade? And what are some of the upcoming technologies that may surprise us or at least get us excited about hitting the year 2027? 

Emerging technologies are changing our future. A new world is coming.

Actually, we do need your thought control

"In the next five years, this new human-computer interface technology will start coming into its own, but I think in the next 10 years it will be huge," says Dan Olds, an analyst at OrionX, a management consulting company. "I think we're going to start edging closer to the Holy Grail of human-computer interaction."

So what is this Holy Grail?

According to Olds, it's thought-based device control.

"There's a lot of research in the direction of enabling human thoughts to be able to control computing devices and physical objects too," says Olds. "I always have a lot of writing to do. When I lay down to go to sleep, paragraphs just come to me. Wouldn't you like a device that would capture that flow of thoughts and store it somewhere—put it on a screen, put it in Word?"

For Olds, that would be the human-computer interface Holy Grail.

With advances in computer power, along with artificial intelligence and sensors, this could be a real possibility in five to 10 years, he says.

With a device or sensors attached to the user's head, their flow of thoughts could be sent to a computer or storage device with the user only thinking a certain phrase, like "OK, begin," to start the communication.

The technology also would have to be attuned to the user's brain so colleagues or even next-door neighbors wouldn't be able to use their own thoughts to control someone's laptop or smartphone.

"I'd be more than happy to wear a headset to make that happen," says Olds.

Interacting with machines as we interact with humans

The natural progression of human-computer interfaces is to get rid of keyboards and mice—and even touchscreens.

That's not a crazy idea, according to Krishna Venkatasubramanian, an assistant professor in the computer science department at Worcester Polytechnic Institute.

How we will get there is a multi-dimensional answer, says Venkatasubramanian, who focuses on how the Internet of Things (IoT) will apply to the healthcare field and wearable devices

There is a definite movement toward perceptual computing given the growth of the IoT, which is bringing about refrigerators, toasters, and even wall treatments that can communicate with each other and users; wearables like fitness trackers and Google Glass; and ingestible sensors.

"We will start to interact with computers and objects the way we interact with other humans—through gestures, through body language, and through speech," says Venkatasubramanian. "If computing devices are going to be everywhere around us, you can't expect to have keyboards and mice everywhere to interact with them. [Computers] will have to adapt to the way humans interact with other humans to be successful."

Perceptual computing is a computer's ability to be aware of what is happening around it.

The future, added Venkatasubramanian, won't be about a new way to connect with your own laptop, for instance. Instead, it will be about the multitude of computers that surround us—whether in our homes, on our bodies, in the office, or out in public places.

"All of these computers will provide a cumulative function," he says. "We'll get to a point where I don't have a smartphone or a laptop. The computers will just be all around us."

New shapes and functionality

Before we get to the place where we don't carry computers or smart devices with us, we'll see those machines take on new shapes and functions.

Mahadev Satyanarayanan is an experimental computer scientist and professor at the School of Computer Science at Carnegie Mellon University.

For Satyanarayanan, changes in the shape and use of our familiar devices will change our lives over the next five or so years.

For instance, instead of viewing a recipe on a smartphone app or even using a similar app on a pair of smart glasses, how different would it be to have an interactive technology that could have a hologram of a person to act as a friend or an experienced cook standing beside you while you're prepping ingredients or working at the stove?

"Today, say I'm wearing something like Google Glass," says Satyanarayanan. "In the little display it will say, 'Stir a cup of this with two teaspoons of that.' Think of the difference of using that compared to having [what seems like] a good friend stand next to you when you're making the dish for the first time. Your friend might say, 'Wait! You have to wait till the oil is sizzling' It will guide you in real time. A simple recipe app won't do that."

And instead of a digital assistant sitting in a user's pocket, in the not-so-distant future it might reside in a headset or even a small device placed in the user's ear to seemingly whisper advice on how to behave in a business meeting being held in a different country and culture.

"This is where wearables meet artificial intelligence," says Satyanarayanan. "Based on the examples we've built here, I believe within five years we'll see the earliest commercial applications. I suspect the first examples will be in the medical field and industrial repair. Commercial use cases, like cooking aides, will be further down the line."

He also notes that the elder care field could become an early adopter for these advanced devices.

If a senior citizen is having memory problems, for example, instead of immediately being placed in assisted living or a nursing facility, they could use a smart headset or ear piece that act as cognitive assistants, reminding them to take their medications or to turn off the stove.

The elderly user wouldn't be interacting with the device via a keyboard, mouse, or even a touchscreen. The device simply would communicate to the user via images or speech.

"At the end of the day, one of the most profoundly helpful use cases will be in elder care," says Satyanarayanan. "Just in the U.S. alone, if you could delay by one month the admitting of elderly people to a nursing home, the annual savings would be $1 billion."

Our smarter apps and devices might get pushier. What if I ask my device to find me the best steak restaurant in town but that device is communicating with my health device and now it will only show me restaurants that serve salad?

Michael FacemireAnalyst, Forrester Research

What will our smarter devices share about us? 

Forrester's Facemire warns that smarter devices will bring problems along with convenience. 

"In three to five years, the mobile device will become my digital representation in my ecosystem," he says. "When I take my phone into a room where there's a smart TV, virtual reality device, or Amazon Echo, my phone can tell the other devices who I am and what services I subscribe to and what information I have access to."

He adds, "Every time I interact with a new piece of technology, I don't want to have to tell it who I am and type in all my password information. My phone will tell these devices who I am and that I'm trustworthy."

At that point, the smartphone won't need a screen because it will be able to use the screens all around it, like TVs, laptops, or headsets that have small screens that drop over the user's eye or screens that might be embedded in our eyes.

"Let's have something embedded in our eyes or attached to the nerves that go from our eyes to our brains that will overlay data there," says Facemire, adding that could be 10 or more years out.

That drive toward enabling our devices to recognize us to other devices in our vicinity, though, could open up a whole lot of new problems.

Would devices, which connect to our advanced smartphones, share our information with the company that made them as well as advertisers?

Will our driverless cars and the chips embedded in our bodies send information to nearby restaurants and stores that want to know more about us to capture our business?

"Our smarter apps and devices might get pushier," says Facemire. "What if I ask my device to find me the best steak restaurant in town but that device is communicating with my health device and now it will only show me restaurants that serve salad? Is my device also talking to my insurance company or my employer?"

Not everyone sees the future of human-computer interfaces as so frightening, though.

Both Sun Young Park, an assistant professor at the University of Michigan, and IDC analyst Tom Mainelli see augmented reality and virtual reality opening up people's abilities to communicate with their devices.

Mainelli points out that executives at Meta Co., which makes augmented reality products, already have removed monitors from their employees' desktops.

Employees can use their headsets and augmented reality to send and read email, write code, and search the Internet.

"They're using head-mounted displays to interface with their computers," says Mainelli. "The headsets are creating images in front of your eyes, and you're able to interact with it with your hands. That shows you where we're headed."

Our computer interfaces will begin to adapt to us, instead of us adapting to them.

Mainelli notes that a user who thinks more in images than in words could have a personalized interface that connects more in pictures than with words.

"Instead of a standard interface that we all adapt to, the interface would adapt to us," he explains. "To make that happen, you've got to have a lot of learning going on in the background. You've got to have machine learning. You've got to have tons of data and people interacting with that interface so it learns. The interface needs to know that Tom tends to do this with his eyes, this with his hands, and this with his body. It will work the way Tom naturally works."

The human-computer interface: Lessons for leaders

  • Expect the way you interact with your laptop, tablet, and smartphone to change a lot in the next five to 10 years.
  • Perceptual computing, a computer's ability to be aware of what is happening around it, is gaining movement.
  • The ability to control computing devices and physical objects with human thoughts is a real possibility.
  • Smarter devices will bring privacy problems along with convenience. 

This article/content was written by the individual writer identified and does not necessarily reflect the view of Hewlett Packard Enterprise Company.