X
Innovation

A real-life 'holodeck' in 10 years? Less far-fetched than you think

Within a decade, InterKnowlogy CEO Tim Huckaby believes technology user interfaces will include a functional holodeck and mind-reading machines. Here's why.
Written by Mari Silbey, Contributor

Tim Huckaby can't sit still. During his hour-long presentation on the future of user interfaces at the recent 2013 Consumer Electronics Show (CES), he leapt from demo to demo, his enthusiasm contagious, and his constant movement making it difficult for anyone in the audience with a camera to capture him in stasis.

Huckaby has good reason to be excited. The way this software expert sees it, we're on the verge of a science-fiction-like future where doctors manipulate molecules in three-dimensional (3-D) space, augmented music players tune into your thoughts, and retailers deliver coupons in real time based on the focus of your gaze across store shelves.

Imagine a world in retail where my wife has opted in at Nordrom's, or Macy's, or something like that to be tracked through the store... We can see what you're looking at, and we can push a coupon to you. 'Hey, Kelly, you were in the Seattle Nordstrom's, and you looked at these cute shoes, but your didn't buy them. Now you're in the Las Vegas Nordstrom's. You're looking at the exact same shoes. How about 40 percent off if you buy them right now?' That's the beauty of retail.

Huckaby is founder and chairman of California-based InterKnowlogy, as well as the current chief executive officer of Actus Interactive Software. Both companies focus on user interface (UI) development, and Huckaby's belief in the coming rapid evolution of the UI field is based on decades of work in emerging technology.

During his recent talk in Las Vegas, Huckaby was tasked with predicting what the interfaces we use to interact with computers and communications technologies will look like in five years. He didn't stick to that time frame, but instead offered multiple examples of where UIs are headed, and how the evolution will unfold.

His predictions for what's possible within the next 10 years are mind-blowing: a functioning "holodeck" (ala the sci-fi classic Star Trek) into which holographic images are displayed; a legitimate neural-based interface offering a direct pathway between the brain and external devices; and virtual objects that extend into practically every facet of life and that behave much as they would in the natural world.

Gestures are just the beginning

Huckaby's forecast is based on work he has already seen, and, in many cases, that he participated in developing.

The Microsoft Kinect device was the centerpiece of several of his demonstrations. Low-cost and readily available, the motion-sensing technology has been a boon for Xbox360 and WIndows developers, many of whom see the gesture-based interface as key to applications that aren't easily navigated with a keyboard or mouse.

In one demo, Huckaby used himself as the model for a physical therapy patient, with a software program tracking his therapeutic exercises and evaluating his progress. In another demo, he showed how a motion-sensing device could help surgeons in an operating room access X-rays and other files digitally through hand-waving gestures rather that requiring doctors or clinicians to touch any physical objects.

These concepts aren't completely new, but they're infinitely more practical now that the cost of sensors has declined, and the raw computing power needed for sensor data analysis is more readily available. The combination is driving rapid innovation, and creating an easier path toward commercialization.

Mind-reading machines

Huckaby didn't stop with gesture-based UIs. His talk ventured into the territory of the surreal when he started referencing neural-based interfaces, or interfaces that can read your mind.

There are two kinds of neural UIs, conscious and non-conscious. The conscious version is reasonably easy to grasp: an example would be a quadriplegic who can steer his or her wheelchair using specific thought commands. The non-conscious use cases are trickier to contemplate. Huckaby suggests envisioning customer service systems that detect the mood of incoming callers, or music players that adjust playback to match a listener's changing thoughts.

Early work in neural interfaces is emerging from companies like Emotiv, whose website looks like something out of the remake of the futuristic film Total Recall. The site includes a video from a 2010 TED talk demonstrating the company's brainwave-reading machine.

If it all sounds faintly ridiculous, remember that so, too, was the sight of people apparently talking to themselves in the London Underground circa 1999 when Bluetooth headsets for mobile phones started emerging, allowing for hands-free conversations. Fast forward a mere 14 years to 2013, when it seems almost everyone is prone to having conversations with invisible people in public spaces.

Given that backdrop, it's easy to contemplate a future a few short years from now, in which we may be regularly waving at signs and interactive kiosks. And within a decade, they just may be waving back at us.

Image credit: Original holodeck image from Moto猫's Flickr photostream

Related on SmartPlanet:

This post was originally published on Smartplanet.com

Editorial standards