Skip to main content

In the future, touchscreens will be obsolete. This lab designs what’s next


Chris Harrison is thinking about the future. His. Yours. Ours. Everyone’s. More specifically, he’s thinking about how the world will be using computers, and what those computers might look like, a quarter-century from now. Since Harrison is 35 years old today, that’s right around the time that he may be contemplating retirement.

It’s Harrison’s job to think about these things. He is director of the Future Interfaces Group at Carnegie Mellon University’s Human-Computer Interaction Institute. Located in a solar-powered, century-old building on the western side of Carnegie Mellon’s Pittsburgh campus, FIGLAB, as it is affectionately called, boasts three studios loaded to the gills with everything from high-tech sensors to CNC milling machines and laser cutters.

Its humble raison d’être is to give us Muggles a tantalizing glimpse into, well, the future.

“I’m definitely a nerd at heart,” Harrison told Digital Trends. “I enjoy thinking about speculative futures and what could be. That’s very much what our research does. I think in some respects we are working in the science fiction domain; we’re trying to think about possibilities that don’t yet exist. Then once we have the idea, we go to work saying, ‘can we cobble together these future technologies out of the Legos of today, meaning the technology pieces that we have [available to us right now?]’.”


The resulting FIGLAB creations veer between the truly inspired and the utterly madcap. Sometimes, like Schrödinger interface, both at once. Conductive paint that turns regular, boring walls into enormous touch-sensitive panels at a cost of $1 per square foot? Of course! A smartwatch that uses laser projection to extend its touchscreen all the way up your arm? No problem! A device for simulating touch in virtual reality by turning humans into living marionettes? You’ve come to the right place!

And these are just a handful of the last couple of years’ worth of creations at FIGLAB. This is just the stuff that gets published. There’s a whole lot where it comes from.

The bridge to the perfect interface


It’s easy to look at computer interfaces and think that they are just gimmicks to sell new devices or products. Bad ones are. But a good interface fundamentally changes the way that we use technology. The graphical user interface or GUI (pronounced “gooey”), with its real world-inspired metaphors of desktops and files, made computing visual. Multitouch, with its pinch-to-zoom gestures and other hand-related gestures, made it tactile. Already we have the embryonic primordial ooze of gaze-based and emotion-sniffing interfaces from which other more sophisticated UI will doubtless one day crawl.

But there’s no map to follow when it comes to creating user interfaces. It’s a discipline stuck halfway between what the British scientist and novelist C.P. Snow called, in 1959, the two cultures: Science and engineering on the one hand, arts and the humanities on the other.

“Engineering works great when you have a problem like ‘Here’s a bridge; the river is 300 feet wide; build a bridge that spans the gap,’” Harrison said. “It’s easy to build solutions when the problem is well defined. Most of our work is actually trying to find the problems … We have to have that eye, that lens, that looks beyond. Like, what could be even better about [a particular] experience? You have to decouple yourself from reality a little bit. [FIGLAB appeals to] people that are open and creative thinkers, [who are] able to have those kinds of insights.”


Some of this can, Harrison said, be taught. A typical Ph.D. at Carnegie Mellon can take around six or seven years to achieve. That’s plenty of time for students to get to grips with the lab’s philosophy and approach to technology. FIGLAB has access to the latest components, often long before they’re accessible to most people. But their approach to these can be dazzlingly subversive: Sure, you created this expensive component to do X, but we’re going to make it do Y because, reasons.

“It often happens where we’re playing with things and we find entirely new ways to leverage them,” Harrison said. “We might get some crazy new sensor that might be for sensing, you know, temperature inside of a steel furnace. We’re like, ‘well, what happens if you flip it upside down and put it in a smartwatch?’ Well, oh my gosh, now you can do authentication based on blood vessels.”

The long nose of invention


It should go without saying that none of this is straightforward. Harrison freely acknowledges that 90% of the prototypes the lab builds (and it nearly always prototypes its ideas) will ultimately end in failure. The technology may not yet be ready. The idea might turn out to be less cool in reality than it was in theory. Or it could just be that the public doesn’t take to an idea. After all, it’s not easy to see into the future.

The future, in some ways, is like fog. Short distances can be seen relatively clearly. Medium distances are fuzzier, but still visible. But try and look much beyond that and you won’t see anything at all. This is because fog is exponential, each unit of distance losing a certain fraction of the available light.

However, what the team at FIGLAB is doing isn’t trying to predict the future, although there is a bit of guesswork in figuring out what future problems might be. Instead, it’s trying to Terminator the future; to screw around in the present with the hope that some of this pays off years from now.


In 2008, Bill Buxton, a senior researcher at Microsoft, put forward the theory he called the long nose of innovation. The idea, in essence, is that it takes a long time for a product to make its way from the first research lab demonstrations to widespread use by computer users. How long? Roughly 25 years. For instance, researcher Doug Engelbart’s lab at Stanford came up with the initial concept for the computer mouse in the 1960s. The concept was refined at Xerox PARC during the 1970s, but it wasn’t until the Apple Macintosh in the 1980s that it became a mass-market product. Multi-touch has been around since the 1980s, complete with gestures like “pinching.” (A young Steve Jobs actually visited Carnegie Mellon in 1985 for an early demo.) Still, it wasn’t until the 2000s that gestural touchscreens became mass-market with the iPhone.

As Buxton pointed out, the long nose says that any technology that is going to have a significant impact in the next decade is already at a decade old. Any technology that is going to have significant impact in the next five years is already at least 15 years old.

What Harrison’s lab is doing, therefore, is to put down the rough starting points of interfaces that, a quarter-century from now, might be commonplace. You probably couldn’t take too many of its current projects and roll them out right now to masses of success. But give it a decade or two and you very much might be able to. As Harrison said, “[Right now people] should be going back to papers from the early 2000s to find out what the next billion-dollar unicorn company is going to be in 2030.”

The right environment


Harrison’s media-savvy approach to user interfaces means that every finished project FIGLAB creates gets its own showcase demo video. These, he said, are often storyboarded long before a single line of code gets written. It’s how the team works out what the compelling use-cases are going to be. It’s also how it garners a whole lot of attention — including from some heavy hitters.

“Often [tech companies will] see it online, or it’ll get passed around the office on some sort of internal social media, and people will get excited and someone will reach out and say, ‘Hey, can we build a demo of that on our platform?’ or ‘Can we come see a demo in person?’”

Companies who have sponsored FIGLAB include Google, Qualcomm, Intel, and others. A recent project, Listen Learner, made it possible for smart speakers owners to ask “what’s that noise?” and have a variety of household sounds positively identified. FIGLAB’s collaborator for that one? The ever-secretive Apple. To Harrison, part of the appeal for these companies is to work with a lab so dedicated to experimentation.

“The wonderful and terrible thing about academia is that we have that intellectual freedom”

“The wonderful and terrible thing about academia is that we have that intellectual freedom,” he said. “That means that very few of our products ship. Probably nine out of 10 of our projects will just disappear into the ether. Never even make a dent. You can’t run an industry lab like that. You have to have more successes to earn your bread. By [our] being decoupled from that reality and being able to cultivate those really eccentric skills and creativity, it’s the right environment to be able to produce these kinds of ideas.”

And, of course, just because nine out of every 10 ideas ends up junked, doesn’t mean anything if the 10th idea turns out to be the next computer mouse or smartphone.

If Harrison’s lab pulls off one of those interface game-changers, any number of short-term flops won’t make a jot of difference. And Chris Harrison will never have to worry about his future again.

Luke Dormehl
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
Meet the game-changing pitching robot that can perfectly mimic any human throw
baseball hitter swings and misses

Who’s your favorite baseball pitcher? Shane McClanahan? Sandy Alcantara? Justin Verlander? Whoever you said, two of the top sports-tech companies in the U.S. -- Rapsodo and Trajekt Sports -- have teamed up to build a robot version of them, and the results are reportedly uncannily accurate.

Okay, so we’re not talking about walking-talking-pitching standalone robots, as great a sci-fi-tinged MLB ad as that would be. However, Rapsodo and Trajekt have combined their considerable powers to throw a slew of different technologies at the problem of building a machine that's able to accurately simulate the pitching style of whichever player you want to practice batting against -- and they may just have pulled it off, too.

Read more
The best portable power stations
EcoFlow DELTA 2 on table at campsite for quick charging.

Affordable and efficient portable power is a necessity these days, keeping our electronic devices operational while on the go. But there are literally dozens of options to choose from, making it abundantly difficult to decide which mobile charging solution is best for you. We've sorted through countless portable power options and came up with six of the best portable power stations to keep your smartphones, tablets, laptops, and other gadgets functioning while living off the grid.
The best overall: Jackery Explorer 1000

Jackery has been a mainstay in the portable power market for several years, and today, the company continues to set the standard. With three AC outlets, two USB-A, and two USB-C plugs, you'll have plenty of options for keeping your gadgets charged.

Read more
CES 2023: HD Hyundai’s Avikus is an A.I. for autonomous boat and marine navigation
Demonstration of NeuBoat level 2 autonomous navigation system at the Fort Lauderdale International Boat Show

This content was produced in partnership with HD Hyundai.
Autonomous vehicle navigation technology is certainly nothing new and has been in the works for the better part of a decade at this point. But one of the most common forms we see and hear about is the type used to control steering in road-based vehicles. That's not the only place where technology can make a huge difference. Autonomous driving systems can offer incredible benefits to boats and marine vehicles, too, which is precisely why HD Hyundai has unveiled its Avikus AI technology -- for marine and watercraft vehicles.

More recently, HD Hyundai participated in the Fort Lauderdale International Boat Show, to demo its NeuBoat level 2 autonomous navigation system for recreational boats. The name mashes together the words "neuron" and "boat" and is quite fitting since the Avikus' A.I. navigation tech is a core component of the solution, it will handle self-recognition, real-time decisions, and controls when on the water. Of course, there are a lot of things happening behind the scenes with HD Hyundai's autonomous navigation solution, which we'll dive into below -- HD Hyundai will also be introducing more about the tech at CES 2023.

Read more