Skip to main content

Digital Trends may earn a commission when you buy through links on our site. Why trust us?

Forget AR glasses. Augmented reality is headed to your windshield

Like millions of other kids around the world, Jamieson Christmas, now in his mid-forties, was transfixed the first time he saw director George Lucas’ epic space opera Star Wars. “I’m a child of the ’70s,” he told Digital Trends. “I grew up when Star Wars was first released. [What really fascinated me was] this idea of holography. George Lucas set up this vision of little robots beaming three-dimensional pictures of people. R2-D2 and all that stuff. It had a really tremendous influence on me.”

Recommended Videos

Jump forward several decades and Christmas, the founder of a U.K.-based company called Envisics, believes that he’s found the perfect use case for real-life augmented reality holograms. (And, spoiler, it’s not pleading for Obi-Wan Kenobi to help rescue a kidnapped princess who has obtained the schematics for an evil, planet-destroying space station the size of a moon.)

What Envisics has developed is a headset-free, in-car holography system that aims to transform the way we view the road. How? By giving your car an AR overhaul more in line with the kind of HUD technology you’d ordinarily find in a fighter jet or a commercial aircraft worth many millions of dollars. Or, heck, an X-Wing.

“The reason it hasn’t happened yet is that it’s really, seriously difficult,”

“Airplanes are the safest mode of transport that there is, because everything about the journey is entirely proceduralized,” said Christmas. “Everything from switching on and checking all the functions in the airplane through to how you communicate, how you set up in your journey, what the flight is going to be, what the flight plan is going to be, all the way through to the other end. And it’s with good reason that is now the safest mode of transport there is. [What we want to do is to] enhance [people’s experience of driving] and make the journey safer.”

If Christmas has his way, that fancy, sleek, giant iPad-style screen that cars like the Tesla come from could suddenly look very, very outdated indeed.

The journey to build in-car holography

Christmas’ journey to build the ultimate in-car holographic technology took him to Cambridge University, where he earned a Ph.D. from the Center for Advanced Photonics Electronics. While there, he set about trying to build a holographic display that would work like the one he had dreamed about. The resounding lesson? “What I discovered was that the reason it hadn’t happened yet is that it’s really, seriously difficult,” he said.

After getting his doctorate, Christmas founded a company called Two Trees Photonics Limited. The idea, as with Envisics, was to create dynamic holography for the automotive market. “That was in 2010,” he recalled. “With hindsight, it was a dreadful time to start a company. It was in the wake of the Lehman shock [which triggered the 2008 financial collapse]. There were no venture capitalists that were interested in investing in deep technology businesses. And this is about as deep as it gets.”

Envisics

Nonetheless, Two Trees grew to a small woodland of 20 employees, before being snapped up by the U.S.-based startup Daqri, with designs to take on HoloLens and Magic Leap. Daqri’s boss, Brian Mullins, described Two Trees’ technology as “stunning to see.” But Daqri ultimately floundered. Despite raising an astronomical $300 million, it failed to deliver the goods and collapsed less than a decade later. Protocol wrote that, while it had no problem hitching a ride on the AR hype train, “Everything went wrong for Daqri.”

Suddenly Jamieson Christmas was back at square one.

Envisics is, in some senses, Two Trees Photonics Limited: Part Deux. The goal of cracking the automotive industry is the same, although the tech has come on in leaps and bounds in the time since then. What Envisics has developed is a device that sits inside the instrument cluster in a vehicle, buried in the dashboard, then projects light through an opening so that it bounces off the windshield and into the eye of the recipient. Christmas said that this is no faint reflection, either.

“In-car AR doesn’t have to worry so much about one of the biggest challenges for makers of AR headsets: People moving their heads.”

“We really are the Retina-grade display of the automotive world,” he said. “Our devices typically work at three to four times the resolution of the human eye. You’re left with an image clarity far beyond that which you would normally experience in a vehicle. Our displays can work to tens of thousands of candelas of brightness, which enables you to see this in the most extreme environments.”

The first-generation version of the technology, projecting a virtual instrument cluster, is available in current Jaguar Land Rovers. The second-gen version, which will go significantly further, is set to appear in GM’s Cadillac Lyriq, currently set for launch in March 2023.

A reason (or several) to be confident

Of course, the question remains: How can Christmas be so confident that AR truly is the way of the future? After all, AR startups may have enjoyed a few years flush with venture cash, but they’re having a hard time convincing ordinary customers that they’re the Next Big Thing sci-fi-loving technologists so desperately want them to be. Some big names, including Daqri and Magic Leap, have stumbled. Big player efforts like Google Glass, while certainly having some exciting technology on offer, proved as questionable as gas station sushi. And the most compelling ideas, like AR contact lenses, are still future propositions that have yet to prove themselves in the marketplace.

Envisics

But there are reasons, both sociological and technological, to get behind AR in cars. The sociological one is, simply put, the lack of awkwardness factor. “I’ve been in meetings where people were wearing a Google Glass, for example — and it’s quite disconcerting,” said Christmas. “There are times you’re sitting there, and [the wearer is] looking at you, but you also realize that they’re not actually looking at you; they’re looking at something else.”

Our cars, however, remain private spheres where people don’t have to worry about the social faux pas of interacting with AR.

Then there’s the technological part. A portable AR headset has to pack in a whole lot of advanced technology to a relatively compact form factor. That likely means underpowered (and underwhelming) AR experiences. A car, on the other hand, has plenty of room to fit the various sensor suites and processors needed.

On top of this, in-car AR doesn’t have to worry so much about one of the biggest challenges for makers of AR headsets: People moving their heads. Sure, a car travels very quickly, but we don’t actually change direction in a car all that often. It makes it easier to overlay information in a smooth, controlled way that can pull up relevant, contextual information without the wearer feeling like they’re having a seizure in a 1990s-era website full of unwanted popups.

“Is the next left turn that one that looks like it could be a private entrance? How many lanes comprise the right in a “keep right” order? All of these problems could be physically marking up the road in front of you.”

However, the biggest reason in-car AR could succeed is this: It solves a problem that actually exists. AR headsets could wind up being the biggest thing since the smartphone. But it’s going to have convince people that they’ve been living life in a sepia Kansas next to the annotated, technicolor Oz of the augmented world. An AR version of an in-car display, though? That’s something people already have — and have issues with.

“The trend in vehicles at the moment is to have larger and larger, higher resolution displays, most of which are touch-enabled in some form or other,” Christmas said. “If you’re driving along, and you need to change your radio station or change your heating settings in the car, it basically requires you to look away from the road and carry out quite complex hand-eye coordination in order to interact with that display. The longer you look away from the road, the more risky driving becomes. Having the ability to take that distraction away from the driver — to overlay all the information they need upon reality, so that they always have that situational awareness in their peripheral vision — is inherently going to be a good thing.”

How in-car holograms will change driving

But while in-car AR might build on this idea, the same way that graphical interfaces used the metaphor of real-world desktop objects to get a foot in the door, it could extend the usefulness, too. A touchscreen in a car is one of the few cases of a user interface in tech that doesn’t want to be looked at for too long. It’s all about getting users to toggle the option they need, then allowing them to look away again. No such limitation need to occur here. Information can be made more granular and involved since we’re not being asked to choose between concentrating on driving and interacting with a display.

 

Christmas explained that there are several compelling use-cases for the technology. One is to alert drivers to hazards in the urban environment, whether it’s people crossing the street ahead of you, complex traffic intersections, potentially dangerous weather conditions, or just the behavior of fellow drivers. AR, he said, can help “orchestrate the gaze” of the driver to draw their attention to salient details on the landscape.

Then there’s the navigation issue. Currently, in-car navigation is carried out either by audio prompts or by following instructions on a screen. In both cases, the driver must transpose an abstracted version of the real world, often with a little bit of lag, and place it onto the physical world. Is the next left turn that one that looks like it could be a private entrance? How many lanes comprise the right in a “keep right” order? All of these problems could be physically marking up the road in front of you.

How will this drive toward in-car AR fit with the other big innovation currently taking place in the automotive space, referring to the rise of the self-driving car? After all, if autonomous vehicles are inevitable, is ironing out the challenges with the ways that humans currently drive a bit like tweaking the interface elements on an app that’s about to shut down? Christmas doesn’t think so. In fact, he suggested that autonomous vehicles sit very comfortably with in-car AR.

All set for the future

In the short term, that’s because they can help handle the hand-off process when a self-driving car encounters a scenario that requires the human driver to take over.

“I can tell you that needing to transact responsibility for the vehicle is actually a really big problem,” he said. “If the car is in autonomous mode, happily running down the street, [should it need to pass control back to the human driver], how does it give you all the information you need to make informed decisions in a split second? The only way to do that really successfully is if the car can overlay its understanding of the world around you upon reality.”

Image used with permission by copyright holder

Even when true autonomy is achieved, Christmas doesn’t see any reason why in-car holography needs to take a (no pun intended) back seat.

“Ultimately, when you end up with level five autonomous driving, [this] becomes an entertainment medium,” he said. “It becomes a mechanism to, frankly, earn revenue. It could stream you information about what the shops are doing, where the sales are. There really are limitless opportunities for this technology.”

At present, Envisics seems to be making all the right moves. This month, it announced a new funding raise of $50 million. According to Christmas, this will be used to approximately double the size of the company’s workforce over the next 12 to 18 months, as well as open new offices in mainland Europe and Asia. “And, of course, we’ll be using the money to accelerate our next-generation technologies that will bring even greater functionality and greater enhancement, and open up the wider market for us,” he said.

Coming soon to a windscreen near you. Hopefully.

Luke Dormehl
Former Digital Trends Contributor
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
Waymo faces questions about its use of onboard cameras for AI training, ads targeting
Two people exit a Waymo taxi.

In an iconic scene from the 2002 sci-fi film Minority Report, on-the-run Agent John Anderton, played by Tom Cruise, struggles to walk through a mall as he’s targeted by a multitude of personalized ads from the likes of Lexus, Guinness and American Express, everytime hidden detectors identify his eyes.
It was clearly meant as a warning about a not-so-desirable dystopian future.
Yet, 23 years later that future is at least partlially here in the online world and threatens to spread to other areas of daily life which are increasingly ‘connected’, such as the inside of cars. And the new testing grounds, according to online security researcher Jane Manchun Wong, might very well be automated-driving vehicles, such as Waymo’s robotaxis.
On X, Wong unveiled an unreleased version of Waymo’s privacy policy that suggests the California-based company is preparing to use data from its robotaxis, including interior cameras, to train generative AI models and to offer targetted ads.
“Waymo may share data to improve and analyze its functionality and to tailor products, services, ads, and offers to your interests,” the Waymo’s unreleased privacy statement reads. “You can opt out of sharing your information with third parties, unless it’s necessary to the functioning of the service.”
Asked for comments about the unreleased app update, Waymo told The Verge that it contained “placeholder text that doesn’t accurately reflect the feature’s purpose”.
Waymo’s AI-models “are not designed to use this data to identify individual people, and there are no plans to use this data for targeted ads,” spokesperson Julia Ilina said.
Waymo’s robotaxis, which are operating on the streets of San Francisco, Los Angeles, Phoenix and Austin, do contain onboard cameras that monitor riders. But Ilina says these are mainly used to train AI models for safety, finding lost items, check that in-car rules are followed, and to improve the service.
The new feature is still under development and offers riders an opportunity to opt out of data collection, Ilina says.
But as we all get used to ads targeting based on everything that’s somehow connected to the web, it seems a once-distant vision of the future may be just around the corner.

Read more
Waymo’s driverless cars are about to begin an overseas adventure
Waymo Jaguar I-Pace

Waymo’s autonomous cars are about to appear on streets outside of the U.S. for the first time.

The company on Wednesday announced on social media that its autonomous cars will be driving onto the streets of Tokyo, Japan, “soon,” with some reports suggesting the rollout will begin as early as next week.

Read more
Buy Now, Upgrade Later: Slate’s $25K Truck Flips the Script on EVs
many hybrids rank as most reliable of all vehicles evs progress consumer reports cr tout cars 0224

A new electric vehicle startup—quietly backed by Amazon CEO Jeff Bezos—is building something bold in Michigan. Not just a car, but a whole new idea of what an EV company can be. Slate Auto is a stealthy new automaker with one mission: ditch the luxury-first EV playbook and start from the affordable —which most drivers actually seek.
The start-up has been operating out of public sight since 2022, until TechCrunch found out about its existence. Of course, creating a little mystery about a potentially game-changing concept is a well-tested marketing approach.
But Slate truly seems to approach EVs in a very different way than most: It isn’t debuting with a six-figure spaceship-on-wheels. Instead, it's targeting the holy grail of EV dreams: a two-seat electric pickup truck for just $25,000. Yep, twenty-five grand. That’s less than a tricked-out golf cart in some neighborhoods. Slate is flipping the Tesla model on its head. Tesla, but also the likes of Lucid, BMW, and to a certain degree, Rivian, all started with high-end vehicles to build brand and bankroll future affordable car. But Slate wants to start with the people’s pickup—and letting it grow with you.
This isn’t just a cheap car. It’s a modular, upgradeable EV that’s meant to be personalized over time. Buy the basic model now, then add performance, tech, or lifestyle upgrades later—kind of like building your own dream ride one paycheck at a time. It’s a DIY car for a generation raised on customization and subscriptions. The company even trademarked the phrase: “We built it. You make it.”
Backing up this idea is an equally bold strategy: selling accessories, apparel, and utility add-ons à la Harley-Davidson and Jeep’s MoPar division. You’re not just buying a vehicle; you’re buying into a lifestyle. Think affordable EV meets open-source car culture.
Slate's approach isn't just novel—it's almost rebellious. At a time when other startups risk folding under the weight of their own lofty ambitions, Slate is keeping things lean, scalable, and customer focused. The company reportedly plans to source major components like battery packs and motors from outside suppliers, keeping manufacturing costs low while focusing energy on design, experience, and upgrade paths.
Sure, it’s all been kept under wraps—until now. With plans to begin production near Indianapolis by next year, the wraps are about to come off this EV underdog.
While, at least in spirit, the U.S. market has been dominated by high-end EVs, Slate’s “start small, scale with you” philosophy might be just the jolt the industry needs.

Read more