What would it take to build a Matrix-level simulation of reality?

matrix remake report the movie feat
The Matrix

Released almost exactly 20 years ago, The Matrix has gone on to become a cultural phenomenon well beyond the science fiction genre. While it was generally considered science fiction at the time, it helped popularize the Simulation Hypothesis: the idea that we’re all living inside a computer simulation.

While Nick Bostrom’s article in 2003 popularized the discussion in academia and among scientists, it was Elon Musk’s eye-popping declaration at the Code Conference in 2016 about video games that really got many of us in the tech industry to take the idea more seriously. Musk pointed out that, 40 years ago, video games consisted of Pong — basically two squares and a dot — while today we have fully 3D MMORPGs and stunningly realistic VR and AR.

As a video game industry insider and technologist, I’ve started to wonder — what would it take to build something like The Matrix: a simulation that’s so realistic that it’s effectively indistinguishable from physical reality?

Clearly, our technology is not quite there yet, but not in ways you might think. It’s not just a matter of image resolution, pixel density, or visual realism. Rather, it’s about creating interface technologies that can create full immersion and record our responses in real time.

The road to the Simulation Point

So how far away are we from the Simulation Point, the theoretical point where we’re capable of creating virtual worlds indistinguishable from physical reality? In my book, The Simulation Hypothesis, I lay out the 10 stages of technology that would be required to create an all-encompassing virtual world like the Matrix. Let’s run through this roadmap, and then we can answer that question.

Stage  Technology Timeframe
0 Text Adventures 1970s-1980s
1 Graphical Arcade Games 1970s-1980s
2 Graphical RPG Games 1980s
3 3D Rendered MMORPGs and Virtual Worlds 1990s-2000s
4 Immersive Virtual Reality 2010s-2020s
5 (*) Photo-realistic Augmented and Mixed Reality 2020s
6 (*) Real World Rendering: Light Fields and 3D Printing 2010s-2020s
7 (*) Mind Interfaces 2020s-?
8 (*) Implanted Memories 2030s-?
9 (*) Artificial Intelligence and NPCs 2020s-2100?
10 (*) Downloadable Consciousness 2040s-2100?
11 The Simulation Point 2100-?

The Stages on the road to the Simulation Point

Let’s travel down the road any civilization might take to reach the Simulation Point, starting with a brief history of Earth’s video games.

Stages 0-3: From text adventures to MMORPGs

The idea of an explorable “world” inside a computer started with text-based games like Colossal Cave Adventure in the 1970s, and reached its peak with the Infocom games like Zork I-III and The Hitchhikers Guide to the Galaxy. The first graphical game that was widely available, Pong, led directly to the arcade and home video console craze of the 1980s, with games like Space Invaders and Pac Man.

MMO return World of Warcraft

The introduction of 3D perspective and avatars

It wasn’t until the tools of graphical arcade games were combined with elements of text adventures that we really started down the road to the simulation point. These primitive RPGs included Kings Quest, Legend of Zelda, and more. Although these were simple, 2D single-player games, they had many of the elements of today’s 3D MMORPGs like World of Warcraft and Fortnite: worlds that are rendered and can be explored, and characters/avatars that can be moved around.

In this sense, Toy Story (1995) and Doom (1993) were landmark events which really marked an evolutionary leap forward with 3D graphics and rendering technology. These two were at opposite ends of the spectrum — to render a movie like Toy Story took many hours per frame, while Doom’s main achievement was that was that you could move left and right and the scene would shift in real time. Doom’s chief programmer, John Carmac, would later go on to the be the CTO at Oculus, which contributed heavily to the modern virtual reality boom. Today we have millions of players interacting with 3D virtual avatars, and we are well on our way to the Simulation Point.

Stage 5: VR, AR, MR and approaching full immersion

Building on top of 3D MMORPGs, today’s virtual and augmented reality systems are starting to bring science fiction closer to reality. In last year’s Ready Player One, for example, characters could not only experience VR through a headset, but also use haptic gloves, full body suits, and even omni-directional treadmills to increase the sense of realism. Here in the real world, these items are already being developed, and in many cases are already available on the market today.

ready player one review headset up
Warner Bros. Studios

VR Worlds like the OASIS in Ready Player One

Stage 6: Building Star Trek’s Replicators and Holodeck

Stage 6 includes 3D printers and light field technology, which represent significant leaps forward in making virtual objects. In fact, these technologies are starting to look more like Star Trek’s replicators or its Holodeck than any video game. The basic idea of 3D printers is that almost any physical object can be modeled as information and then “printed” as a series of 3D pixels. While today’s 3D printers can generally only print using one type of “ink” (usually a single colored thermoplastic), there have been 1/3 scale models of an Aston Martin car, an actual gun, and recently, an Israeli team was able to use the cells of a living patient to create a 1/3 scale.  If this trend continues, pretty soon, like Captain Picard, you’ll be able to say, “Tea. Earl Gray. Hot” and have it fabricated right before your eyes.

While today’s AR headsets rely on having a physical headset, there is research going on at BYU and MIT to use light-field technology to simulate how light bounces off objects. This suggests the possibility that, within a decade or two, we will be able to create realistic holograms that look like actual objects without the need for headsets.

Stages 7-8: Mind Interfaces and Implanted Memories

Now let’s move beyond where we are today into more speculative areas. One of the main reasons the Matrix was so convincing to humans like Neo was that the images were beamed directly into their brains, in this case via a wire that attached to the cerebral cortex. Basically, the brain was tricked into thinking the experience was real. Neo then woke up in a pod with a wire into his cerebral cortex which was responsible for sending images to his brain and recording his responses.

To truly build something like this, we will need to bypass today’s VR and AR goggles and interface directly with the brain to read our intentions and to visualize the game-world.

Advances made in the last decade suggest that mind interfaces are not as far off as we might think. Startups in this field include Neurable, which is working on BCI (Brain Computer Interfaces) for controlling objects within virtual reality using nothing but your mind. Another startup, Neuralink (funded by Elon Musk) claims to develop “high bandwidth and safe” brain-machine interfaces which involve implants, based on a concept from science fiction writer Iain Banks.

8 examples of amazing mind reading tech brain controlled drone bci

Recently, a team of  researchers from the University of Washington and Carnegie Mellon were able to use skull caps and brain waves to send information about how to move a Tetris piece between 3 players; two who could see the screen and one who couldn’t — effectively an electronic form of telepathy.

In 2011 and 2016, researchers from University of California, Berkeley were able to reconstruct low resolution versions of what participants had been watching (movie trailers) by measuring their brain activity.  This research shows that recording our dreams may be possible in the near future. Unlike in the Matrix, when Morpheus’ teammates needed to look at the now famous stream of green symbols to figure out what was going on in the user’s mind in the simulation, we could just display it on a screen.

So, we are well on the road to being able to read intentions and interpret them. But what about the opposite: broadcasting into the mind?

Experiments done in the 1950s by Wilder Penfield suggest that memories can be triggered inside the brain by electrical signals. But, in what sounds like a science fiction scene out of Blade Runner, there are much newer experiments which suggest that memories can also be implanted.

In 2013, a team of researchers at MIT, while researching Alzheimer’s, found that they could implant false memories in the brains of mice, and these memories ended up having the same neural structure as real memories. This was done in a very limited way, but the techniques are promising.

If memories can be falsified, then we may be entering the world that Stephen Hawking warned us about. “The history books and our memories,” he said, “could just be illusions. It is the past that tells us who we are. Without it, we lose our identity.”

Stages 9-10: Artificial, simulated, and downloadable consciousness

A.I. and artificial consciousness are relatively common today — but only in very primitive forms. Take NPCs (Non-player characters) from video games, for example. These are artificial beings that can move through virtual worlds and interact with you, but they can’t yet pass the infamous Turing Test. Created by computer pioneer Alan Turing, the test is basically a game wherein a conversation with an A.I. is indistinguishable from a conversation with a human being.

Even though we don’t fully understand consciousness, A.I. is one of the most rapidly advancing fields in computer science today.  Already, A.I. is giving humans serious competition in traditional games like Chess and Go. China’s Xinhua news agency recently introduced virtual news anchors that can read the news like real humans. “Deepfake” photographs are being generated by A.I. which are indistinguishable from “real” photographs, and a video went viral recently of A.I. removing cars from scenes with pretty astonishing results.

One of the leaders of transhumanist movement, Google futurist Ray Kurzweil, believes that we are approaching the singularity in both superintelligent A.I. and in another way: downloading consciousness to silicon-based devices, preserving our minds forever.

Those who believe this think that all we need to do is to duplicate the neurons and neural connections of the brain – which would be 1012 neurons through 1015 synapses. While this task seemed insurmountable twenty years ago, today, teams have already simulated the neurons in a rat’s brain using a much smaller number of neurons and connections. Kurzweil thinks we’ll be there by 2045.

Others believe consciousness is more complicated, bordering on the philosophical and religious discussion. Most of the worlds’ religions (Eastern and Western traditions) already teach of transmission of consciousness: downloading it at birth and uploading it at death of the body.

The video game metaphor raises the possibility that there are both PCs (player characters), and NPCs (non-player characters) that are purely artificial.

The Simulation Point and the world as information

A famous Silicon Valley venture capitalist, Marc Andreeson famously said that “software is eating the world.” However, part of the reason that I wrote the book about the Simulation Hypothesis is that it seems that computer science is actually providing new understanding and underpinning for the other sciences.

Once upon a time, physics and biology were thought of as the study of physical objects. Today, physicists and biologists are coming to the conclusion that information is the key to unlocking their sciences. Genes, for example, are nothing if not a way to store information inside biological computers. Physicist John Wheeler, who was one of the last to work with Albert Einstein, decided that there was no material world and that everything came down to bits of information, when he coined the phrase “It from bit”.

If everything is information, then our current technology development trends will lead us to the Simulation Point soon. Looking at these stages, many of them will be done before 2050, but a few, like downloading of consciousness, may prove more elusive while we understand what consciousness is. Even in those instances, my estimate is that in 100-200 years at the most, we will have the technical underpinnings required to reach the Simulation Point and build our own version of the Matrix.

Nick Bostrom from Oxford in his paper “Are You Living In A Simulation?” argued that that if such technology can ever be created, then chances are it has already been created by some advanced civilization somewhere in the universe.

If that’s the case, then who is to say that we aren’t already living inside a giant video game?  As Morpheus said to Neo, “You have been living in a dream world.”


What is UFS 3.0 storage? We asked an expert about the SSD for phones

We take a dive into UFS, or Universal Flash Storage, to find out what the latest UFS 3.0 standard is capable of and why you might want it in your next smartphone. An expert from the standards body, JEDEC, explains all.
Movies & TV

The best shows on Netflix right now (June 2019)

Looking for a new show to binge? Lucky for you, we've curated a list of the best shows on Netflix, whether you're a fan of outlandish anime, dramatic period pieces, or shows that leave you questioning what lies beyond.

Everything we know about cloud gaming

Cloud gaming is taking the video game industry by storm, but just what is cloud gaming? We've broken it down into several sections so that you can learn the basics and decide if you want it.
Smart Home

We tested anti-snoring devices on our loudest friends. Here’s what worked

If your partner snores and it keeps you up at night, you may be interested in the latest anti-snoring technology. We tested out a few different gadgets to find out what they do and whether they work or not.
Emerging Tech

SpaceX is on a hiring spree for its Starlink global internet project

After a string of delays, SpaceX's Starlink project was finally launched last month. Now an analysis of data from SpaceX's job listings shows the company is on a hiring tear, advertising for more and more positions for the project.
Emerging Tech

Ready to roll: Mars 2020 rover fitted with wheels ahead of mission next year

The Mars 2020 rover is getting ready for its trip to the red planet next year. The latest step in readying the rover is installing its wheels and suspension system, which engineers at NASA have been doing this month.
Emerging Tech

Want to work in the stars? Here are six future space jobs you could hold

Ever dreamed of leaving Earth to work in the stars? Here's a list of job titles that might sound like science fiction now, but almost certainly won’t a decade or two in the future.
Emerging Tech

You can help search for aliens with an open access release of SETI data

The Breakthrough Initiatives, a program to search for extraterrestrial intelligence, recently analyzed its first three years of radio telescope data. And all of the data collected is being made publicly available in an open data archive.
Emerging Tech

Awesome Tech You Can’t Buy Yet: Illuminated keyboards and a retro gaming console

Check out our roundup of the best new crowdfunding projects and product announcements that hit the web this week. You may not be able to buy this stuff yet, but it sure is fun to gawk!
Emerging Tech

The U.K.’s biggest (and only) asteroid mining company has designs on our skies

Is the founder and CEO of the U.K.'s Asteroid Mining Corporation going to be among the first people to strike it rich in space, or is he just chasing an ambitious but doomed mirage?
Emerging Tech

Tiny galaxy has huge black hole at its center, gives clues to galactic evolution

A Hubble image shows a tiny galaxy which could hold the clue to unraveling a longstanding question about the evolution of galaxies. Despite its small size, it hosts a feature found in much larger galaxies -- a supermassive black hole.
Emerging Tech

Dark matter galaxy crashed into the Milky Way, causing the ripples in its disk

New research suggests hundreds of million of years ago, the Milky Way collided with Antlia 2, a nearby dwarf galaxy dominated by dark matter. The collision caused ripples in the disk of gas around the Milky Way which we still observe today.
Emerging Tech

Uranus’ rings shine brightly but hold a puzzle for astronomers

New images reveal the rings around Uranus, which are almost invisible to most telescopes. But there's a strange puzzle about them -- why they don't contain any small dust-sized particles.
Emerging Tech

U.S. Navy is working on making its fleet invisible to computerized surveillance

The U.S. Navy’s ever-innovative Office of Naval Research is working on a way to turn the United States military fleet invisible. Well, to cutting-edge image-recognition systems, at least.