In the 1966 movie Fantastic Voyage, a U.S. submarine and its crew are shrunk down to microscopic size and injected into the body of a comatose Soviet doctor who has defected to the U.S., in an effort to destroy a blood clot in his brain that threatens his life. Since then, the idea of being able to shrink people down to microscale, often to solve some kind of medical challenge, has made its way into various parts of popular culture. But not, as of yet, into reality.
Sadly, scientists and engineers still have yet to develop a real-life shrink ray. But investigators from the U.K.’s University of Cambridge and 3D image analysis software company Lume VR have come up with a method that uses virtual reality to allow researchers to “walk around” inside individual cells in order to better understand some fundamental problems in biology and, in the process, learn to develop better treatments.
“Biology occurs in 3D, and visualizing 3D data on a 2D screen is restrictive,” Steven Lee, a reader in biophysical chemistry in Cambridge’s Department of Chemistry and leader of The Lee Lab, told Digital Trends. “By implementing data into a VR environment, one can intuitively walk around in their data, viewing all three dimensions in one setting. [Our software], vLUME, instantaneously allows you to visualize artifacts, clusters and various characteristics in VR that would be time-consuming otherwise.”
Being able to, for instance, image an immune cell from your own blood and then look around it in three dimensions is certainly impressive. As weird as the idea of exploring a giant-sized single cell might sound, it can help make the process of understanding less abstract and let researchers do things like watch how antigen cells trigger immune responses in the body on a hitherto unimaginable scale.
Turning microscope images into a three-dimensional environment
As you might expect, this visualization process is pretty complex. Anoushka Handa, another researcher on the project, who carried out the aforementioned immune cell demonstration, explained that the approach uses a technique called super-resolution imaging, which transforms a flat microscope image into an explorable, 3D one by building up an image one point at a time.
“A typical image contains millions of individual points, called a localization,” Handa told Digital Trends. “This allows us to view biology at higher spatial resolutions than would be possible with conventional imaging. Each of these localizations represents a particular biological molecule of interest, whether that is a single protein [or] single antibody, bound to an individual fluorescent molecule. We [then] use a special optical element that enables you to determine the position of these probes in 3D, from a 2D picture. This is [called] the ‘double-helix point spread function.’”
Once these localizations have been processed and pinpointed, the file can be uploaded into the VR viewing system and opened up in vLUME.
“Future directions could include the incorporation of a multiuser tool for numerous users to use vLUME within the same environment,” Lee said. “This enables researchers to quickly interact with their data remotely, which, given the pandemic, is becoming more useful. In addition, we are looking at incorporating advanced computation imaging tools such as focused training methods for machine learning to help better understand complex 3D data.”
A paper describing the work was recently published in the journal Nature Methods.
- We finally might know what Apple will call its AR/VR headset
- HTC aims to turn your carpool into a VR roller coaster
- Apple may have just leaked its VR headset’s operating system
- You won’t be taking Microsoft’s HoloLens 3 into the metaverse
- What will Apple call its VR headset? We might have an answer