Black Mirror is a show that takes hypothetical looks into the future of modern tech. Usually with a dark twist, the scenarios you’ll find in it are typically pretty outlandish. Playtest is one of those episodes that seems like pure science fiction. Or, well, it used to.
Playtest follows a young fellow named Cooper, who lands a job as a tester for a game company that specializes in survival horror games. At his new gig, he tries out some new AR technology that warps his reality into the most terrifying experience possible, based on the responses from its user. What follows is a series of seemingly real and horrifying events that, unknown to Cooper, are an illusion created by the new AR technology.
VR/AR technology isn’t yet immersive enough to recreate an experience as powerful as Playtest, but an up-and-coming full-body haptic suit could change that. It’s called the Teslasuit.
The future of immersion
Using electro-tactile haptic feedback, the Teslasuit can mimic sensations like bumping into a wall, touching an object, or the impact of a punch in AR/VR settings. We witnessed these sensations first-hand at CES 2019. From the prickly patter of raindrops to growing waves of static, we experienced how the Teslasuit can bring virtual worlds to life.
That’s not all the suit can do. Motion capture, climate control, and biometrics are on its list of capabilities. Features like that make Black Mirror’s Playtest episode a real-life possibility.
Motion capture reveals how a person moves and responds to their environment, while climate control can adjust the temperature inside the suit. Biometric data reveals things like their current heart rate, stress levels, and overall mental and emotional state.
Combine these features with the haptic feedback system and you get a real-life version of Mushroom, the AR computer that put Cooper through the most terrifying video game of his life in Playtest.
Data meets artificial intelligence
The most important part of this equation, though, is biometrics — the key to creating experiences that are catered to the user. Teslasuit Co-Founder Dimitri Mikhalchuk explained the biometric data collected by the suit can determine if the user is uninterested, tired, nervous, stressed, or scared. That’s the kind of information that can be used in virtual or augmented realities in response to a user’s current physical and mental state.
Mikhalchuk envisions the Teslasuit being used exactly like that in the near future. But it’s up to game developers to step up to the challenge.
“This is very important for the gaming industry. We see that in the future, when we come to the end user market, that we will be able to offer a lot of data sensing for the developers to process, for the AI itself to adjust the game to the player,” Mikhalchuk explained. “We could run alternative endings, alternative scenarios because that suit would actually provide the computer with the knowledge of how the person feels in that specific environment.”
That sounds like the immersive VR experience many have hoped for since its introduction. It also sounds a little bit scary.
The views expressed here are solely those of the author and do not reflect the beliefs of Digital Trends.
- Google’s ARCore is getting better at tracking moving images
- HTC’s stand-alone Vive Focus Plus will be out in April, and it won’t be cheap
- Apple TV 2019: Here’s everything we want from Apple’s next streamer
- Unreal Engine 4 support for developers coming to Microsoft’s HoloLens 2 in May
- What is 5G? Here’s everything you need to know