Black Mirror is a show that takes hypothetical looks into the future of modern tech. Usually with a dark twist, the scenarios you’ll find in it are typically pretty outlandish. Playtest is one of those episodes that seems like pure science fiction. Or, well, it used to.
Playtest follows a young fellow named Cooper, who lands a job as a tester for a game company that specializes in survival horror games. At his new gig, he tries out some new AR technology that warps his reality into the most terrifying experience possible, based on the responses from its user. What follows is a series of seemingly real and horrifying events that, unknown to Cooper, are an illusion created by the new AR technology.
VR/AR technology isn’t yet immersive enough to recreate an experience as powerful as Playtest, but an up-and-coming full-body haptic suit could change that. It’s called the Teslasuit.
Using electro-tactile haptic feedback, the Teslasuit can mimic sensations like bumping into a wall, touching an object, or the impact of a punch in AR/VR settings. We witnessed these sensations first-hand at CES. From the prickly patter of raindrops to growing waves of static, we experienced how the Teslasuit can bring virtual worlds to life.
That’s not all the suit can do. Motion capture, climate control, and biometrics are on its list of capabilities. Features like that make Black Mirror’s Playtest episode a real-life possibility.
Motion capture reveals how a person moves and responds to their environment, while climate control can adjust the temperature inside the suit. Biometric data reveals things like their current heart rate, stress levels, and overall mental and emotional state.
Combine these features with the haptic feedback system and you get a real-life version of Mushroom, the AR computer that put Cooper through the most terrifying video game of his life in Playtest.
The most important part of this equation, though, is biometrics — the key to creating experiences that are catered to the user. Teslasuit Co-Founder Dimitri Mikhalchuk explained the biometric data collected by the suit can determine if the user is uninterested, tired, nervous, stressed, or scared. That’s the kind of information that can be used in virtual or augmented realities in response to a user’s current physical and mental state.
Mikhalchuk envisions the Teslasuit being used exactly like that in the near future. But it’s up to game developers to step up to the challenge.
More CES 2019 coverage
- Food 2.0: Impossible Foods is back with a bloody new non-beef burger
- Bag the toothbrush. The Y-Brush can clean your teeth in just 10 seconds
- Samsung Galaxy S10 to launch February 20: Here’s everything we know
- Thousands of products showed up at CES 2019. These were the best of the best
“This is very important for the gaming industry. We see that in the future, when we come to the end user market, that we will be able to offer a lot of data sensing for the developers to process, for the AI itself to adjust the game to the player,” Mikhalchuk explained. “We could run alternative endings, alternative scenarios because that suit would actually provide the computer with the knowledge of how the person feels in that specific environment.”
That sounds like the immersive VR experience many have hoped for since its introduction. It also sounds a little bit scary.
- Apple’s mixed reality headset could be half the weight of other headsets
- Adobe Aero turns Photoshop art into augmented reality, no code necessary
- Apple unveils ARKit 3 for more immersive augmented reality experiences
- These shoes let me stroll through ‘Skyrim,’ and I desperately want to go back
- Oculus’ Quest is the headset that will make me (and you) a VR believer