Home > Gaming > Calm down! Like dogs, the first Intel RealSense…

Calm down! Like dogs, the first Intel RealSense games can sense your fear

“Just breathe,” Erin tells me as static fills the screen. Maybe it’s the cup of coffee I just downed, or perhaps it’s a touch of performance anxiety to be trying out this game in front of the developer, but my heart is apparently racing. And the game I’m playing knows it.

By tracking my pulse with Intel’s RealSense camera, Nevermind isn’t just responding to my hand gestures, it’s responding to my emotions. This is not what I expected when I came to try out gaming applications of Intel’s new 3D camera, but it has me immediately more interested than the expected motion controls. I take a deep breath, the static clears from the screen, and I continue.

If you’re not already familiar, Intel RealSense is effectively a Kinect, but with shorter range and higher resolution, and fitted in the bezel of a laptop or tablet, like a conventional webcam. Where the Kinect takes in the entire living room, RealSense has an effective range of about a meter, but a much greater resolution and responsiveness within that limited range.

Game developers can use RealSense for features like facial detection and gesture tracking, but also for subtler, more intriguing interpretations like emotion tracking. And those are just the built-in options. Inventive game makers could use the raw data for just about anything, potentially opening some exciting doors for gamers.

Related: Hands on: Intel RealSense 3D

I recently had a chance to try the very first RealSense-enabled games at an event sponsored by Intel. While some merely ape the Microsoft Kinect and Nintendo Wii, some truly push the boundaries of what games are capable of. Here’s what they’re like.

All you need is a body

Lego Portal Racers is an endless runner game in the vein of Temple Run or Subway Surfers. You speed through endless tracks across various, Lego-themed universes, collecting points and power-ups for as long as possible before crashing. Tilting your head side-to-side moves yoru character side-to-side, which makes it intuitive enough that anyone could sit down and get going with little to no instruction. That was the developer’s goal here, and the team nailed it.

Madagascar: Move it! is similarly accessible and targeted at children. It features the lemur from DreamWorks’ animated Madagascar films, dancing with simple, rhythmic gestures for the player to repeat, increasing in speed and complexity as you move up difficulty levels. It’s very similar to the Let’s Dance series of rhythm games.

The appeal of these games is the same appeal that made the original Wii so popular with people outside of traditional “gamer” demographics: They don’t need to learn an intimidating controller or keyboard and mouse controls. The Wii-mote and Wii Sports provided an intuitive interface that anyone could pick up and play with relative ease. RealSense takes that one step further by cutting out the input device entirely and replacing it with gesture controls, plus it’s totally portable. Lego Portal Racer developer Metaio’s Jack Dashwood told me that, in testing, parents really liked being able to set their kid up with a laptop in the back seat of a car or in an airplane.

The future means lasers

Laserlife reaches for slightly older gamers, but still focusing on accessibility. It comes from developer Choice Provisions (formerly Gaijin Games), a studio best known for its Bit.Trip series of rhythm games.

You play as a mysterious, alien entity, represented by a laser beam, that stumbles upon the corpse of a human astronaut drifting through the void. Using your advanced technology, you collect the astronaut’s fragmentary memories to get some sense of who or what they were. Each memory is a level, divided into multiple phases as you collect, reform, and then return the memory. The different phases require different gesture controls, such as grabbing glowing shards as they pass by, or subtly raising and lowering your fingers to guide the laser through glowing gates, all in time with the music.

Related: Intel refocuses on gesture input with RealSense

The level I played was an eerie, underwater environment, culminating in finding a boat. As you complete levels, the objects accrue in the space surrounding the astronaut, adding up to a symbolic tableau of their life. While the developers have an elaborate and specific backstory for the astronaut, they have left it intentionally vague within the game, encouraging players to fill in the details with their imagination.

Controls have also been kept intentionally minimal. Each section requires basically a single gesture that is easy to pick up. Studio co-founder Alex Neuse explained that the levels have been cleverly divided into different sections with different movements in order to alleviate the fatigue from doing the same thing for too long. Although they initially experimented with subtler and more complex gesture controls, Neuse said that they ultimately skewed towards bare minimalism because the technology is so new: “We wanted to make sure we could walk before trying to run.”

Once more, with feeling

The most interesting demonstration of the day, though, was first-time developer Flying Mollusk’s Nevermind, a psychological horror adventure game that uses your fear against you. As you explore the game, it uses the RealSense camera to register your pulse as a measure of anxiety. As players get more anxious, the screen fills with static and the environment changes contextually. For instance, in a weird meat locker/dairy in the demo, the floor would flood with milk, slowing you down as your anxiety rose. If things get out of control, then the game teleports you back to a soothing hub area for you to cool off.

While it’s difficult to achieve the kind of immersive anxiety that horror games thrive on while standing in a busy hotel event room for a demonstration, the effect was nevertheless really interesting. Designer Erin Reynolds first created Nevermind as an MFA project with a focus on how games could be used to help people. By providing direct, immediate feedback, players can improve their awareness of how their body responds to anxiety, and intuitively learn strategies to help mitigate it. “If we can help get people a little more in touch with themselves, I already count that as a win,” explained Reynolds. Although the game was developed just reading pulse, Reynolds hopes to experiment with the additional biometric measures and emotion recognition included with the RealSense SDK. Reynolds also saw the possibilities this opens up for more conventional action games as well. Imagine, for instance, a game that adds more enemies if you’re getting too relaxed.

While not the most polished demonstration of the day, Nevermind was the most interesting because it pointed toward mechanics that were heretofore impossible. Other games merely used motion controls as input for otherwise conventional games. Steering a spaceship by waving your hand around rather than moving a mouse is more gimmick than real innovation, and can be fatiguing in a way that using a conventional gamepad is not. However, a game that responds to your emotions opens up wholly new and exciting design spaces. The Wii and Kinect have already shown that motion-controlled gaming is feasible, but not as sexy as the developers had hoped. The killer gaming app for RealSense will have to show us gameplay that we could not have imagined otherwise.