Snapchat is moving forward after a redesign drew some negative reactions. So, what’s next for the social network? Two patents published in mid-March suggest the company is researching the possibility of creating 3D models with a smartphone and reading emotion in a video chat. Both patents were filed in November, but only became public recently.
The first patent suggests a method for simplifying the process of creating a 3D model, to the point where anyone with a smartphone can create one. Rather than using a complex camera or a wearable device to create a 3D rendering, the patent instead suggests using a smartphone camera and motion to help detect depth. The idea is that, after mapping out points from one view of an object or a face, tracking movement between points will allow the program to construct depth maps for 3D rendering.
Snap isn’t just looking to make 3D modeling simple by using a smartphone camera, however. The patent also details a user interface that would instruct the user on how to move the camera around an object or, for facial renditions, how to angle the face. According to the patent, the program would also warn users if the conditions aren’t ideal, such as limited lighting, and what to do to correct the problem. On-screen instructions would also assist with proper alignment and moving the camera, while alerts would warn of problems, such as moving too fast.
The patent doesn’t detail what the technology would be used for, but it’s easy to imagine how easy-to-create 3D models could be used in an augmented reality platform like Snapchat’s Lenses, potentially adding those models to an AR world. The Snapchat-owned Bitmoji could be another application for the 3D face mapping.
The patent suggests that the technology could be used on a device with a built-in camera like a smartphone, tablet or computer — or it could even be a stand-alone device.
The second patent also creates a facial map — but this time, to detect emotions during a video chat. The person in the video would first have to approve a request for the emotion mapping. Then, the program would use object recognition to find the face, then create a “face mesh” — essentially a map of the face’s various features. The program would then measure changes in that facial map in order to look for emotions.
Interestingly enough, the patent doesn’t recommend using the feature for suggesting emojis on a video chat between friends, but for using it during a chat between a customer and a customer service representative. Because, as the patent says, “customer anger is not always easy to spot.” Snap Inc. isn’t the only network looking into tracking emotions — published patents last year suggested Facebook researched the possibility of tracking emotions through a camera, keyboard movement, and touchpad.
Many patents never turn into anything besides a bunch of legal paperwork, so there’s no guarantee that a future Snapchat could allow you to create 3D models with a phone — or really let that customer service agent see how angry you are. The patents, however, do offer a look at what Snap Inc. researchers are dreaming up.
- Google patent shows off a hinged smartphone with selectable displays
- Google awarded patent for using eye tracking to detect expressions in VR
- LG may be working on a smartphone camera with 16 lenses
- A Google patent shows a way to make VR even more immersive
- Skype’s real-time A.I. captions and subtitles aim for better collaboration