Samsung’s Galaxy S9 and S9 Plus come with a variety of new camera features including AR Emoji, Super Slow Motion, and a variable aperture. But there’s one more new trick that we’ve never seen built into a smartphone before – the augmented reality makeup tool in Bixby Vision.
Baked into the S9’s camera app, Makeup lets you apply different products to your face with augmented reality. Point the selfie camera at your face, and the makeup styles you pick are layered over your face, exactly like Snapchat filters. The benefit is you’re trying on makeup in the comfort of your own home, while also avoiding the need to clean up afterwards. It’s the first time we’ve seen this kind of feature built into a smartphone, but the technology is far from new. Samsung tapped ModiFace, a company with more than a decade of research in this field, to integrate its beauty AR technology into the camera.
From skincare to augmented reality makeup
If you’ve ever used the Sephora app to virtually try-on makeup, or Benefit’s Brow Try-On app to test out a new eyebrow shape, then you’ve used ModiFace’s AR technology. The company now powers more than 200 custom augmented reality apps for high-end beauty brands, but it all initially started with skincare.
Smartphone popularity inspired ModiFace to test its technology with augmented reality.
Before the smartphone revolution, ModiFace worked with dermatologists and certain brands to help people find solutions for skincare concerns.
People could upload a photo of themselves to a web app, and the company would process it to pinpoint problem areas and suggest improvements to skincare routines.
ModiFace originally used 2D images with the web app, but as technology progressed, the company was soon able to move to allowing users to upload video. When smartphones got popular, ModiFace began testing its technology with augmented reality, and added makeup and hair to its roster of supported features.
“With the smartphone boom, it was the perfect opportunity for us to expand because it allowed us to bring AR right to our users in the palm of [their] hand,” Jeff Houghton, ModiFace vice president of Technology, told Digital Trends.
Its current software development kits (SDKs) are the culmination of 10 years of engineering. By working closely with beauty brands, ModiFace is able to provide a lightning fast and easy-to-follow user experience, while also encouraging product discoverability.
Its technology doesn’t stop at smartphone apps, extending to retail stores as well. Back in November, MAC Cosmetics debuted its MAC Virtual Try-On Mirror at certain locations. The mirror lets customers virtually try on makeup in real-time by swiping throughout the interface. It helps eliminate the need to test a lot of products at the store, narrowing it down to the styles you like the most.
Putting its tech on the Galaxy S9
The same technology and concept is what’s available on the new Galaxy S9. By collaborating with Samsung, ModiFace was able to optimize the experience.
The partnership
could also kick start a
new trend for phone manufacturers.
“Starting in house, we train our Neural Networks on thousands of images to create the base tracking and face analysis for our apps,” Houghton said. “This Neural Network is then embedded inside Samsung’s app. We worked with Samsung to tweak several parameters to make sure we were achieving the effects that brands and end users want to see.”
The partnership between the two companies could kick start a new trend for phone manufacturers. While beauty technology is still in its infancy, makeup has found its way on to our screens for years through social media. Whether it’s YouTube or Instagram tutorials, people are constantly looking to their smartphones to find new inspiration when it comes to makeup. With the Galaxy S9, all you need to do is swipe open the camera and you have a catalog of makeup at your disposal.
On the Galaxy S9, you can search through a variety of different cosmetic products available from both Sephora or Cover Girl, and you can purchase them on the spot. As you scroll through each product, it will apply itself to your face like Snapchat filters.
Using facial tracking and 3D video rendering, the makeup filters are mapped to the face at 30 frames per second. There’s no lag, and nothing is misaligned. It works as instantly as you to tap the next product you want to try on. You can try on complete looks – which includes lipstick, foundation, eyeshadow, blush, mascara and eyeliner – or you can try them all separately.
Trying on foundation can be extremely tricky, almost always requires being at the store and having an employee help you find your exact shade. There’s also different foundation types to take into account – matte, sheer, water-based, or ones that provide full coverage. To make sure the shades are as accurate as possible, the ModiFace team tediously went through each product and compared how it looked in real life versus how it looked on the smartphone. They worked with both Samsung and the brands directly to create a matching render.
“I myself was actually involved with that process and it’s kind of a lot of fun,” Houghton said. “You get to try on a lot of different things, and then you get to wipe them off and see how they look in the app. It involves a lot of screenshot comparison and manual work to make sure that the product we’ve included is just perfect.”
The team went through each product and compared how it looked in real life
versus how it looked
on the smartphone.
But there’s still always room for improvement. Using deep learning and image research, the team studies internal data — gathered from those who are testing the final product — to learn more about a user’s face in order to train its algorithms. This helps to produce better results when it comes to factors like how colors mix on to your lip, how light gets added into those colors, and how your face should look under different lighting if you have a specific foundation on. The data also helps to improve texture and overall coverage of the makeup.
With this kind of deep learning, the company is able to create new and more realistic effects. Recently, ModiFace partnered with L’Oreal to create 3D hair tracking that can recolor your hair in real time. The company has plans to work with Samsung to bring similar effects to its phones.
But for those with privacy concerns, Modiface doesn’t collect or monitor any data from its applications when in the hands of users. Any images you take with the apps will never leave your device.