Google rolls out the new AR effects tool on YouTube Stories. Similar to other filters available on Snapchat and Instagram, the AR filters on YouTube Stories also feature 3D hats, glasses, and masks.
The only difference: Google says YouTube Stories' AR effects have the best filters out there, all thanks to machine learning.
AR Effects And Machine Learning
Google, in a recent blogpost, detailed how these AR effects work. According to Google Research AI engineers Ivan Grishchenko and Artsiom Ablavatski, to properly render AR objects into the real world, they used perceptive technologies that can track "highly dynamic surface geometry," an element that varies from every individual's facial expressions.
YouTube Stories AR filters are comparable to Apple's Animoji, but it can do away with a dedicated depth sensor. Basically, it can work even with a single camera, as it uses machine learning to infer a 3D surface geometry and enable visual effects. The result is a more precise and refined virtual content overlay.
"Our ML pipeline consists of two real-time deep neural network models that work together: A detector that operates on the full image and computes face locations, and a generic 3D mesh model that operates on those locations and predicts the approximate surface geometry via regression," Grishchenko and Ablavatski explained.