Projection mapping creates ‘living makeup’ to transform faces

By combining art and technology, a person's face can be changed in real-time through a technique called projection mapping.

Omote is a new projection-mapping demo by Japanese producer and technical director Nobumichi Asai that uses technology to transform a model's face. Projecting mapping uses light that gives the illusion of movement and depth on an object.

Asai was inspired by the Noh mask, which derives from classical Japanese musical dramas. His work combines digital design with CGI and makeup. To create the projection map, the model's face is laser-scanned so that the system knows the dimensions and contours of the face. The model then wears tiny dots that track the exact motions of the face. A digital projection is then applied to the face.

The result creates "living makeup," where the model's features are transformed in real-time to resemble a cyborg. Each time the model moves her head her features shift much like characters seen in sci-fi movies.

This technology is commonly used by visual artists in concerts or product demos. Asai previously worked with Subaru to put CGI onto cars and buildings. Projecting graphics on buildings or rooms has been used in the past, but the subjects are normally stationary. Now visual artists can have their models move in real-time to create mind-blowing facial effects.

It is not yet known how much movement the system could take.

Asai worked using Microsoft Kinect in the past, but it is not yet known if Kinect is integrated for this project. Microsoft is also working on a projection project known as IllumiRoom, which takes Xbox One games out from the TV to cover the entire room.

OMOTE / REAL-TIME FACE TRACKING & PROJECTION MAPPING. from something wonderful on Vimeo.

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics