Google experiment mirrors user movements in images

Spread the love

Google has launched a machine learning experiment called Move Mirror. Using a webcam, the system can see user movements and mimic them with images from a database. The result can be saved as a gif.

For its Move Mirror, Google uses a machine learning model that detects the user’s movements in front of the webcam and compares them with the postures of people in images in a database. The experiment used PoseNet, a model that can recognize people and their most important joints in photos and videos.

Move Mirror uses over 80,000 images and works with Tensorflow.js, which allows machine learning models to be run in the browser. The webcam images are only processed locally by the algorithm and not sent to a server or stored.

Google Creative Lab employees have posted an extensive blog post on Medium explaining the experiment. One aim of the experiment is to show that such techniques are accessible to anyone with a computer and webcam.

You might also like