Data in the Wild #5: The Google Face lift

Google Meet’s Latest Treat

Now and then, I hop onto Google Meet for a video call with friends. Why? Honestly, it’s because I’m not a fan of trying to find the perfect angle for my phone while working on my laptop. I’m all about those all-in-one solutions! Also, screen sharing can not be underestimated!

As the call winds down, I love playing around with filters to see what new features have been added. To my surprise, I came across a weird new option: “Touch up my appearance.” It’s one thing to retouch images for Instagram, but to see live video adjustments being made on a platform used for business? Well, I had to try it. I was honestly impressed. The filter subtly softened my face and lightened my 4:30 pm shadowy living room without making me look like an airbrushed Disney character! This is the magic of machine learning, face detection technology, and the data-driven brilliance of Google at work.

Data in the Wild

Welcome back to Data in the Wild, the series where we highlight everyday examples of data viz in action.  As always, it’s a pleasure to be writing on this platform to speak to a like-minded audience whose answer to the question, Why does everyone lie about hating pie charts? Is because they struggle to keep it 100! Today, we’re diving into Google’s “Touch up my appearance” filter.

A GIF sourced from Google support page on how the feature works

The Art of Pixel Perfect

In the past, we’ve explored data that goes way back in history, wind vanes, for instance, which have been tracking the atmosphere for centuries. But today, we’re zooming into the present to explore how Google has taken video data and turned it into live AI models that reimagine our faces in real-time. Enter Google’s “Touch up my appearance” filter.

So, how does it work? Well, Google hasn’t given us the exact breakdown, but we can make some educated guesses. It comes down to three key data operations:

  1. Face Detection & Landmark Mapping
    Google’s AI model was trained on thousands of faces to understand what makes up a face. It maps out key facial landmarks—your eyes, nose, mouth, and jawline to get a sense of your structure. With this understanding, it can adjust your appearance with precision.
  2. Smoothing and Blemish Reduction
    Think of it as a super-slick version of Photoshop. The AI measures the light in the image and creates a histogram based on light levels. It then reduces high-contrast areas (hello, dark bags under the eyes!) and applies a subtle Gaussian blur to smooth out imperfections. All this happens in the blink of an eye.
  3. Adaptive Lighting
    The lighting technique used here is the same as the one above, but this time, it’s applied to your background. The tool adjusts the lighting around you to create a more balanced, natural look.

Now, this is just a snapshot of the process. There are likely many other behind-the-scenes optimizations happening. In the end, it takes raw video feed data, applies computer vision models, and dynamically adjusts output based on learned patterns, all in milliseconds. It’s a live, real-time data visualization happening right within your Google Meet call!

Data is evolving 

Over time, data has evolved to fit new technologies and platforms. While data visualization has been around for thousands of years, its future is always evolving. So, the next time you turn on your webcam and see that “Touch up my appearance” feature, you’re engaging with data viz, a massive machine learning system trained on thousands of faces, fine-tuned to balance subtlety and realism. 

Whether you see it as a handy tool or a digital vanity, it’s another example of how data is shaping the way we see ourselves…literally.

Catch you next time as we uncover more Data in the Wild!