Google announced that a brand new feature is coming to Google Search. The new feature will improve the way you shop through Google, and improve your experience when shopping for clothes and other clothing items. The new Try-on AI model will let users put real clothes on real model with different body shapes and sizes, making it easier to see whether an item is suitable for a body shape and size like yours.
AI has been a lot in the news lately, and it has received so much attention in the last few months that popular individuals, companies, and even politicians are calling for new legislations to help keep the technology under control. While we won’t get into the politics, there are a few genuine beneficial use cases of using AI, and this new Try-on feature by Google appears to be one of them.
Google is trying to fix the problem where users are unable to get the exact idea of whether a piece of garment would fit them. Whether that’s due to the size, shape, or form, it’s still easier to shop in-person than online, since the garment can be tried on in fitting rooms, and the material can be felt. While the material’s feel is not something current technology can tackle, Google appears to have a solution for the other problem.
The new Virtual Try-On feature will allow users to shop online more easily by putting clothes on models who feature similar shapes and sizes to themselves. The feature will also contain new filters to help users find exactly what they’re looking for.
“While apparel is one of the most-searched shopping categories, most online shoppers agree: It’s hard to know what clothes will look like on you before you buy them. Forty-two percent of online shoppers don’t feel represented by images of models, and fifty-nine percent feel dissatisfied with an item they shopped for online because it looked different on them than expected.
Now, thanks to our new virtual try-on tool on Search, you can see whether a piece is right for you before you buy it.” says Google in its blog post.
The new Virtual try-on for apparel feature will show clothes on a variety of real models. Google selected models ranging in sizes XXS-4XL representing different skin tones, body shapes, ethnicities, and hair types. Users will be able to select different clothing on different models to help tailor their experiences and get a better idea about how a garment would look on a person most similar to them.
Google says that the virtual try-on feature is already rolling out to US shopped on womens tops from brands across Google, including Anthropologie, Everlane, H&M and LOFT. Users can tap products with the “Try On” badge on Search and select the model that resonates with them.
How does it work?
Our diffusion model sends images to their own neural network (a U-net) to generate the output: a photorealistic image of the person wearing the garment, Source: Google
The model uses Google’s diffusion AI model. Google explains that the model uses a diffusion process that gradually adds extra pixels (or “noise”) to an image until it becomes unrecognizable, and then by removing the noise completely until the original image is constructed in perfect quality.
VTO (Virtual Try-On) uses a similar text-to-image model such as Imagen to generate realistic images based on the text, or in this case, one for the garment and another for the person. Each image is shared with the model for a process that’s called “cross-attention” to generate the output of a photorealistic image of a person wearing the garment.
Google took hundreds of images of real models of different skin tones, shapes, and sizes which are used to train the model and show realistic images of them wearing different garments.
If you’re interested in learning more, we recommend Google’s blog post that goes into more detail about the process and the model used to train and generate these images.
Lots of use cases for similar technologies
While AI could be harmful to future jobs, innovation, there’s a reason they’re so popular nowadays. They help make everyday life easier, and the new virtual try-on feature is an excellent example.
Amazon and other businesses have previously experimented with AR (Augmented Reality) that would let users wear real clothes and shoes just by looking in the mirror or turning on the camera on their smartphones. While that future doesn’t appear to have reached the majority of the public, it’s clearly the next step to personalize and tailor the experience further.
Similar technologies could be applied to other products, such as shoes, other wearable items, and furniture. The possibilities are endless, and there’s a lot of potentials to offer even more information thanks to the advanced AI models that exist today.
These are the best phones from Google in 2023
The new Google Pixel 7 is powered by the all-new Google Tensor G2 chipset. The device is coupled with 8GB of memory, and it has 128/256GB storage tiers. The phone comes with a significantly improved camera system, and it’s more portable than the last generation. The Pixel 7 is available in Obsidian, Lemongrass, and Snow colors.
$699 $899 Save $200
The Google Pixel 7 Pro, powered by the new Google Pixel Tensor sensor, provides great graphics performance and computing power to let you easily play all of your favorite games and multitask. The device also has a highly capable camera setup that’s backed by a unique post-processing algorithm that helps achieve great results.
Google Pixel 7a
$450 $500 Save $50
The Pixel 7a is the new affordable smartphone from Google, featuring much-awaited features such as a 90Hz responsive display, a more powerful camera setup, and support for wireless charging. The Pixel 7a is powered by the Tensor G2 chipset, and it’s the best device in the A series yet.