Google has introduced a new AI-powered feature that lets users virtually try on clothes using just a single photo of themselves. The tool, currently rolling out in the U.S., is built into Google Search and Google Shopping, allowing shoppers to see how specific clothing items would realistically look on their own body.
To use the feature, users upload a full-body image. Google’s generative AI then simulates how clothes like shirts, dresses, or pants would hang, stretch, or fold based on body shape, pose, and even fabric behavior—wrinkles, folds, and how a material moves. Unlike typical virtual try-ons that place an outfit over a generic model, this system personalizes everything to your unique frame and appearance.
The feature launched with a limited set of women’s tops from brands like H&M, Everlane, Anthropologie, and LOFT, but more categories are expected soon. It also supports diverse body types, sizes (from XXS to 4XL), and skin tones, making it one of the more inclusive shopping tools released to date.
Beyond the visuals, Google is integrating smart shopping tools like price tracking and automatic checkout through its “AI Mode” experience, first teased at Google I/O. This broader update aims to make the entire shopping flow—from discovery to purchase—smarter and more seamless. With virtual previews that mirror reality and AI helping you shop, Google is making a clear move toward redefining how people buy clothes online.