Google’s AI Try-On Feature For Clothes Now Works With Just A Selfie

Google AI try-on now works with a selfie, letting shoppers preview clothes instantly using Gemini-powered virtual fitting technology.
Matilda

Google AI Try-On Now Works With Just a Selfie

Google AI try-on is taking a major leap forward by letting shoppers virtually try on clothes using only a selfie. Announced this week, the update removes the need for a full-body photo, making virtual fashion previews faster and far more accessible. Users can now upload a simple selfie, select their usual clothing size, and instantly see how outfits might look on a realistic, AI-generated version of themselves. The feature is powered by Nano Banana, Google’s Gemini 2.5 Flash Image model, which generates a full-body digital likeness from minimal input. For shoppers wondering how accurate or easy AI try-ons really are, Google’s update aims to answer both questions at once. The rollout begins today in the United States, signaling Google’s growing push into AI-powered shopping experiences. It also reflects a broader shift toward frictionless, mobile-first ecommerce tools.

Google’s AI Try-On Feature For Clothes Now Works With Just A SelfieCredit: Klaudia Radecka/NurPhoto / Getty Images

How Google AI Try-On Uses Gemini to Create Full-Body Avatars

At the core of the new Google AI try-on experience is Gemini’s advanced image-generation technology. Instead of requiring a carefully framed, full-body photo, the system extrapolates body shape and proportions from a selfie combined with user-selected sizing details. This allows Google to generate multiple full-body images that simulate how clothing might drape, fit, and appear in real life. Users can browse several generated options and choose one as their default try-on image for future shopping sessions. While the AI doesn’t claim to be a perfect mirror, it’s designed to provide a practical visualization rather than a stylized model shot. This approach lowers the barrier to entry for casual shoppers who want quick answers before buying. It also aligns with how people naturally shop on their phones—fast, visual, and low effort.

Virtual Try-On Still Supports Full-Body Photos and Diverse Models

Despite the selfie-first approach, Google hasn’t removed existing options from its AI try-on feature. Users who prefer more control can still upload a full-body image if they feel it better represents their appearance. Alternatively, shoppers can choose from a range of pre-generated models featuring diverse body types and proportions. This flexibility is important as virtual try-on tools continue to raise questions around representation and inclusivity. Google appears to be positioning its platform as adaptable rather than prescriptive. By offering multiple paths to visualization, the company reduces friction for first-time users while still catering to power shoppers. The result is a more inclusive shopping experience that doesn’t force one “ideal” digital body standard. That balance may prove critical as AI fashion tools become more mainstream.

Where the Google AI Try-On Feature Is Available

The updated Google AI try-on feature is launching first in the United States, continuing the company’s phased rollout strategy. Users can access it through Search, Google Shopping, and Google Images by tapping on supported apparel listings. Once a product qualifies, shoppers simply select the familiar “try it on” icon to activate the AI experience. The feature pulls inventory from Google’s massive Shopping Graph, which aggregates product data from thousands of merchants. This means the try-on tool isn’t limited to a single retailer or brand ecosystem. For ecommerce sellers, it represents another incentive to ensure their product listings are optimized and eligible. For consumers, it creates a consistent try-on experience across platforms they already use daily.

Google’s Bigger Bet on AI-Powered Shopping Experiences

This selfie-based update isn’t happening in isolation. Google has been steadily investing in virtual try-on and AI shopping tools as competition intensifies across ecommerce and social platforms. The company already operates a dedicated app called Doppl, which focuses entirely on visualizing outfits using AI. Doppl allows users to mix and match clothing digitally, helping them imagine complete looks rather than single items. With the latest updates, Google is clearly testing how far AI can guide discovery, inspiration, and purchasing decisions. These tools also generate valuable data about consumer preferences, fit concerns, and browsing behavior. From Google’s perspective, AI try-on isn’t just a convenience feature—it’s a strategic layer in the future of online retail.

Doppl’s Shoppable Feed Expands Google AI Try-On Vision

Earlier this week, Google expanded Doppl with a shoppable discovery feed designed to surface outfit recommendations. Nearly every item shown in the feed links directly to merchants, turning inspiration into instant commerce. The feed features AI-generated videos of real products styled into complete outfits. It also adapts suggestions based on a user’s personal style, browsing habits, and interaction history. While some users remain skeptical of algorithm-driven fashion feeds, the format mirrors what already works on platforms like TikTok and Instagram. Google appears to be betting that familiarity will outweigh hesitation. By blending entertainment-style discovery with practical shopping tools, Doppl complements the broader Google AI try-on ecosystem.

Why Google Thinks AI Try-On Can Reduce Returns

One of the biggest challenges in online apparel shopping is product returns, often driven by fit and expectation mismatches. Google AI try-on aims to address this by setting more realistic visual expectations before purchase. While an AI-generated image can’t fully replace physical try-ons, it provides more context than static product photos. Seeing how a garment might sit on a body similar to yours can influence size selection and confidence. For retailers, fewer returns translate into lower logistics costs and higher margins. For shoppers, it reduces the frustration of waiting, returning, and reordering. Google’s investment suggests it sees virtual try-on as both a user experience upgrade and a practical ecommerce solution.

Privacy and Trust in Google AI Try-On Technology

As with any AI feature that uses personal images, privacy remains a key concern. Google has emphasized that users control what images they upload and can switch between selfie-based avatars, full-body photos, or generic models. The company hasn’t positioned the feature as biometric identification, but rather as a temporary visualization tool. Still, trust will play a major role in adoption, especially as AI-generated likenesses become more realistic. Google’s reputation and existing account infrastructure may give it an advantage over smaller startups. Clear communication around data usage and image handling will be essential. In 2025, transparency is no longer optional for consumer-facing AI.

How Google AI Try-On Fits Into Mobile-First Shopping Trends

The move to selfie-based try-on reflects broader mobile shopping behavior. Most users browse products casually, often in short sessions, without the patience to stage a perfect photo. A quick selfie fits naturally into that flow. Google’s update prioritizes speed, convenience, and visual feedback—three factors that drive mobile conversions. It also aligns with Google Discover-style content, where visual storytelling and instant engagement matter more than detailed specifications. As shopping increasingly blends with content discovery, tools like AI try-on act as bridges between browsing and buying. Google appears to be optimizing for that exact moment of impulse and curiosity.

What This Update Signals About the Future of AI Fashion Tools

Google AI try-on working with just a selfie is a clear signal of where AI-powered shopping is heading. The goal isn’t hyper-realistic digital twins, but practical tools that reduce friction and uncertainty. By integrating AI try-on directly into Search and Shopping, Google removes the need for separate apps or complex workflows. Combined with Doppl’s discovery feed, the company is building an end-to-end visual commerce ecosystem. Whether users embrace AI-generated fashion previews at scale remains to be seen. However, Google’s steady expansion suggests confidence that virtual try-on will become a standard expectation. In the evolving landscape of online retail, convenience may matter just as much as accuracy.

Post a Comment