Generative AI is changing the way people shop. According to Capgemini Research Institute’s annual consumer trends report, published this year, 71% of consumers want it embedded in their purchasing experience.
The report states that the push is led by Gen Z and Millennials, whose demand for hyper-personalised, digital-first journeys reshapes the retail landscape. Nearly half of consumers are excited about AI’s role in shopping, and more than half now prefer Gen AI tools over traditional search engines for product discovery.
The gap between rising expectations and current execution is still wide, but the direction is clear. AI is no longer an add-on, it is being woven into the shopping experience.
Sanchit Vir Gogia, founder and CEO of Greyhound Research, told AIM, “E-commerce players who avoid AI altogether risk falling behind on critical experience metrics—search precision, recommendation accuracy, and fulfilment agility.”
He added, “Many mid-sized and regional players are exploring embedded AI features offered by cloud platforms, CMS vendors, or specialised SaaS providers.”
Gogia explained that instead of viewing AI as a singular tool, it should be integrated strategically across marketing, operations, and customer support. Leading companies frequently achieve an early competitive edge by closely integrating AI models, particularly when leveraging proprietary data to enhance customer experience or automate supply chain processes.
Last month, OpenAI added an experimental shopping feature to ChatGPT. The chatbot can now help users browse, compare, and buy products, all within the interface. According to OpenAI, these results are not ads but independently selected listings, complete with visuals, reviews, and price tags. It’s an early hint at how shopping may one day be indistinguishable from chatting.
It was also reported that OpenAI plans to integrate with Shopify, an e-commerce platform, to add a shopping experience to ChatGPT.
Pinterest’s latest AI-powered visual search tools let users shop using images instead of words. Beginning with women’s fashion, users can refine results based on style or occasion. “We’re curating a personalised journey of discovery,” said Dana Cho, VP of design at Pinterest, noting a shift from traditional search to visual-first experiences.
India’s e-commerce platform, Meesho, is also in the same boat. With Azure OpenAI and GitHub Copilot in its tech stack, the company uses generative AI to transform discovery, logistics, and support. “There are many teams within Meesho who are successfully deploying LLMs in their applications,” said chief data scientist Debdoot Mukherjee.
According to Mukherjee, AI helps understand complex Indic text, recommend prices, and help sellers improve their strategies. In his words, “It’s fair to say that AI is riding the flight of the company in a very effective manner.”
Meesho is not alone. Flipkart recently launched Immerse, an AI-powered feature combining text and image search.
Perplexity introduced “Buy with Pro,” an AI-powered shopping experience, for its US pro users, enabling them to research and buy products directly on the platform. It also launched “Snap to Shop,” a visual search tool where users can find products by uploading a photo.
Last but not least, Amazon also introduced Rufus, a shopping assistant powered by generative AI. Rufus aims to simplify the Amazon shopping process by providing intelligent assistance, personalised recommendations, and seamless product discovery for users.

Despite these leaps, Google seem to be better placed with its ecosystem. It has the search engine, Android as the dominant mobile operating system, the user base, and increasingly, the AI integration to match offerings from competitors and beyond.
While newer entrants are building ecosystems from scratch or plugging into chatbots, Google’s ecosystem stretches from Search to Gmail, Maps, YouTube, and Android.
At Google I/O 2025, the company introduced an “AI Mode” for shopping. Powered by Gemini and Google’s vast Shopping Graph, it claims to have 50 billion listings.
The feature aims to act as both stylists and assistants. The system dynamically curates images and product data sourced from various brands to help users browse, refine, and purchase with minimal friction.
For instance, a user can ask AI for a stylish travel bag, and it won’t just serve up generic suggestions. It will visually present tailored results, understand the intent, and refine options based on factors like seasonality and use case. A new right-hand panel updates in real time, letting users filter as they go.
And when it’s time to buy, a new “agentic checkout” promises to streamline the final step. By tapping “track price,” shoppers can set preferences on size, colour, and budget. When the deal hits the mark, Google steps in to complete the purchase directly, adding it to the cart, checking out securely, and using Google Pay behind the scenes.
Google’s new try-on feature even includes a virtual dressing room. By uploading a full-length photo, shoppers can see how a shirt or dress looks on them.
Overall, Google isn’t just surfacing products. It’s reshaping intent, exploration, and checkout into one continuous flow. If the future of shopping lies in knowing what shoppers want before they do, Google may be further down the aisle than it seems.
Where This Is Going
As AI becomes more multimodal and understands text, images, and context, shopping is likely to evolve into something bigger. Rather than navigating a store or website, users might describe a feeling, show an image, or speak an idea, and AI will pull together options, availability, pricing, and recommendations.
Google already has a platform ready for users to experience this with front row seats. It should be interesting to see how Google plans to transform the shopping experience across its ecosystem.