No part of Amazon is 'unaffected' by AI, says its head of AGI

In This Article:

“There's scarcely a part of the company that is unaffected by AI,” said Vishal Sharma, Amazon’s VP of Artificial General Intelligence, on Monday at Mobile World Congress in Barcelona. He dismissed the idea that open source models might reduce compute needs and deflected when asked whether European companies would change their generative AI strategies in light of geopolitical tensions with the U.S.

Sharma said onstage at the startup conference that Amazon was now deploying AI through its own foundational models across Amazon Web Services — Amazon’s cloud computing division — the robotics in its warehouses, and the Alexa consumer product, among other applications.

“We have something like three-quarters of a million robots now, and they are doing everything from picking things to running themselves within the warehouse. The Alexa product is probably the most widely deployed home AI product in existence … There's no part of Amazon that's untouched by generative AI.”

In December, AWS announced a new suite of four text-generating models, a family of multimodal generative AI models it calls Nova.

Sharma said these models are tested against public benchmarks: “It became pretty clear there's a huge diversity of use cases. There's not a one-size-fits-all. There are some places where you need video generation … and other places, like Alexa, where you ask it to do specific things, and the response needs to be very, very quick, and it needs to be highly predictable. You can't hallucinate ‘unlock the back door’.”

However, he said reducing compute needs with smaller open source models was unlikely to happen: “As you begin to implement it in different scenarios, you just need more and more and more intelligence,” he said.

Amazon has also launched “Bedrock,” a service within AWS aimed at companies and startups that want to mix and match various foundational models — including China's DeepSeek. It enables users to switch between models seamlessly, he said.

Amazon is also building a huge AI compute cluster on its Trainium 2 chips in partnership with Anthropic, in which it has invested $8 billion. Meanwhile, Elon Musk’s xAI recently released its latest flagship AI model, Grok 3, using an enormous data center in Memphis that contains around 200,000 GPUs.

Asked about this level of compute resources, Sharma said: “My personal opinion is that compute will be a part of the conversation for a very long time to come.”

Mike Butcher, TechCrunch and Vishal Sharma, Amazon
Mike Butcher,TechCrunch and Vishal Sharma,AmazonImage Credits:Mobile World Congress

He did not think Amazon was under pressure from the blizzard of open source models that had recently emerged from China: “I wouldn't describe it like that,” he said. On the contrary, Amazon is comfortable deploying DeepSeek and other models on AWS, he suggested. “We're a company that believes in choice … We are open to adopting whatever trends and technologies are good from a customer perspective,” Sharma said.