AmazonVishal Sharma, VP of General Artificial Intelligence, said at Mobile World Congress in Barcelona, "There are virtually no business units at Amazon that are not AI impact." At the same time, he denies the claim that open source models can reduce computing requirements.

Sharma presented at the startup conference that Amazon is relying on its self-developed base model to advance AI deployments in a number of areas, including Amazon Web Services (Amazon's cloud computing arm), warehouse automationrobotand the Alexa intelligent assistant, among others.
"We now have about 750,000 robots, responsible for tasks ranging from picking items to running autonomously.Alexa is probably the world's most ubiquitous home AI product ...... It's safe to say that generative AI has permeated every area of Amazon's business."
In addition, Amazon is working with Anthropic, its $8 billion (note: currently about Rs. 58,369 million) AI startup, on a Trainium 2 chip.Building Large AI Computing Clusters. Meanwhile, Elon Musk's xAI has also launched Grok 3 and utilized a Memphis-based company that has aApproximately 200,000 GPUs of the mega data center for training.
When asked if the proliferation of AI computing resources will be a long-term trend, Sharma opined, "Arithmetic power will remain a central topic of discussion going forward."
Responding to a question about whether the recent influx of open-source AI models in China poses a challenge to Amazon, he said, "I wouldn't characterize it that way." Instead, he implied that AmazonWillingness to run DeepSeek and other models on AWS. "We push for open choice ...... We actively embrace new technology whenever it benefits our customers."
Asked if the ChatGPT release in late 2022 was caught off guard, Sharma said, according to TechCrunch, "I disagree with that. Amazon has been deep in the AI space for 25 years. Alexa, for example, runs at least 20 AI models ...... Our language models have long been backed by billions of parameters. It's not an emerging field, we've been investing consistently.