Amazon’s cloud computing wing – AWS, is reportedly considering using AMD chips for its AI needs.
Though Amazon didn’t make any statement yet, an AWS executive has told Reuters that AMD’s MI300 could be better suitable for their needs. Amazon has already declined to use Nvidia’s DGX Cloud platform and may consider alternatives for its cloud computing needs.
Cashing on the Rise of AI
The rise of generative AI tools like ChatGPT kicked tech companies to invent chatbots of a similar kind. And this tech needed special hardware since the AI concerned needs to be trained on large data sets as a part of the process.
In this pursuit, Amazon Web Services (AWS), the world’s largest cloud computing company, is said to be considering AMD chips for its future AI needs. Though it majorly deals with storage services, AWS is also leaning toward clients who want to run their AI services on its servers.
Companies with mini AI tools that can’t setup up a dedicated server can rely on AWS, which offers flexible storage plans with its extensive data centres. And the servers within these centres have chips from Nvidia and AMD, which empower the needs of AWS clients.
While Amazon is yet to make an official statement, an AWS executive told Reuters that the company is considering using the new AI chips from AMD for its future needs. Well, this decision was not yet finalised, but it was more likely considering Amazon ditched Nvidia.
Though Nvidia got better AI chips for running tools like ChatGPT, the company has reportedly been asking the cloud providers to take up their DGX Cloud – a web-based platform for running AI tools on Nvidia’s hardware remotely.
Though Oracle accepted this offer, AWS declined this; since it prefers building its servers ground up, always. This also aligns with the AMD CEO’s statement in an interview with Reuters, where Lisa Su outlined an approach of offering customisable services to win major cloud computing customers.