Meta, OpenAI, and Microsoft declared their intention to use the recently developed Instinct MI300X chip from AMD as a sign that tech firms are looking for alternatives to the pricey Nvidia graphics processors that are important for artificial intelligence. If the MI300X is viable and cost-effective when it launches in the beginning of next year, this could lead to lower prices for producing AI models. Lisa Su, head of AMD, anticipated the AI chip market will reach up to or exceed $400 billion by 2027 and she expressed hope that AMD will have a notable stake in it.At an AMD investor event Wednesday, it was announced by Meta, OpenAI, and Microsoft that they would utilize the Instinct MI300X chip, AMD's latest AI product. This is a major indication that tech businesses are in search of options to the cost-intensive Nvidia graphics processors, which have been key for developing and using AI applications like OpenAI's ChatGPT. If AMD's modern supreme chip measures up when it's released early next year, it could drop the expenditure of developing AI designs and place rival pressure on Nvidia's rapidly growing AI chip sales.AMD CEO Lisa Su underlined the fact that "It takes work to adopt AMD" and that the rate of the MI300X would have to be lower than Nvidia's for customers to be persuaded to buy it. Additionally, she accentuated the MI300X's principal feature, which is its 192GB of cutting-edge, high-performance HBM3 memory, that can facilitate faster data transmission and house larger AI models. Su also stated that the MI300X can provide a better user experience by providing quicker responses to queries.At the AMD investor event Wednesday, Meta, OpenAI, and Microsoft declared their intention to employ the Instinct MI300X chip, AMD's most advanced AI product. This is a strong indicator that tech companies are searching for alternatives to the expensive Nvidia graphics processors, which are vital for creating and deploying AI programs such as OpenAI's ChatGPT. When AMD's state-of-the-art chip starts shipping at the beginning of the year, it could bring down the costs of engineering AI designs and put competitive strain on Nvidia's rising AI chip sales.Lisa Su told reporters that AMD's chip must be more economical to purchase and operate than Nvidia's to convince clients to buy it. She also mentioned the MI300X's key feature, which is its 192GB of advanced, high-performance HBM3 memory that can deliver faster data transfer and have enough room for large AI models. Su highlighted that the MI300X can provide a better user experience by giving more rapid responses to queries.
On Wednesday, AMD stated it had already signed contracts with a few of the entities that exhibited the strongest appetite for GPUs. According to a report from market intelligence agency Omidia, Meta and Microsoft were the two major customers of Nvidia H100 GPUs in 2023.Meta declared it would use MI300X GPUs for AI inference projects, such as handling AI stickers, image editing, and the running of its assistant.Microsoft's CTO, Kevin Scott, shared that the company was offering access to MI300X chips through its Azure network service.The cloud services of Oracle will also utilize these chips.OpenAI noted it would support AMD GPUs in one of its software products, Triton, which is not a massive large language model such as GPT but employed for AI exploration to access processor features.AMD is not predicting high gross sales for the chip yet, merely projecting around $2 billion in entire data center GPU income in 2024. Nvidia declared over $14 billion in data center sales in the most recent quarter, although this figure comprises chips apart from GPUs.However, AMD claims the total market for AI GPUs could increase to $400 billion across the next four years, twice as much as the firm's prior projection. This reveals how significant outlooks are and how coveted high-end AI chips have grown to be -- and why the business is currently focusing investor attention on the line of products.Su remarked to reporters that AMD does not reckon that it must outshine Nvidia to be profitable in the industry."I think it is obvious to mention that Nvidia has to be the vast majority of that at present," Su told media representatives, regarding the AI chip market. "We feel it may be $400 billion-plus in 2027. And we might get a nice portion of that."
top of page
bottom of page
Comments