top of page
Lanon Wee

Microsoft Executive Reports Supply of Nvidia AI Processing Units is Increasing

Kevin Scott, Microsoft's chief technology officer, declared that the provision of Nvidia's GPUs is improving. Microsoft Azure's ChatGPT, hosted in the cloud, is suffering some delays, enhancing availability of GPUs. Over the last several years, Scott said, Microsoft has put money into silicon, but Nvidia's components have remained the top available choice. Kevin Scott, Microsoft's technology chief, said on Wednesday that the difficulty of getting access to Nvidia's AI-running chips has eased since a few months ago. At the Code Conference in Dana Point, California, Scott declared that the market for Nvidia's graphics processing units (GPUs) has become more accessible since OpenAi launched ChatGPT chatbot late last year. "The demand for GPU capacity was far greater than what the whole ecosystem could produce," Scott told the Verge's Nilay Patel. "But that is gradually improving, and we're getting better news every week rather than not, so it's great." Microsoft, as well as other tech companies, has been actively using generative AI in their products and offerings. Working with Nvidia's GPUs has been the main way of training and deploying AI models, but this has caused a lack of supply. In May, Scott said he is responsible for monitoring Microsoft's GPU budget, which he referred to as "terrible" and "miserable". He stated on Wednesday at the Code Conference that, compared to some time ago, the current situation is relatively better. Nvidia highlighted that their revenue will grow 170% from the previous year and commented that their gross margin has risen from 44% to 70% by the last year. Additionally, their stock price has skyrocketed by 190% in 2023, making it a top performer in the S&P 500. Furthermore, the company expects to increase its supply each quarter in the upcoming year. In spite of this, Similarweb reported that traffic to ChatGPT has declined over the span of three months. Microsoft provides the Azure cloud-computing services to OpenAI while planning to start selling Microsoft 365 Copilot access to large organizations in November. When questioned on the reports concerning Microsoft's custom AI chip development, Scott shed light on the company's internal silicon work and said they are going to choose the best options available for building systems. He declined to confirm any speculations, but added, "the best choice during the last few years has been Nvidia."

コメント


bottom of page