Nvidia’s AI Chip Business Booms, Predicts Continued Growth

Nvidia’s AI Chip Business Booms, Predicts Continued Growth

Nvidia has seen a massive increase in revenue, driven by the growth of its data center business, which skyrocketed by over 400% in the last year. The company’s revenue more than tripled in the first fiscal quarter. Nvidia is now assuring investors that the businesses investing heavily in its AI chips will also see significant profits.

Nvidia’s Chief Financial Officer, Colette Kress, explained to investors that cloud service providers are experiencing immediate and substantial returns on their investments. This is a crucial point for the company because there is a limit to how much customers can spend on infrastructure without seeing some profit. Nvidia’s message is clear: their chips can deliver a strong and sustainable return on investment, indicating that the AI industry is set for long-term growth.

The major buyers of Nvidia’s graphics processing units (GPUs) include big cloud service providers like Amazon Web Services, Microsoft Azure, Google Cloud, and Oracle Cloud. These providers accounted for around 45% of Nvidia’s $22.56 billion in data center sales for the April quarter. Additionally, new specialized GPU data center startups are emerging, purchasing Nvidia’s GPUs, setting them up in server racks, and renting them out to customers on an hourly basis. For example, CoreWeave charges $4.25 per hour to rent an Nvidia H100, which is vital for training large language models like OpenAI’s GPT.

Following Nvidia’s impressive earnings report, Colette Kress highlighted that cloud providers see a return of $5 for every $1 spent on Nvidia hardware over four years. She mentioned that newer Nvidia products, like the HDX H200, offer even better returns, with every $1 spent potentially generating $7 in revenue over the same period. This product includes eight GPUs and provides access to Meta’s Llama AI model, enhancing its value.

Nvidia’s CEO, Jensen Huang, noted that companies like OpenAI, Google, and numerous AI startups are eagerly awaiting more GPUs from cloud providers. He emphasized the high demand for Nvidia’s systems, with customers pressuring the company to deliver as quickly as possible. Meta, for instance, plans to invest billions in 350,000 Nvidia chips for its AI projects, despite not being a cloud provider. Meta is expected to use this investment to boost its advertising business or integrate AI features into its apps.

Huang described Meta’s large server cluster as essential infrastructure for AI production, referring to it as an “AI factory.” Nvidia also surprised analysts by announcing an aggressive timeline for its next-generation GPU, called Blackwell, which will be available in data centers by the fiscal fourth quarter. This announcement eased concerns about a potential slowdown as companies wait for the latest technology.

The first customers for these new chips will include Amazon, Google, Meta, Microsoft, OpenAI, Oracle, Tesla, and Elon Musk’s xAI. Following these announcements, Nvidia’s shares jumped 6% in extended trading, surpassing $1,000 for the first time. Nvidia also announced a 10-for-1 stock split after a 25-fold increase in its share price over the past five years.

This information was obtained from CNBC’s article by Kif Leswing, published on May 22, 2024. You can check the full article here

Voss Xolani Photo

Hi, I'm Voss Xolani, and I'm passionate about all things AI. With many years of experience in the tech industry, I specialize in explaining the functionality and benefits of AI-powered software for both businesses and individual users. My content explores the latest AI tools, offering practical insights on how they can streamline workflows, boost productivity, and drive innovation. I also review new software solutions to help readers understand their features and applications. Beyond that, I stay up-to-date with AI trends and experiment with emerging technologies to provide the most relevant information.