Nvidia plans to triple production of its H100 processors, which are in high demand for their role in driving the generative AI revolution and building large language models such as ChatGPT.
Nvidia's sales continue to soar as demand for its highest-end AI chip, the H100, remains extremely high among tech companies, contributing to a 171% annual sales growth and a gross margin expansion to 71.2%, leading the company's stock to rise over 200% this year.
Nvidia expects to see $16 billion in revenue this quarter, driven by strong demand for its data-center chips used in artificial intelligence applications.
Nvidia, the AI chipmaker, achieved record second-quarter revenues of $13.51 billion, leading analysts to believe it will become the "most important company to civilization" in the next decade due to increasing reliance on its chips.
Nvidia has reported explosive sales growth for AI GPU chips, which has significant implications for Advanced Micro Devices as they prepare to release a competing chip in Q4. Analysts believe that AMD's growth targets for AI GPU chips are too low and that they have the potential to capture a meaningful market share from Nvidia.
Nvidia's CEO, Jensen Huang, predicts that upgrading data centers for AI, which includes the cost of expensive GPUs, will amount to $1 trillion over the next 4 years, with cloud providers like Amazon, Google, Microsoft, and Meta expected to shoulder a significant portion of this bill.
Nvidia's impressive earnings growth driven by high demand for its GPU chips in AI workloads raises the question of whether the company will face similar challenges as Zoom, but with the continuous growth in data center demand and the focus on accelerated computing and generative AI, Nvidia could potentially sustain its growth in the long term.
Nvidia predicts a $600 billion AI market opportunity driven by accelerated computing, with $300 billion in chips and systems, $150 billion in generative AI software, and $150 billion in omniverse enterprise software.
Nvidia's success in the AI industry can be attributed to their graphical processing units (GPUs), which have become crucial tools for AI development, as they possess the ability to perform parallel processing and complex mathematical operations at a rapid pace. However, the long-term market for AI remains uncertain, and Nvidia's dominance may not be guaranteed indefinitely.
Nvidia's head of enterprise computing, Manuvir Das, believes that the artificial intelligence (AI) market presents a $600 billion opportunity for the company, as demand for AI technology continues to fuel its growth, leading analysts to overlook its undervalued shares and potential for exceptional growth in the years to come.
Large language models like Llama2 and ChatGPT perform well on datacenter-class computers, with the best being able to summarize more than 100 articles in a second, according to the latest MLPerf benchmark results. Nvidia continues to dominate in performance, though Intel's Habana Gaudi2 and Qualcomm's Cloud AI 100 chips also showed strong results in power consumption benchmarks. Nvidia's Grace Hopper superchip, combined with an H100 GPU, outperformed other systems in various categories, with its memory access and additional memory capacity contributing to its advantage. Nvidia also announced a software library, TensorRT-LLM, which doubles the H100's performance on GPT-J. Intel's Habana Gaudi2 accelerator is closing in on Nvidia's H100, while Intel's CPUs showed lower performance but could still deliver summaries at a decent speed. Only Qualcomm and Nvidia chips were measured for datacenter efficiency, with both performing well in this category.