
Nvidia CEO Jensen Huang predicted that someday we’ll have a billion cars on the road and they will all be robotic cars.
It sounds like science fiction, but as Huang has said before, “I am science fiction.” He made the comments in a conference call with analysts about Nvidia’s FYQ4 earnings ending January 26, 2025. (Here’s our full report on the earnings). Nvidia’s stock is current down half a percent to $130.72 a share in after-hours trading.
Colette Kress, EVP and CFO, said in the conference call that the data center business was up 93% from a year ago and 16% sequentially as the Blackwell ramp commenced and the Hopper chip sales also grew. Blackwell sales exceeded Nvidia’s expectations, she said.
“This is the fastest product ramp in our company’s history, unprecedented in its speed and scale,” said Kress. “Blackwell production is in full gear across multiple configurations and mere increasing supply,expanding customer adoption. Our Q4 data center compute revenue jumped 18% sequentially and over 2x year on year. Customers are racing to scale infrastructure to train the next generation of cutting edge models and unlock the next level of AI capabilities.”
With Blackwell, it will be common for these clusters to start with 100,000 graphics processing units (GPUs) or more, Kress said. Shipments have already started for multiple infrastructures of this size. Post training and model customization are fueling demand for Nvidia infrastructure and software as developers and enterprisers leverage techniques such as fine tuning, reinforcement learning and distillation to tailor models. Hugging Face alone posts over 90,000 derivatives created from the Llama foundation model.
The scale of post training and model customization is massive and can collectively demand orders of magnitude more compute than pre training, Kress said. And inference demand is accelerating, driven by test time scaling and new reasoning models like OpenAI o3, DeepSeek and more. Kress said she expected China sales to be up sequentially, and Huang said China is expected to be the same percentage as in Q4. It is about half of what it was before export controls were introduced by the Biden administration.
Nvidia has driven to a 200 times reduction in inference costs in just the last two years, Kress said. She also said that as AI expands beyond the digital world, Nvidia infrastructure and software platforms are increasingly being adopted to power robotics and physical AI development. On top of that, Nvidia’s automotive vertical revenue is expected to grow as well.

Regarding CES, she noted the Nvidia Cosmo World Foundation model platform was unveiled there, and the robotics and automotive companies — including Uber — have been among the first to adopt it.
From a geographic perspective to potential growth of data center revenue was strongest in the U.S., driven by the initial ramp up. Countries across the globe are building their AI ecosystems, and demand for compute infrastructure is searching France’s 200 billion euro AI investment and the EU’s 200 billion euro investment initiatives offer a glimpse into the build out set to redefin global AI infrastructure in the coming years.
Kress said that as a percentage of total data center revenue, data center sales in China remained well below levels seen before the onset of export controls. Absent any change in regulations, Nvidia believes that China shipments will remain roughly at the same level in China for data center solutions.
“We will continue to comply with export controls while serving our customers,” Kress said.
Gaming and AI PCs

Kress noted that gaming revenue of $2.5 billion decreased 22% sequentially, and 11% year on year.Full year, revenue of $11.4 billion increased 9% year on year, and demand remained strong throughout the holiday. But Kress said Q4 shipments were impacted by supply constraints.
“We expect strong sequential growth in q1 as supply increases, the new GeForce RTX 50 series desktop and laptop GPUs are here, built for gamers, creators and developers,” Kress said.
Be the first to comment