For its first 30 years, Nvidia wasn't a household name unless you were a gamer. Now, some of its original fan base feel left behind as artificial intelligence has made the chipmaker the world's most valuable company.Â
"The gaming segment is no longer the driving force of the company. There was one point when it clearly was," said Stacy Rasgon of Bernstein Research.
Nvidia popularized the graphics processing units, or GPUs, that enable fast frame rates and rendering that make the best video game play possible.Â
When Nvidia released its first GPU in 1999, the GeForce 256, it laid off the majority of workers and approached bankruptcy to make it happen. Gamers snapped up the new type of processor, bringing Nvidia back from the brink.
Now, with demand for AI soaring, nearly all of Nvidia's revenue comes from its products that serve that industry, instead of gaming. And as AI chipmaking shrinks the available memory supply, Nvidia has been forced to make tough decisions about priorities.
In a memory-constrained reality, it's not shocking that Nvidia would prioritize its far more profitable data center GPUs such as Hopper and Blackwell.
Nvidia's operating margins in its compute and networking segment averaged 69% over the past three years, compared to a 40% margin for the consumer-forward graphics segment.
"I understand that they're going to chase that. And that breaks my heart," said Greg Miller, co-founder and host of popular video game podcast Kinda Funny Games Daily in an interview with CNBC.
"Dance with the one who brought you. Gamers have brought you this far," Miller added.
If analyst predictions are correct, 2026 will be the first year in three decades that Nvidia doesn't release a new generation of its consumer-facing GeForce line of graphics processing units.
Gamers are "hugely important" to Nvidia, according to an email the company sent to CNBC, adding that it's "always innovating, testing and releasing" new gaming-focused technologies.
The current RTX 50 series of GeForce GPU was unveiled at CES in January 2025.Â
But with 2026 CES and GTC in the rearview mirror, some worry this will be the first year without a new generation, although Nvidia does commonly reveal new hardware as late as September.
While it represents a big strategy pivot, some gamers say it's not a bad move for their budgets.Â
"It's kind of hard to keep up. You can't upgrade every single year, so having a bit of a break and waiting for a generation to really matter I think is actually in service of the gamers out there," said Tim Gettys, Miller's co-founder of Kinda Funny Games.
AI profits take over
Nvidia's current era of AI dominance started two decades ago with the 2006 launch of its CUDA software toolkit. Suddenly, developers could use GPUs for general-purpose computing instead of just graphics.
Then, in 2012, Nvidia's deep learning capabilities were made clear during what many consider the big bang moment for modern AI. Nvidia's GPUs and CUDA were used to build a neural engine called AlexNet that blew away the competition during a prominent image recognition contest.
Although Nvidia didn't stop making gaming GPUs, it signaled a new focus on GPUs for AI in 2020 when it purchased high-performance computing chipmaker Mellanox Technologies for $7 billion.
The company has been releasing new generations of high-end GPUs ever since, along with full rack-scale systems for AI workloads such as the new Vera Rubin platform, which CNBC got an exclusive first look at in February.
Nvidia doesn't reveal prices for its AI chips, but analysts say one Blackwell GPU costs up to $40,000, while the Futurum Group estimates a full Vera Rubin system will cost up to $4 million.
In contrast, Nvidia sells its RTX 50-series gaming GPUs for between $299 to $1,999.
During the cryptocurrency peaks of 2018 and 2021, Nvidia's GPUs sold in online marketplaces for up to three times listing price because they were once key to mining Bitcoin and Ethereum.Â
Although prices fell when mining changed course in 2022, Nvidia's current RTX 5090 GPU is still sold online for up to double the retail price.Â
Plenty of demand for last year's generation may make Nvidia less motivated to put out a new version this year.
'Hard to get the memory'
But the memory shortage is a more likely culprit for Nvidia's gaming drawback.
Industry reports suggest Nvidia has made plans to reduce production of its latest gaming GPUs by up to 40% as it faces a major shortage of the general-purpose memory that's necessary for making a GPU.
Dynamic Random Access Memory, or DRAM, enables fast, temporary data storage so the GPU can run parallel tasks.
Personal computers, where Nvidia's gaming GPUs end up, have borne the brunt of DRAM shortages. When memory prices go up, manufacturing a GPU costs more, and that cost trickles down to consumers.
Gartner predicts PC prices will rise by 17% this year, causing PC shipments to decline 10.4%.
"With how expensive all of this has gotten, it's concerning to see prices go up on the gaming side with no signs of ever coming back down, and then Nvidia clearly chasing a completely different category of consumer," Gettys said.Â
If the entry-level consumer PC market disappears by 2028 as Gartner predicts, the market for Nvidia's entry-level gaming GPUs is likely to contract, too.Â
Instead, Nvidia is likely saving limited memory inventory for its higher cost, higher margin AI chips.Â
"If there is push-outs or delays on the gaming roadmap, it's probably in large part that they probably can't make the cards anyways because it's hard to get the memory," Rasgon said. "Every bit of memory that's out there, I think is really getting prioritized to AI compute."
Higher-performance GPUs like Blackwell and Rubin are lined with dense stacks of a specific type of DRAM known as High Bandwidth Memory, or HBM. Rasgon said it takes about four times as many silicon wafers to make a gigabyte of HBM as it does to make the same amount of more traditional types of DRAM.
"That dynamic is starving the overall industry of the type of memory that is traditionally used for more consumer type applications. It's just not available," Rasgon said.
Nvidia told CNBC that it's continuing to ship all GeForce GPUs as it sees strong demand, and is working closely with suppliers to maximize memory availability.
"If they're making three times the money and the stockholders are three times happier, then yeah, I do think that they will abandon gaming despite it being what got them there," Gettys said.
'Feels like a slap in the face'
CEO Jensen Huang did make a big gaming announcement at the beginning of his keynote address at Nvidia's annual GTC conference in March, but the gaming community was less than enthused.Â
Huang announced the next generation of its rendering software called Deep Learning Super Sampling or DLSS, coming in the fall. It's well known for boosting frame rates by rendering games at lower resolutions and using AI to scale up the image, helping games run more smoothly on less powerful hardware.
The controversy with the new DLSS 5 is that gamers worry it uses generative AI to change the look of the game. Huang unveiled DLSS 5 with a sizzle reel of photorealistically enhanced versions of characters in popular games such as Resident Evil Requiem, Starfield, and Hogwarts Legacy.
"I play video games because they're an art form. And so I like to see the thumbprint of the creator in what I'm doing," said Miller of Kinda Funny Games. "That raised a lot of hair on a lot of necks in the video game industry as we deal with so many layoffs, so many studio closures."
As it grapples with a post-pandemic slowdown, the gaming industry has seen studio closures, canceled games, and thousands of job cuts across giants like Epic Games, Microsoft's Xbox, and Sony's PlayStation.
Gettys was a fan of previous versions of Nvidia's DLSS for making gaming more accessible on a lower budget.
"The technology is mind-blowing for what it can do to make games run on lower-end PCs," he said. "But then to add this generative AI stuff, it feels like a slap in the face."
Gettys' big fear is that this is a step toward fully AI-generated games, which he thinks is "100% the goal."
Elon Musk has already addressed the potential for it. In an October post on X, Musk said his xAI game studio will release "a great AI-generated game" before the end of 2026.
"You're literally altering the art created by the developers. And then at a certain point you're replacing the developers and then their studio gets closed down," Gettys said.
Nvidia said in a statement to CNBC, "Games are a creative artform that give developers the opportunity to tell engaging stories and immerse players in incredible worlds. Our RTX technologies are tools that enable game developers to achieve their creative vision - these include rendering techniques such as ray tracing and path tracing, and those enhanced by AI, like DLSS Super Resolution, DLSS Frame Generation, and DLSS 5, all working together to provide the best performance and image quality."
During his GTC keynote, Huang said AI is going to "revolutionize how computer graphics is done."Â
In a question-and-answer session the next day, Huang responded to assertions from the gaming community that DLSS 5 makes games appear homogeneous.
"They're completely wrong," Huang said.
He emphasized that game developers will still be in control, able to "fine-tune the generative AI" to match their style.
'Clear favorite'
For over a decade, Nvidia has also offered gaming in the cloud through a service called GeForce NOW. The model has evolved to include different subscription tiers â including a free option â that lets users stream games they own on services like Steam, running on Nvidia GPUs in data centers, rather than on personal devices.Â
"You see XBox and you see PlayStation, you see other competitors trying to get the cloud into gamers' hands in a way that actually makes sense. And Nvidia GeForce NOW has really cracked that code," Miller said.
Gettys told CNBC that Nvidia's streaming platform is the best "by a landslide."
"It allows millions more people access to gaming at the highest level, even if they don't have the latest cards and all of that. And it's truly incredible technology," he said.
Advanced Micro Devices is Nvidia's top competitor in gaming, with its Radeon line of GPUs.
But the memory crunch remains a challenge for both.
"If Nvidia can't get the memory, AMD ain't going to get the memory," Rasgon said. "Sentiment wise, both brands have their fans and they can be die hard."
"There's a clear favorite," Gettys said. "If you're playing on PC, you're going to want an Nvidia card."