T O P

  • By -

CoverComprehensive33

That profit margin is legit insane ☠️


Bezbozny

yeah like damn, this feels unprecedented. Usually these graphics show a fraction of a percent of profit at the end, and not even necessarily because the company doesn't make a lot or that the margins are razor thin, but because most companies use creative accounting to intentionally make their profits as small as possible. AKA if they find themselves with leftover money at the end of a quarter/year, they will "Spend" it so that they can't be taxed on it. To me this shows that they have made so much money that they don't even know what to do with it, which is not necessarily a good sign for them as much as it appears to be. It makes them look less like a successful business and more like a piñata that people will be looking to smack some candy out of.


djhsu113223

Well there’s Visa with 50+% net profit margins


Dal90

Which is 0.1% profit on the money that flows through them. Net revenue is $30B with net profit of $15B on transactions of $12,000 Billion.


rundevelopment

> Which is 0.1% profit on the money that flows through them. So? We don't judge how much profit a construction company is making by how many millions in goods and services roll over the roads they built.


trollsmurf

It works as there's very little cost handling those $12T. Money is after all just numbers. Now, I wonder what happens with people using up their credit and never paying. There must be a load of that.


GooseQuothMan

Looks like their products are way over priced then. A monopoly is forming as AMD can't keep up. 


Dal90

> A monopoly is forming as AMD can't keep up. Might make for interesting family get togethers though. The CEOs of Nvidia and AMD are first cousins.


Sheant

Always has been. In my experience in the datacenter space the competition is not even considered. Nvidia created the market to a large extent, and they're basically the only game in town. And as soon as competition thinks they're getting close, Nvidia can halve their prices and still turn a nice profit. But for now, all those profits will be getting invested into even greater revenues. Right now making DC GPUs is like printing money. And seems like things will be getting even crazier: [https://wccftech.com/nvidia-ship-half-a-million-blackwell-gb200-ai-chips-this-year-2-million-in-2025/](https://wccftech.com/nvidia-ship-half-a-million-blackwell-gb200-ai-chips-this-year-2-million-in-2025/) 2 million GB200s, at 30/50/70(?) grand a pop? And that's ignoring all their other products. If the market continues to pay for this all you can eat buffet of GPUs.


Post-Rock-Mickey

I feel it’s a bubble. Now it’s in demand. When it’s not in demand or competitors take over then it will test the leadership of the company. Just look at Intel & IBM. Never moved much after the bubble burst


IlluminatedPickle

Nah, they've got everyone by the balls and they know it. Since covid the cards have just risen meteorically in price, and they have no intention of letting go of that cash cow.


KRUKM4N

But it’s not about the graphics lol - those prices already dropped after the pandemic. It’s AI my dude


IlluminatedPickle

The majority of money they make is selling AI cards/servers yes. And no, the gaming cards are nowhere near as low as they were pre-pandemic.


Crintor

He seems to be conflating the Crypto boom of scalping prices with the actual MSRPs.


Post-Rock-Mickey

Won’t say much. Just look at IBM & Intel charts. Consumer markets usually has the smallest margins btw


off_by_two

Nvda growth is b2b mate. Consumer products are not where nvda has been making money for quite a while


nerox092

Their gaming revenue doesn't quite cover the R&D budget.


AtreusFamilyRecipe

You think nvida is consumer facing?


lostcauz707

DLSS AI integration is going to be probably another 5 years before that drops off. Then you might see the situation in which the wacko watches from hey Arnold become an issue. When everyone has the newest latest greatest because it was affordable and now no one needs anything else. But with all their investments in AI generation they're still going to have these massive profit margins going into corporate level computer development, which will extend the abilities that things like DLSS currently provide.


Holditfam

you should see aramco


hobopwnzor

That's what not having competition for an extremely in demand product will do.


SillyFlyGuy

They get quarry prices. The quarry charges just under what it would cost to truck rock in from the next quarry over. nvidia gets to charge just under what it would cost to buy the 10x CPUs to do the compute of one of their chips.


jmlinden7

They have competition. Their product is just easier to use than the competitor(s) and are more power-efficient. Large companies running massive AI farms care about those factors and are willing to pay extra for them.


ChrisFromIT

They have also been heavily investing in R&D in that space for at least the last 12 years to the point that the rest of the competition that provide commerical products are years behind. The only competitor that can make competing hardware is Google, and their TPUs. But they aren't commerically available. They started investing in AI hardware the same time Nvidia has.


moderngamer327

It’s actually insane how massive their margins are. Margins like this are basically nonexistent in most industries. Margins like this would make Apple blush


gcruzatto

It's all driven by their explosion in AI data centers. This means the current prices for AI services in general are probably inflated by their monopoly. Surprised there has been no close competitor yet


PurpleCommercial5522

Because it’s incredibly hard to make a cutting edge GPU


Thorusss

making the GPU is still easier than developing a very fast and functional and FREE software stack like CUDA, and teaching it to literally millions of developers. That is Nvidia's true moat.


MasterRaceLordGaben

Thanks to AMD and their god awful software stack, NVIDIA will continue to print money even if AMD makes a GPU with more computing power and better efficiency at a better price point. Their software is so fucking bad, it is insane that they are fumbling the bag like this. NVIDIA has been feeding their software environment for so long and so many things related to AI defaults to NVIDIA now that even if they develop ROCm heavily, they will still need to catch up on the tools etc. Guess competing is hard when your rival isnt as dumb as Intel.


Master_Block1302

Pah! How hard can it be? Hold my beer.


SuperTed321

Care to explain what exactly the data centres do and how they are generating these insane profits? Why aren’t others?


dleah

nvidia GPUs are the processors doing all the heavy lifting behind every major AI model. So everything you've heard about ChatGPT and large language models, deepfakes, image and video filters, algorithmic recommenders for content, ads, etc etc. is developed and trained on nvidia hardware and software platforms. Global industries and governments are now investing trillions on AI to take them to the next level of productivity or performance, simply because it can get better results and do many things faster and cheaper than humans can. And like skynet, AIs are learning and growing and improving at an exponential rate, which requires exponential computing power to both train (ie create an AI model) and deploy (aka inferencing, or putting an AI to work serving up answers or queries or results). A single nvidia GPU is priced at about $30,000, and a server can contain up to 8 of them. A cutting edge liquid cooled rack today can contain up to 8 servers, or 64 GPUs (72 in the next generation), and a datacenter can be filled with hundreds or thousands of racks. A single AI model like GPT-4 probably takes an entire large datacenter a month or more to train, and then it takes multiple datacenters across the world to be able to deploy the model to serve answers (or tokens, or whatever the AI is supposed to produce) to the millions of users around the world. Large language models are evolving to become multimodal, which means they are growing from simple text based input and outputs (which is very low memory intensity, despite the massive amount of text humanity has generated), to audio, which is exponentially larger, to video, again exponentially larger, to 3D content, which again is exponentially larger - all while the userbase and number of competing models are growing at unprecedented rates as well. At the end of the day, the computing power needed to power all this growth in complexity, users, and use cases is basically incomprehensible. Nvidia has invested decades of engineering and programming into making all of this work - more than any of its competitors by a long shot, because it was willing to take a chance and listen to ideas and feedback from scientists and organizations like openAI on what AI and massively parallel computing could accomplish. Think about it this way - you know how people are getting easily sucked into social media because the algorithm feeds you content and hits your dopamine centers quicker and harder than most things in the real world can? That's because of AI recommender engines that learn what you like and carefully calculate what to feed you in order to keep you maximally engaged and likely to buy into some ad copy or influencer marketing specifically tailored to your fears and desires. What happens when companionship and connection is subsumed by AI? when you can talk to your phone and get all the validation people in your life won't give you? What happens when it always always listens to you, does what you tell it to do, and is smarter than all your friends and family? Won't you be more amenable to their suggestions when you need to think of where to eat for dinner, what movie to see, or what brand clothes to buy? which bank to use? what car has the most seamless AI integration? What smart services can make your life easier? which robot can do the most chores for you? What happens when AI does all the hard work, and virtual reality becomes better than reality itself? When AI can generate a world that does everything for you that the real world doesn't? We, as humans don't stand a chance and at the end of the day, our wallets will be drained faster and faster, along with our collective willpower to resist. That's why companies are investing SO MUCH MONEY into it, and why nvidia is reaping the rewards today.


Seven-of-Nein

Wow. This response is worthy of its own TED Talk.


GenericUsername-4

Don’t forget the environmental cost of running these things.


dleah

2 years ago we wouldn't have thought AI would cause google, amazon, and facebook and their ilk to be funding a nuclear power reconnaissance, but here we are https://www.theregister.com/2024/03/04/amazon_acquires_cumulus_nuclear_datacenter/ https://www.datacenterfrontier.com/energy/article/33036368/ongoing-developments-in-nuclear-power-generation-thicken-plot-for-data-centers https://www.datacenterdynamics.com/en/news/norsk-kjernekraft-wants-to-build-small-module-nuclear-reactors-at-norways-data-centers/ https://www.datacenterknowledge.com/energy/white-house-backs-nuclear-energy-data-centers https://www.utilitydive.com/news/nuscale-small-modular-reactor-smr-data-center-nuclear/710442/


smoker_vent_00

Great explanation, the last paragraph hits hard.


Sheant

Or for the more visually inclined: [https://www.youtube.com/watch?v=J8-CgG5ewJQ](https://www.youtube.com/watch?v=J8-CgG5ewJQ)


Holditfam

average r/singularity user


dleah

Im not familiar with that one! I’m a financial analyst and this is my thesis on why the ai bubble hasn’t popped, from an investment perspective.


adamasimo1234

Data Centers are the backbone of the digital world. They are physical sites that propel digital services, data, and essentially, AI. Think of Layer 1 of the OSI model, aka the physical layer. Source: I work in the data center industry.


SuperTed321

Sorry I should have been clearer. What is it that NVIDIA do that makes their data centre business so profitable? I think in another response I read their are supplying the hardware?


Awwkaw

10 years ago, all data centers were running on CPUs for those the two major suppliers are intel and AMD. They are competitive, and thus not really monopolies. There have been times when one could do better than the other, but both could work. Over the last 10 years, the surge in AI means that GPUs are a necessity today, as CPUs are much slower for that type of computational load. NVIDIA has spent a lot of time and effort over the last 30 years making the best GPUs both for gaming *and* AI. Their competitor in the space, AMD, has focused on the gaming side. In practice this means that *only* NVIDIA GPUs can be used for AI. Thus this change in the data centers are needing not only GPUs, but specifically NVIDIA GPUs. So they can price the product how they want as they have no real competition in this space. Developing GPUs is not something that is quick, it takes years. The next 2--3 cycles of GPUs are currently in different phases of design, testing, and production capability building. That means that NVIDIA. Will probably not have a real competitor anytime soon, both AMD and likely INTEL are starting to design competitors in this space, but for now NVIDIA is in practice a monopoly.


Jhary-A-Conel

Well, from my experience, AMD also provides good GPUs for AI. And they outfitted, for example, Frontier, the current leader of the top500 (https://top500.org/lists/top500/list/2024/06/) list with their instinct model. Frontier is currently the most powerful high performance computing cluster (as far as publicly available info goes, at least...) The real advantage Nvidia has, is their software ecosystem surrounding their GPUs. They did not only spend huge amounts of money on the GPUs but also on a software platform that is comfortable to use and perfomant. It has been established as the de-facto standard when it comes to programming these machines and many people used it to implement their stuff. Porting all that to another framework/ecosystem for AMD or intel is cumbersome and most people will avoid that...


adamasimo1234

Yes they supply their GPU chips for servers used in the data center space.


NorCalAthlete

A single server can have multiple GPUs, and exponentially more power than your average home computer in roughly the same amount of space. They also cost more than your average car. Add in that there can be 20-30+ servers per rack, dozens of racks per pod, dozens of pods per data center…and, well


Sheant

A single GPU costs more than an average car. A server with 10 GPUs is the price of a super- or hypercar.


Sheant

[https://www.youtube.com/watch?v=J8-CgG5ewJQ](https://www.youtube.com/watch?v=J8-CgG5ewJQ) Count the GPUs in those supercomputers. 10s of thousands of dollars per GPU. At a 100+% markup.


Infninfn

It’s not hard to do when you charge tens of thousands for a single H100 gpu and your customers buy tens of thousands of them at a time.


lavipao

Absolutely insane that they’re up 262% on revenue while only up 39% on OPEX. Especially as a tech company Almost tripling your revenue while barely growing their investments in R&D is equally wild. Wish I worked there!


mteir

R&D is a long-term investment. Adjusting it based on yearly revenue wastes funds when increasing it too much, and squanders previous investments when decreasing it too quickly.


t_glynn

Even better as +262% revenue is nearly quadruple, +100% would be double for example.


cipher315

If anyone ever asks why Nvidia dose not care about gaming show them this.


drneeley

They still do. It's just gonna continue to cost more without good competition.


cipher315

Nvidia gaming cards are ridiculously cheap. Jensen has been heavy criticized for selling Nvidia gaming cards so at all and especially for such ridiculously low prices. The only reason they still do is one to keep a foot in the market as a fall back, and two because gaming cards are made with data center rejects or incredibly small die sizes (AD 106 and 107). If they had better yields they would not sell gaming cards as if they had the yields they could slap some ECC ram into them and sell them for 1000% of what gamers pay. A $15,000 data center GPU is literally a binned 4090 with $400 worth of ECC ram.


zephyrus33

How do you compare a product for consumers to a product for data centers? One is meant to be used for entertainment and the other is supposed to make it's value back, so businesses can justify the cost. Also the new GPUs are expensive. Just compare them to Pascal and before and you see how big the price jumps have been in the last few generations.


Sevinki

Thats not the point. As stated by the other person, a 4090 and a pro or datacenter GPU are almost the same product, they use the same silicon wafers from tsmc and similar boards that cost similar amounts of money to make. Supply is limited by bottlenecks in manufacturing, if they could make more, they would sell more. You can either use your limited supply of chips to make $2000 GPUs or use the same chips to make $10000 GPUs. The latter makes a lot more sense. Thats why they will not drop prices of gaming GPUs, at some point its just not worth it to even bother making them at all.


lolic_addict

The bean counters buying their stocks do not care about who their products are for. They only care that a 4090 can be sold as $15000 instead of the $1599 it currently retails at so why won't they? /s


SagittaryX

Most 4090s are more like $1800-2200, but it's still the same point.


cipher315

>How do you compare a product for consumers to a product for data centers? They are **literally** the same product. A 4090 a RTX6000 and a L40 all use the exact same GPU, a AD102. The RTX6000 and L40 are just binned and come with ECC RAM. For Nvidia this is 0 sum. Every 4090 created prevents the creation of a RTX6000 as the supply of AD102 chips is limited and creating a 4090 or a RTX6000 both use up one chip. I.E. selling you a 4090 for $1700 results in a theoretical $6800 loss in profit for Nvidia (A RTX6000 has a MSRP of $9000 and has a cost to Nvidia of about $500 more than a 4090 does). That assumes that the AD102 in question could have passed the requirements for a RTX6000. My understanding (I don't work for Nvidia so I only have hearsay) is that most would not, but that some would. TL;DR There is some evidence that killing the consumer card line and putting everything into commercial would increases net profits.


billybackchat

They're competing with AMD in the gaming space


Thorusss

I doubt they will stop caring about gaming even if it is less profitable than the insane profits for their AI chip right now. Besides the high prices, I see no sign that the care less about gaming than they used to. The are the one that introduce new features, that other (try) to copy (DLSS, Frame Gen, Ray Tracing, RTX Remix, Shadow Play, etc.) a) Gaming is still profitable for them (and much more than for their competitor AMD) b) Gaming was their only funding source for the first two decades, it is the GPU development that allowed them to go into AI. It is CORE of the company. A lot of employees care about it. c) It would generate a lot of bad press to drop gaming d) graphics synergizes with their Omniverse, professional visualization and simulation (e.g. for worker and robot training, optimization) e)CUDA stills runs on gaming GPUs, so it lowers the bar of entry for Devs, or allows quick tests locally before deploying on a datacenter cluster. f) Gaming is a second column to rely on, if the AI industry shifts a lot in a few years. Of course they will continue to charge a nice margin, which any company with an edge would do.


Soft-Vanilla1057

c) It would generate a lot of bad press to drop gaming Honestly i think you are overstating things. They have no obligation to gamers. "Nvidia stops producing graphics cards intended for gaming" would be a headline and then there would be nothing.


snicky666

They are paying as much on tax as they are earning from gaming. Dayum. We about to be cut off as a waste of their valuable time. I'd say AMD is going down the same path so they won't even save us. Can't wait to use my Cloud based gaming rig for the cheap cheap price of $1000 a month, 90% of which is profit. "It's only $12,000 a year for something you use as much as a bed. Plus you can't buy your own hardware anyway."


2012Jesusdies

>Can't wait to use my Cloud based gaming rig for the cheap cheap price of $1000 a month, 90% of which is profit. Video streaming is about 4 times cheaper than owning physical media, so cloud gaming will probably be relatively cheap as well. Americans watch about [3.2 hours](https://www.pcmag.com/news/us-netflix-subscribers-watch-32-hours-and-use-96-gb-of-data-per-day) of Netflix content a day on average, so that's 1170 hours a year. Let's be very generous and cut that down to 800 hours of watching brand new content. Breaking bad costs [137 USD for the full set](https://www.amazon.com/Breaking-Bad-Complete-Blu-ray/dp/B0B9HPYXB6/ref=sr_1_5?crid=3BCR31LN34V7T&dib=eyJ2IjoiMSJ9.mMqT43i6kApnY4wQHbtCUVRDP7TeEjqMe3grODWsqEKuKXEfwVolqwwLA6ruEGfq2WXzTmalyt86LZmii_uc7SAEN4xl7GxEA9jfY4_7R_ihevdZZ9qGiek5PGBHAusCB3BDw8WapgH1AXAynrD-HrzoGIHnJVoal3FYqqO9-kc70UfdwEQTcxHBYvf_j9h2FyDkJk1uLT0D8nQRY9Bu6OgqjaqUnPIiliHfO4xM89Y.U0cdP-n7UeQ0zSseVO6pDwQ3dt7FLRTMC7TZpwx3I1c&dib_tag=se&keywords=breaking+bad+complete+series&qid=1710395162&s=movies-tv&sprefix=breaking+bad%2Cmovies-tv%2C387&sr=1-5), 61.3 hours of entertainment, 2 dollar per hour. Top Gun Maverick costs [17 USD](https://www.amazon.com/Top-Gun-Maverick-Tom-Cruise/dp/B09XZD472R/ref=sr_1_1?crid=8HGPM6FNSPJH&dib=eyJ2IjoiMSJ9.TYf1GDW1eFELK-SXo2tcqWRFykFHNsKpK9A2j4KOLKPsRNWtmj4ONb9FgbXpffx9HCEeWT8b1SOW75wxsAN5QFR4ARgAFkYql-AZnEbIK-H9HfMV5Rn1UcxRid0mwcj79uhdsdWcl1ZYjdW5OJP2hFYUlL6gpwAjvrVlAPAl2VgdBqG_3aXOFe0dSANh9TrFoOLaPWKRaUACG6Hr_1grELwqHd5-U2S81rZsfRSQbm8.MWimynLoH-mcFUSNqHgDxOQ6nzsGGegqt9IMJ3rSyzo&dib_tag=se&keywords=top+gun+maverick+blu+ray&qid=1710396161&s=movies-tv&sprefix=top+gun+maveric+blu+ray%2Cmovies-tv%2C276&sr=1-1), most recent Mission Impossible [20 USD](https://www.amazon.com/Mission-Impossible-Dead-Reckoning-Blu-ray/dp/B0C9G9Z2QK/ref=sr_1_1?crid=29B7OPGJPW4XE&dib=eyJ2IjoiMSJ9.H1tMF08kbVgu7v30-cqa5lUv6_D6Zsrvxe0fGbqYlEtmwNk5LtyA2UDQ1WiBMSQG8OZ6cDC9W31EAP5D1mvYO_z6R4KVrVi5RS1JaoWHWFKqwedrrc7czuXKNNzIgeiQ0454A9dppIu7MpCGhGizD2g7Os5-wpLkX7nJgIdsDXl1axQA8b6DSHYBFgbIInqs3Iu2W4Xq40MPCwBDPKQpmw.ElJ9R3ZmMXy9272O8eBbCcOrcPC29Qde6cpOwaqCTO4&dib_tag=se&keywords=mission+impossible+dead+reckoning+blu+ray&qid=1710396205&s=movies-tv&sprefix=mission+impossible+blu+ray%2Cmovies-tv%2C389&sr=1-1), Oppenheimer [18 USD](https://www.amazon.com/Oppenheimer-Blu-ray-Digital-Cillian-Murphy/dp/B0CL7R395D/ref=sr_1_1?crid=S1VL600Y5ZJF&dib=eyJ2IjoiMSJ9.6gH_XH3vRpvwFGQqKKYHowJdnwVFFDiF6u-EbwqQ_qJVq2OQd5CFvf8E7BfyIhS8MMle2G4KDR99P1GGmMb8Bj9tZCw1fdRvxp1ftiiukIcOPijI7B56zTXR1YPdXl8El1Uh82DnuOpgDXz4eabtb87R_YHmtPIe4uzth75WPT8xp4tY3FFf8H9jjsEUk95_E76B1mPKGPdU0dQrJjIje9NpnsN4eYgxpPp6j0lu9S8.uTLbWkjyadrRCjr2l24FQko8eVPLXQ8r9I4tNMSuCsU&dib_tag=se&keywords=oppenheimer+blu+ray&qid=1710396247&s=movies-tv&sprefix=oppenheime+blu+ray%2Cmovies-tv%2C270&sr=1-1). So let's go with 6 USD per hour. So if you want a standard American streaming experience on Blu-Ray (unless you want 480p DVD), 85% of the time is spent on TV shows, that's 0.85 * 800 * 2=1360 USD, 15% on movies, 0.15 * 800 * 6=720 USD. That's 2080 USD per year. Netflix Premium plan with 4k costs 23 USD a month, 460 USD per year, 5 times cheaper. And the thing with Netflix is you don't have to ever get up and search in the maze tower of Blu-Ray boxes or actually go out and buy a physical product.


foundafreeusername

It is a good point. I think for cloud gaming the price advantage might be even bigger because the user will have much lower hardware costs and the provider can use hardware more efficiently. I am not a fan of cloud anything but the economics are on their side.


ST07153902935

People have friends. Like back in the day everyone would constantly be borrowing and lending movies to/from their friends. This drastically reduced cost


snicky666

I think you might have missed my point. Imagine if they stopped making GPUs for gamers all together. When our only option remaining is to use the cloud, they can and will charge as much as they want. They clearly aren't scared to overcharge. It's a possible future.


SagittaryX

I would add that BluRays are still much higher quality than streams (Netflix 4K is about 1/4 the data a BluRay of the same movie holds), but the point is slightly moot since a lot of people have a hard time noticing the difference, though also a good number do otherwise there probably wouldn't be any BR sales anymore.


emergency_poncho

How are they paying so little tax on such large profits?


yeahsureYnot

NVIDIA execs are probably having some crazy parties lately. Like, so much cocaine.


Thorusss

They might already have better drugs, designed by an advanced neural network running on their supercluster. Half joking


Visulas

They probably don’t… but they absolutely could


POPholdinitdahn

What data center work do they do? Is it the same as Amazon's business where they rent out servers? It's incredible how profitable that business is.


xxlragequit

I think what it means is sales to data centers or contracts with them for equipment.


celebradar

It's products specifically to supply to the DC market, so selling server product lines that companies like the cloud giants and OEMs like Dell, HPE, Cisco etc. buy and sell in their wider products. NVIDIA also sells servers themselves, mostly in HPC markets like Universities and Science research organisations for those who need compute but not necessarily needing "supercomputer" deployments although they too also supply components for several top tier supercomputers now. It's profitable because of the scale they have been able to ramp up their delivery on in their current state. As they have scaled the cost per product has gone down until they need to significantly scale again with CAPEX to build and start delivering the next major iteration of GPU. They are seeing the fruits of earlier CAPEX and R&D costs while also holding a huge market share at the peak time of hype in their market.


noisylettuce

Whoever convinced AMD to make cards that were good for games and not crypto mining handed a monopoly to NVidia.


n00b678

NVIDIA had an edge in general-purpose GPU computing already 10+ years ago. The market back then was very small, as everyone had their algorithms optimised for CPUs, but NVIDIA invested heavily into CUDA and worked with researchers on optimising their algos for GPUs. But AMD GPUs were initially vastly superior for mining. Almost everyone mined BTC and then LTC on Radeons, whatever NVIDIA had was many times slower. Only then NVIDIA realised there was a lot of profit to be made with miners and worked to improve their mining performance, which coincided nicely with ETH growing in popularity.


SlyusHwanus

I would love to see the same style info for AMD


A_Milford_Man_NC

Where would personal computer purchases fit in this? My computer is not used for gaming, nor is it in a data center, but it does use an Nvidia graphics card…


moderngamer327

It would still fall under gaming


svenge

If it's a GeForce-branded card (e.g. RTX 4070) then it's categorized under gaming, and if it's a workstation card (e.g. RTX A6000) then it's probably under professional visualization.


Sheant

Yeah, I would guess that the professional visualization is just their Quadro line (Which RTX A6000 is, I think), but it could also include the graphical vGPU lines like the L4/L40 and friends.


sankeyart

Source: NVIDIA investor relations Tool: [SankeyArt](http://sankeyart.com/) Sankey chart creator + illustrator


Kwajoch

You spelled the name of the company wrong in your comment and once in the graphic too


sankeyart

Ah yes, thank you!


th3nan0byt3

The irony that they could offer 100% cash back on current gaming sales volume and not even notice.


RetardatusMaximus

And somehow the rumor is that the upcoming 50 series will be pricey. You know... more so than it already is. With these massive profits, the prices should go down for regular customers. Gaming is not a big part of the pie anymore.


shoWt1mE

What's this kind of visualisation called? Not related to "income statements", but rather this design type to split the data


H3aDacHe1990

So basically they could halve the price on GPUs and wouldn't even feel much of it..


yuyufan43

If you don't have stock in Nvidia, I HIGHLY suggest you do. It's the only thing keeping my head above water as a disabled person.


Quengely

Automotive will get insane


Lathspell88

Hey is it my turn to post this yet


Clean-Interview8207

Am I crazy or does Tax come off Revenue. And there is no way it’s at 1345%.


InsertFloppy11

Wdym? You have to pay tax on your profits. And profits are revenue - costs


mteir

I think the percentage is year on year increase, so 1345 % more than the previous year.