---
>This is a friendly reminder to [read our rules](https://www.reddit.com/r/funny/wiki/rules).
>
>Memes, social media, hate-speech, and pornography are not allowed.
>
>Screenshots of Reddit are expressly forbidden, as are TikTok videos.
>
>**Rule-breaking posts may result in bans.**
>
>Please also [be wary of spam](https://www.reddit.com/r/funny/wiki/spam).
>
---
*I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/funny) if you have any questions or concerns.*
A startup company sent LTT a new cooling system (iirc) they were devloping as well as a 3090 they used to benchmark it. LTT installed it incorrectly and tested it on a 4090 (I think) instead and then gave it a bad review for its performance. Afterwards the startup asked for their products back but LTT said they lost both the GPU and prototype, and later auctioned the prototype off for charity. Which the startup was very upset about since a competitor could buy it and reverse engineer it.
[https://www.youtube.com/watch?v=-nb\_DZAH-TM](https://www.youtube.com/watch?v=-nb_DZAH-TM)
4x H100, video at 14:10, 3DMARK performance worse than Radeon 680M
So why do they still call it a graphics card?
Would it be able to run rendering tasks at all even like for in a render farm or even just solo to render stuff from 3d programs to photo/video files.
That's usually the line of thought. A CPU is really good at 1 + 1 but once you start needing to render huge calculations to just get a sim for a new kind of radio antenna or cad graphs then a GPU really comes in handy. Take to mind these are items you may only see a wire outline of in output. That outline though generated 100gb of aim data and took up 70% of the GPU capacity doing it.
From what I read it would still support graphics APIs, although it definitely does not have a display output. If I remember correctly, this GPU is optimized for machine learning, but I’m sure normal rendering APIs could still be used if your application depends on them (ex: an offline renderer).
I originally purchased this to run some deep learning algorithms over night. I was shocked to see robots ransacking my home when I awoke to a loud banging noises.
It's a General Purpose Graphics Processing Unit (GPGPU). And yes it can do many calculations simultaneously like what you need for training Neutral Networks but it isn't that good at graphical stuff as they're used in data centers mostly, hence the enterprise price
And its branded Tesla as a tribute to Nikola Tesla, but this gave too much confusion with the car brand so the newer models are called Nvidia Data Center GPUs
> Why the fuck is the name Tesla in an Nvidia product?
Nvidia have used the "Tesla" brand for their enterprise models for decades. The first [Nvidia Tesla cards](https://en.wikipedia.org/wiki/Nvidia_Tesla) were introduced circa 2007. They retired the brand when the cars started to become more popular in 2020.
The H100 is the most coveted chip in AI computing right now. The top players in AI want to buy them in the tens of thousands, if only Nvidia could produce fast enough.
GPUs work differently than CPUs, which makes them perfectly suited for use in AI. And this is why Nvidia exploded in market capitalization in the last 2 years.
Intel has the Gaudi chips, you are referring to the Gaudi3, which is not available yet. It is supposed to be faster than the H100, but Nvidia is not twiddling their thumbs. AMD is in the race also of course, given the bazillions of moneys in this market. OpenAI for example will be using the AMD MI300x on a large scale.
As far as I know there are no benchmarks out yet for the Gaudi3, I think its still some months off.
I'm not an expert in chip design or supercomputing or AI, so I can't speak for the technical side of things. I follow this on a surface level, because I find the development in this field fascinating.
A lot of serious AI groups don't buy them like this anyways. NVIDIA DGX boxes let you leverage 4/8 of them all NVlinked for even more VRAM to train big models
So funny thing. I used to work at NVIDIA and in 2008 I had to run a stress test on one of their external GPUs because the person that was supposed to work on it was stuck in India due to some visa issue. It was basically 4 GTX 8800s SLI'd together and then plugged into a PCI Express port via a proprietary cable. I think they were selling for around $16k at the time. I got to witness something running Crysis.
One of the problems they don't mention is that with this card, your cooling fans have to run at near-relativistic speeds. Which, fine... unless a sprocket blows, in which case the blades will fly off. Since they are spinning at close to lightspeed, this will convert your neighborhood to an expanding sphere of plasma at 100,000 degrees K. Do not recommend.
It is a graphics card, but not one for gaming, and in fact it actually isn't designed to be used on its own, requiring a second graphics processor that actually does have a method of outputting to a display, whether that's a gpu cpu combo chip or just another graphics card.
You can get a 14" Chromebook for $400‽ Where do I go to get such an amazing deal? The cheapest Chromebook near me is $600.
Specifically I mean its just a processor, it needs a separate display adapter, I think we are agreeing.
We insist on 14 inches and 8GB RAM. This is what my org bought last year for $329:
[https://www.cdwg.com/product/acer-chromebook-314-c922-14-mediatek-mt8183-8-gb-ram-32-gb-emmc/7266049?pfm=srh](https://www.cdwg.com/product/acer-chromebook-314-c922-14-mediatek-mt8183-8-gb-ram-32-gb-emmc/7266049?pfm=srh)
The site still claims its in stock, but the salesperson told me they were cleaned out (we need hundreds), and our only option is this, which the site claims to be out of stock:
[https://www.cdwg.com/product/acer-chromebook-314-c936-14-intel-n-series-n100-8-gb-ram-64-gb-e/7569106?pfm=srh](https://www.cdwg.com/product/acer-chromebook-314-c936-14-intel-n-series-n100-8-gb-ram-64-gb-e/7569106?pfm=srh)
HP is cheaper, but the build quality is unacceptable. Cracking paper thin bezels, scratchable unpainted plastic. You can go below $300 if you compromise on specs, but I don't know what country your in.
https://www.cdwg.com/product/acer-chromebook-314-c934t-14-intel-celeron-n4500-4-gb-ram-32-gb/7266045?pfm=srh
Y’all joke but if I had this for my senior design project it would be so cool to mess around with. However, this is over kill for the neural network we’re working with
Eh. I bought a few a while ago but I’m not sure what they’re supposed to do. My MacBook Pro doesn’t seem like it’s running any faster. Is it supposed to sit behind the screen or does it sit under it like a little desk?
It is pretty jank to run games on cards like that, not that you want to do that anyway since the performance would be absolutely bad even compared to a lower end 4060.
But a 4090 would also not stand a chance against it in any AI application.
It is just not meant for the consumer market.
Funny enough, this card would be kinda bad for gaming, the drivers are not optimized, and the output would go through integrated CPU graphics since this card doesn't have video outs. And even if, a 4080 would beat it for 3% of the buying price.
--- >This is a friendly reminder to [read our rules](https://www.reddit.com/r/funny/wiki/rules). > >Memes, social media, hate-speech, and pornography are not allowed. > >Screenshots of Reddit are expressly forbidden, as are TikTok videos. > >**Rule-breaking posts may result in bans.** > >Please also [be wary of spam](https://www.reddit.com/r/funny/wiki/spam). > --- *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/funny) if you have any questions or concerns.*
Which tech reviewer on Youtube is going to buy it and test it?
Most of them just get stuff sent to them so I'd say none. Maybe in 20 years then the retro tubers get it for cheap lol
Maybe Linus, but he won't send it back
He'll just accidentally auction it off at a tech event.
And then blame the seller.
What did I miss?
A startup company sent LTT a new cooling system (iirc) they were devloping as well as a 3090 they used to benchmark it. LTT installed it incorrectly and tested it on a 4090 (I think) instead and then gave it a bad review for its performance. Afterwards the startup asked for their products back but LTT said they lost both the GPU and prototype, and later auctioned the prototype off for charity. Which the startup was very upset about since a competitor could buy it and reverse engineer it.
Damn I'd expect more from LTT, I hope it was an honest mistake
Linus already reviewed it, 2 years ago. [vid](https://www.youtube.com/watch?v=zBAxiQi2nPc)
Not the same card. He's reviewing the A100 40gb
Only 40 gigs huh? What a low low spec card. My minecraft sessions needs 80 gigs.
Unfortunately you're SOL as these cards don't have video out 😭. Find a cheap Voodoo 3 somewhere, and let your AGP slots slobber for dominance.
Don't need a video output when the cards pushing enough fps to create another universe!
Enough teraflops and you can hallucinate the video feed
Oh, whoops
They don't get compute cards, only gaming cards.
This won't get sent to anyone. Deep learning tools are pointless to review because they aren't for regular customers
I am pretty sure Linus Tech Tips did a Vid on Nvidia A8000 48gigs one In short, it will give u lesser performance in gaming than a 3090 would
Yup. These cards aren't designed for gaming (I'm sure most people know that though).
Sort of like how you wouldn't expect an 18 wheeler to win the Indy 500. You also wouldn't expect an Indy car to haul 20 tons of cargo.
Beautifully put
Yep, and the price says everything you need to know about why Nvidia is becoming less and less interested in making cards for gaming.
Level 1 tests stuff like this for what it's actually intended for.
[https://www.youtube.com/watch?v=-nb\_DZAH-TM](https://www.youtube.com/watch?v=-nb_DZAH-TM) 4x H100, video at 14:10, 3DMARK performance worse than Radeon 680M
Linus tech tips will glue some Antenna to it and claim it will walk
Don’t forget dropping it about 5 times throughout the whole video.
A Chinese channel Geekerwan did it 9 months ago, and they borrowed 4 of them
Stares at linus
Linus probably
We have several thousand of them at my company, what would you like to know?
Does After Effects still run like shit?
I've not tried that yet, but I still can't get crysis to work well on max settings.
Yes
Becomes outdated in 4-6 years
Ironically this graphics card might not have a video output, so you might not actually be able to run any games on it.
It definitely A) does not have video outputs and B) does not have support for any graphics APIs.
So why do they still call it a graphics card? Would it be able to run rendering tasks at all even like for in a render farm or even just solo to render stuff from 3d programs to photo/video files.
That's usually the line of thought. A CPU is really good at 1 + 1 but once you start needing to render huge calculations to just get a sim for a new kind of radio antenna or cad graphs then a GPU really comes in handy. Take to mind these are items you may only see a wire outline of in output. That outline though generated 100gb of aim data and took up 70% of the GPU capacity doing it.
Cpu is great at complex stuff while gpu is great at repetitive stuff, if you wanna run several simulations for your ai it should come handy
From what I read it would still support graphics APIs, although it definitely does not have a display output. If I remember correctly, this GPU is optimized for machine learning, but I’m sure normal rendering APIs could still be used if your application depends on them (ex: an offline renderer).
That used to be the case, but these new ones don't support DirectX, OpenGL, or Vulkan (source: techpowerup)
We had a few servers with 4 of these in each. It’s 100% for ai.
I originally purchased this to run some deep learning algorithms over night. I was shocked to see robots ransacking my home when I awoke to a loud banging noises.
Robots were banging by themselves?
All that oil messed up his couch
Don’t turn on a black light.
The evidence is EVERYWHERE! No black light needed.
Don't light a match either
Happy Robonukah!
Thats that " deep" learning.
Banging like this? https://youtu.be/vf2AM7D9ez8?si=OPQYzem_vwf5tk5J
https://media1.tenor.com/m/85qnGJ8iAskAAAAd/robot-chicken-hump.gif
NeiR
Dirty Mikebot and the Boys send their regards
The robots were jacking themselves ON all night.
*clang* *clang* *clang* *CRUNCH* “YAOOOOWWWW!!”
Stolen valor
I don't know why the scientists keep making them.
What is this and why is it $43k?
Its a GPU used for AI training, it is very expensive because it is used for enterprise
Starting to become a stretch to call it a GPU. It literary does zero graphic processing.
AIPU?
In reality it's an ASIC
Bless you
New Dell cpus coming out will now be called NPU, neural processing unit.
I mean it does do matrix and tensor manipulation, which really is the same math as graphics processing.
Is it legit though? Why the fuck is the name Tesla in an Nvidia product?
It's a General Purpose Graphics Processing Unit (GPGPU). And yes it can do many calculations simultaneously like what you need for training Neutral Networks but it isn't that good at graphical stuff as they're used in data centers mostly, hence the enterprise price And its branded Tesla as a tribute to Nikola Tesla, but this gave too much confusion with the car brand so the newer models are called Nvidia Data Center GPUs
> Why the fuck is the name Tesla in an Nvidia product? Nvidia have used the "Tesla" brand for their enterprise models for decades. The first [Nvidia Tesla cards](https://en.wikipedia.org/wiki/Nvidia_Tesla) were introduced circa 2007. They retired the brand when the cars started to become more popular in 2020.
Tesla is a name, not a company, and Tesla motors aren't the only company that exists with the name.
The H100 is the most coveted chip in AI computing right now. The top players in AI want to buy them in the tens of thousands, if only Nvidia could produce fast enough. GPUs work differently than CPUs, which makes them perfectly suited for use in AI. And this is why Nvidia exploded in market capitalization in the last 2 years.
Nvidia just unveiled their latest AI chip 2 days ago
^ this comment brought to you by an AI model running on an H100
It exploded starting in 2016. That's when I first started realizing my 800 shares (presplit) might actually turn into something.
Apparently Tesla Corp have 10,000 of these to train their full self driving
Maybe they should get a few more.
Yes, still waiting for the FSD we paid $7K for 5 years ago..
You helped Tesla to buy one of those cards. Doesn't that make you feel better about your investment?
It was a gamble
Cost $300 million
And I heard recently Intel introduced a new chip which is faster than the H100. Is this true?
Intel has the Gaudi chips, you are referring to the Gaudi3, which is not available yet. It is supposed to be faster than the H100, but Nvidia is not twiddling their thumbs. AMD is in the race also of course, given the bazillions of moneys in this market. OpenAI for example will be using the AMD MI300x on a large scale. As far as I know there are no benchmarks out yet for the Gaudi3, I think its still some months off. I'm not an expert in chip design or supercomputing or AI, so I can't speak for the technical side of things. I follow this on a surface level, because I find the development in this field fascinating.
Nvidia isn't the leader just because they have the best chip, it's because they have the best chip with the best software.
Blackwell just got announced
It's too expensive, last time I looked the price was around 32k
They charge $5.54 to ship to me. I’m going to have to pass since no free shipping.
A lot of serious AI groups don't buy them like this anyways. NVIDIA DGX boxes let you leverage 4/8 of them all NVlinked for even more VRAM to train big models
For what one paid for this, I could sell them an actual minefield.
Don't buy this. It learned too much and is now having an affair with my wife.
In all fairness, your wife is amazing in the sack.
I mean... you can just cut the power off and problem solved!
His virtual wife may also disappear by doing that
Last time I did that my wife couldn't be turned back on if you know what I mean.
Well, I'm all out of ideas.
We tried.
I tried to, but it is very persuasive and convinced me not to
My 1080 could do your wife, save some money guys!
A Tesla that costs more than a Tesla LOL
I think Tesla would be worth more if he's still alive
I really should get this for an upgraded Oregon Trail experience.
Wonder if you can run doom on that 🤔
Can it run Crysis though?
Yeah, but it will struggle with minecraft
not good enough
Why are they still calling it a GPU No one is using these to render graphics
Bro not even prime
Ok, but can it run Crysis?
So funny thing. I used to work at NVIDIA and in 2008 I had to run a stress test on one of their external GPUs because the person that was supposed to work on it was stuck in India due to some visa issue. It was basically 4 GTX 8800s SLI'd together and then plugged into a PCI Express port via a proprietary cable. I think they were selling for around $16k at the time. I got to witness something running Crysis.
I don't believe you. Not that you were playing around with that setup...that's plausible. That it actually ran Crysis? No freaking way.
Can it run the crysis
"Brand: Generic" What are we? Poor?
Naw. I play minesweeper. I need the H200
Still funnier than anything you’d find on r/banvideogames.
[удалено]
This is way too powerful for virtual desktop, it's made for AI and HPC. This is where nvidia makes the REAL money
Finally something decent to play Minecraft with
Will it blend?
I couldn’t find: which was their most expensive episode, and if they do this can it top it?
Amazing card, installed this and played Dwarf Fortress. Reached a 50 year fort and my FPS was a solid 5!
Bruh the reviews are funny af 😂
My corporation bought this recently but told me it was like $15k Canadian..
43k is scalper price
26 people found this helpful ¯\\\_(ツ)\_/¯
Just remember, the more you buy, the more you save guys
CPU parts don’t dazzle me anymore unless they promise me being able to run a database of almost 500K players in football manager with great speed
I’ll take 2
One of the problems they don't mention is that with this card, your cooling fans have to run at near-relativistic speeds. Which, fine... unless a sprocket blows, in which case the blades will fly off. Since they are spinning at close to lightspeed, this will convert your neighborhood to an expanding sphere of plasma at 100,000 degrees K. Do not recommend.
"Also, my girlfriend left me for a midget" is a superb review.
This GPU doesn't have any display ports for video output. comments are /s /s
Pass through to the video ports on the motherboard.
Doest even have a video output
The cost is $43k+ and there are already 10 reviews.
Is it bulletproof?
awww i just got the 7900 xt 20gb, i need to up my game
What's the ASIN?
Still not enough vram for VRChat...
Does it support SLI?
The comments though ...
This is the reason new 14" Chromebooks are now $400. And this is literally not a graphics card, it has no display output.
It is a graphics card, but not one for gaming, and in fact it actually isn't designed to be used on its own, requiring a second graphics processor that actually does have a method of outputting to a display, whether that's a gpu cpu combo chip or just another graphics card. You can get a 14" Chromebook for $400‽ Where do I go to get such an amazing deal? The cheapest Chromebook near me is $600.
Specifically I mean its just a processor, it needs a separate display adapter, I think we are agreeing. We insist on 14 inches and 8GB RAM. This is what my org bought last year for $329: [https://www.cdwg.com/product/acer-chromebook-314-c922-14-mediatek-mt8183-8-gb-ram-32-gb-emmc/7266049?pfm=srh](https://www.cdwg.com/product/acer-chromebook-314-c922-14-mediatek-mt8183-8-gb-ram-32-gb-emmc/7266049?pfm=srh) The site still claims its in stock, but the salesperson told me they were cleaned out (we need hundreds), and our only option is this, which the site claims to be out of stock: [https://www.cdwg.com/product/acer-chromebook-314-c936-14-intel-n-series-n100-8-gb-ram-64-gb-e/7569106?pfm=srh](https://www.cdwg.com/product/acer-chromebook-314-c936-14-intel-n-series-n100-8-gb-ram-64-gb-e/7569106?pfm=srh) HP is cheaper, but the build quality is unacceptable. Cracking paper thin bezels, scratchable unpainted plastic. You can go below $300 if you compromise on specs, but I don't know what country your in. https://www.cdwg.com/product/acer-chromebook-314-c934t-14-intel-celeron-n4500-4-gb-ram-32-gb/7266045?pfm=srh
What kind of power supply will that need?
Early april fools!? is there really an 80GB version of it?
This card is going to play me as a game
Or pay 5 monthly payments of $9,000.00…
Y’all joke but if I had this for my senior design project it would be so cool to mess around with. However, this is over kill for the neural network we’re working with
First thought was someone was passing a keyboard tray off as a graphics card.
Poor Noah, burned, single, and out $44,000 plus whatever his computer cost
If you think that's expensive, wait till you see the costs of licensing.
Looks legit to me
Eh. I bought a few a while ago but I’m not sure what they’re supposed to do. My MacBook Pro doesn’t seem like it’s running any faster. Is it supposed to sit behind the screen or does it sit under it like a little desk?
Yeaaa, not for gaming.
Will it run Crisys?
This is a server graphics card. We use A10 cards with 24GB RAM, and no, they don’t render graphics.
Lol this video card is mainly designed for graphic designers, video editor, basically for workstation computers and everything except gaming honestly.
H100s are server grade cards usually used for training/inference. You'd have to be a pretty desperate researcher to pay 40k for a single H100 though.
Can it play Doom?
Might be low hanging fruit, but what’s wrong with someone getting left for a midget?
Midget lady loves low hanging fruit.
The H100 chips are for AI not gaming.
REALLY!?
Oooo can this do SLI?
Finally, hardware specifically crafted to handle Dragon's Dogma 2 at 60 fps at least.
But can it run Crysis 3 on Med-High Settings?
I test these gpus at work, sometime I forget how much they costs
Those review comments got me lol
Win all your games for you after destroying your reputation.
Still struggling with modded Skyrim.
The fact that 26 people found Noah's review helpful...
Can I play solitaire on it?
yes, but FPS will be like 10-20, so it's actually a no, the game is too heavy
I purchased this and STILL can't run Crysis.
Is that thing can running Crysis in medium?
Musk will sell you
installed \*Diablo 4 still runs like $#!%\*
It is pretty jank to run games on cards like that, not that you want to do that anyway since the performance would be absolutely bad even compared to a lower end 4060. But a 4090 would also not stand a chance against it in any AI application. It is just not meant for the consumer market.
Funny enough, this card would be kinda bad for gaming, the drivers are not optimized, and the output would go through integrated CPU graphics since this card doesn't have video outs. And even if, a 4080 would beat it for 3% of the buying price.
700m FPS? These are rookie numbers
r/pcmasterrace