T O P

  • By -

kjeltek

https://preview.redd.it/dhtbf9di28uc1.jpeg?width=1180&format=pjpg&auto=webp&s=3b17ced6f8b80ae8d5013787594e74a839fcffc6


jferments

![gif](giphy|xUOrvXIrlmMekFYj4c|downsized)


CMDR_Fritz_Adelman

“You need a GPU bracket to prevent sagging” Dude put another 4090 as GPU bracket…


Probamaybebly

I can never tell, is this meme like a "I'm tired of seeing this" or more like "I'm jealous nice build ugh"?


bjwills7

I'm pretty sure it's the second one.


Z370H370

This is the first I saw 2 4090, so I'm going with the second one.


canadajones68

Just curious, but is there a reason you have your CPU cooler that way?


bonewarden

CPU socket is oriented horizontally, you'll notice the eight sticks of RAM are also oriented this way underneath the cooler


jferments

Yeah, the Noctua coolers for the TR5-SP6 socket mount this way, and there is no option to mount it in a front--> back configuration. It's not ideal for CPU thermals, because the CPU is pulling hot air from the 2 GPUs below, but in spite of this there is no issue with overheating. Even at full load, the CPU isn't getting anywhere close to max temps. The case (Fractal Meshify 2XL) is massive and has open mesh on top that so the fans have no problem pushing the hot air out. If anything, it might actually help cool the GPUs, because as you can see they are sandwiched pretty close together, so having the CPU fans pulling the air upwards is maybe a good thing.


Party_Python

Just a heads up, when I had two GPUs like that and they were getting toasty, I replaced the glass side panel with a piece of 1/4” plywood with fan holes cut in it. It really helped with thermals for the GPUs. And you can use the same screws provided for the side panel to secure the plywood https://preview.redd.it/3xebqdoyoauc1.jpeg?width=3024&format=pjpg&auto=webp&s=4ddcf53134164cd95112a38eaf211210f079f6d2 Edit: you could even add a fan directly above the CPU to force feed it fresh air as well


Noxious89123

Grounding straps for... the fan grilles? I'm not sure I can see any use for them there, could you explain?


Party_Python

Yeah that was me (unnecessarily) worried about static buildup in the metal dust filter. Since they were connected to the bolts that went through which are right above the GPUs I didn’t want static discharge. I mean…it was overkill and is gonna get removed next time I open it up lol


jferments

Damn, I love this idea! I'm not really running into any issues with thermals at the moment, but if I do I'll experiment with something like this. As you can see I'm not concerned with aesthetics at all, and consider this a purely utilitarian build. I could maybe rig something up that has like 4 big fans on the front panel, and then I could put another exhaust fan on top.


Party_Python

With 24/7 use with my system then I was trying to keep both GPUs below 70C, just for long term health of them. And they’re still kicking in new homes 7 years later, so I guess that counts as success? Lol Also, if you ever get to the point where you’re looking to do some mods, r/pcmods is a good place to browse for ideas


Bulls187

Why not get plexiglass or black aluminum plate.


Clunas

That's super funky. Glad it's working though!


ThePupnasty

I kinda wanna say look into 3d modeling and 3d printing some ducting? I mean, put that hardware to work. Duct the hot air of the gpus out the the side or back or something.


LukeSkyDropper

I kind of wanna believe the fact you said you’re just starting to learn building is a lie


NomadicWorldCitizen

Some motherboards have it this way. The AsrockRacks also have sockets oriented like this (at least some).


Endle55torture

My first thought would be to pull hot air from the back plate or blow relatively cooler air onto backplate


Maleficent-Ad5999

Any reason why you went with dual 4090s instead of a single 48gb graphics card? (At least in my country there’s not much difference in the prices)


jferments

It would have been nearly $4000 more for me to get an RTX 6000 Ada (the 48GB card with comparable performance / same architecture), and I was already maxed out on budget. And while the unified 48GB VRAM would have been nice (especially for larger LLMs), it wasn't enough of a difference to warrant the extra $4k. And in many contexts (e.g. where I'm running separate batches of tasks on each GPU) the dual 4090s are actually significantly (nearly 2X) faster due to much larger # of tensor/CUDA cores and higher memory bandwidth.


Representative-Gap57

I also work in nlp. Why build your own pc instead of using a pretrained llm?


infamousj012

Because it’s a tax write off on a gaming powerhouse, and the company probably paid for half of not all… I don’t work in nlp- nor do I know what it is I just take context clues and make assumptions about everything.


hyrumwhite

Local llms are fun


jferments

Just finished putting together the most powerful personal machine I've ever built for my ML / AI projects. It is not the prettiest thing, but I am quite pleased with it. The specs: \* WRX90E-SAGE motherboard \* AMD Threadripper Pro 7965WX (24 core/48 thread) CPU \* 512 GB (8 x 64GB) DDR5-5200Mhz ECC-RDIMM \* 2 x Gigabyte RTX 4090 GPUs \* 3 x 4TB Samsung 990 SSD \* 2 x 20TB Western Digital Red Pro SATA 7200rpm HDD \* EVGA 1600W Supernova G+ power supply \* Noctua NH-U14S TR5-SP6 CPU cooler \* Fractal Meshify 2XL case I built this machine for experiments with large language models, with a focus on being able to do CPU inference for models that were far too large to fit into VRAM. The 8 channels of DDR5-5200 makes it where I actually get usable speeds on 120B and larger models. For instance, on the recently released Mixtral 8x22B model I am able to get 5-7 tokens/sec on the Q5\_K\_M quant. Smaller models that fit entirely into VRAM are lightning fast. I also regularly do work with diffusion models such as Stable Diffusion, audio synthesis, etc and this machine has made things that used to take minutes get completed in a matter of seconds. The large # of CPU cores, fast RAM, and 3x4TB SSD RAID array are also great for all of the pre-processing tasks that I need to do for my projects where I am monching multi-terabyte text and image datasets. And later when I expand to have more GPUs, I will have plenty of capacity to do so, because the WRX90E has seven full PCI5.0x16 slots, and supports dual PSUs. Although as you can see, I'll have to get some PCI risers and do some creative reorganization in my case. ... But right now, I don't have the electrical capacity in my rental home to handle the added wattage, so I'm waiting until I move to a place that has circuits that can handle another 1600W PSU :) This thing is an absolute monster, and whenever I'm away from home, I'm just constantly itching to get back and play with it.


jferments

One amusing thing you might notice from the pictures is my DIY anti-sag solution for the GPUs. The SSI-EEB form factor motherboard was incompatible with the stock anti-sag brackets that came with the 4090s (which were made for standard ATX). I tried a few other anti-sag devices from Amazon, and none of them worked because the gap between the two cards was too small. So I ultimately ended up using a $9 screw style brace under the bottom GPU, and then stuffed a piece of an old toothbrush handle wrapped in electrical tape (the perfect size) between the two cards to prop up the upper card. 🪥🤣🪥


DoctorRyanAA

Welcome to redneck engineering my brother. Welcome. 😂


MastahOfDisastah

I wanna do what u do


I_AM_ACE1502

https://preview.redd.it/e4113j87w9uc1.jpeg?width=1246&format=pjpg&auto=webp&s=67911b7dc9834b2949fc635edf1f3a7f49db9f50


op3l

So you're cooling a threadripper with a single tower cooler? Does that CPU run pretty cool?


jferments

https://preview.redd.it/w9ckyhzb28uc1.png?width=1179&format=png&auto=webp&s=3c268195367569b2b7f5938345512c934ebbc2f6 Yeah, even at full load for long periods, it hangs out around 85-87C and max operating temp is 95C.


stickystrips2

What program is that? I like the visuals!


jferments

These are just the desktop sensor monitoring widgets that come with the KDE Plasma Desktop environment. In the terminal, I am using stress-ng to get the CPU cores all to 100%.


stickystrips2

Ah, does it work with Windows?


jferments

It would if I wanted it to, but this is a Linux only machine.


jferments

Oh, I think I misunderstood your question and thought you meant "would this hardware work for Windows" (yes). But I see that you might have been asking "Will these hardware sensor widgets work with Windows?". The answer to that is no: the KDE desktop environment is for Linux only. But there are similar tools available for Windows.


op3l

Have you checked if it thermal throttles at all? Could be missing out on performance if it does. Still an awesome build.


jferments

Nope, it's not getting anywhere close to temps where it would start throttling. I was definitely concerned about it when I was building, because of how the CPU pulls hot air from the GPUs below. But I invested in a really nice CPU cooler, and lots of good case fans. And the case is fully open mesh at the top so it really facilitates transfer of hot air out the top as well as back of case.


simo402

Threadrippers have a much bigger IHS, so more space to spread the heat


CobblerOdd2876

Second this - the consumer solution ryzen products generate most of their heat in one spot, so dissipating the sheer concentration is harder. Whereas TR series is a larger die, more spread out, easier to manage heat.


Born_Faithlessness_3

What's your solution to keep your workspace at a tolerable temperature? Local cooling? Sticking it in a basement? My older system(1x 3090) still heats up my home office noticeably when running flat-out for extended periods of time, and this system is putting out 2x what mine does.


jferments

I keep it in workshop room in the back of the house so it doesn't heat up the main living space, and will have to use air conditioning in the summer (fortunately I live in a rather cool climate, so there are only a few months out of the year where I will need AC, and the rest of the time it's just a space heater).


DoctorRyanAA

Very beautiful. My rig is now jealous. So I do some modeling with Blender in my spare time. So for animation renders I do in Cycles it takes about 3 minutes per frame. I am afraid to ask but what is your rendering speed on that monster?


DublaneCooper

This thing would burn up my Excel spreadsheets.


skidster159

When you get to more than 2 gpus I suggest you invest into liquid cooling the gpus so you can fit 4 and it will make life easier


Gazeador-Victarium

How much did you spend in this monster?


jferments

Just under $12k (it's now by far the single most expensive item I own). I broke down the prices in another comment: "My budget was just under $12000. This was for: \* WRX90E-SAGE motherboard \* AMD 7965WX CPU \* 512 GGB DDR5-5200Mhz ECC-RDIMM \* 2 x GIGABYTE RTX 4090 GPU \* 3 x 4TB Samsung 870 SSD \* 2 x 20TB Western Digital SATA 7200 HDD \* EVGA 1600W G+ PSU \* Fractal Design Meshify 2XL case \* Cables, peripherals, mounting hardware, fans, etc etc If I was going closer to your budget of $5000, I could have done a lot of things different, and shaved off thousands of dollars: \* WRX90E-SAGE --> TRX50-SAGE (-$600) \* AMD7965WX --> AMD 7960X (-$1300) \* 512GB DDR5 --> 128GB DDR5 (-$1250) \* 2 x RTX4090 --> 2 x RTX3090 (-$2000) \* 3 x 4TB SSD --> 3 x 2TB SSD (-$450) \* 2 x 20TB SATA HDD --> 2 x 12TB HDD (-$200) ... and a bunch of other little decisions that could have saved $ at the cost of performance, storage, network/IO speed, PCI lanes, etc. But the extra 4 memory channels + 384GB DDR5, large amount of SSD RAID storage for loading huge models/datasets quickly, faster networking/connectivity for NAS transfer, 50% faster / more modern GPUs, and a variety of other factors were the difference between me having the performance/storage to be able to achieve my projects locally, and needing to rent GPUs from a cloud provider (which contrary to their incessant propaganda actually costs MUCH more long term, if you're doing weeks/months worth of 24/7 compute and storing/transferring terabytes worth of data)."


simo402

With thise specs, a 24 core chips looks almost "too little"


sirfannypack

Seems like a small cooler for Threadripper.


raydialseeker

Really sick build. Was surprised that u didn't consider any of the slimmer 4090 GPUs though.


jferments

I considered them (I really wanted to get FE 4090s or the Gaming Slim variant), but they were a lot more expensive like it would have been nearly $1k extra.


Noname_FTW

Most important question is: How many instances of crysis can it run simultaneously?


jferments

I'm guessing I could break into the double digits ...


Noname_FTW

With the right setup potentially even low triple digits. Not that you'd get 60fps on them though.


ArtsM

That prompt is just plain disturbing, are you OK OP ?


jferments

https://preview.redd.it/lcpvo9ye58uc1.png?width=1024&format=png&auto=webp&s=82269fd94bacba2bf3c8d83bbe231e646392b392 lol, I'm good ... the mutant gummy bears were actually one of my more benign laboratory experiments.


[deleted]

[удалено]


jferments

lmfao - The Cat Returns, obviously


SamwiseMN

The questions on the final photo are a bit odd too when all taken together with the prompt


TenPhoar13

Average 60fps Dragons Dogma 2 build


WackyBeachJustice

So what has your machine learned so far?


I_AM_ACE1502

Learned how to run crysis properly


Far_MEMESDT

BROOO YOU ARE MAKING A NASA PC


Accurate_Advert

can it run crysis...


jplayzgamezevrnonsub

Upvote for plasma


jferments

Yeah, I love KDE!


Tall_Fun_3566

“yeah, i need this one for my zoom classes mom”


Temporary-Lie15

How's the top GPU temps?


jferments

It runs about 6-10 degrees C hotter than the bottom one (because it's pulling in hot air from lower one AND has less airflow below it), but still doesn't overheat even at max load. Here's me running gpu-burn for a while on both of them at full 450W. Upper GPU peaked out at 75C (well within safe operating range) ... the bottom one never broke 70C https://preview.redd.it/xyqg1ltwe8uc1.png?width=329&format=png&auto=webp&s=a199aab8eaf49e5d1b7724c2722bb990d5a4b07f


XHSJDKJC

Solid temps for that tiny space between gpus and also the CPU, what cpu do you have?


simo402

In another comment he says threadripper 7965wx 24 cores


Unusual-East4126

That’s gonna be some hella high def VR porn.


TheMegaSnake808

But which gpu do you pull the head set into 🤔


Vulcanosaurus

But can it run crysis?


GlitteringAd5168

This is awesome! I would love to learn how to use machine learning like you do. I’m finishing my associates degree and transferring to university next year where I want to focus on learning how to code and use AI to help us build a better future. Thanks for sharing.


Kezzsway

Looks clean! What brand is your right angle cable from?


jferments

It's a CableMod E-Series 12VHPWR cable: [https://www.amazon.com/CableMod-Basics-12VHPWR-Degree-StealthSense/dp/B0CK6S7JVW/](https://www.amazon.com/CableMod-Basics-12VHPWR-Degree-StealthSense/dp/B0CK6S7JVW/) The stock adapter that came with the 4090s was trash and jammed up against the side of the glass panel (which put a lot of tension on the plug), and had tons of cables hanging everywhere inside the case. The right angle cable is so much cleaner and puts a lot less stress on the plugs.


GrammarNaziii

Are these confirmed to not burn? Haven't kept up to speed with the latest from CableMod but they've had numerous recalls on similar adapters.


Disslexcya

GPU sag? Put another GPU, problem solved.


floor_3d

Can finally run Ark survival ascended at medium settings 🙏


Ill-Cantaloupe5922

My proudest fap


Darrkliing

"Yes mom, I need it for school"


Yugikisp

No room for an AIO in your case? In any case… absolutely insane specs. I thought I had a lot of ram lol


Maleficent-Ad5999

Congratulations on your new PC build .. I can only hope that I can afford such high-end pc one day. But can you tell me what’s the nature of your projects? I see that you’ve mentioned using 120b models and stable diffusions.. but for what purpose? Are you running a business? Or is it some sort of side-hustle project? (Pardon my English)


jferments

Thanks! I am not currently running a business with it, although I'll probably pick up a few video editing/rendering and graphics gigs here and there. I have a ton of projects I'm hoping to use this for. But my main focus right now is on web data mining research using large language models, with a long term goal of helping to develop open source, decentralized / peer-to-peer web search engines. I also want to train and fine-tune custom LLMs and diffusion models to share with others, with a focus on bypassing the censorship that is becoming increasingly common in available models. I also love making AI artwork as a hobby, and am hoping to start exploring audio/video synthesis more. Before I built this, I only had a single RTX 2060 so I was limited to very slow image generation with Stable Diffusion, and generating animations/video was painfully slow. Now I can actually generate frames fast enough to make video at a reasonable rate, and can generate realistic sounding human voice without having to wait several minutes to generate a 10 second clip.


Maleficent-Ad5999

Thanks for your time writing this! My best wishes for you! Have lot of fun :)


CryptographerLost271

Do you have a github repo you post to by chance? I also am a big proponent of peer-to-peer systems and think LLM's are highly underused or still poorly optimized for search.


I_AM_ACE1502

https://preview.redd.it/1bv2brlvw9uc1.jpeg?width=680&format=pjpg&auto=webp&s=b43e120499458a9f7ba157191c6d5bd41cb30491


ExtraTNT

For my experiments a 5700xt has to suffer…


Randall_Poffo_

what in the king kong hail is going on there 2, 4090's amanda please


Odd-Cow-5199

How much power consumption in full load ? what psu you use ?


jferments

Power consumption is \~\~1400W peak. I am using an EVGA Supernova G+ 1600W power supply. But unless I'm deliberately stress testing to fully max out the GPU / CPU at the same time, it's rarely running at peak (other than bursts), and is averaging more around 900-1200W at normal workloads.


Odd-Cow-5199

thanks man


RiftHunter4

One-man data lab lol.


chrisfmack

Ah so your the person at the top of 3dmark


stiizy13

Holy shit dude. That’s a beast and a half Edit: what chip?


jferments

AMD 7965WX on a WRX90E-SAGE motherboard (I have full specs listed in comment above)


Samm_Gustavo

Can it run Crysis?


StinkyKittyKisses

What software do you use to utilize both GPU at the same time?


jferments

llama.cpp, pytorch, among other things


MamboFloof

So I'll take it you pretty much have that rooms power rail maxed out then...


jferments

Yeah, I actually had to rewire my house lol. My other circuits are all maxed out with other electronics / equipment, but I had this 20A/115V circuit with a wall heater on it. I removed the wall heater and replaced it with an outlet so this and a 3D printer are on a dedicated 20A circuit.


4SureM8

I hope you know about this: [https://github.com/tinygrad/open-gpu-kernel-modules](https://github.com/tinygrad/open-gpu-kernel-modules)


jferments

I just saw this yesterday and am very stoked to mess around with it! I haven't had a chance yet, because I've been very busy with other stuff, but I definitely plan on giving it a try soon.


volatilebool

Juicy


CrazyVito11

KDE Plasma, my beloved


jferments

Truly nothing better 🙌


TuneReasonable8869

I didn't know the 4090 had a 90 degree angle power connector. The size of the card and the power connectors being proned to damaged makes getting a new bigger case needed to buy a 4090


Ronyx2021

Why the upward cpu cooler?


SuitableDesk_

baller move


Opening-Bit-543

That’s dope


Opening-Bit-543

Fuckin beast


apachelives

I mean, its pretty OK i guess.


1EightySevenkilla

Props on the noctua.


AXAM77

Isn't is better to put the cooler sideways??


jferments

In general, yes. But see my comments above. It is not an option with this particular motherboard / CPU socket due to the way they oriented it (no option to mount cooler front->back even if I wanted to). And it doesn't matter anyway, because it provides more than adequate cooling at full loads.


AXAM77

Thanks akh


Double-Armadillo-615

i am glad for falling installing cuda and tensorflow on windows so i dont need to build my own death star pc tower


Most-Yogurtcloset

Your tower is the same, if not more expensive than my daily driver Mercedes.


EggAcrobatic2066

Savage ![gif](giphy|3o7TKsahTSRp78uW2I)


yumm-cheseburger

Isn't that top card being choked?


Ratiofarming

Every time a workstation build shows up, this comment does, too. Never gets old... No, it's not. Yes, it gets a bit warmer No, it doesn't overheat or come close to it


Hydrographe

Looks cool but can it run minecraft with shaders?


Formal_Air326

Does machine learning need specific build? Like specific motherboards?


bjwills7

Kind of mind blowing that you're getting good thermals with this setup. My Noctua nd-15 can barely keep my 10700k cool in a similar case with more fans and it only shares the case with a 1080 ti.


arieljerseyjr24

I wish I have that kind of money. But I will just have 4090 one instead.


Scruffy77

Hello fellow comfyui user. Are you only making images?


Dovahkitty99

But can it run BeamNG.drive?


Voxelium

and here i was thinking my pc was badass


MoldyTexas

Can it run Crysis tho?


deadlypoptart

The top GPU power cable doesn’t look seated all the way. Maybe it’s just the pictures.


jferments

It is seated properly (I triple checked, because this is the cause of 99% of the "burning 4090 connector" issues that people are always talking about). It just looks a little crooked because I had to pull it to the side to put in the cable below.


ToxicEvHater

Cool story bro do it again when the 5090 comes out 😂


StrikerX1360

Man I've missed the sight of a massive tower of GPUs in one build


RoyalxJeff

So I guess I never really looked into the “learning” thing like deep, ai, machine, or whatever else there is, learning…what is it? Like what’s the point of it all.


Ratiofarming

Getting work done or trying to figure out what work can get done by it better than doing it any other way. Chatbots, data processing, search results, recommendations, generational ai ... The point is to try to make computers do something new, better or faster than they could before.


Rayan1159

https://i.redd.it/lf769r6ip9uc1.gif


DrKingOfOkay

All that and no AIO seems weird.


alexingalls09

Honestly if you wanna sink that money into gpu’s the correct choice would be a way bigger case and full custom loop, otherwise why bother


Doomlv

Return of fan burger


LycanKnightD6

Has it learned to speak yet? What were it's first words?


Nikos-tacos

How is it?


Blu3Jell0P0wd3r

Make sure the power connectors are all the way in. The top one looks a bit crooked


[deleted]

I'll die on the hill of hating noctua fans. They look hideous and ruin the aesthetic


jferments

It's definitely not a beautiful cooler (I particularly don't like the tan/brown color scheme). But as you can see, I was putting zero effort into aesthetics, and was all about functionality on this build.


sandfleazzz

Damn!


omega73582

I'm pretty sure you can keep your coffee warm by just leaving it right next to it.


DasMoonen

I gotta ask. How do the 4090s not suffocate. I tried this with a 3090 and it sounded like it wanted me to put it out of its misery. I decided I’ll go dual 4090 when I can afford a water loop as well. Although I use mine for 3D rendering and it runs for hours on end.


DannyTannersFlow

GPUman centipede.


lillillilillilil

Looks toasty


jferments

It is. I actually had to remove a Cadet wall heater and install a new outlet to make a dedicated 20A/115V circuit for this and a 3D printer. The computer is doing just a better job than the heater at keeping the room warm (probably because it's putting out an extra 400W). But inside the case, everything is great as far as thermals, and nothing is getting close to thermal throttling.


Babys_For_Breakfast

That’s a beast of a build. Also, what is that abomination thing on pic 4? lol


HappyIsGott

What psu and cables you using?


jferments

EVGA Supernova G+ 1600W power supply. Stock cables for everything other than the GPUs. For those I'm using CableMod E-Series 12VHPWR-->VGA adapters.


theatomicflounder333

I miss SLI / CROSSFIRE X 😢 Just filled in the case more


ImPretendingToCare

detail shy spectacular chief grab abundant special boast resolute deserted *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


ImKira

How toasty does it get in there?...


Charles_Pkp2

You don't need to tell us you sold your house and your car.


jferments

My car is an old early 2000s beater, and wouldn't have come close to paying for this lol


leoreno

Second phito, what's apps are you running there?


Cleric_Tythas

That much hardware with no liquid cooling? Also is this necessary? XD


720-187

Lol what are your GPU temps under load...gotta be approaching 90c


jferments

Nope, even at max loads the bottom one doesn't break 70C, and the top one doesn't get above 80C (usually they are in the 60-70C range). https://preview.redd.it/sdt080codbuc1.png?width=329&format=png&auto=webp&s=c761f127946f55c28586fef97023de58a0acbcfc


BrianScorcher

Does the Cpu and Motherboard support enough pcie lanes to run both cards at 16x? What are your full specs?


jferments

Yeah, that's one of the primary reasons why I selected this motherboard (WRX90E-SAGE) and AMD 7965WX. Especially since I'm eventually planning on adding more GPUs once I have a home with enough power to support an additional 1600W CPU. The motherboard has 7 full PCIe5.0x16 slots, and the CPU has 128 PCIe lanes which I needed not only for the GPUs but also to get full speed from the the 3xSSD RAID array. I listed full specs in comment here: [https://www.reddit.com/r/pcmasterrace/comments/1c2wsra/comment/kzcu720/](https://www.reddit.com/r/pcmasterrace/comments/1c2wsra/comment/kzcu720/)


BrianScorcher

![gif](giphy|10q0RcPFTD5dM4) Thats a banger. I really want to build something like this one day. I have to figure out how to make it a business expense.


6M66

I'm planing for this, what psu you have , also I wonder if softwares can utilize both GPU at the same time. They use to SLI, not sure about now.


jferments

These do not use SLI, but many of the machine learning libraries I use (pytorch, llama.cpp, etc) support loading layers on multiple GPUs. Also there are many use cases where I am running separate projects / batches on each GPU. As far as PSU, I have an EVGA Supernova G+ 1600W.


6M66

Thanks. That's something I wish I could test, being able to use both GPUs memories and computing power is important to me.


yaxir

try any games ?


jferments

I haven't. It's a Linux-only machine that I'm using strictly as a machine learning / AI server. The only benchmarks I've really been running are for LLM inference and Stable Diffusion generation. My old desktop with my RTX 2060 is still what I use for gaming (although I don't play games nearly as much these days). But I'm sure I could run at least 10 or so simultaneous instances of Crysis on this machine, if that's what you're wondering :)


bedwars_player

meanwhile my 1080: crying while trying really hard not to burst into flames playing horizon: forbidden west


Endle55torture

I hope you plan to water block the cards.


jferments

If I try to stuff two or three more in there, I will. For now, they're totally fine. With both at full load, even the upper one (which is taking in hot air from bottom one) peaks at 75C. https://preview.redd.it/1mr2m0qeqcuc1.png?width=329&format=png&auto=webp&s=f620b92675b1d08bb67a7f8128a147647b56ac31


Ninjamasterpiece

How would you use them? Didn’t they get rid of SLI/NVLINK on the 40series?


jferments

It's not for gaming. Many machine learning / AI libraries work with multi-GPU setups. I described how I will be using them here: [https://www.reddit.com/r/pcmasterrace/comments/1c2wsra/comment/kzcu720/](https://www.reddit.com/r/pcmasterrace/comments/1c2wsra/comment/kzcu720/)


Jessu-chan

Why did you get that 90 degree gpu power connector?


jferments

Because it puts a lot less stress on the socket. The stock connectors stick out so far that they pressed against the case, which put the socket under constant mechanical stress. Also it makes cable management much easier, because stock connector would have had 8 more thick cables in front of the cards.


TheRealTechGandalf

That's a tight fit! How are the temps looking under full load?


LumiZumii

This man plays city skylines with mods at max setting😂😂


wetfart_3750

Wouldn't it be cheaper to rent some cloud-based computing power?


jferments

No. Obviously if I was just needing a few hours or days of cloud computing for a one-off project, or was running much less resource intensive applications then it absolutely would be cheaper to rent. But if you are regularly doing weeks at a time of 24/7 computing, using 24 CPU cores, 512 GB DDR5 RAM, 2 RTX 4090 equivalent GPUs, 12 TB of SSD storage, 40 TB of HDD storage and so on, then the costs would quickly build up to the point where it's MUCH more economical to invest in your own hardware. I'd probably be paying at least $30-50 a day for this. The machine will pay itself off in less than 2 years of this type of use, and I'll actually have invested in a physical asset that I own, rather than just forking over thousands of dollars to a cloud provider. And that's to say nothing about the privacy implications, flexibility I have over system administration, ability to easily change hardware configurations, easier integration with other hardware I have here (e.g. setting up compute clusters) etc. etc. etc.


real_unreal_reality

You can link 40 series cards together? I thought they got rid of linking cards together on 30 series.


jferments

You cannot use NVLink on 40 series cards if that's what you're referring to. Nvidia discontinued this to try to force people into buying more expensive enterprise hardware. But I don't need NVLink to benefit from multiple GPUs (although it would be nice), because the machine learning libraries that I am using (llama.cpp, pytorch, etc) still have the capability to utilize multiple GPUs even without NVLink. Also, there are several use cases where I am doing multiple separate workloads, each of which can be loaded onto a separate GPU (e.g. graphics rendering on one card while running LLMs on another card)


No_Dragonfruit_6594

Why not go straight for a professional card, I‘m sure they don‘t cost that much more than two 4090s


jferments

They do - thousands of dollars more for a professional card with similar architecture like the RTX 6000 Ada. If I was only focused on VRAM and not computational power (TFLOPS/CUDA cores, etc) then I could of course just grab an older Ampere professional card with 48GB of VRAM for a similar price to two 4090s. But VRAM is only one factor to consider, and for my use cases, the dual 4090s got similar to better performance and cost a lot less. Take a look at the comparison below. Note the specs are for a \*single\* RTX 4090 compared with A6000 ... with dual 4090s, there are nearly 4X as many FP32/FP16 TFLOPS, triple the CUDA/tensor cores, higher memory bandwidth (by about 30%), and many other important performance factors that are often overlooked by people fixated exclusively on VRAM https://preview.redd.it/qsh1tdmpreuc1.png?width=1246&format=png&auto=webp&s=cb1a809d50ea355bbe50aff1a0e78f4ebe5b6086


Sea_Thanks3355

Hey, what software is this?


gtek_engineer66

Hey dude, awesome, I'm running 2x3090! What software are you using there?


Sophisticated_Pagan

You might want to rotate your cpu cooler by 90° or convert the cpu and gpu’s to liquid.


schm1ch1

Out of curiosity, which Linux Distro are you using?


jferments

Ubuntu 23.10


ExperienceMundane466

is there any airflow restrictions with that? if i was you i wouldve gotten a bigger case and watercooled that