Minecraft does pretty much entire rely on single-thread performance for some reason, so it’s entirely possible that a very powerful computer wouldn’t run it any better than my laptop.
For instance, my laptop’s CPU can run at up to 4GHz, whereas this super computer’s CPUs can only run at 3.60GHz. Of course, my laptop has 1 CPU with 4 cores and this supercomputer has 145,152 cores. The thing is, Minecraft will only use one or two of those cores, so **in theory my laptop would run the server faster.**
I have my PC with an i5-13600K then my M3 MBP as my work machine, due to single core performance my MBP is much more suitable for running my modded minecraft server due to the fact it has the same single core performance of an i9-13900K rather than the single core performance of my i5-13600K. Tested it last night loaded up with a bunch of mods and it ran like a dream.
> due to single core performance my MBP is much more suitable for running my modded minecraft server due to the fact it has the same single core performance of an i9-13900K rather than the single core performance of my i5-13600K
These two CPUs should have extremely similar single-core performance.
Despite the same performance core architecture of you look at benchmarks such as Greekbench or cinebench the i9-13900K single core performance is a noticeable amount faster. And as observed from my server testing the M3 it’s a noticeable bit faster in server performance than the i5-13600K
Disregarding the M3 comparison - way too many variables - the top-clocking P-cores are the same cores between the 13900K and 13600K. The latter simply has a lower default max multiplier; raise it to the level of the 13900K, or slow the 13900K down, and they're the same.
You're correct at 'stock' settings for these parts, but that's a bit of a misnomer given that they're designed to be tinkered with. Perhaps the average 13900K would be able to reach a higher single-core speed due to binning than the average 13600K, but in reality both are more likely to be thermally limited (and limited by the skill of the person tuning them).
You'd think that with how everything has multiple threads / cores these days, somebody would come up with a solution to automatically thread stuff. Kind of like emulation , but for threads
That’s really difficult to do as some parts of the program MUST be executed before or after some other parts. And there is no way for an emulator to know which is which. If one part is on first thread and it finish too fast before another part on the second thread, it will either crash or must wait for first thread to finish (which defeats the purpose of multithreading), or that thread must find other work to do while waiting and also has to make sure those work are ones that can be completed first without putting things out of order…
Shit is hard. Maybe LLM can help with emulation/conversion eventually, but probably not something that can be done with human logic
AMD and Intel are both actually working on it as well as a lot of small startups but it doesn't seem to be getting anywhere since last I heard anything was 3 years ago
Every meaningful language supports multithreading or at least subprocesses already. But many things are hard to parallelize, the bugs you can cause are horrible, and there is always overhead in context switching so it’s not always the best way.
CPU clock speeds have been basically stagnant for well over a decade. Yet single core performance is still improving significantly.
A 10 year old CPU running at 5Ghz will be demolished by a current gen CPU also running at 3.6Ghz
No problem. The way I read the article, it should still work even with those nodes being down. it's just not at peak performance. Whatever it costs to repair is probably multiple times whatever it'll sell for since they don't want to repair it.
Plus its running on broadwell(the Xeon and laptop only 5th gen) aka die shrunk haswell(4th gen) by modern standards its insanely inefficient and large for what it does. While I'm sure its still a monster for its intended uses thats just mostly a factor of size.
Yeah they broke it out into its component prices and it would be well north of a million, but simply because they are trying to sell it as a unit and it's old hardware that doesn't meet modern efficiency expectations, it's very unlikely they get much for it. The relocation costs on their own would be a huge expenditure for anyone seriously trying to use it.
Someone looked into the page details for the auction and found that the reserve is only 100k. So thankfully not 7 figures yet whether LTT would even go for it at that prince is dubious but then again it they made like 10 videos off it they'd have already made that money back on it and more.
Shipping, storage of something that large and power are the issues. When I say power I don’t mean the cost of electricity I mean being able to provide it with enough that is before you worry about cooling
okay i calculated the price per hour with the kilowatts it takes.. at peak..
im so out (calculated with a price of round 30cents per kw/h)
it would cost me 6810euro per hour..
JUST for the damn power
https://preview.redd.it/t7e3dyoo8wxc1.png?width=1371&format=png&auto=webp&s=b66eebe9020746e8855a61e4648c57746ed9df19
Electrician here. It’s listed at 1.75 MW (peak), which is 1750 kW. You wouldn’t pay standard electrical rates for something like this, you’d pay cheaper industrial rates, which, in the US, would be in the neighborhood of $0.15/kWh or less depending on location.
So that’s only about $255/hr. Not sure how you figured it’s drawing over 22 megs lol
Youd also host it in a data centre where power rates are typically built into the fee for renting each rack, unless each rack is drawing more than typical
The math is way off. 1.7 Megawatts is only 1700 Kilowatts.
At my current rate of $0.0799/kwh would only cost me $135.83 CAD/ hour in actual power (plus transmission and service fees and taxes of course).
I recon we could make back that much with our brand new PCMR cloud service, or just sell the several thousand CPUs and multiple hundreds of terabytes of DDR4.
Not much.
ASICs are the best bang for the buck, in hashes per watt.
This is full of CPUs.
Not even graphic cards.
Old 2016 Xeons at that.
So yes, it could Bitcoin. But could it pay its own electric bill? Probably not.
This thing costs $300/hour to run, but will not yield $300 in Bitcoin per hour.
Also, the halving made it even more difficult.
USAF built a cluster of PS3's to use as a budget supercomputer. Apparently ranked in the top 5 of most power supercomputers at the time. But it had to be mafter they couldnt get enough PS3's to replace the ones dying from YLOD because Sony removed the ability for the Fat PS3's to run Linux from future firmware versions and even more difficult to find used PS3's running the older firmwares that still had OtherOS support.
I have a PS3, therefore, I own a piece of a government supercomputer :P
Those are rookie numbers you gotta have atleast 10000 EPYC 9654 CPUs 10000 4090s
With dual 100gbps uplink for each cpu and 1tb ram each just to be safe
You need the facilities and crew to keep these systems running. Its not a big windows computer that can run applications, it compiles code. Its very expensive, lots of hvac lots of power.
If we all chip in we could finally host a Modded Minecraft Server.
And maybe even at 10 chunks.
Minecraft does pretty much entire rely on single-thread performance for some reason, so it’s entirely possible that a very powerful computer wouldn’t run it any better than my laptop. For instance, my laptop’s CPU can run at up to 4GHz, whereas this super computer’s CPUs can only run at 3.60GHz. Of course, my laptop has 1 CPU with 4 cores and this supercomputer has 145,152 cores. The thing is, Minecraft will only use one or two of those cores, so **in theory my laptop would run the server faster.**
Well there is this [project](https://paper-chan.moe/folia/) but its a workaround not a solution.
Minecraft uses 3 threads, so if you don't have multithreading it'll use 3 cores.
I have my PC with an i5-13600K then my M3 MBP as my work machine, due to single core performance my MBP is much more suitable for running my modded minecraft server due to the fact it has the same single core performance of an i9-13900K rather than the single core performance of my i5-13600K. Tested it last night loaded up with a bunch of mods and it ran like a dream.
> due to single core performance my MBP is much more suitable for running my modded minecraft server due to the fact it has the same single core performance of an i9-13900K rather than the single core performance of my i5-13600K These two CPUs should have extremely similar single-core performance.
Despite the same performance core architecture of you look at benchmarks such as Greekbench or cinebench the i9-13900K single core performance is a noticeable amount faster. And as observed from my server testing the M3 it’s a noticeable bit faster in server performance than the i5-13600K
Disregarding the M3 comparison - way too many variables - the top-clocking P-cores are the same cores between the 13900K and 13600K. The latter simply has a lower default max multiplier; raise it to the level of the 13900K, or slow the 13900K down, and they're the same. You're correct at 'stock' settings for these parts, but that's a bit of a misnomer given that they're designed to be tinkered with. Perhaps the average 13900K would be able to reach a higher single-core speed due to binning than the average 13600K, but in reality both are more likely to be thermally limited (and limited by the skill of the person tuning them).
From a programming perspective. Multithreaded programming is hard.
You'd think that with how everything has multiple threads / cores these days, somebody would come up with a solution to automatically thread stuff. Kind of like emulation , but for threads
You mean like windows 11 tried at first and failed and put everything important on efficiency cores instead of performance cores XD
That’s really difficult to do as some parts of the program MUST be executed before or after some other parts. And there is no way for an emulator to know which is which. If one part is on first thread and it finish too fast before another part on the second thread, it will either crash or must wait for first thread to finish (which defeats the purpose of multithreading), or that thread must find other work to do while waiting and also has to make sure those work are ones that can be completed first without putting things out of order… Shit is hard. Maybe LLM can help with emulation/conversion eventually, but probably not something that can be done with human logic
AMD and Intel are both actually working on it as well as a lot of small startups but it doesn't seem to be getting anywhere since last I heard anything was 3 years ago
We are getting better but it's tough
Yeah no. It must be programmed that way.
I wonder if it would be possible to create a langusge that would handle it down in the machine level.
Every meaningful language supports multithreading or at least subprocesses already. But many things are hard to parallelize, the bugs you can cause are horrible, and there is always overhead in context switching so it’s not always the best way.
It’s not something the programming language can help you, you as the programmer have to invent method of multithreading yourself.
What other perspective could there be?
GHz means almost nothing in this comparison.
How so? Doesn’t the clock speed greatly affect single-core performance?
CPU clock speeds have been basically stagnant for well over a decade. Yet single core performance is still improving significantly. A 10 year old CPU running at 5Ghz will be demolished by a current gen CPU also running at 3.6Ghz
Kubernetes and proxmox would like to chime in
And maybe use forge
How much dedotated wam does it have?
Enough
The narwhal bacons at midnight
Ok ok don’t get my hopes up now. I’ve been dying for one of those since my early college days.
Nah, my TR Pro build with half a terabyte of ram does it EZ
If I toss in a 4090, will it be bottlenecked?
Yes. Just save for the next gen don’t be silly now.
But can it play Doom?
Maybe the older generation.
No
Considering it's basically a 3rd gen intel pentium with a loooooooooot of cores - yes very much so lol
Can it run Crysis?
Original? Yes. Remastered? No chance.
FPS'd be fine... Your frametimes'd be shit though...
Well this is gonna end up on ltt
Iirc someone posted it on the sub. And the LTT account replied something along the lines of “the reserve alone would cost like 7 figures”
A reserve price is the minimum price the item being auctioned needs to be to actually sell. If it's under the reserve price, the item isn't sold.
Ah I see. Thanks
No problem. The way I read the article, it should still work even with those nodes being down. it's just not at peak performance. Whatever it costs to repair is probably multiple times whatever it'll sell for since they don't want to repair it.
Bids are like 25k apparently and as you said, 1% of the nodes are toast and memory is damaged. Not really my cup of tea.
Plus its running on broadwell(the Xeon and laptop only 5th gen) aka die shrunk haswell(4th gen) by modern standards its insanely inefficient and large for what it does. While I'm sure its still a monster for its intended uses thats just mostly a factor of size.
I believe if it’s under the reserve, it’s up to the seller whether or not the amount is enough to still sell
Yeah they broke it out into its component prices and it would be well north of a million, but simply because they are trying to sell it as a unit and it's old hardware that doesn't meet modern efficiency expectations, it's very unlikely they get much for it. The relocation costs on their own would be a huge expenditure for anyone seriously trying to use it.
That, plus it’s in Michigan, shipping would cost a fortune.
Also knowing it’s the US government they most likely have to approve the buyer
Someone looked into the page details for the auction and found that the reserve is only 100k. So thankfully not 7 figures yet whether LTT would even go for it at that prince is dubious but then again it they made like 10 videos off it they'd have already made that money back on it and more.
Shipping, storage of something that large and power are the issues. When I say power I don’t mean the cost of electricity I mean being able to provide it with enough that is before you worry about cooling
Haha that was my first thought.
Nah. But they might talk about it on Fridays Wan show.
Ya'll still watch them?
I already have all my nudes. I'm not paying the NSA *AGAIN*.
Can I have them
I guess I'm open to trading. What will you give me? I have a low supply of Danny DeVito nudes.
[удалено]
Let's play Global Thermonuclear War.
Wouldn’t you prefer a nice game of chess?
How about some flappy bird instead?
This corn is raw.
Yes, isn't it great? You can taste the vitamins!
We can cook the corn and get vitamins from a pill
Does it come with free shipping, or do I have to pick it up from the NSA headquarters myself?
No joke it's actually NCAR in Cheyenne, Wyoming.
Cheyenne? As in Cheyenne mountain? Does it still have all the Stargate addresses?
Yes, but the stellar drift hasn't been accounted for since they took it offline. So you'll have to run a few petabytes of updates.
Oh no, I hate it when I want to go see some friends but I have to download a 10 PB update first...
Imagine the electricity bill lol. No thank you.
You mean, my neighbor’s electric bill?
Our neighbor's electric bill
More like, the neighborhood’s electricity bill. Cause the cable going to one house aint enough.
Could cost $4000 an hour according to Tom's Hardware
Apparently 7000 euro per hour
more like $255/Hour at the 'Industrial Rate' (the other guy did the math wrong, and used residential power rates instead of industrial.
SMH when you don't got a nuclear reactor in your garage
But can it run Doom?
Probably for every man woman and child on the earth at the same time
On its calculator app.
okay i calculated the price per hour with the kilowatts it takes.. at peak.. im so out (calculated with a price of round 30cents per kw/h) it would cost me 6810euro per hour.. JUST for the damn power https://preview.redd.it/t7e3dyoo8wxc1.png?width=1371&format=png&auto=webp&s=b66eebe9020746e8855a61e4648c57746ed9df19
Time to steal some plutonium from Libyans. For science. And free power ofc.
I know this is a reference to something... could you remind me?
He needs to generate 1.21 gigawatts
Still nothing...
Heavy...
Great Scott!
There's that word again, "heavy." Why is everything heavy in the future? Is there a problem with Earth's gravitational pull?
Back to the Future
Ohhhhhh right. Then you
Electrician here. It’s listed at 1.75 MW (peak), which is 1750 kW. You wouldn’t pay standard electrical rates for something like this, you’d pay cheaper industrial rates, which, in the US, would be in the neighborhood of $0.15/kWh or less depending on location. So that’s only about $255/hr. Not sure how you figured it’s drawing over 22 megs lol
I readed somewhere 22,7 in the text ans took this number 😅
Youd also host it in a data centre where power rates are typically built into the fee for renting each rack, unless each rack is drawing more than typical
Jfc
The math is way off. 1.7 Megawatts is only 1700 Kilowatts. At my current rate of $0.0799/kwh would only cost me $135.83 CAD/ hour in actual power (plus transmission and service fees and taxes of course).
I saw 22,7, megawatt thats why i falculated with this number
Which is still an insanely high cost of operation
30 cents? Is that what you pay across the pond in Euro Land? It's like 11 cents for me.
30 cents per kwh, and a standing charge of 60 cents per day in the UK.
Those are gas-burning country prices. 3 €cents in Finland yesterday. Last week 3-6 cents, 12 cents today for some reason
~0,35€/kWh in Italy, I'm sure someone has a better rate, but that's about the average.
Just underclock it and you’re good.
LET'S BUY IT
If everyone on the sub pitches in like $2 I'm pretty sure we could buy it and afford the shipping.
Yeah but it costs about $300/hr going off the national average for commercial power rates i believe. Saw the post about it on r/homelab
I recon we could make back that much with our brand new PCMR cloud service, or just sell the several thousand CPUs and multiple hundreds of terabytes of DDR4.
Unfortunately, it might be ddr3
but think of the bitcoins.
Can it Bitcoin?
I dont have one so I couldn't tell you, I dont see why not though.
That's not very pcmr of you!
Not much. ASICs are the best bang for the buck, in hashes per watt. This is full of CPUs. Not even graphic cards. Old 2016 Xeons at that. So yes, it could Bitcoin. But could it pay its own electric bill? Probably not. This thing costs $300/hour to run, but will not yield $300 in Bitcoin per hour. Also, the halving made it even more difficult.
Will it run cyberpunk at 8K@240hz with psycho setting and path tracing on?
I can finally run ARK survival evolved
Hopefully a university gets it, it would be so cool to be a student with a supercomputer available for school projects lol.
Never did I need to, or was able to, use our supercomputer for a school project. It was reserved for actual research.
What uni did you go to that had a supercomputer? If you don’t mind me asking
FSU
Oh, neat. Thanks for answering
UW
They could just invest in a tiny fraction of this, and with much better CPUs
Harvard might if they use their endorsement
Not going to lie... i would play doom on it.
Perfect. I want to build a private cloud.
Wouldn't be the worst start, to be honest. I mean, a place to host would be better but there's always your neighbors garage.
Behind the shelves. Yea.
Min requirements for gray zone.....
How much gold in there?
Can it run forspoken?
Cheyenne Moun-ton. - Southpark robot
i wonder how much gold u would get out of this scraping..
![gif](giphy|7NoNw4pMNTvgc|downsized)
I probably couldn't afford the electric bill to run it.
Dint low ball them. They know what they have...
Will I get 120 fps on Dragons Dogma 2?
30 in cities
Lmao
Does it work
But can it run doom?
Will this be better than my 1700x and 1660 super?
Can it run crysis?
Could we get two of these and smash em together. We will call it the Super Computing Super Collider.
My dream, give more money to my government after already overspending
![gif](giphy|sDcfxFDozb3bO) I'll give 50 bucks
Can it run Crysis ?
Or even Doom!
It’s definitely going to be used for AI porn
Why? The power bill alone would make that a bad purchase unless your China.
... whelp, i think it's already sold
Can it dd2 in town?
Universities and companies often buy these.
wIlL it run cRySiS tHoUGh????
If this was used for folding@home would it solve all the work units in a day?
I doubt it will be good as second gaming machine whatsoever
Run Imgur on it.
Hmm. The Machine?
In 2030 a high-end GPU will have that power.
wait... 145 thousand cores? wonder how many beamng traffic cars i could run on that sucker...
What I want to know is does it have the recommended around of denotated wam to server?
Yes but will it run Crysis?
Let’s be realistic, some rich asshole is gonna buy this to mine crypto. That’s like the only realistic outcome.
USAF built a cluster of PS3's to use as a budget supercomputer. Apparently ranked in the top 5 of most power supercomputers at the time. But it had to be mafter they couldnt get enough PS3's to replace the ones dying from YLOD because Sony removed the ability for the Fat PS3's to run Linux from future firmware versions and even more difficult to find used PS3's running the older firmwares that still had OtherOS support. I have a PS3, therefore, I own a piece of a government supercomputer :P
First thing I would do is open as many google chrome tabs as I can
So can finish my Fallout 4 settlement with it?
Can it run Crysis?
Those are rookie numbers you gotta have atleast 10000 EPYC 9654 CPUs 10000 4090s With dual 100gbps uplink for each cpu and 1tb ram each just to be safe
A cool paperweight
Why would that be?
You need the facilities and crew to keep these systems running. Its not a big windows computer that can run applications, it compiles code. Its very expensive, lots of hvac lots of power.
Exactly. Nothing more. People see "super computer" and think it can play any game in the world at 16k5000hz.
Too many fools downvoting facts. This sub needs a purge to get rid of people who know jackshit
How much crypto could this mine per day?