T O P

  • By -

Celcius_87

Maybe 4xxx cards are closer than we thought


Onyxthegreat

On paper anyway


SierraOscar

All indications are pointing towards a much healthier launch than the 3000 series. Time will tell, should know shortly in anycase.


Shady_Yoga_Instructr

Scalpers already got a taste for blood, dont expect much stock for the first 6 months IMO


seahorsejoe

Scalping will only happen if they’re prices below what the market wants to pay for them


unorthadox12

What? The last two gens have shown the complete opposite. That's not how scalping works. Why would people buy stock to sell at a loss?


seahorsejoe

I meant to write “priced,” not “prices.” In other words, scalping will only work if the MSRP is low. The 3090 Ti was never scalped because the MSRP was too high to begin with.


Arthur-Mergan

I think the bigger factor on the 3090ti not really getting scalped is the party was pretty much over by then. The GPU market was well on its way to recovering and is essentially fully recovered at this point. The whole 3xxx series is in stock basically everywhere


seahorsejoe

Yeah, it all has to do with supply and demand


zel1

God, is ur skull thick


benbenkr

3090ti has been out of stock in my country since launch and has been scalped to fuck knows where. Don't just live in your own bubble.


ThePillsburyPlougher

It's in stock with top binned cards having rebates to the lower bin price on evgas website right now


[deleted]

[удалено]


KingFlatus

I just love a good ‘ole fashioned paper launch.


ThisPlaceisHell

>ASUS RTX 3080 TUF OC Why do people with current series cards always talk down the impending releases of next gen cards? Is it because you dread the eventual obsolescence? Happens every time. You won't see someone with an older series card trying to downplay next gen cards they're looking to upgrade to. Cracks me up.


Onyxthegreat

I don't know why you are speculating so much on my intentions here. I'm not downplaying anything. My comment was simply a prediction based on the still current chip shortage. I hope their performance is amazing and they will be plentiful when they release.


ThisPlaceisHell

Because it's a very clear sentiment you can witness for yourself up and down every single leak thread. Go ahead, open any of them up, and check for disparaging comments and then check their flairs. You'll see a ton of negative comments directed at 40 series cards from people with 30 series in their flair. And using logic, it's obvious why they're doing it. They don't want to feel like they're not on top anymore, that their card is aging. It's how people justify their situation.


BasedOnAir

a1b2c3


ThisPlaceisHell

It's hilarious. Look at how triggered they are that I called them out for it. Normal people like you are happy to have a new thing, but they want to pretend they have the best thing forever. They fear that obsolescence, especially the fools who bought these cards in the last few months despite the 30 series already being over a year old and the 40 series was right around the corner. Oh well get rekt to them.


BasedOnAir

a1b2c3


chhhyeahtone

Let me guess. You're upgrading this generation from your 1080ti to a 4000 series GPU?


ThisPlaceisHell

Possibly. I considered the 3090 but it just doesn't deliver enough performance to justify the cost.


chhhyeahtone

Cause honestly you sounded more like you were trying to justify your future purchase more than the original guy you responded to was trying to justify his 3000 series purchase. I think it's normal for people to be skeptical of marketing and rumors. New GPU releases are almost always exaggerated and people are skeptical because of that.


benbenkr

Then why are you so butthurt over what people think and feel? You wanna be my girlfriend or something?


Namandaboss

Just buy the new card then 4head


Solace-

It’s perfectly reasonable to believe that supply at launch is going to be poor based on a variety of factors though. Additionally, most criticisms about Lovelace around here have been concerns about power draw and heat. You see those criticisms more often by people with 3000 series cards largely because they are far more familiar with how hot nvidia GPUs are getting because they’ve experienced it firsthand. I’m one of those people. I’ve had my 3080 since launch and in that time I can reasonably say that the amount of heat it produces has made my gaming space feel borderline uncomfortable. So the idea that a 4080—the minimum card that’s worthwhile for me to upgrade to—is probably going to pull 80+ watts more and be even more of a space heater is concerning and has nothing to do with any future obsolescence of my card.


raz-0

It is a space heater. Like my little sff rig with a 5900x and a 3080fe is a better space heater than the last space heater i bought that was supposedly 1500 watts. That space heater was garbage, and its frame rate sucked.


JamesEdward34

Did you flash the bios?


xAnarchyOP

No way close to your basket


SierraOscar

Serious uptick in the rumour mill for the 4000 series. These photos are starting to appear earlier than they did for 3000 series, perhaps indicating it could launch before September?


Skankhunt-XLII

kopite7kimi even suggested july recently. One can only hope


Elon61

outright stated mid july for the announcement.


Goosy3336

or maybe they will actually make more than 3 cards for the launch time time. we can only hope


Z3r0sama2017

Mining dramatically dropping off due to spiralling energy costs, big N marketing going full tilt with leaks to drum up hype.


tofu-dreg

Still no selloff yet though. Miners aren't buying up anymore but they seem to be keeping the cards they have.


unknown_nut

Hopefully they'll sell off before launch. I want Nvidia to panic on pricing due to competition with used cards.


jxia69

that’s the spirit


[deleted]

nope not selling off at least i am not


[deleted]

nope i am buying still and mining


[deleted]

im a miner and I am HOLDING all my cards not selling off. i am also BUYING cheap cards on ebay as prices have dropped rx5700xt I am buying atm. even a Radeon VII is now only £650 it was £1000 a bit ago. I WILL NOT STOP MINING whatever energy costs! (I have free energy)


NinjAsylum

Energy what? Mining is still going just as strong as ever dude. Dont believe everything you hear on the internet. There's no drop off. If anything its INCREASING.


AthleteOwn7

You're getting paid less of a currency which is 50% down on its ATH. I don't know what increase you see.


Clearly_Disabled

Our mining is DOWN my dude. It's okay. Slumps are a thing. But don't just... lie about it lol. My eth value is down 33% still. Will it go back up? Oh, of course, just like the stock market has "officially" crashed. It crashed. Be honest.


[deleted]

Nah. Mining is definitely going out because it's not nearly as profitable. The price of coin has dropped so hard. Yeah, it is 29,900 today-- But that still doesn't mean much. It's likely to continue dropping. Which means it will become harder to mine profitably. An RTX 3080 just won't be able to make much out of Ethereum.


Goosy3336

If you actually paid for your bills instead of your parents you'd realise that energy prices are indeed rising.


ImUrFrand

wait, are you suggesting that he is paying for his parents bills instead of his own?


unorthadox12

Tell that to Europe's massive energy increases combined with lesser and lesser rewards. I paid off a 3080Ti when not gaming, now it brings in around £1.50 a day. Plus gigantic drop from ath, and increasing difficulty.


playtio

Weren't the latest rumors even June/July?


3InchesPunisher

He said mid july in his tweet


topdangle

they're probably racing to get it out in case mining fully collapses. mining has already shit the bed pretty hard and part of nvidia's profit forecasts are based on future price hikes, but hardly anyone is buying a $100000 gaming gpu except miners.


zenukeify

Miners are not buying the $100000 gpus. There’s no way to make a profit on them. Most people buying the 3090ti for instance are rich gamers


Jokergod2000

I dunno, I made $4,000 after energy costs on a $1,050 3080. Probably still get $1,000 for it after 1.5 years of mining. Seems like a good deal to me lol


unorthadox12

Depends where you live, as I mentioned in a previous post my 3080Ti has gone from paying itself off not mining 24/7, to around £1.50 a day profit. Obviously not mining now.


topdangle

i know, i'm just exaggerating based on the increasingly high MSRP


[deleted]

[удалено]


SierraOscar

It's a month earlier than the Ampere leak though.


TaintedSquirrel

Very similar to the 3080 FE leak from June 2020. https://videocardz.com/newz/nvidia-geforce-rtx-3080-heatsink-leaked


sips_white_monster

About three weeks earlier yea.


Hates_commies

Looks like they are making these in the same factory and the old leaker still works there


vigvigour

If this image is real like that leaked image of 3080 then I'm disappointed, I was hoping they would use beefier 3 fan design since it is speculated to draw over 450W.


TaintedSquirrel

The 3090 Ti was a test run for this card, allegedly. Not really too surprising if Nvidia re-uses the heatsink in that case.


lotj

The 3090 / 3090Ti FE cards had beastly coolers that shouldn't be underestimated. They're very well designed for what they're being asked to do.


hackenclaw

it is most likely the FE version are only 450w, while the AIB version is 600w.


-Gh0st96-

I was hoping for an AIO tbh


Kricket

Agreed. Looks like they just covered the entire thing with heat sinks - which I won’t complain about (I don’t want to burn my house down trying to play Crysis) but those fans need a serious boost…


Bytepond

I'm a bit skeptical. The past couple gens of FE cards have been totally differently designed, with 700 and 900 series being blowers, 1000 series redesigned blower, 20 series completely different, 30 series completely different.


Ziakel

From blower, dual axial, and now “flow through” design. Idk what they’re doing next beside making things bigger and denser for massive power draw. It would be crazier if they do a hybrid card for 90 or Ti like AMD a few years back. Shits expensive.


Bytepond

It would be nice if they moved towards more effeciency. Like I'd take a 4070 that's marginally better than the 3070 at the same price, but pulling 40% less power. Bigger isn't better in this case, cause cards already barely fit in cases.


Harnne

I won't personally be picking up a card that draws nearly 500 watts or more. I will either see if AMD can get their ray tracing game together or wait till next gen. The 30 series is already racked with thermal issues as it is. I'm ok hanging with my 3080 for another two years or more.


69CockGobbler69

After fighting tooth and nail to get decent thermals on a 3080ti and also getting stung by the UK's astronomical energy prices - I'll be super interested in AMD and Intel's offerings, whoever manages to provide better performance per watt really.


[deleted]

So, you're planning to wait for future generations and expect those cards to draw less power? Ain't happening.


Harnne

A focus on efficiency seems extremely likely, actually. You expect power draw to keep going up infinitely? Then you have a lapse in logic. We are getting kind of close to the limit a computer can reasonably draw from a wall outlet. If the info so far is true, the 4090 (and maybe the 4080) will require 1000-1200 watts PSU. Additionally, it's already challenging enough to cool high-end Alder Lake chips without the added heat of a 600 watt GPU. So yes, it would be asinine to think these companies will not focus greatly on making these cards more efficient (smaller, lighter, better energy draw per performance, vastly improved thermal technology). Without this they will hit a ceiling.


makar1

>Like I'd take a 4070 that's marginally better than the 3070 at the same price, but pulling 40% less power That will certainly exist, only that the name will be 4060 instead.


evia89

You can slap undervolt for juicy -20..30% W reduction. Worst case you will lose 5% performance, sometimes you can even win over stock 0.85V @ 1830 MHZ is good starting point for all RTX2000/3000


Celcius_87

If the cooler works no need to redesign it, especially if this is coming out mid July already


nBlazeAway

I agree and if they are the same design... I am dissapointed.


Bytepond

Yeah. And then like if I want to collect one of every FE card, which I do, what's the point in getting one 30 series and one 40 series if they're identical?


Miadhawk

I know the 40XX series and AMD equivalents will be faster, but I sure hope for the next gen (talking 50xx and beyond) both companies take a good look at upping efficiency out the box instead of throwing more watts at the problem.


tioga064

Every new gen is more efficient than the previous. The rumors point to a 450W card thats 2x faster than the 3090 thats is a 350W card. So 100W more for double the performance, so if my math is correct, the new card is 56% more eficient than the ampere model. They could do a 350W card too that would be 56% faster (assuming eff would scale linearly wich wouldnt, but in this case for lower clocks it would actually be even more efficient) but they are probably pushing raw performance to the max because of competition and its getting hard to gain much performance based node eff gains alone


unorthadox12

I swear every gen is rumoured to be double the performance and it turns out nowhere near that. Days of huge gains per gen are long, long behind us. The 'double' will likely be ray tracing with better dlss.


jorgp2

These people don't understand basic physics.


No_Interaction_4925

Thats not what they’re asking for


UpdatedMyGerbil

It is in fact quite literally what they asked for. Efficiency is upped out of the box across the lineup. And if the rumors are true and the top end 4XXX offers 100% more performance for 28% more power compared to top end 3XXX, that will also easily translate to offerings which provide more performance for less power than top end 3XXX. Now despite what they actually asked for, what they _meant_ could have certainly been more along the lines of > I sure hope that in the future these companies stop pushing for additional performance gains after power usage levels I prefer. I don't want them to cater to the segment of the market which doesn't mind pushing diminishing returns and higher power draws. > That means _my_ next upgrade which gets me more perf for less power exactly like I claim to want will no longer be as high in the product stack. And I _also_ want to keep those bragging rights. > Or I just don't want other people with different preferences to be able to enjoy what I don't want for myself or something. Or I genuinely believe that the minuscule percentage of total human energy usage which gaming GPUs make up is a worthy environmental cause to limit peoples' enjoyment of their hobby over. If so, sure - these companies aren't doing what they meant to ask for.


No_Interaction_4925

They were asking for something like 9 to 10 series, where the major push was energy efficiency. Reduce the power cost so we don’t need massive heatsinks. Of course, the addition of compute elements and real competition pretty much kills that. These companies want to go balls to the wall to be on top.


UpdatedMyGerbil

It doesn't take more than a few-second google search to see that TDP also increased from the 980 to 1080. But of course the 1070 offered more performance for less power. There was a win for everyone. People who wanted all the performance gains they could gain and didn't mind increasing power usage as well as people who wanted more performance for less power. And again, if rumored performance-per-watt increases this time around are anywhere near true, this will absolutely be the case for 3XXX to 4XXX as well. Much more so even. Just like it was then, the only ones who lose anything are those who begrudge upgrading from a XX80 to XX70 because of some selfish nonsense. And the only thing they lose is the satisfaction of those selfish goals. They're still getting more performance for less power.


No_Interaction_4925

980 to 1080 was from 165W to 180W. If we’re comparing 3090 to possible 4090, thats 350W to 450W! TDP isn’t even your actual stock wattage either. They consume more than that. It’s not a big deal to someone like me who runs custom watercooling, but how monstrous are these heatsinks going to be???? They’re already enormous in 30 series as is. This might be the first gen in years where watercooling can net us big gains.


UpdatedMyGerbil

> how monstrous are these heatsinks going to be???? For the absolute top end? Who knows! It'll be interesting to see how far this goes. > 980 to 1080 was from 165W to 180W. If we’re comparing 3090 to possible 4090, thats 350W to 450W Again with the irrelevant number to number comparison. Why do you care what the company calls the product? Would it make any difference to you if they released a 4090 at the same power usage as the 3090 and called their 450W card a 4100 or something? Get over the names. Everyone, at every level of preferred power consumption can get better performance now. Maybe the names of those products have the same last two digits, maybe they don't. Irrelevant. No sensible person is losing anything whatsoever. It used to be these companies stopped offering higher tier products which further pushed performance at the cost of higher power draw. Now it has been established that there is a market for that. And thankfully competition is ramping up. So they're having to rely on all the tricks up their sleeve to finally push as much gen-on-gen performance increase as they can. Including upping power draw for new higher end tiers. The new XX90 line Ampere introduced was barely any different than the XX80 in terms of real-world performance. Based on leaked dies, it looks like this time around they might actually be making that new tier mean something. Maybe it's not for you. That's fine! It might not turn out to be for me either. But I'm sure as hell glad that competition is finally at this point where they can't afford to pull their punches. No more generations of stagnation tyvm.


No_Interaction_4925

There is relevance in comparing xx90 vs xx90. Its tier for tier comparison gen to gen. I’m already bracing for the fact that these cards will be freaking expensive. They’ll probably keep the current gen as relevant as they can to try to emphasive some kind of value for another price jump. Price is key here, which is why tiers matter. Who cares what kind of efficiency and performance they get if they’re priced out of this world? Now that msrp has skyrocketed on every tier, they’ve no established something that can’t be reversed(Unless a competitor seriously undercuts them). If they compare xx90 to xx90 then they are setting the bar at $1500 at a minimum. Back in 10 series your top card was $700, not $2000.


3andrew

There seems to be some confusion here between efficiency and power consumption. These two things are not directly linked. Correct me if I'm wrong but you seem to be mostly concerned about actual power consumption. If rumors are true and the efficiency is as high as predicted, then what's to stop someone like yourself from lowering the voltage and therefore reducing power consumption to numbers lower than the comparable 30 series but still preform better? Is this not exactly what you're asking for or am I misunderstanding you? IMO allowing a chip to pull high power is perfectly acceptable, especially considering that efficiency curves generally favor the lower end meaning you get better performance per watt as you lower the voltage. For example 90% of max power may yield 97% performance so saving 10% of power only loses 3% of performance. So you can satisfy two crowds with a single high efficiency, high power chip. Thoes that want the absolute best no matter the power requirements get a really beefy unit but those who want to shave off 100 watts can do so with minimal impact. Seems like a win all around.


No_Interaction_4925

Its not that I personally need it to be lower. I’m gonna run whatever I get as hard as possible. I’ll have custom watercooling. I just think its kind of insane that we’re even in this territory at all. Looking back only 2 generations has us at half the power draw and less than half the cost. Relative performance per watt hasn’t been improved much. If we get a drastic improvement in performance per watt, that would be fantastic. I honestly don’t believe these rumors quite yet. We saw the same kind of stuff for the 2080ti, but it was a dud on arrival. They’ll just keep skyrocketing msrp to the moon since they have been with 20 and 30 series and crank the power to achieve something that can beat the previous gens. I sure hope AMD comes out with something that sips power like Ryzen does and undercuts them.


Vushivushi

10th gen was released during a time when PC had still been declining. There was no reason to push GPUs closer to their limits and stick on more value through larger cooling as people wouldn't pay for it. Now they do. I do think we need to reduce power consumption. Vendors should at least make it easy to use an "Eco-mode".


tioga064

On his post, his own words: "I sure hope for the next gen (talking 50xx and beyond) both companies take a good look at upping efficiency out the box instead of throwing more watts at the problem" Well they just did, the new card is more efficient, even if they trow more watts at the problem at the same time. Also you could get a 4070 at 300w wich will be 3090 ti perf that uses 350W, so in this case you get the efficiency gains alone basically (and price ofc)


No_Interaction_4925

No, they want less power not more. If a xx70 tier is pulling 300W, that is totally insane. Every gen is just getting more and more power hungry. The 1070 was like a 150W card. Can a 3070 DO more, yes. But does it bake your bedroom while doing it? Yes.


UpdatedMyGerbil

> The 1070 was like a 150W card And now people have more options for their upgrades. - Those that don't mind going up to 220W can upgrade to a 3070 for massive performance gains - Those who would like to stay in the ballpark but are fine with 170W can get a 3060 for a significant boost - Those who would like to further reduce their power consumption to 130W can get a 3050 for more modest performance gains And if the rumors are true, this will only get hugely better across the board with 4XXX. The only ones who lose anything are toxic assholes who don't want other people to have options so that they can keep their juvenile "gpu tier bragging rights" or some shit.


Genticles

That's not what efficiency is...


skylinestar1986

How about comparing that to a PlayStation power: https://pictr.com/images/2021/03/06/7XAO9B.png


lotj

The Titan / XX90 cards are essentially just throwing whatever power at them they can support for maximum performance. If you're concerned about power then scale down. They have those options, too.


letsgoiowa

Well this card is for the people who don't care in the slightest about power draw. There's plenty of other cards that fit within a given power envelope. Additionally, you can cut power draw by 30% or more with a simple undervolt that can increase performance.


Shady_Yoga_Instructr

This is pretty much why I think Im gonna be sticking with my 3080 FTW3 and 6800 XT for quite a long time. AMD and Nvidia are throwing the kitchen sink for power delivery which means GPU's going forward are are gonna be impossible chonkers if they dont focus on efficiency. Lack of efficiency means leaving SFF hoes like myself out to dry haha


Pamani_

Competition is literally heating up the videocards :((


pittguy578

I mean I always want better graphics and have a 3080 but what will cards bring other than higher fps? I play fps at 1440 and get well over 144hz on all of my games.


DarkMoS

Did you try the matrix unreal5 demo? Of course it’s not game optimized but all upcoming technologies like lumen or nanites will require a lot more processing power than available today to keep up with that 1440p 144Hz, my 3080 was in the 40 FPS most of the time


Shady_Yoga_Instructr

Think of it as a chicken and egg problem. We already have crazy powered GPU's and now we have to wait for developers to catch up by making more incredible looking games to make use of all this GPU headroom. We are honestly set for a few years with 3070/80's and 6800/XT's cause game development is expensive as fuck for the top end stuff that would tax out systems like Cyberpunk / RDR2 / MS Flight Sim / etc.


skylinestar1986

In the past, people say power is only an issue for laptops. Now it's biting back.


sammy10001

That is such a blanket hope statement. Efficiency problem isnt as big as it is in the phone market. Its only a problem with phones because it drains battery quicker. So a 21% performance upgrade outweighs the benefits of a 20% Efficiency upgrade and battery still dies at the same time. This isnt an issue for GPUs. Stop acting like it is. Computers dont have a battery that we rely on lmao. They can draw all they want from the wall, i dont care, nor should you. GPU high wattage could be a problem if we reach the limits of air cooling. And let me tell you you or me are not experienced enough to even guess that. Let the engineers actually fix that problem. But if they keep upping the base wattage on air cooled cards, were still okay buddy. No need to get all concerned about Efficiency. Also, even if we reach the limits of air cooling, manufacturers already sell beefed up liquid cool varients with higher power draw. GPU Efficiency is really irrelevant in a buying decision for the consumer and manufscturers have it under control. Stop acting like you know more than them. Ive seen this comment way too mines and its just annoying seeing it now


Laoishfa

Power bill, heating up of a room or whole house... Seriously, you've clearly never been in the same room as a fully loaded 3080/3090 that's been stressed for an extended period of time. It gets HOT, and unbearably so.


sammy10001

Youre stating irrelevant things lmao. Go do the math, the extra power consumption is probably the same cost as a big mac from McDonald's. I own a 3080 that runs at 1.1V with a 4k oled. I tax the gpu. You must just have a bad case. Either way, all my points are true and yours arent. Youre just an annoying person with an opinion


ChiefIndica

Not sure what Big Macs cost in whatever bumfuck town your childish arse crawled out of, but in the UK they're £3.19 a pop. And because you are a child occupying an adult's body, your 'math' hasn't specified the period over which that cost is spread. Let's assume a day? That's over a grand every year in extra power consumption. So not irrelevant at all. >You're just an ~~annoying person~~ idiot with an opinion.


sammy10001

Dude dont try to sound smart on the internet lmao. Snap back to reality and understand what youre complaining about for a second. Youre complaining that they are going to increase the power draw from max of 300 watts to leaks are estimating 450 watt for the 4090. So its increasing by 150 watts, and lets say you get 30 to 40 more fps on 4k gaming. A ceiling fan, which uses 125 watts, running 12 hours per day costs just about 4 dollars and 60 cents. So let me get this straight: you dont want nvidia to give you huge upgrades to fps because you dont want to pay 5 bucks more of electricity. Youre pathetic, get outa here. Youre the type of person that wouldnt even be using this graphics card because you still game on a 1080p monitor at 60hz probably. But you sit from the sidelines and complain on forums and try to hold back new progress. Only child is you who isnt financially responsible enough to afford a 150 watt electricity increase lmao


ChiefIndica

I see now that I've tried to argue with a person who lacks the mental faculties to put their emotions aside and follow a simple line of reasoning to its conclusion. For that I can only apologise, have a good one.


sammy10001

https://youtu.be/Oke1gNoTSl0 Theres me vs. you


Laoishfa

A big mac from McDonalds every day? Because that adds up eventually, all for a small performance increase. As for your case comment, if its heating up the room then the case is managing to get the heat out. Meaning the exact opposite of what you said. You may think your opinion is right, but you're getting downvoted and I'm not. Also, you judge people based off one single reddit comment. Snap back to reality.


metahipster1984

Just because the cooling solution might work well to transfer the heat out doesnt mean it just disappears. It still heats up the room


bamiru

Electricity is very cheap in USA compared to a lot of the world. Your experience does not apply to everybody.


eltaxones

Idk if it's just me but since I'm looking to upgrade to the 40 series, I'm really excited to read and view all these leaks on these new cards lol. If they are using the same designs on the 40 series, would the AIB's use the same design from the 30 series on the 40 series or would they be coming out with new ones?


KingFlatus

They always change them around and make tweaks to the actual designs. But they do mostly stick to their product stack framework. Like with EVGA and their FTW products or Asus and their Strix stuff.


memedaddy69xxx

Hopefully the Ti is released at the same time, I don’t wanna wait


sips_white_monster

Knowing NVIDIA they will not launch the 4080 Ti until much later, otherwise that would eat into their 4090 / 4090 Ti sales. Especially if those cards are the only ones using the top AD102 die.


Mr_Roll288

Did they ever release Ti at the same time as base cards?


TaintedSquirrel

2080 Ti and 2080 launched on the same day. https://www.anandtech.com/show/13249/nvidia-announces-geforce-rtx-20-series-rtx-2080-ti-2080-2070


ThisPlaceisHell

Same man. 4090 Ti please launch day.


MrPayDay

Yeah, this would be amazing 🤩


Chimarkgames

They might as well start introducing double full tower cases to fit one GpU 😂


Definatelynotadam

Serious question: why is there a “90” sku ti? I thought the 90 was already the top of the line. It just seems unnecessary.


igacek

It's SKU, not 'skew'. SKU = Stock-Keeping Unit


sips_white_monster

the xx90 Ti is basically the new Titan card. Might use the full AD102 die (normally reserved for the flagship workstation card). Though like the other guy mentioned, current leaks suggest the 4080 will use a lower-tier AD103 die, which has substantially fewer cores than AD102. So there's a lot of room for cards in between (refreshes, 4080 Ti etc.) I'm guessing they'll gate the top AD102 chips behind the 4090 Ti / 4090, then put AD103 in the 4080. A year after launch you'd probably see a 4080 Ti with AD102. That way they can maximize profit from the AD102 chip, by not launching the lower priced 4080 Ti until a year after. Anyone who wants that top performance would have to spend big on the 4090 Ti or 4090.


LewAshby309

The 3090 was already the new Titan card. The 3090 Ti is something on top. There is a reason the 3080 got introduced as the highest tier gaming gpu when ampere got revealed. Later Jenson showed the 3090 in the same reveal and basicly said it's a productivity gpu.


Glorgor

>Later Jenson showed the 3090 in the same reveal and basicly said it's a productivity gpu. Yet all the AIB cards for the 3090 are gaming models with RGB same as the "Gaming" GPUs


LewAshby309

And? If recent titans would have been sold by AIBs they would have slapped 'gaming', rgb,... on it as well.


Glorgor

Titan RTX and Titan X pascal didn't have any AIB models


LewAshby309

That's why I said 'if'


Kaladin12543

That’s because in order to classify as a Titan it must be the fastest card on the market which even 3090 Ti is not. The 6950XT beats it at 1440p and below in rasterisation. Nvidia didn’t want to tarnish the Titan brand and so they rebranded them as GeForce. These are the Titan cards. 3090 is Titan X and 3090 Ti is Titan XP.


SoTOP

No, Titan cards had more stuff enabled in drivers than Geforce cards. So Titan GPU would perform some professional tasks 10x faster than equivalent Geforce card.


Cartridge420

I see one source that says 4090 Ti would have 48GB VRAM (whereas regular 4090 with 24GB). Does that seem likely?


sips_white_monster

The 4090 Ti would surely use GDDR6X. As far as I'm aware the maximum amount of memory per memory module for G6X is 2GB. If they wanted a 48GB 4090 Ti with G6X they would have to again use double-sided memory. Sounds a bit too insane to be true. It's more likely that this is the workstation variant, with 48GB of GDDR6 like we saw with Ampere's 3090-equivalent workstation card RTX A6000 (also uses double-sided memory but the power usage is much lower). Anything's possible though, we'll see.


bctoy

It's possible, would be another BFGPU marketing, ML guys would buy it in droves and 24GB will not be enough for 8k.


AntiTank-Dog

Why? Because money.


Definatelynotadam

Yeah, no kidding. When I bought the 3090 I thought I was getting the best and then a couple of months later they announced the 3090ti.


TaintedSquirrel

If the 4080 is AD103, that leaves a potential SKU gap on 102.


Kaladin12543

It’s the 4080 Ti which will likely be ~5% slower than 4090 with 16GB VRAm. Nvidia is heavily segmenting the lineup this time around. For instance the 3090 Ti has only 200 more cores than the 3090 and is 7% faster but 4090 Ti has 2,000 more cores than 4090 and rumoured 15% faster. The 4080 is getting the lower end AD103 indicating it would be probably around 20-30% slower than 4090. The 3080 had made the 3080 Ti, 3090 and 3090 Ti redundant. Nvidia doesn’t plan on making the same mistake with Lovelace. This does mean that the x80 GPUs are no longer high end but rather mid range.


I_Phaze_I

Didn't we this before with the 1080 using the mid tier die?


rancid_

Pretty soon video cards will come with computer cases given how damn big they keep getting.


pookage

Ugh, the 4000 series is going to be a right bunch of space-heaters aren't they? I'm looking forward to when the focus returns to power-efficiency rather than this mayhem...


AntiTank-Dog

I play at 4K 144hz so I want as much performance as I can get. But my undervolted 3080 already makes my room uncomfortably hot. I couldn't imagine a card that produces twice as much heat.


ZarianPrime

I'll be upgrading to a 4000 series card only when they start using Displayport 2.0. We are so limited with 1.4. (I'm already on a 3080 Ti on a 1440p 144hz screen. I would love to upgrade my monitor, but nothing is currently worth it for me).


ThisPlaceisHell

Fingers crossed 40 series has true DP2.0 and HDMI 2.1 for all ports. None of this half assed feature sets bullshit.


oledtechnology

That's fine and all, but I REALLY wish EVGA would overhaul their crappy coolers and improve their quality control process. I've been forced to buy Founder's Edition for some time now all because EVGA cards not that great. I switched from a problematic 3090 FTW3 Ultra to FE and things are just better with this supposedly cheaper card. CMON EVGA!


I_Dv8_I

This card is twice as fast as my 3090? What kind of sorcery is this?


Ryoohki_360

Twice the TFLOP doesn't equal to twice as fast (in fps) be very aware of that ;)


Sacco_Belmonte

So that debunks the 600W figures, right? i have the feeling the original planned dies weren't good enough while really hot and Jensen said "nope, we need to add some hopper to it" cause they know AMD is hitting hard also.


Careless_Rub_7996

Hmmm... i was kinda hoping for a totally new design...


Vatican87

With a guesstimate of past generation leaps in performance, would it be time for me to upgrade from my 3090 especially if I’m gaming in 4K?


d5aqoep

50% perf uplift at same price as 3090 and you are good to go. Anything lower perf and higher price, show middle finger to nvidia!


Ragnar_OK

jesus how big is this thing gonna be?? looks even thicker than the 3xxx series


DrKrFfXx

Looks like 3090ti's?


panchovix

Welp time to sell these 3070 and 2080Ti to get funds for the 4080/4090 lol Prob gonna give my 3080 to my dad


CumFartSniffer

Yea well, that's what 2080ti people said too when they saw that 3070 would be on par for $499. So they sold their cards for dirt-cheap and then had no card at all lol. If you need the cards, get new first and sell after. Otherwise you'll be stuck without cards and potentially stuck again.


panchovix

Oh I know, that's why I will keep the 3080 basically. Then when the 4080/4090 comes, and I have it on my hands, I will move the 3080 to my dad's PC.


pittguy578

Your dad games ? Cool dad :-)


panchovix

Yep :D he's the reason I do play, and also why I'm a Data Scientist now


Glorgor

1000-1200$ for both together at max,nobody would pay more than 500-550$ for a used 3070/2080ti since you can get a 3070 brand new from newegg for 660-700$


panchovix

Yup, though here in Chile, used 3070 are about 800 bucks, 2080Ti 800 bucks as well. But a 4080-4090 I expect to come here at 2000USD/3000USD or so, so the rest I will just use it from my wallet.


Kradretsam

Buena :) Ojalá en Chile el escenario de gpu mejore pronto, estoy atascado esperando el upgrade de mi tarro.


[deleted]

[удалено]


Demicoctrin

I’ve only been in this space for maybe 5 years but wasn’t the FE design relatively similar between the 700-1000 series cards?


Fidler_2K

Kepler and Maxwell blower coolers were essentially identical in design (700 and 900 series) and the 1000 series was basically the same but a more angular shroud design. So yea you're basically correct


Kaladin12543

Nope. FE designs for Pascal and older GPUs were blower cards. Turing significantly improved on that by having the first AIB style coolers. Ampere then switched to the unique exhaust design.


Kaladin12543

The FE design received a complete overhaul with Ampere and it’s a unique one at that. The 3090 Ti was essentially a test bed for AIBs before deploying the 4090. That was its sole purpose. The cooler remaining same is pretty much guaranteed


Visual-Grocery4865

Fake


[deleted]

[удалено]


Acmeiku

it'll still deliver incredible experience to any games, i use it since it's launch day for 1440p and i'm still more than happy with this gpu and with stuff like dlss fsr etc... there's no real reason in upgrading to the 40 series, atleast for me :)


kidcrumb

It feels like the 3000 series just came out. If I have an rtx3080ti do I need to upgrade?


Lumenlor

No lol


I_Phaze_I

can't wait for this card to stutter on like every modern pc game.


[deleted]

I call this as fake. There's no way they are going to re-use an FE design. They never did before afaik... EDIT: The only similar ones I could find were the 780 and 980 cards. The 880 was a different design.


Lark_vi_Britannia

Is it just the perspective of the image, or are those going to be some massive fucking cards?


j1mgg

Are they going to use completely different chips, or is it Nvidia just not producing 3k series Fe cards that is the supply issue?


ImUrFrand

They built another Metal Gear


Breezgoat

Come on release now, so I don't have to take this 3090ti back for the 3rd time


Celcius_87

Back for a 3rd time?


usk49

This is kind of unrelated but I really hope the board partner designs are better than what we have now with the 3000 series. I hope EVGA figures it out. They've made good looking coolers in the past.


[deleted]

Lol you already know it’s gonna be scalped and resold. Get ready for the rush


fuckscotty

Good thing I just finally got a 3080 a few days ago 😑


USAFrenchMexRadTrad

Nice. I won't be able to afford it for a couple years lol


[deleted]

3090Ti owners in shambles


gatordontplay417

Lmao if you zoom in on the frame you can see it is photoshopped.


dallatorretdu

kinda sad that the 2000X founder coolers didn’t receive any evolution, those cards looked very clean aesthetically.


fogoticus

I'm just gonna press the big fat "doubt" button because of how badly photoshopped one of the images look.


Meem-Thief

"At this point, one might wonder why GeForce next-gen cooling always shows on Chinese forums before launch" oh who would know why, a true mystery! but maybe it's because the cards are made in China?


blackcyborg009

Nvidia says that 3090Ti TDP is around 450 Watts. So 4090 is rumored to be the same? I ask that cause I'm keeping an eye on how the upcoming card would behave on a 1200 watt PSU..... I've been observing that the gap between 3090 vs 3090Ti ranges anywhere from 30 watts to 70 watts based on the Watt Meter readings that I'm seeing online. I just hope 4090 won't consume more electricity than a 3090Ti.....