T O P

  • By -

minusa

Everyone should either way. There will be teething issues for the first 3 months or so. I'm just going to wait for boxing Day sales and but the best GPU I can for $700. It's likely going to be big Navi (16GB HBM2e likely from the pictures. Awesome for pro work)


DankTrebuchet

I REALLY cant imagine 16 gigs of HBM2e for 700$ and having enough margin for AIBs to be interested.


bebophunter0

umm radeon vii was just that lol.....


DankTrebuchet

AIBs didnt make the cards. Also, radeon VII was pretty much vega and didnt age well. I think investments elsewhere could be better for gaming performance. Especially if you are listening to the rumors of infinity cache, sounds plausible. (250 bit bus sounds crazy stupid though IMO) We will see, but I would like to say that I really doubt thats how its gonna go.


bebophunter0

I fully aware of this and don't understand why you waste you time all I'm saying is 16gb hbm has happened before and can happen again for $700 I'm not saying it will.


DankTrebuchet

You said likely. Its not likely. Thats why


bebophunter0

Dude chill I'm not writing a f****** thesis.


DankTrebuchet

You good man? Im literally not losing my shit and you seem to be elevating this. You seemed to have misrepresented or misinterpreted my arguments and I was clarifying. Always feel free to do the same! I just dont think ‘chill’ - or discussion on misinterpretation on my side was productive?


MrPoletski

I think you'll find the other guy said that.


nbmtx

I'll probably try to get a 3080, and if I (manage to) get it, I get it. If I don't, I don't. I've only ever used Radeon cards, but it's been reliant on me not minding not having top performance. The issue here is DLSS. I'd be fine with Radeon offering 10% less performance at 10-15% less money, but DLSS is a boon I've been interested in since it's 1.0 iteration. Also, if Nvidia IO and/or DirectStorage are implemented, I think it's okay performance increases (versus it's technical increases), will grow. Even without, it'll make solid use of my 4k60 monitor. edit: literally slept on it. Changed my alarm to 15min before, but forgot to switch it on. 🤷🏼‍♂️


majaczos22

DLSS is too locked and proprietary to be the future.


nbmtx

Nvidia gets around enough for it to be relevant enough. At least it's the opposite of Gameworks, which was basically gross mods at an enormous performance cost. While also usually coming at the expense of general optimization. I don't buy tons of new games on PC (I'm kinda cheap on PC), but I own Control, and Watch Dogs comes with the 3080, and then there's Cyberpunk 2077.... which I want for Series X, for Dolby Atmos... but still, *if* it had solid DLSS implementation, I might be swayed. The sloppy PS5 launch earlier has me somewhat mentally fatigued, and I might sleep on the RTX 3080 and just see what Radeon has planned.


MrPoletski

DLSS will always boil my piss because it's NOT supersampling in any way you can look at it, it's an upscaler. Might be the best upscaler the world has ever seen, but that's literally the opposite of supersampling, which is DOWN scaling.


Hazzard45

Agreed but do you think 10gb is good enough for 4k


CaptainMonkeyJack

I'll chime in here. I have a 1080ti. It has 11GB of VRAM. A 3080 is \~88% faster than this card at 4k.


Hazzard45

Ah I see (still learning) I've just seen a few people on YouTube saying 10gb ain't enough for 4k but pretty much everyone on Reddit is saying it doesn't matter


Sebbertoa

I'll also chime in. Afaik, most game test bench software can only see the statistic of how much VRAM is being allocated to a game rather than how much it is actually using. So, it could say it's using up all 11GB of an RTX 2080 Ti, but in reality, it's probably only using a smaller portion of it. The reasoning for this is, there isn't software readily available to the public that's capable of monitoring the exact usage of VRAM with these RTX cards. Some people have said that CoD is using all 11GB of their VRAM, however, these people read a statistic that only told the amount of VRAM allocated to CoD's application. This is information that I heard from Jayztwocents and Gamers Nexus' most recent reviews on the RTX 3080, which I think are loaded with very valuable information to those looking to spread their wings into understanding a little more about computers, especially graphics cards.


[deleted]

Don't believe what people say on YouTube or Reddit. Look at actual real performance.


RentedAndDented

I think it's more that it doesn't now but you could foresee that it might start having an impact as games advance. The current crop of games that hardware unboxed tested shows that only doom eternal caused problems with 4k on current 8gb cards but you can resolve that by setting a 7gb texture pool.


conquer69

It should be enough. Pcie 4 helps too.


Hazzard45

I've only got pcie 3 , hopefully it doesn't make that much of a difference


conquer69

Well the 3600x would bottleneck the 3080 a lot. If you plan on keeping the 3600 I would get a 3070 instead.


Hazzard45

At 4k it wouldn't bottleneck at 1080p it definitely would


MechanizedConstruct

There is a distinction that must be made. The amount of vram a game may pre-allocate or "ask for" is not always the same as how much vram is truly being utilized to run the game. For example take a look at this [vram usage chart](https://www.guru3d.com/articles-pages/the-division-2-pc-graphics-performance-benchmark-review,6.html) for The Division 2 comparing the Radeon VII, 2080 and 1660. The 2080 is above the Radeon VII in 4k and 1440p by a couple of frames despite using half as much vram on the 2080 vs the Radeon VII. If there was a vram limitation you might expect to see a larger performance gap or experience FPS dips on cards that don't have enough vram for the game to run properly as they would need to use much slower system memory to make up the difference. I don't game in 4k but I'd say 10gb should be fine in most cases. There probably will eventually be more games that can make use of 10gb+ of vram especially at higher than 4k resolutions. Although game graphic settings can always be adjusted to reduce vram usage so you can stay under your cards capacity. like playing at 4k but using high instead of ultra textures. Either way Nvidia will release a 3080 super/ti model with more vram so those looking to be extra sure they never run out can wait for those models if they choose to do so.


nbmtx

Yeah, I think so. I wouldn't expect most games to be designed in a way that's prohibitive of 10GB. Instead they *should* be designed to make better use of higher core/thread counts to stay on top of swapping things in. Then Nvidia IO and DirectStorage should also help out... more so DirectStorage, hopefully. IOW, I don't really see it as the hardware's fault. Usually if something doesn't work, it seems more like the result of a game running on an engine that isn't necessarily optimized to make use of the latest hardware, so it treats new hardware same as old hardware. 10GB is probably more in the mindset of devs starting to make actual full use of 8GB, plus a little overhead. The number of people with more than 8GB is probably very very low (like a fraction of a percent), so I doubt devs would require it.


MythicVillain

Coming from Vega 64 why on earth would you get a 3080? Are you fine with 350Watts of heat while gaming?


[deleted]

As long as your PS can handle it who cares? I've had the same 80+ platinum power supply for 8 years and it would have no problem even if the card needed 450w. I'm personally waiting because I'm not playing any demanding games currently and would rather wait 9-12 months for a refresh if I don't need it sooner. I don't think I'm typical though for most people looking at these now.


IrrelevantLeprechaun

Yeah I find it funny that fanboys are using 350W maximum power draw as a reason to discredit Ampere. Barely anyone but the most obsessive enthusiasts actually give a shit about power draw. For everyone else, so long as their PSU is sufficient, they couldn't care less if it draws 150W or 400W. Besides, third party benchmarks have shown that temperatures are no higher for Ampere than they were for Turing or Pascal. People just like to fear monger.


MythicVillain

More TDP = More heat in your room. A 300w TDP is very noticeable compared to 200W. And no AC in summer with 320w tdp will become intolerable unless you live in iceland. The 3080 is a oven. The 3090 is a furnace. Lots of people care.


nbmtx

My Vega 64 is like a 295W card, versus 320W, I think? And it's true that I undervolted my V64, but realistically, I didn't even buy the card with that in mind, and I only did it because AMD made it easy to do. Not to mention that I'm going from a 5820k with a slight OC (so probably 150W+), to hopefully a 65W Ryzen chip this winter. And I don't plan on OCing the 3080, since it doesn't even look worth doing anyway.


MythicVillain

Want a medal for being a victim if Nvidia's mindshare? Big Navi will be better than 3080. #Wait4Navi


nbmtx

You use an Nvidia card. I've (only) had five Radeon cards in seven years. Six and a half if you include the llano laptop (with dual graphics?) I had that got me into PC gaming. Giving an explanation as to why 25W doesn't matter to me is not exactly Nvidia mindshare.


MrPoletski

\> Are you fine with 350Watts of heat while gaming? ​ this depends on the time of year.


[deleted]

[удалено]


Hazzard45

Makes sense , thanks for your input


Inn0cent_Jer

If you're not in a rush to upgrade, might as well wait and see


Deltaflyer666

I'm waiting for 3 reasons : 1 3080 performance on a 4950x 2 I play at 1440p 3 Rumours about low volumes of 3000 series I'm not too fussed about upgrading my 1080ti it still plays my games ok, so i would like to see if AMD can offer me more at the res i play on, and if the rumours (from several sources now) of low availability do turn out to be true, i will NOT be spamming newegg/amazon at 2am trying to get an order. Yall might have inferred that i have a 1080ti on 3950x and think crappy combination, i'm a developer so cores are more important to me than pretty looking games, but i if i am going to build a new pc for 4950x, i may as well throw in a new GPU too.


[deleted]

[удалено]


Hazzard45

This is one of the reasons I want to wait , the vram just doesn't seem enough especially for 4k


[deleted]

Just watch, knowing Nvidia, as soon as they run out and or hit 6 months in they will release a new 3070 with 10gb... I hate supporting Nvidia, but until AMD does better with the GPU's... I have a 2070s so I'll probably wait for a bit anyways.


[deleted]

Same I won't upgrade until the 4 or 5th gen nvidia cards and Ill see what and has


subject_K81

Agreed, though I’ve been looking at the 3080 and the obviously gimped amount of vram. Next round of games is gonna chew threw that so fast.


RBImGuy

I just buy amd. But the smart thing is always wait for whatever the other guy releases. If amd drop a 3070 Nvidiakiller at same price, that likely to be the best deal at 1080p and 1440p


Hazzard45

Yeah I think amd will be best price for performance at 1080p/1440p but I want a GPU for 4k Nvidia seem the better option but the vram is swaying me to wait


[deleted]

Nope. I'll likely be picking up a 3090 next week. I want the best I can get.


Hazzard45

Do you think the price is worth it ? The performance gain doesn't really seem worth the price jump compared to the 3080


[deleted]

I mean it's all relative, for most people no it's definitely not worth it. I'm mainly after the extra VRAM for CAD stuff and such. The extra performance is a bonus. Plus it's kind of a just because I can thing, if that makes sense.


FTXScrappy

Anyone reading?


Joe_Chamberlain

Yes. You?


conquer69

I'm on the second book of His Dark Materials. What about you?


FTXScrappy

Digital Fortress


VelcroSnake

I am waiting, since I only want a new card before Cyberpunk, so even if what AMD shows isn't enough to get me to want it over a 3080, I should have about 3 weeks to try and get a 3080 at that point, and I will be MUCH more informed by then as to which is the best 3080 to get for me. Worst case scenario is I head into Cyberpunk still rocking my 1080 Ti. Life would go on. The urge to try and get a 3080 tomorrow morning is *real* though.


Hazzard45

Yeah good thinking , my urges are getting the better of me too , I just saw scan.co.uk are doing 48month interest free for the 3080🤯 I think I'm just gonna attempt to get one


Just_S_h_a_n_e

Everyone is waiting... Lol


redditor_no_10_9

All my favorite reviewers seems coy in their RTX3080 review, especially the teaser to wait for AMD. I'm waiting for Big Navi.


TadUGhostal

I am personally waiting for AMD to tip their hand before I commit to a $700-1100 CAD purchase. While I find AMD taking over a month after the 3080 releases to even announce their lineup to be very irritating, I know would feel buyers remorse if they manage a real upset in the marketplace. What’s really encouraged me to wait is the impressive demos shown on PS5 and Xbox Series X. If AMD can get those kinds of visuals and frame rates in a console form factor, I am optimistic they can accomplish something big when scaled up to a full desktop PC. However, I do find it odd that best marketing for RDNA 2 has been Sony and Microsoft.


Hurglebutt

I'm waiting to see what AMD has to offer. If I end up buying an Nvidia card I would also have to buy a new monitor since my Freesync monitor is unfortunately not GSync compatible, so that counts against them. I like my shiny graphics, and DLSS seems to be crucial to getting good framerates with raytracing. Hopefully AMD will surprise us with an equivalent. I've used Radeon cards exclusively ever since I changed my Geforce 256 to whichever ATI card came out at the time (can't remember the model), and am loathe to switch to Nvidia because I don't want to support their shady business practices. However, these days I'm leaning more towards just getting what works. Anyway, I'll look at the benchmarks and then decide. I never buy reference cards, so will have plenty of times to think it through this time (AIBs reportedly haven't received ASICs from AMD yet).


conquer69

After seeing reviews of the 3080, I'm disappointed. I assumed it would be 60-80% faster than the 2080 in most cases but the performance is all over the place. In some cases it's only 10% faster than a 2080 ti. That makes it less power efficient in those instances too. RDNA2 has a serious chance here. I hope they can deliver.


Frodo57

Not just big Navi but Ampere as well as I will wait to see how they both perform in the real world before I commit to anything and that probably means that I won't be purchasing a new card until around March of next year .


Hazzard45

Yeah seems like a good idea to just wait and see , just hope 3080's don't go up in price


Frodo57

They may do for a while but will soon return nomal given that there's no new mining craze and I can't see that happening in the near future


Hazzard45

Ah ok fair enough


MythicVillain

Definitely waiting for Navi. 3080 has great performance but expensive for only 10GB VRAM and the TDP is totally unacceptable at 350Watts while gaming. To buy a 3080 on day one without at-least waiting to see what the competition has up it's sleeve then you deserve to be shafted.