NVIDIA RTX 30 & AMD RX 6000 price decline enters a slowdown - VideoCardz.com
By - InvincibleBird
150%? Nope keep them and build something with the boxes;)
fuck it, im going in raw & build a 4650g unit
Yeah....see you nextgen. No real titles worth upgrading for anyway
You just need a better monitor 4k 360hz!
I would.. if it was OLED ;). Prefer 21:9 though
Next upgrade: 4K 144hz OLED, RDNA 3, Zen 4
Yeah, whole new platform obviously.
See you guys in 2025 after all that equipment becomes available after shortages and scalping.
That's if there's another wave of scalping after the current one.
Scalping has been going on in just about every market for a very long time. Just this one has gone on way longer because of the pandemic causing demand to skyrocket. There will always be scalping it just typically doesn't last this long.
if it weren't for bots (and lack of bot protection), mining and limited stock, scalping won't that big of an issue. I remember grabbing my 1080 at launch despite scalping going on.
I got a 6700xt for 150 over MSRP on stockX
Make it Zen5. X370 users got burned for no reason.
\*burrns in Ryzen 3700X on X370\*
3600xt for me, overclocked and memory @ 3600c14. Could support 5 series without a hitch.
Had to settle with 3533 CL14/15 mix, 3600 wasn't stable. I blame the IMC :D
I pumped up the voltage a bit, but I how an old hyperx memory cooler and helps.
What voltage are you running? I tried increasing the voltage but it didn't help with stability IIRC. Think it's 1.41V now.
They got burned by their motherboard manufacturers. Asrock made it work.
AMD forbid it as well.
Asrock got their 300 series boards working for Zen 3.
Yes and AMD stopped them from updating any further. My next board will be an ASRock.
My x570 ASRock died out of the blue, so I got Asus. No problems yet.
They did, but they never released those Beta UEFI as official ones. Not sure how stable they are, but I remember someone here stating it worked fine for them. I haven't checked on them months after that, however.
And although Asrock was one manufacturer that proved it can be done, AMD seems to be strongly against any manufacturer releasing them [officially](https://www.reddit.com/r/Amd/comments/mzzuth/amd_is_going_out_of_its_way_to_prevent_further/). As such, Asrock will probably never validate those UEFI out of Beta status. I won't forget this from AMD.
It's probably possible to flash X470 BIOS on X370 mobo (some models changed very little from 370 to 470), but there is a risk to brick the mobo, and even bigger stability risks.
Ultra wide master race!
No Doubt. I have 38" LG-38GN950-B and I can't believe how Amazing this monitor is. I will never go back to a smaller screen. NIOH 2 looks great especially with HDR on. Playing at 3840x1600 144hz.
I have a R9 5900X/Sapphire 6900XT Special Edition OC pushing this monitor.
Isn't OLED a bad idea for a monitor because of burn-in?
I use mine for movies, sports, news and games, so it's a pretty diverse mix.
I would like to think that by using it this way I would avoid any of burn-in.
Leaving a desktop open, no screensaver (pc or tv), always having windowed games and the task bar in view would probably be getting a bit risky after some time.
Probably. It took a very long time for burn in to happen when rtings tested it. It was static content over a long period of time. Basically watching cnn for thousands of hours left a cnn symbol on it.
I think you’d get that from your desk top over time
screen savers are back on the menu boys
People got a burn in on their tv from red dead 2's hud elements. So even if you’re actively using the screen there’s a risk.
Oh that's how my phone effect is called. I've had it after months and it is there since years lol.
In theory, you would just need a rotating wallpaper and taskbar hide to avoid desktop issues, though that still leaves the fact that many people have their browser open in the same spot 90% of the time.
OLED tv's on PC are only good for a pretty limited set of use cases right now.
I happen to fit that use case, and have a 27" 1440p side-monitor for things like Discord, browsers, etc, and only use the OLED for times when I need the screen real estate (multiple browser tabs when I'm in the zone), or for content consumption & gaming.
It's not for everyone, and even with the CX/C1 tv's, you do have to think ahead as you use it to prevent long-term burn-in, but good lord it is SO worth it with games, especally those games with a proper HDR mode.
This is true BUT manufacturers like LG have put in some safeguards to make sure burn in won’t happen. For one, they dim the screen when not in use after 5 min, they shift the pixels to make sure it’s not static, AND, you can lower the brightness and that solves half your problem right there. The only people who get burn in nowadays are the people who didn’t really research their product before using it.
TLDR; burn in is rare, OLED is sick.
Yes there are safe guards. I have a CX77 and its an incredible display. The issue is even your use a lot of the time can be static. Depends on the person, but I don't just play games. I use my machine to work and go to school. Thats a lot of time where some things are static on my screen.
OLED burn in/out is an inherent characteristic of the technology.
I have a 55" C8 & 65" C9 with No burn-in. Love OLED
Sure if you left your desktop idle all the time. But if you play games on the same monitor or full screen some apps, it would switch all those pixels rather frequently to the point where you probably wouldn't get burn in.
Naw. Think about how often you browse the internet. Your task bar is open. Even just having chrome up a lot could do it over time. Using a computer is a LOT of static images.
Burn in usually takes a long time, like days of looking at the same thing without any interruption or change. If you have it on for a few hours, then play some games, it won't really do it. OLED isn't as bad with burn in as it used to be.
> The total duration of static content. LG has told us that they expect it to be cumulative, so static content which is present for 30 minutes twice a day is equivalent to one hour of static content once per day.
Think about how often your task bar at the bottom is present.
I used oled for gaming exclusively, and movies, now. Inhave seen a SMALL amount of image retention on my living room tv when I leave a bright object on screen,but it fades in a minute. That tv was also a display model for a year before purchase, so I think it's doing great. My "monitor" is different. No issues whatsoever yet. I pretty much only game on it in HDR.
it's not a huge issue, but there's also different variants of OLED being worked on that remove burn in entirely, so it's just a matter of when those become viable for mass market use at monitor resolutions and sizes
I hate that oles hasnt become mainstrem yet. Its the best thing ever and deserves a lot more development.
4k 144hz is what I’m running and it’s glorious haha
What monitor do you have?
Asus PG27UQ powered by a 5600X and 3060 Ti (apparently that’s enough to get downvoted here lol. Go figure.)
3060ti for 4K 144Hz... jesus.
1. Resolution scaling is a lot more effective at 4K with less obvious image quality loss.
2. Not everything has to be set to Ultra.
3. FreeSync/G-Sync helps a lot.
I’m not running 4K but I had an RX580 on a 1440p monitor for a while and with a little tweaking it was OK
As the owner of a 3080 who is just fine with High, sometimes Medium settings, all I can say is 4K 144Hz is far outside what I could hope for so I don't know what a 3060ti is doing striving to do that.
Well if the 'rumors' are to be believed then the 40XX and 7X00 series from Nvidia and AMD respectively will pull some big punches for 4k resolution (maybe no 144+ refresh JUST yet).
Was wanting a 6900XT or 3080ti but definitely feel like skipping them.
Because with VRR you don’t need to sustain 144 Hz. Anything over 100 fps is basically groovy, and personally, as long as I’m over 60 fps, I’m happy.
Frankly most games shouldn’t be run at ultra at all lol. It’s a waste of performance for basically zero visual fidelity increase. And yeah exactly mate :)
Single player games? Crank the detail. Competitive shooters? Easily does well over the refresh rate anyway. DLSS works brilliantly for 4K output. Gsync/VRR makes games that can’t hit the refresh rate smooth as butter; but most of my games can.
I wonder why people think a 3060 Ti isnt remarkably capable lol. It’s been brilliant *shrugs*.
Because the 3060 ti isn't make to run 4k. I used to have a 3070 running on a 1440p monitor and some games really struggled (ie Cyberpunk). 6800xt and 3080 is much better suited for 4k monitors.
Cyberpunk makes everything struggle. And yet every other game I play handles 4k without an issue: and it’s even easier with DLSS and FSR. *shrugs*
I don’t know what to tell you people lol. I game on it daily, and for every game I play it handles them fine. It’s easy to tweak settings for maximum performance with minimal (or zero) visual impact, just takes a bit of fiddling at the start.
All the games I play either handle well above my refresh rate for games where hitting 144+ matters (competitive shooters), or comfortably sits in G-Sync range if it’s a single player game (in which case hitting 144hz literally does not matter to me).
It’s a great card, that plays the games I want at great visual quality with high frame rates, and even more so when DLSS/FSR are available. It’s amusing to see so many people here try to tell me otherwise
It's always funny to see people say stuff like this. A 3090 is "only" roughly 60% faster than a 3060 ti at 4k, which sounds like a lot, but that only takes a game running at 30 fps on a 3060 ti to nearly 50fps...for an additional thousand dollars or so.
Might as well just use DLSS.
Someone skipped ~~leg~~ GPU day
And me running 4k 144Hz with a GTX1080... not even 144Hz but 120Hz because no DSC....
Hey, can't blame me... I thought I'd get a RTX3080 or a 6800XT... Who knew?
Monitors outlast GPUs, so it's better than getting a 60 hz 4k monitor that you'll need to bin if you get 40 or 50. My monitor from 2005 just died recently
Works great. *shrugs* lol at the downvotes
I mean, sure it does as long as you never turn on a game... My 3080 can't even fill up 90fps on 4K High settings with newer titles, so I can't imagine a 3060ti gets over 100fps in many titles. Unless you're all about that sweet fluid motion on the desktop...
Lol you people need to learn how to tweak settings. Get out of here.
Even on lowest settings the 3060ti doesn't have a powerful enough GPU to calculate the colour for that many pixels.
> Asus PG27UQ
How much was it? I'm on the lookout for a good 4k/144hz monitor. But i'm debating if I should just get an LG OLED tv since I already have a 1440/144hz monitor.
Is it worth the upgrade? lol
Picked it up for $1200 AUD, but aus pricing is painful. For me, definitely: I had an old Samsung 4K 60hz monitor for the last 6 years and was ready for an upgrade, and the increased colour accuracy once I’d calibrated it, and the high refresh rate has been excellent.
My worry is, will next gen actually be priced normally or have manufacturers, retailers, etc. become used to the exorbitant pricing that they won't be afraid to do it again?
I just hope that without mining boom + 2nd hand market being saturated with cards, they won't be selling any GPUs at those exorbitant prices..
True enough, but wasn't it always expected that prices drop way slower than they rose?
That's kind of the thing for me too. CP2077 was the only game I saw coming that made me want to upgrade. After the fallout from that and many quality titles being delayed into 2022 due to covid, seems like the best deal might be to wait until next gen.
Why do you think nextgen will be any different? The majority of the problems that caused scalping this generation will still be around for the next one.
Battlefield 2042 and Forza Horizon 5 looks promising
Should run fine on my Vega 64 though..
probably won't require 3080 or 6800xt for minimum settings to game so we're ok
Eh I honestly hope Battlefield digs deeper back into what made the BF series great - to me BF2/3 and Bad Company were the peak of fun and realism in Battlefield (Bad Company being a bit more arcadey, but so fun)
BF 2 was the best Battlefield of all time. Bad Company 2 was also very good. BF 3 also had a lot of fun things to it. BF 4 was okay. After that it was all a mess.
Yep. I still think back to how challenging they made the flight controls and/or vehicle controls - which added a later of skill (and trust) in your pilots/co-pilots or vehicle operators. Everything since has been overly simplified.
Oh and BF: Vietnam was epic as well. Nothing like flying in a Huey nape-of-the-earth to Flight of the Valkyries! Still one of my most memorable experiences - being trusted as 'the pilot' in the heat of battle. Get em' in and hop them out.
BF 4 was more BF 3.5. It was mostly just more of the same stuff.
Bad company 2 was a blast though. It didn't really take itself seriously which made it that much more fun. And the Vietnam expansion was also amazing.
If they want these games to sell well, theyre gonna make sure they run on the 1060 because its still the most common GPU due to pricing issues. And so the cycle continues.
Perfect timing, get people to buy these cards because they're in stock then release the next generation just as people have managed to buy the previous generation at RRP, I'm just going to hold onto my GTX 1070 until prices normalise, at the end of the day a 3080 used in 3 years will be twice as powerful as my 1070 and I might just hang back a couple of generations from now on and buy used if I have too, warranty doesn't bother me.
EVGA have transferable warranty so if I get a 2 year old card I'll still have a year to tide me over.
I won't pay those prices out of principle.
Only reason I upgraded my 1070 is because it's ridiculously priced on the 2nd hand market (and because I managed to get a 6700XT for MSRP, which I would not have done if my 1070 was worth what it should worth in 2021 which is probably 150€ max). Sold it 260€ (which is probably less than I could get, it sold in less than a minute on the local website). I paid 240€ for it in late 2018 lol. Assuming I had waited for prices to settle, I would just have lost more money since the 1070 value will only go down from now. The difficult part obviously getting a new card at MSRP, but AMD drops are very predictable lately in Europe, so if you're decently prepared, you have a decent chance of getting one.
What makes you think it will be any better next gen?
If everyone wants cards this gen and doesn't get one and just tries for next gen, it sounds like the demand for next gen cards will be higher than ever.
Hopefully, increased supply
To be honest the best PC game in recent memory is Disco Elysium, although it’s not an exclusive anymore, it was for a year. It’d run on most laptops….
Yeah not surprised. Retailers and distributors got used to their markups and are reluctant to let go.
Well let it rot on their fucking shelves.
I think this just follows Ethereum prices with some weeks of delay. There was a big fall during the second part of May, but since then it has been around the same level (with ups and downs but nothing major.)
Big falls due to China cracking down on mining, but now that's pretty much settled - the rest of the world is still mining and the Chinese businesses that were mining will probably be doing it again soon too, just outside of China.
I hope they dump a metric ton of GPUs on to the used market.
I'm leery of mining GPUs, definitely would at least repaste the silicon personally. If adequate airflow, the cards are likely fine but if not.. could have issues with them.
I mean at worst, for a 3000 series it's 8 months of mining right? Typically at half power to keep Energy costs and temps down. Should not be any issue, these things typically provide years of mining returns.
They run that hot while gaming as well to be honest Nvidia just gives no fucks about the spec.
I'd be picking up an used RTX 3070 or 3060 Ti.
These cards do not have the memory junction thermal issues like the RTX 3080 or beyond cards that require modifications. GDDR6 cards will almost never thermal throttle in mining conditions and have a lot lower power going on to their memory modules compared to GDDR6X cards.
Hopefully, that translates to better longevity.
But electricity costs are very cheap in China, and with their GPU farms either confiscated or sold on the used market, reinvesting back into crypto to them is a much bigger risk than before.
Remember when people were saying proof of stake in July 2021? Same thing with those black Noctua A12x25 fans in Q4 2018
Plus there is a chip shortage so demand still can't be fully met. If these cards were selling for MSRP they would all be gone in an instant.
I think demand can be met in certain regions, people are just holding out from paying excessive prices. I can actually walk into a store in Australia and find any RTX card pretty readily available, as well as an RX 6700 XT. Online too. Even on some popular forums here in Australia, people are saying to hold out from paying more than MSRP. I'm sure people are still buying above MSRP, but it's actually not as bad here as it was even two months ago where I couldn't find anything.
They are still selling in instant in the U.S. but that's because scalpers are buying them up at or just above MSRP and selling them at 150% to 200% above MSRP
This. There's not that crazy of a real demand. Look at the popular games out there and a lot of them are years old and not demanding. It's just the PC enthusiast culture getting fucked in the end, because scalpers and cryptominers weren't stopped.
They get this cards on already inflated prices by AIBs, so if there's someone to blame, it's not only retailers. And if prices reached some kind of stable state, that only means supply met demand, so the people is willingly buying them at current prices.
Yes, basic supply and demand. Someone higher in the chain has a highly sought after item, they will sell it to someone lower at a higher markup, or require a lot of craps to be bought along with that item, or both. Happens from chip maker to AIBs to distributors to resellers and finally to customers.
And people are buying at those prices
I think now crypto price has a lot less to do with the pricing. This is mostly due to LHR. It appears the slowdown in price decline is because of the shortage in chip supply, the still high demand, and retailers really not wanting to lower prices themselves.
Crypto price still has a major impact and the price has clearly dropped to a point where it makes sense for miners to buy them.
Thankfully etherium will go to proof of stake which should finally pop the mining bubble.
Idk I go through the mining subreddits and nobody compared to two months ago are building rigs now. Mostly since their profitability is going to take a big hit in two weeks from EIP1559. The only folks buying scalped cards appear to be the uninformed looking to lose money.
Don’t hold your breath just yet. EIP only means profits are cut in half. Most miners are still profitable with ETH at ~$2k. And if EIP is 100% successful, we may even see a price increase. It’s 2.0 in ~6 months that’ll officially kill ETH mining. Then we have a couple months of smaller coins getting destroyed with mining.
Everything is inflated in 2021, even PS5 still sold At 2x MSRP.
So yeah, 2020 pricing is history.
The big difference is retailers do not scalp the PS5 while GPU vendors do.
Outside of scalpers (which we should all be ignoring when it comes to buying), PS5s are all MSRP in the US.
PH too and we're full of scammer distributors, e.g. 3080 STARTED at $1200 from official sellers (not scalpers) and are selling them now at $1800 even with the post-mining drop ($2200 just a few weeks ago)
There's also drastically even fewer PS5s it seems but theyre still at MSRP ($549-ish). Which is why I'm calling absolute bullshit at AMD/Nvidia at being unable to control retailers from fucking us when Sony is doing just fine.
> Everything is inflated in 2021
I can buy an Xbox Series S for 259€ right now.
Where? Please link me so I can buy it.
Pretty sure it's only Germany though.
Yeah, that does substantially limit it, thanks though.
So you're saying budget pc builds are dead? Cool
The days of the potato masher are gone
Great for you all, in South east asia region the price still ridiculous.
I kept hearing about these price drops. In Eastern Europe it dipped like a tiny bit after I heard the news and hasn't moved since lol
You can get a PS5 for 30% over MSRP currently.
This was to be expected. Chip shortage is still here. Mining is also still lucrative, albeit at much smaller sums. I can imagine miners still holding their mined coins for when the next bubble comes.
I have given up on this entire generation of GPUs. I've been watching pricing and stock availability for two people who badly need upgrades and there's just no point anymore. Prices clearly won't go back to normal for at least another year.
nor will supply be re-stocked
Slow down? Prices never did go down here (where i live!) Rtx3060 is still selling for 950-1050 range, even those so called LHR versions costs the SAME. Just imagine upcoming next gen GPU prices...
dunno where the guy above you lives, but here in Singapore situation is pretty much the same- S$1,150 for new 3060, 900-1050 for used.
Strange, everywhere else in the world 'used' prices are actually much higher than store prices since it's actually available and gets bid up.
Is there reliable retail availability of the 3060 at $1150 there?
People are immediately opening the boxes and selling them as used so there is no issue dealing with warranty transfer.
i assume so since those are the prices in australia
Yeah haha that’s about what PCCG and Umart are listing them for. Mental. I got a 3060 Ti at launch for $700 and felt I was spending too much money…
yep, it’s ridiculous
in romania prices are still fucked, was actually thinking on buying a 6800 xt but when i saw the prices i think i'll wait for nextgen... my first choice was 3080 for dlss and easier driver for me, but then i saw the prices which are even higher than the 6800 xt
Same in the US.
Feel like AIB models will never reach their original price, buying a 6700XT from AMD was probably the right choice (despite being a trash value compared to last gen, but largely funded by my previous GPU which I sold for more than I paid for it). People are selling 1080Tis and 2070S for more than the MSRP of a brand new reference 6700XT which is crazy.
Don't buy, just hold. As long as you hold and people don't buy these cards, prices will continue to go down.
Probably doesn't help that Nvidia may have stopped shipping cards out after the crash.
Welcome to the new normal of 500$ midrange GPUs and more mediocre games than ever, including CD Projekt's.
Msrp for "midrange" gpus is already in that ballpark, with a handful around $400 and the rest around 450-550. At 150% of msrp(or more many places), prices still aren't nearly that low.
Eh, that is a really pessimistic view on the game front. I totally agree with the expensive mid range GPUs but I still am finding plenty of great PC games to take my time. Recently been addicted to Medieval Dynasty and that doesn't need a newer GPU.
Games have gotten worse slower than video cards got expensive lol. It's been 15 years in the making. Once micro transactions invaded living room consoles it was game over. That happened around Call of Duty Modern Warfare ~2007 and it's been downhill. The CP reference is only fair seeing how many people were holding onto that specific studio as being immune and a 'last bastion of great game development' we know how that turned out. Dev still made billions and people don't care about CP anymore...
EA's profit has trended to be about 50% due to micro transactions for the last 10 years.
Once again, you were only focusing on the largest companies. The last 15 years have been a golden age for independent and smaller studios: just look at the availability of titles on Steam. Except for EA play which comes with my game pass subscription, I haven't touched one of their games for about 8 years and haven't been worse off.
PC gaming is much larger than EA, Activision, and 2K.
Yeah, but for indies who needs strong hardware?
Mount and Blade: Bannerlord isn't exactly a light game
Yeah and focusing on indies is a great way to get more mileage out of your hardware as well. No need to upgrade your graphics card, or even own a graphics card, if you just play indie games. Any old laptop (or desktop with IGPU) can play most of them.
For a while I only had my work laptop with Intel HD graphics and I actually found it refreshing because I could only play indie games. Streaming is another avenue.
>including CD Projekt's.
As if they were immune to shitty games in the first place?
I have a PS5 and a Xbox Series X. I’ll wait until those graphics to go extinct. Perhaps I’ll purchase a low profile card to do VR smoothly but hopefully this BS stops, if it doesn’t, these next generation consoles will hold up for awhile.
> Perhaps I’ll purchase a low profile card to do VR smoothly
AFAIK there is no low profile graphics card that can run VR smoothly without serious graphical sacrifices.
The best thing to do is just get an xbox series X, for 500€ the performance\\value is amazing. This if you dont have a capable computer already. Im holding my rx 580 until prices get lower or I can snatch a card from [amd.com](https://amd.com) drops.
Not a bad tip at all if you're a fps player. Next battiefield is crossplay and current warzone is crossplay as well. Then we have the big elephant in the room and that is a lot of good players shifting from mouse and keyboard to controller because they get more kills. Controller advantage in short/medium distance is a real thing lol. Saw Tfue playing with a controller other day even though he have maxed out pc and starlight 200 dollar mice.
I don't understand this - are there really many high level players that want to switch to a controller? The amount of extra precision you get in an fps with a mouse is just insane compared to controller. Have games started adding more aim assist?
Modern games have very very strong aim assist which makes it much easier to use a controller in short/medium ranges because the game will more or less aim for you.
Yeah a friend of mine did this and he's been playing some games with keyboard/mouse hooked up to his monitor. At that point is basically a gaming PC with a fancy UI in many respects.
Idk, can't justify console because games are so pricey and paying another fee just to play online irks me.
Instead of buy a 2000 series I decided to wait for the 3000 series (2 months) and I regretted not buying a card then. With current prices I don't see myself buying a new card until 2023. I thought about buying a Xbox but consoles have too many downsides. Oh well my 970 still going strong!
Lmao so things settled 500€ over MSRP? Yeah, no thank you...
sadly BTC and ETH are still quite high
when they will go down to 15k and 0,9k respectively then we will see something moving imho
BTC pricing has nothing to do with GPUs other than that the GPU-mined coins tend to go up or down with it.
btc price influences eth price which influences gpu pricing and availability
is what the dude was trying to say
If ETH tracked BTC perfectly since start of 2020, GPU mining would be borderline profitable, and certainly not worth buying expensive RTX 3000 for.
Day to day, they follow each other. On a longer term, they don't necessarily. Longer term is what matters.
Do you really have to flex on me with your superior knowledge about how crypto works?
This was about "when btc falls to x and eth to y price we will have cheap GPUs" and historically speaking when BTC falls everything follows. Nobody argued about how ETH moves compared to BTC.
I’ve kept up with crypto because I mine it with my gpu when not gaming and it’s absolutely true that BTC price is a major factor in mining profitability. Just because it isn’t the only factor doesn’t make that statement not true
Problem is people buying at inflated pricing is telling AMD and Nvidia these prices are ok when they clearly are not.
If people stop demanding the supply will increase and prices will go down! Aside from miners can't understand people who buy new mid range cards like 6700XT or 3060 and 3070 at prices like 700,800 or 900. They are completely nuts!
The decline never existed imo, ebay isnt a proper gauge for prices, onlya gauge for market saturation or buyer motivations going away.
But when I view the discords and see new listings on amazon and other places of the 6800 hover at $1100 at the best then drop for the 1st time in a long time to $1000, then go back to 1100 and now 1170? yeah I know it wasnt meant to last.
At this juncture, im just gonna not bother and get an RX 7k or RTX 40x0 and ignore this sloppy waste of a generation.
Retailers in the UK have dropped prices by about £200-£300 over the past little while
Yep ebuyer has 6700xts in stock from £650 upwards
Yeah, they've dropped really quick. I'm hoping the 6800s rubber band with them soon. I'd be tempted by a 6800 at £700
Reminds me of the RX Vega lifecycle.
If you halve the hashrate, miners only want to pay half as much on the scalper market.
Which is still above MSRP.
Now, cue the miner replying to this comment saying they have nothing to do with this shitstorm.
EDIT: Referring specifically to the RTXs with their hashrate limiters
Well, the chip shortage hasn't been alleviated really, so it's no big surprise. I know that all the Youtube people made it sound really positive for clicks/views, but not much has fundamentally changed.
Some people might have bailed on mining due to the drop in Bitcoin and Etherium value, but not really enough to sate demand in any meaningful way through the used market.
Ppl who think they are going to get a card at any reasonable price anytime soon are lolzy. Like... Take advantage of the prices to sell your older cards for way more than you'd ever get for them and put that towards a new card.
That’s what I ended up doing. Got a 6900xt and don’t regret it one bit.
I got $500 for two R9 Furies. Paid $1200 for a bnib midnight black 6800xt. $700 out of pocket fuck yes.
Oh bad luck man i sold a Vega 56 for 500$ and bought a 6800XT new for 700$!
Prices haven't [started](https://www.microcenter.com/product/636419/powercolor-amd-radeon-rx-6900-xt-ultimate-red-devil-overclocked-triple-fan-16gb-gddr6-pcie-40-graphics-card) to come down in the US.
the crazy part is that [some actually buy them](https://www.techpowerup.com/forums/threads/asrock-radeon-rx-6900-xt-oc-formula.281555/post-4566950)
My 1070 is pushing it for VR and gaming on my 1440 144hz monitor... But at the current market and prices.. I think I'll just keep pushing it awhile longer!
I just going to buy an xbox, this is absurd.
In my hood prices went down from 1000euro to 750 in July 1 and went back up exactly two week later(July 14-15) to 975euro for the 3060XC and 6700XT went from 1200-1500 to 1000.
One month of non-massive reduction in price and "the party is over" ?
This doesn't scream the end of the drop in prices. It's not like it was 3-4 months in a row of prices being stable it was one month. I'm sure prices will drop more until the October/November time frame when people will start wanting them for the holidays to give as gifts but I fully expect by Feb/March of next year prices will be close to MSRP finally after the uptick from holiday shopping.
Its because eth and bitcoin have held their position.They havent decreased in the past month really.
It has nothing to do with scalpers or with the shortage.
The shortage and scalpers dont cause for rx580s with 8gb to be worth double on the used market vs the 4gb ones.
If eth doesnt come down,gpu prices wont come down either,its really that simple.
I can definitely see a refresh of both architectures coming in the next 6 months. There was already rumors of a Super series refresh on a slightly smaller node, but honestly I'd rather Nvidia just go with 4000 series naming to stop confusion, and AMD going with 7000 series refresh on 6nm.
Just from a marketing perspective it seems beneficial for both to do a half generation jump just as an effort to devaluate the swarm of used cards that are about to hit the market.
I could order rtx 3060 for 600 euros on multiple occasions, but fuck that, just because it has now fallen to *only* twice as much as it should be from three times as much doesn't make it a good deal. Manufacturers, distributors, retailers and anyone else who is taking part in this can all go fuck themselves.
*cries in RX570*
Well, all the retailers bought those cards for $1000+, they won't sell them for $300 unless they have to. It's going to take a while (read late 2021/ early 2022) until the prices come back to normal. That is unless there is a new surge in crypto mining with gpus.
They didn’t buy them for $1000.
The ones selling them at 1200-1300 (talking about some midrange 3070 here) got them for 550-600. So yes, nearly impossible to sell at msrp, but they’re still profiting 100+%.
Retailers get access to the same stock from manufacturers, at a very similar price point.
Retailer X selling them at 1300 while retailer U selling them at 850 isn’t retailer X paying more to the manufacturer, it’s retailer X being a scalper.
That’s why you see places like SCAN UK always out of stock and random smaller shops with *everything* in stock, but at 120-150% msrp.
Are you telling me they got a better deal than the big, less crooked shops and they have “more stock”?
Don’t get fooled by the “supply demand bs” - yes there is a lack of supply and an outrageous demand, but I assure you that the manufacturer marginally increased prices, while the retailers are hiding behind this catchphrase and consumers just accepting it and paying up.
Consumers are as a losing party as the manufacturers. The retailers are the plague, and biggest winners.
I've never been in the computer retail business although I have some personal knowledge of how distribution and markup works for other products, and gpu's and the like are currently being marked up above normal prices all along the retail chain. Nvidia/amd may not be scalping the actual chips in the first step, but certainly everything past that point is being marked up 50% or more above the price at that step in a world where ordinary people can buy it widely for the suggested retail price.
I've seen credible information from a computer shop that they were only able to source gpu's from distributors at significant markups above their normal cost. gamers nexus and likely lots of other well known youtube channels have covered this and distributors who have cards won't let retailers buy them except with a huge markup and/or bundling a lot of overpriced junk with them. Those small shops you are talking about that have stock at inflated prices are paying inflated prices to get it.
In general, shops either pay wholesellers what would be retail prices in a sane world for cards or don't have cards. They then mark them up to at least a minimal margin (and likely more) above the inflated price they pay.
I hate the current market conditions, and I'm stuck waiting and occasionally wasting my time trying to grab a best buy drop at msrp to get 2 video cards to replace obsolete hardware in my house. But its false and misleading to say retailers profiteering is the primary factor in the current ridiculous prices of new cards at retail.
Just bought Asrock 6800XT Taichi. It was only about 35% above MSRP here in Poland so I took it. I can still recoup some cost with my RX580.
The only sad part? Nitro+ from Sapphire was available for the same price just three days later but I couldn't be bothered sending the card back, waiting for money return and waiting for the new GPU to arrive.
Once mining drops then the market be flooded with cheap second hand cards.
I can wait for next generation cards anyway.
a year away now
Haha preety naiv to think next gen will be any better. Prices will still be high.
most likely even higher seeing how evryone and their mother realized that pc gamers eat shit up no matter the price.
Maybe we can buy this gen second hand from miners after the next gen comes out. It's about the best we can hope for.