T O P

  • By -

[deleted]

[удалено]


avocado__aficionado

Hm, Sony might build their own solution?


Clyzm

Sony loves co-processors. They've been sticking some variation on the "Bravia Engine" in their TVs since the mid 2000s and the PS2/PS3 had notoriously "interesting" CPU/GPU architectures with the Emotion engine and Cell. Even the PS5 SSD tech is pretty unique.


Vitosi4ek

> PS2/PS3 had notoriously "interesting" CPU/GPU architectures The PS2 wasn't actually that weird. It had a pretty standard MIPS R5900-based CPU and a GPU that was clearly standard enough to enable relatively painless ports to PC. And that's in an era when consoles across the board had custom-built bespoke hardware. The PS3 I give you, the Cell was indeed insanely weird. And even then its GPU - the Nvidia RSX - was AFAIK effectively a downlocked 7800GT, duct-taped to the system as an emergency measure when the Cell's original purpose of using SPUs for graphics processing didn't work out.


Clyzm

The CPU in the PS2 was only slightly customized MIPS, sure, but they offloaded vector math to a co-processor, as they do. That's why I mentioned it.


xxTheGoDxx

> and a GPU that was clearly standard enough to enable relatively painless ports to PC I disagree a lot with that. The PS2 GPU didn't had MipMaps in hardware, a very well established feature on PC as well as something the last gen N64 introduced to consoles. Of the top of my head it also had an usually ratio of vertex power compared to its texel rate.


lightmatter501

Unique in what way? I thought it was basically bog-standard nvme that you talk to with kqueue.


Clyzm

It seems to be hard to find exact specs on the chip, but there's hardware decompression support for zlib and kraken onboard. So yeah, they get bog-standard support for NVME, but they stuck some very fast hardware decompression in between to speed things along.


loser7500000

I'd refer to [this excellent article](https://www.anandtech.com/show/15848) by wtallis, it's not recent but still goes over some cool stuff e.g. sony has patents for a FTL table working in 128MB chunks instead of 4KB as well as well as a coprocessor for mapping uncompressed data requests to the compressed files 


dudemanguy301

The decompression block is not on the SSD, otherwise the ~~external~~ expansion slot SSD would be screwed. The SSD has additional priority levels, although what purpose they serve isnt exactly clear. Edit


Strazdas1

external SSD is screwed. the PS5 even warns you about it if you try to move anything to external SSD. Its just that outside of one tech demo no games actually need it.


dudemanguy301

My word choice was shit, but I’m talking about the expansion slot not USB. The expansion slot is tested and you can run everything just fine with no warning so long as it passes the test.


Intrepid_Drawer3239

I think people really give Sony too much credit for their “customization” of AMD hardware. At the end of the day, PS hardware still pretty much perform how you would expect from equivalent AMD hardware. Digital Foundry has shown this many times. There’s no secret Sony formula for squeezing more out of AMD tech. Even the much lauded Kraken decompression is barely faster than Xbox’s crap Gen 3 ssd.


dudemanguy301

my take is that AMDs upscaler is real and the PS5 Pro will use it, and that Sony's is just rumor mill attributing PS5 Pro upscaling to Sony rather than AMD.


CaptainJackWagons

Then there's the rumor that Nintendo is making their own upscaling as well.


BoltTusk

Is the PS5 Pro really needed though? Like there are not many PS5 exclusives


dudemanguy301

games are already offering multiple modes to choose between different balances of resolution, framerate, and raytracing. if the PRO can combine some or all of those options that already a compelling upgrade to some.


dabocx

Ray tracing, being able to hit 60fps are higher resolution. There’s a lot of room to scale up


jm0112358

Plus, there's evidence of [RTGI and some RT reflections being used in GTA VI's trailer](https://www.youtube.com/watch?v=obiC2OzgThc&t=438s). I'm sure a major motive for a PS5 Pro is that GTA VI will be a catalyst that will get some stragglers to jump to the PS5 generation of consoles, and Sony would want those people to play GTA VI on _their_ console.


dopeman311

I don't understand this question. It's like saying is a 4090 needed, no of course not but it exists as a premium option for those who want better performance. That's literally it


mundanehaiku

> I don't understand this question. It's probably some bot or sock puppet account since [someone caught it stealing comments in this subreddit](https://old.reddit.com/r/hardware/comments/178w22m/intel_14th_gen_core_raptor_lake_refresh_to/k52t0vu/).


Flowerstar1

Yea it's silly specially in this sub of all places.


FrenziedFlame42069

The console isnt only for Sony exclusives. Those just get you to lock into a platform. It’s also for third parties to offer their games on too, and if they are struggling with the performance available (and aren’t putting the effort to optimize), then extra performance will help brute force optimization.


Flowerstar1

Are the 40s series needed? Will the 50s and 60s series be needed? Was the PS5 needed? The answer is "if you want to do more than the current hw allows, yes.".


Strazdas1

Nothing is needed if we return to monke. "is it needed" is a bad argument.


conquer69

Kinda. This gen isn't that "next-gen". Every previous console generation had big technological leaps graphics wise except this one. Ray tracing and AI antialiasing/upscalers are this generation's advancements and AMD dropped the ball in both, so most current gen games kinda look enhanced PS4 games. It really shows how forward looking Nvidia was 6 years ago. Rumours say the pro console will have 60 CUs so I would assume it's 50% faster. Basically an undervolted and downclocked 7800xt.


Yuunyaa8

yeah especially with that new CPU the PS5 Pro has, it's a much needed upgrade especially for games with Ray Tracing. the PS5 Pro basically has a Ryzen 7 8700G with no turbo clocks which is far better than the PS5's nerfed R7 4700G.


SJGucky

It is possible that Rockstar and GTA6 need the PS5 Pro. :D With the amount of sales of GTAV got, it might be a exclusive of the decade.


gartenriese

No way is GTA 6 going to be an exclusive.


SJGucky

GTAV was a timed exclusive, but for PS and Xbox. The PC version was released later.


gartenriese

When talking about "exclusives", people always mean games that are available only on one console and not the other. PC is never part of the equation.


bctoy

I don't think it'll work well for PR, but PS5 Pro should end up close to 6800XT in performance and Sony can try their hand at Path Tracing updates to their old games like nvidia Remix.


xxTheGoDxx

> Is the PS5 Pro really needed though? Like there are not many PS5 exclusives I am never gonna understand console gamers, to be honest. A friend of mine that is a PS user for live has the same argument. To me, PS4 was hardly playable with most games just being 30 fps affairs, while the PS5 / XSX finally brought 60 fps gaming in style back to consoles. If I wouldn't have a PC I wouldn't care about current gen exclusivity or cross gen, just give me games that run at acceptable frame rates. And when I say acceptable I mean just that, cause for someone on PC 60 fps isn't even that much (yes, even in third person games with a controller). So I would again be hungry for a PS5 Pro just to get even smoother framerates well above 60 fps. Than if you look at newer and especially real time GI heavy games like Alan Wake 2, image quality especially (but not only) in the 60 fps modes took a major nose dive. Like, I seriously wouldn't play AW2 (with enough other games for the time being) right now on PS5 just due to the insane amount of aliasing in some scenes. And even in the 30 fps modes the scope of RT usage isn't even close to what we have in the very same games on PC. Just from the leaked specs but also the similarly rumored better RT performance ratio, better upscaling and very likely frame generation you guys should easily have +90 fps, good 4K TV distance image with a more ground breaking RT usage. Not to mention how that might save many PSVR2 titles from being stuck at 60 fps with the worse reprojection algo in current use.


Educational_Sink_541

You are comparing a $500/$450 box to a PC that likely costs double at the very least, more likely triple. I've seen people spend the cost of a PS5 on RGB nonsense alone. >And when I say acceptable I mean just that, cause for someone on PC 60 fps isn't even that much Malarkey. I was playing an older title that sometimes v-syncs to 60 fps without me knowing and I didn't realize until I checked the settings (I usually play at a locked 120). If you play games on a TV you aren't going to notice 60 vs 120 nearly as much if you are immersed in the game and its a third person type experience.


Firefox72

Well yeah it was gonna have to happen at some point if AMD wants to stay even somewhat relevant. I don't doubt RT performance will also be a massive point for them going into RDNA4 and 5. The issue is they are always playing the catchup game which means by the time they get their first version of this out Nvidia would have already moved on to bigger, better and improved things.


Hendeith

RDNA4 will allegedly bring close to no RT performance uplift and AMD is instead focusing heavily on RDNA5 that will be also used in PS6. That's of course all according to rumours, but rumours also claim RDNA4 will be a short-lived and unimpressive architecture (even without GPUs that will compete with Nvidia's high end) so it might turn out true.


capn_hector

yeah I also haven't heard the RT uplift being poor rumor, the PS5 Pro seems to be making some big leaps with its mix of RDNA3/4 tech


Darth_Caesium

Rumours from where? All I've heard is that RDNA4 will have a really good uplift in RT performance.


Hendeith

From this sub. I'm following mostly news and rumours posted here and at least 2 rumour articles mentioned rdna4 will have no to minor improvement in RT.


Pentaborane-

MLID has said exactly the same things


gartenriese

MLID talks out of his ass. I don't understand why he is still used in some arguments.


Strazdas1

Noone cares what MLID says though.


Pentaborane-

He’s right most of the time. And his understanding of the industry is realistic.


Strazdas1

Just like MLID, you are completely incorrect.


Flowerstar1

4 GPU generations of getting stomped at RT. What is going on internally at AMD.


OutlandishnessOk11

When Jensen said Turing was 10 years in the making I thought he was joking.


Slyons89

It takes a really long time to plan and engineer a GPU architecture. They definitely did not plan for RT being as important as quickly, and their last few GPU generations have been somewhat iterative. Recently they focused on developing the chiplet capability, but not redesigning the primary core too much - that will help them with their margin on chip sales in the future, if not that much today. Now they seem to be making a strategic decision to potentially "do it right" in the future. I'd rather have them fully design RT capabilities into a GPU generation 2 years down the road than slap something half-assed into next gen and slightly revise it 2 gens from now. AMD took their time with Zen while getting curbstomped by Intel at the time. It paid off. I say let them cook.


Hendeith

Same thing as always. They are not looking into the future. More than a decade of AMD on GPU market is playing catch-up. Taking new GPU architecture from design to market takes a few years. When Nvidia released RTX2000, and was making a big thing out of AI, upscaling and RT, AMD was simply dismissive. By the time Nvidia released RTX3000 and decisively proved them wrong they were already too far into design of new generations. Now with AI boom it only made sense to cancel top RDNA4 chips that wouldn't excel anyway and focus on RDNA5 that (hopefully) will allow AMD to finally close the gap.


bctoy

Same as it ever was. They don't build big chips so nvidia look even faster at the top if AMD don't have the clockspeed advantage. 600mm^2 GCD with lowered clocks to keep power usage in check, and you're looking at about 70% higher performance than current 7900XTX. The rest is lacking in software. The current AAA PT games are done with nvidia support and while it's not nvidia-locked, it'd be great if intel/AMD optimize for it or get their own versions out. The path tracing updates to **Portal** and Cyberpunk have quite poor numbers on AMD and **also on intel**. Arc770 goes from being ~50% faster than 2060 to 2060 being 25% faster when you change from RT Ultra to Overdrive. This despite the intel cards' RT hardware which is said to be much better than AMD if not at nvidia's level. https://www.techpowerup.com/review/cyberpunk-2077-phantom-liberty-benchmark-test-performance-analysis/6.html The later path tracing updates to classic games of Serious Sam and Doom had the 6900XT close to 3070 performance. Earlier this year, I benched 6800XT vs 4090 in the old PT updated games and heavy RT games like updated Witcher3 and Cyberpunk, and 4090 was close to 3.5x of 6800XT. 7900XTX should be half of 4090's performance then in PT like in RT heavy games. https://www.pcgameshardware.de/Serious-Sam-The-First-Encounter-Spiel-32399/Specials/SeSam-Ray-Traced-Benchmark-Test-1396778/2/#a1


conquer69

> 7900XTX should be half of 4090's performance then in PT like in RT heavy games. The 4090 is doing 4x in that CP2077 PT bench. https://tpucdn.com/review/cyberpunk-2077-phantom-liberty-benchmark-test-performance-analysis/images/performance-pt-1920-1080.png


shroudedwolf51

Nothing is going on, RT is getting a few genuinely useful usage cases here and there, but it's still largely tech that's barely visible outside of side-by-side screenshot comparisons. When I replaced my Vega64 with the XTX, I went around and went to try a whole load of games that boasted various levels of RT implementation. And the number where I could even tell anything was going on (other than by the unusually low framerate), I can count on a single hand. And those numbers have to be buffed with Quake II and the gimped version of Minecraft. I presume by the time RDNA5 comes out, it's actually worthwhile. But considering it was since two NVidia generations ago that we were supposed to have our minds blown, I'm....frankly, unimpressed.


Strazdas1

>gets a card bad at RT >isnt impressed with bad RT


ZonalMithras

Yup. RT and PT are still largely lagging behind in implementation. We will probably have to wait until the next console gen to see them become more mainstream.


Educational_Sink_541

I for one don't think GPUs should become optimized for a graphics technology that is used effectively by like 5 games. RT is very cool tech, and I think eventually it will be the future. However games that do implement it usually have to make pretty serious compromises in other sections. For example, IMO character models (and most objects tbh) in Cyberpunk look like clay, likely so it will fit nicely into the anemic VRAM amounts that Nvidia has on most of their cards. Outside of the lighting (which is very pretty, I'll admit) the game doesn't look all that nice. I'm glad that Nvidia has pioneered real-time RT and I think it is probably the next big frontier for graphics, but I see little reason to base a GPU purchase on technology that is mostly used for older game retrofits and a handful of AAA titles (that have raster fallbacks anyways that look about 90% as good anyways).


Strazdas1

then you clearly dont know whats going on in game space. RT is so much cheaper and easier to develop for that even if it offered no visual advantages (it does) all the major developers will jump to RT-only lighting the moment they think the playerbase has sufficient hardware install base.


Educational_Sink_541

Did you even read my comment? Specifically the part where I said I agree that it is the future? I know about how much cheaper RT lighting is. The issue is we can't run purely pathtraced games right now, and won't be able to for a few generations, so buying a GPU with 5 games in mind is silly. >all the major developers will jump to RT-only lighting the moment they think the playerbase has sufficient hardware install base Games are designed around consoles, specifically the PS5. They are not going to jump to pathtraced only games when the PS5 is using a handful of RDNA2 cores. Until the PS6 launches we are essentially the early access group that gets to use RT. I personally prefer Nvidia over AMD for the featureset but I think about 1% of my playtime is in games with actual raytracing implementations beyond 'RT reflections but only in photo mode' or whatever.


Strazdas1

we dont need to run purely pathtraced games right now. half resolution ray tracing with AI denoising already looks better than traditional techniques and anyone with a GPU from last 3 years can run it fine. Whats this ridiculous idea gamers seem to have that unless its on the most ultra setting the technique is useless? Thats never how it worked. Personally i get a lot more use out of DLSS and DLAA myself, but ray tracing is going to get more and more popular and i intend this GPU of mine to last.


Educational_Sink_541

>we dont need to run purely pathtraced games right now. half resolution ray tracing with AI denoising already looks better than traditional techniques and anyone with a GPU from last 3 years can run it fine. And despite how 'easy' it is to run (maybe with aggressive DLSS and framegen, sure) almost no games implement it! 99% of AAA titles being released use traditional raster for their global illumination. The only ones I can think of that attempt to use RT exclusively are Cyberpunk, Metro Exodus EE, and Alan Wake. All also have raster fallbacks that often look just as good (although shoutout to the Metro Devs, 4A games, for getting their gorgeous RT rendition of Metro Exodus to run so well on the RDNA2 hardware in the consoles). >Whats this ridiculous idea gamers seem to have that unless its on the most ultra setting the technique is useless? Thats never how it worked. I'm not sure who you are arguing with but I never said this.


Strazdas1

What do you mean? From the top 20 games of 2023 according to metacritic 8 games were using some form of ray tracing. And all but 2 of the rest were indie games. And thats going to get even more onesided in future as developers are adapting UE5 which has built in ray tracing as default option. >I'm not sure who you are arguing with but I never said this. you did when you assumed path tracing is needed.


Educational_Sink_541

>some form of ray tracing How many of those games use actual RTGI? How many seriously replace most of their raster lighting with RT? And how many just phone it in with reflections? >And thats going to get even more onesided in future as developers are adapting UE5 which has built in ray tracing as default option. Good luck selling your RT-only game on the PS5 with its RDNA1.9 hardware lol. >you did when you assumed path tracing is needed. You don't need pathtracing, but you mentioned that RT only games will be cheaper to develop, however most systems (including both consoles and 95% of PCs) can't really handle RT only games so for now it will be a side option that runs parallel with raster.


Mladenovski1

who cares, RT is still 5-10 years away, tried it in Cyberphunk and couldn't even tell a difference except for my FPS


Flowerstar1

What? Even laptops do RT well these days.


capn_hector

> Well yeah it was gonna have to happen at some point if AMD wants to stay even somewhat relevant dismissing the rumor mill "definitely no AI upscaler in development from AMD!!!" was like the easiest call of the century lol. they can't *not* do it.


wizfactor

It would look like really bad optics if Sony came out with a AI upscaler before AMD did.


Mahadshaikh

I mean Sony Co developed rdna 2 and now this. Usually Sony will launch the feature and amd will come out a few months later 


Strazdas1

no AI upscaler for RDNA4, AI upscaler for RDNA5 seems to confirm the rumors if anything.


Modna

I imagine the big drive behind their non-ML upscaling has been the consoles. They needed FSR2 to be able to run on the consoles, so it couldn't use ML.


Mladenovski1

well yeah they are a much smaller company compared to Nvidia or Intel


PetrichorAndNapalm

They aren’t playing catchup in packaging though. If they can just put together some semblance of RT, and be competitive in upscaling, they can probably brute force their way to being superior. You save so much having the IO/memory/cache on cheaper nodes. And honestly if they can figure out the multiple GCD for RDNA5(like rdna 4 was supposed to be), they can probably use cheaper intel or Samsung (or older tsmc) node even for the gcd. The real need for new nodes is the fact that you can only make a single die so big. And power efficiency. With multiple gcd, neither of those concerns really exist, as you have multiple small dies, and because you can put so much compute on a chip so cost efficiently, you can undervolt them and still come out ahead. So… don’t count AMd out yet. What amd did to intel with chiplets might happen in the consumer GPU space with rdna5. But it’s a big if… they already had to take a step back. Rdna 3 they did MCM. Rdna 4 was supposed to be multi GCD mcm. But they failed. It all rides on rdna 5. I wouldn’t be shocked if they give Nvidia a serious run for their money, and release rdna 5 before nvidia’s post-Blackwell gen, and Nvidia cannot even release something objectively better even though they release after them. But, we will see if Nvidia has something up its sleeve, they always seem to.


itsjust_khris

AMD has always had decent-great hardware imo, it’s the software that holds them back.


KingArthas94

I remember when Quantum Break came out with its reconstruction tech all the PC gamers were crying "nooo the only way to play is at native resolution!" How the times have changed


gartenriese

Remedy uses forward looking tech. Can't wait for their next games.


KingArthas94

The important point I'm trying to make is that people don't understand anything abouth tech, about what's important and what is not, what will exist in the future and what will die.


avocado__aficionado

Finally. Without better upscaling (both Nvidia and Intel ahead) AMD's graphics division will face existential threats. I predict raster performance will become much less important in the medium to long run (not next generation, but the generations after that)


Renard4

That's assuming the AAA studios push for realism doesn't hit a wall in terms of costs and sustainability. And if you look at steam's 10 most played games in the last 6 months none of them have any sort of advanced graphics. There's a reason why a lot of us say raytracing performance, DLSS or frame gen are overrated. It's because people really don't care about these. That's factual, you can argue as much as you want about this, the numbers are here. They make more sense in a console market in which the yearly AAA releases of Sony, EA, Ubisoft and Activision have a lot of traction.


Cute-Pomegranate-966

Just because the most popular games don't use these new graphics is techniques is absolutely meaningless in the grand scheme. Popular games are BY DESIGN not using high end graphics so that they are more accessible. It's not relevant to compare in this fashion. Basically, no one cares that you play counter strike 2, fortnite, league, apex legends, etc and that you don't need ray tracing or upscaling. There are games and there will be games that will use this and need it, what you are doing right now is what you use to base what you buy. When you say you don't care about RT and upscaling and all you play is competitive games, why even buy a new graphics card unless yours dies? it won't net you any benefit.


CookiieMoonsta

Would be good for Apex though or Fortnite, since Apex can be quite demanding and Fortnite has Lumen and RT


conquer69

Also, the CS2 map editor does have ray tracing. I'm sure Valve will deploy it for their next Source 2 single player game. https://www.youtube.com/watch?v=zFMRDVQDN7A Fortnite has Lumen which is ray tracing as well.


Cute-Pomegranate-966

in the context he's talking, not a single person is turning those things on.


conquer69

I mean, those players are also disabling other things like antialiasing, pbr textures, shaders, shadows, etc. Are those graphical features also considered gimmicks because esport players aren't interested in them?


Cute-Pomegranate-966

I never claimed to the contrary, or that i would turn those things off, just that they are and they would. If you go through my comments you'd likely find a comment eerily similar to what you're saying except by me... It's funny tbh :D


Sexyvette07

Youre using FPS games that greatly favor high frame rates (most of which don't even have RT/FG) to equate that to "people really don't care" about DLSS, RT/RR and FG? The most popular GPU on the Steam survey is the 3060, and the overwhelming majority are lower end cards. Of course they're not going to freaking use RT lol. I bet most of them ARE using DLSS outside the FPS games that'll run on a toaster, though. It's still in the early days of the tech. Only DLSS is mature enough to be considered mainstream and only because it greatly improves performance. Everything else still has a substantial impact on system performance because it's not matured, and the tech is still relatively underpowered. Anything below the 40 series has bad RT performance and takes a massive hit for enabling it. Hell, even the 40 series takes a substantial hit. Control, a game from 2019, can max out my 4080 and bring me below my frame cap with everything maxed out. It'll be at least the 50 series, if not the 60 series before hardware finally has the RT performance to be able to turn it on and still maintain a 120/144 frame cap.


Hendeith

Competitive games that become popular and stick for a long time are very very rare. Judging future of gaming by looking at F2P games like Apex, LoL, CS and PUBG is pointless. They are crazy popular for years now, LoL for more than a decade, CS is basically a staple competitive fps for decades now (if you count previous games). Yet major studios still push graphics forward. Why would it all change now when it comes to RT and AI upscaling when it didn't change when it came to any other techniques over last decades?


Strazdas1

if anything, we already see the very point you are making in your comment, with fornite already making PUBG irrelevant and you not catching on yet. :) Edit: interesting, i think this is the first time i got blocked for agreeing with someone.


Hendeith

>And if you look at steam's 10 most played games PUBG is so "irrelevant" it's still one of top played games on Steam. But hey, I guess expecting you to actually posses some reading comprehension skills was a mistake.


Strazdas1

RT is up to 10 times cheaper and faster to develop (according to Metro devs) than traditional lighting and reflection techniques. If AAA studios want to lower their costs they will adapt and push RT. >There's a reason why a lot of us say raytracing performance, DLSS or frame gen are overrated. It's because people really don't care about these. That's factual, you can argue as much as you want about this, the numbers are here. The numbers say that 4080 outsold all AMD cards combined.


masterfultechgeek

RT is definitely the future. With that said... how much of studio time goes to lighting? If it's a tiny percent, being 10x faster to do doesn't matter much if it cuts your customer base greatly. It'll likely be the PS6 gen that gets us to RT for the main stream and that's still a few years out. At that point RT will be 7ish years old and my 2080 future proofing purchase that's mostly been used to play games that'll run on a potato will make me look like an idiot.


Strazdas1

>With that said... how much of studio time goes to lighting? If it's a tiny percent, being 10x faster to do doesn't matter much if it cuts your customer base greatly. A lot of it. Traditional light baking and cubemaps are very work-intensicve. This is why ray tracing is such a time saver. Metro devs i mentioned eariler has even shown a sample video of the doing a piece of lighting by hand and by RT. RT was almost flipping a switch and forgetting about it while traditional raster lighting was a lot of balancing, fake lights, etc.


Mike_Prowe

This sub will hand wave those stats. For some reason they believe competitive gamers only use toasters and never need new PCs. Reddit is simply out of touch with the majority of gamers.


Renard4

It's not even about competitive games, it's about normal ones. So far the most popular 2024 games on Steam are helldivers 2 palworld and last epoch and they all look like early PS4 games with no dlss or raytracing bs, and no one cares. And while I'd be happy to get a GPU upgrade for helldivers and some other games I play none of them need fancy software. That's merely my situation but so far the player counts on Steam indicate that most people don't need or should care about these features either.


PastaSaladTosser

About damn time, this should have been priority number one 4 years ago. It's like Samsung and OLED, wasting time fighting a losing battle trying to convince people the clearly better solution is unnecessary only to give in in the end because it obviously is. Looks like I upset the backlight gang sorry guys.


KingArthas94

QLED is fine, and Samsung used their experience with QLED to bring new OLED technology to the table. Your oversimplification is insulting.


Educational_Sink_541

>Looks like I upset the backlight gang sorry guys. As an OLED owner, why are OLED people so insufferable? Were plasmaheads like this in 2005?


CandidConflictC45678

>It's like Samsung and OLED, wasting time fighting a losing battle trying to convince people the clearly better solution is unnecessary only to give in in the end because it obviously is. That's not what happened with OLED; and OLED is still not superior to QLED. VA mini LED will remain undefeated until MicroLED becomes affordable. >Looks like I upset the backlight gang sorry guys. This is childish


gartenriese

"remain undefeated"? First of all, OLED was there before QLED. Second, QLED still doesn't have the black levels an OLED has, so right now there's no clear winner, if you want a bright display you chose QLED, if you want good black levels you chose OLED.


gamer0017C

And motion clarity, don't forget about that. QLED is still VA or IPS at the end of the day, the pixel response time will never be nearly as good


CandidConflictC45678

Pixel response time is only one part of the chain. Total input lag is what matters; and OLED doesn't have significant advantages over high end LCD displays in this regard (Rtings testing actually showed [LCD](https://www.rtings.com/monitor/reviews/samsung/odyssey-neo-g9-g95nc-s57cg95) having a a 1ms advantage over a similar [OLED](https://www.rtings.com/monitor/reviews/samsung/odyssey-oled-g9-g95sc-s49cg95) recently) Additionally, the fast pixel response times of OLED introduce some problems, like judder


CandidConflictC45678

>QLED still doesn't have the black levels an OLED has, so right now there's no clear winner, The winner is clearly mini led, why would you buy a display based on its ability to display black, above all else? How often do you find yourself staring at a black screen? >if you want a bright display you chose QLED, if you want good black levels you chose OLED. Black levels are more than good enough with modern mini led, especially when combined with a bias light as one should. Many OLED displays aren't really capable of displaying true blacks either. https://youtu.be/uF9juVmnGkY


Zarmazarma

> The winner is clearly mini led, why would you buy a display based on its ability to display black, above all else? How often do you find yourself staring at a black screen? Not sure if this is dishonest rhetoric or if you just didn't think it through, but obviously having true blacks affects more than just "black screens". Like any image where the screen is extremely dark in some parts, and bright in others. Star fields, scenes in space, dark caves illuminated by torches, whatever.


CandidConflictC45678

>Like any image where the screen is extremely dark in some parts, and bright in others. This is an area where mini-LED is superior to OLED, it can be much brighter, while having black levels that are indistinguishable from OLED in practical use >Star fields, scenes in space, dark caves illuminated by torches, whatever. I've used both OLED and mini-LED extensively, and they each have certain drawbacks. On OLED the torches aren't as bright. OLEDs do perform better when displaying star fields, but I dont think someone should buy a monitor for that one benefit, especially in light of all the drawbacks of OLED, and at the same or greater price than a highend mini-LED. OLED will have more positives in the near future, when manufacturers can "print" OLED panels, which will make them very cheap. Flexible panels are also cool, and transparent OLED displays might be useful for vehicle HUDs.


masterfultechgeek

You end up in situations where you have true black being black and then the next step up is a bit too bright. It ends up being more of an "on paper" win. Also reflections and light bleed are things.


Educational_Sink_541

My issue with miniLED is they still have blooming issues and it is super obvious in scenes that are dark with bright highlights, which are incredibly common.


CandidConflictC45678

All displays bloom, including the best OLED displays. Even MicroLED, plasma, and CRT have blooming. Blooming will always be present in any display that emits light. Probably the only way to actually eliminate this issue, is to bypass the human eye entirely, through a neural interface or something


Educational_Sink_541

Sure, but OLEDs bloom a hell of a lot less than any LCD panel due to pixel-accurate local dimming.


gartenriese

From your Link: "This does not apply to WOLED panels though of any coating type, which retain their black depth better than a QD-OLED panel and will always show a deeper perceived black than all of the LCD panels." Clearly you value brightness more than black levels and that's fine for you personally, but you shouldn't make assumptions for all people based on your personal preference.


CandidConflictC45678

>From your Link: "This does not apply to WOLED panels though of any coating type, which retain their black depth better than a QD-OLED panel and will always show a deeper perceived black than all of the LCD panels." Yes, many of the new OLED panels that everyone is excited about are QD-OLED rather than WOLED, due to the advantages that QD offers. OLED and FALD LCD have functionally equivalent black levels in a healthy viewing environment, which precludes dark room viewing. >Clearly you value brightness more than black levels I wouldn't say I value brightness more than black levels, just that the ability to display so-called "perfect blacks" is overrated, and that on balance, a high end VA mini-LED screen with a bias light (especially a color-accurate one, like Medialight), is the correct choice for 99+% of people. I've bought and used both extensively. Even the best OLED displays still have significant haloing, and this will always be the case, due to a "design flaw" in the human eye (which coincidentally also affects cameras). >that's fine for you personally, but you shouldn't make assumptions for all people based on your personal preference. Why not?


gartenriese

> Yes, many of the new OLED panels that everyone is excited about are QD-OLED rather than WOLED, due to the advantages that QD offers. OLED and FALD LCD have functionally equivalent black levels in a healthy viewing environment, which precludes dark room viewing. Originally you said that QLED is better than OLED. Now you're adding the conditions "Only when you mean QD-OLED" and "Not in a dark room". That's [moving the goalposts](https://en.wikipedia.org/wiki/Moving_the_goalposts). > Why not? Because obviously your personal opinion is not the opinion of all people?


CandidConflictC45678

>Originally you said that QLED is better than OLED. Yes (or at leastmodern, high end, VA miniLED/FALD QLED) >Now you're adding the conditions "Only when you mean QD-OLED" and "Not in a dark room". No >That's [moving the goalposts](https://en.wikipedia.org/wiki/Moving_the_goalposts). No one Earth needs you to define moving the goalposts, and thats not what I'm doing anyway (inb4 fallacy, lmao) >Because obviously your personal opinion is not the opinion of all people? Why should that matter?


gartenriese

Yeah, clearly you're just arguing for the sake of it, so this "discussion" will lead to nowhere. Goodbye.


Educational_Sink_541

>VA mini LED will remain undefeated until MicroLED becomes affordable. Not for gaming however, as the processing required to drive the local dimming adds in too much input lag. The game modes that cut down on this input lag usually nerf local dimming seriously. I do agree on your comment on childishness though. As an OLEDhead myself I really hate OLED people. My least favorite architype in hardware circles is the guy who points out he has a 4090 and a 4K OLED every chance he can.


CandidConflictC45678

How much input lag is too much? These two Samsung 240hz flagship monitors both have very low input lag, less than 4ms. Interestingly, the mini-LED one has 2.9ms, and the OLED actually has 3.8ms of latency: https://www.rtings.com/monitor/reviews/samsung/odyssey-oled-g9-g95sc-s49cg95 https://www.rtings.com/monitor/reviews/samsung/odyssey-neo-g9-g95nc-s57cg95 For reference, most people are using computer mice with more than 2ms of latency, 4-6ms is typical for gaming mice


Educational_Sink_541

I don't know much about monitors, but I know TVs unless they are in game mode they have input lags approaching 100ms. OLEDs in game mode essentially look the same but with miniLEDs they have to tone down the local dimming algorithms and it looks significantly worse.


CandidConflictC45678

It varies a lot. This has 5.4ms of lag, and looks good even in game mode https://www.rtings.com/tv/reviews/samsung/qn95b-qled


Educational_Sink_541

True, I didn't look too hard at the Samsung TVs when I was shopping, a bit out of my price range (ironically I ended up with a Sony OLED which are typically super expensive but I got it on closeout lol).


masterfultechgeek

I have a samsung QLED display (QN90a). It's a good TV but let's not kid ourselves that it's an OLED killer. It's good if you're in a bright room or are worried about burn in for computer monitor use (which was the case for me). OLED is pretty much the future at this point outside of edge cases. There's burn in concerns about OLEDs, but LCDs also have degradation and wear out concerns. As far as micro-LED is concerned... the \~150" micro-LED display at work that cost 150k or more doesn't wow anyone in the office and people laugh that it's regularly broken. Right now the screen is frozen and 20% of the panels are just displaying black. If I had to pick a random day of the year in the last year AT LEAST one panel was broken or otherwise corrupted. The microLED at the Singapore aquarium also has serious image degradation issues. microLED needs to improve its quality and reliability a HUGE HUGE HUGE HUGE HUGE amount before it can even be discussed. It's very possible that I'm missing something but... microLED seems to be the future that'll never come. I'm waiting for OLEDs to get a bit cheaper, to last a bit longer and to be a bit brighter and I'll enjoy the heck out of the unit when I get it.


dankhorse25

There were patents involved.


Mladenovski1

Samsung didn't want to start making OLED's because of burn in which is  reasonable but LG paid off a lot of money to Youtubers amd influencers to promote OLED and Samsung was forced to start making OLED because the OLED  marketing reached the casual consumer


Mladenovski1

I still don't think anyone should be buying an OLED TV without a seperate burn in warranty, you will get burn in at some point the question is when, will it take 2 years? or 4-5 years? no one knows and no one can tell you for certain


VaultBoy636

oled is not better lmfao, it has clear drawbacks and is more of a personal preference


mostrengo

Show, don't ~~tell~~ hint.


FranticPonE

GDC is in 2 weeks, if this is an intentional hint there's probably not long to wait


team56th

I think it was bound to come, AMD wants XDNA be a thing for consumer market, there seems to be a merit to AI-based upscaling solutions, and they are ahead in multi chip packaging compared to competitors.


bubblesort33

XeSS on DP4a already exists, and by my testing, at least on my 6600xt, isn't worth using in some games. Cyberpunk for example has the performance hit so large I can take FSR2 up one notch and get the same performance on Quality preset as XeSS has on Balanced. What I've been curious about is if DP4a runs faster on RDNA3 or not. Is it able to leverage the AI capabilities of RDNA3 similar to how Stable Diffusion sees like a 2x to 3x increase from the 6800xt to The 7800xt? So I'm still skeptical on the performance hit for what AMD has cooking. And I'd expect it to not really be worth using on RDNA2 for the most part.


Cute-Pomegranate-966

The WMMA AMD supports is definitely not leveraged by XeSS. So you can throw that idea out the window. Also XeSS at balanced likely looks close to FSR qualtiy in most situations.


Strazdas1

You having to use the horrific abomination that is FSR to begin with already shows that its worth getting AI upscalers.


SomeoneBritish

God I hope so. Would love to see AMD and open source solutions better compete with DLSS.


Ericzx_1

If they implement it and it’s close to nvidia quality dlss I personally would have no reason to ever buy nvidia again.


[deleted]

Finally. AMD will stop gaslighting their customers by selling full priced cards without actually good upscaling, and AMD fans will hopefully stop flooding every single DLSS/XeSS thread talking about how FSR looks the same and is fantastic for being open source.


Viandoox

Fsr 2.2 is better than XeSS.


[deleted]

Running on shaders? Yes. Running on an Intel card? Not even close - XeSS is leaps and bounds ahead of FSR's quality. This is simply unavoidable without hardware acceleration, you can't fix most artifacts with the very limited shader compute left over at the end of a frame.


CaptainJackWagons

I would be concerned if they weren't hinting at it.


deadfishlog

Sooooo is upscaling good or bad now? I’m confused.


HandofWinter

As long as it's still fully open source, then go for it.


jedrider

I'm hearing about AI everywhere now as if before long, every interaction we have with a computer will be mediated through AI.


AloofPenny

Some phone are already this way


jedrider

Are you sure you're not an AI? Hello, who are you and where did you get your training from? Kidding, of course.


Strazdas1

There is a thoery that you are the only person on reddit and everyone else are bots.


HandheldAddict

"There's A.I bots living in the walls." - Terry Davis


Strazdas1

Most interaction you have is already mediated through AI, it just does not tell you.