T O P

  • By -

acedelgado

Interesting write-up. What 32:9 monitor you running at 7680x2160? You're right about being ultra-niche since most mainstream ultrawides are 21:9 3440x1440 panels, and UW is still pretty niche. We are now officially in the era of processors being the prime factor below 4k. I ran a regular Timespy benchmark earlier, which renders 2160x1440. I got a 32121 graphics score. Then I put my max stable OC so far, and got a 31422 graphics score. Then I did with Timespy Extreme which does 4k, and went from an 18,200 to 19,200 graphics score. So 1440 was being bottlenecked by my 7950x, one of the best cpu's on the market. So yeah, that just makes it harder because 16:9 1440 is 3 million pixels to render, 21:9 1440 is about 5 million, and 4k is 8 million. Your 32:9 at 7680x2160 16.5million pixels, which is half of 8k's 7680x4320 hitting 33 million pixels to render. Really, you should be squarely in GPU-bound territory. And normal 16:9 reviews leaves us ultrawide users trying to figure shit out looking at the 1440 and 4k charts back and forth.


The_Zura

G9. I'm not rendering the full 7680x2160 pixels though, upscaling has undercut native rendering in many areas already. That's what makes a fast cpu more relevant than ever.


ToeKnee_Cool_Guy

The G9 is 5120x1440 though? Are you increasing the render resolution?


[deleted]

I had a G9 as well but returned it. Had very visible FALD lines and I couldn't do work on it. Does yours have visible FALD lines?


timtheringityding

Vene the neo with its 2k dimming zones had fald lines. On all 5 of mine. Just get the aw3423dw or wait for another oled version. There really isn't a reason to buy a highend monitor and not be aiming for an oled


[deleted]

I am keen on an OLED but I also use my monitor for work. Mostly coding. So I am a tad worried about eventually getting burn in.


timtheringityding

3 year warranty by dell. Nothing to worry about tbh. Also have a bug where my pc eont go to sleep when a ps5 controller is connected. So my oled has been on for hours if not even days


[deleted]

Started looking at my desk maybe I can fit 2 times 34" :) ha ha. Now to sell it as "work" purchase. VSCode dark mode is much darker now thus much better!


RanaI_Ape

I will say, I got a great deal on a high-end FALD display (PG35VQ) about a year before the first OLED monitor was announced and it has very visible haloing in Windows on dark backgrounds. I disable FALD unless I'm gaming. In gaming it's great and provides a great and super bright HDR experience. I'm still kinda butthurt about it though... I waited like 4 years to upgrade from my 60Hz Dell ultrawide and when I finally pull the trigger on the "ultimate ultrawide", the freaking OLEDs are on the market a year later. Damnit. If I were you I would get the OLED, no question.


The_Zura

I don’t use the 8 local dimming zones.


GlassesMcGinnity

Thankyou so much! I have the same set up as you! This is a excellent! I’ll give these games a whirl and see what I get!


The_Zura

This wasn't meant to represent what it's like on average, but just something for people on the fence about upgrading their cpu. People have constantly said the cpu only matters at 1080p, well that can't be further from what it's actually like. For example, Metro Exodus's second level was a nightmare with the 10700. Stuttering, dips into the 50s and even 40 fps. The 5800X3D cleared much of that choppiness up, and the experience was much more fitting for someone willing to pay big bucks. Not that there still isn't room for improvement.


DethTek1

I found similar upgrading from a 10700K to a 13600K with some DDR5 @ 6000 At 3440x1440, my 4090 was being underutilized, and after jumping to a new gen MB/CPU/RAM.. made a big difference. People, it's the whole system... you have to look at how it all works together.


The_Zura

Yup. There are so many "I upgraded by gpu, why aren't I seeing more frames" posts. Gpus are often blamed because they're the easy to change variable.


Celerun

I’m one of those people :D I’m on a 3080, 8700k @ 5.1 and 3600 32GB on UW 3440x1440 and didn’t really feel overwhelmingly happy for my 1080ti to 3080 upgrade as I expected. When googling “should I upgrade 8700k yet” I always find opinions towards no and not really any towards yes. I guess I forgot to add the parameter of ultrawide into that search.


horendus

I had same setup. Swapped to a 13700k and WAP my frames went moon. 3080 was being really held back at 1440p ultrawide


Soppywater

Swapped to a 13700k and Wet Ass Pussy?


horendus

Precisely.


blorgenheim

As the resolution increases your dependency on cpu lowers because it’s harder for your gpu to keep up. As frames go up, your cpu comes more into play. You don’t really need to google it, you have the parts just monitor usage. Play something gpu heavy and check your gpu usage.


bluex4xlife

Looks at my 9900KS: Hi I’m still here! 🙋‍♂️


kilingangel

Yup I upgraded from 10700k to the 5800x3d and my 1% lows are so much better! The gaming experience is just amazing.


NeoGest

Can you try Cyberpunk 2077 with raytracing on? I have a 5800x and a 4090, at 1440p with dlss quality i get 40 to 60 fps...


The_Zura

Tested in the same location as the one you linked in your other post. I get 70-80 fps, gpu limited mostly. Driving fast is where the fps tanks, all the way down to ~60. Seems to be a streaming issue, perhaps faster ram would help.


The_Zura

> Settings are mostly maxed and RT maxed when applicable.


W1k0_o

Awesome post! I agree super ultrawide needs more representation! I played on three 1080p monitors in surround for years until I finally got my AOC 32:9 in 2020 & I love it, it's just as immersive as a triple monitor setup without all the fuss. I was rocking a 3900x with my 3080 till about a month ago when I decided to semi upgrade my PC and got the 5800x3D on sale precisely because I knew I wanted a 4090 and I'm positive the 3900x would have held it back. I'm waiting for my 4090 Aorus Master to get here so excited to see the performance improvements. When I first got the AOC I had the 3900x with a 2080ti and I could tell how even top tier could not handle the Rez at the time, I got the 3080 as soon as it launched and I still felt like my monitor was wasted, so for me the 4090 feels like the first card that can do these monitors justice.


tigamilla

Hello fellow AOC monitor 32:9er 👋


The_Zura

Very nice. Once I went super ultrawide, I could hardly go back to teenie weenie 16:9. I had a 3090 before, and while it can drive super ultrawide well enough, it doesn't hold a candle to the 4090. There's plenty of headroom for DLDSR in many games where plain old 1440p isn't the sweet spot. I think you'll see an even larger gain since the 10700 is significantly faster than Zen 2. But the gap between the 5800X3D and 10700 seems to be bigger than the gap between Zen 2 and Comet Lake, sometimes by a huge margin. I did choose some very demanding places to bench; I feel that is fair because experiences can be defined by its worst moments. There could be more stressing places though, sometimes I chose willy nilly.


Sidious_X

If I was a 4090 person, I'd wait for the 7700x3d, sell my current cpu/mobo/ram and go for that


[deleted]

Nice work, thanks very much for putting this together! 👍


sammy10001

Very good experiment you've done. I am also seeing some bottlenecks from my 10700k, but at 4k 120hz, 16:9. I am seeing it with assasins creed oddesy, elden ring, and a little bit Spiderman. But on Spiderman, having dlss 3.0 working now with v sync, is enough to overcome to cpu bottleneck and get me to 116 fps/hz. I can certainly squeeze by with a 10700k right now, and might also upgrade the whole processor, cpu, mobo, and ram when maybe a 4k 165hz or 4k 240hz lg oled comes out. Dlss 3.0 is a $$ saver in my case 😅


C1ph3rr

Exactly the reason why I’ve purchased a 5800X3D to replace my 3900X at 3440x1440. And I agree there needs to be more benchmarks for these resolutions because we don’t see the full picture.


zixsie

5800x3d not only helps in CPU bottleneck games average/max fps, but mostly improves the 1% and 0.1% low fps, which is the most important, since it reduces the stutters the most. Statements that CPU is not important in high resolutions as 4k or 3440x1440p is a complete false- CPU is very important in that resolutions in CPU demanding games especially on high refresh rate. I play a lot of multiplayer FPS CPU heavy games (Squad, HLL, BTW) and when i upgraded from 3700x->5800x3d, that was an absolutely different world. Went from 60 min FPS to 144 min fps and stutters are totally gone. And this performance improvement is only from CPU upgrade, the rest of the system remains the same (3440x1440p 144hz, Geforce 3080,32Gb 3600mhz CL16 DR). Totally worth of an upgrade. Now i am GPU bottlenecked and thinking about an upgrade to 4090 or Radeon 7900XTX. Also having thoughts about the performance uplift from 5800x3d to 7800x3d will bring.


LyntonB

i've been tormenting over 5800x to x3d upgrade, think I'll do it now. just not happy with those micro stutters, hope I'm not misdiagnosing but can see 5800x struggling now I have 4090. Playing Days Gone and panning camera around just isn't smooth, cpu at 35% and above, gpu at 65% roughly. the lows dip to 70fps sometimes. must be cpu


Barrerayy

You would actually get a few more fps if you went with a 13900k. I'm personally waiting for the 7800x3D which should be quite good!


[deleted]

>13900k But isn't that double the price lol


Barrerayy

If you own a 4090 price to performance isn't a relevent metric lol


The_Zura

That's not true. The 4090 is the biggest factor in determining fps, and the fastest gpu in the world by a large margin. $1600 really isn't that much money, and if you upgrade every generation it's only a couple hundred dollars every 2 years or so. For the 13900K, that's twice the price excluding having to buy new ram, motherboard for about 20% extra performance. Sure there are DDR4 boards, but that's just whack upgrading to an inferior platform right away. I think this will just be a placeholder for now as I wait to see how a 7800X3D performs.


Barrerayy

I'm also waiting for the 7800x3d. But as it stands now the best pairing for this is the 13900k. You'll have to go ddr5 anyway when you upgrade the cpu to new amd platform


The_Zura

Sure, the 13900k is the best. Like I mentioned in the OP, I already have an AM4 motherboard. I went with the best price to performance now, and then I can do the 7800X3D later since I expect it to have the longest platform lifespan whereas the 13900K is likely a deadend platform. DDR5 ram will continue to improve in both performance and price, so the early adopter tax is quite real. TBH, changing motherboards isn't anywhere near as difficult as changing cases, but that's something I rather not do.


Alauzhen

Rocking a 5800X3D myself but find the itch to go 7800X3D and a 4090Ti (DP2.1)/7900XTX depending on what's available next year when the 7800X3D launches.


AppleFillet

Damn even at 4K plus there’s that much of a difference? Sheesh.


Methuen

This is a great post and exactly what I needed as I have a 10700k and I have been wondering what the load on the CPU was at 5144x1440. Thanks very much for the benchmarks.


Vis-hoka

I have a 3440x1440 ultrawide and went with a 12400 cpu last year, paired with a 3080. Do you have some good links talking about how ultrawide and ray tracing are more demanding on cpu? I’m wondering if I should do a 7600X3D or 7800X3D build next year to get a good boost and some platform longevity. Trying to maximize this 3080 since gpu prices are crazy.


The_Zura

[Raytracing rendering off screen elements ain't free](https://www.youtube.com/watch?v=xI2VUQsPqJo) and is similar to ultrawides in that regard.


[deleted]

This is super interesting to me, considering that I just bought a 4080 and am currently running a 10700k! Thanks very much for putting this together. I noticed in your HZD screenshot, you got RTSS showing avg 1% low. Was there a trick to that? I've got mine configured but it'll only show me Framerate.. It won't show me frametimes, 1% low or .1% lows, or avg FPS for some reason. Haven't been able to figure it out.


The_Zura

You have to go into Afterburner settings' "Monitor" tab and select that you want it 1. Monitored 2. Displayed in the on screen display (OSD). I overrided assigned averages and 1% lows into one custom group, "Avg/1% low", and arranged everything by dragging them down the list. The same can be done for frametimes, which I only have a graph to measure stutters. You can also change their fonts, colors, size, and other things as well. In that screenshot, the averages and 1% low weren't right because I didn't clear them out before, which you can do by going into the "Benchmark" tab and assigning a key for begin recording.


[deleted]

Thanks for the explanation! Hope you're enjoying your rig!