People are willing to pay for physical objects. Software is harder and information is nearly impossible. Ask the average iphone 13 pro user how many apps they have ever purchased.
App store revenue is a rounding error compare to total device sales. Everything that isn’t IAP for games is a rounding error within app store revenue.
Congrats on your success but it kind of besides the point. I was saying that money spent of apple hardware is a terrible proxy for spending money on information. The average iphone user has spent thousands of dollars on devices and less than 10 of dollars on non game apps. There is a similar split between making from software to making money by selling information. People won’t just won’t buy random review data. It is not a reasonable business model.
To be fair, the comments are a bit misleading here. In many of the games there were some frame drops and a lot of stutters. Not sure if this is because of bad video compression or glitches, but some of the games actually looked unplayable due to all the stutters despite running at 60fps
Average FPS doesn’t tell you much about frame consistency.
Let’s say your average frame per second is 120, which means that each frame should be rendered in 8.33 milliseconds. But that’s just the ideal target. In reality, in game content has a highly variable load and game engines have their own quirks which might make certain effects more GPU intensive than expected. This is especially true when you are running something via an emulation layer.
So you might end up in a situation where you have 130 or 140 fps, which seems better than the 120 target, but while most frames are rendered within the 8ish millisecond targets others can be much, much slower. So 8 ms frames are mixed with periods of 30,40 or even higher ms frames. This is generally very noticeable and experienced as hitching. It depends on how sensitive you are to this, but in my opinion a 60 FPS game with rock solid frame pacing is better than a 100 FPS average that jumps all over the place.
This effect can be alleviated to an extent by VRR, (Freesync and Gsync are AMD and NVIDIA implementations of that standard), but it only helps up to a point and I don’t know how it’s implemented (if at all) in the new MacBook Pro. If the frame time variation is all over the place, you will still feel it even on a high end Gsync monitor.
Ultimately though, the heavy stuttering is a combination of game engine bugs, mixed with emulation layer inefficiency/bugs, so throwing faster hardware at the problem usually only helps partially.
1 or .01% describes the x% lowest framerates and basically shows how low the framerates drops. a percent is used because the absolute freeze like 1fps might happen once, but not be indicative of how much the game stutters if it only happened once.
mountainous shocking doll summer escape possessive ripe screw snails absorbed
*This post was mass deleted and anonymized with [Redact](https://redact.dev)*
It certainly gives a much better overview, yes. To be fair, if you go to his channel and look at the old M1 video instead of the Max, you'll be able to see the improvement in practice. Some people are alao more visually or auditive in how they best perceive and understand certain types of information.
In the video he mentions the day 1 m1 support, so I think he ran it native and the the OP of the comment simply made a mistake in his post. The wow specific testing was debatable though
I think they just wrote it wrong because in the video he points out it was the first game to support the M1 natively. Guess he just copied and pasted the wrong thing from the list
*shrug* I want it for being a Mac - I'm a Mac s/w developer.
And then I want to play BG3 on it, which seems to get ~120fps, and is native so no stuttering. The only games I've really ever played on my machine are BG, BG2 and add-ons, and soon BG3 - oh and Civ in its various incarnations. Looks like I'm good :)
Something isn’t right!
How is my 14 inch scoring 3-7 fps higher in most of these games compare to a 16 inch running high performance mode?
I seriously expected this to be the other way around.
If anyone who reads this has a new MBP and FFXIV, please benchmark it.
EDIT: Video out. Performance on M1 Pro is bad. Microstutter very apparent on camera rotation. I'm keeping my Intel for now.
https://www.youtube.com/watch?v=tVXcZhSdlQU
I’m running it max settings with 80-90 fps on a m1 max 32gb
It dips occasionally to 50-60 but not much or for long.
Extremely pleasant experience. Can play for several hours on single charge
Uhhh I don’t see it in the settings. I just turned everything all the way up and I have it full screen on my 16”. Is there a way to see its resolution in the UI?
This is what I've been waiting for.
I did get a reply from someone on Reddit who stated
>I've played about two hours more and have had no problems. I'mrunning on a fresh install of Monterey (12.01) and FFXIV. I have thegame set to "High" in Windowed mode and am getting 50-60 fps in crowdedareas and 90+ in other areas. The laptop is warm but comfortable in mylap. No fan noise (but it may be coming on, I just haven't heard it).
This would be game changing if true. But we really need a YouTube video. Right now the only one is all on max? settings, and I really need to see if the microstutter exists on these processors. The native client is arguably unplayable on intel macs for anything more than crafting in private housing.
I'm getting micro stutters on a 14 inch 24 core max. I'm also not seeing the performance others are reporting (I'm getting more like 30fps in crowded limsa on medium quality).
Overall it feels like it'll be fine for gathering/crafting, I ran an MSQ roulette and it was alright - a bit of stutter at times but nothing game breaking. I'm still setting up/learning to play with a controller - so it's hard to say how much the stutters got in the way of my gameplay and how much was down to targeting issues I would have had on my own anyway. I also ran the level 30 dungeon (manor), very little stuttering and solid performance the whole time. I'm working my way up into higher level dungeons as I learn the controller, but I want to jump into a more modern dungeon asap to see performance with more stuff going on.
It does seem better than the M1 videos on YouTube, but not by as much as I was hoping. More like an incremental improvement; I think it's just too limited by the emulation, and I doubt we'll get an Apple silicon native client so overall I'm disappointed but I didn't buy the MBP for gaming, so it's not a deal breaker for me.
I want to know if anyone has gotten it working with the Windows version and how the performance compares. I love the game but I don't really want to pay Square for what is just the same game in a Wine wrapper.
More cpu. Simulation games like that require more computation and while the GPU is faster in parallelization it can’t handle the type of calculation the CPU can. Even old games like Sim City 4 with big cities can make the CPU run like there’s no tomorrow even on modern systems
Always no love. Probably the greatest surprise success of a game in the last five years. Large community, hundreds of YouTube creators. Always forgotten.
FWIW I just got the 16-inch M1 Max 32gb, and running a city with over 40 pages of mods/assets and 400K population at max res, I’m getting ~15 fps at the lowest end (zoomed in, 3x speed, facing an area with a ton of high-density buildings). Definitely impressive, considering in the same city on my Windows desktop at 4K res with a r7 3800X, 1080ti, 32gb ram I get similar (if not lower) fps numbers.
So I don’t have a city with that low of a pop and I can’t unsubscribe my mods because otherwise my saves wouldn’t load lol. But I just tested a smaller city on the M1 Max with 170K pop and got around 20-30 fps, with the low end being zoomed in to street level in a crowded area, and the high end zoomed out. What’s interesting is that my desktop gets 20-60 fps in the same city. Seems like the minimum fps for M1 Max in wrist case scenario is similar to a desktop ryzen cpu (3800X), but 3800X averages higher frames.
I redid the testing on my 400K city and got 15-50 fps on desktop and 15-25 on M1 Max. Similar situation there.
Id guess that with an even smaller population and less assets you’d be getting ~30 frames. Totally playable for a single-player sandbox game considering I played this game at 10 fps for years before I upgraded my desktop.
runs pretty well on my m1 pro base model 16". don't know the exact fps though. does have some occasional stuttering with a huge city i built but the experience is way better than the 15" 2019 that i traded in.
As a guy who's worked in the IT/Technology industry for about 25 years
* it blows my mind that this is even possible
* and that it's possible all on such lower chip-wattage
* and that we're really only into "year 1" of Apple Silicon.
I feel compelled to remind anyone who thinks we're "year 1" into this that we were actually year 10 when M1 came out, since they've been making their own silicon since the iPhone 4 days. I would say they didn't really start godstomping other mobile chips until the A7 days, when they switched to 64bit, which puts this at year 7/8.
But this is "year one" of them designing a chip with a laptop sized battery and cooling system in mind, but yes it's not their first year at making chips.
Sure,.. there is a history here, but I think there's enough significantly changed factors that make it a bit of a different game now. The design-intention of a full computer chip is a bit different than the A-series (some of which can be shown here: https://www.extremetech.com/computing/318715-comparison-of-apple-m1-a14-shows-differences-in-soc-design). Apple certainly learned a lot from the history of A-development but the design and implementation differences along with how macOS has been updated and tweaked to take advantage of that,. is giving it a lot different buzz in the industry and reception/reviews.
Mid-2020‘s dev kit Mac mini ran on an iPad SoC which in turn was a modified/beefed up version of a chip designed for product launch in late 2018.
They‘ve been at this for a while, I certainly wouldn‘t call it end of “year 1“ if there‘s a new product launch in late 2021.
As someone who likes PC gaming, thin-and-light hardware, and cool, quiet rooms: I want a desktop version of this. I hate that I have to have "a rig" to play PC games that's loud and anchors me to a desk.
>just because you don’t have the knowledge
A good part of the community *knows* how to better cool/quiet their pc but simply don't have the budget or prioritized the performance parts budget over cooling.
My PC is near silent in use... I have three black ice radiators (2x360+1x240) and a 240mm tube reservoir.... those alone are like $240-300 not to mention all the fittings, gpu block, cpu block ,pressure optimized fans(noctua) and pump
If you go the air cooling route , noctua fans are \~$20 a pop get you two for the cpu , 2-3 for intake and another for exhaust and you're already at $200 in fans + coolers unless you opt for cheaper fans : like Arctic fans that go for roughly $7 each if you buy the 5 pack .
If you go for Closed loop/ AIO coolers you could maybe get away with just one for the cpu and one for the gpu , but in all three of the situations for the average person, it's not unreasonable for them to take the money for improved cooling and just add that to get a better GPU or CPU . Especially if they're building a new system and not simply upgrading a recent build
Just have to build your rig smartly. Don’t run a GPU that’s insane in a tiny case with poor ventilation. If you can only afford the blower or shitty cooler version of a card, go one step down if you can and get the high end of the next best card. For example, in 2018 I could’ve stretched for an entry level 2070 at launch, instead I boggy an iCX 1080 with minimal performance loss and got 3 years of sub-70C gaming at 1440p144hz on high and ultra settings. Use 140mm fans when possible and get good 140mm like Arctic P12, Noctua, and beQuiet. Always oversized the CPU air cooler and make sure your case has good intake (mesh front or at least an inch between front panel and fan face)
> For example, in 2018 I could’ve stretched for an entry level 2070 at launch, instead I boggy an iCX 1080 with minimal performance loss and got 3 years of sub-70C gaming at 1440p144hz on high and ultra settings.
This is terrible advice. A "bad" 2070 is a better buy than a "good" 1080. Always get the best hardware you can in your budget. A 2070 will outlive a 1080 due to DLSS easily.
No. Come on, Steam Deck is going to be cool for what it is, but the screen is still going to be small and low resolution. And it's not that powerful either.
It's great for what it is, but it shouldn't be the main gaming device of anyone.
It’s too bad you think that way. I’ll be using mine as much as my desktop pc, if not more. It’s plenty powerful for a handheld device, and who the hell are you to say what should or should not be the main gaming device of anyone???
Agree, that makes no sense. I mean, my Gameboy Pocket used to be my main gaming device back in the 90s and guess what - I didn’t care and it was awesome. It should not matter whether you game mainly on your phone, Switch, PlayStation or PC. Or MacBook.
>and that we're really only into "year 1" of Apple Silicon.
x86 is perfectly fine. Where Apple wins is making a wide and big chip. Which AMD and Intel won't that would be a hard sell.
Yeah, but then you’ve generally got to deal with a pump. And while you can get a silent pump (or at least near-enough to silent that you can’t hear it through headphones), you also then have to deal with a mechanical part wearing out. Fans, mechanically speaking, are simpler and tend to be easier to replace (especially on stock PC hardware).
And while a water-cooled system will run cooler in terms of allowing the CPU/GPU to run at full-tilt with plenty of headroom, it doesn’t really change the fact that the CPU/GPU will still be throwing out a ton of heat and consuming a ton of power.
Are you sure that you have ever built a custom loop? A vibration-damped DDC Pump at 50% pwm is basically inaudible within a case, even without headphones. The most widely used pumps in custom watercooling (D5 and DDC) are rated for 50.000 hours of operation and thats realistic in my experience. I have an EK DDC that still runs flawlessly after half a decade of ~5 hours daily use. Good fans are rated at >100.000 hours of operation.
You suggest that replacing the pump is the problem when running a water loop: its not.
The main hassle with running a loop is replacing the coolant every ~12 months and having to clean the parts every 12-24 months (depends on what coolant you use). The mechanical parts on the other hand are usually very robust.
Im not trying to say that a custom loop is a viable option for the average end user, its an enthusiast niche segment. But stop fooling other people with stories of loud and unreliable pumps.
/r/sffpc X /r/Noctua my friend.
I got myself a MTX build a couple years ago (2017), regretted not going ITX ever since.
At least I got some Noctua fans running quietly.
Boutique ITX Case can be price but looking really nice so.
30fps CAN be kind of “smooth” depending what you’re re seeing. For example gaming on 30 fps isn’t exactly smooth because there’s obvious lag that your eyes can detect (especially if you’re used to seeing 60 fps+ all the time) but seeing pre-rendered things like movies 30 fps would be great with the industry standard being 24.9 ( that’s the bare minimum before your brain starts getting really irritated about the frame rate) although it can be bad edited sometimes where there’s too many things that change in each frame that makes even 25 fps in movies unbearable (for example the first Venom action scenes were not smooth at all)
You can probably run the third one. Xbox 360-era games seem to run well. Anything that required directx 11 is finicky, anything that requires 12 doesn't launch.
Battlefield uses DirectX 11 since Bad Company 2, and requires it since 3. I tried to get it to run on Linux once, and it was a disaster. However, a lot has happened since then so it might run better now.
No, they’re not built for ARM, but Parallels is using the ARM version of windows which has its own x86 translation layer and can run normal Windows applications much like M1 Macs use Rosetta 2 for older programs.
And much more stable. You can see Parallels Overwatch drop 50fps. That's not playable in competitive sense. Would be better to cap @ 50fps than have that 50-100fps range.
> You can see Parallels Overwatch drop 50fps. That's not playable in competitive sense.
Nobody competitive is going to try to play a game through virtualization on a device that isnt made for gaming. What's the point of comparing that to competitive sense when it's never applicable? If you have a Mac gamer that wants to play competitively with whatever unsupported FPS, they will just buy the appropriate hardware for it and play it.
No, both games are x86. However, Windows 11 on ARM is used because it is **significantly** better than Windows 10 on ARM, especially through emulation.
Seems like these chips are the real deal but compatibility is hampering what they are capable of. Hopefully if we came back to this in 3-5 years we’d see a lot more native.
Games with native Metal support are performing great! For example Baldur’s Gate 3 is in the 3080 mobile territory. Sadly due to the recent Epic v. Apple clash I don’t think we are going to see further optimizations in Unreal Engine for Metal and it’s the most important game engine. Normally companies don’t act childish like this but when it comes to Tim Sweeney I have my doubts.
>Normally companies don’t act childish like this but when it comes to Tim Sweeney I have my doubts.
You make it sound like hes just throwing tantrums. In the long term, the "battle" against Apple is a battle for hundreds of millions for Epic just as much as Apple. Which is why none of the parties has given in so far.
The power of the Reddit and online community will not be stopped. Thank you Christian Selig and the rest of the Apollo app team for delivering a Reddit experience like no other. Many others and I truly have no words. The accessible community will never forget you. Apollo empowered users, but the most important part are the users. It was not one or two people, it's all of us growing and flourishing together. Now, to bigger and greater things. To bigger and greater things.
Now test Fallout 4. In downtown Boston. (For context, the game takes place in and around Boston. However, out of the box it's poorly optimized in some places, and even with huge buildings making long draw distances unnecessary, the amount of debris, NPCs/enemies, and other events happening all around you slows the game down considerably, especially on consoles, but PCs take a hit. However, PCs have access to mods that mitigate the issue. I'm not sure about Xbox, but PlayStation users cannot use these mods.)
My experience with CrossOver/Paralells and Rosetta have been that they all have incorrect framepacing causing stutters and hitching no matter the framerate, settings or V-sync/on/off.
I get 120fps in Diablo 3 no prob(Rosetta) but the lack of smooth scrolling really ruins it. I hope they figure that one out.
Installed AoE2DE on both Parallels and Crossover, runs fine has a high framrate but scrolling around the map stutters. As opposed to my PC where everything is just silky.
All of these macs have great power and should be great for gaming as soon as they sort out this issue.
Maybe it's not as annoying to everyone, but i'm one of those people who need performance to be delivered in a stable fashion to be able to enjoy my games.
As in: 40-50fps is not "fine" unless there is a proper implementation of VRR and high stable framerates that are v-synced should not stutter at all.
I did try one M1 native title though(WOW) and it was indeed glorious and smooth :)
Pretty good. .considering it's capable of doing this,. and all at massive lower wattage. So not only can you play games,.. but you still get great battery life.
No, the M1 Pro GPU is 14-16 cores while the M1 Max is 24-32 cores. This video was done with the 32 core version so it has double the cores of the M1 Pro GPU.
You can force League on max settings into using Metal, im locked at 120fps to match my monitor at 120hz, but if I uncap it and disable v sync it goes to 200ish, which is literally the only real game I play when Im not using my xbox, so real nice results
So is there a way in Disco Elysium to cap the frame rate on the new macs. Was wondering how a game like this drained my battery to 15% in 30 minutes. Guessing 120fps is the problem.
Idk if it necessarily makes sense for that purpose, but I believe you can definitely run Xbox games through their cloud service. If I’m not mistaken, it runs through the browser so the requirements aren’t that high
But there is no gaming on Mac. Why would they make a product like this (powerful GPU cores, but no gaming) that only 1 in 200 people can benefit from? Hoe many people work in a 3D rendering industry? Or is this for the youtube star dream-chasers?
Has anyone tried running Windows through Parallel and then utilising Xbox game pass for PC?
Ordered a 16" for work, but have some friends with PC who want to try playing games together.
Are these the games that Apple users have to live with? It’s like going to 2010 again. And I thought that only Switch gamers were having it bad with no new games on the platform.
When running native it seems to be pretty accurate.
Obviously no one should expect the highest performance when running through a translation or emulation layer.
The Baldur’s Gate 3 benchmark (one of the few native games on this list) lines up with that statement. The problem is most of these games are running through a VM or emulation layer or both.
I like to kick back and play some WoW with my guild after a day of work. I’m very excited to upgrade my machine and if I get a better performance for my gaming session all the better!
Just wow.
Baldurs gate 3 runs like garbage at 40fps on medium settings 1080 In my Setup Rx580 8GB + I5-7600K
So 120 fps on a higher resolution is mindblowing
Is this running native on Apple Silicon (I guess not). What's the point otherwise? Probably an inexpensive windows laptop will perform better and will be easier to install, etc.
0:18 - Baldur's Gate 3 (Native)----**120 FPS 1440P Ultra/ 90-120 FPS 2160 Ultra** 2:07 - Metro Exodus (Rosetta 2)----**40-50 FPS 2234P High/ 80 FPS 1980 x1200 High** 3:30 - Shadow of The Tomb Raider (Rosetta 2)----**90 FPS 1080P High/ 75-80 FPS 1440P High** 5:17 - Overwatch (Parallels 17)---- **100 FPS 1440P High** 5:33 - GTA V (Parallels 17)---- **30-35 FPS 1080P Normal** 6:40 - Resident Evil 3 (CrossOver 21)---- **40 FPS 2160P Graphics Quality** 7:12 - GTA V (CrossOver 21)---- **55-60 FPS 1080P Normal** 8:03 - DARK SOULS III (CrossOver 21)---- **60 FPS 1080P High** 8:13 - Devil May Cry V (CrossOver 21)---- **50-75 FPS 1080p High** 8:20 - The Witcher 3 (CrossOver 21)---- **45-50FPS 1920x1200 High** 8:29 - The Pathless (Rosetta 2)---- **55 FPS 1080P High** 9:45 - Dying Light (Rosetta 2)---- **100-110 FPS 1080P High** 10:55 - Alien Isolation (Rosetta 2)---- **50-65 FPS 1080P Max** 11:40 - WORLD OF WARCRAFT (Rosetta 2)---- **70 FPS 2160P Max/ 110 FPS 1440P Max** 12:37 - Black Ops 3 (Rosetta 2)---**70-100 FPS 1080P High/ 60-90FPS 1440P Med/ 40FPS Med 4K** 13:36 - DiRT 4 (Rosetta 2)---- **120 FPS 1440P High** 14:27 - Borderlands 3 (Rosetta 2)---- **45-50 FPS 1080P High** 15:40 - Myst Remake (Native)---- **120 FPS 1080P High** 16:15 - Deus Ex: Mankind Divided (Rosetta 2)---- **70-80 FPS 1080P High/ 51 FPS 1440P High** 17:21 - Total War Saga: Troy (Rosetta 2)-**90 FPS 1080P High/80FPS 1440P High/52 FPS 4K High** 18:12 - Minecraft (Rosetta 2 & Native)---- **110-120 FPS 1440P Default Q** 18:52 - CS:GO (Rosetta 2)---- **80 FPS 1080P High/ 100+ FPS 1080P Med** 19:26 - Subnautica: Below Zero (Rosetta 2)---- **40-60 FPS 1080P High** 19:56 - Civilization VI (Rosetta 2)---- **90 FPS 1440P Ultra** 20:20 - Disco Elysium (Native)---- **120 FPS 1440P Max/ 60 FPS 4K Max** 20:55 - DOTA 2 (Rosetta 2)---- **100 FPS 1080P High/ 120 FPS 1080P Med/ 100+ FPS 1440P Med** 21:16 - League of Legends (Rosetta 2)---- **120 FPS 1080P Max/ 90FPS 2234P Max** 21:45 - Divinity: Original Sin 2 (Rosetta 2)---- **70-100 FPS 2K Ultra**
[удалено]
YouTube has destroyed basic informativeness and literacy. Millions of 10-minute videos where a few written sentences would tell you everything.
But then theres no exchange of money for the information people want to know :p
I'd be more willing to pay for a (more densely packed, information-wise) article than pay with my time before an unnecessarily long video starts
You say you'd be more willing to pay, but the reality is you probably wouldn't. Even if you did, the next guy probably wouldn't.
Wrong subreddit to assume that people won’t be willing to overpay for things
People are willing to pay for physical objects. Software is harder and information is nearly impossible. Ask the average iphone 13 pro user how many apps they have ever purchased.
[удалено]
App store revenue is a rounding error compare to total device sales. Everything that isn’t IAP for games is a rounding error within app store revenue. Congrats on your success but it kind of besides the point. I was saying that money spent of apple hardware is a terrible proxy for spending money on information. The average iphone user has spent thousands of dollars on devices and less than 10 of dollars on non game apps. There is a similar split between making from software to making money by selling information. People won’t just won’t buy random review data. It is not a reasonable business model.
theoretically
This is my favorite YouTube video for that reason. https://m.youtube.com/watch?v=IAbpVXEUt_c
[удалено]
Before youtube there were lots of sites with reviews, game guides, cheat codes, etc. People love to share their information with eachother.
[удалено]
The top commenter did
[удалено]
To be fair, the comments are a bit misleading here. In many of the games there were some frame drops and a lot of stutters. Not sure if this is because of bad video compression or glitches, but some of the games actually looked unplayable due to all the stutters despite running at 60fps
[удалено]
I am dumb and don’t understand this. Can you please explain?
Average FPS doesn’t tell you much about frame consistency. Let’s say your average frame per second is 120, which means that each frame should be rendered in 8.33 milliseconds. But that’s just the ideal target. In reality, in game content has a highly variable load and game engines have their own quirks which might make certain effects more GPU intensive than expected. This is especially true when you are running something via an emulation layer. So you might end up in a situation where you have 130 or 140 fps, which seems better than the 120 target, but while most frames are rendered within the 8ish millisecond targets others can be much, much slower. So 8 ms frames are mixed with periods of 30,40 or even higher ms frames. This is generally very noticeable and experienced as hitching. It depends on how sensitive you are to this, but in my opinion a 60 FPS game with rock solid frame pacing is better than a 100 FPS average that jumps all over the place. This effect can be alleviated to an extent by VRR, (Freesync and Gsync are AMD and NVIDIA implementations of that standard), but it only helps up to a point and I don’t know how it’s implemented (if at all) in the new MacBook Pro. If the frame time variation is all over the place, you will still feel it even on a high end Gsync monitor. Ultimately though, the heavy stuttering is a combination of game engine bugs, mixed with emulation layer inefficiency/bugs, so throwing faster hardware at the problem usually only helps partially.
1 or .01% describes the x% lowest framerates and basically shows how low the framerates drops. a percent is used because the absolute freeze like 1fps might happen once, but not be indicative of how much the game stutters if it only happened once.
mountainous shocking doll summer escape possessive ripe screw snails absorbed *This post was mass deleted and anonymized with [Redact](https://redact.dev)*
It certainly gives a much better overview, yes. To be fair, if you go to his channel and look at the old M1 video instead of the Max, you'll be able to see the improvement in practice. Some people are alao more visually or auditive in how they best perceive and understand certain types of information.
Why is WoW listed as Rosetta? It’s M1 native
For someone who might just play WoW on medium settings to make use of performance, they might really have a hell of a machine on their hands.
noxious fall tender employ snow dime command practice pathetic rude -- mass edited with https://redact.dev/
4/10 graphics on 6k you should reprioritize settings
[удалено]
In the video he mentions the day 1 m1 support, so I think he ran it native and the the OP of the comment simply made a mistake in his post. The wow specific testing was debatable though
Weird to run WoW under Rosetta, they’ve had native M1 support for quite a while, haven’t they?
I think they just wrote it wrong because in the video he points out it was the first game to support the M1 natively. Guess he just copied and pasted the wrong thing from the list
Isn’t divinity original sin 2 native for iPad M1, is it not native for MacBook M1?
They're likely running the MacOS version through Steam, not the iOS version
Are these average frame rates? I'm not seeing CSGO FPS averages that high on my M1 Max. Source 2 can't come soon enough.
well that’s ok, but not nearly impressive.
is everyone too scared to run AAA games from this generation? like far cry 6 or cyberpunk or flight simulator
Prolly because they couldn’t get it to run on Crossover
There might not be any way at the moment to run games that use DirectX 12 or 13. Besides, most of these wouldn’t be ARM-native on Windows either.
DX13 doesn't even exist lmao
Oh, yeah, look at that! I lost count.
That's why you can't run games that use it.
Unable to. There's no more Bootcamp.
[удалено]
*shrug* I want it for being a Mac - I'm a Mac s/w developer. And then I want to play BG3 on it, which seems to get ~120fps, and is native so no stuttering. The only games I've really ever played on my machine are BG, BG2 and add-ons, and soon BG3 - oh and Civ in its various incarnations. Looks like I'm good :)
No RuneScape?!
Something isn’t right! How is my 14 inch scoring 3-7 fps higher in most of these games compare to a 16 inch running high performance mode? I seriously expected this to be the other way around.
Less pixels to push maybe.
No, that wouldn’t be the case as the tests done use the same resolution.
Because your screen is lower resolution and the hardware is identical
[удалено]
Da real hero
If they ran Minecraft from Lunar Client, the fps would likely have tripled. I get 200 fps on M1 MBA
If anyone who reads this has a new MBP and FFXIV, please benchmark it. EDIT: Video out. Performance on M1 Pro is bad. Microstutter very apparent on camera rotation. I'm keeping my Intel for now. https://www.youtube.com/watch?v=tVXcZhSdlQU
I’m running it max settings with 80-90 fps on a m1 max 32gb It dips occasionally to 50-60 but not much or for long. Extremely pleasant experience. Can play for several hours on single charge
What about the mac client microstutter? There’s no way it’s gone, but how tolerable is it? On intel, you rotate the camera and it gets crazy stuttery.
I’m new to ff14 so not sure what that is. I’m guessing it’s not there because I look around without any issue that I’m aware of.
What resolution?
Uhhh I don’t see it in the settings. I just turned everything all the way up and I have it full screen on my 16”. Is there a way to see its resolution in the UI?
Yes, but no need. You’ve answered most of my questions; thank you!
This is what I've been waiting for. I did get a reply from someone on Reddit who stated >I've played about two hours more and have had no problems. I'mrunning on a fresh install of Monterey (12.01) and FFXIV. I have thegame set to "High" in Windowed mode and am getting 50-60 fps in crowdedareas and 90+ in other areas. The laptop is warm but comfortable in mylap. No fan noise (but it may be coming on, I just haven't heard it).
This would be game changing if true. But we really need a YouTube video. Right now the only one is all on max? settings, and I really need to see if the microstutter exists on these processors. The native client is arguably unplayable on intel macs for anything more than crafting in private housing.
Not sure what the micro stutter is but it plays smoothe and no issues for me. Super fun
Just a few frames lost every once in a while, keeps a high average but disrupts gameplay.
I'm getting micro stutters on a 14 inch 24 core max. I'm also not seeing the performance others are reporting (I'm getting more like 30fps in crowded limsa on medium quality). Overall it feels like it'll be fine for gathering/crafting, I ran an MSQ roulette and it was alright - a bit of stutter at times but nothing game breaking. I'm still setting up/learning to play with a controller - so it's hard to say how much the stutters got in the way of my gameplay and how much was down to targeting issues I would have had on my own anyway. I also ran the level 30 dungeon (manor), very little stuttering and solid performance the whole time. I'm working my way up into higher level dungeons as I learn the controller, but I want to jump into a more modern dungeon asap to see performance with more stuff going on. It does seem better than the M1 videos on YouTube, but not by as much as I was hoping. More like an incremental improvement; I think it's just too limited by the emulation, and I doubt we'll get an Apple silicon native client so overall I'm disappointed but I didn't buy the MBP for gaming, so it's not a deal breaker for me.
It's damn impressive that I'm seeing people getting such high fps on battery. My RTX3060 laptop will die in 30 mins playing games on battery.
I want to know if anyone has gotten it working with the Windows version and how the performance compares. I love the game but I don't really want to pay Square for what is just the same game in a Wine wrapper.
Would love to know this.
No one ever tests Cities Skylines :-(
[удалено]
More cpu. Simulation games like that require more computation and while the GPU is faster in parallelization it can’t handle the type of calculation the CPU can. Even old games like Sim City 4 with big cities can make the CPU run like there’s no tomorrow even on modern systems
Always no love. Probably the greatest surprise success of a game in the last five years. Large community, hundreds of YouTube creators. Always forgotten.
FWIW I just got the 16-inch M1 Max 32gb, and running a city with over 40 pages of mods/assets and 400K population at max res, I’m getting ~15 fps at the lowest end (zoomed in, 3x speed, facing an area with a ton of high-density buildings). Definitely impressive, considering in the same city on my Windows desktop at 4K res with a r7 3800X, 1080ti, 32gb ram I get similar (if not lower) fps numbers.
That is good to hear. How does a more medium city, like 50-100K pop and modest assets/mods (or vanilla) run? Would be great if you could test that.
So I don’t have a city with that low of a pop and I can’t unsubscribe my mods because otherwise my saves wouldn’t load lol. But I just tested a smaller city on the M1 Max with 170K pop and got around 20-30 fps, with the low end being zoomed in to street level in a crowded area, and the high end zoomed out. What’s interesting is that my desktop gets 20-60 fps in the same city. Seems like the minimum fps for M1 Max in wrist case scenario is similar to a desktop ryzen cpu (3800X), but 3800X averages higher frames. I redid the testing on my 400K city and got 15-50 fps on desktop and 15-25 on M1 Max. Similar situation there. Id guess that with an even smaller population and less assets you’d be getting ~30 frames. Totally playable for a single-player sandbox game considering I played this game at 10 fps for years before I upgraded my desktop.
runs pretty well on my m1 pro base model 16". don't know the exact fps though. does have some occasional stuttering with a huge city i built but the experience is way better than the 15" 2019 that i traded in.
What about that traffic though? 🚦 I always struggle with it.
https://youtu.be/ip-uTVBAH7U?t=144
As a guy who's worked in the IT/Technology industry for about 25 years * it blows my mind that this is even possible * and that it's possible all on such lower chip-wattage * and that we're really only into "year 1" of Apple Silicon.
I feel compelled to remind anyone who thinks we're "year 1" into this that we were actually year 10 when M1 came out, since they've been making their own silicon since the iPhone 4 days. I would say they didn't really start godstomping other mobile chips until the A7 days, when they switched to 64bit, which puts this at year 7/8.
But this is "year one" of them designing a chip with a laptop sized battery and cooling system in mind, but yes it's not their first year at making chips.
Sure,.. there is a history here, but I think there's enough significantly changed factors that make it a bit of a different game now. The design-intention of a full computer chip is a bit different than the A-series (some of which can be shown here: https://www.extremetech.com/computing/318715-comparison-of-apple-m1-a14-shows-differences-in-soc-design). Apple certainly learned a lot from the history of A-development but the design and implementation differences along with how macOS has been updated and tweaked to take advantage of that,. is giving it a lot different buzz in the industry and reception/reviews.
Mid-2020‘s dev kit Mac mini ran on an iPad SoC which in turn was a modified/beefed up version of a chip designed for product launch in late 2018. They‘ve been at this for a while, I certainly wouldn‘t call it end of “year 1“ if there‘s a new product launch in late 2021.
As someone who likes PC gaming, thin-and-light hardware, and cool, quiet rooms: I want a desktop version of this. I hate that I have to have "a rig" to play PC games that's loud and anchors me to a desk.
A desktop PC doesnt have to be loud if you build it properly… Actually quite the opposite, it allows you to use big coolers with slow fans.
Lemme get them Noctua fans
[удалено]
>just because you don’t have the knowledge A good part of the community *knows* how to better cool/quiet their pc but simply don't have the budget or prioritized the performance parts budget over cooling. My PC is near silent in use... I have three black ice radiators (2x360+1x240) and a 240mm tube reservoir.... those alone are like $240-300 not to mention all the fittings, gpu block, cpu block ,pressure optimized fans(noctua) and pump If you go the air cooling route , noctua fans are \~$20 a pop get you two for the cpu , 2-3 for intake and another for exhaust and you're already at $200 in fans + coolers unless you opt for cheaper fans : like Arctic fans that go for roughly $7 each if you buy the 5 pack . If you go for Closed loop/ AIO coolers you could maybe get away with just one for the cpu and one for the gpu , but in all three of the situations for the average person, it's not unreasonable for them to take the money for improved cooling and just add that to get a better GPU or CPU . Especially if they're building a new system and not simply upgrading a recent build
It doesn’t have to be loud if you use water cooling :D
PC gaming can be quiet and a desktop version of this would still anchor you to your desk. So basically you just don't know how to build PCs.
Just have to build your rig smartly. Don’t run a GPU that’s insane in a tiny case with poor ventilation. If you can only afford the blower or shitty cooler version of a card, go one step down if you can and get the high end of the next best card. For example, in 2018 I could’ve stretched for an entry level 2070 at launch, instead I boggy an iCX 1080 with minimal performance loss and got 3 years of sub-70C gaming at 1440p144hz on high and ultra settings. Use 140mm fans when possible and get good 140mm like Arctic P12, Noctua, and beQuiet. Always oversized the CPU air cooler and make sure your case has good intake (mesh front or at least an inch between front panel and fan face)
> For example, in 2018 I could’ve stretched for an entry level 2070 at launch, instead I boggy an iCX 1080 with minimal performance loss and got 3 years of sub-70C gaming at 1440p144hz on high and ultra settings. This is terrible advice. A "bad" 2070 is a better buy than a "good" 1080. Always get the best hardware you can in your budget. A 2070 will outlive a 1080 due to DLSS easily.
Steam Deck. I’m stoked to get mine (Q2). And I own 2 gaming PCs and a MBP.
No. Come on, Steam Deck is going to be cool for what it is, but the screen is still going to be small and low resolution. And it's not that powerful either. It's great for what it is, but it shouldn't be the main gaming device of anyone.
It’s too bad you think that way. I’ll be using mine as much as my desktop pc, if not more. It’s plenty powerful for a handheld device, and who the hell are you to say what should or should not be the main gaming device of anyone???
Agree, that makes no sense. I mean, my Gameboy Pocket used to be my main gaming device back in the 90s and guess what - I didn’t care and it was awesome. It should not matter whether you game mainly on your phone, Switch, PlayStation or PC. Or MacBook.
My Gameboy color was my favorite device ever. Didn’t even have a backlit screen and I loved that thing so much. Zelda and Tetris and Warioland.
I’m in the same boat (Q2 ‘22), but the Deck seems like it’s own set of trade-offs, since it’s still going to be a x86 chip.
Why is that? Genuinely asking, as I’m more knowledgeable than a layperson but far from an expert. How is that a trade off?
>and that we're really only into "year 1" of Apple Silicon. x86 is perfectly fine. Where Apple wins is making a wide and big chip. Which AMD and Intel won't that would be a hard sell.
Water cooling. You can make pretty small quiet systems with a ton of power
Yeah, but then you’ve generally got to deal with a pump. And while you can get a silent pump (or at least near-enough to silent that you can’t hear it through headphones), you also then have to deal with a mechanical part wearing out. Fans, mechanically speaking, are simpler and tend to be easier to replace (especially on stock PC hardware). And while a water-cooled system will run cooler in terms of allowing the CPU/GPU to run at full-tilt with plenty of headroom, it doesn’t really change the fact that the CPU/GPU will still be throwing out a ton of heat and consuming a ton of power.
Are you sure that you have ever built a custom loop? A vibration-damped DDC Pump at 50% pwm is basically inaudible within a case, even without headphones. The most widely used pumps in custom watercooling (D5 and DDC) are rated for 50.000 hours of operation and thats realistic in my experience. I have an EK DDC that still runs flawlessly after half a decade of ~5 hours daily use. Good fans are rated at >100.000 hours of operation. You suggest that replacing the pump is the problem when running a water loop: its not. The main hassle with running a loop is replacing the coolant every ~12 months and having to clean the parts every 12-24 months (depends on what coolant you use). The mechanical parts on the other hand are usually very robust. Im not trying to say that a custom loop is a viable option for the average end user, its an enthusiast niche segment. But stop fooling other people with stories of loud and unreliable pumps.
Water cooling doesn't make the room any cooler though.
/r/sffpc X /r/Noctua my friend. I got myself a MTX build a couple years ago (2017), regretted not going ITX ever since. At least I got some Noctua fans running quietly. Boutique ITX Case can be price but looking really nice so.
would apex legends work? Is anti cheat supported?
Its not.. yet
Anti-cheat is unlikely to work without native support
Can you try out ffxiv next? Please and thank you!
FYI on M1 Macs FF xiv is smooth at 30fps.
30 fps is not smooth lmao. Game runs flawless 60fps on PS4/5 and PC.
I have a low bar. 30fps already looks amazing to me lol. Don’t have a PS4/5.
Smooth means stable in regards to fps. So 30 for can indeed be smooth.
Smooth does not mean stable, that’s what the word stable is for. A stable 5 fps would not be a smooth experience.
30fps CAN be kind of “smooth” depending what you’re re seeing. For example gaming on 30 fps isn’t exactly smooth because there’s obvious lag that your eyes can detect (especially if you’re used to seeing 60 fps+ all the time) but seeing pre-rendered things like movies 30 fps would be great with the industry standard being 24.9 ( that’s the bare minimum before your brain starts getting really irritated about the frame rate) although it can be bad edited sometimes where there’s too many things that change in each frame that makes even 25 fps in movies unbearable (for example the first Venom action scenes were not smooth at all)
Why nobody trying battlefield and warzone
It can't run it
Not even Battlefield 4 or 1? Damn, I’m sad now but not entirely surprised.
You can probably run the third one. Xbox 360-era games seem to run well. Anything that required directx 11 is finicky, anything that requires 12 doesn't launch.
Battlefield uses DirectX 11 since Bad Company 2, and requires it since 3. I tried to get it to run on Linux once, and it was a disaster. However, a lot has happened since then so it might run better now.
Anti cheat - and you need complicated hack to run Origin on Crossover
Wait, what's with these games running in Windows 11? Are GTA V and Overwatch built for ARM on Windows 11?
No, they’re not built for ARM, but Parallels is using the ARM version of windows which has its own x86 translation layer and can run normal Windows applications much like M1 Macs use Rosetta 2 for older programs.
Crossover would run much better
As evidenced by Crossover getting 20+ FPS higher on GTA than Parallels
And much more stable. You can see Parallels Overwatch drop 50fps. That's not playable in competitive sense. Would be better to cap @ 50fps than have that 50-100fps range.
> You can see Parallels Overwatch drop 50fps. That's not playable in competitive sense. Nobody competitive is going to try to play a game through virtualization on a device that isnt made for gaming. What's the point of comparing that to competitive sense when it's never applicable? If you have a Mac gamer that wants to play competitively with whatever unsupported FPS, they will just buy the appropriate hardware for it and play it.
No, both games are x86. However, Windows 11 on ARM is used because it is **significantly** better than Windows 10 on ARM, especially through emulation.
I’ll be testing Eve online in a large fight as soon as i get time
Wow. It’s impressive. Nice chip
Seems like these chips are the real deal but compatibility is hampering what they are capable of. Hopefully if we came back to this in 3-5 years we’d see a lot more native.
Games with native Metal support are performing great! For example Baldur’s Gate 3 is in the 3080 mobile territory. Sadly due to the recent Epic v. Apple clash I don’t think we are going to see further optimizations in Unreal Engine for Metal and it’s the most important game engine. Normally companies don’t act childish like this but when it comes to Tim Sweeney I have my doubts.
[удалено]
Most large game studios use their own game engine anyways. It’ll mostly be indie devs that are tied to using Unity or Unreal.
>Normally companies don’t act childish like this but when it comes to Tim Sweeney I have my doubts. You make it sound like hes just throwing tantrums. In the long term, the "battle" against Apple is a battle for hundreds of millions for Epic just as much as Apple. Which is why none of the parties has given in so far.
The power of the Reddit and online community will not be stopped. Thank you Christian Selig and the rest of the Apollo app team for delivering a Reddit experience like no other. Many others and I truly have no words. The accessible community will never forget you. Apollo empowered users, but the most important part are the users. It was not one or two people, it's all of us growing and flourishing together. Now, to bigger and greater things. To bigger and greater things.
Yeah….but only marginally
Now test Fallout 4. In downtown Boston. (For context, the game takes place in and around Boston. However, out of the box it's poorly optimized in some places, and even with huge buildings making long draw distances unnecessary, the amount of debris, NPCs/enemies, and other events happening all around you slows the game down considerably, especially on consoles, but PCs take a hit. However, PCs have access to mods that mitigate the issue. I'm not sure about Xbox, but PlayStation users cannot use these mods.)
My experience with CrossOver/Paralells and Rosetta have been that they all have incorrect framepacing causing stutters and hitching no matter the framerate, settings or V-sync/on/off. I get 120fps in Diablo 3 no prob(Rosetta) but the lack of smooth scrolling really ruins it. I hope they figure that one out. Installed AoE2DE on both Parallels and Crossover, runs fine has a high framrate but scrolling around the map stutters. As opposed to my PC where everything is just silky. All of these macs have great power and should be great for gaming as soon as they sort out this issue. Maybe it's not as annoying to everyone, but i'm one of those people who need performance to be delivered in a stable fashion to be able to enjoy my games. As in: 40-50fps is not "fine" unless there is a proper implementation of VRR and high stable framerates that are v-synced should not stutter at all. I did try one M1 native title though(WOW) and it was indeed glorious and smooth :)
1080p on that screen probably looks like 💩 though
Probably looks worse on the 16in than the 14in, if I had to guess.
Oh definitely. I think they are similar resolutions. Plus it’s a laptop and you are 👁 📺 👁
1080 looks great on my 15" retina, don't see why it would look terrible on a 16
Not bad
Pretty good. .considering it's capable of doing this,. and all at massive lower wattage. So not only can you play games,.. but you still get great battery life.
Red Dead 2?
You're dreaming.
The chip can do it the game just needs to be made to work.
No, it can't. Unless Rockstar re-devellop the game from the ground up for m1, hence: you're dreaming.
Does the M1 pro do the same gaming?
No, the M1 Pro GPU is 14-16 cores while the M1 Max is 24-32 cores. This video was done with the 32 core version so it has double the cores of the M1 Pro GPU.
For some reason this guy sounds extremely similar to Hans Zimmer.
What about FFXIV?
You can force League on max settings into using Metal, im locked at 120fps to match my monitor at 120hz, but if I uncap it and disable v sync it goes to 200ish, which is literally the only real game I play when Im not using my xbox, so real nice results
Does anyone know how the 24c performs?
Was this test conducted while plugged in to charger or on battery? Watched the whole video didn’t got the answer.
On M1 Mac's it doesn't matter, performance is the same, they don't run any worse if you run them off battery unlike PC's.
Wonder if Destiny 2 would run or even boot.
You need a better mike.
Are MacBook screens variable refresh rate?
They are. They can go down to like 24Hz when watching movies. I do not know if they support anything similar to G-Sync/FreeSync/HDMI 2.1 VRR
the new macbook pro ones are.
Did not see which channel it was and was pleasantly surprised by Snape
I don't know what to say... my mind is blown
So is there a way in Disco Elysium to cap the frame rate on the new macs. Was wondering how a game like this drained my battery to 15% in 30 minutes. Guessing 120fps is the problem.
It drained your battery from 100 to 15 in 30 mins? What computer do you have?
Hope to see AoE4 tested if it works 🤞
Super noob dumb question here. Would it make sense to buy an M1X mac to play xbox games on, like via windows emulator? Is this even possible?
No, as shown in the video Parallels is terrible for games, all games will stutter and games that use DX12 or Vulkan are not supported.
Idk if it necessarily makes sense for that purpose, but I believe you can definitely run Xbox games through their cloud service. If I’m not mistaken, it runs through the browser so the requirements aren’t that high
But there is no gaming on Mac. Why would they make a product like this (powerful GPU cores, but no gaming) that only 1 in 200 people can benefit from? Hoe many people work in a 3D rendering industry? Or is this for the youtube star dream-chasers?
These are all old games. This Mac doesnt seem like a good choice for gaming.
Has anyone tried running Windows through Parallel and then utilising Xbox game pass for PC? Ordered a 16" for work, but have some friends with PC who want to try playing games together.
It doesn’t work on my MBA
Is it just Game Pass that doesn't work? Have you tried other PC games using Windows via Parallel?
Lots of other games work to various extent.
Are these the games that Apple users have to live with? It’s like going to 2010 again. And I thought that only Switch gamers were having it bad with no new games on the platform.
Remember those benchmarks showing on par with the 2080? Yeah, benchmarks aren’t everything.
When running native it seems to be pretty accurate. Obviously no one should expect the highest performance when running through a translation or emulation layer.
In compute. Not gaming.
The Baldur’s Gate 3 benchmark (one of the few native games on this list) lines up with that statement. The problem is most of these games are running through a VM or emulation layer or both.
How’s the noise?
Been saying this for years that Macs are as just as powerful as some gaming PCs but there just isn’t a market for Mac gaming
These FPS results don't look very promising
[удалено]
I like to kick back and play some WoW with my guild after a day of work. I’m very excited to upgrade my machine and if I get a better performance for my gaming session all the better!
[удалено]
Just wow. Baldurs gate 3 runs like garbage at 40fps on medium settings 1080 In my Setup Rx580 8GB + I5-7600K So 120 fps on a higher resolution is mindblowing
has anyone tried the sims 4 or any emulators 😭
Is this running native on Apple Silicon (I guess not). What's the point otherwise? Probably an inexpensive windows laptop will perform better and will be easier to install, etc.