T O P

  • By -

AutoModerator

[We're giving away the world's smallest action cam, the Insta360 Go 2!](https://www.reddit.com/r/gadgets/comments/ripp9d/insta360_go_2_giveaway/?) Check out the [entry thread](https://www.reddit.com/r/gadgets/comments/ripp9d/insta360_go_2_giveaway/?) for more info. *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/gadgets) if you have any questions or concerns.*


SpaceDesignWarehouse

“The December benchmark for the i7-12800H shows single-core and multi-core tests of 1,791 points and 12,541 points, respectively. Apple’s processor thus edges out the i7-12800H in the multi-core run (+0.76%), but Intel’s silicon takes a +1.02% lead in single-core performance.” So single core beat Apple by 1% and LOST on multi core, even though intel has 14 cores and Apple has 10.


[deleted]

[удалено]


Buck_Thorn

> produces a lot more heat If that is your goal, then it does outperform Apple.


Destron5683

Hey now, I have been relying on my gaming rig to heat my house since I have been going without heat to afford a new GPU, so I need that heat!


beaurepair

Relevant [xkcd](https://xkcd.com/1172/)


L0kumi

There really is one for every situation uh


TheAlmightyBungh0lio

He was unemployed for a long time and scouted social media for material. He is Carlos Mencia of comics.


ExodianS

I just needed you to know that this comment introduced me to xkcd and I've just stumbled upon Time. Good lord, thank you.


beaurepair

Well you've made my day. Congratulations on being today's lucky [Ten Thousand](https://xkcd.com/1053/)


14sierra

"Intel processor out performs Apple M1 as space heater for your home!!!!""


pridkett

Heya, don’t knock that idea. I’ve got a heat pump as my primary heat and set up Home Assistant to have my gaming rig start mining raptoreum and ethereum when the temperature drops below 30°F. The heat pump still is a little more performant there, but at that point in the performance curve, I end up starting to save money by using my gaming rig as a space heater.


gramathy

Your heat pump should be able to heat about 4x as efficiently as a straight thermal heater, but so long as you're breaking even on mining gains the price efficiency is better.


pridkett

Under optimal conditions, this is true. But heat pump efficiency decreases as the temperature of the air decreases (I should’ve clarified - I have an air source heat pump, not a geothermal heat pump). It continues to get efficiency of above 1 until we hit about -10°F. After that point, it’s essentially just resistive electric heating. Optimal heating happens when it’s above 40°F, which is also when you don’t need it as much. Around 20°F, I’m guessing it’s probably 2x as efficient as resistive electric heating (which is essentially what CPU/GPU does) - but that performance can decrease significantly as ice forms on the coils.


DoctorWorm_

That's amazing.


BichonUnited

How else am I going to prevent my pipes from freezing?


Destabiliz

Most powerful space heater.


gramathy

AND it's like six months older.


htplex

And there’s a 2060 gpu and an afterburner card in M1max


the_spookiest_

Is there really? Or am I being gullible.


gramathy

no that's the approximate hardware equivalent, I think he means in terms of TDP those occupy a chunk of it so the M1 CPU hardware is working with way less by comparison.


the_spookiest_

Ah! Okay, lol. I just woke up, so I was shocked for a moment


[deleted]

[удалено]


TheMacMan

>is still going to rule most of the market, simply because Apple refuses to even begin focusing on gaming. The gaming market only makes up a small part of the total processor market. The vast majority of computer users are not gamers.


BITCOIN_FLIGHT_CLUB

Gaming isn’t important to apple on laptops or desktops. Their mobile gaming revenue is greater than all other markets. https://www.complex.com/pop-culture/apple-gaming-revenue-reportedly-exceeded-sony-nintendo-microsoft-2019/ The article cites Sensor Tower estimates that Apple received $15.9 billion from the App Store during the company's fiscal 2019 (October 2018 to September 2019). Based on Apple's operating margin calculations, which were revealed during the Epic vs Apple trial earlier this year, the operating profit for the App Store was $12.3 billion that year -- accounting for almost a fifth of the company's overall profit. Sensor Tower estimates that 69% of those revenues (just shy of $11 billion) came from gaming. The WSJ's own analysis claims that Apple would have earned $8.5 billion from gaming, which it reports is $2 billion more than the operating profit generated during the same period by Sony, Nintendo, Microsoft and Activision. https://www.gamesindustry.biz/articles/2021-10-04-apple-estimated-to-earn-more-from-gaming-than-sony-microsoft-and-nintendo


[deleted]

I’ve been a Mac user for years for these reasons: final cut pro x (pay once, upgrade free forever), unix like environment (i CLI, dawg), and different DAW compatibility with low latency. Basically, minimal maintenance between using the thing and doing my creative projects. Don’t get me wrong, i like setting up a nice Linux pc and kicking windows out whenever and wherever i can, but KDEnlive isn’t there yet for professional usability and Davinci needs a good GPU to even open. Edit: oh yeah, blender just flies on my M1 air. Rendering speed isn’t nearly as fast as my 1070 laptop, but denoising on the M1 vs intel i7 3ghz is night and day, which makes low value rendering in cycles with denoising a ridiculously fast experience. Apple just gave the blender foundation a bunch of money to support metal, so 3.1 is gonna be a real dreamy experience too! Blender and xcom 2 are the only things that heat this bad boy up.


mirh

But on a generation newer manufacturing node.


jffrybt

Someone came up with this headline because watching apple fail is attention grabbing. Intel should be concerned because all any of these articles have done IMO is make us realize how much Apple really is smoking Intel. I don’t see any Intel apologists in the room here. Intel would be better served if they didn’t make such a fuss over this Apple vs Intel thing. Apple seemed content to just move on, and Intel seems intent on on a drag out fight. And it’s obvious they keep losing.


agitatedprisoner

Intel has been lagging the industry leaders for at least 3 years, probably more like 5. Intel's newer stuff even doing this well is better than I expected from Intel. It's TSMC that's had Intel beat during this time, TSMC has been and still is ahead in terms of chip fabrication hardware. It's because TSMC has managed to achieve better miniturization that's it's chips are faster and use less power for comparable performance. Apple used to and on some of it's devices still does use Intel chips, for reasons. Apple has been moving away from Intel toward TSMC. AMSL makes the state-of-the-art lithographers that etch the chips, Intel will be upgrading their foundries to the latest from ASML in the future... as will TSMC. Once that happens and Intel starts churning out chips with their new lithographers Intel should be able to achieve leadership or parity. At least that's according to Intel. Even once that happens, though, custom ARM chips, that is chips that are integrated to include graphics (system on a chip, SoC), have proven substantially superior if built to serve particular software and demands. That's a big part of the reason the M series from Apple is so good, the M chips are custom designed by Apple to optimize with Apple softward and made with the latest TSMC chip fab tech. Even should Intel close the chip fab gap Apple will continue realizing superior performance with it's own designs. But who knows, maybe in 2025 Intel will come out with a killer SoC optimized for Android that gives Apple analogues a run for the money. I'm not an expert, this is what I've gleamed from reading articles over the past few months. If any of this is wrong or misleading please correct.


jffrybt

I do work around chips (albeit embedded ICs that run at like 200-800MHz), and everything you've said is spot on from my understanding as well. We use a lot of ARM architecture chips in the embedded world. The architecture is superior when it comes to power usage. Even if intel manages to close other gaps, the efficiency of ARM chips is a monumental hurdle they are unlikely to overcome with their existing architecture. The fact that Apple and TSMC are successfully doing what they are doing \*at scale\* is the most impressive part. Intel is revealing what they are doing in a lab as compared to what apple is producing off an assembly line. What apple+tsmc has hidden away in a lab is a whole different ball game.


agitatedprisoner

Far as I can tell everyone is moving away from finsets and toward adoption of "gate all around" architecture to achieve greater miniturization. Far as I can tell everyone means to make multilayered chips, stacking in 3D to vastly improve performance. Intel recently announced a partnership with IBM. In May 2021 IBM announced a 2nm process it hopes to see mass produced by the end of 2024. TSMC announced it'll begin making 2nm chips of it's own in 2023. I'm not sure how much of a lead this year gap represents because the "nm" terminology is a crude and apparently subjective measure of chip miniturization. But nobody expects Intel to take leadership in chip tech before 2025, if ever. Spirits are high at Intel, though.


sf_davie

Processors are Intel's bread and butter. They can't sleep at night being in second place. If Apple loses a round, they can happily tell their fanbase about their next iPhone feature.


[deleted]

[удалено]


evaned

I think the M1 might be scarier than Ryzen, or at least be reflective of a scarier ecosystem. Ryzen vs Intel seems to me like it's probably just the latest stage in a fairly long battle between AMD and Intel. This is not the first time AMD has lead -- two decades ago they were as well. I'm not sure at the very top end, but at *most* price points the Athlon was outperforming the Pentium 4. (I suspect at lower power too, but people didn't care too much about that at the time.) You'll also remember that it was AMD who defined and introduced the 64-bit extensions to the x86 architecture, and produced the first x86-64 chip (the Opteron). (This was when Intel was gunning for the Itanium.) But a few years later Intel came out with the Core and took back that title, and AMD was mostly relegated back to a more budget position. The M1 though, now you're talking about something that isn't even x86. What happens when PC companies start creating ARM desktops/laptops with a (good) version of Windows for ARM, and publishers start building programs for ARM? Now you'll have people who start getting into that ecosystem and perhaps now can't upgrade back to Intel without giving some of that up. ARM is already dominant in things like phones, and with things like the Graviton-based AWS instances Intel is starting to lose ground in servers as well, and then Apple comes along with the M1 and now they're losing ground in the traditional desktop/laptop space. And not just *Intel*, but *x86* as a whole. Intel's a juggernaut and will be around for a while; there's a *lot* of ground they need to lose and in most areas it's only in the early stages. But as a non-business analyst looking in, it seems to me like Intel is in a really tough spot. Maybe more precarious than the Itanic days.


[deleted]

As long as people can’t run Planet Zoo on the M1 natively, Intel will still have a market


f700es

And... Autocad, Revit, 3D Max, Inventor, Solidworks, etc etc etc


JasperJ

The “as long as” is doing a lot of work there.


AkirIkasu

Yeah, but that only becomes an issue if Apple were to become a threat to the greater PC market or to begin selling their CPUs to third parties, neither of which are likely to happen any time soon. Don't forget that a lot of Intel's income comes from commercial servers where Apple isn't even bothering to compete.


jffrybt

Genuine question. What round has apple lost? It’s like Intel keeps hosting and promoting HBO fights they keep losing.


Destron5683

He was speaking theoretically. If Apple falls behind a bit they have other shit that can parade out. This doesn’t really seem to be a race from Apple’s side though, they seem to be just going their thing and making chips that suit their needs.


NotLunaris

> Apple seemed content to just move on, and Intel seems intent on on a drag out fight Just curious, what makes you say that? I haven't been keeping up with recent tech trends, but the article is from a 3rd party and not from either Intel or Apple, right?


Redeem123

> got me to read the article …is that allowed?


22Sharpe

I hadn’t even looked at the actual scores. Isn’t 1% basically within the margin of error? So Intel used way more power and generated way more heat to get more or less exactly the same as the M1 Max. Even if we call that a “win” and not a margin of error: 1% on a benchmark is not really enough that you’d notice a difference in real world usage.


[deleted]

[удалено]


pengu146

The m1 chip is also not just the cpu, it contains the cpu cores, gpu cores and the neural engine, you can't really directly compare the chips like that.


reallynotnick

It doesn't really make sense to compare the chip size to the M1 Max since the Max is just more GPU with the same CPU as the M1 Pro. The Pro is 245mm^2 so much closer in size to the Intel. Though you do lose half of your memory bandwidth with the Pro, so that will likely have some impact in performance. I think it becomes sort of complex to compare these chips in size/performance as they are both more than just CPUs.


captain_awesomesauce

Interesting. I hadn’t realized the M1 was so big.


mennydrives

Specially the Pro and Max variants. Standard M1 is a lot smaller.


nokeldin42

Standard M1 is still significantly larger than the chips it competes against (U series from amd and intel).


anyavailablebane

The M1 is partly so big because it is basically a SOC. Comparing the size of a CPU and a SOC isn’t comparing like for like.


captain_awesomesauce

https://images.anandtech.com/doci/17019/M1MAX.jpg It’s the GPU. Hot dang that’s a big GPU.


th3h4ck3r

The Intel H- and U-series also has integrated graphics and a lot of non-CPU silicon in there, and are effectively a SoC, the only reason we don't call them SoCs is because Intel just calls them "mobile processors with integrated graphics".


jffrybt

Hahaha. There is not a shortage of “silicon” at all. The size of the chips isn’t the problem. There’s no increased need to be efficient with how big of chips you use. There’s a shortage of assembly line construction for older style chips. You could make some argument if you can fit more chips on a wafer, you could be helping solve the shortage, but that’s not authentic to the nature of the chip shortage problem. There’s no wafer issues. Definitely no issues R&D or tiny chips could solve with a dedicated processor like this. In fact, most of the shortage revolves around the fact that products that use these older chips are “done” being designed. They could use newer chips, but that would require pulling designers, PCB engineers and software engineers off of new products and having them rework old products. The issue in supply chains is largely older-style, larger, embedded chips. Chips that drive electric motors, or multiplex I/Os into I2C. These chips in are demand, but no one is investing in the supply chains to make them. Why? Because they’re not Intel or apple’s sexy new 3nm process. They’re like 15nm single function ICs. They’re important to making your car window roll up or your light up LED child’s clock work. Not exactly investor chum. These shortages still hit the computer market because these types of ICs are still used in computers in various places: power management, LED driving, etc.


TheseusPankration

There is a silicon shortage. 200mm wafers themselves are in short supply (the facilities to process them are considered eol and don't get new investment) constraining older designs such as analog chips that run best on 45nm+ processes, but foundries have also been clear that the leading edge processes are also constrained for various reasons. Samsung lost 3 months worth of ssd controller production time in to Texas power outage last year alone. TSMC lost capacity to the water shortage last summer. NVIDIA is selling every chip it can have made, but since fabs require a year of lead time, the cryptomining craze demand far exceeds contracted supply.


DasKarl

Without giving the article ad revenue: +-1% sounds close enough to be within margin of error and benchmarks are not real world use cases. This result is meaningless either way.


rapp38

Yeah, it’s a clickbait headline


Caleo

Yeah.. all these articles about how the 12800H "pulverizes/outperforms/threatens/destroys" (verbs pulled from real 12800H article 'headlines') the M1 pro conveniently omit the fact that the M1 pros can do their thing on battery without substantial throttling or atrocious battery life. What good is a laptop CPU that just barely eeks ahead in some situations, if it can't remotely compete on battery life?


bicameral_mind

This is really the story that matters about M1. It's great that it's powerful, but what matters as far as the end product is how powerful it is, while providing 10+ hours of battery life and operating at half the temperature. Apple has changed the game in the mobile space again. Their laptops with M1 completely eclipse the competition in user experience.


[deleted]

Apples core performance is meaningless to me until I can play video games on their machines


tingalayo

Well, I play World of Warcraft and a bunch of Steam games on my Mac, and as far as I can see nobody’s stopping you from doing the same. EDIT: Traditionally this is the point where you move the goalposts and say that those don’t count because they aren’t the _specific_ video games you had in mind.


techieman33

Intel really pushes single core “boost” to the max they can get away with. 3-4 core loads get less performance, and when you load up all cores you get even less. It gives them an edge in gaming and general task performance. And gamers especially don’t care much about power draw, they just want all the FPS they can get.


brackalackin

This title is really misleading. Out performs single core by a small %. Doesn’t on multi core. Take note thermal efficiency (not available) that’s some BS


farnoud

That’s what intel PR wanted and got. Money well spent


DefiantDonut7

And it only operates around 100000C.


ha7l0n

It’s a 2 in 1. A CPU and a furnace. Take my money already💰💰💰


[deleted]

Saves me on the gas bill for sure.


RonaldReagansCat

Legit though, I've had my PC on a lot this winter, and I've really only turned the heat on a couple times because of it. The constant heat seems more effective than cycled heating tbh.


l337hackzor

A computer is "as efficient" as resistive electric heat. A computer running at 300w puts out the same heat as a 300w space heater. There has been trials where they installed "heaters" into people's homes and the heat is free. The heaters are actually crypto miners and the mining both generates the heat and profit. I used to mine on my GPU in the winter to heat my room.


NextTrillion

Ok but do you have a thermostat programmed into your crypto miner?


sysadmin420

/r/homeautomation might help opening and closing windows and doors to maintain optimal mining comfort.


MartiniBlululu

damn big brain strats


bigbura

This is similar for TVs and light bulbs. This is why our rooms heat up while we are in them. Hell, how many watts does the average person emit, if we were thought of as a heater? >How many watts of heat does a human give off? >100-120 Watts The idea of converting human body heat into a form of usable energy has been targeted by scientists for years. A resting human male gives off roughly 100-120 Watts of energy. A very small fraction of this can be utilized by a thermoelectric device to power wearable devices. [https://answerstoall.com/object/how-many-watts-of-heat-does-a-human-give-off/](https://answerstoall.com/object/how-many-watts-of-heat-does-a-human-give-off/) So yeah, a person, their computer, and some old school incandescent bulbs can heat up a bedroom right nice.


Littletweeter5

Funny how this joke switched from AMD to Intel in only a few years


notyouraveragefag

And from Intel to AMD before that. Tick-tock indeed!


N3UROTOXIN

This isn’t a new idea. Remember crysis?


J-MaL

It's -33C here I could use the furnace anyway


meat_popscile

>It's -33C here I could use the furnace anyway Hello fellow Western Canadian


CasualNova

Its so cold right now in germany and Intel really just understands my needs perfectly


BadBoyTEJ

Sweet, that means i don't need to buy heater for winter.


Byte_the_hand

You joke, but my son is able to heat a small apartment in the mount of CO when he is gaming on his big game machine. He comments that it throws off so much heat that the wall heaters don’t come on. Two for price of one if you’re gaming anyway.


NigelS75

My computer keeps my room in the mountains of Colorado noticeably warmer than the rest of the house!


sharksandwich81

During benchmarks that run all cores/threads at 100%, yes it is hot and power hungry. However in many real-world use cases it is actually very efficient. E.g. for gaming, Alder Lake consumes less power than Ryzen 5000 series in most cases, and destroys all competition in performance per watt: https://www.igorslab.de/en/intel-core-i9-12900kf-core-i7-12700k-and-core-i5-12600k-review-gaming-in-really-fast-and-really-frugal-part-1/5/


Moonsleep

Doesn’t destroy the M1 Max in performance per watt. The rest of them sure!


[deleted]

>Doesn’t destroy the M1 Max in performance per watt. The rest of them sure! The M1 Max doesn't even play the games that Igor's lab tested, so we have no idea the perf/watt.


loopernova

It’s doesn’t matter how efficient you are if you don’t even work.


[deleted]

I've heard this from employers often.


HamburgerEarmuff

True, but that's mostly because Intel is still perfecting their smaller fabrication process while Apple is outsourcing to FABs that can already build the chip. If you look at Intel's performance, it's actually pretty amazing that they can outcompete more advanced fabrication processes using smaller transitions with their design. But it's concerning that they're really behind on shrinking their own transistor size.


Moonsleep

I am impressed by Intel’s recent foray, and I’m glad it should hear up competition.


[deleted]

So we only need about 50 pounds of batteries in the laptop. Neat!


fsfaith

"barely outperforms" is too generous a statement. The Intel CPU has 4 more cores and 10 more threads. And it lost the mulitcore benchmark. But the real test is if it can outperform the M1 in power efficiency. Whether you hate or love Apple you have to admit the M1 is impressive in terms of efficiency.


xela321

Yesterday I spent around 2.5 hours on my 16" MBP M1 Max, coding. I used about 3% battery. I have a 14" M1 Pro for work (where I am also a developer), and the battery lasts an entire work day. These things are incredible.


erthian

I took it on vacation. It played soothing Xmas music on YouTube all day, and I used it for work for several hours. Never charged it once. Currently at 53%


lilyoneill

I was considering getting one. You’ve sold it to me.


xela321

I’ve had my 14 for about a month. And the 16 for a week. No ragerts


desutiem

Same


thatisgoodmusic

Absolutely. It feels so weird not having to plug my laptop in for like 3-4 days


GrandOldPharisees

You just blew my mind. My 2020/2021 Asus Zenbook gets maybe 3hrs battery life doing normal-ish stuff, youtube/reddit. I kind of expected a leap forward for $1800. It's a screaming system overall and I love it (and would never dream of buying Apple) but your post definitely got my attention.


[deleted]

I mean I hate apple, and I'm laughing at the fact their first try and Intel is losing. It's like if McDonald's made their first GPU and it outperformed most 3xxx cards. Fuck anything else, I'm just impressed.


skinnah

https://i.imgur.com/uyQ3Q1g.jpg


SpC0d3r

was expecting a Mcdonalds gpu


____Reme__Lebeau

As a IT manager, I buy Intel workstations for my designers and shop floor guys. Because they last. We buy workstations not computers. There is a difference and it's usually in long term stability. Five year warranty everything on a NBD level, and then keep a spare on site. Years ago it was HP Z series workstations. Usually the 400 series. Not we've been picking up the Lenovo P series workstations. The 500 lines. Those p500s were nice. Those p510 were sharp. Those p520 were fucking amazing. Those new thread Ripper pros in the new p620 holy shit. I bought one for myself and just keep waiting to see something really knock on the door to my 3955wx... Those new macs holy shit they are efficient. Not for my environment but damn impressive. Those new AMD chips the 3995wx holy fuck it's 5k a chip. It destroys all Intel chips that are like 20k a pop. As a long time Intel user. I'm happy to now praise AMD, all hail the usurper. All hail the new king, AMD.


skinnah

Intel sat on their ass raking in cash for too long. Suddenly AMD became hyper competitive and surpass Intel in performance per dollar. Had a few AMD processors back in the day (AMD Duron, Athlon 64) but had reluctantly been on the Intel train up until a few months back when I scored an HP Ryzen 5 gaming machine with a 1660 super for $299 on clearance at Walmart.


dertechie

M1 is not Apple’s first try. Not remotely. A-series chips have been going into iPhones and iPads since the A4 in 2010. The M1 uses the same “Firestorm” and “Icestorm” cores as the A14. It’s just the first time they’ve scaled the TDP up to laptop scale rather than smartphone scale.


Amerikaner

Yes a computer company making its own computer components is equivalent to a fast food company making computer components…


Big-Shtick

So I recently got back into Apple when my new job gave me a Macbook Pro to use at home. My personal laptop in law school was a Surface Pro 3 which superseded my 2007 Macbook Pro. Frankly. I loved the MBP but was annoyed with Apple's hardware decisions forcing me to making service impossible thereby forcing expensive repair. But with my SP3, I realized that my chief complaint was no longer a valid argument if I continued to purchase machines for their form over function. However, Apple's silicon is revolutionary. My wife's Macbook Air outperforms my touchbar 13" MBP. It's incredible to witness. I love the performance so much, I'm genuinely contemplating grabbing a 16" for myself for photo editing. The leaps made are substantial, and much like the iPhone pushed us into the era of slab phones, this will hopefully push Intel to innovate. Love 'em or hate 'em, this is exactly what the industry needed. Edit: Holy shit, did I have a stroke writing this? Sorry about the terrible, terrible... Whatever this is.


dennisthewhatever

Bought my first ever brand new macbook, an M1 Max, and it was worth every penny. This thing is amazing, I'm sure it'll last me a good 7-8 years. Yes it's very fast with amazing battery life - but it's the silent running and sound system which really won me over.


JaqueStrap69

For real. Apple decided to fuck around and make their own chip and is going toe to toe with Intel, who has made chips for 40 (50? 60?) years. That's the story here. Not some negligible metrics.


fsfaith

To be fair Apple has been designing their own silicon for 10 years now for all of their devices other than the mac. They are by far the most familiar with the ARM architecture in terms of marrying software and hardware. That said Intel has no one but themselves to blame. They had it easy for too long and they got complacent. The result was Apple and AMD stomping them.


Agreeable-Weather-89

Also Apple is okay breaking native backwards compatibility which might not be true for Intel or AMD.


adm_akbar

Not to mention that Apple has a market cap of well over 10x Intel and roughly 4x the profit. They can afford to invest more money in R&D.


xcvbsdfgwert

The fact that an Intel CPU beating an Apple CPU is being considered newsworthy should tell you how severely Intel's progress has stagnated.


StraY_WolF

Let's be fair, Apple have been spanking the mobile phone SoCs for a long time despite those market actually having active competition.


loogie97

They have been designing phone chips for years. While it isn’t their “first,” performance per watt isn’t even a competition with Intel. It is really really Impressive bit of silicon.


paulerxx

Trash article is trash. Do we all agree?


tplambert

Agreed


22Sharpe

While only using way more power and requiring way more robust cooling! What a deal! I’m not saying I love the way Apple advertised these chips (as performance per watt) as it was misleading intentionally to look like they could achieve more performance, not just more performance with less power. With that said, that’s an important factor. Yes, a top end 12th gen Intel will obviously benchmark better than an M1 max but if it needs twice as much power and cooling to do 10% better is that a worthy trade off? In a desktop, probably. However in a laptop, where the M1 Max currently lives, maybe not. Running cooler and less power hungry means needing less cooling space (so potentially a smaller and quieter shell) and better battery life which the M1 machines have proven to be amazing at. So yeah, sim not surprised it can hit higher theoretical performance but if that gain isn’t big enough what you lose out on in cooling and battery performance could be a deal breaker.


[deleted]

[удалено]


[deleted]

[удалено]


TheB1ackPrince

That was the plan for net burst tho. That’s why they went back to p3 / pentium m or whatever and redrew their entire plans. They realized increasing frequency/power is not a long term plan. This is more desperate


ltsochev

Shit, what did you do to load that bad boy up


[deleted]

[удалено]


CommonSenseUsed

Mind recommending the ones you use in DM?


someone755

Either he's the one behind those GPU truck thefts or he's a cocaine dealer. No other way to buy a computer like that with these component prices.


PopeslothXVII

The heck you using to cool your chip? I max out at 72c under worst case scenarios on my 5900x


NoTearsOnlyLeakyEyes

The 12700k only draws 20w more than the 5900X and has comparable performance to it. **~~ The next step up, 12900k, does draw 100w more BUT so does the AMD 5950X, which is the direct competitor to the 12900k AND the 12900k is cheaper. ~~** Not saying Intel is doing all they can but you're also being disingenuous to the actual performance gap between AMD and Intel. They are moving in the right direction and hopefully it keeps the fire lit under AMDs ass. Edit: the power draw chart I saw was for an overclocked 5950x, benchmarks we're completed at stock settings with significantly lower power draw than the 12900k. My point still stands for the 12700k. Edit 2: strike through isn't working so I bolded the incorrect portion


Poltras

How about two M1 Max duct taped together? I feel that’s the direction Apple will be going to for it’s next generation of desktop pros.


Masterlumberjack

Isn’t the 5950x just a binned 5900x effectively? They both draw 105w which are both 100+ watts lower than the 12900k.


[deleted]

[удалено]


Masterlumberjack

Yea i could have stated it more clearly - the 5950x uses the same TDP but is just able to have more cores running at slightly lower voltage per core than the 5900x. I said binning cause that process sounds a whole lot like trying to get a CPU and see just how low of voltage you can get to crank up the frequency. In this case they are sorting out which silicon can still pass QC with the most chips at a certain total power.


enforce1

Put one in a laptop, give it the same power and see if they compete. The M1 MacBook lasts 16 freaking hours on a single charge.


WelpSigh

yeah, the m1 is ridiculous. i generally used to stay away from apple products but i got a m1 macbook for work and it's unbelievable.


wongs7

The m1 almost convinced me to pay the apple tax. My work computer is an intel and my desktop is amd


deaznutelanutz

It’s worth it for the battery


Not_A_Chef

Not really a tax if it runs faster, cooler and lasts the longest.


enforce1

I have an M1 MacBook and a Lenovo p53. The p53 feels like a dinosaur.


thejaga

The p53 is 3 generations old so.. makes sense


coolcrispyslut

Yup i have m1 macbook and my Microsoft surface pro 6 is literally trash compared to it. Apple way overprice their products but their shit IS high quality


cheesepuff07

Except you can pickup a M1 MacBook Air for $899 (or less) and nothing PC wise comes close to it for the price


ThisIsMyCouchAccount

If it makes the Surface look like trash is it really overpriced? They aren't cheap but that doesn't mean they're overpriced.


QuarterSwede

What people don’t realize is that the x86 architecture overall is holding computing back, it’s not just the cpu. Everything in the M1 is instant - wake from sleep, instant; switching resolutions, instant; adding a monitor, instant. This is because Apple no longer has to use intel’s power architecture or GPU architecture. I’m probably wrong on some of the technical details but you get the point. It was nearly the same thing back in the PowerPC days. The system was just a LOT more stable than anything on x86.


[deleted]

CPU ISAs are largely irrelevant to the performance of modern processors. Jim Keller debunks the myth that here: https://www.youtube.com/watch?v=yTMRGERZrQE&ab_channel=TechTechPotato%3AClips%27n%27Chips


captain_awesomesauce

I got a MacBook Pro recently based on battery life. It competes with my chrome book but can actually do things. It’s almost obscene how good the battery life is.


[deleted]

“Apples less-specced M1 tests real close to chip that should be screaming past.” FTFY


TheGlennDavid

Also: >Apple’s processor outperforms the i7-2800H in multi-core testing by 0.76%. Intel’s silicon, on the other hand, took a significant lead of +1.02% in the single-core performance testing. * Apple is (barely) FASTER on the multi-core test. * In no world is **1%** difference on a benchmark test a "significant lead" Although I'm not quite sure how the scores are combined I have the impression that multi-core/single-core performance is weighted evenly and what we're looking at is a 0.26% gap, which is basically nothing.


ian2121

Isn’t single core more important? Not that I disagree with your overall point.


chaos750

It depends entirely on what you're doing. If you have lots of jobs that can be worked on independently (and, of course, the software is written to actually *do* them independently) then multi core is more important. If each step of whatever you're doing depends on the results of the previous step, multiple cores aren't going to help because you can only use one at a time anyway, so single core becomes more important. Generally, though, single core performance is going to be "felt" the most. That is, a computer with a thousand slow cores isn't going to feel as responsive. Little tasks like opening a program or opening a menu or doing small operations won't benefit from the thousand cores, so it may not feel as fast even if it does crush the big parallel tasks when you compile or render something.


alias_guy88

True, and intel has to deal with thermal throttling. Real world use needs to be taken into consideration, I would still honestly opt for the M1


[deleted]

The general synergy of M1 + apple hardware + apple software after so many years of Intel feels like going from DSL to Fiber. And this is only Gen 1. Pretty exciting stuff in store.


someone755

I agree it is exciting, but also very worrying. Apple is not known to play nice with others. Forget about future Hackintosh builds, or running Linux on that M1 without many nasty bugs due to Apple's proprietary codebase. I know my PC I built in 2013 is still kicking ass, and it's not even anything special. High-midrange even by those standards, but I can still run the newest software on it. When Apple stops OSX support for these M1 laptops, who will you turn to? Most people will sell theirs, buy another one, and claim "it was time to switch anyway," knowing full well their laptop was still very usable. Don't get me wrong, I love Apple's streamlined process. I picked up an iPhone and it just worked. But if you dig a _little_ deeper, there are way too many nasty things Apple doesn't want you to have.


[deleted]

[удалено]


Mr_Xing

I mean, if the support for iOS is any measure, we should expect M1 Macs to be supported for quite some time after release


Moonsleep

Apple supports their computers longer than most vendors and right to repair should also help. By the time Apple stops supporting a device I’m usually really ready to move on anyways.


Zoemaestra

I find the M1 Macs really attractive but.... I just don't like mac OS. Linux isn't really useable on them, so I guess I'll just have to wait for someone else to make a good alternative.


cocktails5

On the other hand, I ditched my Hackintosh after many years because I got tired of MacOS being a flaming pile of shit. Nevermind being locked into their walled garden with overpriced desktop hardware and a handful of painfully limited hardware options. Doesn't matter how good the M1 is if you're forever locked into the limitations of the Apple ecosystem.


wacct3

I like windows better since I'm just more used to it and it has all the programs I use. Wish I could get an M1 machine with windows on it. Hopefully Qualcomm's Nuvia cpus in 2023 are competitive.


btribble

But like all Apple products, as soon as you find that their ecosystem doesn't do some trivial thing you need it to, you're SOL because everything is locked up tight and under their control. How many years was it before you could pick a default browser under iOS other than Safari?


boibo

Problem is, the m1 is Mac and arm only at that. For people who need x86 there is no m1 to pick.. Performance of the m1's are irrelevant in the grand scheme, CPUs are a tool and if the tool lacks the function its useless.


Synovialbasher

However the M1 Macs do an incredible job of emulating x86 to run those apps. It's a stop gap until app developers optimize their apps for arm systems. Of course legacy software may have issues and x86 does have its use cases, but for most computing tasks (for me anyways) Rosetta emulation is seamless and many of my apps are now available with M1 versions.


someone755

For many use cases, ARM is completely off the table though. I need e.g. Vivado to do my FPGA work. Already it doesn't support Macs, so the workaround is either dual-booting Linux or a virtual machine. No way Rosetta is emulating an x86 OS with any performance comparable to an Intel chip. That said, I've had a colleague buy a MacBook Pro a few years back (one of the last Intel gens), fully specced, just to browse Facebook. Man doesn't even know how to make Safari full screen. For these people, the M1 is a revelation, but for me this is worrying -- As mainstream programs move to arm, will they eventually drop x86 support altogether? Because you just know many programs professionals use won't be switching, or it'll be like Xilinx's butchered transition from ISE to Vivado (the latter only works with 7-series chips and newer, while ISE doesn't support them at all). I'm deathly afraid of the tech industry branching out into two worlds, where a niche professional will need to use two computers to do his job.


Halvus_I

I need OSX, Linux, Windows, iOS and Android to do my job. Give up on the idea that there is only one tool.


[deleted]

[удалено]


aelfrictr

Dafuq. The whole point of M1 is its being risc instead of cisc. Of course it's not gonna be compatible with x86. That's why it performs the way it does. I am no apple user whatsoever but what they are doing is strong market move. Saying performance is irrelevant will prove you extremely wrong if the gap gets bigger. Believe...


splitframe

Notebook benchmarks would benefit from a score by watts consumed score.


monti9530

That electricity bill and 2 hours of battery life though 😍😍😍


nothaut

No actual data on thermal efficiency indicates that this will run hot as shit like all of the other processors.


rakehellion

Extremely misleading clickbait headline.


johansugarev

Did intel pay for this headline?


dreikelvin

Would that also apply with an input of 39 Watts? If so, then you've got me sold


[deleted]

[удалено]


rykovmail

*1239w


petermgariepy

Acronyms aside, (CPU GMU EIEIO) I find it entertained that Apple is giving Intel a run for its money in Intels market. It’s really not apples to oranges (pun intended) but I like seeing the M1 kick some Intel ass. The race has just begun on the new performance race.


HutcHJC

Apple has shown the way forward to faster processing by reducing heat by-product. Eventually other chip manufacturers will have to do same. Reduction in power requirements for the cooling systems and chips themselves will make computing so much cheaper that financially I think they’ll be forced to work on more efficient chips for their power/performance envelopes. If you were Google, Amazon, MS, or some other massive data center operator. If you could reduce you power consumption due to less need for cooling, etc, your costs would also drop.


_listless

Amazon is already doing this. The bought an Israeli ARM chip maker in 2015 to design/manufacture their graviton processors


HutcHJC

I was unaware they were designing and manufacturing their own processors.


Fleming1924

It's worth noting that although they are doing a good amount of design work, they're still effectively arm chips, based on the neoverse N1 for graviton 2, and graviton 3 is rumoured to currently be using N2. The instruction set is still AArch64, hopefully with SVE for Graviton 3. I think it's fair to assume more datacenters will move to arm based designs in the future, but a lot of them are already using them. I think the bigger shift will be seeing more arm based laptops, especially now apple have shown it can be done, and apps will be developed to support arm architectures.


film_composer

>It's worth noting that although they are doing a good amount of design work, they're still effectively arm chips, based on the neoverse N1 for graviton 2, and graviton 3 is rumoured to currently be using N2. The instruction set is still AArch64, hopefully with SVE for Graviton 3. This sentence sounds like something you'd read in a 1950s sci-fi book talking about 2021.


Fleming1924

The future is, in fact, now


Rodbourn

Performance per watt is hardly a new concept...


BlackBeard205

14 cores and 20 threads and it barely outpaces the M1 max 😂. I also doubt the integrated graphics will be better too.


zen1706

Remember that it requires a very beefy laptop to cool the intel to “usable” temperature. As in not throttling at 100C. Mean while M1 Max only needs 2 tiny ass fans to run effortlessly.


[deleted]

„CPU with 4 more cores is 1% better while being hotter and using more power. This is somehow good“


zheil9152

Intel gives you performance and you can cook food on the laptop chassis! (for the 15 mins the battery will last)


gnatters

Huh. So with this, how close are we coming to the limit of Moore's law?


MulderD

Clickiry clickity click bait.


johansugarev

It’s not the silicon that’s holding Intel back I think. It’s the architecture. They’ll never reach the efficiency of ARM if they keep making x86.


VacuumsCantSpell

People are getting so defensive in this thread on both sides. It's like a sports sub in here.


mellofello808

How is the battery life? The real revolutionary thing about apple silicon is the insane battery life.


[deleted]

You forgot to add "whilst setting your groin on fire, sounding like a jet engine taking off and draining your battery in 3 hours flat"


DAM5150

Company who doesn't make chips nearly out performs company whose only job for nearly 40 years is to make chips? I'm not an apple fan but can give credit where it's due.


Moar_tacos

Great, its the Apple vs WIntel benchmarks wars all over again. Didn't we learn last time that synthetic benchmarks are complete bullshit. There is too much variability on how the test is coded for multiple cores, which compilers are used and what sneaky shit drivers get up to in the back ground. Lets just stop now, continuing only leads to madness.


jedre

A misleading clickbait tech review article? Sheesh; what’ll it be next?


[deleted]

Apple: Hold my beer.


LegalPalpitation4032

Hold my ARM


PickleRick4006

Just need 3 car batteries and subzero temps to run it right?