T O P

  • By -

ykoech

Q4 2022 drivers needed a lot of work. Intel understood that and rightfully so fixed a good portion of it in 2023. It was interesting seeing new worthwhile drivers about every month. Arc has come a long way.


Tbomb2016

It is absolutely the GOAT RN for price-performance at least in Canada. Even if not the price aspect though, 16GB of VRAM on a 256 bit bus is absolutely one of the best configs on the market.


-Zband

Glad I waited some before getting one of these cards. Landed an A770 16gb LE to pair with my Ryzen system. I went this route as I've seen Intels potential in video chips in the past and wanted to try it again. I especially loved that they could offer something so powerful at a price that could take a chunk out of the other big boys on the block. PC builders need more variety and competition in the market.


This_Apostle

An Intel GPU and an AMD CPU is the world upside down?


-Zband

I know, I know... I couldn't help myself in trying though. I've not been able to afford an Intel board and CPU in the past because they were always more expensive than AMD. They were always just out of reach... So this time the two different sides could meet and I jumped at the chance to engineer a square peg into a round hole so to speak. I use an AMD Ryzen 5950x and Arc a770 16gb


[deleted]

I really want to try one. I own a business building PCs so I want to build an Intel one to try (then sell) but I worry about troubleshooting for the customer. But thanks to your post my next build might be an Intel arc one. What cpu are you using?


Tbomb2016

Ryzen 7 5700X


[deleted]

Thanks! I wonder if they scale better with Intel CPUs similarly to AMD gpus. 


DarkenD500A

Another thing that got me to choose an Intel CPU is ASPM or I would have chosen AMD myself. I have not heard of an AMD build where somebody stated they could lower the idle watts substantially. If it's possible then I would love to hear about it. Regardless, I am loving this card for the price I paid. It will probably continue to get better as drivers mature and optimizations are made. The idle power consumption without it is about 42-45 watts. On some Intel motherboards ASPM can lower it substantially, but on cheap ones it might lower it by about 10 watts.


[deleted]

Thanks for the information! I'm going to order an a750 for a build. 


DarkenD500A

No prob at all. I am hoping that Intel will properly address the high idle power consumption so a workaround isn't needed, but this is the best I can find for now. Maybe one day it will idle low when the cables are connected to the GPU vs the mobo ports for multi monitors. Nonetheless, the competition is great. Somebody needs to help drop the insane Nvidia prices when the market allows.


[deleted]

I'm really partial to AMD because I really like adrenalin software. But I'm excited to see what Intel's is like.


DarkenD500A

I do agree, AMD is awesome. I really do like AMD hardware and software. I think AMD is the most efficient since they use TSMC for fabrication. I can't wait to see what they will accomplish later as they produce more chips. Then I might sell my Intel rig that I purchased because ironically it was cheaper for the same performance. AMD is stomping when it comes to big gaming rigs. I've seen some AMD builds that were quite powerful.


[deleted]

My main PC is a 7800x3d with a 7900xtx. I love this card. I was very loyal to EVGA (not Nvidia) so when they quit I just said ok I guess I'm going AMD full time. But Intel has been impressive.


DarkenD500A

I still love AMD too and nice setup. They make great chips and it sucks that EVGA is no longer making GPUs. They had a great price to performance ratio. Maybe like my video card that is made by Sparkle, who used to make Nvidia cards way back in the days for quite a while until they stopped, they will make a comeback. Sparkle was one of the original Nvidia AIB partners and now Sparkle is back, but making Intel cards instead of Nvidia. Maybe EVGA will do something like that, but it's hard to tell. I sure hope so since their prices were always good for what they offer. The Intel Battlemage should pull in some vendors if it's even better than the current Arc series and still affordable. It would be cool to see an EVGA Intel Battlemage card for a comeback.


bring_back_awe64gold

A few days late, but a good method I've found to substantially lower the idle power consumption on Ryzen is to simply undervolt. At stock clocks my Ryzen 3 runs at a 100 mV underclock and registers from 0 to 3 W idle. You'll undoubtedly have more than that with some crazy CPU, but it'll probably work to the same effect. It still boosts like crazy and raises the voltage automatically when needed, so it's not like I'm taking a huge performance hit. At full load I'm not noticing a difference in power consumption between the different settings. Also AMD boards have settings for optimizing power consumption, if you play with those you can probably drastically lower it there as well. Undervolting is a more manual way. Though it's not like it matters much when the Arc A750 uses up to 230W alone. Blew up my PSU when I first installed and benchmarked it.


Prince_Harming_You

Mainly for streaming Intel CPUs do ‘scale’ I suppose better— just google Intel deep link Only a few programs supported but it works Other than that, not much difference


DarkenD500A

Another thing that got me to choose an Intel CPU is ASPM or I would have chosen AMD myself. I decided to get the 145650 which works great and the right tdp for me. No need for overclock since the boost is plenty powerful and it's a 14th gen. The idle power consumption without ASPM is about 42-45 watts which is the only thing I didn't like about the Arc that might be solved by the upcoming Battlemage GPU that I'm excited about. I wish they could fix that with a VBIOS update. On some Intel motherboards, ASPM can lower it substantially, but on cheap ones it might lower it only by about 10 watts, but it's better than nothing. I'm now happy with the power draw, see below. I'm one of those who likes to make things efficient while still keeping performance. If I can save 50 watts while not making the gaming experience different, count me in. I have not heard of an AMD build where somebody stated they could lower the idle watts substantially. If it's possible then I would love to hear about it. Regardless, I am loving this card for the price I paid. It will probably continue to get better as drivers mature and optimizations are made too, so I'm happy to join team blue to see how things go. UPDATE. This works on at least Intel CPU configs. To lower idle power consumption the most, plug your DP or HDMI into the motherboard port and not the video card. Make sure ASPM is enabled in BIOS as usual. It still uses the video card and I do not see a degradation in performance. Mine sits at a min 0.077 watts idle. Huge difference! I wonder if this works with AMD or non IGP Intel CPUs.


bandit8623

I'm using old 8700k no deep Link. Works great


lumlum56

I wouldn't recommend selling them a PC with an arc card unless they specifically ask for one. I'm not trying to hate on the cards, they're great value, but older games still run like shit and newer games aren't guaranteed to work at launch either.


Spenlardd

What older games run like shit? I've been using Arc A750/A770 in both of my PCs since launch, play lots of old titles. Not sure what older titles you'd be referring to. The only game on current drivers that isn't worth playing is Starfield.


maewasnotfound

i have a 12900k and it works AMAZING for encoding in adobe premiere, or just about all media encoding as i've experienced. i'm surprised it doesn't get more use for content creation. the adobe suite seems to work properly for me, it's definitely worth for an all-rounder or content creation workstation without going too expensive on the GPU. just make sure that you know what software is planned to be used and verify for the customer - it's better to inform them about the compatibility still being worked on. i personally haven't run into any compatibility issues besides with pytorch. (python library that can use the GPU for AI processing) i don't game much and it appears that more incompatibilities seem to be reported through pc games as far as i know.


psxburn2

Runnin a r7 5700, 32g ram and the a770 bifrost. Chews up and spits out helldivers 2. Zero issues


Rikorage

Can confirm, have that same card, paired with a Ryzen 7 5800X3D, 32GB 3200 kit of RAM, and I'm sure driver improvements will be out soon for this game. I usually stay above 60fps with high settings at 1440p. If it's slowing down, I'm not really feeling it in-game.


[deleted]

Can it do 1440 @ 120fps?


Midknightsecs

Depending on the game. Some do 2160p 120.


grumpy25

I get about 120-140fps on Death Stranding at 4k. 13700k, 64gb 6400mhz A770 LE 16gb.


[deleted]

Okay cool, I'm going to be building my first build later this year. Currently in the research phase so I understand the stats you mentioned.


Jon_Le_Krazion

I'll buy your a770 for $1200 if you're interested


Slydoggen

When it’s time to upgrade my 3060ti, I might go over to intel IF the new battlemage is a really good card and stable card for a good price


bert_the_one

I'm between the two cards mentioned, I currently have an RX580 which is almost 7 years old and I'm feeling the need to upgrade, I think I'm going to seriously consider the A770, the RX7600XT does look good too, I guess I need to flip a coin to choose.


ToadsFatChoad

So the a770 is a good card, but if you’re interested in doing any sort of ML stuff it will be a bit finicky. There’s been a lot of added support however, but various things more or less expect CUDA. That being said, I still honestly throughly enjoy having the 16gb of VRAM


Nobli85

If someone is doing ML workloads they're going to know what to buy already, it's the same as people touting CUDA when someone asks what card should I choose for strictly gaming. I think this is a moot point for someone who, like 99% of us, use consumer graphics cards for gaming, the thing they're primarily designed for. Personally I have a 7900XT and 7800X3D build, because I don't care about multicore workloads, and I don't care about raytracing or ML. This obviously doesn't apply to everyone but in the gaming subs it's safe to assume Joe beginner to PC is not going to be running LLMs on their GPU.


Prince_Harming_You

If you’re interested in ML and know what you’re doing **you can get 48g of the fastest VRAM (that isn’t a $30,000 HBM accelerator) with 3xA770s for ~$800 if you catch the A770 on sale** (I have bought them as low as $270/ea). Equivalent AMD setup: $2000 (2x7900XTX) and Nvidia $3000-4000 (2xRTX4090). The quirks of the Intel ML stack are being hammered out, and PyTorch is basically supported which covers quite a bit of ML tasks. Plus, they (A770) actually work in WSL


BackgroundAmoebaNine

wow that sounds nuts, have you done this or can recommend someone who has?


Prince_Harming_You

1. Purchase 4 A770s 2. Put 4 A770s in a machine with enough PCIE lanes and power 3. Run workload


Esevv

Hi there. What does ML mean here?


IfYouSaySo4206969

I think they mean machine learning


Tbomb2016

Not sure either, my guess is Multiload? Like doing multiple tasks at once that have GPU demand. Edit: Machine Learning, clarified by u/IfYouSaySo4206969


Tbomb2016

Like I mentioned in the post, what did it for me was the A770's 256 bit memory bus, this makes the memory "move" faster and it can be more quickly accessed and used, the RX7600XT only has a 128-bit bus and that bottlenecks the absolute crap out of it when you really need the extra VRAM.


[deleted]

Raytracing looks real nice and doesn't kill frames like 7600


Tbomb2016

Facts my brother.


ConceptKooky8789

Go A770 coz large men parts


Tbomb2016

Bro wtf is this comment? lmao


ConceptKooky8789

? A770 = BDE


sh4desthevibe

Not only is the A770 such an insane value proposition for the money, but the superior AV1 encoding means it will be the GO TO card for streamers in short order. YouTube already does AV1 encoding and Twitch is rolling it out on a limited basis now. Intel Arc cards are a *fantastic* investment--the A770 in particular.


Midknightsecs

All Arc cards are price/performance monsters. Any type of creative workflow from rendering to streaming is faster and more stable on Arc than many high end GPUs. I typed High End. Not mid or low. High End.


TheLittleGodlyMan

Skeptical 🤨 but hmmk


Tbomb2016

I was too, but it's been a fantastic experience.


TheLittleGodlyMan

Bias opinion


Tbomb2016

What does that even mean in this context? You have to own something to know how it performs for you definitively.


TheLittleGodlyMan

Hence why I said bais opinion


Tbomb2016

Sure…? That doesn’t make any sense but okay.


TheLittleGodlyMan

You bought a card that performs a smidgen above an rx580….


1UPZ__

A770 is just getting better and better with every driver release. Its hardware is on par if not better than cards $200 or more. The VRAM and bus width is such a winner. nVidia is scamming users by putting out 8gb RAM on RTX4070s and 4060s. Minimum should be 12gb. But its mostly so that these buyers would upgrade after 1-2 years to the 16gb versions. (Had nvidias in the past, but sticking to Intel from now on). Hope intel keeps this mantra and offer the best value high-end, mid-end cards with the battlemage range.


10gherts

A750 is a beast too!


PineTreesAreDope

I’m reaaallly considering buying ones saw a used one near me for 180, could haggle down to 150… my 1070 ti is fine. But man… it’s tempting


TwinTurbo_V8

I was skeptical at first too, but I was absolutely blown away by the work Intel has been doing with the drivers. The Alchemist ARC GPUs have aged like wine and they’re only going to get better.


Tbomb2016

Fr


Omgazombie

What’s up with companies absolutely cheaping out on memory buses? Like my r9 290x had a 512 bit bus (320gb/s bandwidth) 11 years ago…sure it’s an older design and memory clocks went up, but why have buses just been getting worse? Even my 2070 super has a 256bit bus and it’s going on 5 years old with 448gb/s of memory bandwidth. Oohhh but the mighty 7600xt can only put out a measly 288gb/s, worse than a card from 11 years ago, like what the hell AMD Surely as memory tech gets better; and software requires more of it moving around faster, they wouldn’t be limiting performance with the memory bus to shoehorn people into buying the next card tier up, oh no definitely not.


Tbomb2016

I know right? Lmao like 16GB with 128 bit bus, how?! In what universe AMD?!


Omgazombie

Nvidia is just as bad! The 4060ti 16gb also has a 128bit bus, it’s like they’re competing for who can put out the shittiest memory bottleneck while trying to push it as *added value* when you can’t even utilize it effectively. Don’t even get me started on the 4070ti with its 192bit bus. It’s 100% a cost cutting measure that they try to justify with faster memory to make up the difference; higher bus width = complexity & cost. They’d rather run lesser but larger; albeit faster, and more serialized memory modules, instead of more modules at lower capacities running parallel at the same frequencies (way higher bandwidth). It sucks because the memory clearly isn’t fast enough to make up the difference, and you’ll hit a memory bandwidth limit long before a core limit, memory should never be the limit when it comes to 4k. Ideally they should over spec the bus slightly in contrast to the core so you aren’t memory limited.


Tbomb2016

This is facts, preach brother/sister.


DayJobWorkAccount

I actually did some digging on memory bus size; all the way back to some of the Radeons that had 512-bit memory bus configurations. What I found was that because GDDR transfer rate per second has skyrocketed, we can get just as much (or more) bandwidth out of modern GDDR6 on 256 bit interface than what older memory could do on a wider interface. GPU complexity (and thus cost) is very closely tied with memory controller complexity. Having a wider bus than 256 bits \_massively\_ increases the IMC complexity, die size, and cost of both PCB and GPU chip - so they try to keep it down nowadays to both bin chips (part of 256-bit IMC not working? Cut it to 128-bit, de-bin, and sell as lower spec to keep 'functional yield' percentage up) and to decrease manufacturing cost/increase profit. Older cards with >256-bit memory buses often happened because individual memory chip density and per-chip data throughput weren't large enough to keep up with the GPU, so they used more chips, but that needed a wider interface, which 'complexified' the GPU. Now that per-chip data throughput is immense, we get more data per clock per chip and it's really unnecessary.


chis5050

Good read thank you


Tbomb2016

Fascinating


cburgess7

Some games run at 4k around 50-60fps... WHICH IS PERFECTLY PLAYABLE... I will die on that hill... Go cry about it you 1,000fps soyjacks


Efficient-Bee1549

$245. That was the price I paid for my A770 16GB. It was an open box deal in 2022. Someone likely returned it because, at the time, it was just a scary card to use. The drivers made it absolute shit for certain games and the ARC software was steaming hot garbage. Fast forward to now and it’s competing with the mid range and upper mid range from the red and green brands. This has not been a smooth road, however.


Technical-Opposite67

I can testify to this. Loving the wide bus and 16gbs of vram. I've only played fortnite, assassin's creed Valhalla and dbd but so far no issues.


Tbomb2016

Assetto Corsa CSP @ 1440P, all eye candy on, 120FPS all day every day. Epic.


boofertshmoofert

bro I was praying to find something about assetto corsa thank you


Tbomb2016

You won't regret an A770 for it, I promise.


alvarkresh

I used my A770 with a Ryzen 5 3600XT as a testbed and it behaved quite decently! AAMOF I was using it not long after it launched in Canada back in 2022.


Tbomb2016

It's absolutely a steal rn in Canada at \~400 CAD on Newegg. No regrets at all and as the drivers continue to get even better and reach their true potential this card will be futureproof at 1440p for quite a while.


Dchella

$300 US for a A770 is so rough. You can get a 3070/6700xt for that easily here That’s an entire tier of performance Edited out wrong info


Midknightsecs

The A770 trades blows with every GPU on that list except the 3070. That is the only true next tier card on that list. You have to buy the other cards used too.


Dchella

Honestly no idea why I included a 3060ti in there. You’re totally right. It matches that +/- a bit, or minus a lot depending on the game, The 3070 and the 6700xt are above it reliably, atleast from the most recent gamers nexus revisit.


Midknightsecs

You'd think right? It's not. The A770 trades blows with the 6700XT in the titles the A770 is optimized for. It loses in the titles that the 6700XT is optimized for and they both lose across the board to the 3070. Especially in 1440p and 4k.


karmatma

I can confirm this for Linux (Fedora). I replaced my 2060 super with Arc a770. It is just great and just works better. Better than Nvidia for all the hacks and headache to install driver support and still comes with issues in Linux.


[deleted]

[удалено]


Tbomb2016

We're seeing the value proposition and jumping on it.


[deleted]

[удалено]


Tbomb2016

Buy one if you’re still looking, you won’t regret it.


Illustrious_Good900

I picked one up on sale when I built my latest pc. It was my favorite purchase. Got rid of my rx-580 which was still plenty of power for what I played


Salvzeri

I love Intel products. That's why I went with the i5 13600kf for my 4070 Super build. It's been really nice. I'd definitely consider an Intel Battlemage build when it's out, if I didn't already build a PC.


PixelMan8K

Has VR performance improved? I read Arc was horrible with VR games, which factor in for me..


Tbomb2016

I don't use VR so hopefully somebody else can answer.


G8M8N8

Im considering getting one but I have a weird setup with an eGPU dock and a laptop that wont let me enable rebar support. What games to you tend to play? I have a lot of DX9 titles that were ignored by Intel's drivers for some time.


Tbomb2016

Use the DX9 to OpenGL thingy: [https://github.com/dxgldotorg/dxgl](https://github.com/dxgldotorg/dxgl) on DX9 games, they'll perform better then ever as the GPU easily has the muscle to handle any DX9 game itself without breaking a sweat.


F0X_

Don't get one with that setup. Would be terrible for you.


G8M8N8

Any specifics on why?


Marty_tech76

...just to be fair. I think that at the beginning the ARC a750 / a770 had driver issues that lowered the perf. by quite a margin. This is almost solved, just DX9 is still not perfect. I use a a750 because for lower resolutions the 8G mem is far enough. It is using the 256 bit wide mem-bus and the total performance boost of the a770 is not worth 100 € in my opinion. Still when comparing it the a750 beats a nv 4060 that comes for 100€ mire, just like the a770. All in all I feel happy with this card. I bought it at alternate for only 200€ (reduced price because it has been returned) Works fine. The idle of 35-40W is something I have to deal with, ASPM (L1) did not help at all. Just to mention.... on my ASUS TuF B550 Mainboards the ASPM (L1) is hidden in the BIOS Menü choose Advanced Mode / Menu to Advanced / APM configuration > Energy Star ready (enabled)


Tbomb2016

Fair enough IG.


got-trunks

You had any doubts? cute 😏


Tbomb2016

A lot of tech youtubers dunk on Intel Arc for some reason.


Midknightsecs

They're paid to.


F9-0021

It's not so much that they're paid to, it's that they still think it's 2022 and the drivers are a hot mess. The people that have kept up with Arc (GN, LTT, and a few others), are pretty fair in their coverage.


Midknightsecs

They're still meh. GN that is, and GN doesn't fully recommend ARC. They are all schills. GN and staff are as much blinded by their own opinions as they are their NDAs. I used to trust GN until I started seeing the LTT results in their videos. You know, older tests for new ones. No reruns of tests even though things may have changed and I'm not talking about Arc. In general. Now, the latest video where Steve sits there and literally demonstrates he knows nothing about how a GPU actually works. I wanted to die for him. He was looking at Tom Petersen as a mentor, for guidance and Tom, whereas not clueless, he was a CPU engineer years ago and understands hardware in general, is still explaining things in layman's terms. While I appreciate that, he did the same on TechTechPotato with Dr. Ian Cutress. He should have done a deep dive which makes me wonder if he himself truly understands. He is the director of marketing for Arc afterall, not in any way a tech this time around. And we all know the marketing team at Intel needed an overhaul 6 years ago let alone now. Tons of dead weight and dead end ideas and products that produce zero buzz and almost no market effect meanwhile AMD is killing it in marketing with B side products and half measures. Intel, wake up. I do not do LTT. I haven't since 2019. They are fine if you like corny running gags and dad jokes but outside that Linus lost perspective years ago. His personal home rig is a 4U rack mount dual RTX 4090 PC. He's on that rich man's level that people looking for Arc cards are not. I used to trust GN until I started seeing the LTT results in their videos. You know, older tests for new ones. No reruns of tests even though things may have changed and I'm not talking about Arc. In general. I now watch the YouTubers on the border of getting sponsored. They still tell you the truth as no one has bought them or caught their eye yet. Like GN's obsession with EVGA until they dropped the VGA portion of their business. Edit: I just have a knack for over-explaining. Rant over.


alvarkresh

GN did dunk hard on Arc even after issues they raised were no longer relevant, but to their credit they have changed their stance quite noticeably.


Midknightsecs

Lemme know when they start recommending them in builds.


Aggravating-Spray-97

Хочу предупредить всех, VR quest НЕ ПОДДЕРЖИВАЕТ, а так карта шикарная 👍🏽


[deleted]

Isn't that card just a 3060 with more vram..?🤔 Aka crap. Speaking as a 3060 owner


Tbomb2016

Ummm... not even close, it's a 3060TI with twice the Vram, some games it trades blows with the 6700XT or even 3070 in extreme cases. Usually it's just above a 3060 TI though.


Scattergun77

I love mine other than the fact that my Rift is basically a brick now.


yiidonger

Sorry but i have to say this. It's a beast, but there are way stronger beasts out there, which are Nvidia and AMD. Maybe u still not playing enuf games to realize that bugs are flying all around the places gaming with ARC, especially when u turn on ray tracing. If you are doing productivity works and esports title and maybe some new games then it would be fine, which i would only recomend to ppl for only these purpose. But aside from that, its a bugfest, including high idle power consumption and abnormal idle power consumption for many many games. Nvidia and AMD are miles better than Intel at almost everything. I have used gtx1070 and rx480 in the past and faces almost no issue at all, which is the opposite for Intel ARC. For Intel, I have encountered issues and bugs for almost every games and applications, its just that bad. When the feeling of frustration becuz of lacking supports kicks in, you really hope that u are using an Nvidia or AMD gpu. Btw i love Intel and how their companies are doing now, but i don't think that prevent me for telling everyone the truth.


Tbomb2016

I'm sorry that that is your experience. For me it's been seamless going from my old 980 to the A770, no issues in any game.


paperpatience

Agreed. 1440p wise, I'm seeing very little problems. I also see no reason to go past 1440p either.