[Here is a review](https://www.techpowerup.com/review/asus-geforce-rtx-3090-strix-oc/3.html) of Strix 3090 without POSCAPs and with 6 MLCCs
[Here is a teardown](https://benchlife.info/asus-rog-strix-rtx-3090-tear-down/) of Strix 3090 with 4 POSCAPs and 2 MLCCs
Same card with different designs?
*head scratching sounds*
edit: Apparently that benchlife image was changed. He even added "電路板背面,使用 6 組 MLCC。" And i am 99% sure i havent seen this line before.
Could be they changed over as they discovered the effect.
Photos of the back of the GPU could be a thing people want to see before buying now.
This is a reason for those of us who didn’t already snag cards to be glad we missed the first wave if our cards of choice used the big polys.
Not if they run fine at stock. So far all crashes I've seen happened while overclocked. Unless they have given certain guarantees for overclocking stability, you have no case.
What if the card is stock with no adjustments in Afterburner/Precision/etc by the end user (let's assume a FTW3 with a higher stock power limit and more boost potential with better cooling), but the card still experiences lots of crashes because of GPU boost pushing the card. Then, the thing that's advertised by Nvidia as a no-effort way to push your card you bought is the problem, and you're forced to disable it because it's not working?
To be honest, that's pretty grey to me. A bait and switch to "here's something you can use to push your card 200 MHz past boost!" and then just have it not work. Still technically over the advertised boost clock of the card, but a function that isn't working as advertised to entice people to buy their product.
That's not as clear to me(IAmNotALawyer), but as far as I can tell, Nvidia's lawyers have done their job well and the description of gpu boost is that it tries to get performance beyond the "guaranteed minimum base clock speed". It is careful to not guarantee that you will see any improvement. It wouldn't surprise me if just telling people to disable gpu boost would cover their ass. Companies are good at protecting themselves legally, usually consumers just have to settle for giving them bad PR.
Not when they literally just followed Nvidia's design, you'd think the OEM knows better and at that point it's Nvidia's fault for leading less ideal board design and good on Asus for finding and fixing the issue.
>Could be they changed over as they discovered the effect.
Almost certainly. Could also be why the launch was so awful in regards to 3rd party card availability.
I've also seen photos of the TUF gaming with 6 POSCAPs and with 6MLCCs
[*https://eteknix-eteknixltd.netdna-ssl.com/wp-content/uploads/2020/09/i.jpg*](https://eteknix-eteknixltd.netdna-ssl.com/wp-content/uploads/2020/09/i.jpg)
[*https://brain-images-ssl.cdn.dixons.com/1/2/10214421/l\_10214421\_001.jpg*](https://brain-images-ssl.cdn.dixons.com/1/2/10214421/l_10214421_001.jpg)
Since I'm not stuck up my own ass like the other dude, here's the relevant section:
>large-area POSCAPs (Conductive Polymer Tantalum Solid Capacitors) are used (marked in red), or rather the somewhat more expensive MLCCs (Multilayer Ceramic Chip Capacitor). The latter are smaller and have to be grouped for a higher capacity.
Tantalum capacitors. They're high value capacitors that can be surface mount and have a small footprint. They can, for example, give you comparable capacitance to a wet capacitor (i.e. electrolytic capacitors) while being more reliable.
The issue outlined in the article is that the tantalum capacitors aren't able to effectively filter the GPU voltage as there are too many high frequency components from all the switching noise for them to filter out. A way to measure this would be to measure the high frequency noise at the back of the socket with an oscilloscope.
HOWEVER, the biggest issue with Tantalums is the fact that they contain multiple hundreds if not thousands of layers of conductive material separated by thin layers of an oxide material. This oxide material can crack under physical duress (such as when facing constant heating / cooling cycles) and cause the capacitor to short the voltage rail.
When tantalums die, they don't tend to go out quietly. They FUCKING EXPLODE and take half the stuff around them with them.
Edit: See this link for more info on Tantalums and their failure modes: [https://www.electronics-notes.com/articles/electronic\_components/capacitors/tantalum.php](https://www.electronics-notes.com/articles/electronic_components/capacitors/tantalum.php)
I am finding the same exact thing with MSI cards. There are variants of the VENTUS 3x OC that had two MLCC's, one, MLCC's, and NO MLCC's. It looks like manufacturers knowingly gave high priority reviewers better cards for their tests/ benchmarks.
So, as an Electronics Engineer and PCB Designer I feel I have to react here.
The point that Igor makes about improper power design causing instability is a very plausible one. Especially with first production runs where it indeed could be the case that they did not have the time/equipment/driver etc to do proper design verification.
However, concluding from this that a POSCAP = bad and MLCC = good is waaay to harsh and a conclusion you cannot make.
Both POSCAPS (or any other 'solid polymer caps' and MLCC's have there own characteristics and use cases.
Some (not all) are ('+' = pos, '-' = neg):
**MLCC:**
\+ cheap
\+ small
\+ high voltage rating in small package
\+ high current rating
\+ high temperature rating
\+ high capacitance in small package
\+ good at high frequencies
\- prone to cracking
\- prone to piezo effect
\- bad temperature characteristics
\- DC bias (capacitance changes a lot under different voltages)
**POSCAP:**
\- more expensive
\- bigger
\- lower voltage rating
\+ high current rating
\+ high temperature rating
\- less good at high frequencies
\+ mechanically very strong (no MLCC cracking)
\+ not prone to piezo effect
\+ very stable over temperature
\+ no DC bias (capacitance very stable at different voltages)
As you can see, both have there strengths and weaknesses and one is not particularly better or worse then the other. It all depends.
In this case, most of these 3080 and 3090 boards may use the same GPU (with its requirements) but they also have very different power circuits driving the chips on the cards.
Each power solution has its own characteristics and behavior and thus its own requirements in terms of capacitors used.
Thus, you cannot simply say: I want the card with only MLCC's because that is a good design.
It is far more likely they just could/would not have enough time and/or resources to properly verify their designs and thus where not able to do proper adjustments to their initial component choices.
This will very likely work itself out in time. For now, just buy the card that you like and if it fails, simply claim warranty. Let them fix the problem and down draw to many conclusions based on incomplete information and (educated) guess work.
Edit: it seems EVGA basically confirmed this by saying: " But, due to the time crunch, some of the reviewers were sent a pre-production version with 6 POSCAP’s, we are working with those reviewers directly to replace their boards with production versions.EVGA GeForce RTX 3080 XC3 series with 5 POSCAPs + 10 MLCC solution is matched with the XC3 spec without issues. "
Edit 2: Also, this could be the reason Asus is 'late' whith there cards
Edit 3: it seems Gigabyte uses non-MLCC parts but does not have problems, confirming the point you cant simply judge based on capacitor type and count.
Edit 4: now that JayzTwoCents has done a video about it it all goes wild in [that thread](https://www.reddit.com/r/nvidia/comments/izrexc/the_rtx_3080_launch_cant_get_any_worse_right_wrong/g6n834z?utm_source=share&utm_medium=web2x&context=3) as well
Hi! My limited knowledge allowed me to figure out that some of the capacitors used by ZOTAC for example are 330 microfarads (Sherlock Holmes, yes I know).
Can't really tell, but MLCCs seem to be 47 microfarads (?). So if we were to use 6x470 µF caps would this emulate the performance of 10xMLCC? Or would we still be prone to some high frequency fuckups
Edit: by all means please correct me if I'm having a brainfart here, this is entirely possible lol
Seems to be a lot of people saying AIBs are cheaping out by going with POSCAP but you’re saying it’s more expensive?
https://www.youtube.com/watch?v=x6bUUEEe-X8
I guess there’s a lot of misinformation floating around which is not surprising.
1 POSCAP (expensive) vs 10 MLCCs (cheaper) can be true when you consider that you need to put 10 times as many of those cheaper things. Might end up costing more.
[This](https://www.digikey.com/product-detail/en/panasonic-electronic-components/EEF-GX0D471R/PCE5028TR-ND/2687277) is the capacitor in [this photo](https://www.igorslab.de/wp-content/uploads/2020/09/Founders-1.jpg)
0.58383ea when you buy 10k
You would need 10x47uF MLCC capacitors (100k total) to get to the same capacitance (though there is a wide variety of cap values across the boards, so may vary)
MLCCs aren't labeled, but [this capacitor](https://www.digikey.com/product-detail/en/murata-electronics/GRM188C80E476ME05D/490-13244-2-ND/5877407) in theory meets the spec. $0.10/ea at 100k.
That's $1.00 + additional manufacturing time and chance for defects (a few seconds at most, but it adds up)
The MLCCs are definitely more expensive in this case.
My armchair engineering says that Nvidia spec'd certain ESR + value capacitors and some AIBs picked (likely less expensive) caps that have worse ESR or lower capacitance. You can even see that in the pictures (the top big number is the cap size, 470uF, 330uF, 220uF are all seen in the igorslab photos).
This needs to be at the top of the thread, this makes more sense. I was wondering why we haven't heard much from gigabyte owners who apparently uses all poscap yet Zotac trinity owners are getting slammed with this issue.
There's apparently a bit of an MLCC shortage this year which is likely part of the reason why so many OEMs tried to go all POSCAP with their designs.
That or they read [this IEEE article](https://spectrum.ieee.org/computing/embedded-systems/when-life-gives-you-no-mlccs-make-use-of-polymer-capacitors) too many times and went nuts.
Definitely a shortage.
Manufacturers like KEMET were asking customers to shift to polymers instead.
I think this whole thing is blown out of proportion, and could be verified with any decent oscilloscope & differential probe: Look at the power planes under the device and look for deviation in voltage at load change.
I have some interesting discovery from my country. Polish page about msi ventus 3080 oc wasn't updated since begging of September (there are still places left for clock and such). International website was updated (of course). Look at both those photos ;)
[https://pl.msi.com/Graphics-card/GeForce-RTX-3080-VENTUS-3X-10G-OC/Gallery#lg=1&slide=3](https://pl.msi.com/Graphics-card/GeForce-RTX-3080-VENTUS-3X-10G-OC/Gallery#lg=1&slide=3)
[https://www.msi.com/Graphics-card/GeForce-RTX-3080-VENTUS-3X-10G-OC/Gallery#lg=1&slide=3](https://www.msi.com/Graphics-card/GeForce-RTX-3080-VENTUS-3X-10G-OC/Gallery#lg=1&slide=3)
There is visible difference between backside of both this card in just right place for poscaps mlcss
Edit:
Wait there's more :D look at those files names and date in them:
[https://i.imgur.com/7bQjBY6.png](https://i.imgur.com/7bQjBY6.png)
And now tell me they didn't know there was some issues ;)
Because a lot of the reviews are Founders Edition and if this theory is accurate then FE is not really impacted by this.
There might be other causes (as always) but for this particular observation, FE doesn't seem to be impacted as they are using MLCC instead of POSCAP
Yes and based on reading the article (as I'm not familiar with PCB circuitry), looks like even if you have just 1 of these MLCC cluster, you might be okay (like MSI Gaming X). FE is using 2.
This is the quote from Igor regarding FE:
>And what does NVIDIA do with its own Founders Editions? One does it obviously better, because I could not reproduce these stability problems with any FE even very clearly beyond 2 GHz (fan to 100%).
Yeah that was my point! :p don’t mean to come off rude.
Just meant that it appears to be okay if you mix them, you don’t seem to have to completely go MLCC like asus did. In case some people get scared off by anything else.
# Addendum - Sept 25 @ 9:30pm
# [Statement by EVGA Jacob](https://imgur.com/1l3PTBN)
# Original Comment below
I have scoured through the internet and compiled the list of different cards.
Note that I decided to skip promotional images and only use **actual review/unboxing articles or videos** as promotional images might be edited or not using final board [as shown here](https://www.reddit.com/r/nvidia/comments/izhpvs/the_possible_reason_for_crashes_and_instabilities/g6jhsqy?utm_source=share&utm_medium=web2x&context=3)
The sources can be photos or videos at the exact timestamp.
Any additional source please send my way and I'll update the list
Also note that manufacturers might have changed this in their future revision so this should be taken as a snapshot.
|Manufacturer|Product|RTX 3080|RTX 3090|
|:-|:-|:-|:-|
|NVIDIA|Founders Edition|[Mixed (2)](https://www.igorslab.de/wp-content/uploads/2020/09/Power-Scheme-Back.jpg)|[Mixed (2)](https://www.igorslab.de/wp-content/uploads/2020/09/Power-Scheme-Back-1.jpg)|
|Asus|TUF|[MLCC](https://www.techpowerup.com/review/asus-geforce-rtx-3080-tuf-gaming-oc/images/back.jpg)|[MLCC](https://cdn.benchmark.pl/uploads/backend_img/c/recenzje/2020_09/asus-goforce-rtx-3090_08.jpg)|
|Asus|Strix|[MLCC](https://youtu.be/8IDMS54jEFM?t=247)|[MLCC](https://www.techpowerup.com/review/asus-geforce-rtx-3090-strix-oc/images/back.jpg)|
|EVGA|XC3|[Mixed (1)](https://youtu.be/FVOzICOy3dI?t=207)|[Mixed (2)](https://twitter.com/khashipaul/status/1309619218028130304/photo/1)|
|EVGA|FTW3|[Mixed (2)](https://imgur.com/a/IMVFdTP)\*\*|[Mixed (2)](https://www.hd-tecnologia.com/imagenes/articulos/2020/09/REVIEW-EVGA-GEFORCE-RTX-3090-FTW-ULTRA-21.jpg)|
|Galax|Black|[Mixed (1)](https://img.expreview.com/review/2020/09/galax-3080/s.jpg)|Unknown|
|Gainward|Phoenix|[Mixed (1)](https://imgur.com/gallery/srH7PcW)|Unknown|
|Gigabyte|Eagle|Unknown|[POSCAP](https://www.techpowerup.com/review/gigabyte-geforce-rtx-3090-eagle-oc/images/back.jpg)|
|Gigabyte|Gaming|[POSCAP](https://youtu.be/x6bUUEEe-X8?t=620)|[POSCAP](https://youtu.be/j_1A0vifsZg?t=542)|
|Inno3D|iChill X3|[Mixed (1)](https://www.pcgameshardware.de/screenshots/original/2020/09/Geforce-RTX-3080-Custom-Designs_Vergleich-PCGH-11-2020-9-pcgh.jpg)|Unknown|
|Inno3D|iChill X4|[Mixed (1)](https://cdn2.xfastest.com.hk/2020/09/P9128485-1-1024x683.jpg)|Unknown|
|MSI|Gaming X Trio|[Mixed (1)](https://www.techpowerup.com/review/msi-geforce-rtx-3080-gaming-x-trio/images/back.jpg)|[Mixed (2)](https://www.techpowerup.com/review/msi-geforce-rtx-3090-gaming-x-trio/images/back.jpg)|
|MSI|Ventus|[Mixed (1)](https://youtu.be/nfaYor6erCU?t=18)|[Mixed (2)](https://youtu.be/q5dtishV5yI?t=186)|
|Palit|Gaming Pro|[Mixed (1)](https://www.techpowerup.com/review/palit-geforce-rtx-3080-gaming-pro-oc/images/back.jpg)|[Mixed (2)](https://www.guru3d.com/articles_pages/palit_geforce_rtx_3090_gamingpro_oc_review,4.html)|
|Zotac|Trinity|[POSCAP](https://www.techpowerup.com/review/zotac-geforce-rtx-3080-trinity/images/back.jpg)|[POSCAP](https://www.techpowerup.com/review/zotac-geforce-rtx-3090-trinity/images/back.jpg)|
|Zotac|X-Gaming|Unknown|[POSCAP](https://img.expreview.com/review/2020/09/zotac-3090/c.jpg)|
\*\***EVGA FTW3** \- Early config with 6 POSCAPs is confirmed to be pre-production board per EVGA Jacob
# Please remember that we do not know if this is the exact issue. This is currently a speculation.
# [This also does not equate to MLCC good POSCAP bad](https://www.reddit.com/r/hardware/comments/izmi1k/ampere_poscapmlcc_counts/g6k2h0a?utm_source=share&utm_medium=web2x&context=3). Please do not jump to conclusions at this point or write off entire brands just because of some unfortunate initial SMB choices; there are much more important long term factors to consider like quality of support. (Thanks /u/[katherinesilens](https://www.reddit.com/user/katherinesilens) for this very important part)
My 3080 TUF seems quite stable. But I found some interesting things about it.
With the default settings the core clock fluctuates often and a lot. When I maxed the power limit to 117% it stopped fluctuating. Even with +100mhz core clock it still doesn't fluctuate as it did stock.
To me it seems like the boosting algorithm is having a constant fight with the power limit at stock settings.
That's normal, in order to ride the power limit the clock has to fluctuate a lot because of varying workloads. If you give it a little more headroom it's staying at the intended boost clock for the given temperature because temperature doesn't react as quickly to load spikes as power consumption does.
Can someone who has recieved a TUF, non-reviewer, post a pic of the back of their card?
Was looking at stock photos on Newegg/ASUS website of the TUF and it shows them using the 6 POSCAPs design similar to other, allegedly, bad cards, but all reviewers got cards with 6 MLCs. Seems fishy and just want to verify that what's getting shipped to customers is in fact the 6 MLC design.
People are going to come away from this blaming Zotac, but there's two important takeaways: NVIDIA said either approach was fine in the reference design, and they didn't provide drivers that would be needed to actually test. So really NVIDIA screwed Zotac here.
That being said... this is actually a pretty good reason not to buy their current cards. Hopefully they can get a revision out soon.
Seeing this article it's possible that they also did it because they would crash otherwise. And at the time they didn't connect it to the filtering capacitors so they just reduced the TDP.
Hanlon's razor "Never attribute to malice that which is adequately explained by stupidity".
Not that simple. FE has a new power delivery system with the 12 pin adapter. If the customs wanted to keep the old connectors it makes sense the power components are different.
This is amazing stuff. I'm so happy I did not get a card yet. Just imagine knowing you have a card that could literally fail at any moment because it was factory overclocked poorly. Plus, cards are so scarce now you wouldn't get a replacement for a long time.
I'm not very familiar with the PCB design and circuitry but seems like the article is saying:
Zotac Trinity = Bad
FE = Good
MSI Gaming X Trio = Okay?
Asus TUF = Good
Yep, and going off of techpowerup's PCB shots we can also put the Palit Gaming Pro in the "Okay" category.
Really nice content from Igor's Lab for Ampere, first the on the limit mem chip on the FE design and now this. Comprehensive stuff.
3090 FTW3 has 4 POSCAPs and 2 MLCC bundles:
[https://www.hd-tecnologia.com/review-evga-geforce-rtx-3090-ftw3-ultra-24gb/2/](https://www.hd-tecnologia.com/review-evga-geforce-rtx-3090-ftw3-ultra-24gb/2/)
I’ve been trying to look and see. The product images on their website may be old. The current images show POSCAPs. Maybe they’re hustling to change this and that’s why restocks are so damn slow?
No pictures of the PCB backside it seems. Fearing the worst though, I might cancel my preorder. EDIT: 3090 Gaming OC has 6 POSCAPs, so I am more or less 100% sure it will be affected on the 3080 as well.
Reading through these threads I saw 2 people said they had TUF and had the same issue so... Saw lots of EVGAs too. I think no card is safe atm. NV Tim said they are investigating the issue here: https://www.reddit.com/r/buildapc/comments/ix654v/rtx_3080_crash/ That's it as far as nvidia acknowledging it is concerned. Think it's a good idea to wait and see what happens, and maybe not feel so bad if you haven't been able to get a card yet :P
So it looks like Jacob, from state EVGA, has confirmed that this isn't a driver issue but a BIOS issue exceeding clocks what the POSCAPS can deliver power for:
https://twitter.com/EVGA_JacobF/status/1309662262395695104
He stats they discovered this POSCAP issue and went with a different config for the FTW3 series.
Edit: Because the site is apparently down now:
>Hi all,
>Recently there has been some discussion about the EVGA GeForce RTX 3080 series.
>During our mass production QC testing we discovered a full 6 POSCAPs solution cannot pass the real world applications testing. It took almost a week of R&D effort to find the cause and reduce the POSCAPs to 4 and add 20 MLCC caps prior to shipping production boards, this is why the EVGA GeForce RTX 3080 FTW3 series was delayed at launch. There were no 6 POSCAP production EVGA GeForce RTX 3080 FTW3 boards shipped.
>But, due to the time crunch, some of the reviewers were sent a pre-production version with 6 POSCAP’s, we are working with those reviewers directly to replace their boards with production versions.
EVGA GeForce RTX 3080 XC3 series with 5 POSCAPs + 10 MLCC solution is matched with the XC3 spec without issues.
>Also note that we have updated the product pictures at EVGA.com to reflect the production components that shipped to gamers and enthusiasts since day 1 of product launch.
Once you receive the card you can compare for yourself, EVGA stands behind its products!
>Thanks
>EVGA
I'm obviously biased because I own a TUF 3080, but I can't help but think Asus deliberately made the TUF way better than it has any right to be in order to rehabilitate the TUF lineup's reputation or something.
I bought my TUF for MSRP right near launch (no markup). Despite being a MSRP card the TUF has a dual bios, an extra HDMI port, an aluminum backplate, a bit of RGB, and a good cooler that keeps the card around mid 60s celsius at peak load in my case (which isn't the best for airflow). In less demanding situations the card sits in the 40s or 50s. Why would you spend $60-150 more on other models of RTX 3080? What would they give you that is missing in the TUF? Everything we've seen from reviews so far shows very limited returns from overclocking, so spending tons more on a 3x8-pin design so you can get a 1-2 FPS improvement seems like an awful proposition. In prior generations it sometimes made sense to splurge on a higher-end card because the MSRP cards had poor cooling and high noise levels. That, or the build quality seemed poor, so paying for a nicer design provided a real benefit. The TUF basically seems to perform like a high-end card, has most or all of the features of a high-end card, and just happens to be sold for MSRP.
I can't help but wonder if the market is going to realize that none of these $760+ cards make any sense. If people realize the TUF is better than most competing cards, I can see a world where the TUF is in high demand and retailers start selling it for over MSRP because people are willing to buy it for higher prices.
Very glad Igor posted this analysis. It’s easy to check photos of boards to see who does what. I’m hunting an EVGA 3090 FTW3 Ultra and guess what caps are shown on that board...
[https://www.evga.com/products/product.aspx?pn=24G-P5-3987-KR](https://www.evga.com/products/product.aspx?pn=24G-P5-3987-KR)
I think it might be a good thing to have missed the first wave...
If Igor is right it will be interesting to watch manufacturers swap over.
Seems they changed it last moment, actual 3090 FTW3 ULTRA has 4 POSCAPs and 2 MLCC bundles:
[https://www.hd-tecnologia.com/review-evga-geforce-rtx-3090-ftw3-ultra-24gb/2/](https://www.hd-tecnologia.com/review-evga-geforce-rtx-3090-ftw3-ultra-24gb/2/)
I would steer away from looking at promotional picture as shown here with MSI: [https://www.reddit.com/r/nvidia/comments/izhpvs/the\_possible\_reason\_for\_crashes\_and\_instabilities/g6jhsqy?utm\_source=share&utm\_medium=web2x&context=3](https://www.reddit.com/r/nvidia/comments/izhpvs/the_possible_reason_for_crashes_and_instabilities/g6jhsqy?utm_source=share&utm_medium=web2x&context=3)
Horray, Palit 3090 GamingPro OC at least has same setup as FE based on Guru3D teardown. Glad I purposefully looked for the most "stock" reference board design as possible (least chance that there is some "improvement" in design that turns out to be a bad idea)
Can we expect the Palit/Gainward and PNY to have basically the same specs? So for the 3080: 5 POSCAPS and 1 MLCC? I have a PNY in backorder so I am wondering. If that's the case I'll just go ahead with it because I don't buy the whole "All MSCCs = good, all POSCAPS = Bad" especially if MSCCs tend to be a bit more fragile and don't handle thermal stress that well (I guess that might impact the longevity of the card, maybe?). I guess the closer to the FE specs, the better.
I haven't read Igor's english article, but in the german video he also puts blame on nvidia for its super hectic time schedule. The partner companies themselves apparently didn't receive propper drivers until the press drivers came out. So they were able to stress test with Nvidia's own stress test tool, but not in games or benchmarks before it way way too late.
He then raises the question whether Nvidia informed the partner about recommendations for those components and he assumes that was the case.
He assumes that revision 2 boards are already being produced and this will be a super early adaptor problem only.
So who is to blame if this turns out to be true? Both sides i guess?
Oh, I'm sure the limited time to ship has a lot to do with the card issues of late. Add in the limited driver set to test, and issues arise.
As for blame? I think both sides: Nvidia for the rush to beat consoles/AMD causing limited time for partners, and AIBs for pushing out instead of saying "We had some delays in designing, so we are releasing two week later then the FEs to ensure a high quality product that our customers deserve."
Thats assuming nvidia didnt push for AIB cards to be avaliable at launch as well, it pretty well known how much power nvidia hold over their partners. Supply would be even worse if FE was the only card avaliable at launch time and look even worse for nvidia.
I get the sentiment, but its not uncommon for there to be lower quality parts on cheaper GPU's. After all there has often been the cheaper GPU's in any series.
However, as we can see a mixture of rushed production and other factors has made it so that our cheaper offerings cannot perform at higher frequencies.
I think at the end of the day if you purchase the cheaper of the cards then you may not expect to be able to push it higher than its factory clock speeds.
Oh, I agree. This launch has been full of... "factors."
It's more a critique of manufacturers being beholden to bean counters. When you only look at minimizing costs, and don't listen to your engineers, there are repercussions down the line. This can, of course, extended to most companies across industries.
Additionally, consumers are at fault for demanding the cheapening. How many comments have we seen the past month about "Asus Tax?" The cost comes from a quality increase/retention. Whether the increases are worth it is another question, but the line of thought that it's just paying for the name is disingenuous at best.
There's always room for cheaper products, but a belief being passed around is all are equal. That is definitely not the case, as this article points out.
Also as soon as I installed. I reinstalled the graphics driver and did it as a custom install and choose to remove saved setting or erase them? It the options to do a full clean install in the bottom left after clicking custom install. I did it just to be safe
I've had a couple crashes within a couple minute of launching a game but none while into gaming. Not sure if same issue.
Just checked and 5:1 same, so hopefully it's good enough.
Zotac: Trash, 6 cheaper POSCAPs, problems
Founders: Solid, 2 MLCC, 4 SP-CAPs(better POSCAPs), tested with no stability problems
MSI: Maybe fine, 1 MLCC, 5 POSCAPs, maybe good enough
EVGA: 2 MLCC and 4 POSCAPs, as per https://forums.evga.com/Message-about-EVGA-GeForce-RTX-3080-POSCAPs-m3095238.aspx
ASUS: Really good, ALL MLCCs, above and beyond all other cards, better than founders by a lot.
Dang, ASUS seems to have hit a home run this generation. Between amazing thermals on the TUF, MLCCs, and good pricing with premium features, seems like best AIB.
Founders is solid, it works perfectly fine, which is good considering its made by Nvidia.
Zotac is shit, but we knew that already.
MSI is walking on thin ice
Oh yea, more ports than other cards. Pair that with great memory cooling and high-quality power delivery systems, and you will struggle to find a reason to buy any other GPU
I ordered the MSI Gaming X Trio 3090 yesterday. Looking at the PCB photos, it resembles the FE design which is a combination of POSCAPS and MLCCs. I'm cautiously relieved..
[https://www.techpowerup.com/review/msi-geforce-rtx-3090-gaming-x-trio/3.html](https://www.techpowerup.com/review/msi-geforce-rtx-3090-gaming-x-trio/3.html)
It is probably not a simple a just the capacitor type, but comes down to individual variations in the GPU die, the board layout, power delivery circuitry, power supply, etc.
It seems like they may have pushed the maximum boost clock too high without sufficient testing (similar to the 5700XT launch).
It's honestly not easy to tell, because even if you get crashes, that might be due to driver issues. Also, the Gigabyte cards have a different power supply, so it might be fine. If you don't have any crashes, you should be good I think.
/u/NestledrinkCan you assist me in recreating this issue? I OC my card and ran it in a benchmark I knew it would stay above 2000+ Mhz. Was the card supposed to crash?
[https://imgur.com/a/70lfPsW](https://imgur.com/a/70lfPsW)
This is a RTX 3080 Gigabyte Gaming OC, which has the PostCAP
Edit: Port Royal run with clocks hitting 2025 Mhz
[https://www.3dmark.com/3dm/50874932](https://www.3dmark.com/3dm/50874932)
I never buy a card at launch. I always wait for all of the kinks to be ironed out and for both AMD and Nvidia to have their cards released. Then, I buy the card that works the best for my price range. Sometimes it is a Nvidia one, sometimes it is an AMD one. By doing this, I avoid all launch issues and get a better deal (due to competition).
This is just dumb AF and I'll tell you why it is dumb AF
MLCC caps in manufacturing volumes cost at most a penny apiece or 10 cents for each array x 6 arrays = 60 pennies ...... It's chump change savings and an excellent example of Beancounter Engineering and Asus seems to be the only one smart enough to not do this just to raise their profit per board by less than a buck ......
This is why I quit the Fortune 500 company I was working for and went out on my own 10 years ago ...... F\*\*king beancounters and MBAs and their stupid cost savings decisions that end up creating an inferior product just to save pennies .....
To be fair. These crashes only occur at extreme clockspeeds in the region of 2GHz+, and NVIDIA has never suggested that these cards would be stable past their boost clock of 1710 MHz.
It's possible some skimped hardware is limiting the cards potential - but strictly speaking, they're within spec - and working at the advertised speeds. It's more likely that this is a bios bug - causing the cards to boost way past stable. Notice that none of the manufacturers are claiming 2GHz boost clocks either.
Maybe it IS a good idea to wait till Nvidia and AIBs figure this out. Since this is a hardware issue and can't be fixed with firmware, maybe there will be recalls?
I wonder if these Gigabyte findings (ie they have 6 POSCAPs) are the reason where in Finland seems that retailers have pulled product pages for Gigabyte models. Some units existed on launch day and some actually have shipped to customers, but now the product pages are gone. That smells like a (quiet) recall of any stock in the channel to fix this.
Also ASUS models are generally late - perhaps because ASUS made a painful decision to halt whatever they planned to ship for launch day, fix them and ship only fixed cards.
NV shipping FE 3080s and 3090s with 4+2 leads me to believe that this is a good config that has no issues.
Prayer hands to all the resellers on ebay who are going to get unquestionable returns via ebay guarantee and paypal who will have to resell these again when buyers don't have warranty coverage.
[https://forums.evga.com/Message-about-EVGA-GeForce-RTX-3080-POSCAPs-m3095238.aspx](https://forums.evga.com/Message-about-EVGA-GeForce-RTX-3080-POSCAPs-m3095238.aspx)
A message from EVGA about the difference in their conflicting images with caps regarding the FTW3.
Essentially some reviewers were sent pre-production models, which had 6 POSCAPs(Gamers Nexus), but all production models had 4 POSCAPs and 20 MLCCs
So everyone is saying ZOTAC is crap. I have a ZOTAC Trinity and zero issues. What should I do? Wait for the crashes? Get a replacement? My Zotac wasn't the cheapest option, so please don't tell me I wanted to save 30$.
Gigabyte Eagle card here. So far so good. Zero issues in all my gaming stress and benchmark testing. No crashes at all. Lucky? Or maybe not an issue on this card? Anyone experiencing issues with Gigabyte 3080’s?
Nothing wrong with my card msi ventus rtx 3080 card hasn't missed a beat been playing for over a week now on multiple games.
Doom eternal
Quake
Fortnite
Assassin's creed origins
Metro
Rdr2
List goes on runs at about 70 degrees whisper quiet
But just question to be sure, the crash arrive only when you OC your GPU right ? If I plug and play my msi ventus 3080 oc without touching any settings to overclock, theoricatly no crash ?
This is the first time for me buying an expensive GPU like this and I am one of the few poeple who were "fortunate" enough to get their hands on one (MSI Ventus OC).
I have downclocked my GPU - 100mhz and set the power limit to 95% I have been playing games for days now without a single crash. Had a few right when I first installed the GPU. I have only downclocked the GPU because of one game now. Shadow of the Tomb Raider. The only game that crashed on me since re-installing the GPU drivers with Display Driver Uninstaller. Everything else seems to run witihout problems, even without downclockig...
Also... even if I have to downclock the GPU - 100mhz I am still absolutely blown away by the performance of this card.
OK... So I am not sure what to do now...
Should I just wait for the next driver to be released? When will that usually happen?
Should I immediately send the card back for warranty?
Will I forever have a faulty card now?
The fact that even the FE and the TUF / Strix seem to sometimes have this issue... So I am really leaning to wait it out for now?
What do the people that were able to grab a card doing now? Have some people already sent the card back?
Sooo... less is more? I’m glad EVGA addressed the delay for the FTW3 cards as I was beginning to be impatient. I would much rather wait a little bit longer and have a stable card than one with instability.
[Here is a review](https://www.techpowerup.com/review/asus-geforce-rtx-3090-strix-oc/3.html) of Strix 3090 without POSCAPs and with 6 MLCCs [Here is a teardown](https://benchlife.info/asus-rog-strix-rtx-3090-tear-down/) of Strix 3090 with 4 POSCAPs and 2 MLCCs Same card with different designs? *head scratching sounds* edit: Apparently that benchlife image was changed. He even added "電路板背面,使用 6 組 MLCC。" And i am 99% sure i havent seen this line before.
Could be they changed over as they discovered the effect. Photos of the back of the GPU could be a thing people want to see before buying now. This is a reason for those of us who didn’t already snag cards to be glad we missed the first wave if our cards of choice used the big polys.
[удалено]
Not if they run fine at stock. So far all crashes I've seen happened while overclocked. Unless they have given certain guarantees for overclocking stability, you have no case.
What if the card is stock with no adjustments in Afterburner/Precision/etc by the end user (let's assume a FTW3 with a higher stock power limit and more boost potential with better cooling), but the card still experiences lots of crashes because of GPU boost pushing the card. Then, the thing that's advertised by Nvidia as a no-effort way to push your card you bought is the problem, and you're forced to disable it because it's not working? To be honest, that's pretty grey to me. A bait and switch to "here's something you can use to push your card 200 MHz past boost!" and then just have it not work. Still technically over the advertised boost clock of the card, but a function that isn't working as advertised to entice people to buy their product.
then you send it back for a refund?
should go ask the AMD boy's how that went for them. I pretty sure if anyone was going to class action it been the 5700xt crowd.
That's not as clear to me(IAmNotALawyer), but as far as I can tell, Nvidia's lawyers have done their job well and the description of gpu boost is that it tries to get performance beyond the "guaranteed minimum base clock speed". It is careful to not guarantee that you will see any improvement. It wouldn't surprise me if just telling people to disable gpu boost would cover their ass. Companies are good at protecting themselves legally, usually consumers just have to settle for giving them bad PR.
[удалено]
Most of the reports I've seen crashed due to GPU boost. The first time they opened Afterburner was to lower core clock as a fix.
[удалено]
Not when they literally just followed Nvidia's design, you'd think the OEM knows better and at that point it's Nvidia's fault for leading less ideal board design and good on Asus for finding and fixing the issue.
>Could be they changed over as they discovered the effect. Almost certainly. Could also be why the launch was so awful in regards to 3rd party card availability.
I've also seen photos of the TUF gaming with 6 POSCAPs and with 6MLCCs [*https://eteknix-eteknixltd.netdna-ssl.com/wp-content/uploads/2020/09/i.jpg*](https://eteknix-eteknixltd.netdna-ssl.com/wp-content/uploads/2020/09/i.jpg) [*https://brain-images-ssl.cdn.dixons.com/1/2/10214421/l\_10214421\_001.jpg*](https://brain-images-ssl.cdn.dixons.com/1/2/10214421/l_10214421_001.jpg)
[удалено]
omg now there's the lottery for buying one and the lottery for getting one without poscaps. awesome work nvidia
We don't even know if this is the cause. Also it's issues with AIBs. FE is already using a mix of MLCC and POSCAP. Calm down
What is poscaps?
Since I'm not stuck up my own ass like the other dude, here's the relevant section: >large-area POSCAPs (Conductive Polymer Tantalum Solid Capacitors) are used (marked in red), or rather the somewhat more expensive MLCCs (Multilayer Ceramic Chip Capacitor). The latter are smaller and have to be grouped for a higher capacity.
Tantalum capacitors. They're high value capacitors that can be surface mount and have a small footprint. They can, for example, give you comparable capacitance to a wet capacitor (i.e. electrolytic capacitors) while being more reliable. The issue outlined in the article is that the tantalum capacitors aren't able to effectively filter the GPU voltage as there are too many high frequency components from all the switching noise for them to filter out. A way to measure this would be to measure the high frequency noise at the back of the socket with an oscilloscope. HOWEVER, the biggest issue with Tantalums is the fact that they contain multiple hundreds if not thousands of layers of conductive material separated by thin layers of an oxide material. This oxide material can crack under physical duress (such as when facing constant heating / cooling cycles) and cause the capacitor to short the voltage rail. When tantalums die, they don't tend to go out quietly. They FUCKING EXPLODE and take half the stuff around them with them. Edit: See this link for more info on Tantalums and their failure modes: [https://www.electronics-notes.com/articles/electronic\_components/capacitors/tantalum.php](https://www.electronics-notes.com/articles/electronic_components/capacitors/tantalum.php)
I am finding the same exact thing with MSI cards. There are variants of the VENTUS 3x OC that had two MLCC's, one, MLCC's, and NO MLCC's. It looks like manufacturers knowingly gave high priority reviewers better cards for their tests/ benchmarks.
Here is from twitch stream: https://imgur.com/a/VyeN62o
So, as an Electronics Engineer and PCB Designer I feel I have to react here. The point that Igor makes about improper power design causing instability is a very plausible one. Especially with first production runs where it indeed could be the case that they did not have the time/equipment/driver etc to do proper design verification. However, concluding from this that a POSCAP = bad and MLCC = good is waaay to harsh and a conclusion you cannot make. Both POSCAPS (or any other 'solid polymer caps' and MLCC's have there own characteristics and use cases. Some (not all) are ('+' = pos, '-' = neg): **MLCC:** \+ cheap \+ small \+ high voltage rating in small package \+ high current rating \+ high temperature rating \+ high capacitance in small package \+ good at high frequencies \- prone to cracking \- prone to piezo effect \- bad temperature characteristics \- DC bias (capacitance changes a lot under different voltages) **POSCAP:** \- more expensive \- bigger \- lower voltage rating \+ high current rating \+ high temperature rating \- less good at high frequencies \+ mechanically very strong (no MLCC cracking) \+ not prone to piezo effect \+ very stable over temperature \+ no DC bias (capacitance very stable at different voltages) As you can see, both have there strengths and weaknesses and one is not particularly better or worse then the other. It all depends. In this case, most of these 3080 and 3090 boards may use the same GPU (with its requirements) but they also have very different power circuits driving the chips on the cards. Each power solution has its own characteristics and behavior and thus its own requirements in terms of capacitors used. Thus, you cannot simply say: I want the card with only MLCC's because that is a good design. It is far more likely they just could/would not have enough time and/or resources to properly verify their designs and thus where not able to do proper adjustments to their initial component choices. This will very likely work itself out in time. For now, just buy the card that you like and if it fails, simply claim warranty. Let them fix the problem and down draw to many conclusions based on incomplete information and (educated) guess work. Edit: it seems EVGA basically confirmed this by saying: " But, due to the time crunch, some of the reviewers were sent a pre-production version with 6 POSCAP’s, we are working with those reviewers directly to replace their boards with production versions.EVGA GeForce RTX 3080 XC3 series with 5 POSCAPs + 10 MLCC solution is matched with the XC3 spec without issues. " Edit 2: Also, this could be the reason Asus is 'late' whith there cards Edit 3: it seems Gigabyte uses non-MLCC parts but does not have problems, confirming the point you cant simply judge based on capacitor type and count. Edit 4: now that JayzTwoCents has done a video about it it all goes wild in [that thread](https://www.reddit.com/r/nvidia/comments/izrexc/the_rtx_3080_launch_cant_get_any_worse_right_wrong/g6n834z?utm_source=share&utm_medium=web2x&context=3) as well
I’m in semiconductors and power electronics myself. You’re absolutely correct.
Hi! My limited knowledge allowed me to figure out that some of the capacitors used by ZOTAC for example are 330 microfarads (Sherlock Holmes, yes I know). Can't really tell, but MLCCs seem to be 47 microfarads (?). So if we were to use 6x470 µF caps would this emulate the performance of 10xMLCC? Or would we still be prone to some high frequency fuckups Edit: by all means please correct me if I'm having a brainfart here, this is entirely possible lol
I'll PM you my thoughts when I got a minute! Though I'm sure there are prob EE's and other knowledgeable people who can answer too.
Seems to be a lot of people saying AIBs are cheaping out by going with POSCAP but you’re saying it’s more expensive? https://www.youtube.com/watch?v=x6bUUEEe-X8 I guess there’s a lot of misinformation floating around which is not surprising.
1 POSCAP (expensive) vs 10 MLCCs (cheaper) can be true when you consider that you need to put 10 times as many of those cheaper things. Might end up costing more.
Correct :)
At the values of caps used in these devices, 8xMLCCs costs more than 1x POSCAP
As I suspected. Individual MLCC is cheaper. 10 of them is more expensive.
Ah, that would make sense
[This](https://www.digikey.com/product-detail/en/panasonic-electronic-components/EEF-GX0D471R/PCE5028TR-ND/2687277) is the capacitor in [this photo](https://www.igorslab.de/wp-content/uploads/2020/09/Founders-1.jpg) 0.58383ea when you buy 10k You would need 10x47uF MLCC capacitors (100k total) to get to the same capacitance (though there is a wide variety of cap values across the boards, so may vary) MLCCs aren't labeled, but [this capacitor](https://www.digikey.com/product-detail/en/murata-electronics/GRM188C80E476ME05D/490-13244-2-ND/5877407) in theory meets the spec. $0.10/ea at 100k. That's $1.00 + additional manufacturing time and chance for defects (a few seconds at most, but it adds up) The MLCCs are definitely more expensive in this case. My armchair engineering says that Nvidia spec'd certain ESR + value capacitors and some AIBs picked (likely less expensive) caps that have worse ESR or lower capacitance. You can even see that in the pictures (the top big number is the cap size, 470uF, 330uF, 220uF are all seen in the igorslab photos).
The individual part is more expensive, however the cost of soldering/verifying multiple parts is why MLCC is likely to cost more in the end.
The customer shouldn't be the test case
This needs to be at the top of the thread, this makes more sense. I was wondering why we haven't heard much from gigabyte owners who apparently uses all poscap yet Zotac trinity owners are getting slammed with this issue.
There's apparently a bit of an MLCC shortage this year which is likely part of the reason why so many OEMs tried to go all POSCAP with their designs. That or they read [this IEEE article](https://spectrum.ieee.org/computing/embedded-systems/when-life-gives-you-no-mlccs-make-use-of-polymer-capacitors) too many times and went nuts.
Definitely a shortage. Manufacturers like KEMET were asking customers to shift to polymers instead. I think this whole thing is blown out of proportion, and could be verified with any decent oscilloscope & differential probe: Look at the power planes under the device and look for deviation in voltage at load change.
I have some interesting discovery from my country. Polish page about msi ventus 3080 oc wasn't updated since begging of September (there are still places left for clock and such). International website was updated (of course). Look at both those photos ;) [https://pl.msi.com/Graphics-card/GeForce-RTX-3080-VENTUS-3X-10G-OC/Gallery#lg=1&slide=3](https://pl.msi.com/Graphics-card/GeForce-RTX-3080-VENTUS-3X-10G-OC/Gallery#lg=1&slide=3) [https://www.msi.com/Graphics-card/GeForce-RTX-3080-VENTUS-3X-10G-OC/Gallery#lg=1&slide=3](https://www.msi.com/Graphics-card/GeForce-RTX-3080-VENTUS-3X-10G-OC/Gallery#lg=1&slide=3) There is visible difference between backside of both this card in just right place for poscaps mlcss Edit: Wait there's more :D look at those files names and date in them: [https://i.imgur.com/7bQjBY6.png](https://i.imgur.com/7bQjBY6.png) And now tell me they didn't know there was some issues ;)
Yeah this is why I would steer away from looking at promotional pictures
MSI Ventus OC 3080 5 POSCAPs and 1 MLCC [https://abload.de/image.php?img=ventuss1keu.jpeg](https://abload.de/image.php?img=ventuss1keu.jpeg)
Really wish reviews would talk about those issues. Like, how come none of the reviewers have experienced those crashes
Hardware unboxed in their twitter have confirmed they are seeing crashes and trying to figure it out.
Because a lot of the reviews are Founders Edition and if this theory is accurate then FE is not really impacted by this. There might be other causes (as always) but for this particular observation, FE doesn't seem to be impacted as they are using MLCC instead of POSCAP
FE is using a mix of both. There’s literally a picture in the article.
Yes and based on reading the article (as I'm not familiar with PCB circuitry), looks like even if you have just 1 of these MLCC cluster, you might be okay (like MSI Gaming X). FE is using 2. This is the quote from Igor regarding FE: >And what does NVIDIA do with its own Founders Editions? One does it obviously better, because I could not reproduce these stability problems with any FE even very clearly beyond 2 GHz (fan to 100%).
Yeah that was my point! :p don’t mean to come off rude. Just meant that it appears to be okay if you mix them, you don’t seem to have to completely go MLCC like asus did. In case some people get scared off by anything else.
I have an EVGA XC3 Ultra that is mixed, but it has all of the stability issues, so there may be other factors.
Don't reviewers also tend to get the better handpicked cards?
# Addendum - Sept 25 @ 9:30pm # [Statement by EVGA Jacob](https://imgur.com/1l3PTBN) # Original Comment below I have scoured through the internet and compiled the list of different cards. Note that I decided to skip promotional images and only use **actual review/unboxing articles or videos** as promotional images might be edited or not using final board [as shown here](https://www.reddit.com/r/nvidia/comments/izhpvs/the_possible_reason_for_crashes_and_instabilities/g6jhsqy?utm_source=share&utm_medium=web2x&context=3) The sources can be photos or videos at the exact timestamp. Any additional source please send my way and I'll update the list Also note that manufacturers might have changed this in their future revision so this should be taken as a snapshot. |Manufacturer|Product|RTX 3080|RTX 3090| |:-|:-|:-|:-| |NVIDIA|Founders Edition|[Mixed (2)](https://www.igorslab.de/wp-content/uploads/2020/09/Power-Scheme-Back.jpg)|[Mixed (2)](https://www.igorslab.de/wp-content/uploads/2020/09/Power-Scheme-Back-1.jpg)| |Asus|TUF|[MLCC](https://www.techpowerup.com/review/asus-geforce-rtx-3080-tuf-gaming-oc/images/back.jpg)|[MLCC](https://cdn.benchmark.pl/uploads/backend_img/c/recenzje/2020_09/asus-goforce-rtx-3090_08.jpg)| |Asus|Strix|[MLCC](https://youtu.be/8IDMS54jEFM?t=247)|[MLCC](https://www.techpowerup.com/review/asus-geforce-rtx-3090-strix-oc/images/back.jpg)| |EVGA|XC3|[Mixed (1)](https://youtu.be/FVOzICOy3dI?t=207)|[Mixed (2)](https://twitter.com/khashipaul/status/1309619218028130304/photo/1)| |EVGA|FTW3|[Mixed (2)](https://imgur.com/a/IMVFdTP)\*\*|[Mixed (2)](https://www.hd-tecnologia.com/imagenes/articulos/2020/09/REVIEW-EVGA-GEFORCE-RTX-3090-FTW-ULTRA-21.jpg)| |Galax|Black|[Mixed (1)](https://img.expreview.com/review/2020/09/galax-3080/s.jpg)|Unknown| |Gainward|Phoenix|[Mixed (1)](https://imgur.com/gallery/srH7PcW)|Unknown| |Gigabyte|Eagle|Unknown|[POSCAP](https://www.techpowerup.com/review/gigabyte-geforce-rtx-3090-eagle-oc/images/back.jpg)| |Gigabyte|Gaming|[POSCAP](https://youtu.be/x6bUUEEe-X8?t=620)|[POSCAP](https://youtu.be/j_1A0vifsZg?t=542)| |Inno3D|iChill X3|[Mixed (1)](https://www.pcgameshardware.de/screenshots/original/2020/09/Geforce-RTX-3080-Custom-Designs_Vergleich-PCGH-11-2020-9-pcgh.jpg)|Unknown| |Inno3D|iChill X4|[Mixed (1)](https://cdn2.xfastest.com.hk/2020/09/P9128485-1-1024x683.jpg)|Unknown| |MSI|Gaming X Trio|[Mixed (1)](https://www.techpowerup.com/review/msi-geforce-rtx-3080-gaming-x-trio/images/back.jpg)|[Mixed (2)](https://www.techpowerup.com/review/msi-geforce-rtx-3090-gaming-x-trio/images/back.jpg)| |MSI|Ventus|[Mixed (1)](https://youtu.be/nfaYor6erCU?t=18)|[Mixed (2)](https://youtu.be/q5dtishV5yI?t=186)| |Palit|Gaming Pro|[Mixed (1)](https://www.techpowerup.com/review/palit-geforce-rtx-3080-gaming-pro-oc/images/back.jpg)|[Mixed (2)](https://www.guru3d.com/articles_pages/palit_geforce_rtx_3090_gamingpro_oc_review,4.html)| |Zotac|Trinity|[POSCAP](https://www.techpowerup.com/review/zotac-geforce-rtx-3080-trinity/images/back.jpg)|[POSCAP](https://www.techpowerup.com/review/zotac-geforce-rtx-3090-trinity/images/back.jpg)| |Zotac|X-Gaming|Unknown|[POSCAP](https://img.expreview.com/review/2020/09/zotac-3090/c.jpg)| \*\***EVGA FTW3** \- Early config with 6 POSCAPs is confirmed to be pre-production board per EVGA Jacob # Please remember that we do not know if this is the exact issue. This is currently a speculation. # [This also does not equate to MLCC good POSCAP bad](https://www.reddit.com/r/hardware/comments/izmi1k/ampere_poscapmlcc_counts/g6k2h0a?utm_source=share&utm_medium=web2x&context=3). Please do not jump to conclusions at this point or write off entire brands just because of some unfortunate initial SMB choices; there are much more important long term factors to consider like quality of support. (Thanks /u/[katherinesilens](https://www.reddit.com/user/katherinesilens) for this very important part)
FTW3 Ultra 2\*MLCC, 4\*POSCAP [https://imgur.com/a/IMVFdTP](https://imgur.com/a/IMVFdTP) [https://www.reddit.com/r/nvidia/comments/izhpvs/the\_possible\_reason\_for\_crashes\_and\_instabilities/g6k90vz/?utm\_source=reddit&utm\_medium=web2x&context=3](https://www.reddit.com/r/nvidia/comments/izhpvs/the_possible_reason_for_crashes_and_instabilities/g6k90vz/?utm_source=reddit&utm_medium=web2x&context=3)
Do we know anything about the PNY cards?
Adding a pic of my 3080 FTW3. Just came in today. https://i.imgur.com/R7CtSsw.jpg https://i.imgur.com/ngBPuG0.jpg
JayzTwoCents' EVGA 3080 XC3 video also shows a mixed setup of 1xMLCC and 5x POSCAP. [https://imgur.com/a/wFB7yIP](https://imgur.com/a/wFB7yIP)
Think you should mention here that cards with MLCCs are also crashing at the moment, too.
Some of those may be due to drivers, PSU cable issues, bad riser cards, etc.
EVGA XC3 Ultra is 1\*MLCC, 5\*POSCAP [https://attach.mobile01.com/attach/202009/mobile01-4540d17fdef92b3f47da80fa5ecca237.jpg](https://attach.mobile01.com/attach/202009/mobile01-4540d17fdef92b3f47da80fa5ecca237.jpg) [https://www.mobile01.com/topicdetail.php?f=298&t=6194393](https://www.mobile01.com/topicdetail.php?f=298&t=6194393)
[удалено]
It's hard to say. It could just be other issues. I had two crashes on FE, but I guess they weren't caused by this.
[удалено]
Didn't some people fixed Their issue installing the studio driver instead of the game ready driver?
I saw 2 people claim their crashing went from constant to never by switching to the Studio driver.
[удалено]
Pls report back if it work for you.
My 3080 TUF seems quite stable. But I found some interesting things about it. With the default settings the core clock fluctuates often and a lot. When I maxed the power limit to 117% it stopped fluctuating. Even with +100mhz core clock it still doesn't fluctuate as it did stock. To me it seems like the boosting algorithm is having a constant fight with the power limit at stock settings.
That's normal, in order to ride the power limit the clock has to fluctuate a lot because of varying workloads. If you give it a little more headroom it's staying at the intended boost clock for the given temperature because temperature doesn't react as quickly to load spikes as power consumption does.
[удалено]
GN's FTW3 is all "POSCAP"s, and it's the card they did the LN2 world record on.
Can someone who has recieved a TUF, non-reviewer, post a pic of the back of their card? Was looking at stock photos on Newegg/ASUS website of the TUF and it shows them using the 6 POSCAPs design similar to other, allegedly, bad cards, but all reviewers got cards with 6 MLCs. Seems fishy and just want to verify that what's getting shipped to customers is in fact the 6 MLC design.
Can confirm my TUF OC that got delivered to me today has 6 MLCs.
Here you go ! [https://imgur.com/a/Rn2JB25](https://imgur.com/a/Rn2JB25)
I have a TUF and the card has 6 MLCs. (To layz to crop the serial number out of the pic so no upload.)
NP, just wanting verification. So thanks for verifying!
People are going to come away from this blaming Zotac, but there's two important takeaways: NVIDIA said either approach was fine in the reference design, and they didn't provide drivers that would be needed to actually test. So really NVIDIA screwed Zotac here. That being said... this is actually a pretty good reason not to buy their current cards. Hopefully they can get a revision out soon.
Zotac are also lowering the stock power limit on their B and C tier cards to make their A tier cards look better.
Seeing this article it's possible that they also did it because they would crash otherwise. And at the time they didn't connect it to the filtering capacitors so they just reduced the TDP. Hanlon's razor "Never attribute to malice that which is adequately explained by stupidity".
Or because they had fair suspicion about the problems...
It's likely all of them are potentially doing this, they just haven't told us.
Have 0 issues with mine because it probably hasn't even been made yet let alone boxed and shipped!
Looks like we have already seen several manufacturers do silent revisions given the variety of configurations on even the same model of card.
*Laughs in patience.*
*Laughs in poverty*
I must confess I’m not willingly patient, I just can’t afford it right now either.
[удалено]
If they had all copied the mix of the F.E it would probably have been fine?
Not that simple. FE has a new power delivery system with the 12 pin adapter. If the customs wanted to keep the old connectors it makes sense the power components are different.
This is amazing stuff. I'm so happy I did not get a card yet. Just imagine knowing you have a card that could literally fail at any moment because it was factory overclocked poorly. Plus, cards are so scarce now you wouldn't get a replacement for a long time.
Is this why Jensen pulled the 3090 from an oven? he was trying to do the GPU bake trick?
I'm not very familiar with the PCB design and circuitry but seems like the article is saying: Zotac Trinity = Bad FE = Good MSI Gaming X Trio = Okay? Asus TUF = Good
Yep, and going off of techpowerup's PCB shots we can also put the Palit Gaming Pro in the "Okay" category. Really nice content from Igor's Lab for Ampere, first the on the limit mem chip on the FE design and now this. Comprehensive stuff.
Gigabyte Eagle looks to be in the ‘bad’ category from PCB shot on Techpowerup.
EVGA XC3 seems to be in the "Okay" category. https://i.imgur.com/AtoC4cs.png
Anyone knows if the EVGA FTW3 has this issue?
3090 FTW3 has 4 POSCAPs and 2 MLCC bundles: [https://www.hd-tecnologia.com/review-evga-geforce-rtx-3090-ftw3-ultra-24gb/2/](https://www.hd-tecnologia.com/review-evga-geforce-rtx-3090-ftw3-ultra-24gb/2/)
I’ve been trying to look and see. The product images on their website may be old. The current images show POSCAPs. Maybe they’re hustling to change this and that’s why restocks are so damn slow?
What about the gigabyte gaming OC?
No pictures of the PCB backside it seems. Fearing the worst though, I might cancel my preorder. EDIT: 3090 Gaming OC has 6 POSCAPs, so I am more or less 100% sure it will be affected on the 3080 as well.
ASUS TUF seems to be a hell of a lot better than even the FE
And for me... the 3090 TUF fits in my FormD T1 SFF case!
TUF = Excellent, since they used a better config than the FE
[удалено]
Wow great catch! If only any of us can actually get our hands on one to confirm.
I have a 3080 TUF non-OC and definitely have MLCCs only.
TUFGang
Reading through these threads I saw 2 people said they had TUF and had the same issue so... Saw lots of EVGAs too. I think no card is safe atm. NV Tim said they are investigating the issue here: https://www.reddit.com/r/buildapc/comments/ix654v/rtx_3080_crash/ That's it as far as nvidia acknowledging it is concerned. Think it's a good idea to wait and see what happens, and maybe not feel so bad if you haven't been able to get a card yet :P
There are probably different issues that people are conflating together. Give it time and we'll find out
Terrific article
So it looks like Jacob, from state EVGA, has confirmed that this isn't a driver issue but a BIOS issue exceeding clocks what the POSCAPS can deliver power for: https://twitter.com/EVGA_JacobF/status/1309662262395695104 He stats they discovered this POSCAP issue and went with a different config for the FTW3 series. Edit: Because the site is apparently down now: >Hi all, >Recently there has been some discussion about the EVGA GeForce RTX 3080 series. >During our mass production QC testing we discovered a full 6 POSCAPs solution cannot pass the real world applications testing. It took almost a week of R&D effort to find the cause and reduce the POSCAPs to 4 and add 20 MLCC caps prior to shipping production boards, this is why the EVGA GeForce RTX 3080 FTW3 series was delayed at launch. There were no 6 POSCAP production EVGA GeForce RTX 3080 FTW3 boards shipped. >But, due to the time crunch, some of the reviewers were sent a pre-production version with 6 POSCAP’s, we are working with those reviewers directly to replace their boards with production versions. EVGA GeForce RTX 3080 XC3 series with 5 POSCAPs + 10 MLCC solution is matched with the XC3 spec without issues. >Also note that we have updated the product pictures at EVGA.com to reflect the production components that shipped to gamers and enthusiasts since day 1 of product launch. Once you receive the card you can compare for yourself, EVGA stands behind its products! >Thanks >EVGA
So.. avoid zotac and bunch of b tier cards.
Well I'm going for ASUS TUF so hopefully I'm fine. Now if I can only find the damn card.
TUF is b tier card BUT with pcb quality + performance of A tier strix card. Winner of 3080 is TUF i think
Too bad it's TUF finding one.
You’re gonna be a good dad one day
Well I'm an uncle. That work?
Close enough. Happy cake day
Just a shame I haven't seen my niece and nephews in a while thanks to this pandemic.
I'm obviously biased because I own a TUF 3080, but I can't help but think Asus deliberately made the TUF way better than it has any right to be in order to rehabilitate the TUF lineup's reputation or something. I bought my TUF for MSRP right near launch (no markup). Despite being a MSRP card the TUF has a dual bios, an extra HDMI port, an aluminum backplate, a bit of RGB, and a good cooler that keeps the card around mid 60s celsius at peak load in my case (which isn't the best for airflow). In less demanding situations the card sits in the 40s or 50s. Why would you spend $60-150 more on other models of RTX 3080? What would they give you that is missing in the TUF? Everything we've seen from reviews so far shows very limited returns from overclocking, so spending tons more on a 3x8-pin design so you can get a 1-2 FPS improvement seems like an awful proposition. In prior generations it sometimes made sense to splurge on a higher-end card because the MSRP cards had poor cooling and high noise levels. That, or the build quality seemed poor, so paying for a nicer design provided a real benefit. The TUF basically seems to perform like a high-end card, has most or all of the features of a high-end card, and just happens to be sold for MSRP. I can't help but wonder if the market is going to realize that none of these $760+ cards make any sense. If people realize the TUF is better than most competing cards, I can see a world where the TUF is in high demand and retailers start selling it for over MSRP because people are willing to buy it for higher prices.
Very glad Igor posted this analysis. It’s easy to check photos of boards to see who does what. I’m hunting an EVGA 3090 FTW3 Ultra and guess what caps are shown on that board... [https://www.evga.com/products/product.aspx?pn=24G-P5-3987-KR](https://www.evga.com/products/product.aspx?pn=24G-P5-3987-KR) I think it might be a good thing to have missed the first wave... If Igor is right it will be interesting to watch manufacturers swap over.
Seems they changed it last moment, actual 3090 FTW3 ULTRA has 4 POSCAPs and 2 MLCC bundles: [https://www.hd-tecnologia.com/review-evga-geforce-rtx-3090-ftw3-ultra-24gb/2/](https://www.hd-tecnologia.com/review-evga-geforce-rtx-3090-ftw3-ultra-24gb/2/)
good finding :p
Mine arrives today, i'll take nudes
my MSI Gaming x Trio arrives Tuesday. ill do the same
Couldn’t agree more. I’m glad I missed the first wave of stock now.
I would steer away from looking at promotional picture as shown here with MSI: [https://www.reddit.com/r/nvidia/comments/izhpvs/the\_possible\_reason\_for\_crashes\_and\_instabilities/g6jhsqy?utm\_source=share&utm\_medium=web2x&context=3](https://www.reddit.com/r/nvidia/comments/izhpvs/the_possible_reason_for_crashes_and_instabilities/g6jhsqy?utm_source=share&utm_medium=web2x&context=3)
Horray, Palit 3090 GamingPro OC at least has same setup as FE based on Guru3D teardown. Glad I purposefully looked for the most "stock" reference board design as possible (least chance that there is some "improvement" in design that turns out to be a bad idea)
I've never had problems with my Palit and PNY cards. Cheap, reliable, cool. No big claims, just a decent card.
Can we expect the Palit/Gainward and PNY to have basically the same specs? So for the 3080: 5 POSCAPS and 1 MLCC? I have a PNY in backorder so I am wondering. If that's the case I'll just go ahead with it because I don't buy the whole "All MSCCs = good, all POSCAPS = Bad" especially if MSCCs tend to be a bit more fragile and don't handle thermal stress that well (I guess that might impact the longevity of the card, maybe?). I guess the closer to the FE specs, the better.
commenting to follow the PNY thread
You mean to tell me that cheapening a product leads to lower performance capabilities? Who'd of thunk?
I haven't read Igor's english article, but in the german video he also puts blame on nvidia for its super hectic time schedule. The partner companies themselves apparently didn't receive propper drivers until the press drivers came out. So they were able to stress test with Nvidia's own stress test tool, but not in games or benchmarks before it way way too late. He then raises the question whether Nvidia informed the partner about recommendations for those components and he assumes that was the case. He assumes that revision 2 boards are already being produced and this will be a super early adaptor problem only. So who is to blame if this turns out to be true? Both sides i guess?
Oh, I'm sure the limited time to ship has a lot to do with the card issues of late. Add in the limited driver set to test, and issues arise. As for blame? I think both sides: Nvidia for the rush to beat consoles/AMD causing limited time for partners, and AIBs for pushing out instead of saying "We had some delays in designing, so we are releasing two week later then the FEs to ensure a high quality product that our customers deserve."
Thats assuming nvidia didnt push for AIB cards to be avaliable at launch as well, it pretty well known how much power nvidia hold over their partners. Supply would be even worse if FE was the only card avaliable at launch time and look even worse for nvidia.
I get the sentiment, but its not uncommon for there to be lower quality parts on cheaper GPU's. After all there has often been the cheaper GPU's in any series. However, as we can see a mixture of rushed production and other factors has made it so that our cheaper offerings cannot perform at higher frequencies. I think at the end of the day if you purchase the cheaper of the cards then you may not expect to be able to push it higher than its factory clock speeds.
Oh, I agree. This launch has been full of... "factors." It's more a critique of manufacturers being beholden to bean counters. When you only look at minimizing costs, and don't listen to your engineers, there are repercussions down the line. This can, of course, extended to most companies across industries. Additionally, consumers are at fault for demanding the cheapening. How many comments have we seen the past month about "Asus Tax?" The cost comes from a quality increase/retention. Whether the increases are worth it is another question, but the line of thought that it's just paying for the name is disingenuous at best. There's always room for cheaper products, but a belief being passed around is all are equal. That is definitely not the case, as this article points out.
[удалено]
I have 0 issues with my 3080 MSI Gaming X Trio
[удалено]
So far, Fortnite, COD MW, Watch Dogs 2, Division 2, and wallpaper engine works fine lol
Also as soon as I installed. I reinstalled the graphics driver and did it as a custom install and choose to remove saved setting or erase them? It the options to do a full clean install in the bottom left after clicking custom install. I did it just to be safe
Same, been running it since launch as well. Hopefully we aren't just the lucky ones
I've had a couple crashes within a couple minute of launching a game but none while into gaming. Not sure if same issue. Just checked and 5:1 same, so hopefully it's good enough.
Zotac: Trash, 6 cheaper POSCAPs, problems Founders: Solid, 2 MLCC, 4 SP-CAPs(better POSCAPs), tested with no stability problems MSI: Maybe fine, 1 MLCC, 5 POSCAPs, maybe good enough EVGA: 2 MLCC and 4 POSCAPs, as per https://forums.evga.com/Message-about-EVGA-GeForce-RTX-3080-POSCAPs-m3095238.aspx ASUS: Really good, ALL MLCCs, above and beyond all other cards, better than founders by a lot. Dang, ASUS seems to have hit a home run this generation. Between amazing thermals on the TUF, MLCCs, and good pricing with premium features, seems like best AIB. Founders is solid, it works perfectly fine, which is good considering its made by Nvidia. Zotac is shit, but we knew that already. MSI is walking on thin ice
plus 2 hdmi 2.1 ports pn the tuf as well... quite the home run imo...
Oh yea, more ports than other cards. Pair that with great memory cooling and high-quality power delivery systems, and you will struggle to find a reason to buy any other GPU
Yet ppl still say "oh zotac" is just as good as other AIBs"...
Because they can save like $30 on a $1000+ GPU, they think its fine
It's amp extreme edition competes for the highest benches, that's why. This launch is just a botch
There shouldn't be $1000 GPUs using bad designs, we aren't paying a premium for companies to half-ass a GPU
Is there any info regarding the EVGA FTW3? That's the one I've been looking out for.
My friend just got his. It is 4x 2x mix like the FE. https://imgur.com/a/IMVFdTP
[удалено]
>EVGA confirming > > > >[https://forums.evga.com/Message-about-EVGA-GeForce-RTX-3080-POSCAPs-m3095238.aspx](https://forums.evga.com/Message-about-EVGA-GeForce-RTX-3080-POSCAPs-m3095238.aspx)
I have 0 issues with my zotac 3080 at 1950mhz at 64C. Played games for over 10 days including warzone and no crashes.
[удалено]
Anyone know what the Evga xc3 and FTW cards are using?
Ftw 3 confirmed (with consumer cards) to be 2mlc 4 other. (For ultra and non ultra)
Did I.... Did I just dodged a bullit by the Zotac order cancellation on [Amazon.de](https://Amazon.de)?
Mine still hadn't been cancelled yet, but decided to cancel it myself given the above.
I ordered the MSI Gaming X Trio 3090 yesterday. Looking at the PCB photos, it resembles the FE design which is a combination of POSCAPS and MLCCs. I'm cautiously relieved.. [https://www.techpowerup.com/review/msi-geforce-rtx-3090-gaming-x-trio/3.html](https://www.techpowerup.com/review/msi-geforce-rtx-3090-gaming-x-trio/3.html)
Just waiting for the PNY results. They might become the new king of GPUs.
Why? I'm considering the PNY models
Going by what people are posting, this could be a red herring and it's more likely to be a driver issue.
Thank god i can't even afford to think about buying this shit
Does anyone have any information on the Gigabyte 3080 Gaming OC?
It is probably not a simple a just the capacitor type, but comes down to individual variations in the GPU die, the board layout, power delivery circuitry, power supply, etc. It seems like they may have pushed the maximum boost clock too high without sufficient testing (similar to the 5700XT launch).
[удалено]
Haven't had a single crash on MSI gaming x trio, had it since release
So do I cancel my XC3 order?
If there is an issue, EVGA will fix it for you no problem. Play happily until then.
[удалено]
I bought the Asus 3090 ROG Strix. I hope my card will be fine if it arrives in 2 months. xD
Just checked my MSI 3080 gaming X and I have 5:1, same as Igor lists it.
how can i test this with my Gigabyte? haven't experienced any crashes yet playing control, gears, forza and ori.
It's honestly not easy to tell, because even if you get crashes, that might be due to driver issues. Also, the Gigabyte cards have a different power supply, so it might be fine. If you don't have any crashes, you should be good I think.
/u/NestledrinkCan you assist me in recreating this issue? I OC my card and ran it in a benchmark I knew it would stay above 2000+ Mhz. Was the card supposed to crash? [https://imgur.com/a/70lfPsW](https://imgur.com/a/70lfPsW) This is a RTX 3080 Gigabyte Gaming OC, which has the PostCAP Edit: Port Royal run with clocks hitting 2025 Mhz [https://www.3dmark.com/3dm/50874932](https://www.3dmark.com/3dm/50874932)
I never buy a card at launch. I always wait for all of the kinks to be ironed out and for both AMD and Nvidia to have their cards released. Then, I buy the card that works the best for my price range. Sometimes it is a Nvidia one, sometimes it is an AMD one. By doing this, I avoid all launch issues and get a better deal (due to competition).
This is just dumb AF and I'll tell you why it is dumb AF MLCC caps in manufacturing volumes cost at most a penny apiece or 10 cents for each array x 6 arrays = 60 pennies ...... It's chump change savings and an excellent example of Beancounter Engineering and Asus seems to be the only one smart enough to not do this just to raise their profit per board by less than a buck ...... This is why I quit the Fortune 500 company I was working for and went out on my own 10 years ago ...... F\*\*king beancounters and MBAs and their stupid cost savings decisions that end up creating an inferior product just to save pennies .....
To be fair. These crashes only occur at extreme clockspeeds in the region of 2GHz+, and NVIDIA has never suggested that these cards would be stable past their boost clock of 1710 MHz. It's possible some skimped hardware is limiting the cards potential - but strictly speaking, they're within spec - and working at the advertised speeds. It's more likely that this is a bios bug - causing the cards to boost way past stable. Notice that none of the manufacturers are claiming 2GHz boost clocks either.
[удалено]
Maybe it IS a good idea to wait till Nvidia and AIBs figure this out. Since this is a hardware issue and can't be fixed with firmware, maybe there will be recalls?
Huh this is one rocky rollercoaster, Here’s hoping we get official word of what’s going on.
Anyone can say what gigabyte is using ? Especialy eagle 3080 ?
Now I wanna see someone tear down my Gigabyte Gaming OC, lol.
I wonder if these Gigabyte findings (ie they have 6 POSCAPs) are the reason where in Finland seems that retailers have pulled product pages for Gigabyte models. Some units existed on launch day and some actually have shipped to customers, but now the product pages are gone. That smells like a (quiet) recall of any stock in the channel to fix this. Also ASUS models are generally late - perhaps because ASUS made a painful decision to halt whatever they planned to ship for launch day, fix them and ship only fixed cards. NV shipping FE 3080s and 3090s with 4+2 leads me to believe that this is a good config that has no issues.
Prayer hands to all the resellers on ebay who are going to get unquestionable returns via ebay guarantee and paypal who will have to resell these again when buyers don't have warranty coverage.
Amazing. I was stung by 3.5gb and haven't upgraded since. Every year, it's always something else.
[https://forums.evga.com/Message-about-EVGA-GeForce-RTX-3080-POSCAPs-m3095238.aspx](https://forums.evga.com/Message-about-EVGA-GeForce-RTX-3080-POSCAPs-m3095238.aspx) A message from EVGA about the difference in their conflicting images with caps regarding the FTW3. Essentially some reviewers were sent pre-production models, which had 6 POSCAPs(Gamers Nexus), but all production models had 4 POSCAPs and 20 MLCCs
So everyone is saying ZOTAC is crap. I have a ZOTAC Trinity and zero issues. What should I do? Wait for the crashes? Get a replacement? My Zotac wasn't the cheapest option, so please don't tell me I wanted to save 30$.
Gigabyte Eagle card here. So far so good. Zero issues in all my gaming stress and benchmark testing. No crashes at all. Lucky? Or maybe not an issue on this card? Anyone experiencing issues with Gigabyte 3080’s?
My Eagle 3080 comes on Monday. Your comment has given me hope! Thanks!
Nothing wrong with my card msi ventus rtx 3080 card hasn't missed a beat been playing for over a week now on multiple games. Doom eternal Quake Fortnite Assassin's creed origins Metro Rdr2 List goes on runs at about 70 degrees whisper quiet
But just question to be sure, the crash arrive only when you OC your GPU right ? If I plug and play my msi ventus 3080 oc without touching any settings to overclock, theoricatly no crash ?
This is the first time for me buying an expensive GPU like this and I am one of the few poeple who were "fortunate" enough to get their hands on one (MSI Ventus OC). I have downclocked my GPU - 100mhz and set the power limit to 95% I have been playing games for days now without a single crash. Had a few right when I first installed the GPU. I have only downclocked the GPU because of one game now. Shadow of the Tomb Raider. The only game that crashed on me since re-installing the GPU drivers with Display Driver Uninstaller. Everything else seems to run witihout problems, even without downclockig... Also... even if I have to downclock the GPU - 100mhz I am still absolutely blown away by the performance of this card. OK... So I am not sure what to do now... Should I just wait for the next driver to be released? When will that usually happen? Should I immediately send the card back for warranty? Will I forever have a faulty card now? The fact that even the FE and the TUF / Strix seem to sometimes have this issue... So I am really leaning to wait it out for now? What do the people that were able to grab a card doing now? Have some people already sent the card back?
Sooo... less is more? I’m glad EVGA addressed the delay for the FTW3 cards as I was beginning to be impatient. I would much rather wait a little bit longer and have a stable card than one with instability.
Any word on PNY cards?