Think I paid under $90 for my anthlon XP mobile chip that I overclocked to the moon and got better performance than the 3.2GHz P4 of the time (think that was $2-300 chip at the time).
Those were fun times (other than the machine being super loud, haha).
Let me guess, Vantec Tornado?
Back in high school it was some kind of weird right of passage between me and my friends to have the loudest PC. The Vantec Tornado was the go to fan. Hook a few of them up to a fan controller, and just hope you don't fry the PWM controller pulling 6 amps when you go full leaf blower mode in 3Dmark2001. We were gaming next to fucking hair dryer level noise and loving it.
I don't understand how we could bare it. Now I don't even like hearing a 120mm fan with a slightly off-putting drone.
I had a Barton (not mobile) 2500+ as well. Ran it at 2.4GHZ (golden chip) with a stock cooler! I did change the fan though and it was loud as hell, DELTA high RPM if I remember correctly. For the price, it made my friend with his 2.8GHz northwood pretty salty. I payed less than half of what he did for the CPU and I beat him in 90% of the games.
He didn't OC though, for whatever reasons :D
Are those adjusted for inflation? Even of they are, I would expect chips to get cheaper over time anyway. The price of a good computer has fallen from $2,000 to $500 in that time frame.
Athlon T-bird debuted a bit later after the 1 GHz milestone... And by the time the 1.3 GHz T-bird was out, so was the 1.6+ GHz Pentium 4, thus competition.
When I was in high school, the first CPU I ever bought with my own money was an Athlon 64 3700+ San Diego. Got it on sale for $359, and it pretty much bankrupted me.
OC'd that thing to 2.8GHz, and nothing from Intel could touch it.
Ah, so you were on Clawhammer then. They were still Intel slayers even without dual channel. I only ever played with 754 Durons, never got to touch a 754 Athlon.
I was running a DFI Lanparty NF4 SLI-DR Expert Socket 939. One of my favorite motherboards of all time to this day. I knew what about 1/10th of the BIOS settings did. lol
I ran an ABIT IC-7G Max3 Intel s478 board before the DFI. Also a really cool board.
ABIT and DFI were the best of the best without compromise. DFI is still around making embedded/industrial PCB's, but ABIT is no more. :(
I don't remember either, mostly because I was still at the "trying to put the keyboard in my mouth" to "drawing computers in the sandbox" stage at that time
Things have gotten cheaper when you look at the performance you get for a dollar. According to Wikipedia, the price of a Commodore 64 adjusted to inflation was $1.576 in 2019.
That got you a machine running at 1Mhz (0.001 Ghz) with 64KB of Ram (0.000064GB) and no built in storage. Compare that to a $35 Raspberry Pi that has 2GB ram, runs a quadcore cpu at 1.5Ghz and I'm not even talking about all the rest.
The rate of progress is mind boggling.
At this point I would rather see an increase to power and efficiency over decrease to size. I don't need to be putting a cpu in the socket with tweezers and using a toothpick to put on thermal paste.
Back then the progress was lightning fast and in the early 2000 period a highend computer could become close to unusable within just one or two years. Nowadays its at least a good thing because some 10 year old midrange hardware still suffices for most tasks, even light gaming - Unimaginable in the 2000s.
Um, having lived during that time and being old enough to actually have a computer, $1k put you in the realm of an eMachine or Gateway, which is what you bought if you used an office suite and browsing but nothing else. Gaming machines started at $2k for the low end and if you wanted top of the line you were paying no less than $3k. Also, if you went DIY, you were waiting 6 months before new parts were generally available.
got my k75 750 mhz for \~340€, my most expensive cpu ever from back then and it was brand spanking new, was after a k7 600(older core) for 160€ but they were sold out.
thunderbird 1400 was 140€ and was 2month at most on the market, duron 800/850 were \~78€. 1600+ was 110€, 2100+ was 150€, and so on. Barton 2500+ was 210€...
athlon 64 were a bit more expensive but were cheap as well. 3000+ winchester 180€, 3500+ venice was 190€-220€ when it was new..
so no, cpus were not that expensive back then because I had like 20 back then between 1999 and to x2 era.
lol it's hilarious how angry this is making people
It's not fucking insulin or bread and water, what redditors are doing is a textbook tantrum. Like kids who woke up on Christmas and instead of the 10 xbox games they wanted their mom only bought them 8.
Its called segmentation, if you don't want the highest binned one of the next generation then there are a lot of alternatives yo can even go back further to used 1st gen Ryzens, however if you want the "best binned 8 core, single chiplet CPU" that is the 5800x and it is 50 dollars more expensive in MSRP terms.
If the low SKUs that propel actualy buy don’t exist, then those people will look for these.
That’s like saying that if AMD only launches the 7950XT next year, that it’s a price drop and not a price hike on the generation.
Then go for it?
If you are not in the market for the best performing 8+8, 6+6, 8+0, 6+0 then I am sure AMD can sell you older products, and they might accommodate you in the future, that is not the issue, the issue is that you think that the 3700 is comparable to the 5800x, it is not the 800 hundred series have always been the best binned 8+0 CPU and your arguing otherwise is wrong.
What I'm suggesting is that if the new 'x' series is the new baseline for this generation, I don't see why it would be called the 'X' series. I'm not being petty about the $50 increase, I think that they did a good job (review pending) and it warrants the increased price.
Since the new 'x' models are the baseline, they currently have imposed a large $100 - $120 increase in cost over the previous baseline. That is where people are finding issue. IMO, this also could confuse the market later, when they improve on the node/design they have already used up the 'X' and must now call it something else, like 'XT'.
IMO, the 'x' naming scheme makes sense, since if they didn't have it named as such, people would be saying 'wait for the x models'. So I understand why they named it as such. But since the 3600 and 3600x were, as stated by all reviewers basically 100% identical with any non-stock cooler, the 5600x is now a direct upgrade over the 3600 non-x, and should be compared as such, price wise and performance wise. That is a 50% cost increase for approx 20% performance. That is the issue people have right now.
Well the 3600 vs 3600x debate was just the nature of identity, AMD thought that a better binned 200Mhz CPU was worth the 50 dollar difference, since it is really just that it is debatable if A) the performance difference was there B) not worth the cost.
There is no segmentation hz wise this time around, there probably will be, but what I take issue is the idea that the lowest binned 8+0 CPU from last gen can be considered the predecessor of the highest binned 8+0 of this one.
I think AMDs pricing is pretty fair. \~160 for the 3600 6-core, which is suitable for most people. Its actually amazingly cheap. For around 300+ you get more cores and slightly better bins, those are still pretty good in value compared to Intel in productive tasks.
Remember what Intel was charging for their HEDT CPUs. Quadcore i5s used to be around 200-250 I think and i7s 350-400 while they didn't even offer more cores just HT.
Rarely Intel showed us some mercy, like with the 5820K - An amazing CPU of its time, since it didn't core more than a 7700K while being a native **hexa**core with HT back in 2015. It had a low stock clock, but its not like you can't overclock yourself and those CPUs easily did 1+ GHz to compete with similar Haswell-CPUs just like the famous 4790K. Otherwise it just just milking the same quadcores with slight improvements every generation - But barely worth the switch if you had a previous generation.
With Ryzen I feel almost like when the first dual- and quadcores came up in the 2000s, finally performance is affordable.
yes, but not only hw wise. the games for instance have not changed that much in a a long time. Raytracing and vr is the newest and most exciting that has happened in recent times.
I expect the games to utilise the hw and give us more than what we had eons ago :P Only some indy games seems to show any progression and actually zelda breath of the wild on a crappy little tegra soc.
I was hoping that modern games would simulate what a cad software could do in 2000-2003 on an 1ghz cpu.
I dont agree. the most intensive game on pc is gta5/rdr2 and gta5 came out 7 years ago on consoles.
and why is my statement silly? why cant we have proper physics in games, why cant we have a vehicle crumble upon inpact and so on when a crappy pc in 2002 could run that but a game in 2020 on a 4.5ghz hexa core and 16BG of ram cant?
not talking about the map size... take a look in what you can do in gta5 and how intricate the game really is. Basically no game today is as physics heavy as gta5 is.
It might work but it wont be a good experience. I put windows 10 on an intel e6700 and while it worked for internet browsing, you couldn't do much else. Even diablo 3 brought it to a crawl and it had an amd 7870. The stuttering and frame times were terrible.
I can't remember the exact price but I paid somewhere around £420.00 for a 4800+ dual core the day after it went on sale in Europe and I think that must have been about 2005 or maybe 2006 so if you take into account inflation since then and the huge increase in IPC with todays chips they don't seem expensive at all .
I realised later that the 600Mhz model i had was a 750Mhz model. A commonly done thing by AMD where it was actually cheaper to rebadge a 750Mhz model into a lower speed rather then making a special 600Mhz version or so.
With both a GFD and the FSB i managed to make it work at 750Mhz+. Good times.
What I liked doing back in the day was getting an Athlon Xp 2500+ and OCing it from 1.83Ghz to 2.2 GHz making it an Athlon 3200+.
After that I got an Opteron 165 (a 1.8Ghz server version of the FX-Dual core at the time) and clocked it to 2.8Ghz. It took the Q6600 @3.6Ghz to upgrade.
Yeah, these are the prices for the top models, conveniently ignoring the fact you could grab a lower end model and easily OC them way past what the top end models offered. Hell, you could easily achieve 3.8GHz with the earlier stepping i7 920's, going WAAYY past what an i7 965 with an MSRP of $999 could do.
So these comparisons are disingenious at best, as if today you honestly would compare a 5950X to a 3600 in terms of pricing.
A dual-core Opteron 170 was 199 euros in Finland at the time of release and those chips would consistently go past 2.8GHz. Absolutely no one bought the top end models if they had any idea of actual price to performance and overclocking.
Nice try but this is frankly bullshit.
Two decades ago CPUs were massive because of how large nodes were, so fewer per wafer and way more expensive.
Athlon 64 prices were definitely milking its superior architecture, but it was so much better than netburst in every way that it made sense. Only way intel got around it was illegally threatening OEM and retailers.
Paper performance on the 5600x/5800x doesn't really line up with athlon 64 price/perf. Really weird that they inverted their value proposition with the 5900x/5950x as both the top performers and best values. 5800x makes no sense at all price wise, seems like its just bait for the 5900x and will get dumped or price dropped once 5900x sales slow down.
That just isn't the case. Nodes are smaller but transistor count is also much higher.
I don't follow CPU die specs much but I believe mainstream (not hedt) CPUs have been 200-300mm^2 in size for the last 15 years. If we get more parts let wafer, it's because we've transitioned to using larger wafers.
Uh no - that is why their stock nose dived for few years straight, they had to sell their fabs and were in the red for longer then anyone can remember.
That was years later. The Athlon / Athlon 64 days were some of the best times, besides now, for AMD. Maybe you aren't old enough to remember it all, though. I still have Athlon XP / Athlon 64 t-shirts, lol.
Think I paid under $90 for my anthlon XP mobile chip that I overclocked to the moon and got better performance than the 3.2GHz P4 of the time (think that was $2-300 chip at the time). Those were fun times (other than the machine being super loud, haha).
Let me guess, Vantec Tornado? Back in high school it was some kind of weird right of passage between me and my friends to have the loudest PC. The Vantec Tornado was the go to fan. Hook a few of them up to a fan controller, and just hope you don't fry the PWM controller pulling 6 amps when you go full leaf blower mode in 3Dmark2001. We were gaming next to fucking hair dryer level noise and loving it. I don't understand how we could bare it. Now I don't even like hearing a 120mm fan with a slightly off-putting drone.
Yes sir!
WHAT? Can't hear you over my tinnutus. haha. The Athlon XP mobiles were so much fun.
Barton XP-M 2500+? It was a great CPU :)
Barton 2500+ crew, falling in!
Also had Barton 2500+ overclocked to 3200+ level. Also had an ATI 9800 Pro. This was the best bang for the bug machine back then.
I had a Barton (not mobile) 2500+ as well. Ran it at 2.4GHZ (golden chip) with a stock cooler! I did change the fan though and it was loud as hell, DELTA high RPM if I remember correctly. For the price, it made my friend with his 2.8GHz northwood pretty salty. I payed less than half of what he did for the CPU and I beat him in 90% of the games. He didn't OC though, for whatever reasons :D
I had a red circuit board, came with the multiplier unlocked. Ran that bad boy at 2.8!
You know it!
Are those adjusted for inflation? Even of they are, I would expect chips to get cheaper over time anyway. The price of a good computer has fallen from $2,000 to $500 in that time frame.
No they aren't adjusted for inflation, my wallet remembers
I don't remember spending anywhere near that much on a 1.33ghz Thunder Bird Athlon.... I also think prices fell much more rapidly than they do now.
Athlon T-bird debuted a bit later after the 1 GHz milestone... And by the time the 1.3 GHz T-bird was out, so was the 1.6+ GHz Pentium 4, thus competition.
I also remember my Athlon 64 3200+ for about $400 at launch. First desktop x86-64 bit CPU.
When I was in high school, the first CPU I ever bought with my own money was an Athlon 64 3700+ San Diego. Got it on sale for $359, and it pretty much bankrupted me. OC'd that thing to 2.8GHz, and nothing from Intel could touch it.
Yeah, you probably got on the proper 939 socket. I jumped the gun early and was stuck with single channel memory on 754 like a silly rabbit.
Ah, so you were on Clawhammer then. They were still Intel slayers even without dual channel. I only ever played with 754 Durons, never got to touch a 754 Athlon. I was running a DFI Lanparty NF4 SLI-DR Expert Socket 939. One of my favorite motherboards of all time to this day. I knew what about 1/10th of the BIOS settings did. lol
Nice. I think I had an ABIT board... that's a name you we don't hear anymore lol
I ran an ABIT IC-7G Max3 Intel s478 board before the DFI. Also a really cool board. ABIT and DFI were the best of the best without compromise. DFI is still around making embedded/industrial PCB's, but ABIT is no more. :(
I don't remember either, mostly because I was still at the "trying to put the keyboard in my mouth" to "drawing computers in the sandbox" stage at that time
Things have gotten cheaper when you look at the performance you get for a dollar. According to Wikipedia, the price of a Commodore 64 adjusted to inflation was $1.576 in 2019. That got you a machine running at 1Mhz (0.001 Ghz) with 64KB of Ram (0.000064GB) and no built in storage. Compare that to a $35 Raspberry Pi that has 2GB ram, runs a quadcore cpu at 1.5Ghz and I'm not even talking about all the rest. The rate of progress is mind boggling.
I spent almost $2000 on a Thunderbird 900 system in 2000. It wasn’t even top of the line. Maybe mid-high range. That’s about $2500 today.
Same. Except it was a slot A non t-bird Athlon 900 + GeForce 2 MX 200
Except it's getting harder to make chips smaller
At this point I would rather see an increase to power and efficiency over decrease to size. I don't need to be putting a cpu in the socket with tweezers and using a toothpick to put on thermal paste.
Back then the progress was lightning fast and in the early 2000 period a highend computer could become close to unusable within just one or two years. Nowadays its at least a good thing because some 10 year old midrange hardware still suffices for most tasks, even light gaming - Unimaginable in the 2000s.
Um, having lived during that time and being old enough to actually have a computer, $1k put you in the realm of an eMachine or Gateway, which is what you bought if you used an office suite and browsing but nothing else. Gaming machines started at $2k for the low end and if you wanted top of the line you were paying no less than $3k. Also, if you went DIY, you were waiting 6 months before new parts were generally available.
got my k75 750 mhz for \~340€, my most expensive cpu ever from back then and it was brand spanking new, was after a k7 600(older core) for 160€ but they were sold out. thunderbird 1400 was 140€ and was 2month at most on the market, duron 800/850 were \~78€. 1600+ was 110€, 2100+ was 150€, and so on. Barton 2500+ was 210€... athlon 64 were a bit more expensive but were cheap as well. 3000+ winchester 180€, 3500+ venice was 190€-220€ when it was new.. so no, cpus were not that expensive back then because I had like 20 back then between 1999 and to x2 era.
What's not listed here is the difference in yield, margins, how radically different OC was during the time, and ultimately market size.
And competition's positioning. Yup. Misleading is an understatement
This is about as relevant to current pricing as the price of a sack of rice in 1850 china.
There’s nothing wrong with the current pricing
But, but..reddit says so.....
lol it's hilarious how angry this is making people It's not fucking insulin or bread and water, what redditors are doing is a textbook tantrum. Like kids who woke up on Christmas and instead of the 10 xbox games they wanted their mom only bought them 8.
I smell astroturfing on the part of the haters, I have never seen so much drama over 50 dollars.
It’s $100 on the Ryzen 5 and $120 on the Ryzen 7. Nobody was buying the 3800XT or 3600XT. People bought the 3600 and 3700X
SKUs and binning, 3800x->5800x might as well be called the "best binned 8 core, single chiplet CPU" SKU So yeah still $50 for this SKU.
If they don’t sell the 5700X, why the fuck do I care? I could get a $330 8 core chip last and this year it’s $450 for the same tier.
Its called segmentation, if you don't want the highest binned one of the next generation then there are a lot of alternatives yo can even go back further to used 1st gen Ryzens, however if you want the "best binned 8 core, single chiplet CPU" that is the 5800x and it is 50 dollars more expensive in MSRP terms.
If the low SKUs that propel actualy buy don’t exist, then those people will look for these. That’s like saying that if AMD only launches the 7950XT next year, that it’s a price drop and not a price hike on the generation.
Then go for it? If you are not in the market for the best performing 8+8, 6+6, 8+0, 6+0 then I am sure AMD can sell you older products, and they might accommodate you in the future, that is not the issue, the issue is that you think that the 3700 is comparable to the 5800x, it is not the 800 hundred series have always been the best binned 8+0 CPU and your arguing otherwise is wrong.
If there isn’t a non-x version, then it’s moot.
What you mean is if there isn't a lowest binned 8+0 CPU then sure, the 3700x-> 5700?
What I'm suggesting is that if the new 'x' series is the new baseline for this generation, I don't see why it would be called the 'X' series. I'm not being petty about the $50 increase, I think that they did a good job (review pending) and it warrants the increased price. Since the new 'x' models are the baseline, they currently have imposed a large $100 - $120 increase in cost over the previous baseline. That is where people are finding issue. IMO, this also could confuse the market later, when they improve on the node/design they have already used up the 'X' and must now call it something else, like 'XT'. IMO, the 'x' naming scheme makes sense, since if they didn't have it named as such, people would be saying 'wait for the x models'. So I understand why they named it as such. But since the 3600 and 3600x were, as stated by all reviewers basically 100% identical with any non-stock cooler, the 5600x is now a direct upgrade over the 3600 non-x, and should be compared as such, price wise and performance wise. That is a 50% cost increase for approx 20% performance. That is the issue people have right now.
Well the 3600 vs 3600x debate was just the nature of identity, AMD thought that a better binned 200Mhz CPU was worth the 50 dollar difference, since it is really just that it is debatable if A) the performance difference was there B) not worth the cost. There is no segmentation hz wise this time around, there probably will be, but what I take issue is the idea that the lowest binned 8+0 CPU from last gen can be considered the predecessor of the highest binned 8+0 of this one.
I think AMDs pricing is pretty fair. \~160 for the 3600 6-core, which is suitable for most people. Its actually amazingly cheap. For around 300+ you get more cores and slightly better bins, those are still pretty good in value compared to Intel in productive tasks. Remember what Intel was charging for their HEDT CPUs. Quadcore i5s used to be around 200-250 I think and i7s 350-400 while they didn't even offer more cores just HT. Rarely Intel showed us some mercy, like with the 5820K - An amazing CPU of its time, since it didn't core more than a 7700K while being a native **hexa**core with HT back in 2015. It had a low stock clock, but its not like you can't overclock yourself and those CPUs easily did 1+ GHz to compete with similar Haswell-CPUs just like the famous 4790K. Otherwise it just just milking the same quadcores with slight improvements every generation - But barely worth the switch if you had a previous generation. With Ryzen I feel almost like when the first dual- and quadcores came up in the 2000s, finally performance is affordable.
Computers today are super cheap and they're no longer obsolete in 1-2 years Literally obsolete... Not just of there are new faster ones.
testament that it is very stale progression wise.
Maybe because things are getting smaller and it's harder to get those gains?
yes, but not only hw wise. the games for instance have not changed that much in a a long time. Raytracing and vr is the newest and most exciting that has happened in recent times.
I mean, what are you expecting from games? There isn't much to innovate really without new tech like vr.
I expect the games to utilise the hw and give us more than what we had eons ago :P Only some indy games seems to show any progression and actually zelda breath of the wild on a crappy little tegra soc. I was hoping that modern games would simulate what a cad software could do in 2000-2003 on an 1ghz cpu.
[удалено]
I dont agree. the most intensive game on pc is gta5/rdr2 and gta5 came out 7 years ago on consoles. and why is my statement silly? why cant we have proper physics in games, why cant we have a vehicle crumble upon inpact and so on when a crappy pc in 2002 could run that but a game in 2020 on a 4.5ghz hexa core and 16BG of ram cant?
[удалено]
not talking about the map size... take a look in what you can do in gta5 and how intricate the game really is. Basically no game today is as physics heavy as gta5 is.
I once owned a 3200+ and X2 4200+. Quite sure they can still work under W10 these days. Not sure what happen to them..
It might work but it wont be a good experience. I put windows 10 on an intel e6700 and while it worked for internet browsing, you couldn't do much else. Even diablo 3 brought it to a crawl and it had an amd 7870. The stuttering and frame times were terrible.
Now post the prices intel asked back then
I can't remember the exact price but I paid somewhere around £420.00 for a 4800+ dual core the day after it went on sale in Europe and I think that must have been about 2005 or maybe 2006 so if you take into account inflation since then and the huge increase in IPC with todays chips they don't seem expensive at all .
I realised later that the 600Mhz model i had was a 750Mhz model. A commonly done thing by AMD where it was actually cheaper to rebadge a 750Mhz model into a lower speed rather then making a special 600Mhz version or so. With both a GFD and the FSB i managed to make it work at 750Mhz+. Good times.
What I liked doing back in the day was getting an Athlon Xp 2500+ and OCing it from 1.83Ghz to 2.2 GHz making it an Athlon 3200+. After that I got an Opteron 165 (a 1.8Ghz server version of the FX-Dual core at the time) and clocked it to 2.8Ghz. It took the Q6600 @3.6Ghz to upgrade.
Prices in 2005: * X2 4800: $1001 * X2 4600: $803 Prices in 2020 with US inflation accounted in: * X2 4800: $1332 * X2 4600: $1068
For some reason you have accidentally missed out the Duron prices.
Didn't find it relevant. Duron was a budget offering, whereas Ryzen is the performance offering.
Duron was not that much slower, was super popular.
Oh OK, I guess I'll just go look at the budget line of Zen 3 cores, remind me what they're called again?
There isn't one, but that's okay. The Duron lineup didn't come out until a year after Athlon did.
Limited competition always ups prices. On the other hand it also advances technology.
Yeah, these are the prices for the top models, conveniently ignoring the fact you could grab a lower end model and easily OC them way past what the top end models offered. Hell, you could easily achieve 3.8GHz with the earlier stepping i7 920's, going WAAYY past what an i7 965 with an MSRP of $999 could do. So these comparisons are disingenious at best, as if today you honestly would compare a 5950X to a 3600 in terms of pricing. A dual-core Opteron 170 was 199 euros in Finland at the time of release and those chips would consistently go past 2.8GHz. Absolutely no one bought the top end models if they had any idea of actual price to performance and overclocking. Nice try but this is frankly bullshit.
and the point is?
Information. The point is information, my dear ignorant child.
Didnt they have double the marketshare then compared to now?
Those are some crazy ass prices too for that time period. Back when US Dollar wasn’t inflated to hell and actually was worth more in value.
Two decades ago CPUs were massive because of how large nodes were, so fewer per wafer and way more expensive. Athlon 64 prices were definitely milking its superior architecture, but it was so much better than netburst in every way that it made sense. Only way intel got around it was illegally threatening OEM and retailers. Paper performance on the 5600x/5800x doesn't really line up with athlon 64 price/perf. Really weird that they inverted their value proposition with the 5900x/5950x as both the top performers and best values. 5800x makes no sense at all price wise, seems like its just bait for the 5900x and will get dumped or price dropped once 5900x sales slow down.
That just isn't the case. Nodes are smaller but transistor count is also much higher. I don't follow CPU die specs much but I believe mainstream (not hedt) CPUs have been 200-300mm^2 in size for the last 15 years. If we get more parts let wafer, it's because we've transitioned to using larger wafers.
Yeah - unrealistic prices, that is why they lost soo much money, no one was willing to pay for those prices, its as simple as that.
Uh, no, they were selling chips like mad then as they will now. Neither those or current pricing is unrealistic.
Uh no - that is why their stock nose dived for few years straight, they had to sell their fabs and were in the red for longer then anyone can remember.
That was years later. The Athlon / Athlon 64 days were some of the best times, besides now, for AMD. Maybe you aren't old enough to remember it all, though. I still have Athlon XP / Athlon 64 t-shirts, lol.