Showing posts with label AMD. Show all posts
Showing posts with label AMD. Show all posts

Sunday, May 30, 2010

Intel kills Larrabee



I'm laughing my ass off right now.

Years ago, when I was knee deep in 3d visual effects as my hobby, I was much more into hardware than I am now. Hardware really mattered. No matter how much you had, it was not enough to get the job done. As a result I tracked improvements in the state-of-the-art very closely. Don't get me wrong, I've always loved hardware. I've always been a detailed guy in this respect, but never so much as when I was doing a lot of 3d every day.

Something like 3 or 4 years ago, Intel basically announced that they were going to kill Nvidia and ATI. They were going to introduce their own 'discrete graphics processor'. The project was code-named Larrabee.

They had reasons to plan such a move. ATI was on the verge of merging with AMD. AMD, arch-rival of Intel was about to bring one of the two premiere graphics vendors in-house. Also, Intel's laptop accelerators were the laughing stock of the industry. We hardware enthusiasts could not say enough bad things about Intel's so-called GPUs.

Everytime Anandtech or Tomsharware.com published a GPU round up, the benchmark specs clearly showed Intel at the very bottom of the heap. It wasn't even close. Intel routinely showed performance benchmarks between 6% and 30% of comparable nVidia or ATI solutions.

When Intel made the Larrabee announcement, a lot of hardware fanboys shook their heads in disbelief. "Not again!" we all said. The last time Intel entered the discrete-GPU market, it was with a notorious market-failure called the i740. That chip was a pussy, and it remains the core architecture of their present integrated solution, which remains a pussy. Quite simply, none of us believed that Intel had the nuts to accomplish this mission.

It turns out we were right. Here we are something like 3 or 4 years later and the announcement has come "Intel has shelved the Larrabee project." We knew it. I told you so.

The announcement does quite a bit of spin-doctoring. They make constant allusions to a subject I will soon blog about: The arrival of super-compact computers on a chip; the sort of thing that makes Android 2.2 and the iPhone possible.

They also make near-direct references to AMD's upcoming fusion processor. This is a new hybrid chip that will incorporate both x86 instruction processing, and full-bore GPU processing in a single core. This is the most exciting project in development right now. If AMD can bring it off, Intel is in a lot of trouble.

Excuses, excuses! We know you failed. I know your internal benchmarks of the prototype demonstrated that you could not compete with nVidia and ATI. You knew your were going to introduce another spectacular market failure, and take a bath. You cut your losses.

The mere attempt to enter the discrete market messed-up Intel's relationship with nVidia, which was warming for the first time in history at that moment. It turned the two of them into phony-competitors. This wasted spectacular amounts of HTML code.

I think this announcement augers bad things for Intel. Next time I blog, I'm going to talk about the sub-super-compact little computers we are now carting around in our hands these days. AMD's Fusion processor has the potential to make that market really explode. AMD has been working hard on this project for something like 4 years. Right now, Intel does not have an answer for this product. The failure to plan for competition in the segment, the failure to correctly track the evolution of this segment, just might lock Intel out of the most spectacular growth segment of the market.

That would be a bad thing... A really bad thing.

Thursday, May 14, 2009

So AMD gained share on Intel this past quarter?

AMD saw it's share of total CPU shipments increase to 22.3%.  AMD gained 4.6%.  Intel lost 4.7%.  Ergo AMD gained directly from Intel's lost.  You can read about it here.

When you read the analysis one thing becomes absolutely clear:  This is a famine story.  This is a story of global recession, shrinking budgets, and less discretionary income.  AMD won share because AMD is cheaper.  The entire Phenom II platform is cheaper than the entire Core 2 Quad and Core i7 platform.  People are spending less money on new computers, ergo they buy AMD.

Truth be told, Phenom II is more than fast enough for all of today's games and all of today's conventional software.  If you are a developer and want to download a copy of Netbeans 6.7, Maven2 and Scala 2.8, a Phenom II will work just fine.  It will run Glassfish and MySQL at the same time without worries.  You will enjoy crisp performance.  Visual Studio & Microsoft SQL Server are a piece of cake.  Not to worry.  You do killer gaming?  No problem.  Just make sure you have a good GPU also.

But Phenom II cannot hold a candle to Core i7.  I know.  I have one.  After very careful evaluation of the full spectrum of benchmark info, I came to the conclusion that Phenom II is almost a peer of Core 2 Quad.  I already owned a Core 2 Quad.  I had owned this Core 2 Quad for almost 2 years when I sold it.  Since Phenom II was almost competitive with my 2 year old machine, it constituted no upgrade for me.  Further, Phenom II--just like Core 2 Quad--delivers about 64% of the processing power of a Core i7 {when you compare Mhz for Mhz, core for core, watt for watt}.  Core i7 is more than 50% faster than these two.

Who cares?  Well, truth be told, there is one thing in the world that is still preposterously CPU intensive.  That is 3d Visual Effects and HD composition.  If you make HD movies, and like overlaying some 3d spice, you will bring any rig to its knees in no time.  No matter how much CPU you get, it is never enough.  No 3d or HD rendering process runs in real time on any machine.  None of these software pacakges are interactive.  The CPU is the bottleneck in the process.  The user is not the bottleneck in the process.  The computer does not wait for the user.  The user waits for the computer.

I love 3d arts.  I am not a good artist, but this is my hobby and love it.  I use Modo and Vue 7 and ZBrush.  Each of these systems wants a lot of RAM and lot of CPU.  The more the better.  This is why I chose Core i7.

Industrial Lights & Magic (George Lucas's firm) used to have a render farm they called The Death Star Render Cluster.  This was a huge server farm, packed out with 600 to 800 server blades.  Each blade sported 2 Opteron processors {back in the day}.  Each of those Opterons had 1, 2 or 4 cores.  They went through several generations of blades quickly.  Just about all the visual shots in all of your favorite visual effects movies over the past 8 years were rendered on this cluster.  The CPUs did all the work.  GPUs from Nvidia and ATI did nothing vis-a-vis rendering.

You might ask why did they choose AMD?  At the risk of quoting Sarek of Vulcan, at the time it was the logical thing to do.  People don't remember anymore, but for three years between 2004 and 2006, AMD ruled the world.  In the middle of that span, AMD brought out a processor called the Athlon 64 X2.  I was the first in line to buy the 4400.  I had it a day before the official release and linked it with a Geforce 7800.  In May of 2005, this was the finest platform any guy could want.  It consumed 89 watts tops, vis-a-vis 125 watts.  That made a big difference to your heat production, heat sink, and noise levels.  In terms of horse power, it utterly devestating Intel's single core 32 bit approach.

This was a major inflection point in the history of computing.  It was like the ENIAC, it was like the Dec VAX, it was like Apple II, it was like the IBM PC, it was like Mac, it was like the Amiga.  It changed the entire course and flow of history.  Before Athlon 64 X2, we had a single processor core on a chip.  It was 32 bit...  for the most part.  Intel's objective was to increase raw frequency of clock cycles.  This increased single threaded execution speed.  Suddenly in May of 2005, we had two cores, and they were both 64 bit.  It offered a far better experianced for a highly-multitasking OS like Windows XP.  This chip is now the pattern for everything in the industry.  It broke the entire flow and direction of Intel's corporate road map, and it forced them to go in a different direction.  

In the summer of 2005 Intel fell so far behind AMD {technologically speaking} most of us could not see the bottom.  Nobody wanted to buy Intel.  So powerful was this demand-force that even the most recalcitrant vendor in the world (Dell) was forced to introduce AMD systems.  Some were wondering if AMD would become the major supplier of x86 chips and lay a Zilog whupping on Intel.  Intel was in crisis.  The Pentium 4 had failed.  Worse still, the central strategy of the Intel corporation had failed.  AMD had handed them their ass.  There is no other way to describe it.

It took more than a year of devastation and shame to kick out of the hole, but in the fall of 2006 Intel delivered the greatest counter punch in the history of the computer industry.  It was a haymaker TKO.  They called it the Core 2 Duo.  It was so powerful, it sunk AMD's ship.  It was an overnight sensation.  Intel could righteously claim that Core 2 Duo was the best processor for every application.  There was no benchmark it didn't win.  The average margin of victory was 27%.   Worst of all, it consumed a meager 65 watts tops.  AMD's finest were at that level, but they were much slower.  Intel won every server and every laptop design contest in every major manufacturer.  We got the finest desktops and latpops we had ever owned.

Since then, AMD has been in a downward spiral.  AMD merged with ATI, which most of us considered a mistake.  They got completely obsessed with 'true quadcore' when meant sticking 4 cores on a single piece of silicone.  This was of little consiquence to the end user, and proved to be a first-class red-herring.  Intel just popped two Core 2 Duos into a single processor package and had instant Quad Core.  AMD stayed in their own personal wild-goose chase for more than a year and introduced Phenom I at the end of it.  This was a processor that was not competitive in wattage or processing power with the best Intel offered.  People got fired.  AMD trugged on.

Many of us were depressed about AMD's plight.  A lot of us liked AMD better than Intel, but the objective among us understood AMD was not competitive.  It wasn't that Intel was anti-competitive.  Rather, AMD failed to compete.  AMD did not answered the bell.  AMD has not counter-punched.

Nothing has really changed since then.  We are into the 3rd year of total domination by Intel.  Intel crushes everybody in terms of performance.  There is no competitor on the horizon for Core i7.  We only wait to see what Intel will do next.  Right now nobody is strong enough to carry the weight of Intel's jockstrap.

There is a faint hope that Sony, IBM and Toshiba can deliver a dual-core 64 bit Cell which runs at 7GHz.  There have been rumors about this for some time.  There is a hope that the Sony Playstation can be extended into a full-blown general purpose platform with this 7GHz processor.  But this is just hope against hope.  Everything is X86 and AMD64 now.  Even the Java platform is very centered up on x86 and AMD64.  This is the absolute point of focus where the entire platform is best-tested and most robust.

Until quantum computing arrives, it is x86 and AMD64 all the way down to the turtles.  When quantum computing arrives, I have a sinking feeling we will use it to run x86 and AMD64 code in emmulator mode.  

In view of all this, you can understand why I was surprised to read that AMD gained share against Intel.  I am happy about this.  It means AMD won't die soon.  I am glad.  However, technologically speaking, it is surprising.  The reason is famine, global recession, liquidy crisis, credit crunch, and de-leveraging.  People want to play less.  People want to buy in cash.  Intel is a bit pricy for this.  AMD can hit the budget point.


Tuesday, May 12, 2009

Does the government have any idea of which direction it should shoot?


Once in a while I read something that makes my blood boil.  Yesterday, around noon, a flurry of articles were published indicating that the Department of Justice intends to begin stricter enforcement of antitrust laws. Assistant Attorney General Christine Varney said the Justice Department is abandoning legal guidelines established by the Bush administration in September 2008.

So who is in the cross-hairs of the sniper rifle this time?  Sure it is Citibank, Bank of America, JP Morgan and Wells Fargo, right?

Wrong!  Google, Intel and Microsoft are the main targets.  Obama is shifting to the European approach to anti-monopoly action.  Since Google, Intel and Microsoft are constantly in trouble with European regulators, we can expect the same from the good old Federal Government.

Now, as much as I like small competitors like Borland (now dead), Yahoo (dying) and AMD (also in trouble), I know better than to think that monopoly power did it.  AMD simply failed to compete with Intel.  There were some very bad decisions there, such as the merger with ATI.  I know Yahoo has not been competitive with Google in terms of apps & hard information research for years now.  That isn't Google's fault.  Yahoo wanted to be a celebrity tabloid machine.  They decided move towards People Magazine on their own, and now they are fucked.  Borland died because it didn't seem to know how to market to corporate America.  They had awesome persuasive powers with true programmers, but they had no idea that they needed to market direct to corporate officers.  The CIO needed to be in the crosshairs.  They also had no idea that they needed to recruit MS Office power users.  It is a horrible thing that Borland basically died, but market-wise, they were stupid.

So, instead of breaking up the Banks, who fucked the entire world, the Justice Department intends to prepare anti-trust proceedings against Google, Intel and Microsoft.  These are the three most powerful and important firms in the high-tech sector of the world.  Incidentally, the tech sector is the only thing keeping our American economy afloat right now.  Boy, these supreme assholes in the Justice Department really don't have a fucking clue do they?

I have nothing against Trust Busting.  Teddy Roosevelt did it all the time.  He is one of my favorite guys, and I don't much like presidents and politicians.  If you are going to bust some trusts, bust the dangerous ones.  Don't go after the healthy, useful and benevolent ones.  For crying out loud!  Shoot strait will yah!?!?!?

This new policy statement indicates indicates that the Government does not know the difference between its ass and hole in the wall.  This has to be the most AFU announcement I have heard in sometime.