I'm laughing my ass off right now.
Years ago, when I was knee deep in 3d visual effects as my hobby, I was much more into hardware than I am now. Hardware really mattered. No matter how much you had, it was not enough to get the job done. As a result I tracked improvements in the state-of-the-art very closely. Don't get me wrong, I've always loved hardware. I've always been a detailed guy in this respect, but never so much as when I was doing a lot of 3d every day.
Something like 3 or 4 years ago, Intel basically announced that they were going to kill Nvidia and ATI. They were going to introduce their own 'discrete graphics processor'. The project was code-named Larrabee.
They had reasons to plan such a move. ATI was on the verge of merging with AMD. AMD, arch-rival of Intel was about to bring one of the two premiere graphics vendors in-house. Also, Intel's laptop accelerators were the laughing stock of the industry. We hardware enthusiasts could not say enough bad things about Intel's so-called GPUs.
Everytime Anandtech or Tomsharware.com published a GPU round up, the benchmark specs clearly showed Intel at the very bottom of the heap. It wasn't even close. Intel routinely showed performance benchmarks between 6% and 30% of comparable nVidia or ATI solutions.
When Intel made the Larrabee announcement, a lot of hardware fanboys shook their heads in disbelief. "Not again!" we all said. The last time Intel entered the discrete-GPU market, it was with a notorious market-failure called the i740. That chip was a pussy, and it remains the core architecture of their present integrated solution, which remains a pussy. Quite simply, none of us believed that Intel had the nuts to accomplish this mission.
It turns out we were right. Here we are something like 3 or 4 years later and the announcement has come "Intel has shelved the Larrabee project." We knew it. I told you so.
The announcement does quite a bit of spin-doctoring. They make constant allusions to a subject I will soon blog about: The arrival of super-compact computers on a chip; the sort of thing that makes Android 2.2 and the iPhone possible.
They also make near-direct references to AMD's upcoming fusion processor. This is a new hybrid chip that will incorporate both x86 instruction processing, and full-bore GPU processing in a single core. This is the most exciting project in development right now. If AMD can bring it off, Intel is in a lot of trouble.
Excuses, excuses! We know you failed. I know your internal benchmarks of the prototype demonstrated that you could not compete with nVidia and ATI. You knew your were going to introduce another spectacular market failure, and take a bath. You cut your losses.
The mere attempt to enter the discrete market messed-up Intel's relationship with nVidia, which was warming for the first time in history at that moment. It turned the two of them into phony-competitors. This wasted spectacular amounts of HTML code.
I think this announcement augers bad things for Intel. Next time I blog, I'm going to talk about the sub-super-compact little computers we are now carting around in our hands these days. AMD's Fusion processor has the potential to make that market really explode. AMD has been working hard on this project for something like 4 years. Right now, Intel does not have an answer for this product. The failure to plan for competition in the segment, the failure to correctly track the evolution of this segment, just might lock Intel out of the most spectacular growth segment of the market.
That would be a bad thing... A really bad thing.