Showing posts with label Intel. Show all posts
Showing posts with label Intel. Show all posts

Tuesday, June 1, 2010

Why Apple Inc is cult for idiots



Folks, I have heard words lately I never wanted to hear in my life. It turns out my little sister is going away to college and I have been asked to finance a Mac for her. This is just the wrong request to make of a guy like me.

I was shown a CSUSF bookstore offering for an outdated and outmoded laptop that was current circa 2007. It was a Core 2 model running at 2.26Ghz, loaded with 2GB of RAM, and equipped with a 250GB disk. The price tag was over $900. Allegedly, she was catching a break on the price. This price was the result of a good sized student discount.

Anyone who knows something about laptop pricing these days will be laughing his ass off right about now. That is a clearance model. The exact equivalent from HP, Dell, Lenovo or Sony will fetch no more than $599 right about now.

To make matters worse, the current Core i5 model laptops from Apple run right around $1,800. I can show you a better Core i7 from HP for a meager $800. This is less than half the price; More like 45% of the price.

Now that Apple makes 100% pure Intel machines, they have no place to run to and no place to hide. Perfect head-to-head comparisons are now possible. Other firms are using the exact same Mobos, CPUs, GPUs, and Samsung LED LCD screens Apple... er... Foxconn is using to make Apple notebooks.

For all practical purposes, Apple laptops and Windows laptops are identical now. The visual styling maybe a tad different, but this is of no importance. The operating system is the main and only key difference. It boils down to OSX versus Windows7. That's the end of the story.

OH BUT THE TOUCH PAD MOUSE WITH GESTURES IS SO MUCH BETTER!

That is now an option with most PC laptops. The options adds about $100 to price tag. This cannot explain a more than two-fold increase in price.

Pray tell why OSX should more than double the cost of the machine? There is no answer. OSX development costs a lot, and Apple cannot spread this cost over the large market Windows has. However, this cost does not even come close to explaining the doubling of the Apple price-tag.

There is but one solution: Apple demands a very handsome profit for each and everyone of their items. They have never wanted to sell in volume. They have never tried to corner the market. They have always fostered an elitist mentality. They have always wanted to be a Rolls Royce vendor.

They could afford to sell their products at a much cheaper price. They simply don't want to.

I could afford to spend the money. I simply don't want to. In fact, I would be a complete idiot if I did so. A computer is not like a Le Creuset enameled cast iron pan. I will never be able to pass it down to my grandkids. The laptop I buy today is guaranteed to be undesirable rubbish in just 3 years or less. A man would have to be a fool to buy an heirloom quality computer.

The farce goes even deeper than this. There is almost no reason to prefer Mac OSX over Windows7 except for bad prejudices, usually based in creative cultures such as sound mixing. Anything OSX can do, Windows can do better. Windows can do anything better than Mac. I don't care whether it is simple job-apps like word processing, spreadsheets, project plans, or more complicated tasks such as 3d visual effects. Windows 7 does it all.

Unless I was totally dedicated and committed to developing software for the UNIX environment (something I am not) I can see no early reason why I would want OSX. OSX is the Rolls Royce of UNIX. This I will gladly admit. If I were a LAMP stack developer, I would want Mac OSX. Otherwise, I simply have no use for the critter.

So why do fools buy into the idea of Macintosh? Let me reply with a question. Why do fools buy into the idea of expensive enameled cast iron from France when cheap stainless steel gets it done, and is preferred by the expert chefs? You can have an entire set of stainless steel cookware for about the price of a 7.25 quart Le Creuset French Oven. Don't believe me? Check it out:



I am afraid it goes deeper than this. Cookware doesn't quite have the dimension of religion that the Mac carries with it.

You see, Apple has always been cultic in its orientation. They have always attempted to breed a false sense of morality into their users. It is immoral and unethical to buy anything but. Your Mac is like your wife. Having a PC is like cheating on your wife. Also, you are violating the ethics of your religion. You are committing heresy, like an apostate, if you do. This will make you an antichrist in our community.

Creative types enforce the laws of Macintosh. They insist you be a bad-ass rebel against Windows7, and they insist you rebel in exact conformity with their demands. If you go any other way, the community will more or less lock you out.

I myself have witnessed sound engineers feigning difficulties with FAT32 format hard drives, unwilling to accept digital soundwaves recorded on a Windows PC. I have seen these greazie basterds waste my brother's precious studio time and money, attempting to get over technical difficulties that did not exist.

I have personally humiliated such drippy-hippies by showing them how to use their Macs to open these files without any difficulty. I have witnessed these humiliated drippy-hippies lecturing my brother about how he really should own a Mac, or he can't really share and participate in the sound engineering community... This after I have shown said hippy that there are no real problems in using Windows sound files.

Mac is a fascistic intellectual orthodoxy enforced among certain creative types. As is often the case with orthodoxy, the law has no real merit, it's just the law.


Sunday, May 30, 2010

Intel kills Larrabee



I'm laughing my ass off right now.

Years ago, when I was knee deep in 3d visual effects as my hobby, I was much more into hardware than I am now. Hardware really mattered. No matter how much you had, it was not enough to get the job done. As a result I tracked improvements in the state-of-the-art very closely. Don't get me wrong, I've always loved hardware. I've always been a detailed guy in this respect, but never so much as when I was doing a lot of 3d every day.

Something like 3 or 4 years ago, Intel basically announced that they were going to kill Nvidia and ATI. They were going to introduce their own 'discrete graphics processor'. The project was code-named Larrabee.

They had reasons to plan such a move. ATI was on the verge of merging with AMD. AMD, arch-rival of Intel was about to bring one of the two premiere graphics vendors in-house. Also, Intel's laptop accelerators were the laughing stock of the industry. We hardware enthusiasts could not say enough bad things about Intel's so-called GPUs.

Everytime Anandtech or Tomsharware.com published a GPU round up, the benchmark specs clearly showed Intel at the very bottom of the heap. It wasn't even close. Intel routinely showed performance benchmarks between 6% and 30% of comparable nVidia or ATI solutions.

When Intel made the Larrabee announcement, a lot of hardware fanboys shook their heads in disbelief. "Not again!" we all said. The last time Intel entered the discrete-GPU market, it was with a notorious market-failure called the i740. That chip was a pussy, and it remains the core architecture of their present integrated solution, which remains a pussy. Quite simply, none of us believed that Intel had the nuts to accomplish this mission.

It turns out we were right. Here we are something like 3 or 4 years later and the announcement has come "Intel has shelved the Larrabee project." We knew it. I told you so.

The announcement does quite a bit of spin-doctoring. They make constant allusions to a subject I will soon blog about: The arrival of super-compact computers on a chip; the sort of thing that makes Android 2.2 and the iPhone possible.

They also make near-direct references to AMD's upcoming fusion processor. This is a new hybrid chip that will incorporate both x86 instruction processing, and full-bore GPU processing in a single core. This is the most exciting project in development right now. If AMD can bring it off, Intel is in a lot of trouble.

Excuses, excuses! We know you failed. I know your internal benchmarks of the prototype demonstrated that you could not compete with nVidia and ATI. You knew your were going to introduce another spectacular market failure, and take a bath. You cut your losses.

The mere attempt to enter the discrete market messed-up Intel's relationship with nVidia, which was warming for the first time in history at that moment. It turned the two of them into phony-competitors. This wasted spectacular amounts of HTML code.

I think this announcement augers bad things for Intel. Next time I blog, I'm going to talk about the sub-super-compact little computers we are now carting around in our hands these days. AMD's Fusion processor has the potential to make that market really explode. AMD has been working hard on this project for something like 4 years. Right now, Intel does not have an answer for this product. The failure to plan for competition in the segment, the failure to correctly track the evolution of this segment, just might lock Intel out of the most spectacular growth segment of the market.

That would be a bad thing... A really bad thing.

Thursday, June 18, 2009

Is Intel going to loose its manufacturing advantage?


Whatever happened to 32nm chips?

It was a mere 4 months ago that Intel first demo'd working 32nm processors in both mobile and desktop systems. Those chips were to become available soon. Soon is still not here yet.

Now all of a sudden, I hear news TSMC will be moving to a 28nm fabrication process early in 2010. Now Toshiba is going to be mass producing 28nm chips in the same time frame.

This has implications. Unless both TSMC and Toshiba are vastly over-optimistic about their time frame, Intel is going to going to lose the lead in manufacturing. Think it through. Intel has still not shipped its first 32nm chip. They have a way to go before they completely convert to 32nm. There will be 45nm transistors in Intel's lineup for some time. However, TSMC and Toshiba plan to ship 28nm chips early in 2010.

This news is like ship bearing a pirate flag rising up over a watery horizon in front of a harbor town. A sense of dread must be rising in the hearts of Intel Corp all over the world.

Is it actually possible? Do you think they will actually do it? Can it be? TSMC vowed several years ago they would take the manufacturing process lead away from Intel. Now it appears they are on the verge of doing it. But Toshiba also?

Thursday, May 14, 2009

So AMD gained share on Intel this past quarter?

AMD saw it's share of total CPU shipments increase to 22.3%.  AMD gained 4.6%.  Intel lost 4.7%.  Ergo AMD gained directly from Intel's lost.  You can read about it here.

When you read the analysis one thing becomes absolutely clear:  This is a famine story.  This is a story of global recession, shrinking budgets, and less discretionary income.  AMD won share because AMD is cheaper.  The entire Phenom II platform is cheaper than the entire Core 2 Quad and Core i7 platform.  People are spending less money on new computers, ergo they buy AMD.

Truth be told, Phenom II is more than fast enough for all of today's games and all of today's conventional software.  If you are a developer and want to download a copy of Netbeans 6.7, Maven2 and Scala 2.8, a Phenom II will work just fine.  It will run Glassfish and MySQL at the same time without worries.  You will enjoy crisp performance.  Visual Studio & Microsoft SQL Server are a piece of cake.  Not to worry.  You do killer gaming?  No problem.  Just make sure you have a good GPU also.

But Phenom II cannot hold a candle to Core i7.  I know.  I have one.  After very careful evaluation of the full spectrum of benchmark info, I came to the conclusion that Phenom II is almost a peer of Core 2 Quad.  I already owned a Core 2 Quad.  I had owned this Core 2 Quad for almost 2 years when I sold it.  Since Phenom II was almost competitive with my 2 year old machine, it constituted no upgrade for me.  Further, Phenom II--just like Core 2 Quad--delivers about 64% of the processing power of a Core i7 {when you compare Mhz for Mhz, core for core, watt for watt}.  Core i7 is more than 50% faster than these two.

Who cares?  Well, truth be told, there is one thing in the world that is still preposterously CPU intensive.  That is 3d Visual Effects and HD composition.  If you make HD movies, and like overlaying some 3d spice, you will bring any rig to its knees in no time.  No matter how much CPU you get, it is never enough.  No 3d or HD rendering process runs in real time on any machine.  None of these software pacakges are interactive.  The CPU is the bottleneck in the process.  The user is not the bottleneck in the process.  The computer does not wait for the user.  The user waits for the computer.

I love 3d arts.  I am not a good artist, but this is my hobby and love it.  I use Modo and Vue 7 and ZBrush.  Each of these systems wants a lot of RAM and lot of CPU.  The more the better.  This is why I chose Core i7.

Industrial Lights & Magic (George Lucas's firm) used to have a render farm they called The Death Star Render Cluster.  This was a huge server farm, packed out with 600 to 800 server blades.  Each blade sported 2 Opteron processors {back in the day}.  Each of those Opterons had 1, 2 or 4 cores.  They went through several generations of blades quickly.  Just about all the visual shots in all of your favorite visual effects movies over the past 8 years were rendered on this cluster.  The CPUs did all the work.  GPUs from Nvidia and ATI did nothing vis-a-vis rendering.

You might ask why did they choose AMD?  At the risk of quoting Sarek of Vulcan, at the time it was the logical thing to do.  People don't remember anymore, but for three years between 2004 and 2006, AMD ruled the world.  In the middle of that span, AMD brought out a processor called the Athlon 64 X2.  I was the first in line to buy the 4400.  I had it a day before the official release and linked it with a Geforce 7800.  In May of 2005, this was the finest platform any guy could want.  It consumed 89 watts tops, vis-a-vis 125 watts.  That made a big difference to your heat production, heat sink, and noise levels.  In terms of horse power, it utterly devestating Intel's single core 32 bit approach.

This was a major inflection point in the history of computing.  It was like the ENIAC, it was like the Dec VAX, it was like Apple II, it was like the IBM PC, it was like Mac, it was like the Amiga.  It changed the entire course and flow of history.  Before Athlon 64 X2, we had a single processor core on a chip.  It was 32 bit...  for the most part.  Intel's objective was to increase raw frequency of clock cycles.  This increased single threaded execution speed.  Suddenly in May of 2005, we had two cores, and they were both 64 bit.  It offered a far better experianced for a highly-multitasking OS like Windows XP.  This chip is now the pattern for everything in the industry.  It broke the entire flow and direction of Intel's corporate road map, and it forced them to go in a different direction.  

In the summer of 2005 Intel fell so far behind AMD {technologically speaking} most of us could not see the bottom.  Nobody wanted to buy Intel.  So powerful was this demand-force that even the most recalcitrant vendor in the world (Dell) was forced to introduce AMD systems.  Some were wondering if AMD would become the major supplier of x86 chips and lay a Zilog whupping on Intel.  Intel was in crisis.  The Pentium 4 had failed.  Worse still, the central strategy of the Intel corporation had failed.  AMD had handed them their ass.  There is no other way to describe it.

It took more than a year of devastation and shame to kick out of the hole, but in the fall of 2006 Intel delivered the greatest counter punch in the history of the computer industry.  It was a haymaker TKO.  They called it the Core 2 Duo.  It was so powerful, it sunk AMD's ship.  It was an overnight sensation.  Intel could righteously claim that Core 2 Duo was the best processor for every application.  There was no benchmark it didn't win.  The average margin of victory was 27%.   Worst of all, it consumed a meager 65 watts tops.  AMD's finest were at that level, but they were much slower.  Intel won every server and every laptop design contest in every major manufacturer.  We got the finest desktops and latpops we had ever owned.

Since then, AMD has been in a downward spiral.  AMD merged with ATI, which most of us considered a mistake.  They got completely obsessed with 'true quadcore' when meant sticking 4 cores on a single piece of silicone.  This was of little consiquence to the end user, and proved to be a first-class red-herring.  Intel just popped two Core 2 Duos into a single processor package and had instant Quad Core.  AMD stayed in their own personal wild-goose chase for more than a year and introduced Phenom I at the end of it.  This was a processor that was not competitive in wattage or processing power with the best Intel offered.  People got fired.  AMD trugged on.

Many of us were depressed about AMD's plight.  A lot of us liked AMD better than Intel, but the objective among us understood AMD was not competitive.  It wasn't that Intel was anti-competitive.  Rather, AMD failed to compete.  AMD did not answered the bell.  AMD has not counter-punched.

Nothing has really changed since then.  We are into the 3rd year of total domination by Intel.  Intel crushes everybody in terms of performance.  There is no competitor on the horizon for Core i7.  We only wait to see what Intel will do next.  Right now nobody is strong enough to carry the weight of Intel's jockstrap.

There is a faint hope that Sony, IBM and Toshiba can deliver a dual-core 64 bit Cell which runs at 7GHz.  There have been rumors about this for some time.  There is a hope that the Sony Playstation can be extended into a full-blown general purpose platform with this 7GHz processor.  But this is just hope against hope.  Everything is X86 and AMD64 now.  Even the Java platform is very centered up on x86 and AMD64.  This is the absolute point of focus where the entire platform is best-tested and most robust.

Until quantum computing arrives, it is x86 and AMD64 all the way down to the turtles.  When quantum computing arrives, I have a sinking feeling we will use it to run x86 and AMD64 code in emmulator mode.  

In view of all this, you can understand why I was surprised to read that AMD gained share against Intel.  I am happy about this.  It means AMD won't die soon.  I am glad.  However, technologically speaking, it is surprising.  The reason is famine, global recession, liquidy crisis, credit crunch, and de-leveraging.  People want to play less.  People want to buy in cash.  Intel is a bit pricy for this.  AMD can hit the budget point.


Tuesday, May 12, 2009

Does the government have any idea of which direction it should shoot?


Once in a while I read something that makes my blood boil.  Yesterday, around noon, a flurry of articles were published indicating that the Department of Justice intends to begin stricter enforcement of antitrust laws. Assistant Attorney General Christine Varney said the Justice Department is abandoning legal guidelines established by the Bush administration in September 2008.

So who is in the cross-hairs of the sniper rifle this time?  Sure it is Citibank, Bank of America, JP Morgan and Wells Fargo, right?

Wrong!  Google, Intel and Microsoft are the main targets.  Obama is shifting to the European approach to anti-monopoly action.  Since Google, Intel and Microsoft are constantly in trouble with European regulators, we can expect the same from the good old Federal Government.

Now, as much as I like small competitors like Borland (now dead), Yahoo (dying) and AMD (also in trouble), I know better than to think that monopoly power did it.  AMD simply failed to compete with Intel.  There were some very bad decisions there, such as the merger with ATI.  I know Yahoo has not been competitive with Google in terms of apps & hard information research for years now.  That isn't Google's fault.  Yahoo wanted to be a celebrity tabloid machine.  They decided move towards People Magazine on their own, and now they are fucked.  Borland died because it didn't seem to know how to market to corporate America.  They had awesome persuasive powers with true programmers, but they had no idea that they needed to market direct to corporate officers.  The CIO needed to be in the crosshairs.  They also had no idea that they needed to recruit MS Office power users.  It is a horrible thing that Borland basically died, but market-wise, they were stupid.

So, instead of breaking up the Banks, who fucked the entire world, the Justice Department intends to prepare anti-trust proceedings against Google, Intel and Microsoft.  These are the three most powerful and important firms in the high-tech sector of the world.  Incidentally, the tech sector is the only thing keeping our American economy afloat right now.  Boy, these supreme assholes in the Justice Department really don't have a fucking clue do they?

I have nothing against Trust Busting.  Teddy Roosevelt did it all the time.  He is one of my favorite guys, and I don't much like presidents and politicians.  If you are going to bust some trusts, bust the dangerous ones.  Don't go after the healthy, useful and benevolent ones.  For crying out loud!  Shoot strait will yah!?!?!?

This new policy statement indicates indicates that the Government does not know the difference between its ass and hole in the wall.  This has to be the most AFU announcement I have heard in sometime.