Wow... shocking...
A month or two ago, I was listening to an episode of the .NET Rocks Podcast where Anders Hejlsberg slagged Software Transactional Memory. "It's the gift that keeps on giving... in terms of complexity. The overhead is terrible also. We're looking at two-fold and four-fold increases in processing time even in the best case scenario."
For those who have no idea what I am talking about, STM was considered the silver-bullet for the problem of parallel programming. Guys like me have serious problems constructing highly-parallel applications. I can do some threats. I've tried P-Linq. I am even fooling around with the Task Parallel Library. However, none of this gets me (or anyone else) to the point where you can construct extremely parallel systems. This is a very dark and arcane art for super-geniuses.
STM was supposed to be the closest thing to a silver-bullet for this problem. In theory, it would have enabled a bunch of dudes like me to collaborate and build a seriously parallel system in pretty much the same way we have always coded. Supposedly the performance was exceptionally good also. I remember hearing Microsoft researchers at Cambridge England stating this very thing.
Well... no....
Anders says it performs like shit. He said that execution times increase 200-400% in the best case scenario. That is not good performance.
Evidently, Anders had the ammo to back it up. According to recent reports, Microsoft's experimentations with STM have ended, and there were no engineering results. That means Microsoft will not be introducing any new products based on STM technology, or incorporating STM into any existing product lines.
Damn... I am so disappointed...
I think there are two things we can carry away from this monumental moment:
- This is another case where the academic computer world has made huge promises--like artificial intelligence--and come up with shit in their hands.
- Witness the power of Anders Hejlsberg, tech geek in a basement, and his ability to shutdown Ph.D. minted Cambridge scientists.
Years ago, I decided not to major in computer science because the fuck-heads running the UCLA department of computer science did not know what the hell they were doing. I could recount their reaction to Visual Basic 1.0, but I have already blogged on that.
Of course, AI has been the perpetual waste of billions in research money. We have gotten little or nothing out of this research. The best we have done is a couple of medical "Expert Systems" written in Prolog that help doctors to diagnose really rare & difficult problem. According to many doctors, those tools aren't that good either.
Now we see the apparent demise of Software Transactional Memory (STM). I am so disappointed to learn that our boys in Cambridge were ivory-tower academics with their feet firmly planted in mid-air. I thought these guys were practical.
Do you remember what Dr. Stanz said to Dr. Venkman in Ghost Busters? "You don't know the private sector! I've been there. They actually expect results!"
Yeah, we expect results. Researchers work under ideal conditions, and they don't need to produce results. Engineers work under real conditions, and they better produce results. It looks like Cambridge's claims of outstanding performance were based on ideal-conditions, not real conditions.
Disgusting! I've been taken in by academic clowns! Bastards!
Second of all, this event speaks volumes about Anders's power as a good 'ole tech geek. Anders and Bill Gates have similar origins. Anders was the better looking guy, and the smarter guy, but Bill was more ambitious and a better businessman. Both of them were tech geeks in a basement who never really finished their academic degrees. Now these guys arguably have more power over the real software world than just about anybody else.
Wow man... The Cambridge guys had me so convinced... If I had heard it from any other source than Anders I would have discounted it as a crackpot remark. Anders has a ton of street-cred with me.