Showing posts with label Parallel processing. Show all posts
Showing posts with label Parallel processing. Show all posts

Thursday, June 10, 2010

So Microsoft's experiments with Software Transactional Memory have ended, aye?



Wow... shocking...

A month or two ago, I was listening to an episode of the .NET Rocks Podcast where Anders Hejlsberg slagged Software Transactional Memory. "It's the gift that keeps on giving... in terms of complexity. The overhead is terrible also. We're looking at two-fold and four-fold increases in processing time even in the best case scenario."

For those who have no idea what I am talking about, STM was considered the silver-bullet for the problem of parallel programming. Guys like me have serious problems constructing highly-parallel applications. I can do some threats. I've tried P-Linq. I am even fooling around with the Task Parallel Library. However, none of this gets me (or anyone else) to the point where you can construct extremely parallel systems. This is a very dark and arcane art for super-geniuses.

STM was supposed to be the closest thing to a silver-bullet for this problem. In theory, it would have enabled a bunch of dudes like me to collaborate and build a seriously parallel system in pretty much the same way we have always coded. Supposedly the performance was exceptionally good also. I remember hearing Microsoft researchers at Cambridge England stating this very thing.

Well... no....

Anders says it performs like shit. He said that execution times increase 200-400% in the best case scenario. That is not good performance.

Evidently, Anders had the ammo to back it up. According to recent reports, Microsoft's experimentations with STM have ended, and there were no engineering results. That means Microsoft will not be introducing any new products based on STM technology, or incorporating STM into any existing product lines.

Damn... I am so disappointed...

I think there are two things we can carry away from this monumental moment:
  1. This is another case where the academic computer world has made huge promises--like artificial intelligence--and come up with shit in their hands.
  2. Witness the power of Anders Hejlsberg, tech geek in a basement, and his ability to shutdown Ph.D. minted Cambridge scientists.
Years ago, I decided not to major in computer science because the fuck-heads running the UCLA department of computer science did not know what the hell they were doing. I could recount their reaction to Visual Basic 1.0, but I have already blogged on that.

Of course, AI has been the perpetual waste of billions in research money. We have gotten little or nothing out of this research. The best we have done is a couple of medical "Expert Systems" written in Prolog that help doctors to diagnose really rare & difficult problem. According to many doctors, those tools aren't that good either.

Now we see the apparent demise of Software Transactional Memory (STM). I am so disappointed to learn that our boys in Cambridge were ivory-tower academics with their feet firmly planted in mid-air. I thought these guys were practical.

Do you remember what Dr. Stanz said to Dr. Venkman in Ghost Busters? "You don't know the private sector! I've been there. They actually expect results!"

Yeah, we expect results. Researchers work under ideal conditions, and they don't need to produce results. Engineers work under real conditions, and they better produce results. It looks like Cambridge's claims of outstanding performance were based on ideal-conditions, not real conditions.

Disgusting! I've been taken in by academic clowns! Bastards!

Second of all, this event speaks volumes about Anders's power as a good 'ole tech geek. Anders and Bill Gates have similar origins. Anders was the better looking guy, and the smarter guy, but Bill was more ambitious and a better businessman. Both of them were tech geeks in a basement who never really finished their academic degrees. Now these guys arguably have more power over the real software world than just about anybody else.

Wow man... The Cambridge guys had me so convinced... If I had heard it from any other source than Anders I would have discounted it as a crackpot remark. Anders has a ton of street-cred with me.

Monday, June 22, 2009

Visual Basic is destined to die

Visual Basic is going to die because there are five specific things that kill parallelism:

1. Shared, global, mutable state.
2. Side-effects all over the place
3. Mutable data in just about any scope.
4. Synchronous communication between threads
5. A highly imperative approach to coding.

All of these things are the bread and butter, the day and night, the warp and woof of life for a Visual Basic programmer. I challenge you to show me any sizable Visual Basic App currently in use in any department in any corporation that does not manifest all five of these elements. Element 4 is the only one that might possibly be missing. If so, it is only because the application is absolutely single threaded, with no split between interface and worker threads at all.

For these reasons, I think Visual Basic is going go the way of dBase and FoxPro. I am not alone. There are many within Microsoft who do not believe that Visual Basic can be saved. Better minds, and less biased minds, know the culture of Visual Basic will have to be violently altered to permit a wide deployment of parallel processing through this language community.

Bill himself, does not like notion dead VB in the EOL category. Bill may personally see to it that the system lives a little longer than it should.

How would Bill save Basic and do a really good job of it at the same time? The only way to achieve this is for Microsoft to (once again) make violent changes to the language and overall software pattern of the Basic language. This happened once before. When VB.Net hit the market, VB programmers almost became violent over the loss of COM objects (such as DAO and ADO) and the kind of performance degradation their favorite sloppy language constructs produced. They screamed their lungs out.

So Bill could order the construction of PB--Parallel Basic--that would not allow Global.bas files, enforce class encapsulation, push immutable data, and statelessness, but... Can you imagine how much more upset VB programmers will be if and when they were to discover than Microsoft had removed the Module, the Global.BAS, and the ability to float global variables, the ability to do functions without classes in PB? Can you image what would occur if every DIM statement produced a value identifier that immutable by default?

I believe the typical VB programmer would be livid to the point of heart-attack and stroke.

It might just be easier for Microsoft to place Visual Basic in the EOL category, and offer no further upgrades to this language. EOL means End of Life. They did it to FoxPro. They did it with their Fortran product (which was excellent). They did it to VBScript & ASP.

This brings us to the subject of Oslo and M. Already there is a theory that Oslo, also known as the M language, is being groomed by Microsoft as the declarative and thread-safe replacement for Visual Basic. According to the poop sheet, M is going to be an extremely parallel language system. Lots and lots of parallelism is going to happen under the hood whether you know it or not (as is the case in SQL). More will be availible if you simply learn & implement a few elementary patterns of development. It has to be seen whether a recalcitrant and stubbornly lazy VB community will even be willing to learn this new programming system.

At this point the Visual Basic programmer is probably screaming his head off "AS IF ANY OF THIS IS REALLY NECESSARY?? WHAT THE FUCK IS THE BIG DEAL ABOUT THREADING?" I have stood nose to nose with 54 year old VB programmers paid $95K+ as they insisted that that sort of programming is categorically unnecessary in an LOB departmental programmer's tool kit. "Yeah, but we'll never have to do that kind of thing around here! Why would it ever be necessary? That just isn't necessary."

Make no mistake about it: we are all going to have to program in parallel. We are going to have to use threads and PLinq and everything else that PCP throws at us. This is the only way our apps will be able to handle the terabytes of data we will be required to process in the next decade. The world changed in June of 2005, and most programmers have still not accepted this fact. Processors are not going to be getting that much faster in single-thread execution mode. Processors are going to become massively more parallel. We are only going to get faster by exploiting multiple cores at a time. We can only process increasing volumes of data by abandoning the single threaded and dual threaded application architecture. This means you will not be able to continue making a living programming in single threaded or dual threaded mode. Parallelism is the new God and maximum imperative of programming.

I find that most programmers in most languages are in a vehement state of denial about the gravity of our current CPU architecture. No where is it more stridently expressed than in the VB world. VB programmers are strident because they have great reason to fear. Many of the better VB programmers have tried their hand at threading. Most of them discovered threading caused all manner of problems in their applications. Basically, threading broke their existing application architectures. This is because of shared, global, mutable state, uncontrolled side-effects all over the place, synchronous communication between threads, and a highly imperative approach to coding?

So what about C# and Java? Java and C# programmers have a tendency to thread more in their apps, but these languages may not survive either. C# is a bit better off than Java. C# has absorbed many features of functional programming, although it is much more difficult than it should be to do immutable data. Also, F# is not a very good challenger for C# on the .NET side. Many question whether it was ever indented to be. Java, on the other hand, has not really absorbed anything of the gospel of functional programming. Java still preaches the old-style of lock-based, synchronous communications thread model. Java is also faced with a serious threat of replacement by Scala, which is surging in popularity all over the world. There is no real doubt that Martin Odersky intends Scala to be the general-purpose replacement for Java on the JEE platform. He has explicitly testified to this in interviews.

Neither Java nor C# is guaranteed survival in the coming marketplace, but I firmly believe Visual Basic is dead.