I spent a good part of today, not unlike most days, searching, writing, testing, backtracking and just generally feeling my way around the .NET framework (fx). There can be no doubt that it's an impressive piece of work and, it has to be said, I love it. But, the purist in me wonders what I'm losing (nay, throwing away) by being so lazy.
As part of my thinking process, I still refer to my days (well, yeah ok, nights) of building my C programs on the Amiga. The environment I had trained myself to use was strictly command line and self-rolled scripts. I thought it was pretty flash at the time. And yet the .NET Visual Studio environment is like the finest ice cream to a tea-spoon of sugar. But that ain't necessarily a good thing.
It took me a while to realise that learning a programming language isn't the hard part. It's learning the API or object model of the environment that takes the real work. I spent many hours trawling through the Amiga OS docs, marveling at all those functions and looking forward to the day when I'd have a project that would require me to learn how to use them. Going from VBA v5 (Excel/Access 97), which at the time I thought insurmountable, to VBA/VB v6 to .NET VB/C# has been a continuous seemingly vertical learning 'curve'. The funny thing is, I discovered today that it requires 2 lines of VB.NET code to open and post to a system wide Event Log of my own creation. A few days ago, I found out that it takes about the same amount of code to find out the name of the function that called the current function. These are not simple undertakings, at least I wouldn't have thought so. So what are we losing???
Back on the Amiga, I started out programming in AmigaBASIC. It didn't take me long to dump that and head for C, a much more satisfying and malleable tool. 68K assembly made a brief showing too, the two seeming to be very closely related and never far away from each other. Of course that could be because of Matt Dillon's excellent DICE compiler but I digress. I didn't move from BASIC because of speed issues, although that would have been a valid reason. I moved to get closer to the hardware and have more control over the actual code that was run. In fact, I usually took great panes to make sure things ran as efficiently as possible. It took me a long time to come across to the PC partly because of the bloat of the programs, which seemed to be simply reeking of inefficiency. On the Amiga, I could write a "hello world" program, in C, that compiled and linked to a 400 byte executable. The smallest program I could write on a PC was something like 16KB. Is it any wonder a 7Mhz Amiga felt like a 33Mhz 386?
Is David (a colleague) right to continue to write and debug his code in an environment that is little more than a fancy shell for a basic text editor? Is the frustration I feel when I see him racking his brains trying to hunt down a bug that simply would not happen in a more sophisticated environment, proportional to the level that I've been spoilt by the bells and whistles? Is he really disadvantaged by using an environment that does not dangle under his nose the spoils of the fx such as Remoting and Timespan types. I wonder. Unbeknownst to him, is he not 'paying his dues'?
In my defense, I think I've payed most of my dues and discovering all these new 'toys' in the fx and CLR has renewed my enthusiasm. These days, I feel more like a conductor than one of the instrumentalists as far as programming is concerned. In fact, I think the term programmer should be struck from current language, software developer being much better suited to the actual work. And besides, I can fire up WinUAE anytime and relive those days where you got two fifths of fuck all for free but the rewards were perhaps more valued.