When I used to lecture at UNI on programming concepts and also computer history I was often amused at the opinions of students on only using the newest method of doing something because it was new so it was obviously better than the old way. I’m not having a go at anyone. Everyone is entitled to their own opinion but that doesn’t mean their opinion is any less valid than mine. If we all had the same opinion about something then there would be no room for discussion on anything and the world would a much more boring place than it already is.
I’m not suggesting that anyone use any of the historical concepts I describe, and I never would, I merely post them because as a lover of programming history I think that understanding old techniques can give you ideas to improve your newer code. The current example might make you think about how you might be able to remove conditionals or redesign your code in some places that might speed up execution, and in OO you sure don’t need to modify running code to do that, and if there are now operating system protections in place to prevent it then of course that is impossible. But knowing that it was once possible can make you think.
I would ask a question of new students which was, “A client has asked you to write a large business application so what are you going to write it in?” Invariably I’d get responses of what was the latest and greatest programming language at the time, when in fact the correct answer was, “what ever tool was best suited to the task to meet the client requirements and get the project completed in the desired time frame, regardless of the language.” If you needed to use a RAD which used, to the students horror, BASIC or PASCAL, then that’s what you’d use.
When I moved onto a company called QCOM in 1999, they were already Australia’s oldest software house being over 30 years old when I joined them, I was responsible for coding some of Queensland Rail’s train monitoring systems, as well a graduate recruitment. Now QCOM had a lot of very well thought out QA and coding procedures which a lot of the new graduates couldn’t cope with because the procedures used old tried, tested and proven ways of doing things. Some graduates would often soon leave because they didn’t want to learn the old ways of doing something. In 2000 QCOM was acquired by Unisys and I spent almost the next 20 years being contracted out to code for some big names and I can honestly say we didn’t always use the latest techniques or technologies.
Now it’s no secret that I’m really not that much of a lover of object orientated coding, I like procedural code and I particularly like zero level assembler, which means everything is global, there is no scope, no dynamic memory management or really any memory management of any kind. All variables are static and their addresses and memory allocated at compile time. The only dynamic memory you have is the stack which you can push and pop items on and off at will. If you allocate too much memory to the heap and it crashes into the stack then everything goes down really quick.
Now I’m reminded of an interesting example of using an old method over new. In 1999 ID software released Quake 3 Arena, in 1999 we had pentium 2, 3, Celeron’s and Xeon processors all with floating point coprocessors, the 486DX was released in 1989 so consumer level FPU’s on a chip had been around for a while. So the question is why did ID decide to use a software implementation of doing a Fast Inverse Square Root function instead of using the FPU. The FISR algorithm as it’s now known is legendary, it works strangely by smashing a floating point number into an Integer then converting it back to a floating point by using a hexadecimal constant of 0x5F3759DF to get an estimate of the reciprocal. Now if you look into the history no one is one hundred percent sure who came up with the code, ID claims it was one person but the algorithm can be traced back to SGI with the introduction of the Silicon Graphics Indigo back in about 1991. No one knows who came up with the constant though. If you’re interested in a really good mystery reading up on the FISR algorithm is fascinating. You could also watch a YouTube video about it on the channel Dave’s Garage, hosted by Dave Plummer an ex Microsoft programmer responsible for such things as dos 6.2, TaskManager, Zip Folders the dreaded Windows activation code and much much more…
As you can tell I’m more than a little passionate about how the old ways can teach us new ways of doing things, I hope that this little tome into some of my past and the reason I am the way that I am has not be too tiring for everyone…
TC