Technology life-cycle (interpret – compile – interpret…)


I’ve been involved with technology long enough to recognize that most of it follows what appears to be well-defined cycles.

With the exception of machine language, which I used to enjoy working with because it allows for intimate contact with some of the underlying hardware, software development or, more specifically, computer languages seem to be en route to follow cyclic patterns.

When I started, back in the late 70’s, it was either BASIC (most people don’t know that it stands for “Beginners All-purpose Symbolic Instruction Code) or machine language. The former was interpreted on the fly – at the time, at least in the case of the Apple II, the interpreter was an integral part of the operating system, which, in turn, lived in the computer’s ROM. The latter was, well, machine language; not much you can do about or with it, as it operated the CPU directly.

Back then, software seemed to advance more rapidly than hardware and as a result, interpreting high-level languages in order to run a program was noticeably slower, and by slower I mean that the user noticed, and this is important because as speed became a consideration in human terms, there was ample incentive to figure ways to speed things up as much as possible, and that’s how compiled languages gained popularity over interpreted code in no time. The ability to hide the original source code (the crown jewels of programmers all over) was another reason for the spike in popularity of compiled programs.

Recently (two or three years maybe), however, interpreted languages have started gaining popularity.

I think this is due to two factors: hardware is fast enough to make it impossible for us humans to notice any delays in response or user interface rendering, which takes care of the first reason compilers became so popular in the 80’s and 90’s. The second reason is that many of these interpreted languages are used to drive web-based or SaaS applications and the code, in these cases, is inherently hidden from prying eyes (i.e. competitors).

According to Robert Cringely (link to article), this is bound to change once solid state storage becomes pervasive enough. With their access speed several orders of magnitude higher than current technology (regular hard drives), delays, he implies, will, once again become noticeable enough to prompt programmers to vie for speed, but I digress.

One thing is to be able to notice a menu being drawn on-screen. A whole ‘nother thing is to wait five seconds instead of one for a web site to respond to a database query.

While the first is obviously unacceptable from a design perspective, cell phones have shown, indisputably, that users do have patience after all. I don’t know how else to explain that NO ONE complains about the fact that 20 years ago the damn things powered-up immediately, while now it takes close to a minute before we can place a phone call.

Of all the gadgets that surround us, the only two that I can think of that are really usable instants after powering them on are cameras and cars (there are probably more, but it’s late and I’m tired). For everything else, we wait.

Advertisements
%d bloggers like this: