DAVE'S LIFE ON HOLD

Hardware Culture for the Software Guy

Some days I get to realize I'm one of the lucky few who get to play with both hardware and software at a very low level. Most application programmers these days are so divorced from the reality of the hardware they touch every day, that they forget or simply remain unaware of the magic behind modern electronics. And by magic, I mean data sheets!

Data sheets like this one for a family of operational amplifiers are amazing pieces of documentation. They include more than enough information to predict how a hardware component will function in production across a wide range of inputs and outputs. Not only that, but the data sheets provide reliability guidelines indicating within what operating conditions a particular module is most reliable. Finally, as hardware deals with a messy reality of manufacturing process and quality control, each component has it's attributes quantified with tolerances.

When compared with software documentation, there is no comparison. Most code is not only undocumented, but entirely uncharacterized. In software, we have a "benchmarks" industry that really only supports marketing. You can't go to market with a set of software modules from different vendors and expect anything but a complete disaster based on available benchmark data. It is easy enough to test and measure the behavior of software components with respect to a number of measurable quantities, yet no suppliers do this work. The absence of this culture of data sheets in software is why there is no real market for "software component makers".

Imagine a world in which Object Oriented programming won. Imagine a world in which a hugely popular VM became a standard platform across which many different languages ran. Imagine a world where instrumentation existed to dynamically trace resource utilization of each software component. Imagine a world where software could be dynamically optimized and de-optimized for a given processor architecture based on heuristics at run time.

Ok?

That world has been with us since 1995. We have all of the technology necessary to build highly reusable software components. Yet when was the last time you saw a piece of Java code with documented performance metrics and runtime characterization? Look at the documentation for JSON in Java. Find a single table concerning runtime characteristics. Find a single performance graph. Find a single reference to I/O bandwidth, transactions per second, GCs per GB of JSON, CPU cycles per MB JSON, RAM per MB JSON. You'll see nothing, nothing at all. The reference implementation of JSON for Java has no characterization data at all. This is a reference in spirit only, a yardstick only in name.

I was loading a website the other day from a cafe, and the site kept not loading. Doing a little digging revealed that I was getting ~1Mbps download speeds (134kB/s), and as a result I was only able to download the first 68k lines of the file (~1.3MB). Using curl I was able to download the full 2MB file, all 78k lines of javascript of it. To put that in perspective, the source code to DOOM was 59k lines of C, including headers, sound card drivers, serial drivers, and IPX network drivers. Assuming that the web application was at least as complicated as DOOM (highly unlikely), the fact that such an error condition could be reached shows a basic misunderstanding of the interaction between file size, bandwidth, and time. It becomes self evident that not only are the software developers not measuring critical performance criteria, but are unaware of the conditions that can cause their applications to fail. This is an example of "works for me" at it's finest.

Much of what passes for software engineering is nothing such. The reason for this farce lie in some of the popular myths of software development. First there is an economic model which treats software development as a sunk cost, which leads to a falacious escalation of commitment based on prior investment. This is why most people continue to run Windows even when recent versions of Windows no longer supports a considerable amount of software that is only 10 years old. (Try to run a DirectX 8 app on Windows 8 or newer). The second reason lies in social inertia, the more popular an idea becomes the more difficult it is to change course over time. This is why most programming languages look like Algol (C is just Algol with more squiggles), there is too much cultural inertia to overcome the Algol family tree heritage. The third reason lies in the mistaken notion that software is ephemeral. If software is ephemeral why the NYC MTA still runs IBM OS/2 Warp 3.0 which is 21 years old. Why is the FAA running 40 year old air traffic control systems? The answer is that software isn't ephemeral, it is as long lived as the system of which it is a part. For example, NASA still maintains the software on Voyager 1 which is older almost half of all professional programmers!