Would an omnipotent God create a bad design? That’s a question that creationists who know something about biology should ponder. After all, there are no end of design flaws in biological organisms.
However I’m going to write about bad design decisions made by humans. I’m going to talk about technology because that’s where we see examples every day.
I build my own computers so one classic example of a bad design was the way USB motherboard headers were laid out. Even today motherboards have header pins on them so that builders can connect extra ports to the front or back of computer cases.
USB headers have always been arranged in pairs. One USB port has 4 connections: 5V. signal, return & ground. Two of them can be either set up in parallel, which makes board design easier (you only need to route the signal connections separately), or in opposition (i.e. the 5V on one is next to the ground on the other and vice versa), which makes it impossible to connect the port incorrectly (no matter which way you insert the plug, it is correct). Fortunately no one ever set them up in series (the two headers in a straight line rather than side by side)!
In the first case, which was distressingly common, putting the plug in the wrong way would damage the motherboard. At the very least you would burn out the USB circuit. The second design was clearly superior but it was rarely used.
The first case could have been corrected by putting a plastic shroud around the header pins, then putting a “key” on the plug that would fit into a notch on the shroud. This would have added perhaps a penny or so to the manufacturing cost, so you only saw it on high end motherboards.
Because both designs existed, case designers had to allow for either. There were two methods of handling this. The better, and more common, way was to separate the individual USB plugs, so you could plug them in either way. Less common, but sometimes used, was to simply put a plug on each individual wire, so the user had to plug 8 individual wires onto the USB header. WTF!
It took a very long time but eventually industry came up with a solution. Board designers added a second ground pin to their designs so that the two rows of headers were asymmetrical. One side was one pin longer than the other. Case manufacturers switched to using a 10 hole plug with one hole plugged, so the plug could only fit over the header one way.
I doubt this was significantly cheaper to design or manufacture than the 8-pin methods used earlier, but it allowed everyone to adopt a new standard without causing either of the older designs to be criticized. It “saved face” for everyone.
A slightly different problem arose when video cards started requiring more power than could be provided through the motherboard slots. The original solution was to use the increasingly underused floppy disk power connector. This connector was on every power supply (there were almost always 2 of them) and floppy drives were becoming obsolete so there was no real downside to using one.
This was a short-lived solution as the power demands of video cards climbed. Floppy power connectors were soon replaced by a shortened version of the motherboard power connector. Instead of 20 lines, it had only 6 – 3 power and 3 ground. Thanks to the design of the motherboard connector, it couldn’t be plugged in the wrong way. Like the motherboard connector, the +12V connections were away from the locking clip.
Over time this has increased in size to an 8-pin version.
Unfortunately, the demands for power on the motherboards were also climbing. This was met by extending the motherboard power connector by 4 pins and also by adding a separate 4-pin power connector (usually plugged in near the CPU). Like the video card power connector, this also was shortened version of the motherboard power connector. Over time it also was extended to an 8-pin version.
Unlike the other two connectors, however, this new connector had the +12v connections on the same side as the locking clip.
You can see where this is going – two connectors with the same physical size and shape, capable of being plugged into two different sockets with different electrical expectations. Plug the wrong one in and you fry the circuitry.
This was, in effect, a monumentally stupid design decision. Moreover there was no reason for it. The original connectors were simply providing power and had no sensing capability. Had they been delivering power the same way, no harm would be done to the video card or motherboard.
You can see examples of the various power connectors at http://www.playtool.com/pages/psuconnectors/connectors.html. The yellow wires are +12V. You can do a web search for explanations of the various connections.
Let’s turn our attention now to something more familiar – television boxes. The first high definition (HD) televisions were rolled out back in 1996. They used 1080i (interlaced) which is still the broadcast standard for HD. Prior to that, the standard was 480i (SD) , which had been set back in the 1940s. While there are some 4k transmissions today, any HD TV has to be able to handle 1080i.
DVDs have been available since 1995 and they specify a data rate of 480i – the same as SD television, although they allow for a wide-screen format of 1.5:1 – slightly more square than the HD ratio of 1.77:1.
There is also 1080p (progressive scan) which delivers twice the data rate, painting a completed screen 60 times per second instead of the 30 times per second of 1080i. Few people can detect the difference but when BluRay became the de-facto HD disc standard, it allowed 1080p.
Recapping, every TV ever made must be able to handle 480i because that is what is output from DVD players and what is still used in a minority of television broadcasts. Every HD TV must be able to handle 1080i because that is still the broadcast standard.
So let’s look at some “smart TV” adapters – small boxes with an internet capability (wired or wifi). A lot of them give you two output options – 720p and 1080p. Some even add 4k (2160p). This is another WTF moment. Only the 4k option even matches a required data rate and that is only for very new TVs. The vast bulk of TVs in use aren’t 4k.
So owners of standard and HD TVs frequently can’t use these boxes to turn their televisions into smart TVs. Conversely, almost all TVs sold today already have smart capability so they don’t need these boxes. In particular, finding a 4k TV without smart capability is hard to do.
It leaves me scratching my head about the market research the manufacturers conducted. They seem to have deliberately relegated themselves to a niche market -adding smart capability to those few TVs that handle 720p or 1080p but don’t already have smart capability. The larger number of 480i and 1080i TVs are ignored.
Modern TVs have adopted a standard developed for computer monitors where the monitor notifies the device it is attached to as to the video formats it can handle. This doesn’t always work however, but when it does, it makes life easier for the end user.
Let’s look at what happens when it doesn’t work. I recently acquired a used PVR to connect to my home theatre system. Most of the system is far from state of the art. My main speakers are around 40 years old, while my AV receiver pre-dates HDMI and uses component video instead (component video is as good a HDMI for HD material, but uses a more complicated cable that can be connected wrong if you can’t follow colour coding. It also doesn’t include audio so you need a stereo audio cable as well).
Fortunately the PVR box has component video output. In theory I should just be able to plug it in and it should work (after getting it authorized from my cable TV provider). Unfortunately the designers of the box decided to default to having the box negotiate with TV to set the output resolution. The result was that I got nothing useful on my screen because, when it couldn’t negotiate, it defaulted to a non-standard mode that my 10 year old television didn’t support.
Neither did a newer, small TV that I had set up in my exercise room.
Both of these TVs were from quality manufacturers (Panasonic and Sony) so the problem isn’t that they were made on the cheap. The problem is that auto-negotiation isn’t reliable. The connecting device needs to be able to fall back to something that will display a reliable signal.
I’m not sure if the smart TV adapter or the PVR fell back to 720p or 1080p but neither is a standard definition supported by all TVs.
The obvious solution would be to fall back to 480i, a data rate supported (as I mentioned earlier) by every TV made since the DVD standard was set in 1995 and by every North American TV ever sold. So why drop back to a mode that cannot be guaranteed to be available?
In fact, why go through auto-negotiation at all? Why not simply display a 480i picture and ask users to set the output mode? You could even have that default to an “auto” setting that falls back to 480i if you don’t get a confirmation within 15 seconds, like some computer operating systems do.
In the end, I had to connect my pvr to a TV that handled 1080p so I could change the display setting from auto to 1080i. Fortunately the pvr kept that setting between power outages while I moved it between TVs.
Modern automobiles are another case in point for bad design. In recent years they have replaced the myriad of knobs, buttons and switches with a small touch screen control console – usually in the centre of the dash.
It doesn’t take a genius to realize that the last thing you want is for the driver to take their eyes off the road to adjust something. Yet that is what is required for any function controlled by a touch screen. While I can switch the A/C on or off or change its temperature in my car by the push of a button or turn of a knob, requiring a fraction of a second to ensure my finger is touching the right control, modern cars require you navigate menus before finding the right control.
Moreover there is no tactile feedback to let you know you succeeded. You need to keep your eyes on the screen to verify that the setting was properly changed.
Star Trek fans will recognize this. The captains and navigators had control buttons and levers. Only the people not responsible for the speed and direction of the ship used touch displays. Apparently SciFi writers did a better job of designing controls than automotive designers despite their decisions not having potentially fatal consequences.
Intelligent design of technology requires giving a little thought to your design decisions then testing them before embedding those decisions in products. Apparently human intelligent designers aren’t too different from the cosmic intelligent designer creationists believe made all those design errors in biological organisms.