In my last article, I explained my view on why we need to constantly upgrade our computers. One vital thing was missing from that view however, and that was the effect of "Backwards Compatibility" on the evolution of computing. To explain further, we have to go back in time, so let's fire up the old TARDIS and take a look-see (vworrrp, vworrrp, vworrrp...)
In the Eighties, the world (in computing terms) was split into mainframes (think cupboard computers), minis (sideboard computers), and Micros (desktop computers). Micros were regarded mostly as toys, so the main development was in the larger platforms, but home users started to get in on the act, and in the early eighties, 8bit machines like the BBC Micro / Sinclair ZX81 were the norm. By the mid Eighties, things had changed significantly, with the arrival of two things - the first was the IBM PC, an 16Bit personal computer (limited by an 8bit bus), designed for business by the biggest mainframe designer in the business - IBM The IBM PC started off life as nothing more than a "dumb terminal" to connect to IBM mainframes, but soon, companies realised that this machine could be more productive, and not long after it's inception, software began to appear for it - business software, that would eventually lead to a revolution amongst computing, and eventually give us the most powerful Monopoly next to Disney - yup, you guessed it, it's Microsoft... As time progressed, the Intel 8086 processor used as the PCs "brain" had extra bits bolted on to give more power (and which changed its name, so 80286, 80386, 80486, Pentium etc...), whilst still keeping the ability to run software designed for the older parts - this backwards compatibility became the Intel/Microsoft Windows biggest selling point...
The second event of the Eighties was much more significant - the development of the 16bit processor with a 32bit bus. The PC ran a chip by a company called Intel, but other manufacturers were developing much more powerful chips, that were far easier to program than the stupid Intel 8086. Motorola gave us the 68000 range of chips, the power of which can be seen if you compare the IBM PC with the home computers of the time - the Atari ST, Commodore Amiga & Apple Macintosh, all of which were far superior, and could graphically out-perform a PC easily. A war ensued in the late Eighties / early Nineties, between machines based on the far superior Motorola chips, and the Intel based computers...
As we move forward, the outcome of that war is evident - the Intel/Microsoft became dominant across the world, as did the 8086 and it's subsequent progeny, all the way up to the Intel Dual Core/Quad Core processors used today. The only survivor of the Motorola camp was Apple, which continued to use Motorola based processors until recently, when (for cost reasons), they switched to the same Intel processors.
Why did Intel/Microsoft win the war? It certainly wasn't through technical superiority - the Motorola chips were much more powerful, and easier to program than the Intels. No, the outcome of the war was due to the fact that for the first time in history, the SOFTWARE / USER combination was the driving force. In the beginning, the two camps were split by usage - the Intel camp was purely business based, the Motorola camp was mostly home based, with some business use (mainly in areas that required advanced graphic capabilities). At first, business users didn't need the kind of graphical power - no, what they wanted was a number cruncher / word processor that could store it's information. The IBM PC had a neat trick up it's sleeve - some of it's parts were modular, which meant that you could upgrade it easily. Need some better expansion ports? Add a card! Need a better graphical display? Change the graphics card! Need more storage? Change the hard disk! This expansion capability led to something else - other manufacturers could use the same parts to create their own version of a IBM PC! Manufacturers like COMPAQ & Packard Bell were soon making the first IBM PC "Clones", which ran the same operating system (DOS), run the same programs, and even use the same expansion boards! Business users kept pushing the envelope of what was possible, and what was wanted. Many more clones appeared...
Over at the technically superior Motorola camp, there was no equivalent business push. Lots of innovation & technical wizardry (which would eventually be picked up by the business market, but implemented at a business level on Intel based PCs). Many of the Motorola Camp computers of the time were staggeringly good, but the business market looked at what it already had (many more Intel based PCs around, and many more en route) and market forces went the "PC" way. The Intel/Microsoft camp's ability to be expandable, combined with it's software "backwards compatibility", triumphed over a technologically superior force...
As we materialise back into the present, we realise that ALL PC's (including Macs) are now using chips based on the original Intel 8086 design, albeit a design that has been significantly added to, with new capabilities bolted on nearly every day. A design that is acknowledged as being "difficult to program", compared to most other chip archetectures. In summary then, the majority of us are using computers based on a chip design that has thrown nothing away, is bloated, inefficient, and difficult to program...
It boggles the mind!
(by the way, If you want to see how inefficient your own computer is, I suggest you take a look at the Raspberry Pi, a credit card sized computer, that can do the same things a PC can, and do it cheaper. )
However, things may well soon change. Recently, the same software/user combination that created the PC dominated world we live in, has started to evolve itself. No longer is the business market the driving force, instead, the individual user market is starting to take hold. New single user devices, such as smart phones, music players & tablets, are coming to the fore like never before, and unlike the PC, these devices are not reliant on old chip technology - instead, they use a new breed of modern CPUs that are cheaper to manufacture, as well as being powerful. Software is slowly moving away from the monlithic ideas of the past, and is slowly morphing into the "app" model - cheap, powerful programs, that are just a click away. Where the computer was once the focus, now it is the user who is the focal point. People argue still argue over which is better - PC or Mac, but that war is over - another war is on it's way, the war between Apple's IOS and Android, and who will win this one? n The technically superior Apple line, or the more adaptible Android camp? I know who my money's on...
No comments:
Post a Comment