Wednesday 19 December 2012

Raspberry Pi's, Android, and the(mis)hape of things to come...

In my last article, I explained my view on why we need to constantly upgrade our computers.  One vital thing was missing from that view however, and that was the effect of "Backwards Compatibility" on the evolution of computing.  To explain further, we have to go back in time, so let's fire up the old TARDIS and take a look-see (vworrrp, vworrrp, vworrrp...)

In the Eighties, the world (in computing terms) was split into mainframes (think cupboard computers), minis (sideboard computers), and Micros (desktop computers).  Micros were regarded mostly as toys, so the main development was in the larger platforms, but home users started to get in on the act, and in the early eighties, 8bit machines like the BBC Micro / Sinclair ZX81 were the norm.  By the mid Eighties, things had changed significantly, with the arrival of two things - the first was the IBM PC, an 16Bit personal computer (limited by an 8bit bus), designed for business by the biggest  mainframe designer in the business - IBM  The IBM PC started off life as nothing more than a "dumb terminal" to connect to IBM mainframes, but soon, companies realised that this machine could be more productive, and not long after it's inception, software began to appear for it - business software, that would eventually lead to a revolution amongst computing, and eventually give us the most powerful Monopoly next to Disney - yup, you guessed it, it's Microsoft...  As time progressed, the Intel 8086 processor used as the PCs "brain" had extra bits bolted on to give more power (and which changed its name, so 80286, 80386, 80486, Pentium etc...), whilst still keeping the ability to run software designed for the older parts - this backwards compatibility became the Intel/Microsoft Windows biggest selling point...

The second event of the Eighties was much more significant - the development of the 16bit processor with a 32bit bus.  The PC ran a chip by a company called Intel, but other manufacturers were developing much more powerful chips, that were far easier to program than the stupid Intel 8086.  Motorola gave us the 68000 range of chips, the power of which can be seen if you compare the IBM PC with the home computers of the time - the Atari ST, Commodore Amiga & Apple Macintosh, all of which were far superior, and could graphically out-perform a PC easily.  A war ensued in the late Eighties / early Nineties, between machines based on the far superior Motorola chips, and the Intel based computers...

As we move forward, the outcome of that war is evident - the Intel/Microsoft became dominant across the world, as did the 8086 and it's subsequent progeny, all the way up to the Intel Dual Core/Quad Core processors used today.  The only survivor of the Motorola camp was Apple, which continued to use Motorola based processors until recently, when (for cost reasons), they switched to the same Intel processors.

Why did Intel/Microsoft win the war?  It certainly wasn't through technical superiority - the Motorola chips were much more powerful, and easier to program than the Intels.  No, the outcome of the war was due to the fact that for the first time in history, the SOFTWARE / USER combination was the driving force.  In the beginning, the two camps were split by usage - the Intel camp was purely business based, the Motorola camp was mostly home based, with some business use (mainly in areas that required advanced graphic capabilities).  At first, business users didn't need the kind of graphical power - no, what they wanted was a number cruncher / word processor that could store it's information.  The IBM PC had a neat trick up it's sleeve - some of it's parts were modular, which meant that you could upgrade it easily.  Need some better expansion ports?  Add a card!  Need a better graphical display?  Change the graphics card!  Need more storage?  Change the hard disk!  This expansion capability led to something else - other manufacturers could use the same parts to create their own version of a IBM PC!  Manufacturers like COMPAQ & Packard Bell were soon making the first IBM PC "Clones", which ran the same operating system (DOS), run the same programs, and even use the same expansion boards!  Business users kept pushing the envelope of what was possible, and what was wanted.  Many more clones appeared...

Over at the technically superior Motorola camp, there was no equivalent business push.  Lots of innovation & technical wizardry (which would eventually be picked up by the business market, but implemented at a business level on Intel based PCs).  Many of the Motorola Camp computers of the time were staggeringly good, but the business market looked at what it already had (many more Intel based PCs around, and many more en route) and market forces went the "PC" way.  The Intel/Microsoft camp's ability to be expandable, combined with it's software "backwards compatibility", triumphed over a technologically superior force... 

As we materialise back into the present, we realise that ALL PC's (including Macs) are now using chips based on the original Intel 8086 design, albeit a design that has been significantly added to, with new capabilities bolted on nearly every day.  A design that is acknowledged as being "difficult to program", compared to most other chip archetectures.  In summary then, the majority of us are using computers based on a chip design that has thrown nothing away, is bloated, inefficient, and difficult to program...

It boggles the mind!

(by the way, If you want to see how inefficient your own computer is, I suggest you take a look at the Raspberry Pi, a credit card sized computer, that can do the same things a PC can, and do it cheaper. )

However, things may well soon change.  Recently, the same software/user combination that created the PC dominated world we live in, has started to evolve itself.  No longer is the business market the driving force, instead, the individual user market is starting to take hold.  New single user devices, such as smart phones, music players & tablets, are coming to the fore like never before, and unlike the PC, these devices are not reliant on old chip technology - instead, they use a new breed of modern CPUs that are cheaper to manufacture, as well as being powerful.  Software is slowly moving away from the monlithic ideas of the past, and is slowly morphing into the "app" model - cheap, powerful programs, that are just a click away.  Where the computer was once the focus, now it is the user who is the focal point.  People argue still argue over which is better - PC or Mac, but that war is over - another war is on it's way, the war between Apple's IOS and Android, and who will win this one? n The technically superior Apple line, or the more adaptible Android camp?  I know who my money's on...






Tuesday 18 December 2012

Progress - why you HAVE to buy a new computer every so often...

Do you remember the HOVIS advert?  Que the brass band...

"'Ee were a great computer, my Mainframe!  He could run t'payroll, t'stock system, and t'online transactions, print bloody great reports that consumed reams of green & white paper, and do all that in 512k of memory.  Kids these days - don't know they're born..."  etc, etc....

You may laugh, but this is all true - I know, I was that child.   In 1983, I started my first job - a computer mainframe operator for a large seed grower in East Anglia, and the mainframe I operated was an ICL ME29, with 256k of memory, that did all of the above.  My friend at the time (hi Tony) used a Dragon 32 computer at home - state of the art.

Fast forward to 1991 - I was still operating mainframes (although I was now a Shift Leader, in charge of other operators, as well as doing some COBOL development) and I was operating ICL VME based systems (3960).  At home, I was using a ATARI 512k STFM - a great little machine that had a graphical interface, ran great games, and let me create a newsletter for a club I was with.  In short, a brilliant little all-rounder, that had twice as much memory as my old mainframe...

...1996 - my Atari was a fond memory in the attic, and my main machine was a 486 based PC running Windows 3.11, with a mahoosive 4Mb memory...

...and now, I'm writing this on a Dual-Core PC, with 8Gb memory (that's 32 times as much memory as my PC in '96 - 16 years ago).

Am I doing work that significantly different to what I was doing 16 years ago?  No.  Do I perform my work any quicker?  No.  Is my computer any quicker?  No.  So the question needs to be asked - why, when technology has developed, am I no more efficient than I was 30 years ago?  I'm using more computing power now, than the Apollo Space Program / Shuttle Program had at it's disposal, so why am I STILL screaming at my computer to "GO FASTER DAMMIT!"?

The answer my friends, is one simple word, and that word is...

LAZINESS.

You see, back in the old days, in the Eighties and earlier, much software was written at very low level, often hand coded & optimised to suit the hardware available.  Try any of the computers created during this "golden" age, and you'll find that they seems to run just as fast as anything now - yes, their graphics might be a bit blocky, and their floppy access is SLOW, but you can run a game on an Atari STE now, and it'll seem just as snappy and responsive as a modern game.  This is because of optimisation.

In the late eighties, a programming paradigm became popular - that of code libraries.  The idea is simple - say you write a piece of code that opens a text file.  You'd like to use that code again in another program, so you create a library - a repository of common code that you can use, again and again.  Libraries are a great idea, but they do have a drawback - they promote "library sprawl", a condition where the number of libraries you use grows out of control.  This sprawl, is most noticeable in situations where a program needs features added quickly, and where is the worst offender of library sprawl?  Yup, it's our old friend Microsoft Windows - again...  :(

As an example, I was recently looking into moving the data from a user's PC to another, and I did a quick check to see how much disk space he was using.  Windows itself, was taking nearly 14Gb - compared to my system in 1996, that's nearly 1000 times the size that Windows 3.11 took up on my machine in 1996.  Reliance on existing code, gives faster development turn rounds, but leads to mahoosive computer systems, that require massive amounts of processing power, just to process these libraries.

To be fair, most commercial operating systems have the same problem, this includes OSX and Linux.  Some would say that software grows to fit hardware, but over the years, I come to the conclusion that this is not true.  We HAVE to upgrade our systems because the developers of new software are under tremendous pressure to deliver quickly, which means they HAVE to rely on existing libraries - they have no choice.  It's a sad, but true fact - our software is LAZY, which means we have to throw more computing power at it to get the speed / results we expect.

Think of it in these terms.  A fat man can't run the 100 meters, but instead of losing weight, he gets a bike, but still eats what he likes.  Two years later, he can't ride the bike any more, because he's too fat, but still needs to do the 100 meters, so he gets a motor bike, but yup, he's still tucking into those doughnuts!  Two years further on, he's now so fat, that he can't keep his balance on the bike any more, so he gets a Ferrari-powered dumper truck to do the 100 meters in - stopping on the way, to get a super sized McHeartbreaker Burger with thick shake...

Lazy - oh yes...

IN the 

 


Apple Vs PC - Pointless & Stupid...

In my last article, I mentioned why I wouldn't buy a Mac, with my own money.  Some people have commented that I'm anti Apple, or I'm in the "PC Camp" in the Apple/PC Fanboi debate.

What a load of old tosh!

I've been in the IT Industry for nearly 30 years, I've used mainframes, PCs, Macs - in short, been there, done that. I use PCs at work (because that's the environment I work in), I use PCs at home (because I like the platform, and the value for money it gives me), but I don't always run Windows, because I prefer Linux.  

I've dealt with this type of technology long enough, to know that every platform has good and bad points, but if you're aware of the good points, you should also not be blind to the bad.  I believe I have a balanced view on the subject, because I don't pretend that one platform is better than the other - they all have their strong points.  Apple make some lovely kit, but they do charge a bundle for it, and sometimes, it doesn't play nicely with anything that isn't Apple - that doesn't mean you shouldn't buy it, just be aware.  In the same vein, PC kit can vary in quality because it's so varied, but it offers better value for money, combined with cheaper repair costs - but it can be hampered by an operating system that has evolved in strange ways, leading to sometimes weird and wonderful problems - again, that doesn't mean you shouldn't buy it, just be aware.  If you don't like windows, try something else!

Any debate  about Apple vs PC is a stupid waste of time.  If you like one, stop wasting time trying to evangelise it's benefits - enjoy it for what it is - a tool.


Thursday 13 December 2012

Why I won't be buying an iMac...

As many of you know, I'm a cartoonist, as well as an IT Specialist, and many of my cartooning friends use Macs - in fact, many creatives use Apple Macs (in various guises) and will continue to do so, no matter what.  Many have tried to convince me to convert, and many times have I thought about it - and declined.

"But why do you decline?" you may ask.  "Surely they're stuff is great!".

Mmmnn...

Let me start by linking a review of the new Apple iMac 21.52 2012 version - this is for balance (!)

Ok - Apple make good products.  They are well engineered, look stunning, perform well, and provide a great user experience.  They are also expensive, difficult to repair / upgrade, and are somewhat proprietary in their nature.  Back in the Nineties, I was using Atari STs at home, PCs at work.  I loved my Atari - it was quick, slick and a joy to use, but it suffered form the same blight as the iMac, as it was expensive to upgrade, and proprietary.  By 1996, I was solely PC based, and have been ever since.

PCs may be a bit clunky, a bit large, and not as sexy as Apples, but they are cheap to upgrade / repair, and cheap to purchase - I can buy a PC that will out-perform an iMac for less money, and for those of you who would argue that buying a PC means that you're locked into that user experience, I would point out that not only can you run Windows on a PC, but you can run Linux (my personal favourite is PCLinuxOS) or even OSX (although you're not supposed to...)

In short, a PC is cheaper to buy, cheaper to run, and just as good.  I liken the Apple/PC choice to that of buying a Ferrari/Honda - both will get you where you want to go, but it'll be cheaper and more reliable in the Honda (and yes, I DO drive a honda)...