In my last article, I explained my view on why we need to constantly upgrade our computers. One vital thing was missing from that view however, and that was the effect of "Backwards Compatibility" on the evolution of computing. To explain further, we have to go back in time, so let's fire up the old TARDIS and take a look-see (vworrrp, vworrrp, vworrrp...)
In the Eighties, the world (in computing terms) was split into mainframes (think cupboard computers), minis (sideboard computers), and Micros (desktop computers). Micros were regarded mostly as toys, so the main development was in the larger platforms, but home users started to get in on the act, and in the early eighties, 8bit machines like the BBC Micro / Sinclair ZX81 were the norm. By the mid Eighties, things had changed significantly, with the arrival of two things - the first was the IBM PC, an 16Bit personal computer (limited by an 8bit bus), designed for business by the biggest mainframe designer in the business - IBM The IBM PC started off life as nothing more than a "dumb terminal" to connect to IBM mainframes, but soon, companies realised that this machine could be more productive, and not long after it's inception, software began to appear for it - business software, that would eventually lead to a revolution amongst computing, and eventually give us the most powerful Monopoly next to Disney - yup, you guessed it, it's Microsoft... As time progressed, the Intel 8086 processor used as the PCs "brain" had extra bits bolted on to give more power (and which changed its name, so 80286, 80386, 80486, Pentium etc...), whilst still keeping the ability to run software designed for the older parts - this backwards compatibility became the Intel/Microsoft Windows biggest selling point...
The second event of the Eighties was much more significant - the development of the 16bit processor with a 32bit bus. The PC ran a chip by a company called Intel, but other manufacturers were developing much more powerful chips, that were far easier to program than the stupid Intel 8086. Motorola gave us the 68000 range of chips, the power of which can be seen if you compare the IBM PC with the home computers of the time - the Atari ST, Commodore Amiga & Apple Macintosh, all of which were far superior, and could graphically out-perform a PC easily. A war ensued in the late Eighties / early Nineties, between machines based on the far superior Motorola chips, and the Intel based computers...
As we move forward, the outcome of that war is evident - the Intel/Microsoft became dominant across the world, as did the 8086 and it's subsequent progeny, all the way up to the Intel Dual Core/Quad Core processors used today. The only survivor of the Motorola camp was Apple, which continued to use Motorola based processors until recently, when (for cost reasons), they switched to the same Intel processors.
Why did Intel/Microsoft win the war? It certainly wasn't through technical superiority - the Motorola chips were much more powerful, and easier to program than the Intels. No, the outcome of the war was due to the fact that for the first time in history, the SOFTWARE / USER combination was the driving force. In the beginning, the two camps were split by usage - the Intel camp was purely business based, the Motorola camp was mostly home based, with some business use (mainly in areas that required advanced graphic capabilities). At first, business users didn't need the kind of graphical power - no, what they wanted was a number cruncher / word processor that could store it's information. The IBM PC had a neat trick up it's sleeve - some of it's parts were modular, which meant that you could upgrade it easily. Need some better expansion ports? Add a card! Need a better graphical display? Change the graphics card! Need more storage? Change the hard disk! This expansion capability led to something else - other manufacturers could use the same parts to create their own version of a IBM PC! Manufacturers like COMPAQ & Packard Bell were soon making the first IBM PC "Clones", which ran the same operating system (DOS), run the same programs, and even use the same expansion boards! Business users kept pushing the envelope of what was possible, and what was wanted. Many more clones appeared...
Over at the technically superior Motorola camp, there was no equivalent business push. Lots of innovation & technical wizardry (which would eventually be picked up by the business market, but implemented at a business level on Intel based PCs). Many of the Motorola Camp computers of the time were staggeringly good, but the business market looked at what it already had (many more Intel based PCs around, and many more en route) and market forces went the "PC" way. The Intel/Microsoft camp's ability to be expandable, combined with it's software "backwards compatibility", triumphed over a technologically superior force...
As we materialise back into the present, we realise that ALL PC's (including Macs) are now using chips based on the original Intel 8086 design, albeit a design that has been significantly added to, with new capabilities bolted on nearly every day. A design that is acknowledged as being "difficult to program", compared to most other chip archetectures. In summary then, the majority of us are using computers based on a chip design that has thrown nothing away, is bloated, inefficient, and difficult to program...
It boggles the mind!
(by the way, If you want to see how inefficient your own computer is, I suggest you take a look at the Raspberry Pi, a credit card sized computer, that can do the same things a PC can, and do it cheaper. )
However, things may well soon change. Recently, the same software/user combination that created the PC dominated world we live in, has started to evolve itself. No longer is the business market the driving force, instead, the individual user market is starting to take hold. New single user devices, such as smart phones, music players & tablets, are coming to the fore like never before, and unlike the PC, these devices are not reliant on old chip technology - instead, they use a new breed of modern CPUs that are cheaper to manufacture, as well as being powerful. Software is slowly moving away from the monlithic ideas of the past, and is slowly morphing into the "app" model - cheap, powerful programs, that are just a click away. Where the computer was once the focus, now it is the user who is the focal point. People argue still argue over which is better - PC or Mac, but that war is over - another war is on it's way, the war between Apple's IOS and Android, and who will win this one? n The technically superior Apple line, or the more adaptible Android camp? I know who my money's on...
Wednesday, 19 December 2012
Tuesday, 18 December 2012
Progress - why you HAVE to buy a new computer every so often...
Do you remember the HOVIS advert? Que the brass band...
"'Ee were a great computer, my Mainframe! He could run t'payroll, t'stock system, and t'online transactions, print bloody great reports that consumed reams of green & white paper, and do all that in 512k of memory. Kids these days - don't know they're born..." etc, etc....
You may laugh, but this is all true - I know, I was that child. In 1983, I started my first job - a computer mainframe operator for a large seed grower in East Anglia, and the mainframe I operated was an ICL ME29, with 256k of memory, that did all of the above. My friend at the time (hi Tony) used a Dragon 32 computer at home - state of the art.
Fast forward to 1991 - I was still operating mainframes (although I was now a Shift Leader, in charge of other operators, as well as doing some COBOL development) and I was operating ICL VME based systems (3960). At home, I was using a ATARI 512k STFM - a great little machine that had a graphical interface, ran great games, and let me create a newsletter for a club I was with. In short, a brilliant little all-rounder, that had twice as much memory as my old mainframe...
...1996 - my Atari was a fond memory in the attic, and my main machine was a 486 based PC running Windows 3.11, with a mahoosive 4Mb memory...
...and now, I'm writing this on a Dual-Core PC, with 8Gb memory (that's 32 times as much memory as my PC in '96 - 16 years ago).
Am I doing work that significantly different to what I was doing 16 years ago? No. Do I perform my work any quicker? No. Is my computer any quicker? No. So the question needs to be asked - why, when technology has developed, am I no more efficient than I was 30 years ago? I'm using more computing power now, than the Apollo Space Program / Shuttle Program had at it's disposal, so why am I STILL screaming at my computer to "GO FASTER DAMMIT!"?
The answer my friends, is one simple word, and that word is...
You see, back in the old days, in the Eighties and earlier, much software was written at very low level, often hand coded & optimised to suit the hardware available. Try any of the computers created during this "golden" age, and you'll find that they seems to run just as fast as anything now - yes, their graphics might be a bit blocky, and their floppy access is SLOW, but you can run a game on an Atari STE now, and it'll seem just as snappy and responsive as a modern game. This is because of optimisation.
In the late eighties, a programming paradigm became popular - that of code libraries. The idea is simple - say you write a piece of code that opens a text file. You'd like to use that code again in another program, so you create a library - a repository of common code that you can use, again and again. Libraries are a great idea, but they do have a drawback - they promote "library sprawl", a condition where the number of libraries you use grows out of control. This sprawl, is most noticeable in situations where a program needs features added quickly, and where is the worst offender of library sprawl? Yup, it's our old friend Microsoft Windows - again... :(
As an example, I was recently looking into moving the data from a user's PC to another, and I did a quick check to see how much disk space he was using. Windows itself, was taking nearly 14Gb - compared to my system in 1996, that's nearly 1000 times the size that Windows 3.11 took up on my machine in 1996. Reliance on existing code, gives faster development turn rounds, but leads to mahoosive computer systems, that require massive amounts of processing power, just to process these libraries.
To be fair, most commercial operating systems have the same problem, this includes OSX and Linux. Some would say that software grows to fit hardware, but over the years, I come to the conclusion that this is not true. We HAVE to upgrade our systems because the developers of new software are under tremendous pressure to deliver quickly, which means they HAVE to rely on existing libraries - they have no choice. It's a sad, but true fact - our software is LAZY, which means we have to throw more computing power at it to get the speed / results we expect.
Think of it in these terms. A fat man can't run the 100 meters, but instead of losing weight, he gets a bike, but still eats what he likes. Two years later, he can't ride the bike any more, because he's too fat, but still needs to do the 100 meters, so he gets a motor bike, but yup, he's still tucking into those doughnuts! Two years further on, he's now so fat, that he can't keep his balance on the bike any more, so he gets a Ferrari-powered dumper truck to do the 100 meters in - stopping on the way, to get a super sized McHeartbreaker Burger with thick shake...
Lazy - oh yes...
IN the
"'Ee were a great computer, my Mainframe! He could run t'payroll, t'stock system, and t'online transactions, print bloody great reports that consumed reams of green & white paper, and do all that in 512k of memory. Kids these days - don't know they're born..." etc, etc....
You may laugh, but this is all true - I know, I was that child. In 1983, I started my first job - a computer mainframe operator for a large seed grower in East Anglia, and the mainframe I operated was an ICL ME29, with 256k of memory, that did all of the above. My friend at the time (hi Tony) used a Dragon 32 computer at home - state of the art.
Fast forward to 1991 - I was still operating mainframes (although I was now a Shift Leader, in charge of other operators, as well as doing some COBOL development) and I was operating ICL VME based systems (3960). At home, I was using a ATARI 512k STFM - a great little machine that had a graphical interface, ran great games, and let me create a newsletter for a club I was with. In short, a brilliant little all-rounder, that had twice as much memory as my old mainframe...
...1996 - my Atari was a fond memory in the attic, and my main machine was a 486 based PC running Windows 3.11, with a mahoosive 4Mb memory...
...and now, I'm writing this on a Dual-Core PC, with 8Gb memory (that's 32 times as much memory as my PC in '96 - 16 years ago).
Am I doing work that significantly different to what I was doing 16 years ago? No. Do I perform my work any quicker? No. Is my computer any quicker? No. So the question needs to be asked - why, when technology has developed, am I no more efficient than I was 30 years ago? I'm using more computing power now, than the Apollo Space Program / Shuttle Program had at it's disposal, so why am I STILL screaming at my computer to "GO FASTER DAMMIT!"?
The answer my friends, is one simple word, and that word is...
LAZINESS.
You see, back in the old days, in the Eighties and earlier, much software was written at very low level, often hand coded & optimised to suit the hardware available. Try any of the computers created during this "golden" age, and you'll find that they seems to run just as fast as anything now - yes, their graphics might be a bit blocky, and their floppy access is SLOW, but you can run a game on an Atari STE now, and it'll seem just as snappy and responsive as a modern game. This is because of optimisation.
In the late eighties, a programming paradigm became popular - that of code libraries. The idea is simple - say you write a piece of code that opens a text file. You'd like to use that code again in another program, so you create a library - a repository of common code that you can use, again and again. Libraries are a great idea, but they do have a drawback - they promote "library sprawl", a condition where the number of libraries you use grows out of control. This sprawl, is most noticeable in situations where a program needs features added quickly, and where is the worst offender of library sprawl? Yup, it's our old friend Microsoft Windows - again... :(
As an example, I was recently looking into moving the data from a user's PC to another, and I did a quick check to see how much disk space he was using. Windows itself, was taking nearly 14Gb - compared to my system in 1996, that's nearly 1000 times the size that Windows 3.11 took up on my machine in 1996. Reliance on existing code, gives faster development turn rounds, but leads to mahoosive computer systems, that require massive amounts of processing power, just to process these libraries.
To be fair, most commercial operating systems have the same problem, this includes OSX and Linux. Some would say that software grows to fit hardware, but over the years, I come to the conclusion that this is not true. We HAVE to upgrade our systems because the developers of new software are under tremendous pressure to deliver quickly, which means they HAVE to rely on existing libraries - they have no choice. It's a sad, but true fact - our software is LAZY, which means we have to throw more computing power at it to get the speed / results we expect.
Think of it in these terms. A fat man can't run the 100 meters, but instead of losing weight, he gets a bike, but still eats what he likes. Two years later, he can't ride the bike any more, because he's too fat, but still needs to do the 100 meters, so he gets a motor bike, but yup, he's still tucking into those doughnuts! Two years further on, he's now so fat, that he can't keep his balance on the bike any more, so he gets a Ferrari-powered dumper truck to do the 100 meters in - stopping on the way, to get a super sized McHeartbreaker Burger with thick shake...
Lazy - oh yes...
IN the
Apple Vs PC - Pointless & Stupid...
In my last article, I mentioned why I wouldn't buy a Mac, with my own money. Some people have commented that I'm anti Apple, or I'm in the "PC Camp" in the Apple/PC Fanboi debate.
What a load of old tosh!
I've been in the IT Industry for nearly 30 years, I've used mainframes, PCs, Macs - in short, been there, done that. I use PCs at work (because that's the environment I work in), I use PCs at home (because I like the platform, and the value for money it gives me), but I don't always run Windows, because I prefer Linux.
I've dealt with this type of technology long enough, to know that every platform has good and bad points, but if you're aware of the good points, you should also not be blind to the bad. I believe I have a balanced view on the subject, because I don't pretend that one platform is better than the other - they all have their strong points. Apple make some lovely kit, but they do charge a bundle for it, and sometimes, it doesn't play nicely with anything that isn't Apple - that doesn't mean you shouldn't buy it, just be aware. In the same vein, PC kit can vary in quality because it's so varied, but it offers better value for money, combined with cheaper repair costs - but it can be hampered by an operating system that has evolved in strange ways, leading to sometimes weird and wonderful problems - again, that doesn't mean you shouldn't buy it, just be aware. If you don't like windows, try something else!
Any debate about Apple vs PC is a stupid waste of time. If you like one, stop wasting time trying to evangelise it's benefits - enjoy it for what it is - a tool.
What a load of old tosh!
I've been in the IT Industry for nearly 30 years, I've used mainframes, PCs, Macs - in short, been there, done that. I use PCs at work (because that's the environment I work in), I use PCs at home (because I like the platform, and the value for money it gives me), but I don't always run Windows, because I prefer Linux.
I've dealt with this type of technology long enough, to know that every platform has good and bad points, but if you're aware of the good points, you should also not be blind to the bad. I believe I have a balanced view on the subject, because I don't pretend that one platform is better than the other - they all have their strong points. Apple make some lovely kit, but they do charge a bundle for it, and sometimes, it doesn't play nicely with anything that isn't Apple - that doesn't mean you shouldn't buy it, just be aware. In the same vein, PC kit can vary in quality because it's so varied, but it offers better value for money, combined with cheaper repair costs - but it can be hampered by an operating system that has evolved in strange ways, leading to sometimes weird and wonderful problems - again, that doesn't mean you shouldn't buy it, just be aware. If you don't like windows, try something else!
Any debate about Apple vs PC is a stupid waste of time. If you like one, stop wasting time trying to evangelise it's benefits - enjoy it for what it is - a tool.
Thursday, 13 December 2012
Why I won't be buying an iMac...
As many of you know, I'm a cartoonist, as well as an IT Specialist, and many of my cartooning friends use Macs - in fact, many creatives use Apple Macs (in various guises) and will continue to do so, no matter what. Many have tried to convince me to convert, and many times have I thought about it - and declined.
"But why do you decline?" you may ask. "Surely they're stuff is great!".
Mmmnn...
Let me start by linking a review of the new Apple iMac 21.52 2012 version - this is for balance (!)
Ok - Apple make good products. They are well engineered, look stunning, perform well, and provide a great user experience. They are also expensive, difficult to repair / upgrade, and are somewhat proprietary in their nature. Back in the Nineties, I was using Atari STs at home, PCs at work. I loved my Atari - it was quick, slick and a joy to use, but it suffered form the same blight as the iMac, as it was expensive to upgrade, and proprietary. By 1996, I was solely PC based, and have been ever since.
PCs may be a bit clunky, a bit large, and not as sexy as Apples, but they are cheap to upgrade / repair, and cheap to purchase - I can buy a PC that will out-perform an iMac for less money, and for those of you who would argue that buying a PC means that you're locked into that user experience, I would point out that not only can you run Windows on a PC, but you can run Linux (my personal favourite is PCLinuxOS) or even OSX (although you're not supposed to...)
In short, a PC is cheaper to buy, cheaper to run, and just as good. I liken the Apple/PC choice to that of buying a Ferrari/Honda - both will get you where you want to go, but it'll be cheaper and more reliable in the Honda (and yes, I DO drive a honda)...
"But why do you decline?" you may ask. "Surely they're stuff is great!".
Mmmnn...
Let me start by linking a review of the new Apple iMac 21.52 2012 version - this is for balance (!)
Ok - Apple make good products. They are well engineered, look stunning, perform well, and provide a great user experience. They are also expensive, difficult to repair / upgrade, and are somewhat proprietary in their nature. Back in the Nineties, I was using Atari STs at home, PCs at work. I loved my Atari - it was quick, slick and a joy to use, but it suffered form the same blight as the iMac, as it was expensive to upgrade, and proprietary. By 1996, I was solely PC based, and have been ever since.
PCs may be a bit clunky, a bit large, and not as sexy as Apples, but they are cheap to upgrade / repair, and cheap to purchase - I can buy a PC that will out-perform an iMac for less money, and for those of you who would argue that buying a PC means that you're locked into that user experience, I would point out that not only can you run Windows on a PC, but you can run Linux (my personal favourite is PCLinuxOS) or even OSX (although you're not supposed to...)
In short, a PC is cheaper to buy, cheaper to run, and just as good. I liken the Apple/PC choice to that of buying a Ferrari/Honda - both will get you where you want to go, but it'll be cheaper and more reliable in the Honda (and yes, I DO drive a honda)...
Monday, 22 October 2012
Watchdog - Why you should TALK to your engineer
Something got me quire riled this week. The BBC's "Watchdog" program did a piece on a Computer Support company that ripped off it's customers,
and as part of the "undercover investigation", they nobbled a PC so
that it could be fixed by an engineer. So far, so good, until they
showed how they nobbled the PC in the first place.
They chose to remove a jumper from a disk drive.
I take affront to this test for several reasons. Firstly, it's not a "natural" fault that would occur during normal use, instead, it's a fault that should only occur if a) the user has been fitting hardware, or authorised someone else to fit hardware (like a new disk), or b) on the day the computer was built. Secondly (and perhaps more damningly), the "user" (an actor in this case) was instructed to tell the engineer some cock and bull story about the machine getting slower and slower, before it stopped altogether - at no time mentioning that the case had been "opened".
I'm sorry Watchdog, but you're out of order. I don't doubt that the company concerned were a bunch of cowboys (you just had to see the prices they were charging), but this evidence misleads the public. Engineers rely on the information that the user gives to them to make an accurate diagnosis of the problem - in this case, your information was not just incorrect, but deliberately misleading. A computer engineer is like a doctor - both have to make a diagnosis based on reported symptoms and tests. It's like swallowing lots of laxatives, experiencing a bad case of Bombay Belly , and then complaining to the doctor that you feel unwell, you wee a lot, and then showing surprise when he says you might have Diabetes...
They chose to remove a jumper from a disk drive.
I take affront to this test for several reasons. Firstly, it's not a "natural" fault that would occur during normal use, instead, it's a fault that should only occur if a) the user has been fitting hardware, or authorised someone else to fit hardware (like a new disk), or b) on the day the computer was built. Secondly (and perhaps more damningly), the "user" (an actor in this case) was instructed to tell the engineer some cock and bull story about the machine getting slower and slower, before it stopped altogether - at no time mentioning that the case had been "opened".
I'm sorry Watchdog, but you're out of order. I don't doubt that the company concerned were a bunch of cowboys (you just had to see the prices they were charging), but this evidence misleads the public. Engineers rely on the information that the user gives to them to make an accurate diagnosis of the problem - in this case, your information was not just incorrect, but deliberately misleading. A computer engineer is like a doctor - both have to make a diagnosis based on reported symptoms and tests. It's like swallowing lots of laxatives, experiencing a bad case of Bombay Belly , and then complaining to the doctor that you feel unwell, you wee a lot, and then showing surprise when he says you might have Diabetes...
Tuesday, 28 August 2012
This is what a disk crash looks like...
If you've ever wondered what goes on inside your hard disk, now's the time to find out. I recently had to try and recover some data from a disk, but with no success. I decided it might be useful to see what had happened, so I opened the disk up - by the way, these pictures are quite high res, so they may take some time to load, but it's worth it...
A bit of explanation is called for here. Your data is magnetically stored on the platter (the large silver disk) by the read write head, which is moved over the surface of the disk by the arm. In the next two pictures, you can see the circular scratch over the disk, caused when the head smashed into the platter...
End result? Damaged head, damaged surface, loss of data. Moral of the story? ALWAYS back up your data to an extral hard drive, or dvd.
A bit of explanation is called for here. Your data is magnetically stored on the platter (the large silver disk) by the read write head, which is moved over the surface of the disk by the arm. In the next two pictures, you can see the circular scratch over the disk, caused when the head smashed into the platter...
End result? Damaged head, damaged surface, loss of data. Moral of the story? ALWAYS back up your data to an extral hard drive, or dvd.
Thursday, 21 June 2012
BRAAIIINNSSSS!
...
...
"MY GOD - I DON'T UNDERSTAND ANY OF THIS GARBAGE! DAMN YOU! DAMN YOU ALL TO HELLLLLL!"
... and sink slowly to your knees, weeping for your lost innocence....
Have no fear! It's honestly not all that bad, and I'm here to help you. When buying a computer, the three most important things are processor, memory and disk space. As I've dealt with the last two subjects before, I think it's time we looked at the biggy - Central Processor Units (CPU's, or processors) - the brain of the computer. So, without further ado, here we go...
Intel & AMD are the main manufacturers of computer processors (known as CPU's) used in home/business computers (note: for the sake of this thread, I'm ignoring Apple - no offence iFans...)
Intel produces several ranges of CPU - in descending order of computing power (and age) we have...
i7 - most powerful and newest (up to 6 cores)
i5 (up to 4 cores)
i3 (2 cores)
Pentium (1-2 cores)
Celeron - least powerful and oldest. (1 core)
AMD have their own chip range....
FX - most powerful and newest
A-series
Phenom II
Athlon II
Sempron II - least powerful and oldest.
Cores: In computing terms, a "core" is one logical processor. Years ago (back in the 90's) all CPU's had one single core. As technology has progressed, the CPU designs have evolved and many now contain more than one logical processor in their physical CPU chip (For example, i7 CPU's can have 4 logical processors) - the advantage of this is "hyper threading" - that is, having the computer run more than one task simultaneously, with no performance hit (older CPU's can't do this).
Quad Core: A CPU with 4 logical processors
Dual Core: A CPU with 2 logical processors
The more logical processors, the better your computer can "potentially" multitask - this depends ENTIRELY on the operating system installed (Windows 7 for example)...
This, combined with the move to 64bit processing, means that computers these days are incredibly powerful beasts compared to 10 years ago...
For buyers, here are the following recommendations....
1. Buy an Intel based chip. There's nothing wrong with AMD chips (I've used lots of them in the past, and they're great), but Microsoft work hand-in-glove with Intel, which means that Windows will not only be better supported, but will run better too..
2. Buy the best processor you can afford - i.e If you have the choice between I3 and I5, choose I5 - you can always upgrade memory / disk space later...
Thursday, 14 June 2012
ASP, Access, ODBC & 80020009, 80004005 errors
Just a quickie here for my own benefit. When trying to get ASP pages working with VB Dlls that access MS ACCESS DB's, you may get 80020009 & 80004005 errors. You've checked the permissions on the database folder, and the IUSR has the right access - yet, you still get the errors.
Check your date & time settings - especially if you've created this using a VM. Make sure that your regional settings are as you expect.
Check your date & time settings - especially if you've created this using a VM. Make sure that your regional settings are as you expect.
Saturday, 31 March 2012
PDF - the most AB-used file format on the planet...
In days of old, when knights were bold, before the O/S war,
You couldn't share you documents, and see them as before,
Then the Knight Adobe, presented PDF to the King,
Who rejoiced and said "At last! An open documenty-thing!"
Alas Adobe cautioned his royal Haughtiness,
"It's read-only your Majesty - sorry for the mess..."
We all love PDF files. In the world of information, the Portable Document Format means that everyone can read your document, on whatever device you wish - and the document will look the same on each device. At the time of it's inception, PDF was perceived as the Holy Grail by many folk, and the format made Adobe a household name.
However, not all is rosy in the world of PDF, and the wise old silver surfer should be aware of all the pitfalls, misconceptions and stupidity that goes on within it, so here is a quick guide for those of you with you own teeth - especially if they're on the sideboard...
1. PDF is open source
Yes and no. PDF was created by Adobe, who released the details of the format to the IT community. It's format is now controlled by the International Standards Organization (ISO)
2. Once you've created a PDF for sharing, no one can edit it.
Wrong - ANY file format that can be read, can be re-created. PDF's do contain secure features, but in essence, if you can read a PDF, then it's data can be extracted by a third party tool, and reused.
3. All text in a PDF is true text - that is, it's a set of characters
Wrong - Any text you see in a PDF can be either character text, or a bitmap representation of text (a picture).
4. PDF's contain either character text or bitmap pictures.
Wrong - they can contain vector images too...
5. PDF's are small
WRONG! The size of a PDF depends on what's in it. A PDF containing a simple text character version of "War & Peace" is going to be MUCH smaller than a PDF created by scanning in a copy of "War & Peace" from a book (which will be mostly bitmaps).
6. You NEED Adobe Acrobat Reader to read PDF files, and Adobe Acrobat to creater them...
Wrong - Foxit make a good replacement reader, and free utilities like Ghostscript or CutePDF can create them...
You couldn't share you documents, and see them as before,
Then the Knight Adobe, presented PDF to the King,
Who rejoiced and said "At last! An open documenty-thing!"
Alas Adobe cautioned his royal Haughtiness,
"It's read-only your Majesty - sorry for the mess..."
We all love PDF files. In the world of information, the Portable Document Format means that everyone can read your document, on whatever device you wish - and the document will look the same on each device. At the time of it's inception, PDF was perceived as the Holy Grail by many folk, and the format made Adobe a household name.
However, not all is rosy in the world of PDF, and the wise old silver surfer should be aware of all the pitfalls, misconceptions and stupidity that goes on within it, so here is a quick guide for those of you with you own teeth - especially if they're on the sideboard...
1. PDF is open source
Yes and no. PDF was created by Adobe, who released the details of the format to the IT community. It's format is now controlled by the International Standards Organization (ISO)
2. Once you've created a PDF for sharing, no one can edit it.
Wrong - ANY file format that can be read, can be re-created. PDF's do contain secure features, but in essence, if you can read a PDF, then it's data can be extracted by a third party tool, and reused.
3. All text in a PDF is true text - that is, it's a set of characters
Wrong - Any text you see in a PDF can be either character text, or a bitmap representation of text (a picture).
4. PDF's contain either character text or bitmap pictures.
Wrong - they can contain vector images too...
5. PDF's are small
WRONG! The size of a PDF depends on what's in it. A PDF containing a simple text character version of "War & Peace" is going to be MUCH smaller than a PDF created by scanning in a copy of "War & Peace" from a book (which will be mostly bitmaps).
6. You NEED Adobe Acrobat Reader to read PDF files, and Adobe Acrobat to creater them...
Wrong - Foxit make a good replacement reader, and free utilities like Ghostscript or CutePDF can create them...
Tuesday, 13 March 2012
Caricatoons is LIVE!
Hi Everyone!
I've recently just put my new online shop live. If you're looking for a truly unique gift for that someone special, why not take a look?
http://www.caricatoons.co.uk
I've recently just put my new online shop live. If you're looking for a truly unique gift for that someone special, why not take a look?
http://www.caricatoons.co.uk
Thursday, 23 February 2012
Abuse of power - why Windows Security Fails...
Windows has one very nasty habit, that it has kept since the first days of it's creation. It's a habit so deadly, that Windows has never really been able to ditch it (despite several attempts), and rather like a junkie always needing one last desperate fix, Windows defends it's use, and will continue to use it, until the bitter end... I am of course, referring to default administrative rights.
Back in the old days of DOS (that's the operating system before windows - some of us over 40's are quite fond of it), networks existed only in business, and one user / one computer was the norm. In those days, the ideas of multiple users was more for large scale computing, such as big business or Universities. From these hot-beds came the idea of networks, multiple users on one machine, and user rights. In short, a human needed a user account to use a computer, and that account needed access rights to use files/printers/resources - all this, controlled by an Administrator: the one person deemed trustworthy enough to organize these resources on the User's behalf.
Microsoft had already created Windows - not as an Operating System at first, but as a Window Manager - that is, a nice, graphical way to view your files, edit notes etc. Windows was not the first, but it became the most popular, and in the way that popular things do, it created demand. There was now a demand for Windows to have network access to files, printers and resources, but Windows was a single user environment.
Enter "Windows for Workgroups" - Microsoft's answer to the network problem. Windows could now access network files, share network resources, and assign user rights - right? Wrong - Windows could assign network rights to other people / computers, but it was still a single user environment, and that user had FULL Administrator access to the computer. Microsoft got it right with Windows NT however - a version of Windows DESIGNED to be a computer server. It had users, printers, rights... but the USER computers that used it's resources were still running WFW - single user, full Administrative access to their own machine. The users HAD to have Administrative access to install printers, device drivers, anything that was actually useful. This problem would continue for a VERY long time...
In 1995, Microsoft tried again, and released Windows 95. 95 was Windows For WorkGroups with a new front end, as well as some improvements to the back end engine, but it was still essentially the same Windows / DOS combination that WFW was, and it shared the same single user philosophy. Yes, you could have separate user profiles, but each user was still "the Administrator". We also now had "Windows Update" - Microsoft could keep your computers up-t0-date for you, but they could only do it if the user had full access to the machine...
Then came Windows NT4, and NT4 Workstation. These were version of Windows designed for full network use, and actually had true user accounts practice. Now, only the Administrator could make system changes to the machine, and everyone else could be user's of lesser power... except that user's expected to be able to do things the way they were used to. People now had PC's running windows at home, and the "single user" experience was what they were used to. The user's also had to keep the machine up to date with the latest patches, to stop the growing virus threat, so the average user STILL had to have administrative access. Savvy network admins locked down the machines as best they could, but those who didn't know better still allowed the standard user's full Administrator access - is it any wonder that Windows is responsible for more viruses spreading than any other operating system?
In 98, we had Windows 98, and then Windows ME, and once again, Microsoft dropped the ball, allowing the Single User ethos to continue. 2000 brought Windows 2000 / Server 2000 and for the first time, Windows on both Client AND desktop had the notion of access rights. This should have been the turning point for Microsoft - their poor record of virus spreading and compromised security should have stopped here... but it didn't. In fact, it spread like wildfire. By default, new users in the client environment were MADE administrators!!! This madness carried on, and by the time of Windows XP, was commonplace.
Windows Vista tried to do something about it, by alerting the user when administrative access was required - but all the user had to do was click a button and say "ALLOW". Give a user a button that says "ALLOW", and they'll always press it...
Windows 7 went one step in the right direction, by getting rid of the Administrator account - but not before granting the first user set up FULL ADMINISTRATOR ACCESS...
In the UNIX/Linux world, we have the ROOT account. You can do anything with the root account, but it is never used as the primary account for users. This is the primary reason why UNIX/Linux computers are inherently safer than Windows ones.
Below are two links that you may find useful at some point - the first tell you ways you can manually remove spy ware from your computer (always useful knowledge), the second, explains why running as the Administrator is such a bad idea...
http://www.codinghorror.com/blog/2007/06/how-to-clean-up-a-windows-spyware-infestation.html
http://www.codinghorror.com/blog/2007/06/the-windows-security-epidemic-dont-run-as-an-administrator.html
Running as Administrator is like Arnie armed to the teeth, walking through a Nursery class - a friendly fire incident waiting to happen...
Back in the old days of DOS (that's the operating system before windows - some of us over 40's are quite fond of it), networks existed only in business, and one user / one computer was the norm. In those days, the ideas of multiple users was more for large scale computing, such as big business or Universities. From these hot-beds came the idea of networks, multiple users on one machine, and user rights. In short, a human needed a user account to use a computer, and that account needed access rights to use files/printers/resources - all this, controlled by an Administrator: the one person deemed trustworthy enough to organize these resources on the User's behalf.
Microsoft had already created Windows - not as an Operating System at first, but as a Window Manager - that is, a nice, graphical way to view your files, edit notes etc. Windows was not the first, but it became the most popular, and in the way that popular things do, it created demand. There was now a demand for Windows to have network access to files, printers and resources, but Windows was a single user environment.
Enter "Windows for Workgroups" - Microsoft's answer to the network problem. Windows could now access network files, share network resources, and assign user rights - right? Wrong - Windows could assign network rights to other people / computers, but it was still a single user environment, and that user had FULL Administrator access to the computer. Microsoft got it right with Windows NT however - a version of Windows DESIGNED to be a computer server. It had users, printers, rights... but the USER computers that used it's resources were still running WFW - single user, full Administrative access to their own machine. The users HAD to have Administrative access to install printers, device drivers, anything that was actually useful. This problem would continue for a VERY long time...
In 1995, Microsoft tried again, and released Windows 95. 95 was Windows For WorkGroups with a new front end, as well as some improvements to the back end engine, but it was still essentially the same Windows / DOS combination that WFW was, and it shared the same single user philosophy. Yes, you could have separate user profiles, but each user was still "the Administrator". We also now had "Windows Update" - Microsoft could keep your computers up-t0-date for you, but they could only do it if the user had full access to the machine...
Then came Windows NT4, and NT4 Workstation. These were version of Windows designed for full network use, and actually had true user accounts practice. Now, only the Administrator could make system changes to the machine, and everyone else could be user's of lesser power... except that user's expected to be able to do things the way they were used to. People now had PC's running windows at home, and the "single user" experience was what they were used to. The user's also had to keep the machine up to date with the latest patches, to stop the growing virus threat, so the average user STILL had to have administrative access. Savvy network admins locked down the machines as best they could, but those who didn't know better still allowed the standard user's full Administrator access - is it any wonder that Windows is responsible for more viruses spreading than any other operating system?
In 98, we had Windows 98, and then Windows ME, and once again, Microsoft dropped the ball, allowing the Single User ethos to continue. 2000 brought Windows 2000 / Server 2000 and for the first time, Windows on both Client AND desktop had the notion of access rights. This should have been the turning point for Microsoft - their poor record of virus spreading and compromised security should have stopped here... but it didn't. In fact, it spread like wildfire. By default, new users in the client environment were MADE administrators!!! This madness carried on, and by the time of Windows XP, was commonplace.
Windows Vista tried to do something about it, by alerting the user when administrative access was required - but all the user had to do was click a button and say "ALLOW". Give a user a button that says "ALLOW", and they'll always press it...
Windows 7 went one step in the right direction, by getting rid of the Administrator account - but not before granting the first user set up FULL ADMINISTRATOR ACCESS...
In the UNIX/Linux world, we have the ROOT account. You can do anything with the root account, but it is never used as the primary account for users. This is the primary reason why UNIX/Linux computers are inherently safer than Windows ones.
Below are two links that you may find useful at some point - the first tell you ways you can manually remove spy ware from your computer (always useful knowledge), the second, explains why running as the Administrator is such a bad idea...
http://www.codinghorror.com/blog/2007/06/how-to-clean-up-a-windows-spyware-infestation.html
http://www.codinghorror.com/blog/2007/06/the-windows-security-epidemic-dont-run-as-an-administrator.html
Running as Administrator is like Arnie armed to the teeth, walking through a Nursery class - a friendly fire incident waiting to happen...
Thursday, 9 February 2012
Social Networking - why text is not the way to talk...
"Too be or knot 2b - that is the question?"
I'll admit that I am a social networking Luddite. I blog (your reading one now) and I use forums, but I seldom tweet, and I never Facebook. Why you ask? Because text is impersonal.
Take the following statement:
I think that cheese is really cheesy.
This statement can be taken several ways. It could mean that the speaker thinks that cheese has a very cheesy taste, or that cheese is a very old fashioned subject. We can't tell, because we're missing the human element of conversation, namely inflection. We can't tell what the original speaker means, because the text can't reflect how the speaker has said it.
When you have a conversation with someone, face to face, the importance of "Body Language" cannot be underestimated. A person's body language can tell you if they are being coy, sardonic, humorous, and a thousand other possible emotional subtleties that go along with the words you hear to create the verbal impression you receive.
Text does not do this well.
I have lost count of the number of sms, email, twitter, forum & facebook arguments I have witnessed caused by simple misunderstandings. For your own sake, keep your text conversations simple, and if possible, read them through before posting. If you think what you have written may be taken the wrong way, then rephrase it - because you can guarantee that someone will...
I'll admit that I am a social networking Luddite. I blog (your reading one now) and I use forums, but I seldom tweet, and I never Facebook. Why you ask? Because text is impersonal.
Take the following statement:
I think that cheese is really cheesy.
This statement can be taken several ways. It could mean that the speaker thinks that cheese has a very cheesy taste, or that cheese is a very old fashioned subject. We can't tell, because we're missing the human element of conversation, namely inflection. We can't tell what the original speaker means, because the text can't reflect how the speaker has said it.
When you have a conversation with someone, face to face, the importance of "Body Language" cannot be underestimated. A person's body language can tell you if they are being coy, sardonic, humorous, and a thousand other possible emotional subtleties that go along with the words you hear to create the verbal impression you receive.
Text does not do this well.
I have lost count of the number of sms, email, twitter, forum & facebook arguments I have witnessed caused by simple misunderstandings. For your own sake, keep your text conversations simple, and if possible, read them through before posting. If you think what you have written may be taken the wrong way, then rephrase it - because you can guarantee that someone will...
Subscribe to:
Posts (Atom)