The mighty handful

From the microprocessor to the internet... Five of the most influential electronic technologies of the past 40 years - advances that have transformed our business world - are counted off.

by Andrew Saunders
Last Updated: 31 Aug 2010


Microprocessors are the unsung silicon workhorses of modern life, installed in their billions in everything from toasters to MRI scanners.

You could say it's cheating to include them here, as none of the other things on the list would work without them, but in a way that's exactly the point. If every existing microprocessor stopped working tomorrow, the world would instantly be flung backwards 100 years - not bad for a device that has been available for only 35 years.

The first single-chip microprocessor, the Intel 4004, hit the market in 1971, packing an unprecedented 2,300 transistors onto a silicon wafer measuring 3mm by 4mm.

The 4004 and its descendants went on to launch any number of pocket calculators, digital watches and, of course, computers. A 4004 was the on-board brain of 1972's Pioneer 10 spacecraft, last heard from in 2003, when it had travelled 7.6 billion miles.

Intel co-founder Gordon Moore coined 'Moore's Law', the most famous axiom in the IT business. This states that the complexity of integrated circuits doubles every 24 months. It has held true to this day. There are now many millions of transistors on a top-of-the-range chip. The market for chips today is worth £30 billion a year.

But another more imminent and prosaic problem faces Intel: its customers' needs are changing. For many modern devices, battery power is the critical factor. BlackBerries, mobiles, MP3 players - all these require simpler, more efficient (and cheaper) chips that make the most of battery life.

Unless Intel can adapt its Rolls-Royce chips to this Ford Fiesta world, it could find itself speeding in the wrong direction.


The mobile phone has probably done more to popularise personal technology than any other single gadget - including the legendary Sony Walkman and its more recent incarnation, the Apple iPod. For once, here is something that really deserves to be called a high-tech marvel. Only a decade or two ago a self-contained phone small enough to fit in a pocket was too much even for James Bond. Now there are two billion of them.

Admittedly, the first-generation mobiles were strictly voice-only and hardly pocket-sized. But by the time digital 2G or GSM phones arrived in the early '90s, the stage was set for the slinky mobiles we know today.

The insatiable desire for ubiquitous conversation has created some of the most successful British companies of recent years - enterprises such as Vodafone. Mistakes have been made (will 3G ever make money?), but by and large Vodafone's leaders have steered an adroit course. The troubles it now faces in the City have more to do with shareholders experiencing pain at the close of the growth phase than any fundamental problems in the firm's model. Walk down any high street pretty much anywhere in the world and you will see people talking on their mobile phones. How can that be a bad business to be in?


The greatest double act in technology, this dynamic duo hit the mainstream in the mid-90s. Nowadays more than a billion people use the net and despite the burgeoning attractions of the world wide web, e-mail remains its most popular feature.

With about 60 billion messages sent daily - that's nearly 10 e-mails for every person alive on the planet - it's the original killer app. And all in little more than a decade of mega-growth. But various early iterations of the net had existed since the late '60s, mostly in the US, as a means to allow two or more computers to communicate with each other (if that doesn't sound exactly revolutionary, we can't all be Larry Page, can we?).

E-mail emerged at around the same time, as a way of leaving messages for other users of the same computer. Pioneering researcher Ray Tomlinson deserves to be remembered as the person who, back in 1972, put the @ symbol into e-mail addresses.

If e-mail drove the early adoption and growth of the internet, then the world wide web has made it what it is today. British developer Tim Berners-Lee made a huge contribution in 1992 when he decided to allow free use of his idea rather than trying to make money from it.

Now the talk is all of Web 2.0, a new era of more flexible and powerful internet services. The future will be dominated by handheld and smart mobile devices with fast wireless net connections and by further blurring of the lines between the internet, telecoms and computing.


Traders in bonds, shares, securities and other financial instruments have always valued rapid, accurate information - the London and New York stock exchanges were early customers for both the electric telegraph and the telephone. But the biggest change in the history of the London Stock Exchange (LSE) was the Big Bang of 1986, which coupled computerised share trading with liberalisation of the rules governing the operation of the exchange.

Outside firms were allowed to own exchange members for the first time and the old trading floor ceased to exist - although the anarchists who invaded it in 1999 seem not to have realised this. Instead, stocks were bought and sold via computer screen and telephone, a sort of prototype cyberspace dedicated to making money. At least one computer system still used by the LSE - the share price system SEAQ (Stock Exchange Automated Quote) - dates from the Big Bang.

The City was transformed from a clubby English institution into the international financial powerhouse that it is today, and one of the UK economy's primary wealth-generators, to boot.

International electronic share trading, together with technologies for the instantaneous transfer of huge sums of money across the world, also laid the foundations for the huge growth in cross-border trading and M&A activity of recent years - as well as making it much easier for all of us to get credit cards, loans, ISAs and offset mortgages, too.


When MT first made its appearance in 1966, the idea of a computer on every desk was pure science fiction.

In those days, computers were mainframes, hugely expensive rarities.

Unless you worked for a very large private company or a big government agency, you had probably never even seen one.

By the late '70s, a few pioneering companies had started to exploit advances in micro-electronics to produce smaller, cheaper and more accessible machines, the best known of which was the Apple II. It made Apple's founders, the Steves Jobs and Wozniak, very happy and pretty rich.

But the market was fragmented, with lots of small, clever companies doing their own thing - including Acorn and Apricot here in the UK. Enter US giant IBM, not remotely small and barely clever enough for the task in hand.

In 1981 Big Blue decided to do what it had done very successfully in many other business machine markets, and make the industry-standard personal computer. Its 5150 cost $1,595 (about £1,800 in today's money) for the entry-level model and $4,500 for the business version. It was neither the best nor the cheapest, but it was the most widely available. IBM expected to sell 250,000 units in five years - it actually shifted more than a million.

The machines that we use today - about a billion of them - are direct descendants of that first IBM 5150. But the decision to buy-in operating software from a 25-year-old geek rather than make its own was a crucial strategic blunder on the part of IBM.

The operating system chosen, MS-DOS, would set the geek in question - one Bill Gates - well on the road to becoming the world's richest man.

He was happy to leave the business of making the boxes to others, but Gates did learn at least one lesson from Big Blue: you don't have to be best to be biggest. Microsoft's enormous success has been based on the (correct) assumption that industry standards are set not by the most technically advanced products but by those that are good enough and very widely distributed.

The next generation of PC-like boxes will be 'home hubs' - networked storage and access devices designed to manage domestic entertainment, information and communication needs - opening the door to companies such as Samsung, LG Electronics and Sony, whose core expertise combines product design, technology and understanding consumers.

Find this article useful?

Get more great articles like this in your inbox every lunchtime

What Lego robots can teach us about motivating teams

People crave meaningful work, yet managers can so easily make it all seem futile.

What went wrong at Debenhams?

There are lessons in the high street store's sorry story.

How to find the right mentor or executive coach

One minute briefing: McDonald’s UK CEO Paul Pomroy.

What you don't want to copy from Silicon Valley

Workplace Evolution podcast: Twitter's former EMEA chief Bruce Daisley on Saturday emails, biased recruitment and...

Research: How the most effective CEOs spend their time

Do you prefer the big, cross-functional meeting or the one-to-one catch-up?

6 rules for leading a remote team

Our C-suite panel share their distilled wisdom.