UK: Mega memories are made of this.

UK: Mega memories are made of this. - Who could have predicted that anything so tiny as the microchip, first invented in the 1950s, would now be underpinning every computer in the world?

by Rhymer Rigby.
Last Updated: 31 Aug 2010

Who could have predicted that anything so tiny as the microchip, first invented in the 1950s, would now be underpinning every computer in the world?

They got it wrong: in the 1950s, science fiction writers and futurologists confidently predicted that, by now, we'd be travelling regularly if not to the stars, at least to nearby planets and that our space ships would have to be big. Not only because (obviously) bigger was better but because we'd have to take our computers with us and all those vacuum tubes would take up a lot of space. Of course, the future has proved much smaller and rather less romantic than envisaged: a Psion organiser is more powerful than the computers they dreamt of and the next man on the moon will probably be a Japanese tourist.

What they failed to foresee was the invention and subsequent breakneck development of the microchip, which began at Shockley Semiconductor in California in the late 1950s. In 1956, a PhD graduate, Gordon Moore, joined the company, where he met Robert Noyce, an MIT graduate. Along with their fellow scientists, they were messing around with silicon, trying to go beyond the transistor, in whose invention the eponymous Mr Shockley had played a role. But Shockley was, by all accounts, something of a shocker to work for and a disgruntled group including Noyce and Moore left the company to set up on their own.

They turned to Arthur Rock, a San Francisco-based merchant banker, who arranged a meeting with the Fairchild Camera and Instrument Company. It was sufficiently interested to set up a division called Fairchild Semiconductor in 1957. Two years on, Noyce, who is credited as the inventor of the chip, managed to put an array of transistors on a piece of silicon. This was the first, very rudimentary chip and it went down well. Before long, the Fairchild chips were usurping the clumsy mechanical switches in computers.

But Moore saw far more for the invention, especially in the potential to cram vast numbers of transistors on the chip. In 1963, he made his famous prediction that the power of the chip would double every 18 months.

Meanwhile, a young Hungarian PhD from the University of California joined the outfit as Moore's assistant. His name was Andy Grove. Fortune smiled on Fairchild and, as silicon became the switch of choice, the company's turnover rose to $130 million. All was not as rosy as the figures, however.

Noyce and Moore wanted to investigate areas of interest to them, principally memory. But the Fairchild parent was cautious and reluctant to give its flakey West Coasters that much rope. In 1968, Moore and Noyce famously met one weekend and decided to go it alone. Armed with a good track record and some $250,000 apiece, they went back to their money man, Rock, who managed to raise $2.5 million. Grimmel College, to which Noyce was affiliated, chipped in a further $300,000 and, on 18 July 1968, Intel was born - initially called NM Electronic. More employees, Grove included, were recruited from Fairchild, and the company began grappling with the memory problem.

At the time, memory was chunky and magnetic - but relatively cheap. The silicon alternative, though much smaller, was more expensive, although Moore's law, which had thus far held good, said it was destined to be the cheaper option within a short time. Noyce, Moore and co also faced another dilemma. If they came up with a really simple device, copyists would have replicas on the market within months. Conversely, if their product was too expensive, they'd probably run out of money before they came up with a product. Which explains in part why Intel's first-year business revenues were a shade under $3,000. But 1970 was better, when the business brought out its first success - a DRAM (dynamic random access) chip, which stored one kilobyte (by contrast, your desktop computer now has around 32,000 kilobytes. In the next year, the company trumped this with the EPROM (erasable programmable read only memory).

Meanwhile, in 1969, the business had been asked by a Japanese company to produce a set of chips to beef up calculator performance. Eventually, engineer Ted Hoff produced the Intel 4004 processor. It had 2,300 transistors, could perform 60,000 operations a second and was the world's first microprocessor.

Effectively, it underpins every computer in the world. But Intel realised it would have to run simply to maintain position. In 1972, it launched the 8008 processor and, in 1978, ushered in the modern age of computing with the 8086 chip. Intel's position became secure when, in 1981, IBM opted to use its 8088 chip in its first PC.

Although more powerful, the current generation of Pentiums are descendants of these long ago chips. But, for silicon at least, the point where Moore's law breaks down is approaching and even a man of Grove's resources cannot make atoms any smaller.

Find this article useful?

Get more great articles like this in your inbox every lunchtime

Subscribe

Get your essential reading delivered. Subscribe to Management Today