How a Microprocessor Chip for a Computer is MadeA microprocessor is a type of integrated circuit built on a tiny piece of silicon. Today's microprocessors contain hundreds of millions of transistors, which are interconnected through extremely fine wires made of copper. The transistors work together to store and manipulate data so that the microprocessor can perform a wide variety of functions.
Making microprocessors is a complex, demanding process involving more than 300 steps. Microprocessors are built by layering materials on top of thin rounds of silicon, called wafers, through various processes using chemicals, gases and light.
Moore's Law - How Powerful Are Microprocessor Chips?The increase in capacity of microprocessors follows is called Moore's law. Gordon Moore is widely known for writing Moore's Law, in which he predicted that the number of transistors the industry would be able to place on a microprocessor chip would double every year. In 1995, he updated his prediction to once every two years. While originally intended as a rule of thumb in 1965, it has become the guiding principle for the industry to deliver ever-more-powerful semiconductor chips at proportionate decreases in cost.
The History of the First Microprocessor Chip
In November, 1971, a company called Intel publicly introduced the world's first single chip microprocessor, the Intel 4004 (U.S. Patent #3,821,715), invented by Intel engineers Federico Faggin, Ted Hoff, and Stanley Mazor. The Intel 4004 took the integrated circuit down one step further by placing all the parts that made a computer think (i.e. central processing unit, memory, input and output controls) on one small chip.