- Source: Microprocessor chronology
1970s
The first chips that could be considered microprocessors were designed and manufactured in the late 1960s and early 1970s, including the MP944 used in the Grumman F-14 CADC. Intel's 4004 of 1971 is widely regarded as the first commercial microprocessor.
Designers predominantly used MOSFET transistors with pMOS logic in the early 1970s, switching to nMOS logic after the mid-1970s. nMOS had the advantage that it could run on a single voltage, typically +5V, which simplified the power supply requirements and allowed it to be easily interfaced with the wide variety of +5V transistor-transistor logic (TTL) devices. nMOS had the disadvantage that it was more susceptible to electronic noise generated by slight impurities in the underlying silicon material, and it was not until the mid-1970s that these, sodium in particular, were successfully removed to the required levels. At that time, around 1975, nMOS quickly took over the market.
This corresponded with the introduction of new semiconductor masking systems, notably the Micralign system from Perkin-Elmer. Micralign projected an image of the mask onto the silicon wafer, never touching it directly, which eliminated the previous problems when the mask would be lifted off the surface and take away some of the photoresist along with it, ruining the chips on that portion of the wafer. By reducing the number of flawed chips, from about 70% to 10%, the cost of complex designs like early microprocessors fell by the same amount. Systems based on contact aligners cost on the order of $300 in single-unit quantities, the MOS 6502, designed specifically to take advantage of these improvements, cost only $25.
This period also saw considerable experimentation with various word lengths. Early on, 4-bit processors were common, like the Intel 4004, simply because making a wider word length could not be accomplished cost-effectively in the room available on the small wafers of the era, especially when the majority would be defective. As yields improved, wafer sizes grew, and feature size continued to be reduced, more complex 8-bit designs emerged like the Intel 8080 and 6502. 16-bit processors emerged early but were expensive; by the decade's end, low-cost 16-bit designs like the Zilog Z8000 were becoming common. Some unusual word lengths were also produced, including 12-bit and 20-bit, often matching a design that had previously been implemented in a multi-chip format in a minicomputer. These had largely disappeared by the end of the decade as minicomputers moved to 32-bit formats.
1980s
As Moore's Law continued to drive the industry towards more complex chip designs, the expected widespread move from 8-bit designs of the 1970s to 16-bit designs almost didn't occur; instead, new 32-bit designs like the Motorola 68000 and National Semiconductor NS32000 emerged that offered far more performance. The only widespread use of 16-bit systems was in the IBM PC, which had selected the Intel 8088 in 1979 before the new designs had matured.
Another change was the move to CMOS gates as the primary method of building complex CPUs. CMOS had been available since the early 1970s; RCA introduced the COSMAC processor using CMOS in 1975. Whereas earlier systems used a single transistor as the basis for each "gate", CMOS used a two-sided design, essentially making it twice as expensive to build. Its advantage was that its logic was not based on the voltage of a transistor compared to the silicon substrate, but the difference in voltages between the two sides, which was detectable at much lower power levels. As processor complexity continued to grow, power dissipation had become a significant concern and chips were prone to overheating; CMOS greatly reduced this problem and quickly took over the market. This was aided by the uptake of CMOS by Japanese firms while US firms remained on nMOS, giving the Japanese industry a major advance during the 1980s.
Semiconductor fabrication techniques continued to improve throughout. The Micralign, which had "created the modern IC industry", was obsolete by the early 1980s. They were replaced by the new steppers, which used high magnifications and extremely powerful light sources to allow a large mask to be copied onto the wafer at ever-smaller sizes. This technology allowed the industry to break below the former 1 micron limit.
Key home computers in the early part of the decade predominantly use processors developed in the 1970s. Versions of the 6502, first released in 1975, powered the Commodore 64, Apple II, BBC Micro, and Atari 8-bit computers. The 8-bit Zilog Z80 (1976) is at the core of the ZX Spectrum, MSX systems and many others. The 8086-based IBM PC, launched in 1981, started the move to 16-bit, but was soon passed by the 68000-based 16/32-bit Macintosh, then the Atari ST and Amiga. IBM PC compatibles moved to 32-bit with the introduction of the Intel 80386 in late 1985, although 386-based systems were considerably expensive at the time.
In addition to ever-growing word lengths, microprocessors began to add additional functional units that had previously been optional external parts. By the middle of the decade, memory management units (MMUs) were becoming commonplace, first appearing on designs like the Intel 80286 and Motorola 68030. By the end of the decade, floating point units (FPUs) were being added, first appearing on 1989s Intel 486 and followed the next year by the Motorola 68040.
Another change that began during the 1980s involved overall design philosophy with the emergence of the reduced instruction set computer, or RISC. Although the concept was first developed by IBM in the 1970s, the company did not introduce powerful systems based on it, largely for fear of cannibalizing their sales of larger mainframe systems. Market introduction was driven by smaller companies like MIPS Technologies, SPARC and ARM. These companies did not have access to high-end fabrication like Intel and Motorola, but were able to introduce chips that were highly competitive with those companies with a fraction of the complexity. By the end of the decade, every major vendor was introducing a RISC design of their own, like the IBM POWER, Intel i860 and Motorola 88000.
1990s
The 32-bit microprocessor dominated the consumer market in the 1990s. Processor clock speeds increased by more than tenfold between 1990 and 1999, and 64-bit processors began to emerge later in the decade. In the 1990s, microprocessors no longer used the same clock speed for the processor and the RAM. Processors began to have a front-side bus (FSB) clock speed used in communication with RAM and other components. Typically, the processor itself ran at a clock speed that was a multiple of the FSB clock speed. Intel's Pentium III, for example, had an internal clock speed of 450–600 MHz and an FSB speed of 100–133 MHz. Only the processor's internal clock speed is shown here.
2000s
64-bit processors became mainstream in the 2000s. Microprocessor clock speeds reached a ceiling because of the heat dissipation barrier. Instead of implementing expensive and impractical cooling systems, manufacturers turned to parallel computing in the form of the multi-core processor. Overclocking had its roots in the 1990s, but came into its own in the 2000s. Off-the-shelf cooling systems designed for overclocked processors became common, and the gaming PC had its advent as well. Over the decade, transistor counts increased by about an order of magnitude, a trend continued from previous decades. Process sizes decreased about fourfold, from 180 nm to 45 nm.
2010s
A new trend appears, the multi-chip module made of several chiplets. This is multiple monolithic chips in a single package. This allows higher integration with several smaller and easier to manufacture chips.
2020s
See also
Moore's law
Transistor count per chip, chronology
Timeline of instructions per second – architectural chip performance chronology
Tick–tock model, and its successor:
Process–architecture–optimization model
References and notes
References
Notes
sandpile.org for x86 processor information
Ogdin, Jerry (January 1975). "Microprocessor scorecard". Euromicro Newsletter. 1 (2): 43–77. doi:10.1016/0303-1268(75)90008-5.
Kata Kunci Pencarian:
- Microprocessor chronology
- Microprocessor
- Timeline of computing
- Moore's law
- CPU cache
- Arithmetic logic unit
- Adder (electronics)
- Memory-mapped I/O and port-mapped I/O
- Hazard (computer architecture)
- Carry-save adder