Latest Posts

The History Of Microprocessors

On November 15, 2021, the Intel 4004, thought the primary actual business microchip, turned 50: this is how the chip market has changed from that point to now. On November 15, 1971, with a promotion distributed in Electronic News, Intel sent off its most memorable business single-chip microchip available: it was the Intel 4004 and was portrayed by a profoundly worked on an inner plan yet, for the time, proficient: only four incorporated circuits, with which it is feasible to do every one of the essential tasks.

It’s been a long time since then, and Intel’s 4004 seems to be a dinosaur today. However, the significance of this chip throughout the entire existence of data innovation and gadgets is currently perceived by general specialists in the area. It is definitively from the Intel 4004 that we should begin to tell the historical backdrop.

Intel 4004: The First Chip

The Intel 4004, to tell the truth, was not born in 1971 but two years earlier: in 1969, the Japanese Busicom asked Intel to create a new chip for its latest model of computer (which, at the time, was not yet ” personal ” ). The revolutionary idea of ​​Intel was to insert the integrated circuits necessary for the calculations in a single chip, the 4004, equipped with four integrated circuits formed by 2,250 transistors.

This chip was made with a 10 micrometer (i.e., 10,000 nanometers) manufacturing process, consumed 1 watt of power, and could handle up to 4 KB of RAM. Of course, it had only one core, which ran at 740kHz (i.e0.74MHz, or 0.00074GHz ) and could only run one thread at a time. The Intel 4004 needed three other support chips: Intel 4001, Intel 4002 (the RAM), and Intel 4003. Busicom liked the idea and used this chip exclusively for two years. In 1971 the exclusive contract was not renewed, and Intel was free to sell its 4004 to everyone. The revolution had begun.

From 4004 To 80286

Intel 4004 was a 4-bit chip, and the Santa Clara company immediately understood that the idea of ​​the integrated microchip had to be developed directly, even with more complex architectures. Just five months after the announcement of the commercialization of the 4004, in early 1972, Intel also introduced the 8008 chip, 8-bit, which was an immediate success and, in large part, was the basis of the initial development of the computer market from home: “personal computers ” were born, the PCs that everyone has at home today.

The Intel 8008 was soon followed by the 8080 (still 8-bit) in April 1974 and the Intel 8086 ( 16-bit ) in 1978. Mark this date: it was a turning point, as we will explain shortly. The 8086 was the first 16-bit Intel family microprocessor, followed by Intel 8088, Intel 80186, Intel 80188, and Intel 80286, all 16-bit but increasingly complex and increasingly powerful. The Intel 80286, launched in 1982, took the 16-bit architecture to the technical limitations and, in its most performing version, had a clock frequency of 12 MHz and could manage up to 16 MB of RAM, which, at the time, was a lot.

The Microsoft MS-DOS operating system, born in 1982, was unable to address so much memory and needed to use a logical device called “extended memory.” However, the hardware and software duo formed by Intel 80286 (or, as they all called it, “the 286”) and Microsoft MS-DOS was the beginning of the iron union between Intel and Microsoft that continues today, and that has shaped the history of information technology in the last forty years.

Intel 80386 To Present

With DOS running on 286, economical personal computing was born, within reach of the middle class (American, first, global shortly after), and Intel established itself as the electronics giant on which not only the present is based (i.e., the eighties) but also the future (up to our days). Investments in research undergo an acceleration never seen before in the electronics sector. In 1985, Intel brought its first 32-bit processor to the market: the Intel 80386, which will reach a maximum clock frequency of 40 MHz in its CX version.

The 32-bit history of Intel starts from 386, which will then pass from 80486 to make another massive leap in performance with the Pentium, first, and with the Cores later. The list of 32-bit Intel microprocessors, not counting the variants (such as the ” DX ” and ” SX “) and the low-power versions for notebooks, is very long:

  1. 80386
  2. 80486
  3. Pentium
  4. Pentium Pro
  5. Pentium II
  6. Mobile Pentium II PE
  7. Celeron
  8. Pentium III
  9. Pentium III-M
  10. Pentium 4
  11. Pentium 4-M
  12. Mobile Pentium 4
  13. Pentium 4 EE
  14. Celeron D
  15. Pentium D
  16. Pentium EE
  17. Core 2 Duo
  18. Core 2 Quad
  19. Core 2 Extreme
  20. Core i3
  21. Core i5
  22. Core i7
  23. Core i9

But we must not make the mistake of confusing the history of microprocessors with that of Intel: other companies have also played a fundamental role, and the one that deserves more to be mentioned is certainly AMD, Advanced Micro Devices, the first company able to bring a CPU with 1 GHz clock frequency to market.

AMD Vs. Intel

While AMD has long been Intel’s number one challenger and still is in many ways, it’s also true that Intel wouldn’t have grown to be the giant it is today without AMD. To understand why we have to go back to the Intel 8086. In 1976, AMD signed a contract with Intel to license the 8080 chips. The agreement between Intel and AMD also provided that the two companies could both use some patents of the other. Intel had no intention of renewing the contract with AMD in 1978 for the 8086.

However, IBM, Intel’s primary customer at the time, required all its chip suppliers to have a ” second source “: the same chip also had to be produced by a second company so that IBM would not have any supply problems in case of batches. Of faulty processors or factory blocks of a supplier (a policy, that of ” second source, “which today might have prevented the famous chip crisis ). To keep the IBM customer, Intel was forced to renew the agreement with AMD, and the latter continued for several years to produce Intel’s chips based on the same designs until it realized it could do it on its own. And even to be able to challenge Intel.

AMD’s first real challenge against Intel dates back to the days of the 486 CPUs: AMD produced several models that could be used as an upgrade to the 386 without changing motherboards or other components. This allowed AMD to gain market share and profit right in the golden age of computer sales. The second challenge came in 1997 with the K6 processor, perfectly compatible with motherboards for Intel Pentium, with very similar performances but a much lower price. The third and decisive challenge came in 1999 with AMD Athlon, the first commercial processor capable (in 2000) of reaching the 1 GHz clock frequency.

It was a massive setback for Intel and the beginning of a dark period that lasted years. AMD managed to be often in advance, proposing better CPUs for at least one fundamental aspect: cost, clock speed, and consumption. From that moment on, to put it in minimal technical but convenient terms, Intel and AMD have given it a good reason both on the market and in the courts. However, it has always been consumers who have seen increasingly advanced, powerful, and cheap microchips arrive on the market at impressive rates.

Also Read: AUGMENTED REALITY AND ITS APPLICATION FOR COMMERCE

Latest Posts

Don't Miss