In the beginning computers were big machines, factory constructed or truck delivered, which used drum memories to load programs. These were too cumbersome and therefore called for small portable processors. The Central Processing Unit, C.P.U., is the main component of a computer; it acts as the brain of computers. This is what was introduced to solve the problem of the big cumbersome machines. The heart of any modern computer is the microprocessor, which is a chip consisting of hundreds of thousands transistors and other elements arranged into unique functional operating units. It conveys instructions given to the computer through the keyboard or the mouse, to their intended destinations. This article will critically examine the history of C.P.U. from inception to the designs that will come in the future.
Intel 4004 was the first computer microprocessor that came into being in 1970, designed for a calculator company, Busicom by Intel. This microprocessor handled data in chunks of four bits which, over time was not enough. This necessitated the need for more bits in the microprocessors, and in 1972, Intel came up with another microprocessor that had a different architecture the 8008. This was simply a 4004 that had just been scaled up and had eight bits. The address space of this microprocessor was limited to just 16 kilobytes, RAM that people of the time could not afford. Intel introduced two years later, another 8080 microprocessor with a bigger memory capacity than 8008. This one had a memory space of 64 kilobytes and a ten fold increased rate of execution. Within the same period, Motorola introduced the 6800 microprocessor that had similar performance as the Intel 8080. Serious microcomputers used 8080 as their core; this led to the making of Intel 8088 that was then used in the IBM PC, while the 6800 that was introduced by Motorola was used in the personal computers Apple 11(Drinkypoo, 2002).
The 1970’s saw increased use of the Intel 8080 until 1979, when Intel launched another processor, 8088 and the first PC used this. This 16 bit microprocessor changed the phase of computers; it became a tool for mainstream business. The Intel 8086 had an advantage over the others in that it had up to 1 megabyte memory space necessitating larger documents to be read from the disk and at the same time being held in RAM for quicker access and manipulation. But as memory space kept on expanding and the speed of microprocessor cores kept going up, memory keeping up with all these became a problem that needed to be addressed. Because memories that are large and low powered do not go as fast as the RAM chips that are small but higher powered, computer engineers resorted to inserting fast and smaller memories between the large RAM and the Microprocessor so that the fastest CPUs could run at full speed. The smaller memory is what is referred to as the cache RAM which allows the microprocessor to execute instructions at full speed (Davis, 2005).
The digital age, which is the 1980s, is the time when many things in the history of computers happened. Almost all the chips that are used now were hatched here. Talk of the most crippled chip that Intel has ever made, the 286, the first 32 bit processor 68020 that was advanced from the 68000, the ARM CPUs and those that brought PCs into the era of 32 bit that’s the 386 and 486 and many others. This decade also saw the first clones of the Intel CPUs that were introduced in the 1970s. Many other processors were made in this decade such that some of them did not even make it to the market (Drinkypoo, 2002).
The nineties was the period that home computers started being popular. The MIPS R4000 started being used in workstations especially in the movie industry in the making of movies. This is also the time that saw IBM and Motorola come together and with assistance from Apple started work on and therefore came up with their new PowerPC architecture. This culminated in Intel coming up with big hits like; the Pentium which was followed by the Pentium MMX, Pentium 2 and Pentium 3. This period saw AMD come in with a number of RISC CPUs that could interpret x86 instructions; these were the K5, K6, and Athlon which got AMD battling with Intel all because of CPU supremacy. Some of the machines produced during this time were; RS600 (IBM) POWER introduced in 1990 after PowerPC CPU, and was the first superscalar processor that was able of carrying out multiple instructions at ago; 486SX(Intel) that was produced in 1991. This was a 486 processor having no onboard FPU that was introduced as a budget processor with low cost; the K6-3(AMD) this was the last revision in the line of K6, it improved the multimedia functions speed and made available new clock rates. This is just to name but a few of them (Drinkypoo, 2002).
The race for more efficient C.P.Us continued into the 21st century where we have see AMD and Intel directly and strongly competing. They both have 64 bit designs that have instruction sets based on x86. Everyone seems to have gone for the 64 bit nowadays, and if not, they are planning on it meaning that this will soon become the quad age. Some of the C.P.U.s introduced during this time include; Pentium 4(Intel) which is less efficient than P3, but fairly much clock rates with bus speeds increasing as far as 533MHZ so as to compete with AMD’s Athlons; V-Dragon (China, IBM) this was a RISC with 32 bit designed by the Chinese with help from IBM, this clocked about 200-260MHZ, and many more others that were produced by different companies (Krazit, 2006).