Processors for Computers
A processor is the part of the computer system that manipulates the data. The first computer processors of the late 1940s and early 1950s performed three main functions and had three main components. They worked in a cycle to gather, decode, and execute instructions. They were made up of the arithmetic and logic unit, the control unit, and some extra storage components or registers. Today, most processors contain these components and perform these same functions, but since the 1960s they have developed different forms, capabilities, and organization. As with computers in general, increasing speed and decreasing size has marked their development.
The early computing machines of the twentieth century relied on various and mechanically complex ways to handle data. The first digital computers, such as the ENIAC built for the U.S. Army and completed in 1945, relied on processing units constructed from thousands of thermionic valves (or vacuum tubes) plugged into connectors. Valves were connected in circuits and transferred electronic signals to enable mathematical and logical operations. The thermionic valve increased the speed of a computer’s calculations. However, the valves required large amounts of power and were very expensive; they were approximately the size of small light bulbs and made computers very large and difficult to maintain and operate.
The data processing of the ENIAC was hampered not only by the fragility, size, and cost of the valves but also by its inability to store a program and data. In 1945, mathematician John von Neumann synthesized the research he and his colleagues conducted on the ENIAC and outlined the construction for a new computer in his seminal paper A Draft Report on the EDVAC, in which he proposed the stored program concept as an answer to the ENIAC’s computing problems. The paper also described the basic components and functioning of the processor. While this paper articulated the operating concepts of the modern processor, its physical manifestation would take several more years of work and the use of valves would dominate computer construction into the late 1950s.
The EDVAC processor required a circuitry based on binary logic. Precedents for computing machines based on binary logic existed earlier but were not widely known. Machines built in the late 1930s and early 1940s by pioneers such as Konrad Zuse, Alan Turing, John Atanasoff and Clifford Berry employed a binary system. Claude Shannon, a researcher at the Massachusetts Institute of Technology, also noted the applicability of a computer’s system of ones and zeros to the Boolean logic values of TRUE and FALSE. His 1938 paper, A Symbolic Analysis of Relay and Switching Currents, was an analysis of this relationship between computer logic and Boolean algebra, a mathematical system devised in the mid-nineteenth century by British mathematician, George Boole.
Binary logic offered a more efficient construction for computer processors. In the first computers,
‘‘logic gates,’’ which were groups of valves in early computers and transistors in later ones, were placed to form circuits and operate together under a binary system in which information was transmitted in ones and zeros. The information traveled via electric current and the voltage of the current determined whether the signal was a one or a zero. The gates received the signals and according to their configuration would output a new signal. The complex circuitry formed by these gates allowed computers to process instructions given by the user. Most early processors contained sets of circuits known as accumulators, registers and control units. Accumulators performed simple arithmetic and stored sums; registers provided temporary storage for data and instructions; and control units directed the processor’s operations. Basing their work on these early processing units, many designers in the 1950s and 1960s developed processors for the large computer companies such as IBM, Honeywell, General Electric, Burroughs, Univac, and Digital Equipment Corporation. However, the physical form and capabilities of the computer processor were most dramatically changed with the invention of the transistor and the integrated circuit. In 1948, at Bell Laboratories in New Jersey, the work of three physicists, John Bardeen, Walter H. Brattain, and William Shockley resulted in the production of the first transistor. It was, however, nearly a decade before transistorized computers would appear for commercial use. The U.S. military supported the construction of the first fully transistorized computer in 1952. Built by Bell Laboratories, the computer was named TRIDAC (transistorized digital computer). Due to the early transistor’s cost and unreliability, it was not until the late 1950s that companies devised more sturdy devices and built commercial computer processors using transistors. Following the construction of commercial transistorized computers, research continued on making electronic components smaller and faster. In 1959, Jack Kilby at Texas Instruments and Robert Noyce at Fairchild Semiconductor, working separately to improve microelectronics and circuit design, invented and filed patents on a device called the integrated circuit or ‘‘chip.’’ Integrated circuits were made up of transistors etched onto the surface of a semiconductor, most commonly a thin wafer of silicon. Processors were now made up of groups of integrated circuits placed onto printed circuit boards. Integrated circuits improved the speed and decreased the size of computer processors. After the introduction of an improved semiconductor called MOS (metal oxide semiconductor), several engineers and researchers worked to fit processor components onto a single integrated circuit. This process resulted in the singlechip central processing unit (CPU). The single-chip CPU heralded a new generation of computer processors and the development of the personal computer.
Intel, a company founded by former integrated circuit makers Robert Noyce and Gordon Moore, commercially produced the first single-chip CPU or microprocessor in 1971. This microprocessor, named the 4004 was designed and constructed by Intel employees Ted Hoff, Stan Mazor, and Federico Faggin. The 4004 included 2300 transistors and contained the registers and arithmetic and control units of a basic general-purpose computer processor. It performed 600,000 instructions per second.
Increases in speed were constant after the introduction of the Intel 4004 and several companies competed with Intel to produce smaller and faster microprocessors. In accordance with the popularly termed ‘‘Moore’s law,’’ in which Gordon Moore observed that approximately every eighteen months after the invention of the integrated circuit the number of transistors on a circuit doubled, the number of transistors in a microprocessor increased significantly with each new model. For example, the Pentium 4, introduced by Intel in 2000, contained 42million transistors and performed 2 billion instructions per second. These increases in processor speed, in conjunction with improvements in computer memory, allowed more and larger software applications to run on a computer. The decreasing costs of manufacturing processors and their increasing compactness has made possible very powerful personal computers.
0 komentar:
Posting Komentar