In order to remove the drawbacks of mechanical computers, Le De Forest (1873-1961) in 1906 permits the switching of electrical signals at speed exceeding those of any mechanical device. These computers were later referred to as first generation computers.
A computer generation refers to the state of improvement in the development of computers. in each new generation, there is a major computer technological development that fundamentally changed the way computers operate, resulting in the miniaturization, decrease in cost, increase in speed, memory and power, efficiency and reliability.
In 1946 two Americans, Presper Eckert and John Mauchly built the ENIAC (Electronic Numerical Integrator and Calculator) which used vacuum tubes instead of the mechanical switches of Mark I.
The ENIAC used thousands of vacuum tubes, which took up a lot of space and gave off a great deal just like light bulbs do. The ENIAC led to other vacuum tube type computers like the EDVAC (Electronic Discrete Variable Automatic Computer) and the UNIVAC (UNIVersal Automatic Computer).
The first generation computers were made from bulky vacuum tubes that dissipate large amount of power as heat. Thus, they require elaborate cooling systems. However even with huge coolers, vacuum tubes still overheated regularly. These first generation computers were often undependable, huge, slow, expensive to Operate and maintain. Present day standard also rated the vacuum tubes to be unreliable. These shortcomings hinder further development of the first generation computers.
In 1948 three scientist, John Bardeen, William Shockley, and Walter Brattain working at AT & T’s Bell Labs invented what would replace the vacuum tube forever. This invention was the transistor which functions like a vacuum tube in that it can be used to relay and switch electronic signals. Transistors were a tremendous breakthrough in advancing the computer. Also ferrite cores and magnetic drums replaced cathode-ray tube memories. This was then referred to as second-generation computers.
Due to the knowledge of semiconductors, availability of silicon, reduce cost and mass production methods, the second generation computers become much smaller, cooler and reliable than the first generation computers. In quest for further improvement between 1965-1974, several transistors were put together to generate integrated circuit (IC). The integrated circuit, or as it is sometimes referred to as semiconductor chip, packs a huge number of transistors onto a single wafer of silicon: 1 to 10 transistors – Small Scale Integration (SSI) – and 10 to 100 transistors – Medium Scale Integration (MSI). Robert Noyce of Fairchild Corporation and Jack Kilby of Texas instrument independently discovered the amazing attributes of integrated circuits. Placing such large numbers of transistors on a single chip vastly increase the power of a single computer and lowered its Cost considerably.
Since the invention of integrated circuits, the number of transistors that can be placed on a single chip has doubled every two years, shrinking both the size and cost of computers even further and further enhancing its power. Most electronic devices today use some form of integrated circuits placed on printed circuit boards – thin pieces of bakelite or fiberglass that have electrical connections etched onto them – sometimes called a mother board. This generation of computers was referred to as third generation computers.
These third generation computers could carry out instructions in billions of a second. The size of these machines dropped to the size of small cabinets. In 1975, the microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip: 100 to 1000 transistors – Large Scale Integration (LSI) and thousands of transistors – Very Large Scale Integration (VLSI). What in the first generation filled an entire room could now fit in the palm of the hand.
The Intel4004 chip, developed in 1971 by Ted Hoffemployed by Intel, located all the components of the computer – from the central processing unit and memory to input/output controls – on a single chip. In 1981, IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh.
Microprocessors also moved out of the realm of desktop (Computers that can be put installed on desks) and into many areas of life as more and more every day products began to use microprocessors. As these small computers became more powerful, they could be linked together to form networks which eventually led to the development of the Internet.
Fourth generation computers also saw the development of handheld devices. Today there are various microprocessors with amazing performances which suggest that there is no end in sight for computer generations. Some of these microprocessor sizes may not be more than 3×3 inches but contains thousands of components due to Very Large Scale Integration technology.
The present and beyond of computer development had led to the fifth generation computers that are based on Artificial Intelligent. Though It is still in its infant stage, some successes such as voice recognition and imitation of human reasoning are being used today. Parallel processing (computer harnessing the power of many processors to work as one) and superconductor technology (the development of materials that offer zero resistance to the flow of current) will dramatically change the fourth generation computers in years to come. The capability to learn, self-organization and the ability to respond to natural languages (eg Hausa, Igbo,Yoruba, Urhobo etc) are some of the goals of the fifth generation computers. It should be noticed that the driving force that necessitated continual development Of computers are: (1) Speed (2) Size (3) Cost (4) Efficiency