HISTORY OF COMPUTERS

꧁ Digital Diary ༒Largest Writing Community༒꧂


Meri Kalam Se Digital Diary Create a free account



HISTORY OF COMPUTERS

 AllComputer Basic 

In old times, the man used to count his animals with the help of things arounds him like stone, bones, fingers, etc. Slowly he learnt to do calculations likes addition,subtraction, multiplication,division and more comlicated calculation. As the complexity of calculations increased, it needed devices to help him in calculations. To fulfill the need.

SOME MAJOR MACHINE THAT ARE USED IN CALCULATIONS

ABACUSABACUS :It was the first calculating device of world. Abaci” and “abacuses” redirect here. For the Turkish surname, see Abaci. For the medieval book, see Liber Abaci.Chinese abacus A Chinese abacus Calculating-Table by Gregor Reisch: Margarita Philosophical, 1508. The woodcut shows Arithmetica instructing an algorist and an abacist (inaccurately represented as Boethius and Pythagoras).Napier's bonesNapier’s bones is a manually-operated calculating device created by John Napier of Merchiston for calculation of products and quotients of numbers. The method was based on Arab mathematics and the lattice multiplication used by
Matrakci Nasuh in the Umdet-ul Hisab and Fibonacci’s work in his Liber Abaci. The technique was also called Rabdology. The strips, were made from bones of animals. Therefor they were called Napier’s bones. These astrips are of 10 types, on which multiples of 0 to 9 are printed in such way that tens digits of the adjacent strip. multiplication becomes easy by adding these two digits.

HISTORY OF COMPUTERS


The computer as we know it today had its beginning with a 19th century English mathematics professor name Charles Babbage.
He designed the Analytical Engine and it was this design that the basic framework of the computers of today are based on.

Generally speaking, computers can be classified into three generations. Each generation lasted for a certain period oftime,and each gave us either a new and improved computer or an improvement to the existing computer.

FIRST GENERATION: 1937 – 1946 –

In 1937 the first electronic digital computer was built by Dr. John V. Atanasoff and Clifford Berry. It was called the Atanasoff-Berry Computer (ABC). In 1943 an electronic computer name the Colossus was built for the military. Other developments continued until in 1946 the first general– purpose digital computer, the Electronic Numerical Integrator and Computer (ENIAC) was built. It is said that this computer weighed 30 tons, and had 18,000 vacuum tubes which were used for processing. When this computer was turned on for the first time lights dim in sections of Philadelphia. Computers of this generation could only perform a single task, and they had no operating system.

SECOND GENERATION: 1947 – 1962 –

This generation of computers used transistors instead of vacuum tubes which were more reliable. In 1951 the first computer for commercial use was introduced to the public; the Universal Automatic Computer (UNIVAC 1). In 1953 the International Business Machine (IBM) 650 and 700 series computers made their mark in the computer world. During this generation of computers over 100 computer programming languages were developed, computers had memory and operating systems. Storage media such as tape and disk were in use also were printers for output.

THIRD GENERATION: 1963 – PRESENT –

The invention of the integrated circuit brought us the third generation of computers. With these invention computers became smaller, more powerful more reliable and they are able to run many different programs at the same time. In1980 Microsoft Disk Operating System (MS-Dos) was born and in 1981 IBM introduced the personal computer (PC) for home and office use. Three years later Apple gave us the Macintosh computer with its icon-driven interface and the 90s gave us a Windows operating system.

FOURTH GENERATION:  MICROPROCESSORS (1971-PRESENT)

The microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. What in the first generation filled an entire room could now fit in the palm of the hand. The Intel 4004 chip, developed in 1971, located all the components of the computer—from the central processing unit and memory to input/output controls—on a single chip.

In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and more everyday products began to use microprocessors.

As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUIs, the mouse and handheld devices.

 

 

FIFTH GENERATION: ARTIFICIAL INTELLIGENCE (PRESENT AND BEYOND)

Fifth generation computing devices, based on artificial intelligence, are still in development, though there are some applications, such as voice recognition, that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality.

Quantum computation and molecular and nanotechnology will radically change the face of computers in years to come. The goal of fifth-generation computing is to develop devices that respond to natural language input and are capable of learning and self-organization.




Leave a comment

We are accepting Guest Posting on our website for all categories.


Comments

buy best proxies [deborachiodo@web de] Date:- 2023-06-05 13:17:55
Hi! This is kind of off topic but I need some guidance from an established blog Is it very hard to set up your own blog? I m not very techincal but I ccan figure things out pretty fast I m thinking about making my oown but I m not sure where to st

Wefru Services

I want to Hire a Professional..