The history of computers dates back to 3000 BC. Abacus was invented in Babylon at that time. In 1800 BC the Babylonians invented algorithms for problems involving numbers. In 500 BC the Egyptians created
The bead and wire abacus. The use of computing trays began in Japan in 200 BC. In 1000 BC Gerbert or Pope of Aurillac introduced a new abacus.
computer related facts
Computer was invented two thousand years ago. Then the computer started in the form of abacus. Abacus is a rack made of wood with two strings attached to it. Both the wires are located parallel to each other. A bead shaped object was attached to the wire. By rotating that bead, the solution to some easy mathematical problems was obtained. Secondly, there was an astrobell which was used to connect them together.
The first digital computer was invented by Blaise Pascal in 1642 AD. It had a number which had to be dialled. But it could only do the work of connecting. However, a computer was invented in 1671 AD which was finally ready in 1694. The credit for this invention goes to Gottfried Wilhelm von Leibniz. Leibniz invented the stepped gear mechanism for the introduction of additive numbers, which is still used today.
- The abacus was invented in Babylon around 3000 BC.
- The Babylonians invented algorithms for number problems in 1800 BC.
- In 500 BC, the Egyptians created the manika and wire abacus.
- The use of computing trays began in Japan in 200 BC.
- The new abacus was introduced by Gerbert or Pope of Aurillac in 1000 BC.
- In 1617 AD, Scottish inventor John Napier described the system of division by subtraction and multiplication by addition.
- Slide rule was developed by William Outred in 1622 AD.
- In 1624 AD, the first quadruple-function calculator-watch was invented by Wilhelm Sicard of the University of Heidelberg.
- The first digital calculating machine was made by Blaise Pascal of Paris in 1642 AD.
- Electricity was discovered by Benjamin Franklin in 1780 AD
- In 1876, Alexander Graham Bell invented the telephone.
- In 1886, a commercial mechanical calculating machine was developed by William Barugh which was successful.
- Hollerith Tabulating Machine patent issued in 1889.
- 1896 AD In Hollerith, through the Tabulating Machine Company (which he himself founded) he manufactured sorting machines.
- After the merger of the Tabulating Company in 1911 AD, the Computer Tabulating Recording Company emerged. It was established by the merger of Computing Scale Company and International Time Recording Company.
- 1921 AD I used the Rossum Universal Robot by Karel Chepek. Czech word used to analyze mechanical workers using robot.
- 1925 AD In MIT, the Differential Analyzer (Differential Analyzer) large scale analog calculator was created by Vannevar Bush.
- 1927 AD The first public radio telephone was used between London and New York.
- In 1931 AD, Conrad Zuse of Germany created the Z-1 or the first calculator.
- In 1936, Englishman Alan M. Turning created a machine which was capable of computing any computable function.
- In 1937, George Stibz created the first quadratic calculator at Bell Telephone Laboratory.
- In 1938, Hewlett Packard Company made electrical equipment.
- 1940 AD Color broadcasting started on television
- Colossus Mark-2 was manufactured in England in 1944 AD.
- The Association for Computing Machinery was formed in 1947 AD.
- In 1948, I.B.M. 604 electronic calculator released by.
- 1951 AD The first joint computer conference was organized in
- In 1953, the first high-speed printer was developed by Remington Rand for Univac.
- In 1958, Japan N.E.C. The first electronic computers NEC-1101 and 1102 were developed.
- In 1969 AD, a 16 byte mini computer P.D.P. was launched by Digital Equipment Company. -Prepared 11/20.
- 8 byte microprocessor was introduced by Intel in 1972
- In 1976, Perkin Elmer and Gould SEL launched super mini computers in the market.
- Upon the establishment of Apple Computer in 1977, the Apple-2 personal computer was introduced.
- In 1980, the total number of computers in the United States reached more than 1 million.
- In 1983, the total number of computers in the United States crossed 1 crore.
- In 1992, Windows operating system for work groups was launched by Microsoft.

Stages of computer development- First generation (1940–1956): vacuum tubes : In the first era of computers, magnetic and circuitry drums were used for memory and that is why they were very large and took up the space of the entire room. These were very expensive to use, consumed a lot of power, generated a lot of heat, and sometimes even broke down because of this. The first generation computers depended mainly on machine language for operation and Could solve only one problem at a time. Their input was based on punched cards and paper tape while outputs appeared only on printouts.
UNIVAC (Universal Automatic Computer) and ENIAC (Electronic Numerical Integrator and Computer) First generation computers Examples of instruments are. UNIVAC was the first commercial computer delivered to the United States Census Bureau in 1951.
Vacuum Tube Circuit
2. Second generation (1956–1963): transistor: Use of transistors started in place of vacuum tubes and with this the second generation computers came into existence. The transistor was discovered in 1947 but its widespread use came only in the late 1950s. It was a much more sophisticated vacuum tube that made computers smaller, faster, cheaper, better energy efficient, and more reliable than first-generation computers. Although transistors also generated a lot of heat which could damage the computer, it was much better than vacuum tubes. Second generation computers also relied on punch cards for input and printouts for output.
Second generation computers used symbolic or assembly language rather than cryptic binary machine language. This made it easier for the programmer to give specific instructions in Words. Higher level programming languages were also discovered at this time, such as COBOL and FORTRAN. These were also the first computers to store their instructions in their memory, which led to the use of magnetic core technology in place of magnetic drums. The first computers of this generation were developed for the nuclear power industry.
Transistor based circuits
3: Third generation (1964-1971): Integrated Circuit: The biggest feature of the third generation computers was the use of integrated circuits. The transistor was miniaturized and placed on a silicon chip, which was called a semi-conductor. This dramatically increased the capacity and speed of computers.
In place of punch cards and printouts, users were introduced to monitors and keyboards in the third generation computers. Besides, he also got acquainted with an operating system. This made it possible to create many different applications at the same time with a central program. For the first time, computers could reach a larger audience because they were smaller and cheaper than before.
Integrated Circuits
4: Fourth generation (1971 to present): Microprocessor: Fourth generation computers came into existence with the microprocessor, in which thousands of integrated circuits were built into a single silicon chip. Where first generation computers took up entire rooms, computers could now fit in the palm of a hand. The Intel 4004 chip, invented in 1971, had all the essential components of a computer – from the central processing unit and memory to input and output controls – on just one chip.
In 1981, IBM brought its first computer. It was for home users. In 1984, Apple created the Macintosh. Microprocessors have spread beyond desktop computers to many areas of life and are being used in day-to-day products.
These small computers are very powerful. They could link together networks today that ultimately led to the development of the Internet. Fourth generation computers also led to the development of the mouse, GUI, and hand-held devices.
5:
Fifth generation (present and beyond): Artificial intelligence: Fifth generation computer tools based on artificial intelligence are still in development, although some tools such as voice recognition are in use today. Superconductors and parallel processing are helping make artificial intelligence a reality. Quantum computation and molecular and nanotechnology will completely change the face of computers in the coming years. The goal of fifth generation computers is to develop devices that can operate with natural language input and that are capable of self-organization and learning.
types of computers
- personal computer (pc)
Mainframe:Super Computer:
Laptop:
Micro Computer:
Personal Digital Assistant or Palmtop (PDA)
Analog:
Digital:
Hybrid:
When and who made India's first computer, know every detail here
Everyone knows about the world's first computer. Now it comes to the question of when the first computer came to India and who made it. The first computer developed in India was Siddhartha. It was made by Electronics Corporation of India. It was installed for the first time in the year 1986 on 16 August in Bangalore Head Post Office.
India's first supercomputer
Computers had arrived in India but the task of taking India's technology to a higher level was not possible without supercomputers. Now you must be wondering what is a supercomputer? The name of India's first supercomputer was Param 8000. Let us tell you that when it was completely built, it was the second most powerful computer in the world. The first supercomputer entered India in 1991. This computer was made by Param 8000 C-DAC whose director was Dr. Vijay Pandurang Bhatkar.
टिप्पणियाँ
एक टिप्पणी भेजें
Please do not enter any spam link here