The computer is such an important component of modern life that it is hard to imagine the world without it. Before the computer, the first devices used by primitive people for counting were sticks, stones, and bones. The invention process of the computer started around 3,000 years ago.
The computer started out as an abacus. An abacus is a rack made of wood with two wires running parallel to each other. On the wires, there are beads. By moving the beads, anyone can solve simple math problems.
The first digital computer called Pascaline was invented in 1642 by Blais Pascal. It consisted of numbers entered in dials, but it could only add. It added numbers entered with dials and was made to help his father, a tax collector.
The basic principle of his calculator is still used today in water meters and modern-day odometers. Although it did offer a substantial improvement over manual calculations, only Pascal himself could repair the device, and it cost more than the people it replaced at work. However, in 1671, another computer was invented by Gottfried Wilhelm von Leibniz and was eventually built in 1694. Unlike Pascal's computer, Leibniz's could add, subtract, multiply, and divide. The speed of calculation for multiplication or division was acceptable, but like the Pascaline, this calculator required that the operator using the device had to understand how to turn the wheels and know the way of performing calculations with the calculator. Before I continue to the history of computers, if you are new to Inventions Flex, please subscribe below and turn on the notification so that you will be the first to know when we post new videos on ancient and modern tech inventions.
Haven't said that, let's dive back in. In 1800s Charles Babbage, an English mechanical engineer and polymath, originated the concept of a programmable computer. Considered the father of the computer, he conceptualized and invented the first mechanical computer in the early 19th century.
In 1827, after convincing the British government to finance his project, he worked for years on his difference engine, a device intended for the production of tables. While he produced prototypes of portions of the difference engine, eventually, he gave up. In 1854, he decided to build an analytical engine, which was also left unfinished.
However, his proposals for mechanical computers predated the modern reinvention of computers by almost a century. Because of this accomplishment, he was able to build a new engine. Charles Babbage has earned his place in history as the father of computing. A step towards automated computing was the development of punched cards, which were first successfully used with computers in 1890 by Herman Hollerith and James Powers, who worked for the U.S. Census Bureau.
They developed devices that could read the information that had been punched into the cards automatically, without human help. Because of this, reading errors were reduced dramatically, workflow increased, and most importantly, stacks of punched cards could be used as easily accessible memory of almost unlimited size. Furthermore, different problems could be stored on different stacks of cards and accessed when needed. The history of computer development is often referred to as the different generations of computing devices. Each generation of computers is characterized by a major technological development that fundamentally changed the way computers operate, resulting in increasingly smaller, cheaper, more powerful, efficient, and reliable devices.
The different generations of computing devices are 1. First Generation Computers, 1940-1956 The first computers used vacuum tubes for circuitry and magnetic drums for memory. They were often huge, occupying entire rooms, very expensive to operate, used a great deal of electricity, and generated a lot of heat. which was often the cause of malfunctions. First-generation computers relied on machine language to perform operations, and they could only solve one problem at a time.
Input was based on punched cards and paper tape, and output was displayed on printouts. Examples of first-generation computers are UNIVAC, Universal Automatic Computer, and ENIAC, Electronic Numerical Integrator and Computer. Second-generation computers, 1956 to 1963. Transistors replaced vacuum tubes and ushered in the second generation of computers. The transistor was invented in 1947 but did not see widespread computer use until the late 50s. The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy efficient, and more reliable than their first-generation predecessors.
Though the transistor still generated a great deal of heat that subjected the computer to damage. It was a vast improvement over the vacuum tubes. Second-generation computers still relied on punched cards for input and printouts for output. Second-generation computers moved from cryptic binary machine language to symbolic, or assembly, languages, which allowed programmers to specify instructions in words.
High-level programming languages were also being developed at this time, such as early versions of COBOL and FORTRAN. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology. The first computers of this generation were developed for the atomic energy industry.
Third Generation Computers, 1964-1971 The development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers. Instead of punched cards and printouts, users interacted with third generation computers through keyboards and monitors and interfaced with an operating system.
which allowed the device to run many different applications at one time with a central program that monitored the memory. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors. Fourth Generation Computers, 1971. The microprocessor brought the fourth generation of computers as thousands of integrated circuits were built onto a single silicon chip. What in the first generation filled an entire room could now fit in the palm?
The Intel 4004 chip, developed in 1971, located all the components of the computer from the central processing unit and memory to input-output controls on a single chip. chip. In 1981, IBM introduced its first computer for the home user, and in 1984, Apple introduced the Macintosh.
Microprocessors also moved into many areas of life as more and more everyday products began to use microprocessors. As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUIs, the mouse, and hand-held devices.
Fifth generation computers Fifth-generation computing devices based on artificial intelligence are still in development, though there are some applications, such as voice recognition, that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. Quantum computation and nanotechnology will radically change the face of computers in years to come.
The goal of fifth-generation computing is to develop devices that respond to natural language input and are capable of learning and self-organization. The computer will continue to evolve for generations to come because it has become an essential tool for important human daily activities such as work, gaming, or general entertainment. Now you know the history of computers, do like, share and subscribe for more historic inventions. See you in the next video.