Transcript for:
The Milestones of Computing Evolution

Hi, thanks for tuning into Singularity Prosperity. This video is the first in a multi-part series discussing computing. In this video, we'll be discussing the evolution of computing, more specifically, the evolution of the technologies that have brought upon the modern computing era. The purpose of this video is so we can appreciate how fast technology is evolving and the people who have brought us to this point. Many inventions have taken several centuries to develop into their modern forms, and modern inventions are rarely the product of a single inventor's efforts. The computer is no different. The bits and pieces of the computer, both hardware and software, have come together over many centuries, with many people and groups each adding a small contribution. We start as early as 3000 BC with the Chinese abacus. How is this related to computing, you ask? The abacus was one of the first machines humans had ever created to be used for counting and calculating. Fast forward to 1642, and the Abacus evolves into the first mechanical adding machine, built by mathematician and scientist Blaise Pascal. This first mechanical calculator, the Pascaline, is also where we see the first signs of technophobia emerging, with mathematicians fearing the loss of their jobs due to progress. Also in the 1600s, from the 1660s to the early 1700s, we meet Gottfried Leibniz, a pioneer in many fields, most notably known for his contributions to mathematics, and considered by many the first computer scientist. Inspired by Pascal, he created his own calculating machine. able to perform all four arithmetic operations. He was also the first to lay down the concepts of binary arithmetic, how all technology nowadays communicates, and even envisioned a machine that used binary arithmetic. From birth, we are taught how to do arithmetic in base 10, and for most people that's all they are concerned with, the numbers 0 to 9. However, there are an infinite number of ways to represent information, such as octal as base 8, hexadecimal as base 16 used to represent colours, base 256 which is used for encoding, the list can go on. Binary is base 2. represented by the numbers 0 and 1. We'll explore later in this video why binary is essential for modern computing. Back on topic, progressing to the 1800s we were met with Charles Babbage. Babbage is known as the father of the computer with the design of his mechanical calculating engines. In 1820, Babbage noticed that many computations consisted of operations that were regularly repeated and theorized that these operations could be done automatically. This led to his first design, the Difference Engine. It would have a fixed instruction set, be fully automatic through the use of steam power, and print its results into a table. In 1830, Babbage stopped work on his Difference Engine to pursue his second idea, the Analytical Engine. Elaborating on the Difference Engine, this machine would be able to execute operations in non-numeric orders through the addition of conditional control, store memory, and read instructions from punch cards, essentially making it a programmable, mechanical computer. Unfortunately, due to lack of funding, his designs never came to reality. but if they had, would have sped up the invention of the computer by nearly 100 years. Also worth mentioning is Ada Lovelace, who worked very closely with Babbage. She is considered the world's first programmer, and came up with an algorithm that would calculate Bernoulli numbers, that was designed to work with Babbage's machine. She also outlined many fundamentals of programming, such as data analysis, looping, and memory addressing. Ten years prior to the turn of the century, with inspiration from Babbage, American inventor Herman Hollerith designed one of the first successful electromechanical machines, referred to as the Sensus Tabulator. This machine would read US census data from punch cards, up to 65 at a time, and tally up the results. Halloritt's tabulator became so successful, he went on to found his own firm to market the device. This company eventually became IBM. To briefly explain how punch cards work, essentially, when fed into a machine, an electrical connection is attempted to be made. Depending on where the holes in the card are will determine your input based on what connections are completed. To input data to the punch card, you could use a key punch machine, aka the first iteration of a keyboard. The 1800s are a period where the theory of computing began to evolve, and machines started to be used for calculations. But the 1900s is where we begin to see the pieces of this nearly 5,000 year puzzle coming together, especially between 1930 to 1950. In 1936, Alan Turing proposed the concept of a universal machine, later to be dubbed the Turing machine, capable of computing anything that is computable. Up to this point, machines were not only able to do certain tasks that the hardware was designed for. The concept of the modern computer is largely based off Turing's ideas. Also starting in 1936, German engineer Konrad Zuse invented the world's first programmable computer. This device read instructions from punched tape and was the first computer to use Boolean logic and binary to make decisions through the use of relays. For reference, Boolean logic is simply logic that results in either a true or false output, or when corresponding to binary, 1 or 0. We'll be diving into Boolean logic deeper later in this video. Zuse would later use punch cards to encode information in binary, essentially making them the first data storage and memory devices. In 1942, with the computer the Z4, Zuth also released the world's first commercial computer. For these reasons, many consider Zuth the inventor of the modern day computer. In 1937, Howard Akin with his colleagues at Harvard and in collaboration with IBM began work on the Harvard Mark I calculating machine, a programmable calculator and inspired by Babbage's analytical engine. This machine was composed of nearly 1 million parts had over 500 miles of wiring and weighed nearly 5 tons. The Mark 1 had 60 sets of 24 switches for manual data entry and could store 72 numbers, each 23 decimal digits long. It could do 3 additions or subtractions in a second, a multiplication took 6 seconds, a division took 15.3 seconds, and a logarithm or trig function took about 1 minute. As a funny side note, one of the primary programmers of the Mark I, Grace Hopper, discovered the first computer bug, a dead moth blocking one of the reading holes of the machine. Hopper is also credited with coining the word, debugging. The vacuum tube era marks the beginning of modern computing, the first technology that was fully digital, and unlike the relays used in previous computers, were less power hungry, faster, and more reliable. Beginning in 1937 and completing in 1942, the first digital computer was built by John Atanasoff and his graduate student Clifford Berry. The computer was dubbed the ABC. Unlike previously built computers, like those built by Zeus, the ABC was purely digital. It used vacuum tubes and included binary math and boolean logic to solve up to 29 equations at a time. In 1943, the Colossus was built in collaboration with Alan Turing. to assist in breaking German crypto codes. Not to be confused with Turing's Bombay that actually solved Enigma. This computer was fully digital as well, but unlike the ABC was fully programmable, making it the first fully programmable digital computer. Completing construction in 1946, the Electrical Numerical Integrator and Computer aka the ENIAC was completed. Composed of nearly 18,000 vacuum tubes and large enough to fill an entire room, the ENIAC is considered the first successful high-speed electronic digital computer. It was somewhat programmable, but But like Aiken's Mark I was a pain to rewire every time the instruction set had to be changed. The ENIAC essentially took the concepts from Atanasoff's ABC and elaborated on them in a much larger scale. Meanwhile, the ENIAC was under construction in 1945. Mathematician John von Neumann contributed a new understanding of how computers should be organized and built, further elaborating on Turing's theories and bringing clarity to the idea of computer memory and addressing. He elaborated on conditional addressing or subroutines, something Babbage had envisioned for his analytical engine nearly 100 years earlier. Also, the idea that instructions or the program running on a computer could be modified in the same way as data, and decode them in binary. Von Neumann assisted in the design of the ENIAC's successor. the electronic discrete variable automatic computer aka the EDVAC which was completed in 1950 and the first stored program computer it was able to operate over 1 000 instructions per second he is also credited with being the father of computer virology with his design of a self-reproducing computer program and it contains essentially those things which the modern computer has in it although in somewhat primitive form this machine has stored program concept as its major feature. And that, in fact, is the thing which makes the modern computer revolution possible. At this point you can see that computing had officially evolved into its own field. From mechanical to electromechanical relays that took milliseconds to digital vacuum tubes that took only microseconds. From binary as a way to encode information with punch cards to being used with boolean logic and represented by physical technologies like relays and vacuum tubes to finally being used to store instructions and programs. From the abacus as a way to count to Pascal's mechanical calculator, the theories of Levinese, Alan Turing, and John von Neumann, the vision of Babbage and the intellect of Lovelace, George Boole's contribution of boolean logic, the progressing inventions of a programmable calculator to a stored program fully digital computer, and countless other inventions, individuals, and groups, each step a further accumulation of knowledge. While the title of the inventor of the computer may be given to an individual or group, it was really a joint contribution over 5000 years, and more so between 1800 to 1950. Vacuum tubes were a huge improvement over relays, but they still didn't make economic sense on a large scale. For example, Of the ENIAC's 18,000 tubes, roughly 50 would burn out per day, and a round-the-clock team of technicians would be needed to replace them. Vacuum tubes were also the reason why computers took up the space of entire rooms, weighed multiple tons, and consumed enough energy to power a small town. In 1947, the first silicon transistor was invented at Bell Labs, and by 1954, the first transistorized digital computer was invented, aka the Trotic. It was composed of 800 transistors, took the space of.085 cubic meters compared to the 28 the ENIAC took up, only took 100 watts of power and could perform 1 million operations per second. Also during this era we begin to see major introductions on both the hardware and software aspect of computing. On the hardware side, the first memory device, the Random Access Magnetic Core Store was introduced in 1951 by Jay Forrester. In other words, the beginnings of what is now known as RAM today. The first hard drive was introduced by IBM in 1957. It weighed 1 ton and could store 5 megabytes, costing approximately $27,000 per month in today's money. On the software side is where a lot of major innovations and breakthroughs began to come. This because computer hardware and architecture was beginning to become more standardized, instead of everyone working on different variations of a computing machine. Assembly was the first programming language to be introduced in 1949, but really started taking off in this era of computing. Assembly was a way to communicate with the machine in pseudo-English, instead of machine language, a The first true widely used programming language was Fortran, invented by John Baucus at IBM in 1954. Assembly is a low-level language, and Fortran is a high-level language. In low-level languages, while you aren't writing instructions in machine code, a very deep understanding of computer architecture and instructions is still required to execute a desired program, which means a limited number of people have the skills and it is very error-prone. Also in the early to mid-50s, to compile code back to machine code was still an expensive and time-consuming process. This all changed with Grace Hopper and her development of the first computer compiler. Hopper, if you remember from earlier, also found the first computer bug. This allowed for programming of computers to become more affordable and nearly instantaneous instead of the time-consuming process of writing code in assembly and then manually converting it back to machine code. As a side note, Hopper assisted with the invention of another early programming language, COBOL. This era marks the beginnings of the modern computing era, and where the exponential trend of computing performance really began. While transistors were a major improvement over vacuum tubes, they still had to be individually soldered together. As a result, the more complex computers became led to more complicated and numerous connections between transistors, increasing the likelihood of faulty wiring. In 1958, this all changed with Jack Kilby of Texas Instruments and his invention of the integrated circuit. The integrated circuit was a way to pack many transistors onto a single chip instead of individually wiring transistors. Packing all the transistors also significantly reduced the power and heat consumption of computers once again. and made them significantly more economically feasible to design and buy. Integrated circuits sparked a hardware revolution, and Beyond Computers assisted in the development of various other electronic devices due to miniaturization, such as a mouse invented by Douglas Engelbart in 1964. He also demonstrated the first graphical user interface as a side note. Computer speed, performance, memory, and storage also began to iteratively increase as ICs could pack more transistors into smaller surface areas. This demonstrated by the invention of the floppy disk in 1971 by IBM, and in the same year DRAM by Intel to list a few. Along with hardware, further advances in software were made as well, with an explosion of programming languages and the introduction of some of the most common languages today, BASIC in 1964 and C in 1971. As you can see from throughout this video, computing since the 1900s has evolved at an increasingly fast rate. Thus, in 1965, led Gordon Moore, one of the founders of Intel, to make one of the greatest predictions in human history. Computing power would double every two months at low cost, and that computers would eventually be so small that they could be embedded into homes, cars, and what he referred to as personal portable communications equipment, aka mobile phones. We now refer to this as Moore's Law. Harrison tried to further illustrate how fast computing was evolving and what Moore based his predictions on. One of my colleagues called this Moore's Law. Rather than just being something that... It chronicles the progress of the industry. It kind of became something that drove the progress of the industry. A tremendous amount of engineering and commitment has been required to make that happen, but much to my surprise, the industry has been able to keep up with the projection. At this point the video has come to a conclusion. I'd like to thank you for taking the time to watch it. If you enjoyed it, please leave a thumbs up, and if you want me to elaborate on any of the topics discussed or have any topic suggestions, please leave them in the comments below. Consider subscribing to my channel for more content, follow my Medium publication for accompanying vlogs, and like my Facebook page for more bite-sized chunks of content. This has been Encore, you've been watching Singularity Prosperity, and I'll see you again soon.