let's start with a little history in math now don't worry I'm not gonna make it do any of the math I just want you to be familiar with the terms and the concepts and even if you don't completely follow along now I suspect when we get to more complicated things later they will make some more sense up until the 1950s the word computer meant somebody who did calculations there's all kinds of claims as to what the first computer was going back to the Chinese abacus or strange machines from ancient Greece that looked like something out of tomb raider many historians start with Basel Bashan and Joseph Marie jacquard no not Picard jacquard in the early 1700s the French fashion industry was taking off and the rich desperately wanted bright fabrics with exotic patterns on them like those imported from China and India the trouble was those exotic fabrics were extremely labor-intensive to make and had to be imported at great risk and cost so view Shan and jacquard figured out an ingenious way to use punched paper cards to control their looms and automate the pattern making they worked similar to the way that a player piano school does by allowing the pins to dropped through holes at the right moments to make melody or in the case of the looms to drop in colored thread to make the pattern this made the highly labor-intensive job a lot faster making Japan rich while lowering the price of exotic fabrics and consequently launching the first automation versus labor disputes that continued to this day when French workers revolted against their machines and many through their wooden clogs into the cogs the French word for these woodworker shoes is Szabo leading to our word sabotage and the Scottish workers leading similar protests were known as Luddites and the word we still use today to refer to somebody who fears or resists technology over in England Charles Babbage and ADA Lovelace work on their difference engine Babbage was a Cambridge mathematician in the early 19th century and was interested in efficiencies and the division of labour and he realized that a lot of calculations were very repetitive as any fifth grader with 40 long division problems due the next day can tell you in 1837 Babbage described the analytical engine that used some heavy-duty hardware to calculate equations it used punch cards designed by Lovelace making her in a sense in the fur computer programmer she described the machine by saying we may say most aptly that the analytical engine weaves algebraic patterns just as the jacquard loom weaves flowers and leaves the first computing machine put to actual use was invented by an American engineer and mathematician Herman Hollerith America had seen a massive population explosion in the late 1800s so the 1890 census was going to be a huge undertaking Hollerith electrified the punch cart technology using pins that would pass through the holes in the cards to complete a circuit or be blocked if there was no hole with this efficient calculating and tabulating tool the census results were completed in a few weeks rather than the years they may have taken to count by hand Hollerith under the company with his machine and called it the tabulating machine company his company is still in business today although CTR changed its name in 1924 to International Business Machines IBM and while IBM doesn't dominate the PC market like it used to in the 1990s it's still used widely in high-tech industries like energy exploration engineering and aerospace and places like well Space Station's it was the Second World War they gave a boost to computing like so many other industries in the early days of the war German u-boats and air power were devastating Allied ships and convoys the Germans were coordinating their attacks with encoded communications the Germans used a machine called by the Allies enigma that used a series of gears and plugs that could be assembled in a different order each day resulting in millions of different possible code combinations British and Polish mathematicians were enlisted to beat the code and were housed at the now-famous Bletchley Park most famous among them is Alan Turing no that's Benedict Cumberbatch that's Alan Turing Turing was the British mathematician and cryptographer most credited with breaking the Nazi naval codes allowing convoys to avoid u-boats and Luftwaffe touring in other great minds at Bletchley saved thousands of lives and likely turned the tide of the war their achievements went uncelebrated as Bletchley remained a state secret for decades and Turing himself was persecuted for his homosexuality and likely committed suicide in 1954 but before that touring worked in communications even created some of the first digital music and established the concept of artificial intelligence most importantly he invented the idea of an analytical processor that could be set to work on any kind of problem not just the specific task that the machine was built for so today our pcs can run analytical programs but we can also play games surf the net at a video or create cool stuff all with the same hardware back in America mathematicians and engineers were busy to people like Claude Shannon Howard Aiken John van Neumann and Grace Hopper were making breakthroughs that would lead to massive wartime computers like the Army's ENIAC since the Middle Ages people have been trying to work out the physics of ballistics to better blow stuff up from a distance the trouble is the math behind that can be very time-consuming and can take a skilled mathematician hours to work out just one solution the army wanted to automate that task to provide artillery tables to Gunners in the field so jon moxley and john presley eckhart set out to work at the university of pennsylvania designing ENIAC he was largely programmed and operated by women in 1951 Eckert and Mauchly created the civilian counterpart UNIVAC the first commercially available computer their company the Eckart MOC Lee Computer Corporation II MCC went through a lot of changes in ownership but it's still around today as unisys one of the companies that makes the internet possible these computers were massive and required huge amounts of electricity and constant maintenance largely because they used glass vacuum tubes if you've ever seen the inside of an old radio or if you're in the Steampunk which is kind of silly because vacuum tubes are electronic components and don't have anything to do with steam and anyway well you've probably seen a vacuum tube unive acts were so expensive and difficult to run they could only be used by the military large corporations or universities in 1949 Popular Mechanics magazine made the famous prediction that computers in the future may weigh no more than one and a half tons but two years earlier these guys working at Bell Laboratories invented this which you're probably more used to seeing looking like this which allowed things that started out looking like this to end up like this William Shockley and his colleagues won the Nobel Prize because the solid-state transistor perhaps more than anything has led to the explosion of other affordable and reliable electronics that we have today so far we've only been looking at machines that can crunch numbers what about displaying information mr. Scott's engineers are working on them now report to him when your indicators are registering properly even as late as the 1960s computers were imagined as blinking lights and spinning tape spools so when we start getting graphics well let's go back to the 19th century and back to the French in 1893 this guy Andre blundell a French physicist invented a way to graphically show frequencies on the electromagnetic spectrum his invention looks somewhat like a seismograph as a pen moved across a rotating drum drawing the sine wave four years later German physicist Karl Ferdinand Braun invented the cathode ray tube CRT screens were used in television and even computer monitors until LED and LCD screens began to replace them in the 1990s the British company AC Koster combined Blonde Elle's invention with Browns CRT tube to come up with the first oscilloscope in 1932 an old school analog oscilloscopes are still around today and use CRT tubes that project photons against a phosphor coated screen and just a few years later Philo T Farnsworth one of the last of the great solar inventors demonstrated television at the 1939 World's Fair the first broadcast had come three years earlier from a transmitter on top of the Empire State Building and was a fairly dull interview with the president of RCA Farnsworth device used the cathode ray tube to project a small black and white image TV lagged a little bit larger because of the Second World War but after the war Americans had the money and the leisure time again in television took off quickly [Music] [Applause] and by 1960 there were 52 million sets with about 90% of homes having at least one with color following shortly after in the early 1960s but computers were still using massive punch card systems and ticker tape style printouts the only way to get a visual display was through a grid of light bulbs like an electric abacus meanwhile scientists at MIT were working on their massive Lincoln tx2 computers and began connecting oscilloscopes up to their computers to get a simple VDU or visual display unit while working on his dissertation in 1963 ivan sutherland developed sketchpad the first computer program using a human graphical interface in a very real sense it was the first cad software and the first time humans could interact with a computer without using lines of code in other words the computer is supplied to compass to supply the straight edge for the straight line you go backwards you erase it and the astonishing thing is the way that Sutherland drew object is almost exactly the way we do today only we're using a mouse or a drawing tablet to click and drag instead of touching his light pen to the scope screen incidentally settlin also experimented with VR headsets decades before anybody else would but computers were still massive and astoundingly expensive only the largest industries could use them but mathematicians were busy developing the underlying algorithms we still use today for example French mathematician Pierre bezzie a built on math established in the 1950s by paul d costly ow to come up with a way to use calculus to draw complex curves you'll be using Bezier curves very soon you may already have if you've worked with a software that uses control points to change the shape of a line both Bezier and de castilla worked for french auto companies coronel and Citroen respectively car bodies involved all kinds of complicated curves and manufacturers used to sculpt full-scale models of cars out of clay to design their bodies in fact they still do today but it's very likely that's after a lot of development with CAD software and they're probably going to be employing computer driven milling tools to help them as well ladies and gentlemen this is your captain again I've just been talking Wight to crawl at London Airport when busy a was developing his curves most people haven't seen a computer let alone imagine they would ever work with one the one place they might have encountered one was in an airline ticket counter in 1953 to Smith's happen to sit next to each other on a plane CR Smith the president of American Airlines and our Blair Smith CEO of IBM and they got to talking about the complexity of airline ticketing even in the 1950s thousands of passengers were traveling on hundreds of flights each day around the world and keeping track of them all was a job for the computer and IBM introduced Sabre semi-automated business research environment originally Sabre consoles were attached to electric typewriters and had printout displays but they were soon replaced with early CRT monitors and Sabre is still in use today but it was students playing around with their oscilloscope screens and their university's computers they would start the ball rolling to get computers in nearly every home in America in 1958 William Higinbotham realized that donner model 30 computer he was using at the Brookhaven National Laboratory was able to use its simple oscilloscope display to create a tennis simulator he called tennis for two the program was displayed at several Technology exhibitions but the computer to run it was proprietary and inaccessible to most but tennis for two is considered the first interactive computer game but it was students and faculty at MIT in the early 1960s who came up with space war while programming on their lab computers after hours Steve Russell Martin gets Wayne Witten and among others created a game that could be played by two players on the radar screen display of their pdp-1 computer space war involved two ships the wedge and the needle moving across the screen in firing literal photon torpedoes at each other and while space war was passed around computer science departments and other universities it remained out of the public eye in 1972 Nolan Bushnell and Ted Dabney founded a company in Sunnyvale California to try to market a coin-operated version of space war called computer space and it was sold in a sparkly fiberglass housing although it became the first computer arcade game it wasn't much of a success of Bushnell and Dabney set out to make a driving game and hired a new computer scientist Allan Alcorn to develop it he started by making a ping-pong game using a simple black-and-white TV monitor to work on his skills but the team decided the test game was so fun to play they should just release that and in 1972 pong introduced the world to computers oh and the name of that company pong was a huge success and began the video arcade craze in the 1970s and 80s in addition to Atari a host of companies began to produce computer arcade games including Bally Capcom Sega and Nintendo undone pong and asteroids most of those early games were short-lived novelties until Japanese company Namco released a simple little maze game called pac-man the game was originally called man but American arcade owners lobbied for the name change as they probably rightly feared the game's cases would be vandalized in the US by the mid 80s there was hardly a kid in America who hadn't played pac-man [Music] Arcadian popularity have reached to such a peak 70s Atari took advantage of the new cheap silicon microprocessors to release the Atari 2600 home gaming system while in 1977 it wasn't the first home game console it sold by the millions Magnavox and television colecovision NES and others released variations many with better graphics and processors but the affordability in the wide range of game cartridges for the Atari kept it on top until Christmas of 1982 [Music] only from Atari made especially for systems from Atari the video game that lets you help ET get home in their rush to release the game time with Spielberg's massive box-office hit 80 in time for Christmas Atari pushed out a game along with a hugely expensive marketing campaign unfortunately the game was a disaster it cost the company gigantic losses and by 1984 it was bankrupt but not before hundreds of thousands of people had a computer in their living rooms by the 70s actual home computers were available to the general public but we're still very expensive this 1978 ad for an IBM 5110 features its price is only eighteen thousand dollars that's roughly seventy thousand dollars today RadioShack broke into the home computer market and launched its trs-80 in 1977 the same year as the Atari 2600 the ATS had a full keyboard and stored programs on audio cassette tapes as did the short-lived but popular Commodore 64's that appeared in 1982 they were still expensive but the trash 80s as they came to be unflatteringly known were in the sub thousand dollar range but still a big price tag for Americans dealing with the recession and gas shortages and the little computers were difficult to operate it most ran on Doss which is still deep inside the Windows operating system but it meant that the operator had to type specific command prompts to get them to do anything and most Americans agree with Ken Olson the founder of Digital Equipment Corporation when he said in 1977 there is no reason anyone would want a computer in their home that was before this happened the first Super Bowl ad that made more news than the game itself it was 1984 the cold war the LA Olympics and the year of Orwell's famous novels [Music] [Music] [Applause] On January 24 of Apple computer will introduce Macintosh and you'll see why 1984 won't be like 1984 in 1976 a guy named Ron Wayne and another called Steve Wozniak and another one oh yes Steve Jobs founded a company in a Cupertino California garage the first computer they produced wasn't very impressive Wayne left the company shortly after end jobs and Wozniak went on to make the Apple 2 and released it in that magical year of 1977 while it sold for the fairly steep price of $1300 more than 5000 and today's dollars it was a huge success Apple hit on the clever idea of making twos available to schools and for many kids in the 80s myself included their first interaction with a real computer was on a school labs Apple - right after releasing the Apple - Jobs and Wozniak began working on their next computer they would name it the Lisa but inside the Lisa was a bit of a mess it was way overpriced up in the IBM range of just under $10,000 and some cost-cutting decisions resulted in slow performance but it did sport one other key little feature a mouse the Lisa was hardly the first computer with a mouse Douglas Engelberg is credited with inventing the mouse at Stanford in 1963 and the Xerox star 8010 came with a mouse in 1981 but the Lisa and Apple's next project put the mouse in the hands of American computer users computer ah hello computer just use the keyboard a keyboard Oh quaint that dramatic 1984 Super Bowl ad was promoting the Apple Macintosh which debuted with the diabolically low price of 666 dollars the Mac took advantage of the point-and-click design of the mouse and the appealing icon based graphical user interface instead of the off-putting DOS prompt line the Macintosh shipped with several pre-installed programs - which are the great-grandparents of the software we use today one was called Mac Paint and used the mouse to allow the user to essentially draw on the screen no code or commands required just click on the tool you want and click and drag the mouse to make a shape Mac paint uses a pixel based system basically you were just turning on and off little bricks of light on the screen like a high-resolution version of a light right Mac draw on the other hand there is also Mac right and a few others used math or vector-based graphics not alike Sutherland sold sketchpad you were still clicking and dragging that instead of turning on or in this case off pixels by dragging the mouse tip over them you are setting a point moving the cursor and setting a second point to draw a line or a geometric shape exactly as we still do today the advantage of vector drawing is that the file can be smaller as the equation describing a circle for example is a lot smaller than the pixel map giving the coordinates of every pixel that needs to be on to show the outline of a circle and vector circles can be scaled to any size and still be a circle zoom in too close or enlarge a pixel map too big and you get the LEGO Minecraft look of a pixel in an image the great-grandchildren of those two applications are still in wide use today Photoshop and other digital photography or painting software uses pixel maps albeit maps that now have millions of colors and extremely high resolutions while design software like illustrator AutoCAD or vector works as the name suggests use vector or mathematically generated images that allow us to make very precise and detailed drawings and models despite the scale so we can model an enormous building on a small screen and print out highly accurate drawings in any scale we choose or we can use vector line art to drive a laser cutter or a CNC router to follow a precise pattern and we can use modeling tools to create virtual 3d objects that can be turned into actual 3d printed objects and we will do all of that in this class baby steps I promise Topol went on to make a wide variety of Mackintosh introducing color displays with the 1987 Mac to early laptops with the PowerBook in 91 and computers that were designed to easily surf that newfangled Internet with the iMac in 98 but despite all that Mac's were a niche market compared to the inexpensive pcs of the day and Mac's were founded largely in the arts desktop publishing recording and advertising while windows based systems dominated business and engineering after a disastrous failure with the Newton clumsy and expensive PDA in 1993 many people imagined Apple would soon follow Atari but in 2001 Apple released on the box that let you keep your music library in your pocket the iPod saved Apple and allowed for the 2007 launch of the iPhone let's go ahead and turn it on this is the size of it it fits beautifully in the palm of your hand by 2010 Macs were trendy and cool and in 2019 Apple surpassed Microsoft would become the world's most valuable company worth a trillion dollars today it's less and less important which platform you use is nearly all software has versions from both the Apple iOS and Windows and the differences are largely cosmetic game consoles go on with Sega's a Nintendo's in Genesis and PlayStation all to fill the void left by Atari mongers switch from hot heavy expensive CRT screens to liquid crystal displays allowing for razor thin monitors and making most computer choices laptops that were once the tool of business executives sitting in first class with powerful home computers CAD software became available outside of professional drafting studios in 1982 Autodesk launched the first version of AutoCAD a drafting software tracing its roots right back to sketch pad that same year John Wozniacki from Apple founded Adobe to create font and desktop publishing software Adobe bought Photoshop developed by John and Thomas Knoll in 1988 and Aldus PageMaker in 1995 and since then Adobe and Autodesk have been buying out other companies and adding their software to their line not unlike in 1963 German entrepreneur York and Emma tech founded an engineering company and in that magic year of 1977 began marketing his in-house CAD system static for use in civil engineering in 94 nemecheck acquired a small subsidiary of dell draft soft a company that had been making a Mac based CAD program called mini CAD they was basically a souped-up version of the old Mac draw the European version of graphs off began making ArchiCAD still news a day and the American company rebranded in 2000 as vector works for many years the vector works file extension was dot MCD for mini cat document before being changed to dot vwx but we've left graphics behind to pick up where we left off we need to go back to the 1970s those early graphics were using vector scanning where the emitter inside the CRT was actually firing at the positions of the shapes this is the way the old radar screen monitors like those that were used with sketchpad and space war worked but television screens were using raster displays where the electrons were firing and horizontal rows all the way across the screen and only firing where the shape was and this allowed for better and more complicated images until recently memory had always been the problem the Mac Plus back in the 80s had no hard drive and shipped with one megabyte of RAM not one gigabyte one megabyte the average computer today has about 4 gigabytes of RAM which means it has four thousand times more RAM than the basic Mac plus and with no hard drive everything including the operating system had to fit on a three and a half inch floppy disk which could typically store about one and a half megabytes for a couple hundred dollars you could add a 20 megabyte hard drive about the size of a phone book and that's what's in that box under the computer today it's not unusual to have a terabyte hard drive in a mid to high-end computer that's more than a thousand gigabytes so in order to have that much memory on a Macintosh in 1985 you need 51,000 of those phone book size hard drives to get as much storage is what this guy's hand will actually no you'd need 100 mm because that's actually a photo of a 2 terabyte portable drive but check this out they are expensive but you can now buy one terabyte microSD cards smaller than the tip of your finger so your laptop heck your phone has more memory and computing power than this million dollar UNIVAC from 1960 so with very limited memory the first computer systems had to rely on something called a character generator that saved the images of letters numbers and basic shapes for quick recall that's why those early green or orange displays had one basic font it was kind of baked into the hard memory but by the late 80s computers started to have dedicated memory or vram just to run graphics and now many computers have a secondary processor called a GPU a graphics processing unit separate from the CPU the main processor of the computer just to handle graphics if you played a triple a video game recently probably recognize the Nvidia logo they're one of a few companies specializing in this type of processor as computer graphics improved Hollywood soon became interested in the 1970s Lillian Schwartz was an artist-in-residence at Bell Labs and used their computers to create art animations like this example from 1976 called Olympiad there have been simple computer-generated images in the title sequences of Hitchcock's 1953 vertigo and the 1973 pixelated robot iView in West world but up until the 1980s most effects were created using tried and true techniques like matte painting and stop-motion animation but let's go back to that magic year again 1977 - the premiere of Star Wars a small one-man fighter should be able to penetrate the outer defense Larry Cuba's sketch pad wireframe graphics featured in the Death Star attacks briefing scene and in several other places in the film the massive success of Star Wars allowed Lucas to pour resources into his fledgling effects company Industrial Light and Magic and he went looking for new talent there were two schools in the early 70s exploring the potential of computer graphics the New York Institute of Technology and the University of Utah ivenn subtle and the sketchpad guy ended up on the faculty of u Utah where one of his students ed catmull meticulously measured and modeled his hand to create this animation during this process he created algorithms for anti aliasing subdivision surfaces texture mapping smooth shading and many other foundational concepts that are still part of digital art today he was hired to run the computer graphics lab at NYIT where he met Alvin ray Smith who was a pioneer in early digital art he wasn't long before the pair attracted the attention of a strange man from California Lucas offered jobs to cap Mellon Smith and they joined the team at ILM to the surprise Lucas didn't want them to create effects he wanted them to build Suites of digital editors and CGI generators in 1982 the team created the Genesis project effects for Star Trek - the Wrath of Khan meanwhile a guy named John Lasseter had attended the animation school at Cal arts and landed his dream job with Disney where he tried to convince the executives to invest in computer animation after working on several hand animation movies he was given a chance to show his stuff with a clip for proposed version of where the wild things are using a traditional and cg hybrid process the Disney execs were unimpressed and Lasseter was shown the door but he found his way to ILM where together with Kapil and Smith they created an animated short called The Adventures of Andre and wall-e be and went on to create the first fully CGI character in a feature film the stained glass knight from 1985 young Sherlock Holmes in 1986 the team's spun off to create their own company named Pixar so here it gets complicated with Lucas selling off Pixar to Steve Jobs who had been fired by Apple and then Disney trying to get involved and eventually the Pixar team made the first fully computer animated feature film Toy Story in 1991 and of course it was a massive success we're down there just down there a helpless part we've no time to lose there have been some notable CGI appearances and films in the 80s and 90s like 1989's the abyss and its water tentacle and all the morphing effects in Terminator 2 in 91 but with these few exceptions most computer-generated visual effects were brief albeit impressive moments until 1993 and the film that would change everything the dinosaurs in Jurassic Park were planned to be traditional stop-motion animations and Spielberg had hired Phil Tippett who had worked extensively on the Star Wars films to create them but ILM animators mark Debby and Steve Williams were convinced computers were ready for a major role in the film and worked for hours on a t-rex demo that they had just happened to leave playing on a monitor when they knew that Kathleen Kennedy one of the film's producers would be visiting the studio the demo was shown to Spielberg and he made the decision to switch to digital the final movie is a brilliant combination of digital effects puppetry and animatronics computer-generated dinosaurs are only on the screen for about 15 minutes of the film but more than a quarter of a century later Jurassic Park holds up and forever changed cinema today's CGI is common even in television shows and commercials small studios and even hobbyists can create effects on home computers that were unimaginable in the 20th century most fantasies sci-fi or historical movies build only the scenery that is directly interacted with by the actors and the rest of the world is computer-generated modeling power now allows for lifelike cgi characters actor and puppeteer Andy Serkis is now famous for his motion capture performances like his landmark Gollum in Lord the Rings motion capture now appears even in television shows and major characters are frequently entirely CGI the 2004 animated movie Polar Express was criticized at the time for falling into the uncanny valley or the point at which human-like representations are close but not quite realistic enough and are therefore unsettling we identify and sympathize with inanimate objects that take on human-like features but there's a point where our brains just register that something is wrong that's why despite the incredible computing power of their fingertips Disney and Pixar characters retain the exaggerated features of their hand-drawn predecessors but we may be getting eerily close to the other side of that uncanny valley hello I'm siren and I'm a digital human I was created by an international team of artists and engineers who wanted to challenge our ideas of what a synthetic human could be life's but a walking shadow a poor player that struts and frets his hour upon the stage and then is heard no more it is a tale told by an idiot full of sound and fury signifying nothing both the alien and Andy circuses face our CGI in this example the trouble with 3d graphics is that it's all an illusion unless we export an object out to a tool like a 3d printer the 3d shape only exists inside the computers algorithms the screen or paper we view a 3d object on is flat so computer scientists need to find a way to suggest 3d shapes on a flat plane if we make a cube out of sticks and then shine a light behind it to cast its shadow onto a flat surface we can kind of get an idea how 3d shapes are projected onto a flat surface but the computer is using a process called 3d projection and it does that for all different kinds of things about lights and things are clued 'add if something is behind something or in front of something it does it when we start talking about textures and how they wrap around the surface that 3d projection is an important part of all this so basically this little white plane right here is sort of like our screen and if you imagine we were to orbit around the scene here you can see kind of what's going on here what I have here are three different cubes and these little lines here are raised that the computer is tracing to show how they would appear on that flat plane so we can see that this green cube is in front of this blue cube and so that blue cube is then occluded it's drawn behind this object we don't need to get into exactly what's going on when the computer is doing this it's just basically lots of keeping track of where things are in three-dimensional space to show what lines to draw and what lines to hide believe it or not the engineers working with sketchpad way back in 1963 had already begun experimenting with three-dimensional models we've already seen Edie kotwal's 3d hand one of its colleagues at the University of Utah was martin newell while ed was working as hand martin was exploring hard surface modeling and was looking for a simple shape that at the same time would provide challenges for rendering his wife sandra suggested a cheap newel first graphed out the teapot on paper and began using Bezier curves to make a 3d model there were no modeling programs back then so Newell had to manually type in all the data to plot out the teapot more classmates got involved by tong Fong and Jim Blinn developed methods to shade the polygons on models and add things like soft shading reflections and cache shadows with the addition of lighting and texture maps the Utah teapot became the first realistically rendered three object and Phong and blends names are still used today when referring to their algorithms Newell's teapot has become a famous benchmark object for testing computer graphics it shows up in icons in software like vector works and has made easter-egg appearances and the Simpsons and screen savers and even made it into Toy Story well let's use our little eco sphere here to define what some of these terms mean so right now we're in the earliest form of rendering this is called wireframe for obvious reasons we're just seeing the edges of the object displayed where we can turn off some of this information by making this a hidden line rendering and you can see why it's called that so all of the other parts of the object that are on the other side of the sphere are now hidden from our view so we're only looking at the wireframe mesh that's facing the camera and we can improve that even further we can switch over to this view which is a shaded polygon view so we're still looking at the faces now but they are being lit differently depending upon where the source of the light is so you can see the light here in the scene is up in the upper right hand corner so we've got the dark side of the sphere and a bright side of the sphere and the computer knows how to light and shade each one of these polygons because of these little cyan hairs that are just appeared on the faces these are called normals each face has a little line extending off of its 90 degrees or perfectly perpendicular from the surface and then the computer interprets the angle that the light is coming from with the angle these normals to decide how much light or shade to add to that surface we can then increase the number of faces on our object to get it to be more round and the more times we increase the faces the smoother it gets we can even make it a fairly simple shape here but we can tell it to shade smooth and that smooth shading even though it does look a little awkward around the outside edges here has softened up all the faces to create an effect that looks much more spherical than it actually is and finally we can add textures and materials to our scenes so we can add photographs of actual real-world materials or create them inside of the computer to make things like the hexagonal tiles here on the floor this metallic texture is very easily made and is actually reflecting surfaces inside the scene and responding to the lights in the scene so with all these tools it's fairly easy today to create photorealistic images the earliest models were wireframes where only the edges of objects are shown we saw this in Larry Cuba's Star Wars trench but it also appeared in the earliest versions of 3d video games like 1980s battlezone but in battlezone the shapes are transparent and you can see right through solid objects the next hurdle was hidden line removal calculating what parts of a shape would be occluded or hidden from the viewer in 1984 to Cambridge University College students Ian Bell and David Robin produced elites considered the first indie game you played as a spaceship captain setting out into a vast galaxy of planets to explore but in elite you had flight simulator like 3d movement and objects you encounter our simple geometric shapes but unlike earlier attempts the hidden lines were removed as objects rotated in 3d space the payer even developed an ingenious 3d radar screen so you could see if other ships were above or below you as well Microsoft began to experiment with filled and shaded polygons and programs like flight simulator these graphics may look absurdly simple to us today that at the time they were another breakthrough origin systems had been releasing Dungeons & Dragons like Ultima games since the early 80s but the early games were hardly more than a text-based adventure with simple sprites suggesting the environment or monsters that changed in 1992 with Ultima Underworld the Stygian abyss it's considered the original first-person 3d game with the player moving around freely inside a 3d maze and from here on the paint would start being worn off the WASD Keys on keyboards all around the world [Music] the following year saw another landmark with 1993's mist the graphics in mist could take a huge leap due to pre-rendered scenes it was also the first breakthrough game available for the Mac the player isn't free to move in any direction instead they move step by step through a series of still images or small QuickTime videos the graphics could look so good because those pre-rendered scenes could be stored on a new format in 1982 the Japanese company Denon developed a way to store computer data on compact discs Sony and Philips followed and by the early 90s cd-rom discs became the first portable mass storage device holding roughly the equivalent to 450 floppy disks sales of cd-rom drives skyrocket in the mid-90s and many consumers bought their first drive just so they could play missed developments came quickly in the early 90s with Wolfenstein doom descent and quake these games took advantage of array casting where the computer holy renders with the player can see level designers used simple modular systems like the original tomb raider where Laura moves through a cubic world but simple texture mapping allowed for a huge range of atmospheric locations by the end of the decade gamers had upgraded their systems with powerful Pentium chips graphic cards and CD ROMs and true 3d games especially first person shooters became commonplace games began to incorporate smooth shading to help soften to the pass of the book of earlier games self casting shadows where objects cast shadows on themselves all helped to lead to greater realism in games in 1998 Epic Games released unreal a first person shooter designed by Tim Sweeney to take advantage of the players beefed up computers it included collision detection so players could more realistically interact with objects in the game improvements in light maps allowed for spooky atmospheric lighting and even interactive lighting unreal was a huge success but more importantly the Unreal Engine served as a platform for many games that came after it and in 2009 the third generation of the Unreal Engine was released to the public in the form of the Unreal development kit hundreds of games have been built the Unreal Engine like Bioshock Arkham Asylum Borderlands dishonor Gears of War Mass Effect outer worlds fortnight and literally hundreds of others today along with publicly available engines like unity game maker Godot and Crysis independent game developers architects and film studios are taking advantage of freely available game engines we never had to stop and wait for renders everything can be adjusted right then at there here you can see me tweaking the fog and some of unreal engines post-processing effects like chromatic aberration vignetting and film grain [Music] unreal along with many corporate colleagues are amazingly close to holodecks the camera here is on a robotic dolly being driven by an operator moving of virtual camera around in the scene LED lights respond to changes in the lighting environment and the digital world is projected on the screens surrounding the stage and respond in real-time to changes that are made in the unreal scene so now gamers and cinematic digital effects are merged again as computers get faster and memory gets bigger the details of digital environments get more realistic reflections atmospheric elements like fog or rain dynamic lighting subsurface scattering which is the effect of light passing through a semi-transparent object like wax or human skin caustics which is how light is warped when it passes through a translucent material real-time physics fluid dynamic simulations and all these effects can be created in free open source tools like blender in fact all the examples I've just shown are for blender architecture and entertainment design have kept up and incorporated many of the technologies coming out of Hollywood in game design BIM or Building Information Management technology allows architects and designers to keep track of the hundreds of parts of a project so as the building is being designed the software is keeping track of the number of windows doors sinks and toilets whatever that's going into that building models can show where the Sun shadow will fall throughout the day or the time of year thermal maps can be created to help make more energy-efficient buildings and even improve the air flow of heating and cooling systems even acoustical engineering can be simulated before ground is even broken on a project and clients can roam around a virtual building with cinematic realism using game-like controls scenic designers can show directors photo realistic representations of a set long before drawings are passed on to a shop lighting designers take advantage of the same database tools the architects are using to keep track of hundreds of lighting instruments the type position focused gel color weight and circuiting can all be noted automatically in the background as the designer develops the plot and paperwork is generated with a few clicks gobos and projections and LED screens can be previewed digitally on versions of the scenery just like an architecture and theatrical design can be pre-visualized long before any physical work begins free tools like Ultima Kerkyra allow creating g-code or other instructions read by 3d printers completely automated so people with no coding experience can model in 3d and print actual physical objects and software like carbide creates another free one can program tool paths for CNC routers and other robotic fabrication machines the most amazing thing about all this history and development is that these cutting edge tools are available to anybody who wants to learn them you don't have to be working on a PhD in mathematics to get your hands on the stuff anymore almost every publicly available software has a free student license or allows you to use their tools for little or no cost until you start making money with them many are outright free there has never been a better or easier time to learn how to harness these amazing tools built on the shoulders of giants to create phenomenal things impossible to imagine only a decade ago [Music] you [Music]