Jensen hang sat down with Mark Zuckerberg at sigraph 2024 and discussed a range of artificial intelligence topics I made a super cut of it so let's check it out I guess my first question for you is you know how do you see how do you see the uh uh the advances of generative AI ad meta today and how do you apply it to either enhance your operations or introduce new capabilities that you're offering with generative AI um I think we're going to quickly move into this Zone where not only is is the majority of the content that you see today on Instagram you know just recommended to you from kind of stuff that's out there in the world that matches your interests whether or not you follow the people I think in the future a lot of the stuff is going to be created with these tools too some of that is going to be creators using the tools to create new content some of it I think eventually is going to be content that's either created on the Fly for you um or or or kind of pull together and synthesize through different things that out there so that that's just one example of how kind of the core part of what we're doing is just going to evolve and it's been evolving for for 20 years already but I think very few people realize that that uh one of the largest Computing systems the world has ever conceived of is a recommender system yeah I mean it's this whole yeah it's this whole different path right it's it's not quite the kind of gen hotness that people talk about but I think it's it's like as I mean it's all the Transformer architectures and it's a similar thing of just building up more and more General models embedding unstructured data into features and yeah I mean one of the big things that just drives quality improvements is you know it used to be that you'd have a different model for each type of content right so a recent example is you know we had you know one model for ranking and recommending reels and another model for ranking and recommending more long form videos and then you know take some product work to basically make it so that the system can display you know anything in line but you know the more you kind of just create more General recommendations models that can span everything it just gets better and better so I mean part of it I think is just like economics and liquidity of content and the broader of a pool that you can pull from you're just not having these weird inefficiencies of pulling from different pools but yeah I mean as the models get bigger and more General that gets better and better so I I kind of dream of one day like you can almost imagine all of Facebook or Instagram being you know like a single AI model that has unified all these different content types and systems together that actually have objectives over different time frames right because some of it is just showing you you know what's the interesting content that you're going to be that that you want to see today but some of it is helping you build out your network over the long term right people you may know or accounts you might want to follow and these these multimodal models tend to be yeah tend to be much better at recognizing patterns weak signals and such a lot of the Gen stuff is going to on the one hand it's I think going to just be this big upgrade for all of the workflows and products that we've had for a long time but on the other hand there's going to be all these completely new things that can now get created So Meta AI um you know the idea of having you know just an an AI assistant that can help you with different tasks and um in in our world is going to be you know very creatively oriented like you're saying but I mean they're very general so I you don't need to just constrain it to that it'll be able to answer any question um over time I think you know when we move from like the Llama 3 class of models to llama 4 and Beyond it's going to I think feel less like a chatbot where it's like you you give it a a a prompt and it just responds then you give it a prompt and it responds and it's just like back and forth I think it's GNA pretty quickly evolve to you give it an intent and it actually can go away on multiple time frames and I mean it probably should acknowledge that you gave it an intent upfront but I mean you know some of the stuff I think will end up you know it'll spin up you know compute jobs that take you know weeks or months or something and then just come back to you and like something happens in the world I think that that's going to be really powerful so I mean I'm today's AI as you know is kind of termbase you say something it says something back to you um but obviously when we think when we're given a mission or we're giving a problem you know we we'll contemplate multiple options or maybe we come up with a you know a tree of options a decision tree and we walk down to the decision tree simulating in our mind you know what are the different outcomes of each decision that we could potentially make and so we we're doing planning and so in the future AIS will will kind of do the same one of the things that that I was super excited about when you talked about your vision of Creator AI I just think that's that's a home run idea frankly tell everybody about the Creator Ai and AI Studio that's going to enable you to do that yeah so so we actually I mean this is something that we're we're you know we've talked about it a bit but we're rolling it out a lot wider today you know a lot of our vision is that I don't think that there's just going to be like one AI model right I mean this is something that some of the other companies in the industry they're like you know it's like they're building like one Central agent and and yeah we we'll have the meta AI assistant that you can use but a lot of our vision is that we want to empower all the people who use our products to basically create agents for themselves so whether that's you know all the many many millions of creators that are on the platform or you know hundreds of millions of small businesses um we eventually want to just be able to pull in all your content and very quickly stand up a business agent and um be able to interact with your customers and you know do sales and customer support and all that so the one that we're that just starting to roll out more now is um we call it AI studio and it basically is um a set of tools that eventually is going to make it that every Creator can build sort of an AI version of themselves um as as sort of an an agent or an assistant that that their Community can interact with and know it's it's going to be very clear that it's not engaging with the Creator themselves but I think it'll be another interesting way just like how creators put out content on on these um social systems to be able to have agents that do that I think that there's going to be a thing where people basically create their own agents for all different kinds of uses some will be sort of customized utility things that they're trying to get done that they want to kind of fine-tune and and train an agent for some of them will be entertainment and some of the things that people create are just funny you know and just kind of silly in different ways or or kind of have a funny attitude about things that um you know we probably couldn't we we probably wouldn't build into meta AI as an assistant but but I think people um people are are kind of pretty interested to see and interact with and then one of the interesting use cases that we're seeing is people kind of using these agents for support this was one thing that that was a little bit surprising to me is one of the top use cases for meta AI already is people basically using it to roleplay difficult social situations that they're going to be in so whether it's a professional situation it's like all right I want to ask my manager like how do I get a promotion or raise or I'm having this fight with my friend or I'm having this difficult situation with my girlf friend like how like how can this conversation go and basically having a like a completely judgment free zone where you can basically roleplay that and see how how how the conversation will go and and get feedback on it um but a lot of people they don't just want to interact with the same kind of you know agent whether it's met AI or chat GPT or whatever it is that everyone else is using they want to kind of create their own things so that's roughly where we're going with AI Studio but it's all part of this bigger I I guess view that we we have that there shouldn't just be kind of one big AI that people interact with we we we just think that the world will be better and more interesting if there's a diversity of these different things I just think it's so cool that if you're an artist and you have a style you could take your style all of your body of work you could fine-tune yeah one of your models yeah and now this becomes an AI model that you can come and you could prompt it you could ask me to uh you know create something along the lines of the art style that I have and you might even give me a piece of art as a a drawing a sketch as an inspiration and I can generate something for you it could be it could be a uh every single uh every single restaurant every single website will probably in the future have these AIS yeah I mean I I kind of think that in the future just like every business has you know an email address and a website and a social media account or several I think in the future every business is going to have an AI agent that interfaces with their customer so can I can I use AI Studio to F tune with my images my collection of images yeah you're yeah we're going to get there okay and then I could can I give it load it all the things that I've written so that use it use it as my rag yeah yeah yeah basically okay and then every time I come back to it it loads up the its memory again so it remembers where it left off last time yep and we carry on our conversation as as if nothing ever happened tell tell me about your your open source philosophy where did I come from and you know you open source py porch yeah and that it is now the framework by which AI is done and and uh now you've open source llama 3.1 or llama uh there's a whole ecosystem built around it and so I think it's terrific but where did that all come from yeah so there's there's a bunch of history on on a lot of this I mean we've done a lot of Open Source work over time um I think part of it you know just bluntly is you know we got started after some of the other tech companies right in building out stuff like the distributed computing infrastructure and and the data centers and you know because of that by the time that we built that stuff it wasn't a competitive Advantage so we're like all right we might as well make this open and then we'll benefit from the from the ecosystem around that so we we had a bunch of projects like that I think the biggest one was probably open compute where we took our server designs the network designs and eventually the data center designs and published all of that and by having that become somewhat of an industry standard um all the supply chains basically got or organized around it which had this benefit of saving money for everyone so by making it public um and open we basically saved billions of dollars from doing that work open compute was also what made it possible for NVIDIA hgx is that we designed for one data center all of a sudden Works in yeah Works in every data center awesome um so I'd say by the time that llama came around we were sort of positively predisposed towards doing this for for AI models specifically I guess there's a few ways that I look at this I mean one is you know it's been really fun building stuff over the last 20 years at the company um one of the things that that has been sort of the most difficult has been kind of having to navigate the fact that we ship our apps through our competitors mobile platforms so in the one hand the mobile platforms have been this huge Boon to the industry that's been awesome um on the other hand having to deliver your products through your competitors um is challenging when you look at these generations of Computing there's this big recency bias where everyone just looks at mobile and thinks okay because the closed ecosystem because Apple basically won and set the the terms of that and like yeah I know that there's more Android phones out there technically but like apple basically has the whole Market I think in general for the Computing platforms that the whole industry is building on there's a lot of value for that if the software especially is open so that's really shaped My Philosophy on this and um both AI with llama and with the work that we're doing in AR and VR where we're basically making the Horizon OS that we're building for mixed reality um in in open operating system in the sense of of kind of what Android or or Windows was and and basically making it so that um like we're going to be able to work with lots of different Hardware companies to make all different kinds of of devices we basically just want to return the ecosystem to that level where that that's going to be the open one and and I I I'm pretty optimistic that the Next Generation the open ones are going to win for for us specific spefically um you know I just want to make sure that we have access to I mean this is sort of selfish but you I mean it's you know after building this company for a while um one of my things for the next 10 or 15 years is like I just want to make sure that we can build the fundamental technology that we're going to be building social experiences on because there have just been too many things that I've tried to build and then have just been told nah you can't really Build That by the platform provider that at some level I'm just like nah that for the next Generation Um like we're going to go build like all all the way down and and make sure that that we there goes our broadcast opportunity yeah no sorry give me give me talking about closed platforms and I get angry um so um you know I do think there's this alignment where I we're building it because we want the thing to exist and we want to not get cut off from some closed model right and um but it this isn't just like a piece of software that you can build it's you know you need an ecosystem around it and so it's it's almost like it it kind of almost wouldn't even work that well if we didn't open source it right it's it's not we're not doing this because we're kind of altruistic people um even though I I I think that this is going to be helpful for the ecosystem and we're doing it because we think that this is going to make the thing that we're building the best by by kind of having a robust ecos around look how many people contributed to pytorch ecosystem yeah totally and I recognize an important thing I recognize an important thing and and I I think the Llama is genuinely important we built this concept to call an AI Factory AI Foundry around it uh so that we can help everybody build take you know a lot of people they they they have a desire to um uh uh build Ai and it's very important for them to own the AI because once they put that into their their flywheel their data flywheel that's how their company's institutional knowledge is encoded and embedded into an an AI so they can't afford to have that AI flywheel the data flywheel that experienced flywheel somewhere else so and and so open source allows them to do that but they they don't really know how to turn this whole thing into an AI so we created this thing called an AI Foundry we provide the tooling we provide the expertise uh llama uh technology uh we have the ability to help them uh turn this whole thing uh into an AI service and and then when when we're done with that uh they take it they own it we the output of it what we call a Nim and this Nim this this neurom micro Nvidia inference microservice uh they just download it they take it and they run it anywhere they like including on Prem and we have a whole ecosystem of Partners uh from oems that can run the NIMS to uh gsis like Accenture that that we've trained and work with to create llama based Nims and and uh and pipelines and and now we're we're off helping Enterprises all over the world do this I mean it's really quite exciting thing it's really all triggered off of uh the Llama open sourcing you know one of the things that I really love about the work that you guys do computer vision um uh one of the models that we use a lot internally uh is segment everything the segment anything model that that you're talking about we're actually presenting I think the next version of that here at at sigraph segment anything to um and it is it now works it's faster it works with um oh here we go um it works in video now as well yeah so it's um a lot of fun effects will be able to be made with this and because it'll be open a lot of more serious applications across the industry too so I mean scientists use stuff to you know study um like coral reefs and natural habitats and um and kind of evolution of Landscapes and things like that but I mean it's uh being able to do this in video and having it be zero shot and be able to kind of interact with it and tell it what you want to track is um it's it's it's pretty cool research now what else what else are you guys going to work on Beyond uh Ray talk talk to me about yeah so there's all the smart glasses right so I think when we think about the next Computing platform you know we kind of break it down into mixed reality the headsets and the smart glasses and the smart glasses I think it's easier for people to wrap their head around that and wearing it because it's you know pretty much everyone who's wearing a pair of glasses today will end up that'll get upgraded to smart glasses and that's like more than a billion people in the world so that's going to be a pretty big thing the VMR headsets I think some people find it interesting for gaming or different uses some don't yet my view is that they're going to be both in the world I think the smart glasses are going to be sort of the mobile phone kind of always on version of the next Computing platform and the mixed reality headsets are going to be more like your workstation or your game console where when you're sitting down for a more immersive session and you want access to more compute I mean look I mean the glasses are just very small form factor um there're going to be a lot of constraints on that just like you can't do the same level Computing on a phone it came at exactly the time when all of these breakthroughs in generative AI happen yeah so we we basically for smart classes we've been we've been going at the problem from two different directions on the one hand we've been building what we think is sort of the technology that you need for the kind of Ideal holographic AR glasses and we're doing all the custom silicon work all the custom display stack work like all the stuff that you would need to do to make that work and they're glasses right it's not a headset it's not like a VR or Mr headset they look like glasses but um there's still quite a bit for off from the glasses that you're wearing now I mean those are very thin but um but even even the ray bands that we that we make you couldn't quite fit all the tech that you need to into that yet for kind of full holographic AR though we're getting close the other angle that we've come at this is let's start with good-looking glasses we have camera sensors so you can you can take photos and videos you can actually live stream to Instagram you can take video calls on WhatsApp and stream to the other person um you know what you're seeing um you can I mean it has it has a microphone and speaker so I mean the speaker is actually really good like it's open ear so a lot of people find it more comfortable than than earbuds um you can listen to music and it's just like this private experience that's pretty neat people love that you take phone calls on it um but then it just turned out that that sensor package was exactly what you needed to be able to talk to AI too it's great that we have this but um but in the future we're like not that many years away from being able to have a virtual meeting where like you know it's like I'm not here physically it's just my hologram and like it just feels like we're there and we're physically present we can work on something and collaborate on something together but I think this is going to be especially important with application I could live with with a a device that that I'm not wearing all the time oh yeah but I think we're going to get to the point where it actually is yeah I could it'll be I mean there's with within glasses there's like thinner frames and there's thicker frames and there's like all these Styles but um so I don't I think we're we're a while away from having full holographic glasses in the form factor of your glasses but I think having it in a pair of stylish kind of chunkier framed glasses is not that far off these sunglasses are face size these days I could see that yeah and and you know what that's that's a very helpful style TR exactly that's a very helpful St whoever you know it's like like I'm I'm trying to you know I'm trying to like make my way into becoming like a style influencer so I can like influence this before um you know before the glasses come to the market but you know well I can see you attempting it how's your style influencing work out for you you know it's early yeah but I don't know I feel like if if if a big part of the future of the business is going to be building um kind of stylish glasses that people wear um this is something I should probably start paying a little more attention to ladies and gentlemen Mark Zuckerberg thank you