listen to this interview here and I think you're going to get very very bullish on this one By the way I did a fullon AMD video on the main channel financial education three days ago AMD will never be the same Talked about a lot of very important things in that video So if you haven't got to check out that yet on the main channel definitely check out that AMD video D's CEO and chair Um thanks for having me here Thanks Thanks for being here John So talk to me about um these systems You got a chip sitting there next to you that's I guess fresh out of the oven Uh what is it that you're announcing here What uh what level of competition does that raise you to uh with the others out there Yeah absolutely So first of all um it's been a big day for AMD Really exciting I love talking about our new products You know that So we just recently uh announced our MI355 chip This is our newest uh AI chip and it is fantastic It's the best inference accelerator out there in the market And you know if you look at what's happening in AI today um we're just seeing so many more use cases John like people are using AI everywhere And so one of the most important things for our customers is to get more efficiency um from their chips So when you think about um you know tokens per dollar that's kind of the metric that is used And this chip will give you 40% more tokens per dollar which just gives you the opportunity to offer AI at a much more affordable price um across the world So yes very exciting Uh we also previewed our 2026 lineup We're on an annual cadence now of of new accelerators and and we're not just doing chips but we're doing you know big rack scale systems And so all of that was part of our announcements today and there uh investors are are puzzling over how the innovations will translate into share gains versus the likes of Nvidia So why why is it beyond chips now and why is it about racks that was a big part of the ZT systems acquisition Put that into perspective with the the go to market and the design component to satisfy the customer need Yeah absolutely So look we're really excited about the momentum that we're seeing um in the AI market these days You know if you look today you know seven out of the top 10 you know model builders and AI companies are using um AMD products you know that includes Meta Oracle OpenAI Sam Alman was here with us today Meta and Oracle were here um you know XAI Tesla to name a few And when you think about you know what they need what they really want is the latest generation hardware that gets into the data center and then gets into serving production workloads as fast as possible And so to do that you require you know of course we require chips uh but we're also uh putting a significant amount of investment into the software layer to make it just super easy to use And we're also uh building uh rack scale solutions So we just completed our ZT acquisition We've added a thousand design engineers And the whole purpose is so that we can help our customers get to the market with our AI solutions as fast as physically possible You've got competition with Nvidia at the high end Certainly then the hyperscalers the big cloud providers who are also now big AI providers AWS Azure She's like if you mention Nvidia one more freaking time one more time uh Google they've got their homegrown AI chips that they're trying to use as much as possible to keep their costs lower their margins higher How do you play to both ends of that and win How does openness fit into your strategy Yeah Well you know John when you think about it you know I view we're still in the very early innings of AI And frankly um as much as we've seen a ramp up in AI you know we have um sort of said that the market is going to grow um over 60% annually for the next number of years to over 500 billion in 2028 Um that's a huge market and you know in that market what people want is the most flexible and the most programmable um you know architectures out there and and that's really what we're building And you know the and listen let me put this very simple what she just mentioned right there that alone makes AMD a must buy Like you got to have AMD in your portfolio Like come on man That just in itself we don't even have to go into the rest of it Like that alone you got to have some AMD in the portfolio man It's just question of how big of a position Key point in this is again we believe in deep co-development with our partners So hardware software systems coming together making sure that we have the right compute for the right application And you know that's what we've shown If you look at just how much improvement you know I can just give you some stats right If you look at um the solution that we're just talking about the MI355 you know it's up high Oh hold it up high Yes I want to make sure everyone gets to see my new baby here We're very happy that um it is now out in production But um you know this is like three times the performance of just what you saw last year So you know that's just how much value you're getting and uh we previewed our next generation our MI400 series and there we're going to see like 10x you know types of improvements So that's how much we can get uh from the hardware and working closely with our partners and uh you asked about open um open is the foundation of what we do at AMD You know I'm a real Okay before we go into open because these are all important points she's making here but she she you know specifically mentions around performance right And this is a 3x improvement versus what you saw last year And then MI400 we're talking like a 10x per uh improvement right The important thing this is why these companies are basically and when I say these companies I'm talking about all these big companies that want to compete in anything AI related they're basically forced to continue to spend aggressively on these chips not only this year but next year and years to go in the future because if you don't you're going to fall behind And you may say well why not just wait till the chips are you know why buy a chip this year and uh you know use that chip when next year they're going to have the Mi400s out and those are going to be 10x more powerful Well many reasons One reason is how many MI400s are you going to be able to get your hands on next year You realize everybody's going to want that chip right Everybody's going to want that chip So if anything they'll be supply constrainted in regards to AMD in regards to MI400 right Because everybody's going to want that chip Secondly you don't have time to wait You need the most powerful chip now not next year not the year after You needed no most powerful chip now And you know what You need next year the most powerful chip that's at available at that time next year And then the year after right And obviously you got to weigh in cost as well like what's going to help you operate as efficiently as possible and keep yourselves in budget But there's many reasons why you need the most powerful chips AMD has today but you also need the most powerful they have next year and you also need the most powerful they have in 2027 and the most powerful they have in 2028 You can't fall behind You fall behind in this game you lose You lose And do you want to lose No you lose in this game you're going to miss out on you know potentially tens of billions if not hundreds of billions if not trillion dollar type opportunities You asked about open Um open is the foundation of what we do at AMD You know I'm a real believer in the fact that an open ecosystem is how you get the best innovation into the market So we have you know thousands of developers here today They're playing with our hardware They're learning our software And we want all of them developing on our stuff so that uh you know we get the benefit of all of that innovation and they get the benefit of bringing their best ideas in And that's important That's important because when we're talking about the software side AMD wants everybody to use their software right Whereas Nvidia is going a little bit of a different route They're going a little bit more garden Some people would say Nvidia's approach is much more similar to like an iOS uh type approach right And we'll see what wins The thing you got to understand both can win A lot of people think this is a zero sum game Either AMD wins or Nvidia wins and that's it No doesn't work like that Guess what happened in mobile phones iOS won but you know who else won Android won They both won in the end right Like Android insanely successful and they had way more market share and they have way more market share right than iOS But is Apple hurting for money No of course they're they're extremely successful as well So both won You know did Android end up with massively more market share Sure But Apple actually became a even more valuable company So you know the important thing to understand is like there can be multiple winners It doesn't have to be just one company that wins right Both strategies could end up winning And who knows maybe AMD becomes way more popular over time It might be That doesn't mean necessarily there'll be a bigger market capitalization than Nvidia Maybe Nvidia ends up longterm being a 5 or10 or 10 trillion market cap Like maybe Nvidia becomes a $10 trillion market cap let's say right Maybe AMD becomes a $2 trillion market cap or $5 trillion market cap It's like wow they're still so far from Nvidia but would that have been a win Of course that would have been a huge win Monumental right So that's an important thing to understand about these markets right And once again if you want to check out a video I spoke about AMD pretty extensively and that was a AMD only focused video Check out the main channel financial education and uh check that baby out Okay Hey it's Jeremy I hope you really enjoyed that clip here today What you looking at in front of you right there That's the six figure hall of fame Do you want to see the next one That's the seven figure hall of fame Now that is my private group Those are members of my private group that have scaled to six figures multi-6 figures seven figures multi-se figures in their stock market portfolio Say you are the average of the five people you associate with If you're somebody that is having trouble finding other long-term investors that are focused on building their wealth over the coming years finding like-minded folks look no further than my private group We got a tremendous amount of people We're consistently talking with each other learning from each other researching companies together sharing news about our different companies right We do all that inside my private group If you would like to apply to join us in there go down in the description area There'll be an application down there You can click on that and you can apply to join us in there and take your game up to a much higher level and associate with the type of folks that you actually want to associate with All righty So let's cover the big stuff that happened here uh yesterday because there's some huge stuff Listen right off the bat Lisa soon comes out flexing comes out flexing She says "I've been incredibly proud to say that billions of people use AMD technology every day Whether you're talking about services like Microsoft Office 365 or Facebook or Zoom or Netflix or Uber or Salesforce or SAP they're running on AMD infrastructure right In AI the biggest cloud and AI companies are using Instinct to power their latest models in new production workloads And there's a ton of new innovations that's coming uh for new AI startups right So she's flexing right off the gate And I think that's important because you know you got somebody like Stacy Rascon who's supposed to be this like chip analyst who's like knows everything about chips right Um he goes on CNBC This was a few weeks ago maybe a month ago and I reacted the video on the reaction channel So disrespectful So disrespectful to AMD in my personal opinion He said something along the lines of you know I'm not even sure we need them Listen if you don't need them why why is Microsoft using them and Facebook in Zoom and Netflix and Uber and Salesforce and SAP and all these big companies Like at the end of the day yes they're very very necessary in these markets Okay So as with many companies now including partner Dell in in rival Nvidia AMD is now talking about AI agents This is this new huge opportunity and Salesforce It looks like one of the best companies to kind of benefit from AI agents There's going to be many companies that benefit from AI agents and how those help businesses but they benefit companies like AMD and Nvidia in a huge huge way Right Sue says that Agentic AI represents a new class of user This is important What we're actually seeing is we're adding the equivalent of billions of new virtual users to the global compute infrastructure So I use internet Uh you watching this video right now guess what You use the internet too You know how I know you use the internet Cuz you're watching this video right now Okay we all use the internet We're users of the internet Well if you got these AI agents right they're going to be using the internet too You know why Because they got to get answers for all this stuff They got to do this They got to do this They got to figure all this stuff out right Um you know if a business is using a bunch of these AI agents to solve customer problems and things like that guess what there you're going to be using in the the the compute infrastructure right all of these agents are here to help us and they require lots of GPUs and lots of CPUs working together in an open ecosystem so this is important and the more AI agents that are out there the more demand for compute there is which benefits a company like AMD handsomely right AI isn't just a cloud or data center question now it's also an endpoint question in particular the PC we expect to see AI I deployed in every single device That's big in itself because you could be talking about separate chips in every single device that is just there to help with AI related problems No different than an iPhone has a bunch of different chips made up of it Some of those are for to help with communications RF type stuff Some of that might be for you know uh audio translation Some of it's to obviously run the device right Um and all those sorts of things right Sue hints at another theme that's becoming popular among many new enterprise hardware providers open source and openness AMD is the only company committed to openness across hardware software and solutions Soup claims the history of our industry shows us that time and time again innovation truly takes off when things are open Now little bit of shots at Nvidia there because Nvidia looks like they're trying to build a walled garden there right Uh similar to like an Apple AMD is taking a different approach They're taking an approach of we want this to be an open ecosystem Everybody can use it and that's the way it's going to be right So they're going about it a little bit more like an Android versus iOS right But they're trying to do it a little bit more of an Android route uh they're going about it a little bit more like you could look at a ton of different verticals over time and the ones that went more open ended up becoming the bigger thing over time versus kind of something that that that's closed right and she gave examples of that as well right uh then Sue was joined by Sun who's the a uh X AI uh I believe CEO and they're behind Gro so if you use X which by the way if you use X and you want to ever follow me on X I always have my X page linked in the description area of uh all these videos right But if you use X you might use Grock sometimes okay And so XAI Sun focuses on how his small team is bolstered by AMD hardware like MI300X and of course openness he talks about Right now this is when the presentation starts to get really really important because they start to shift to talking about MI350 Now you got to understand MI350 is the real big gamechanging product for AMD to really enter this AI fight They're already in it right But in terms of we're coming now we got the weapons MI350 marks a huge huge bump up in terms of the ability for inference specifically versus the the basic 300 series A 350 is a whole new bump level up there And then MI400's a whole other level Okay with the MI 350 series we're delivering the largest generational performance lead in the history of Instinct and we're already in deep development of MI400 for 2026 Obviously that's going to be a 2026 story Uh my guess is that will take off at some point mid mid 2026 or maybe back half 2026 is my guess there Today I'm super excited to launch the MI350 series our most advanced AI platform ever that delivers leadership performance across the most demanding models this series she says adding that while AMD will talk about MI55 and the MI350 they're actually the same silicon but MI355 supports higher thermals and power envelopes so that we can deliver more realworld performance So don't think like you know if you get an MI350 like that's like way worse than an MI355 or something like that Um you know just specific things you would need a 355 for over a 350 Okay Now Sue claims the 350 series will deliver a massive 4x generational leap in AI compute By the way from what I heard MI400 should be a 10x [Music] But the 350 is huge And this is why I I tell you guys like this is the big step for AMD to really start moving the needle in regards to the revenue numbers because companies are going to be ordering these handover fists when you're talking about delivering a 4x generational leap in AI compute to accelerate both training and inference some of the the you know specs they're talking about 288 GB of memory running up to 20 billion parameters on a single GPU [Music] you know obviously the chip's very impressive MI400 will even be a whole other level up there right uh then they bring out Meta's VP of engineering um behind obviously the llama model right we're seeing critical advancements that started with inferencing and now extending to all of our of your AI offering I think AMD and Meta have always been strongly aligned right And obviously Meta and AMD have a very strong relationship And you know if you think about who do you really want to have a strong relationship with in the AI space gosh like Emma Meta is right at the top of the list because Meta is integrating AI in basically every single product they have I I can't think of anything that Meta has got going on with that company And obviously that's my biggest position in the public account I can't think of anything they're doing that's not incorporating AI They're incorporating AI in several different ways on Facebook Instagram WhatsApp like everything And then they got the llama models which that's a massive opportunity over this next 10 plus years So you know it's good to be in with those guys Now uh Song then explains that MI350 will become a key part of the company's AI infrastructure It's already using the MI300X accelerators We're also quite excited about the capabilities of MI 350X He says it brings significantly more compute power and acceleration memory to support FP4 FP6 all while maintaining the safety pull factor at at MI300 so we can deploy quickly Now they go ahead they show off like the RMI 350 series solution partners and it's you know all the companies you'd want to have up there Oracle Dell Super Micro Cisco HP E all those type of companies Right now this gets really interesting because now we start moving on to Microsoft in AMD So Bana is a little more on kind of the software side of of AMD more than the hardware side But basically they have Eric Boyd come on who's a uh CVP of AI platforms at Microsoft The two are discussing Microsoft and AMD's long-standing partnership with Boyd saying Microsoft has been using several generations of Instinct It's been a key part of our inferencing platform and we've integrated ROCM to our inferencing stack making it really easy for us to take and deploy models on the platform He says now additionally here I thought this was actually really really big They're talking about what they describe as a new opportunity for driving down inference costs This is huge because you got to understand more and more of these chips are going to need to be needed right More and more uh at the end of the day like if you're talking about people are going to use chat GPT more often people are going to use all these different AI products more often right Gemini and go down the whole list of them a million of them right AI agents You need massively more chips than you already have Now additionally you need these chips to be even more powerful more powerful which means you have to pay even more money This is where AMD comes in big in my opinion And if you want to talk about differentiating AMD verse Nvidia AMD is really looking to help bring down the costs for these companies because you can see the way the numbers are going for Meta for Microsoft all these companies Google run the whole list them Microsoft Amazon all them It's going to get to a point where like they'll end up spending more than they're even making in profits That's why you have to bring down the cost And so I'm sure Microsoft's looking at how much they're spending on all these chips right How much they're going to likely spend over the next few years I'm sure they're looking at what Open AI is spending And obviously Microsoft's a massive shareholder Open AI And you know you start to have these conversations about we got to bring down cost in regards to this whole situation because you know the money's going to flow but the money doesn't flow right away in terms of you starting to make money And so this is where A and D starts to really matter in quite a significant way here Okay in any LLM serving application there are two phases There's the prefill phase and the decode phase These two phases of the model are typically handled by the same GPU But now if you use the same GPU it often becomes a bottleneck for large models or when demand spikes happen you can get limited in performance or flexibility disagregation or disagregating the prefill in the decode phase can significantly improve throughput reduce cost and boost responsiveness he says and I think that matters uh pretty significantly in regards to that right now They also talk about developer cloud which has massive opportunity for this company With developer cloud anyone with a GitHub ID or an email address can access uh can get access to an instant GPU with just a few clicks He says directing developers who want to give it a try to go to dev.amd.com That's a massive opportunity for this company as well Then they bring out the big dog I take you back to the whole Stacy Rascon comment about like I don't know if we really need AMD Sam Elman is the man right now He's the one everybody wants to interview right now Why Because he's leading arguably the most important AI company in the world That's what it is viewed And so Sam Alman can be just about anywhere he wants to be right now The fact that he shows up to AMD's event I think speaks volumes about how important AMD is to a company like Open AI cuz that man could be anywhere getting interviewed by anybody or working doing whatever The fact that he goes to that AMD event that says more than anything he says here in in itself Okay No surprise guest someone who's really an icon of AI Sam Alman comes out Sam uh Alman talks about his company and what he's seen in the market I think the models have gotten good enough that people have been able to build really great products text image voice and all kinds of reasoning capabilities We've seen extremely quick adoption in the enterprise Now coding has been one area people talk about a lot But I think what we're hearing again and again and again uh in in all these different ways is that these tools have gone from things that were you know fun and curious to truly useful he says right and man I just think about like could could open AAI be the first company to come public with a trillion dollar valuation like that's what I'm thinking right now asked by Sue on how compute demands are changing Alman says one of the biggest differences has has been we moved to these reasoning models models So we have had these very long rollouts where a model that often will think come back with a better answer This is really the pressure on model efficiency in long context for all of this We need tons of computers tons of memory tons of CPUs as well People think CPUs are like irrelevant Sam Alman says no no we need tons of CPUs as well not just GPUs That's important Remember this what we're discussing here today has a lot to do with a AI and GPUs but let's not forget AMD's got a whole bunch of businesses that aren't exactly in the realm that we're we're really focused on for this video here today That's important Okay And our infrastructure ramp over the last year and what we're we're looking at for next year has been just crazy crazy things to watch What does he mean by crazy Is he talking about crazy spend next year Turn to the hardware used to support OpenAI's platforms like chat GPT Miss says a company is already running some work on MI300X but we are extremely excited It says here MI450 but from what I remember he's he was talking about MI 350 Remember MI350 is what's launching right now MI450 I mean shoot that's not even going to be till you know 6 months or 12 months after MI400 comes out So I I think what it should say here is MI 350 Keep that in mind Okay But he's extremely excited about that And the memory architecture is great for inference Big remember what Lisa Sue has said in the past She says am she believes AMD is extremely well positioned for inference And she believes inference going to be exponentially bigger And I believe from Jensen's comments as well inference is going to be exponentially bigger than training And so when Sam Alman says we're excited for for the new MI350 chip that's about to hit right And this architecture is great for inference I I don't know what else to tell you Like like if Wall Street's having any any trouble understanding this just understand the most important man in the world in AI says he's very excited for the next chips and they're great for inference which is exactly what you need to be in the market for and he showed up and proudly spoke at an event for AMD I if you if you are confused about what AMD's opportunity is here I I I don't I don't have anything left to tell you If that's not enough proof like I I don't know what to tell you The most important man in the world in AI went to this event spoke in front of everybody and told him "Listen man Chips are great for inference What more could you ask for Sue asked Olman what he sees for the future of AI How will workloads evolve Uh what happens with quote unquote AGI uh artificial general intelligence He says I think we're going to maintain the same rate of progress rate of models for the second half of this decade as we did the first I wasn't so sure about the couple of years ago uh for a new research thing to figure out but now it looks like we'll be able to deliver on that So if you think forward to 2030 these systems will be capable of remarkable new stuff about scientific discovery running extremely complex functions throughput uh throughout society and things that we just couldn't even imagine He continues it's really going to take these these are huge systems now very complex engineering projects very complex research and keep on this path of scaling We've got to work together across research engineering hardware and how we're going to deliver these systems and products This has gotten quite complex but if we can do uh but if we can do on that if we can deliver on that it drives collaboration across the whole industry and we'll keep this curve moving forward Right So I think it's important to take a step back here because we're all we're all talking about right now MI you know 350 and that's about to ramp That's so exciting Like that's going to mean huge thing for AMD's revenue profitability all that stuff right And then we're going to be talking about MI400 next year and like that's going to be a whole new level higher for AMD's revenue profits relevance everything across the board right Oh my gosh And then we'll be talking about MI450 eventually right The thing you got to understand is chips just keep evolving They keep getting more complex They keep getting better and more powerful Okay But listen don't get too caught up into the short-term stuff Samman's talking about the 2030 He's talking about the 2030s Right This is not just a one or two year game This is a next 10 20 year game And a company like AMD is positioning themselves very well for the next 10 to 20 years in these markets Very very well