Transcript for:
AI Censorship and Economic Impact Overview

I think the AI censorship wars are going to be a thousand times more intense and a thousand times more important. My guest today is someone who doesn't just keep up with innovation, he creates it. The incredible Marc Andreessen.

Trust me, when someone like Marc, who spent his entire career betting on the future, says this is the next major disruption, you need to listen. From a political standpoint, we should hope that we have rapid technology progress, because if we have rapid technology progress, we'll have rapid economic growth. Do people care? and are people going to be willing to stand up for this? And I think that's what's required.

It's going to displace a lot of jobs. Some of those people will redistribute themselves by acquiring new skills. Other people will not. This isn't something to think about tomorrow. You've got to be prepared today.

So let's dive right in. I bring you Mark Hendreason. Mark Hendreason, welcome to the podcast.

Awesome. Thank you for having me. My pleasure. Now you've had a Insane amount of success betting on where industries are going. So let me ask you, what is the most radical disruption that you see coming in the near future with AI?

You know, I just say like we're convinced AI is one of those sort of moments of fundamental change. And, you know, in our in the tech industry, you know, these come along every couple of decades, but they're not frequent. And, you know, this one is up there with a microprocessor and the computer and the Internet, for sure, and maybe bigger.

And so for us in the tech industry, this is a this is, I think, a very, very, very profound, powerful moment. And of course, you're already seeing, you know, a lot of a lot of the a lot of the effects that are already playing out. But, you know, this technology, this technology is going to change a lot of things.

And it's going to be, I think, very, very exciting. And so for people that don't know, you have a fundamentally optimistic view of AI, of technology in general. Do you have like from an investment strategy, do you guys have a thesis on what industry you think is going to be most advantaged by AI that you're trying to get into? Yeah, well, there are many.

So we're involved in many. I would say there's obvious slam dunk ones. And so I would say health care.

is a slam dunk one. I actually just happened to have lunch with Demis Hassabis, who just won the Nobel Prize in chemistry for his work on protein folding. Not a bad lunch date. Yeah, exactly.

And he was knighted this year also, so he's also Sir Demis. Good year. But he and his colleagues basically have this transformative approach that they believe is going to lead to dramatic breakthroughs in the development of medicine in the years ahead, powered by AI. So healthcare is an obvious one. And entertainment...

is one that I think is, it's going to be extremely exciting what happens from here. And again, that's already starting to play out. And, you know, you're already seeing like just sort of incredible creativity being applied to that.

And so, you know, maybe you could kind of maybe bookend it by saying those because it's kind of the most serious one and the most fun one. But then look, there's lots and lots of other stuff. Probably the single biggest question I'm asking right now is robotics.

You know, there's been the promise of, you know, kind of robotics, you know, kind of, saturating our society and, you know, everybody having, you know, robot robots in the home and, you know, everybody having, you know, robots to do, you know, to do everything manual labor and, you know, wash the dishes and pack the suitcase and clean the toilet and, you know, can, you know, conceivably everything, you know, the manual labor, you know, kind of free people from manual labor. And that's, you know, been a promise, you know, going back, you know, in science fiction, it's been a promise for, you know, like 120 years. And, you know, until recently, we were no closer than we were maybe back then. But, you know, you're starting to see very dramatic, I think, breakthroughs.

And I think, you know, sort of like you had sort of drones that now work like autonomous drones are like now a standard thing. Self-flying, self-piloting drones, you now have self-driving cars that are now a thing and that now work really well. And I think it may be, you know, humanoid robots and all kinds of other forms of robots. We have two Chinese robot dogs at home.

What? Yeah, yeah, yeah, yeah. You actually have them at your house? Yeah, yeah, yeah. So everybody's probably seen all the demos.

Remember, there's this company, Boston Dynamics, that has all these. They always have these great demos. You see these videos of these robot dogs running around.

But they cost like $50,000, $100,000. And that company never really brought them to market. So it never really worked outside of it as a demo. But there are now Chinese companies that have these things down to $1,500. Yeah.

And they're great. They run around. They run actually quite quickly. They can outrun you.

They do flips. They stand on their hind legs. They climb stairs. There's a version of it that has wheels that they can go like 30 miles an hour.

Then that one can also climb stairs. It locks the wheels and it's perfectly fine climbing stairs. So, you know, those are really starting to work. And then, you know, humanoids are coming fast.

And, you know, Elon just had his demo day for the Tesla Optimus robots. That was unreal. Yeah, and so those are starting to work. You know, it's not quite there yet.

Like those were still teleoperated. There's still people in the background with VR headsets that are kind of steering those and guiding those and helping those. But that's also how you train these robots is you kind of have they kind of watch what people do and then you train. So I think we might be like actually reasonably close on robotics, which would actually have a you know, would have a very big impact.

And so, yeah, maybe you could call out those three categories as obvious ones to focus on. What kind of timeline do you have for robotics? When are we going to start having that first round of people buying them and having them in their home? I know Elon's pegged it at 20 to 30 grand.

When? Yeah, so the big breakthrough, so self-piloting drones were a very big breakthrough. And the dominant ones in the global market are this company DJI, which is this big company in China.

But those now work really well. And then there's American companies. We have American companies that have, I think, even better technology.

um that aren't quite the same size yet but are really good um and so and that's a big deal like so you can you can have you know we have drones now that can like fly between tree branches they can fly you know indoors you know they can fly you know completely autonomously through like by the way underground tunnels um and so those work really well and then like i said like self-driving cars um you know the waymo you know cars now are great um and you know people who use them have fantastic experiences and then the tesla self-driving capability is getting really good um and um And so like, so I go through those to say, those are both robots, you know, flying robot, driving robot. And so walking robot, all of a sudden, it's not so crazy. Exact timing, I don't know, you know, I, you know, swag five years, but, you know, could be two, could be eight. I don't know, optimistically, three or four.

You know, the promise, you know, there's many, many possible form factors for these things, right? Designs, the theory of humanoid robots, which I believe is, the great thing about humanoid robots is there's just, there's so much of the physical world that assumes that there's a person present, right? So, person standing in an assembly line, person driving a car, person driving a tractor, person, you know, picking, you know, picking, you know, vegetables in a field.

There's just all these systems that we have that just assume there's a person. And so if you build a robot in the shape of a person, in theory, it can just kind of fill in and do all that work. And so that should be a very big market. And obviously people should be very comfortable with that.

They'll dovetail really well into kind of normal society. But I also think there will be a lot of other you can package these things up however you want. And so there will be lots of other. You know, there already are obviously lots of robots in the world, but there will be, you know, more and more of different kinds.

And what are the hard parts? What are the hurdles they still have to overcome that's going to cause it to be three, four, possibly eight years from now? Yeah, so there's basically, I would say, three big categories.

So there's the physical sort of controls, you know, the actual physical, you know, kind of body and its ability to kind of control itself. You know, and that's where if you look at like Elon's demonstration the other night, you can kind of see how fast that stuff's moving. Because if you watch his progression of the other companies doing it, they're getting much better.

And so that's just moving right along. Then there's battery. Power is probably still a fundamental limit because it's a question of how long can you actually power one of these things before it has to recharge or do a battery swap.

And that's still a bit of an issue, and it's hard to make progress on batteries, but a lot of people are working on it. And then software is the big challenge, I think. and, you know, where we would get more involved. And, you know, so there's sort of all the software. And so think about it, like these robots have sensors.

They've got visual sensors. They've actually got, like the robot dogs have what's called LiDAR, which is sort of the light version of radar, which is the same thing that's in the Waymo cars. And so, you know, they've got sensors.

They can kind of, you know, gather, they've got sound. You know, they can gather input, you know, from kind of all around them. Actually, they can gather input from their environment better than a human can because they can see 360 degrees and, you know, they can do depth sensing and so forth in ways that we can't. So they get all the raw data, but then it's a process. You have to actually process that data.

You have to form it into a model of the world. You have to then, the robot has to have a plan for what it does, right? And then it has to understand the consequences of the plan, right?

And so, you know, I'm setting the coffee down on the table. You know, I can't set it down on somebody's hand. So I have to set it down near the hand, but not on the hand.

I have to keep it level because if I tip it, you know, I'm going to scald somebody, right? So like, and then, and then by the way, while. I'm the robot while I'm sitting the coffee down, the person has moved.

Right. Um, and so I have to adapt to it. Right. Um, or, you know, same thing walking through a crowd. Like I can't, you know, you can't have robots running into people.

Um, and so you have to have this kind of approaching how they're approaching that problem. So if I think about when I saw the robots interacting with the people at the party, is there an underlying goal for the robot to be likable? And is it like, Hey, get to know people, uh, try to charm them. What, what is the plan that they're giving? to the robot that it's moving towards.

Yeah, so, I mean, in general, if you're a company, in general, you want basically completely benign, right? So if you're a company, you want, because it actually lines up nicely with the profit incentive. You know, you want friendly, approachable, you know, products that make people happy, products that make people comfortable, you know, products that aren't threatening or intimidating and aren't, you know, aren't hurting people. And so you put a really, really big focus on fitting into the environment. You put a really big focus on avoiding anything that would ever, you know, harm a human being.

You put a very big focus on the robot should happily step into traffic or whatever if it's going to save somebody's life. And so you want that. And then, yeah, I think generally you want it to be sort of approachable, safe, harmless are kind of terms that get used a lot.

Friendly. Now, look, this is the other thing is there used to be this like really hard challenge, which is how are you going to control these things? How are you going to talk to them?

Are they going to, you know, watch Star Wars? They communicate. beeps and boops.

If you watch Star Trek and you're watching Commander Data, he's talking in English. Up until two years ago, we thought it would have to be beeps and boops. But now we have large language models and we have these voice AI interfaces. OpenAI just released their advanced voice mode.

It's like talking to the Starship computer on the Starship Enterprise. It's just like talking to a person. All of a sudden, you can give these robots voices. They can talk. They can listen.

you know they can explain quantum physics to you they can sing you a lullaby they can you know forecast the presidential election like you know they can do it they can now do whatever you want um and so that's that's the other part of it is that you know you're going to really be able to talk and interact with them the first one i saw the boston dynamics guys did this hysterical demo where they they wired up one of these early language models a couple years ago to their robot dog um and they gave it a like a super plummy like english butler voice So it's like this like, you know, mechanical robot dog like stomping around, but it's talking to you like it's like you're Bruce Wayne and it's Alfred or something. You know, it's a robot dog. What do you see?

And it does like the very plummy exit. Oh, you know, I see a lovely pile of rocks. And so, yeah, you're going to by the way, there's going to be enormous creativity.

There's this startup we're not involved in, but I like the guys a lot called Curio in Redwood City that basically has a plushie. So they have a stuffed animal. And it's basically designed for little kids.

And it's a voice UI. And it's back-ended by a large language model. And, you know, it doesn't move. It's just a plushie with a voice box. But it will happily sit and tell kids jokes and teach them all about, you know, whatever they want to learn about and talk to them about whatever's on their mind.

And they have it, you know, really elegantly wired up where the parent can both control how the toy actually, like, what it's willing to talk about. So you can, as a parent, you can like define, you know, the topics that are like go zones versus no go zones. So you could kind of say, you know, let it talk to the kid about, you know, science, but not politics, for example. And then you get as a parent, you get a real time transcript of the interaction. So like your kids up in the bedroom talking to the talking to the thing and you actually get to see the conversation.

Right. And so and it's funny when you when you watch this with like little kids, they just think this is like the most natural, normal thing in the world. Right. I've talked in the past.

I have a nine year old and I brought home when ChatGPT first shipped. um, you know, two years ago, uh, I guess he was seven. And so I, uh, he has a laptop that he does is some of his, uh, his, uh, school stuff on.

And so I set up Chad GPT on his laptop and I sat him down. I was so proud of myself. Cause I'm like, I'm like, I don't know. It's like, I'm, I'm, you know, I'm coming down from the mountain to deliver like the gift of fire to my child.

Like I'm giving him like the super technology that's going to be with him his whole life. That's going to answer any question and help him with all this work. And it was like the most amazing gift of technology I could give him.

And I, I showed him Chad GPT and I said, you know, you type in any question you want. And then it answers the question. And he looked at me and he said, you know, so? Right? And I was like, what do you mean?

So, like, this is like the breakthrough. This is like, this is the thing. This is like the thing for 80 years we've all been working on and it finally works.

And he's like, what else would you use a computer for? Like. So funny. Like, obviously it answers your questions.

Right? And so, like, I think kids are going to, kids are, I mean, it's already happening. Kids are going to pick this up, like, incredibly fast.

It's going to be, you know, super normal. Anyway, so long answer to your question. But we have, we have a.

We have a chance to design, you know, we can design technology to be as friendly and helpful and accommodating and supportive as we can possibly imagine. And I think that the commercial products will all get built that way for sure. Yeah, to me, that's where the biggest disruption is going to be.

When I think about AI, I think I'm as optimistic as you in terms of the things that it will do for us. It's intellect. You're going to be able to throw, you know, God knows how many new PhD level people and.

maybe one day even more at all these incredible problems. All right. That's going to be utterly fantastic. But then I think about your dog becomes a robot dog, becomes furry and fluffy and wonderful, but it also talks to your kids and helps raise them.

And you have this lens into it. And then all of a sudden it's, well, it's not just the dog. It's I've got an AI girlfriend. She's not really a girlfriend, not like that.

Well, but then I, you know, I've been talking to her for three years and now robot body comes online and. I want to put that AI into the robot body. And all of a sudden, I think that there's going to be a pretty fascinating, to try to keep it positive here, a fascinating schism that will happen in society.

So five years ago. I wrote a comic book about this, about what I think is going to happen. And I think there's going to be a bifurcation in society.

And I really think this is actually going to happen. How big and how dramatic that remains to be seen. But I think you're going to get a subset of society that says, nope, not doing this.

It's like the opening line in Dune that thou shall not make a mind, an artificial mind mirroring human intelligence or whatever the exact line is. I think people will eschew AI, they will eschew Neuralink and things like that, and they'll be sort of this new puritanical vein of humanity. And then you're going to get other people like me that embrace the technology. I may not be an early adopter of Neuralink, but if it truly gets safe and it allows me to upgrade my abilities, man, I will do that in a heartbeat. And so then it becomes a question of how much friction will there be between those two sides.

But those seem inevitable. Do you think I'm crazy about that? Or do you see that same inevitability?

And if so, how does it play out? I mean, I think it's certainly a plausible scenario. I think it's certainly logical. I, you know, it certainly can play out that way.

I, I, I guess I, my model of human behavior is different. So I'm, I'm skeptical. I'm skeptical that that is what will happen. And, you know, I would just start by saying that there is a, you know, there is a schism of that, of like that in our society today. And they are the Amish.

Yeah. And I actually grew up, you know, they were Amish near where I grew up and, and, um, And, you know, so the good news with the Amish is they have a defined quality of life. You know, a whole value system sort of involves, you know, rejecting technology for some, by the way, for some very deeply thought through reasons. And, you know, they're, you know, by all accounts, you know, in many cases, very happy.

And by the way, they're also very fertile. You know, so they're, you know, they're having lots of kids. And so there's, you know, there's actually quite a bit to admire about what they do. You know, like having said that, I would just say two things. One is they're a very, very, very, very, very small percent of the population.

And so there's not a lot of people who volunteer to become Amish. And then the other thing that happens, if you track them in detail, what actually happens is they don't reject technology. They just adopt it on a lag. Right.

And so and basically the lag is about 30 years. And there's been a bunch of articles that this over the last over the last decade, for example, they're now adopting PCs, personal computers. Yeah, yeah, yeah.

Well, because it's so. I thought they were still without electricity. No, no, no, no.

They've got electricity. I mean, you know, they try to control it, but they definitely, this is a great example, they definitely have it, right? And then they have, they now have landline telephones.

So there's just a, there's just a, there's a point where you just, you know, things just get to be practically. So, you know, the PC, so the PC thing, apparently the articles that I've read, basically what it is, is the personal computer, like, you know, they run these small businesses. They'll have like a, you know, they'll do like handcart furniture, for example, that's like, you know, these amazing things. Well, it's just a lot easier to run a furniture store if you've got a personal computer to do the ledger and the inventory on it, right? And at a certain point, they figure out a theory under which that's okay.

They still don't connect it to the internet, you know, but they have their personal computer, by the way. And then you just kind of say, inevitably, the next step is they're going to want to sell their furniture online. And so it's just a matter of time until they figure out a way to bring in an internet connection, right?

And so one of the really, really fascinating things about AI is it went from being something that was sort of speculative and weird. three years ago to something that is now actually quite common already in use. And this is quite a profound and powerful thing that I think we'll probably talk a lot about today, which is, number one, AI is already in wide use. And so the number of users on systems like ChatGPT and Midjourney and whatever are already in the hundreds of millions and are growing very fast. And lots and lots of people are using these things and they use them in their everyday life.

They use them for work. They may or may not admit to their boss they're using them for work, but they're definitely using them for work. You know, students are using them in school. If you've got like, you know, teenage kids, like any classroom in America now is grappling with this question of like, you know, is the kid bringing in an essay that Chad GPT wrote?

You know, but they're helping with homework and they're doing all kinds of stuff. And the usage numbers on these services kind of reflect, you know, already broad-based adoption. And then there's a really powerful thing underneath that that's really important, which is the most powerful AI systems in the world are the ones that you get on the internet for free or maximum 20 bucks a month.

And very specifically, you know i have the capability if i want to you know i could go spend a million dollars to just have like the best ai i could go spend a million dollars a year if i go spend a million dollars year today i do not get a better ai than you get when you sign up for chet gpt it's literally not available i can't do it The best AI in the world is the thing. It's on ChatGPT, or by the way, Google Gemini, or Microsoft Bing, or Anthropic Cloud, or XAI one, or Mistral, which is one of our companies, or Llama for meta. There's like seven of these now that are available either for free or at most for 20 bucks a month. And they're the best in the world. And so it's actually quite striking shocking, which is a lot of people have the mental model of, oh, well, the best technology must be basically...

Hoarded by a few people who are then going to lord it over the rest of us and are going to make all the money on it, right? It's kind of the, you know, the kind of, you know, kind of always the fear on these things. The reality is like this technology is democratizing faster than the computer did, faster than the internet did. It's available to everybody right out of the chute.

By the way, it's getting built, you know, Apple's building it into the iPhone. It's just, you know, now it's just Apple intelligence is going to be a standard feature of the iPhone. And so this technology basically has gone from not present in our society to like almost universal in one step. And I just, you know, it may be that people choose to voluntarily give it up.

But in my life, I have not yet seen people who sort of voluntarily renounce something that they get used to. So, yeah, it would be a first if it happened. All right.

I hear that. And you're the right person for me to have this conversation. I love when dogs bark the loudest because they're on a leash.

So you're going to be my leash. I'm going to paint a scenario knowing that you're going to pull me back from the brink because I'm fundamentally a techno optimist and I'm definitely somebody that. We'll embrace this technology as fast as humanly possible.

We're deploying it here in my company as rapidly as we can. I will literally, if it's proven safe, get Neuralink the whole nine. If you've ever wondered what separates a good digital product from one that people absolutely love, let me tell you right now, it's not just the code. It's innovation and design.

Enter Alty. powerhouse that's been dominating the tech scene for 14 years. Alty isn't just another IT company.

They're award-winning experts in mobile, web, and desktop app development for fintech, banking, retail, and tourism. Partnering with Visa, MasterCard, and delivering innovative products for CBH, Raytheisen, RentBerry, GTBank, and many other global companies from diverse sectors makes Alty a leader in digital transformation. From UI UX design to complex solution architecture, Alti crafts experiences that keep users coming back. If you're looking to create products that stand out in a crowded market, Alti is your secret weapon.

So if you're ready to level up your digital game, check out Alti right now. Just scan the QR code or click the link below. Don't let your competition get there first. Get after it right now with Alti.

Social media has already changed the world. You should be using it everywhere in your business that you can, including within your social media strategy. And if you're not, you're falling behind. Opus Clip is how you're going to catch up fast.

Opus Clip is an AI tool designed to streamline your video content strategy. It turns long-form videos into short social media ready clips automatically. At Impact Theory we've integrated it into our workflow and it's making a difference. Our social media team uses it to create digestible clips from our shows, saving time, helping us maintain a consistent social presence. and just find the best parts of an interview.

But you don't have to take my word for it. For the next few weeks, you can try Clip Anything, Opus Clips flagship product for free. Efficient video marketing can make or break your business. If you're curious about how an AI can enhance your content strategy, click the link in the show notes to start your free trial of Clip Anything so you can see it directly for yourself. So here's what I think plays out.

This is as close to the sort of realistic mess that I think we'll go through. The long arc of history bends towards justice, but history does not care about any single generation. And I think that the thing we will all have to get very politically comfortable with is the fact that, yes, AI is going to displace jobs wildly as we move towards something absolutely wonderful and spectacular, but it's going to displace a lot of jobs.

Some of those people will redistribute themselves by acquiring new skills. Other people will not. And it won't be a great time for them.

And their families will rally around them as the material wealth is unlocked, as spending power becomes more abundant, all of that. The younger people that are more intellectually nimble will take advantage of that to care for people. But there's going to be this conflict on the left and the right as to, hey, shouldn't we just give these people UBI or whatever to take care of the people that are going to struggle because they are going to struggle.

And if people don't have a mental defense, if they don't have a narrative that they can understand about how we weather that storm, I think they'll make very bizarre economic choices. As you were talking, you're talking about deflation. And people ought to wonder how on earth, given all the technological advances we've had over the last 300 years, how is inflation still going up? This seems crazy.

And the reason that inflation goes up, despite the massive deflation that technology brings, is that the government gobbles it up by printing money. And oh boy, do I have a personal bone to pick. I have no idea your take on the economy and how it intersects.

So I'll plant my flag and let you react. I think that you need to only look at the M2 money supply chart to see. I mean, it's just absolutely outrageous how much more money has been poured into the system completely artificially, just generated out of thin air. And that is the inflation. When we say inflation, that's what we're talking about, the inflation of the money supply.

In doing that, the government doesn't have to get your vote on something. They will, I refer to it. I do not want to put words in your mouth, but I refer to that as the government steals from you.

And then they force you to play the stock market as one stand in for investments in order to beat inflation caused by them printing money and stealing from you. And I think that's deranging. And I think that the government has a moral obligation to give people a non inflatable currency in which people can at least park their wealth so that the average person who does not want to play the stock market can just save like a guy. that is a janitor and he's just trying to get by and take care of his family should be able to sock away money and not have its value eroded over time through very conscious and poor, in my opinion, policies. Curious to get your take on that.

If I tell you that in any given time you could have more or less technology change and then that change would show up in economic statistics the way that economists measure it as what they... productivity growth, which is a thing they measure. It's an actual number.

And so if a society has 1% productivity growth, that's super low. If they have 4% productivity growth per year, that's super high. Let's call that the super rank.

And if you could ever get to 8% or 10% productivity growth, you'd have cornucopia, technological utopia, it'd be amazing. Everything would get super cheap and abundant super fast. But modern societies go somewhere between 1% and 4%. Would you say that we live in a time today in which productivity is growth and therefore technological change is running high or low?

I think we are about to unleash a ton of that productivity. But right now, I think that the government is siphoning off so much of that productivity that you get this schism between the young and the old. So the old, I think, are doing very well and the young are getting absolutely clobbered.

And so they don't feel it. But if AI does what we think it's going to do, then yes, I think that we will finally be able to unlock a lot of that. But just take the distributional part of it out, just because we'll come back to that, but just take the distributional part out, but just talk about just the rate of technology change.

Do we live right now in a time of great technology change or low technology change? The only great technology change is in AI, so low. And then you'll probably get the next answer right, which is, did we have faster technology change between 1930 and 1970 than we do today or slower? Much faster.

Much faster. Yeah. So those are the correct answers. And so the metric on what's happened, and this is actually quite important, is that productivity growth and therefore technological change in the economy was much faster in the decades that preceded the 1970s.

Actually, by the way, the turning point was the year I was born. It was 1971. WTF happened? It was you, Mark. Yeah, so there's a website called WTFhappenedin1971.com, and it's literally hundreds of charts of basically this discontinuous change on all kinds of economic and social markers that kind of kicked in the year I was born. I do believe it is entirely my fault.

I will confess to that. But yeah, one of the things that happened was right around that time, productivity growth downshifted. It was running at like 2%, 3%, 4%, and then it's sort of been 1% to 2% ever since.

And it ebbs and flows a little bit with the economic cycle, but it's been quite low for the last 60 years. It dovetails to the political thing you were saying. There's a lot of questions as to why it's been so low. There's actually economists talk about something called the productivity paradox because it was really weird because the computer emerged in the 1970s. And so all the economists in the 1970s said the computer is going to lead to cornucopia.

It's going to lead to enormous productivity growth. Of course it is. You've got Moore's Law.

you know it's just like it's all this software and all this you know inventory just in time manufacturing and you're gonna have you know by the way robots right um and so you're gonna have this for sure you're gonna have a massive takeoff in in productivity growth and actually what happened was productivity growth uh actually downshifted and so the the whole s all of our expectations for how society works are actually geared towards low productivity growth and low economic growth from a historical standpoint The importance of that is really key to the next thing that you said, which is the psychological effect of being in a low growth environment is zero sum politics. Right. Logically. Right. Because if we're in a high growth environment, if the economy, if technology productivity grows running at 4 percent or God willing, someday more.

And if economic growth is running at 4 percent or more, the economy will be doing so well, it will be spewing money in all directions. Everything will be going crazy. Everything will be every business will be flush.

Every consumer will feel fantastic. Jobs are being created all over the place. Everybody's kids for sure are going to live better lives than their parents did. It's going to be great.

By the way, the 1990s were that, right? There was this kind of five-year stretch in the 1990s where economic growth really took off. And you probably remember, it was fantastic, right?

Everybody felt. Awesome. Right. And so this is one of the kind of weird, this is why I like a lot of the fears around the impact of technology, I think are really misguided when it comes to all these economic and political topics, which is from a political standpoint, we should hope that we have rapid technology progress, because if we have rapid technology progress, we'll have rapid economic growth. If we have rapid economic growth, we'll have positive some politics, right?

For me to be better in a high growth environment for me to be better off, I can go be better off. I can go exercise my skills and talents and get new jobs and switch jobs and switch careers and do all kinds of things. And I have a path and a future for myself and my children that does not require taking away from other people.

In a low growth environment, all of the economics and all the politics go zero sum because the only way for me to do better is I have to take away from you. Right. Or to your point, the government exactly. I completely agree with you. Or what happens is the government just inflates and they inflate because they want to basically buy votes.

They want to basically spend on programs and they want to buy votes. And so this is sort of what I would say, which is like, if you want zero-sum smash-mouth destructive politics with the government playing a bigger and bigger role, you want a slow pace of technological development. If you want positive-sum politics where people are thrilled and excited about the future and about their own opportunity and they don't have to feel like they have to take away from somebody else and they don't need handouts from the government because they're doing so well, you want rapid productivity growth. Right. And so that...

You said I'm saying like it's the opposite of the fear that everybody thinks that they have. I have many other thoughts on your question, but yeah, let me let me pause there and see which part you want to you wanted to get to. Oh, inflation.

Yes. Yeah. So inflation.

Yeah. So look, I would just say two things on inflation. It's actually pretty, pretty interesting. So there's an overall concept of inflation, which, as you said, is growth of the money supply. But but but the way that that plays out in the economy.

And they actually analyze it this way. It's basically the way they think about it. It's the basket of overall prices of everything in the economy.

And the government agency that calculates the rate of inflation uses a basket of sort of equivalent products over time to try to get a sense of what's actually happening with prices. And so there's both the money supply aspect of inflation and the government printing press and all that. And that's totally true. But what's actually happened inside that is actually because of differences in technology regulation, you actually haven't. really, I think, historically unprecedented difference in how different industries are actually inflating or deflating.

And there's a chart that we can maybe post for your listeners that basically shows three really big important sectors of the economy, which are healthcare, education, and housing, where the prices are skyrocketing. And by the way, everybody feels this, right? This is just like, okay, you want to go buy a starter home, or you want to get good healthcare, or you want to get your kid into good school.

The prices are going crazy. I mean, you see this in housing prices, of course. Another version of this is the higher education. A four-year college degree at a private university now costs $400,000 and is on its way to a million dollars, right?

Ooh, that's crazy. Completely crazy. Completely crazy. So the price of higher ed is just skyrocketing.

The price of higher education, bachelor's degrees, master's degrees, is rising far faster than the rate of inflation. And same thing is true in healthcare costs are rising faster than the rate of inflation and housing prices are rising faster than the rate of inflation. But then you have all these other sectors, and these are sectors like video games, entertainment, consumer electronics, by the way, food, cars, which is good, retail, consumer products generally.

Those prices are crashing. And so the things that you can buy today versus 20, 30, 40 years ago for the same dollar in those categories, I just take obvious examples, music. It obviously, music, to buy music 30 years ago, you had to go spend $15 to buy a CD and get 10 songs out of which you maybe wanted two of the songs.

Today, $10 buys you Spotify for a month and you have, you know, 10 million songs on demand and you can listen to it 24-7 and it's fantastic, right? And so the price of music has crashed, right? And so the price of housing, education and healthcare has skyrocketed. The prices of everything else is crashing. What explains that?

Well, the prices for everything is crashing. Number one, they have rapid technological change. which is driving down prices because of productivity growth, and they're not regulated, right? Nobody in the government is price-fixing music, right? Whereas housing, education, and healthcare are incredibly highly regulated and centrally controlled by the government, right?

And they have fixed supply dictated by the government, and they have very slow rate of technological adoption, right? It's almost impossible to get new technology into the healthcare system, into the education system, or into housing. Robots are not building houses. It's not happening. It's just not happening.

And so what we have actually in the economy is a diverse... I call these the slow sectors versus the fast sectors. The sectors for which prices are skyrocketing because of slow technology change and too much government regulation, and the sectors where prices are crashing because of rapid technological advances and lack of government regulation.

And when you chart these out, you can just extrapolate the lines. And so where this is happening is within a decade, if the current trends continue within a decade, a four-year college degree is going to cost a million dollars. And a flat-screen TV that covers your entire wall is going to cost $100.

And at some point, you might want to ask the question, isn't that backwards? Isn't what we all – this where I get very emotional about this. It's like, OK, define the American dream.

The American dream. By the way, for that, you could probably substitute the dream in many other countries, but let's just say the American dream. The American dream, I want to buy a house for my family.

I want to be able to send my kids to a great school, and then I want my family to be able to get great health care. Those are the three higher order bits, and those are the things where we have wired, our system is wired right now to drive the prices of those things to the moon. And then good news, iPhones and cars and digital music are plentiful, but they're not healthcare education. and housing.

And this is the other thing that's driving inflation, right? Because then what happens is the fast sectors of the economy with prices are crashing. They're shrinking as a percentage of the economy, right? Because prices are falling so fast.

And then because prices are growing so fast for healthcare, education, and housing, they're becoming larger and larger parts of the economy. And so the economy writ large and people's pocketbooks and how you spend your money, it's being eaten by these sectors that have slow technology growth. And therefore, rapidly rising prices.

By the way, once again, if you want to fix this problem, what's the way to fix this problem? You inject a lot more technology into those three sectors. You would want completely automated AI-driven healthcare. You would want AI education.

Every kid having an AI tutor, teacher. And you would want robust building houses. If you wedged full modern technology into those three sectors, you could crash prices, which would also crash inflation. And would cause everybody to be far better off. And so once again, it's this thing where you think you don't want the technology to change.

You actually very, very, very much want the technology to change. And if we don't get the technology to change, our politics for the next 30 years are going to be so crazily vicious. Because we're all going to be fighting over this shrinking pie.

And we're just going to hate how we have to live. So, yeah, let me pause there. Do you think the benefits of AI will be so overwhelming that there's just no way for politicians to hide the ball? Or will...

there be enough narrative and story and being able to leverage the resentment that exists right now to continue to forestall that, continue to grow government, keep it strong, keep it big. Yeah. So let me give you a micro answer and a macro answer.

So the micro answer. So do you see the dock workers strike that just happened? Yeah. So the dock workers just went on strike and they demanded this huge raise.

They demanded a huge raise. They demanded no more technology at the docks. They have this actually this dichotomy of an argument.

They say, Our jobs are like so backbreaking and arduous and physically harmful to our workers that like we need to be appreciated a lot more. And we want you to completely ban the introduction of automation that would basically automate those jobs so that our workers don't have to do them. Right. And they kind of make both sides of this argument like at the same time because they're completely contradictory.

But that's not their responsibility to resolve it. But the dock workers go on strike. They were literally asking for no more new technology at the docks.

preserve the jobs. It turned out through that, I discovered, I just had never looked at that industry before. It turns out there are 25,000 dock workers in the US, except that's not right.

There's actually 50,000 dock workers in the US. There's 25,000 dock workers actually work on the docks. And then there's 25,000 dock workers who don't work, who just sit at home and collect paychecks because of prior union agreements banning automation. What?

Yes. Whoa. Yes.

Because in previous bargaining rounds, they cut deals where if there were introduction of like, for example, machine. cranes to unload containers from ships, that those jobs would not go away. And so those jobs have not gone away.

That's crazy. That is malpractice. Well, so this is the thing.

So this is the thing. Okay, so this is the classic thing on all these things. Is that good or bad?

Well, it depends who you are. There's a political science, there's this concept of concentrated benefits and diffuse harms. And so for those 50,000 dock workers, this is great.

For the rest of us, it just makes everything we buy more expensive. Right. Because it makes working the docs more expensive. Right.

Because it's got all this dead weight loss on chips, which is a big part of the cost of like all the food we buy is more expensive as a consequence of these kinds of arrangements. But, you know, you and I pay another, you know, five cents every time we go to the supermarket as a consequence of this versus the 50,000 people who are organized in a union. Right.

And are able to negotiate on their behalf. Right. So, so, so, so, right. Concentrated benefits to the dock workers. diffuse harms to the rest of the economy.

And every time you get a special interest group in the economy pleading for this kind of employment protection, that's what's happening, right? They're basically trying to create an employment cartel that benefits the people in the cartel at the expense of everybody else. So here's the macro version of that is 30% of the jobs in the United States today require some form of occupational licensing. You can't just get the job.

You have to have some form of certification that you're qualified for the job. This has been pushed to extraordinary lengths. In California, you need, I think it's now, it's like 900 plus hours of professional training to be a hairdresser.

Right? Yes. Correct.

What? Yes. You cannot just like start cutting people's hair for money. No, no, no, no, no, no, no. That's illegal.

You need to have a. whatever, cosmetology certificate. If you get the cosmetology certificate, you have to go to hairdressing school.

To do that, by the way, you have to get admitted to the hairdressing school. It has to be a certified hairdressing school. By the way, guess who controls how many hairdressing schools there can be?

Is the current, oh, this is my favorite part. Let me give you my favorite example of this. So the university system.

So federal student loans, there's federal student loans for you to go to college, for you to go to college. You basically can't, normal people, you can't afford to go to college if you can't get federal student loans. So you can't be a university or college or university in the U.S. without having access to the federal student loan program.

It's not possible. So but to be a college or university that is able to get to give out federal student loans, they have to be accredited. Guess who accredits colleges, colleges and universities?

The existing colleges and universities. Yeah, saw that one coming. Guess how many new colleges and universities they're accrediting?

Yeah, but zero. Right. And so.

30% of jobs in the country right now require some form of license or accreditation. By the way, this is all doctors. And by the way, I think that's good.

You probably want doctors to be accredited. But it's also nurses, nurse practitioners. And then it's not just lawyers. It's also paralegals.

And then it's all general contractors. And then on and on and on, depending on which state you're in, including hairdressers and many other jobs where you would not think this is required. By the way, another version of this is teacher. you know to be a teacher in a lot of places in the u.s now you need an education degree right is there any evidence that teachers with an education degree are better teachers than teachers without an education degree i don't think so by the way the education schools are completely bananas crazy you know you know they're the most crazy of like the academic departments at these crazy universities right but again it's it's it's a cartel structure of course k-12 education is not just a cartel it's a government monopoly right so you you have to get actually hired into the into the well actually this is the other great part um You have higher ed is like this. There's K through 12 is like this.

And there's other branches of the federal workforce and state workforce that are like this. Or actually police and police are like this. You have quite a few people in the economy today who both have their government employees.

They have civil service protections because they're government employees, which means in practice they can't be fired. But they're also members of what are called public sector unions. Right.

So they both have to get hired by the government with whatever criteria they set. and they have to get admitted into the public sector union, and they have the employment protections of both, right, of both the civil service and the public sector unions. Good lord. And this is why, by the way, you can't fire, bad teachers can't get fired, right, because you hit all these things.

That winds me up. So the point of that is AI cannot change that quickly in this system. AI cannot become a lawyer. It's not legally allowed to. It can't become a doctor.

It can't replace the doc worker. It can't cut your hair. It can't build your house.

It's not legally allowed to, right? And so a very large, it goes actually to the Gulliver thing, a very large percentage of the economy as we experience it literally cannot be automated. It's illegal to do so. And so I actually-That's so ridiculous. Yeah, and so I actually think what's gonna happen is the economic impact of AI is actually gonna be very muted compared to what people are fearing or hoping or fearing because it's literally not legal to do that.

It's crazy. So if everything that you just walked us through is true in terms of when you have high growth, everybody's feeling good. More technology equals more growth.

AI is poised to bring that growth, but you have this trepidation. And so people, well, it's not just that, but you have trepidation around it. So the fact that the government tends towards this justify its existence, create a new regulatory body, slow things down, everything just grinds to a halt for people that...

Don't know the story of Gulliver's Travels. You have this guy that encounters these tiny Lilliputians. And despite him being, you know, whatever, a thousand times bigger than they are, they just end up tying him down with all these tiny little strings. And it's an analogy that Elon certainly has used a lot.

What do you think about his idea of going in and creating an efficiency program inside the government to try to free up some of these strings so that the economy can get going again? Yeah, that's right. So I'll give you a couple of books if people want to read about this.

So one is the Supreme Court Justice Neil Gorsuch just wrote a book. I think it's called like I forget the name is like overlawed or overlawed or something like that. But he basically lays out the data on the number of laws in the country.

And by the way, this is another one of these WTF happened in 1971 things, which is starting in the 1970s. The number of laws and regulations in the U.S. just took off like a rocket. Basically what happens is the lawyers took over everything. By the way, a big part of that is in politics, basically almost everybody now who's in elected office is a lawyer. So basically the lawyers just kind of swept in and took control of everything.

Also, Senator Mike Lee has also done a lot of work on this. You can just count the number of laws and then you can also count the number of regulations, which if anything is even worse because they're not even laws. They're just like a bureaucrat who's decided something. The number of regulations has just skyrocketed. So he goes through it in the book and then there's another book.

called Three Felonies a Day. And it goes through in detail that technically odds are you, I, and every other American citizen are committing at least three felonies every day. There are so many-And we just don't know it?

We don't know it. We don't know it. And the reason is because there are so many penalties. There are so many felonies on the book, on the books. And the felony laws are so sweeping in what they cover.

Now, you know, most of those never get detected or prosecuted, but like if somebody, if prosecutors want to come at you, They can figure out ways to do it. This is what people with lots of experience in the legal system always tell you. Like, if the feds want to get you, they're going to figure out a way to do it because you're almost certainly tripping something.

And so, yeah, so I completely agree with Elon on the nature of the problem. Like, it's just, yeah. And, again, this is sort of this weird, it's this, like, concentrated benefit, diffuse harm thing, which is, like, each law or regulation in isolation seems like a good idea. And each law or regulation has somebody advocating for it because they're going to benefit from it.

And they, you know, and typically there's like some level of self-interest, you know, somebody is trying to get something for themselves and then they sort of have a cover story of like, you know, consumer benefit or something. And then they get these things passed, right? And they, you know, operate in Washington and they get these, they're in the state house and they get these things passed. And, you know, each one of them on its own is not a big deal, but you run that process at scale over 60 years.

And that's when you end up with the Gulliver, you know, the Gulliver scenario, which is you're just, you're just drowning in laws and regulations. And again, I tie back to what I said before, like, that's why. the prices of healthcare, education, and housing have skyrocketed is because that's where the laws and regulations in the economy are concentrated. All right.

So let's talk about then the next four years. So if Elon were to find himself in that position, do you think that we could meaningfully strip away red tape to the point that that scenario you painted where those three things we care about so much, where the prices begin to crash, or is that just unrealistic? Full stop? Is it unrealistic in four years? How much can we do?

So it could be done for sure. There is actually a case of it actually happening in the world right now, which sitting here today looks very good, which is Argentina. And so Javier Mille, who's the new president of Argentina, has passed, I don't know the exact details, but I think his first big reform package, which was a real fight for him to pass, I think it was like, it fundamentally was like, I think it took regulations out of, I think, 800 different sectors of the Argentinian economy. In one package and I think they have a follow-up package they're working on that's like another it's like a 2000 or something so He's trying to do exactly what you just described. He's trying to just basically he's just like Me like me like me like this, you know, it's a staunch libertarian, you know, anti-socialist anti-communist He has this great line which he used the other day, which I love so much So mark Margaret Thatcher had the famous line about socialism, which is she said, you know The thing about spending other people's money is eventually you run out Um, Mele has a better term, which he says, he says, anybody can be, anybody can be a prostitute with other people's asses.

That guy is a gangster. He's hilarious. Which is freaking amazing.

Anyway, so, um, yeah, no, so he's trying to strip as much regulation out as possible. And the, and the thesis of it is precisely this. It's like, okay, you strip out regulation, you remove government control, you liberate the people, you liberate the people to be able to exchange, you know, go into voluntary trade and exchange to be able to actually conduct business with each other without the government interfering with it all the time.

And then as a consequence, you get far higher rates of economic growth, far higher rates of prosperity. And so this is a big experiment. And of course, Argentina has been a case study for 100 years of doing this the wrong way. And he's now administering a form of shock therapy to basically see if he could do it the right way. And by the way, sitting here today, in very short order, inflation in Argentina has – had a persistent inflation problem for a very long time.

He's completely nuked inflation. And economic growth has kicked in and job growth has kicked in. And now he is fighting like he has enemies. He is fighting like crazy both in the political system and riots in the streets from people who are trying to stop this.

That goes to our situation, which is, yes, the theory is totally sound. Everything that Elon is describing should absolutely happen. This should absolutely be done.

By the way, I think basically everybody knows this should be done. Again, concentrated benefits, diffused farms, even people who benefit from some aspect of this are suffering from it in every other area of their lives. Right. And so this is what Millie always points out, is the system in aggregate is making everybody poor.

Like it is leading to all these like bad, as you said, it's leading, for example, to intergenerational conflict that's just like unnecessary and very destructive. And so it's just like, let's just stop this form of self-harm. But to do that, the reason I say this, every single regulation has somebody behind it who doesn't want it to go away.

Right, because it benefits somebody right it benefits that you know, the dock workers who are sitting at home, right? It benefits somebody right? It's all the all the little cartels monopolies and oligopolies and little conspiracies in the economy like it's you know They they are in business because they're protected by the government and when you strip these regulations away You expose them to competition and they really don't like that And so there will be a backlash from the system from the from all of the you know The special interest groups in aggregate will rebel in great numbers And then, you know, look, the key fight ultimately is the civil service itself, you know, the actual government employees. Right.

And so, you know, for example, you know, how about a reform where like there's actual performance metrics for government employees and low performers get fired? Brother, please, if you want to get me in a cult, start a cult about that. I'm here for that.

I'll do what we need. I'll wear whatever crazy outfit. I am here for that one.

Yeah. Yeah, exactly. Let me ask, going back to Millet.

Are the layoffs causing any sort of economic downturn? Because one criticism I've heard of Elon is, hey, if you come in and you do this and you slash it, not only is it cruel, but you're going to tank the economy. You're going to have so many people without a job. Yeah, yeah. So this happens.

By the way, this happened actually in the late 70s, early 80s. There was actually a version of this, which is inflation in the US actually got completely out of control. And everything was kind of going sideways. But inflation went crazy. I think inflation spiked at like 15%.

15%. And then Paul Volcker, which was super destructive, right? Like really ruinously bad. Like it destroys everything.

It destroys savings. It destroys ability for businesses to plan. It just, it basically damages everything. And the way you crack the back of inflation is you raise interest rates. And you deliberately cool the economy in order to bring down the demand for money.

And then inflation falls. And so Paul Volcker, who was the chairman of the Federal Reserve, who is a famous guy. He's like the six foot eight giant guy with the cigar.

And he was the head of the Federal Reserve and he lived in the he lived in the undergraduate dorms at I think Georgetown And like took the taxi to work So he was like in contact with like regular people every day even though he was like the head of the Federal Reserve in his three-piece suit and It whenever he testified to Congress if you see the old photos He's just cause it's just like dank clouds of cigar smoke around him all the time So one of these like old-school figures and he raised interest rates in 1981. I think to 20% Whoa Which basically crushed the economy. It basically crushed demand in the economy. It meant that nobody could borrow money, nobody could buy a house, nobody could start a business.

It was very devastating in that moment. But he wrote a book about this, and he said at no point, when he would walk down the street, and people would recognize him, this is in D.C., and he'd be walking down the street or he'd be in the cab. He said nobody was ever mad at him, because what they said was inflation is so bad.

We know that inflation is bad. We know that you have to do what you're doing at the interest rates to do it. We know if you do it, you're going to fix the inflation problem and things are going to go back to being good again. And so we support you stick with it. And so he had that he had the people on his side and Mille has the same thing in Argentina right now.

He has very high level of support from the population because they've seen the other experiment for too long. They've been through they've been through a society with too much regulation, too much corruption and too much inflation for a long time. And they're just like, look, you know, the people are behind him. You see it in the polls and you see it in the voting.

They're just like, all right, we're going to try And so what you need is you need a politics of plan B. You need a majority of the population to basically say, you know, look, like whatever the pros and cons of the old system were, like they're not working. And we need fundamental change.

And then obviously you need leadership that's going to be willing to implement that. But if the people are behind it, you know, then you can actually do that. And so the fact that it's actually the fact that it worked under Volcker and the fact that it's working under Melee is very promising.

Like those are two great examples of how it can work. You know, we don't yet have that. but we could. Very, very interesting. When I start thinking about how we build back, we get the economy going, we take off the Gulliver strings.

One of the things that I would want to see is one of the things I think we need to see is a return to prizing freedom of speech. Because if we can't debate these ideas, if people can't get in there and mix it up and say, I think this is the way. No, that's terrible. We should be doing it this way. But you know, nothing being verboten, like actually being able to discuss these ideas, that feels like a critical need.

What's your take, especially coming off the heels of talking so much about AI? What's your take on censorship? Where are we culturally? And what's AI's role going to be in either breaking us free from censorship or using that to really tighten down? Yep.

Yep. So I should start with I am classic Gen X. I am 100% pro free speech. um two of us i am 100 pro-free speech by the way the first you may know this the first amendment you know guarantees the government at least in theory is not supposed to censor us um although that's been happening a bit lately um just a smidge Just a smidge, but the government also, there's the case law around the First Amendment that actually defines illegal speech. And there are a bunch of forms of illegal speech, and it's things like child porn, and it's incitement to violence, it's terrorist recruitment.

And so there's actually carve-outs for that stuff. And so my philosophy is basically U.S. law is actually very good on this. And U.S. law isn't just U.S. law. This has been litigated culturally in the U.S. as well as legally for 250 years.

Going back to the Bill of Rights, we and our predecessors in the U.S. went through a long process to get to where the First Amendment is. I think it therefore represents more than just a law. I think it's also a statement of culture and a statement of values. So true.

Right? And I've always been an advocate that the code for Internet freedom of speech should basically be that. It should be the First Amendment with only limited carve-outs for things that are truly dangerous, truly destructive. I don't want terrorist recruitment any more than anybody else.

But, like, you know. should people be able to talk about their politics online without getting censored? 100%.

Right. Full range of expression. 100%. Of course.

Like it's the American way. Of course. And so I'm 100% on that. You know, you know, probably as much as I do about the last decade, you know, which I've seen up close, which is, you know, generally things went very bad. You know, the Internet companies, you know, ran into a variety of, you know, externally and self-inflicted, you know, situations where there ended up being a pervasive censorship machine.

uh, for a long time. You know, the, the most dramatic change of that is Twitter before and after Elon buying it. And we're, by the way, we're a proud member of the syndicate that bought it with Elon. Um, and so, you know, I'm, I'm completely thrilled by what he's done there.

Thank you for your service, by the way. Uh, to me, it's just so better. I cannot, yeah, I just can't believe that that was controversial.

It's crazy. Yep. And as you know, it was a big change.

Like it was a, it was an absolutely dramatic change. Um, We're also, by the way, the main investor, outside investor in Substack, which I think has also done a spectacular job at navigating through this and basically has come out the other side of... And they're a small company, so when the pressure gets brought to bear on a small company, it can really have an impact. But the team there has, I think, done a fantastic job navigating to a real freedom of speech position. And as a consequence, Substack has now the full range of views on all kinds of topics in a really good way.

So the good news is we have two case studies where this has gone really well. The other ones are more difficult. I think we're going to have to wait and see. Here's what I would say is I think the internet social media censorship wars were the preamble to the AI censorship wars. I think the AI censorship wars are going to be a thousand times more intense and a thousand times more important.

Yes, 100%. And so, and the reason for that is, you know, the internet social media is important because it's what we all say to each other. But AI is going to be, I think, the software layer that controls everything.

It's going to be the software layer that basically tells us everything. It's going to be the software layer that teaches our kids. It's going to be the software layer that we talk to every day. And, you know, as I think you know, there's already AI censorship.

Like, you know, a lot of these LLMs are very slanted. And, you know, it's very easy. By the way, it's very easy to see because you can go on them today and you just ask them, you know, two questions about two opposing political candidates and they give you a completely different, you know.

One candidate, they're like, I'd be happy to tell you all about his positions. And the other candidate, they're like, oh, he's a hate figure. I won't talk about him. And it's like, wait a minute.

Right? Like half the country is voting for one, half the country is voting for the other. Yeah.

Who are you as an AI company to basically censor like that? And so, look, the AI censorship conflict is already underway. The information war around AI is already underway. By the way, the same people who were pushing so hard for social media censorship have now shifted their focus to AI censorship.

By the way, a lot of the actual censors themselves who used to work at companies like Twitter now work for the AI companies. And so there's been like a direct, you know, just lessons learned and now applying it at a larger scale. And so I think that, yeah, no, look, I think it's going to be a giant fight.

I think it's just starting. I think it's, you know, maybe the most important. I think it's maybe the most important political fight of the next 30 years.

Tell me why. In business, flying blind is a recipe for absolute catastrophic disaster. Yet most companies are doing exactly that, making decisions based on intuition. or outdated information.

If that sounds familiar, let me introduce you to NetSuite by Oracle. It's like giving your company x-ray vision, letting you see every aspect of your business in real time. It puts your accounting, finances, inventory, and HR into one seamless, beautiful system.

No more juggling multiple tools or guessing at the big picture. This means you make decisions based on facts, not hunches. Success in business isn't about predicting the future. It's about being ready for whatever comes.

NetSuite gives you that readiness. Download the CFO's Guide to AI and Machine Learning at netsuite.com slash theory. The guide is free to you at netsuite.com slash theory.

Again, that's netsuite.com slash theory. Well, because everything is downstream. Everything is downstream from being able to discuss and argue and be able to communicate. And so if you cannot have open discussions about important topics, you can't get good answers. Let me give you an angle on this.

I am pretty sure we will agree about this. The thing about AI censorship that scares me isn't just that person is a bad person and so I'm not going to tell you about them. It is that you can control the entire world through framing.

Just how you frame something and everything has a frame. And when you have humans with a desire to convert or indoctrinate rather than seek truth, then now the only thing I can guarantee is, OK, the AI is responding to me from within a frame. They are using that to.

nudge my thinking in a direction and it becomes a form of mind control. And if you've ever seen, dear listener, if you've ever seen an incredible debater, I promise you what you love about them is they can reject the frame and then put their own frame on it. And now they're arguing from a position of power. Most people can't do it. Most people don't even realize somebody just put them in a frame and they don't realize how constraining that frame is.

And that's what really freaks me out is everything else felt. more like it was out in the open. Like even when it was still Twitter and Twitter was being censored like crazy, everybody was like, bro, this is so obvious. Like, look, you post about this, poof, gone.

I post about this, it's gonna explode. So when the Twitter files came out, I don't think anybody was like, wait, what? Everyone was like, yeah, that's exactly how it felt.

This will be a game of frame and really does come down to, it's hard for humans to determine what is true. We were talking earlier about Why is technology stalled out? The reason technology stalled out, in my humble opinion, is physics broke somewhere around, call it 50, 60 years ago. It just got hung up and we haven't been decoding the real world.

That's truth. Now, once you're able to make contact with that ground level truth, new things are open to you. And so that's my big concern with AI is that we will not be getting informed. By what is making contact with ground truth, we're going to be having the frame set and we're going to be taught as kids, as adults, as everybody, based on the frame that matches somebody's ideology.

And that scares the life out of me. It should. I agree with that.

Of all the radical things that Elon is doing, maybe the most radical is that he's declared that his goal, and I've said we're investors in it with him, but his goal for XAI is what he calls maximally truth-seeking. And if you've listened to him on this, what you know is he actually means two different things by that. I mean, they're the same thing ultimately, but two different angles. One is maximum truth-seeking in terms of actually understanding the universe. And so to your point, actually learning more about physics.

But he also means maximally truth-seeking in terms of social and political affairs. And so being able to actually speak openly about having the AI actually be fair and truth-seeking when it comes to politics. And of course, that's, you know, that's...

That is possibly the most radical thing anybody could do is build a legitimately truth-seeking AI. And at least he's declared the determination to do that. So yeah, there's a version of the world where he succeeds and that becomes the new benchmark. And by the way, open source AI plays a big role here because people can field open source AI to do this without permission. And so there's a version of the world where AI becomes an ally in trying to understand ground truth and trying to enable all the actual...

discussions and debates that need to happen. And then there's a version of the world in which, yeah, it's a Orwellian thought control. My line on it is 1984, the novel 1984 was not written to be an instruction manual, right?

That was not the goal, right? It was supposed to be a dystopian future that we were trying to avoid. And so the idea that the machines are telling us what to think and that they're slanted and biased by the people who build them, yeah, I find to be completely unacceptable. But there is a, I mean, look, we have that today. Most of the AIs in the world today are like that.

And there is a very big danger. And by the way, those and again, those companies, people always the people who are the most upset about freedom of speech, they I think justifiably aim internet freedom of speech, they justifiably aim a lot of criticism at the companies. And I think that is valid in many cases. But I would just also tell you these companies are under intense pressure. And there's tons of activists that are very powerful that are basically bearing down on these companies all the time.

But then also the government directly. And one of the things that has really kicked in in the last, you know, 10 years is governments both here and in Europe and other places, you know, basically seeking to censor and control, even in ways that I think are just like obviously illegal by their own laws. And, you know, that pressure remains very strong. And I think, if anything, that pressure probably is going to intensify. And so this for me is in the category of, yes, these are the right concerns.

And then ultimately, this is a lowercase d democratic question, which is, do people care? And are people going to be willing to stand up for this? And I think that's what's required. Why do so many people in society want censorship right now?

Well, they want censorship if it's on their side, right? So my version of this is, so when I... I told you I grew up in the, I wasn't really part of it, but I grew up in the middle of the sort of great evangelical awakening in the 70s and 80s.

And at that time, the sort of Christian conservatives in the U.S. were the forces for censorship, right? And so the classic thing was it would be like religious groups that would try to censor movies or books. And then it was the coastal liberals who would be arguing in favor of free speech, right?

And so it would be famously like the press, the Pentagon Papers, they had all these stories about how great free speech was and libraries were sacrosanct that have free speech and they weren't going to censor things. So the censorship pressure was coming from the right in that era. And my analysis of that is that's because at that time, the right was culturally ascendant.

American society was much more overtly religious at that time, and the Christian conservatives were very, very powerful from a cultural standpoint. They got to write the textbooks and all these things. And so because they were winning culturally, they wanted to lock down speech so that they would continue to win. And the left was the counterculture, right? Classically, the left, the hippies, you know, the 60s, 70s, 80s, the left was the counterculture, right?

And the press and so forth was the counterculture. And they wanted to challenge the dominant frame, right? And they wanted to disrupt the system, right?

And so they were pro-free speech. And then, you know, 30 years later, it's inverted, where, you know, the left owns the universities. They own the book publishers. They own the media.

They own the press. They own the newspapers. They own most of the TV stations.

You know, they own the internet companies. They own the, you know. They own these commanding heights of society and culture.

And so now that they won and now that they're in charge, they want to lock down discourse. And then the right has become, it's averted, the right has now become the counterculture. And so the censorship pressure comes to the left and then the right wants to open things back up.

If the right becomes culturally ascended again, I would expect that polarity to shift. Once again, it'll flip. Whoever's in charge will not want free speech and whoever's the rebel will want free speech.

The principled position is I want free speech regardless. It's just very few people sign up for the principle because most people are part of the tribe. But yeah, as I say, I'm an old-fashioned Gen X libertarian. I actually believe in the principle.

Yeah, no, me too. For me, free speech is important because part of thinking is speaking out loud, having your ideas challenged. Also, facts have a half-life. All the things that we believe, man, a lot of them call it 30, 40 years down the road.

We don't believe them anymore. We've realized we had an approximation of the truth, but not the real truth. The example I always use on people is Newtonian physics versus relativity.

It's like, hey, when we had Newtonian physics, we thought everything worked. We thought we understood it. And then we get to relativity and up. Actually, you couldn't have had GPS with Newtonian physics. So this was an update that was absolutely necessary.

And as we mentioned earlier, we still aren't at ground truth. So we know that we're going to be revising that even further. And if you really internalize every time we get closer to ground truth, it unlocks things for us.

Then it's like, OK, I just want my ideas to be challenged. And so I'll because I teach young entrepreneurs a lot. I'm like, look, you've got to recognize that skills have utility.

And so the reason you want your idea challenge is you can actually develop a better skill once you realize, oh, I was wrong about X, Y, Z thing. I can now be right and that actually has utility in the real world, lets me do something I couldn't do previously. And so when you lock that down, now all of a sudden people get stuck. You get stuck because you're not able to have the best arguments thrown at your own idea.

That one is pretty traumatic to me. Now, speaking from a position of utility, Elon is somebody that has really demonstrated an obscene ability to get things done. You have bet on a lot of entrepreneurs in your career. You've obviously been very good at picking the best of the best. What is it that Elon does either in worldview or action that makes him so effective?

Yeah, in my mind, this is the single biggest question. I'm really glad you asked it because it's the single biggest question in the world right now. It's the single biggest question in my world, which is like, okay, how is it that he does what he does?

And I would say like I don't there are people who have worked with him for a lot longer who probably understand this better. But I've had an up-close kind of look at it for the last several years now and have come to really, I think, really respect it and I think understand at least parts of it. Look, it's the, a lot of it, there was that famous text exchange, and actually he's a friend of mine, a wonderful guy, Parag, who was running Twitter at the time when Elon first kind of tangled with it. And Parag's a wonderful guy, and he had literally just become CEO like a month earlier or something, and so he was just putting his plans in place when kind of everything, you know, the hurricane hit. But, you know, there was an exchange where Parag is talking about whatever, and it's the famous text exchange where Elon's like, all right, fuck it, I'm not having this conversation anymore.

He said, you know, what have you gotten done this week? Right. And what I realized when I read that was like that, that is the Elon method.

Like the Elon method boiled all the way down is what have you gotten done this week? Right. And that's very important because anybody who has ever been in a large company trying to do anything big, the big things happen over the course of years, you know, decades, years, months.

Things don't happen in weeks. Like, you know, companies have like five year plans. right they've like you know cars take like seven years to design right like rockets take like a decade uh fighter jets take like 25 years big software systems take five ten years um you know any large-scale effort anywhere in the economy we've just all gotten used to this idea that things just take years and years and years and then you've got like processes and procedures and plans and this you know documentation and you know rules and structure and strategies and like frameworks and powerpoint presentations coming out of your ears Amazon's big breakthrough was to go from having PowerPoint presentations to having 15-page written documents that everybody reads at the start of a meeting, which actually is an improvement off of a PowerPoint presentation, but there was that. Elon's like, no, I'm not doing any of that.

I'm not doing any of that. We're not doing any of that. Basically, we're going to staff these companies almost entirely with engineers.

I, myself, Elon, am an engineer. I am going to understand every aspect of every technical system that we're working on. I am going to be able to be in all the meetings on everything from rocket design to database design at Twitter and everything else.

I'm gonna only talk to the engineers if I can, you know, have, you know, possibly avoid it. I'm never gonna talk to anybody who's not an engineer. I'm gonna talk, I'm gonna go, I'm gonna talk to the person who's directly relevant to the project.

I'm not going through layers. I'm going all the way down to the company to just talk to the person who's in charge of this thing. And then basically what he does is he goes to each of his companies each week.

He identifies whatever is the bottleneck at that company this week, and then he works with the engineers and he fixes it that week. And so what happens is his companies move so much faster than everybody else's. Like, it's just like, it's like tortoise and rabbit. Like, they just move so much faster.

They're so much leaner. They don't have all these layers. They don't have all these, like, systems and controls and processes and all this stuff.

But what they have is, like, many of the best engineers in the world who... just absolutely love working with a CEO who understands the substance of what the product is and then is willing to actually work with them hands-on. I mean, I've been in meetings with him at edX where he's in there with 24-year-old engineers and they'll just walk through fire for him, right? Because he's like their idol and he's able to have a pure conversation with them and he cares about the work that they're doing.

And if they succeed at it, he is going to love them for it. And if they fail at it, he's going to be very disappointed in them. And it's just a completely different relationship than the CEO of one of these big tech companies has. it's just completely different um he he does a um i was in i was i went to see him one night when he took over uh took over x and i was sitting in the sitting in the conference room so okay so it's like 10 o'clock it's a classic illustrator so it's 10 o'clock at night um and he's like yeah meet me at twitter at 10 o'clock and i'm like fine so i drive up and i go in and um and uh i go to the conference room and it's it's elon on his it's elon on his uh iphone uh doing email and there's a dog on the floor. And I'm like, oh, and I, you know, retrospective, I was just like, oh, you know, is that your dog?

And he looks at me completely deadpan. He's like, I've never seen that dog before in my life. I'm like, what is it?

Just like the company dog, you know, he bursts out laughing because of course it's his dog. And then he's like, all right, you know, I want to talk, but he's like, I, you know, I, I, you know, I need 15 minutes, you know, and he's like, by the way, you can sit and hang out if you want. I just have to take a call. And he gets on Zoom and he's on Zoom with the rocket engineers for the Falcon rocket, the next generation rocket.

in Texas and it's whatever, I don't know, 12 o'clock their time, midnight their time. And it's just him on his iPhone on a Zoom call, you know, designing the next rocket, you know, which is like probably the rocket that we just saw work, right? And he's like fully conversant in the, you know, completely conversant in that.

And he and the engineers fix whatever the problem is that week with the rocket. And he's like, all right, now we're going to go fix the, you know, the database, you know, here at Twitter. And so it's just like rinse and repeat, rinse and repeat, rinse and repeat, do that every single week. I once offered him a place where I thought he might want to take it.

I was like, I know you're under a lot of pressure. Go to this place for a week if you want. Because he famously doesn't own any houses. He sold all his houses. He doesn't own any houses.

So he stays at friends'houses. So I was like, you can go use my house for a week. And if you need a vacation, you can go use my house for a week.

I got back five minutes later, one line, I don't take vacations. I'm going to frame and bronze that email, right? And so this is what he does. And he just does this at like an incredible high rate of speed.

He doesn't tolerate anything that stands in the way of it. And by the way, this is the same thing that drives everybody crazy, right? And so this was the whole thing on the, you know, this is his whole thing. He's in this, you know, big fight with regulators on like Starship launches, which is like, you know, a normal rocket company would take whatever, you know, a decade or 20 years to design a new rocket. You know, he's going to put out the prototype as fast as he can.

He's going to launch it and see what happens. You know, he's gonna, it's gonna explode in midair. There's, there's my, my nine-year-old and I love watching the SpaceX rocket explosion compilation videos on YouTube. They're hysterical because they just show these larger and larger and larger rockets launching and exploding in midair. And his competitors, all the way SpaceX, all the way SpaceX was on its way up, his competitors were like, he's crazy.

He can't make rockets work. See, they're all exploding. And what he was doing was he was iterating on the rocket design so much faster than they were.

And so he would run through five rocket generations. of which four would fail, but he would learn so much that the fifth one would work, and he would go through the five generations faster than his rocket competitors could do one generation. And he's just like, fuck it, I don't care.

Of course some rockets are going to explode. Nobody's going to get hurt. It's totally fine. But a big company can't tolerate that because it's like headline news and everybody's going to get mad.

And so anyway, it's just like this completely, it's a base level reality. He calls it first principles. You get straight to base level reality.

You get straight to substance. You spend no time on anything other than substance. And so anyway, like if you, you know, like me, if you're an engineer and you kind of see this, you know, I'm an engineer by training. And so if you kind of see this, you're like, oh, my God, this is like obviously the way that everything should be run.

But if you see it from the outside, it just looks so wild compared to all of these other large systems and rules that we've all gotten used to. And therein lies the conflict. What do you. So there's a lot of engineers in the world and none of them are having the kind of success that Elon is having. How much.

credit do you give to the bundle of traits that he must have? You've already talked about several of them, just getting to first principles thinking, moving very quickly. But there's also something that seems, I don't know, never met him, but by things that I have read, one of the early biographies, there's just a level of, this is not emotional for me at all.

It's the, your assistant asks for, this was in the original, one of the original biographies on him. A assistant asked for higher pay or something. He's like, take a vacation for four weeks.

I'm going to do your job and see how hard it is. If it's hard, cool. I'll give you a raise. And if it's not, you're gone. And she'd been with him for like 15 years or something crazy.

And she comes back and he's like, yeah, it wasn't that hard. Bye. And people were gobsmacked by that.

And I was like, yeah, I get it. I get it. How much is there something to that? Like is sort of, you know, if that were your friend and your friend treated you like that, it would not feel good. But in terms of proportion of his success, his ability to just completely divorce emotion and just say, this is either right or wrong for the project.

Yeah. Look, I think there's a lot to that. You know, by the way, I think Steve Jobs had a lot of that.

It's just, it's, it's, you know, I mean, there's a lot of ways to look at it. And, you know, people can have lots of views on this, of course, but, you know, substance, I would say dichotomy, substance versus style. or substance versus social, or substance versus protocol. It's so easy to slide into a way of thinking and being in which you are thinking abstractly about things. You are following rules that were established years ago.

Most big companies are. So our companies start as startups, and then basically what happens is, generally what happens is, they either fail or they succeed. If they fail, they go away. If they succeed, what happens is they succeed by going through basically scandal after scandal, crisis after crisis after crisis.

I always describe it as like a process of like falling upstairs. You're just like constantly falling and smashing your face into the stairs, but like you're gaining altitude as you go. And it's just like these companies are just constant internal crisis. And the sort of normal response, and by the way, it's the thing that everybody in business is trained to do. It's what they train you to do at Harvard Business School and Stanford Business School and all the books and all the stuff, all the CEO coaches.

It's like, oh, you know. You go through a crisis, you fix the crisis, and then you put in place a set of rules to make sure that crisis never happens again, right? It's like the legal thing we're talking about.

It's like, okay, that by itself would be fine, but you do that 20 times over 20 years and you have buried a company in bureaucracy to the point where it just basically, right? At that point, it's a company primarily that exists to follow rules. By the way, rules that were in many cases defined by people who aren't even there at the company anymore. And so nobody at the company today actually even understands why they were there.

Toby Lutke has a version of this, the guy who runs Shopify, who's an amazing CEO. He has a version of this, which is it's like every whatever year or something or every six months, he requires all standing meetings to be canceled. Taking off people's calendars and so all management reviews one-on-ones planning meetings like everything just gets taken off And then he says we only put the meetings back on where people are howling in pain Because we don't have them right but you have to do but his point and you have to do that over and over and over Again, because if you don't everybody's calendar just to creates meetings and then everybody's sitting in meetings all day long and nobody's doing anything Right and of course anybody listening to this who works at a big company knows exactly what I'm talking about Because that's the day-to-day life, which is oh my god You know, I worked at IBM.

I worked on the other side. I've seen the other side of this. So my first real professional experience was I was an intern at IBM in 1989 and 1990. When they were on top of the world, they, as late as 1985, IBM was 80% of the market capitalization of the entire tech industry.

They were a giant. They were like, you know, Fang combined into one company. They were like totally dominant. I was there 1989, 90, right before they basically fell off a cliff and caved in. And so it had been 70 years of success.

They had never had a layoff. By the way, lifetime employment. There were entire buildings full of people there who did not have actual jobs because you couldn't actually fire people. Oh, my God.

Oh, let me tell you the story. So I got to IBM and my manager's kind of showing me around. And I'm in this giant division in Austin building these sort of, at the time, were called workstations, these supercomputers, basically.

And he's like, yeah, he's like, look, here's how it works. He's like, we're the development, we're development, and we have the development building, and we have like 6,000 people doing development of the product. And then they have what they call marketing, but everybody else calls sales, which is the people who go sell the product. And then he's like, and then that building over there is the planning department.

And I was like, oh, I get it. You know, in development, we come up with ideas and then we work with the planning department to have the plans to be able to do it. And he's like, no, we never talk to them. We will never visit that building because that's the department that we assign people to when we can't fire them. Right.

And so, right. By the way, this is how, of course, public school systems work. You know, the public New York public school system famously has, I think, what they call the rubber room. Which is it's the place they send the teachers who are so terrible they can't put them in a classroom, but they can't fire them. And so they just have them sit and they do crossword puzzles all day.

It's the longshoremen who are sitting at home. So anyway, so big companies develop their own version of this. And it just and it accretes. By the time I got to IBM, two things. Number one, there was an app that they had that showed me the number of reporting, the number of manager layers to get to be the CEO.

So if I stayed at IBM and I want to become the CEO. How many layers would I have to climb? And I was 12 layers below the CEO, right? Which meant that my boss's boss's boss's boss's boss, who was like the big cheese, was still six layers down. Wow.

So there was that. But the other part of it was they had a formal process of decision making they called concurrence. And concurrence was if you're going to make a decision at IBM in those days, you had to make a formal list of every person in the company who was going to be affected by the decision, like every manager, every function.

And for any sort of product-related decision, that was like 35 names on the checklist. And it was like the sales heads of all the different regions and all this stuff. And to be able to get a yes on the decision, you had to get concurrence from every single person on that list. Any one person on that list could say the term was a deconcur.

I deconcur was the internal term, and deconcur meant veto. And so you needed 35 people to agree, and any one person could veto a decision. Right. And so So decision-making just simply stopped.

And this is why the company fell apart is because they couldn't make, they couldn't adapt because they couldn't make decisions, right? They literally couldn't act. And they had 440,000 employees, right? Oh my God.

So on a time-adjusted basis for the growth of the market, it was like the equivalent of today would be a million or a million two employees, something like that. So it's like a nation state. This is the other thing is at IBM in those days. In those days, you could work there for years and you would never meet anybody either at work or in your social life who didn't work for IBM.

Because it was so big. And so all of your friends, so everybody worked in the same company. The thing I always look at when I visit big companies is I always look for the signs, the signs in the parking area and the signs in the buildings.

Because everybody who works at the company knows where everything is, and so they don't rely on signs. And so when you go to a big company and there's no signs for what the buildings do, it's a sure sign that they're losing touch with the market because it means they don't get visitors. Interesting.

Because they're completely insular. So anyway, so this is the natural trajectory for all these companies just to end up in this state. And that's the polar opposite of the Elon method. That's the barbell. Now, to your point, your example on the assistant.

The big question is, why aren't there more ELANs and how do you make more ELANs? The second question is, can you have a partial ELAN? And so one of the ways I describe this is, is there a unit of metric which is milli-ELANs, right?

You know, like millimeters, right? So could you have like 900 milli-ELANs? Like, could you have 90% of ELAN?

but maybe not 100%? Or could you have the 50% version or the 10% version or maybe just the one mil at Elon, right? Maybe somebody who's just a little bit more like that. It's your question. Do you need the whole package or can people learn these techniques and be this way even if they're not Elon, even if they don't have his natural capacities and even if they're not willing to go all the way to where he goes?

Can they go partway there? And I actually think that's an open question today. And I would say there are shockingly few CEOs I know who are even asking that question or trying to figure it out.

Now, in theory, in theory, in theory, like if you've got one, in theory, you should be able to have a thousand. I mean, there's a lot of smart people in the world. Like, and so this, so here's the other example is what would it do for our civilization if we had a thousand of them? Yeah. I mean, at the rate that he's producing now, a lot, a lot.

And what would, what would happen in our civilization if every single industry had an Elon? Right. And so like, you know. It's legitimately insane.

Yeah. So that possibility exists. You can see it, right?

So I find that very exciting, very optimistic. I don't know if it'll go, but I think that's one of the really big questions in front of us right now. Yeah, no doubt.

Be interesting to see if anybody can pull those principles out in a way that's metabolizable by other entrepreneurs. The economy, did we just dodge a recession? Does debt make the recession inevitable? And we just kick the can a little bit down the road. What's your health check on the economy right now?

Yeah, so the way I think, okay, so let me give you a couple of things on this. So number one, I differentiate between the United States and America. I think there are two different concepts.

Say more. I think the United States is the system. It's the formal governance system.

So it's the government and all the stuff we've been talking about. It's all the rules. and all the processes and all the procedures.

And we all complain about, you know, we all have our various complaints about it. And, you know, whoever we are in the political spectrum, we've got all kinds of complaints about the government. But then there's America.

And for me, America is the people, right? And, you know, they're part and parcel of the government. And the people are kind of part and parcel of a country, but like they are different.

They're not the same thing. And, you know, we happen to be a very large country with a very large number of very smart, talented, you know, driven, capable people. And then, you know, I'd also say my mental model of America is like, we're just like a giant sprawling mess. like, you know, we're just, you know, we're just like chaos.

Like, and we have been, you know, for our entire 250 year existence, like we're the place people come when they're just like too ornery to start out where they were, you know, they just can't tolerate it. And so we, you know, we get the most disagreeable people from all over the world who come here. Cause they get to, you know, they get to basically be wild.

They get to do things that they wouldn't normally get to, you know, get to do. And I, of course I benefit from that. Cause you know, that's, we get all the, we get so many of the good founders from all over the world who come here to do it. Cause they don't think they can do it in the countries where they grew up.

And so we're, we are, America is a country of like tremendously talented, driven, capable, ambitious people from all over, by the way, from all over the world who have aggregated here and their descendants over many generations. And, you know, we've just, we've selected ourselves into the best, we've dealt the best possible hand in terms of the quality of our people. Like, you know, it's just extraordinary what this country is capable of.

And then most of what the country does is not done by the United States. It's not by the government. Most of it's done by the people.

Most of it's done by America. It's the old line of the business of America is business, which is this old line from the 50s. It's just like most of what most people do every day is they go to work and they try to do things.

They try to do things. They try to contribute. They try to take care of their family.

They try to build their companies. They try to do a good job. They try to build good products. They try to take care of customers. And so most of what people do every day is actually really productive and really helpful.

We're just the best, ranked by that, we're the best country. We have the best combination, we have this sort of rule of law of an advanced society, but we have less rules than the European countries, for example. And then we have all the energy of a new country, right? Because of immigration and because of all the talented people that we have. And so we're kind of at the sweet spot of sort of a combination, big country, small country, old country, new country.

Like we're kind of in that, we're kind of in that sweet spot. And so I go through that to just say, like, America wants to grow, right? The America, the country, the people, we want to grow. We want to succeed. We want to build great things.

We want to build businesses. We want to, we want to have economic growth. We want to have, you know, we want to, we want to just like shock the world with all these amazing inventions.

Like we, we, we want to do all these things. We are held back in all kinds of ways by the United States, but America wants to do that. And so basically if the government isn't too much on our throats, the economy will naturally just grow forever. It'll just grow in perpetuity and America will remain the best bet.

Globally, it will remain the best market to invest in. It'll produce the largest number of high quality new companies and so forth. And so the American economy wants to grow. And that's what's happened, which is we came out of COVID. And if you just plot a chart of American economic growth versus Europe and other countries, it's just, you know, there we are, we're off to the races.

And, you know, Germany's like, you know starting to shrink you know and you know the you know uk stack a bunch of other countries like have severe problems they're not able to reignite growth the new uk labor government just had the the labor government just had a growth conference this week because it's now hit such a crisis point in the uk they don't know how to get economic growth and so yeah our economy wants to grow it wants to it wants to do fine yeah we probably did we probably did dodge a recession and that's just because the productive energies of the american people just you know kicked in um you know it's it's it's all completely unpredictable from from here but like you know fundamentally i feel really good about I feel really good about America. I feel really good about the people and I feel really good about the engine that we have. I believe that, I forget who said it. I actually think you know, because I've heard you talk about this, but inside of all of us is a God-shaped hole.

And that hole right now, I think is having a resurgence of people really trying to re-embrace religion from an interesting angle that's probably outside of today's purview, what we're going to talk about, but they have a need to fill that. You're going to get the question of the soul. So what's going to happen is you're going to get somebody like me who doesn't have kids and I'm going to raise an AI child that is embodied. Because why not?

I can rush through the terrible twos. I can pause when they're seven years old for a couple of years and just enjoy that. Whatever.

I can, if I want to go to a movie with my wife, I can literally put them in the kitchen and shut them down. Like it's just all of the upside and none of the downside. And then all of the sudden other people can be like, yeah, that's dope.

And people are either going to be in relationships with robots romantically or they're going to be in a romantic relationship with a human, but they're going to raise AI kids. And you will literally at least for pockets, because there will be like the Amish or whatever, there will be the sort of super producers who keep their fertility high because cultural value says yes. There will be some that won't.

And so those cultures. will hit an existential crisis based on that, which I think will cause the religious element to really push and say, you know, this is an abomination before God and we just absolutely cannot do it. So that's where I feel like, huh, there's going to be this weird tension. And then if people are getting augmented with Neuralink, and obviously I'm talking, these are 20 year time horizons, maybe 30, maybe 50, but this is going to play out for somebody in the not too distant future in my estimation. And just to put one more thing in the mix.

You know very well that in backroom conversations in the government, people are asking questions, should we be prepared to do airstrikes on data centers because we are so worried about AI breaking free? So there's already this ambient anxiety about it. You've got me talking like a sci-fi writer, but it's a pretty plausible scenario.

How do we stop that? from happening or what is the automatic in the human mind kill switch that will stop that from happening so let's just start by saying there's a lot in there and i would love to talk about every part of it um and by the way we should go as deep as deep as you want with me anyway on on the on the religion stuff and so forth because i i agree with a lot of the i agree with a lot of the setup to the to the question um so let's see how to come at this so well look to start with i would say we have a crisis of meaning already right um and so you talk about like pop you know talk about fertility right and you know elon's been talking about this a lot lately but like Fertility rates are crashing all over the world, right? And it's actually really striking what's happening, right? Which is it's happening across cultures, right?

And so normally when there's like something happening in America or whatever, Europe or Japan or something, like you generally analyze and you're like, okay, what's happening in American culture that's causing this? Or what's happening in Japanese culture that causes this? But like it's happening in all those cultures simultaneously. Population's growth is crashing here, it's crashing in Europe, it's crashing in Korea, it's crashing in Japan, it's crashing in China. And by the way, like China, Japan and Korea have- very different cultures than we do.

And they have very different cultures between each other. Like they're, they're really different. Like the Japanese and Koreans are like really different.

Um, and yet it's happening in, in, in all these sort of advanced societies. And so I guess I would say it's like, we, that's sort of a preexisting condition. Um, you know, we, we just have that.

Um, and so that, and that's sort of a fundamental, you know, fundamental question. We have this, you know, this question of meaning, um, right. Which, you know, the God shaped hole, which is, you know, a process that kicked off, you know, probably. you know, basically like 150 years ago that, you know, has been playing out.

And, you know, people have been grappling with that for a long time. And, you know, as you know, we've been through various phases of religious revivals, you know, boom, boom, bus cycles with religions over the last, over the last hundred years. When I was growing up in the Midwest in the seventies and eighties during the, one of the great awakenings.

So the, you know, sort of comeback of evangelical Christianity and, you know, kind of born again, they're born again, you know, kind of phenomenon. So I remember it well, you know, I've seen, I've seen that happen. Yeah. So like, you know, I think that's all true.

That's all super important. And then look, tech obviously changes culture. By the way, culture changes tech.

It's a positive feedback loop. Different cultures react to tech in different ways. Let's see where to take it.

I think the counter argument, maybe the leash to put on it, I guess maybe I should start with, if you don't mind me asking, do you have kids yet? I don't, no. Yeah. So one of the things that I find in my conversations with my friends who don't have kids and then have kids that I went through, and it's a little bit, it's almost like a little bit, I have these conversations with my friends.

You know, I work in tech and a lot of people don't have kids or they wait for a long time. And I have this conversation where it's like the people with kids sound like pod people. You know, they sound like they got the brain fungus in The Last of Us or something, right? It's like, oh, you don't understand.

When you have a kid, everything changes. Right. And my friends are like, you know, like, what happened to you?

Like, what's wrong? You know, you sound like you're in a cult. And I'm like, no, no, you don't.

And it's literally like that was me before I had my first kid. Right. It was like, oh, I just whatever. Like, I want to live my life. I don't know whether I want this additional responsibility.

But like, basically, I think this is true. It's almost a universal thing. If you talk to parents, like when you have your first kid and you look in the kid's eyes for the first time and, you know, literally what you see, like, you know, look in the best case scenario, you know, you've got a blend, you know, literally a blending of DNA.

And the person you love most in the world is combined with you. And then the baby shows up with these eyes. And the eyes look back at you.

And it's like looking at yourself. And it's like looking at the person you love the most in the world. And it's like looking at this new soul all at the same time. And it is.

It's like a psychological reset. And so that's just such a... It seems so universal that parents understand that and non-parents don't. Right.

In fact, I have friends who are like, I don't know that I want to have kids because it sounds like it changes your psychology so much. Like, I'm worried it's going to ruin everything I like about my life today. And I'm like, no, no, it makes everything better. And they're like, but you have to spend all your time with the kid. And I'm like, yes, but it's the thing I want to do most in the world.

My friends are like, well, that's not what I want because I want to work all the time. And I'm like, you're missing out. It's like you're you know, you're brainwashed. And right.

So so that's like, you know, that's a thing. I mean, look, I don't I fully believe people are going to have AI pets. AI friends, they're going to have AI, like all kinds of relationships, AIs, they'll have some form of proxy children. I totally buy that.

By the way, that will probably be based on their information. One of the things I think, like, for example, your AI kid is probably going to be a version of you basically trained on your own training data, right? That's terrifying, but yeah. Well, so the concept actually that's starting to take off in the tech world right now is what's called the digital twin. So it's not the digital kid, it's the digital twin.

But the idea is, You know, look, like, for example, I might, I haven't done this yet, but I might do this, which is like, I'm not available 24 seven, but if I feed a language model, like everything I've ever written and everything I've ever said, then maybe if like somebody we work with has wants to ask me a question and it's the middle of the night, they can ask my digital twin and they'll get back a representative answer to what I would say. Right. And so like that, that that's starting to happen. So yeah, like, I think a lot of that stuff's going to happen, but the primal relationship that you have with another human being, and that could be another human being you're related to, or by the way, just another human being that you're not related to like that. There's a level.

I mean, we are very, very, very deeply wired to have those relationships be the center of our universe. And again, like I said, like there's a big issue here, which is people aren't having kids. And so, you know, that's not getting transmitted. And there's very big questions that kind of come, you know, kind of, kind of, kind of flow out of that, but it's, it's just different. Like it, it's just flat out different.

When you have your first kid, and certainly you should have like a dozen kids, they'd be great. I'm pretty sure like if we tape a show after that, like two years later, you're going to be like, oh, yeah, I don't know what I was thinking. Like this is just so different. So do you think that's the kill switch?

Well, let me broaden out the answer, which is fundamentally technology, AI, all this, like it has implications on lots of things for sure. But one of the things that it does is it makes us richer. Like it makes our society richer. It makes our material comfort a lot better.

It makes it a lot easier, by the way, to provide for kids and family, be able to have a higher level of material welfare. There's this line of critique of new technology, which is like, well, material welfare is not sufficient because it still leaves this God-shaped hole. But the way I think about it is, at higher levels of material comfort, we have a better shot at figuring out the answer to the God-shaped hole.

Like, if you're going to be confronted with existential questions about religion and philosophy and how to live your life, would you rather do that with material deprivation or with material plenty? And it's really easy for people to say that they would prefer to, you know, it's like, you know, would you rather be a monk with a straw mat on the floor, right, eating bread and water, trying to figure out the meaning of life, or would you rather be you? with a nice fluffy bed and air conditioning and artisanal cheese from Whole Foods.

I love that that's the one you pick. You'd much rather be you. Of course, I'm going to have a much better chance at figuring out the important questions in life if I'm not worried about where my next meal comes from, if I'm not worried whether the power is going to go out, if I'm not worried that I'm going to freeze to death overnight, if I'm not worried that my kid is not going to have access to a needle incubator, that I have to worry about where my income is coming from. Of course, with material planning, I'm going to have a lot more capacity to answer the deep questions.

And so I think that's going to be the unanticipated payoff, which is as technology and as AI makes the world materially better off. I believe it increases our ability to address these big questions, not decrease it. Yeah, so I'll agree with you there. But there's one division that I'm going to make, which is I'm the reason that religion is so impactful is because it addresses every intellectual, every person on the intellectual spectrum. So when I went through a phase where I was trying to explain to people, hey, think like this, act like this, it'll make your life better.

These ideas just radically changed me. And I found that. largely because as people age, they're just not able to be as intellectually nimble. But you also run into the reality that some people do not have the intellectual horsepower. Whenever I talk about this, I want to remind people it's entirely possible I fall below the line.

I'm perfectly willing to accept that. But you have to understand that there are dumb people that cannot process some of these ideas. And so religion becomes this catch all for, hey, this is how you live a good life. And it will speak to highly intelligent people and it will speak to people who are just going to follow the 10 commandments.

I mean, the 10 commandments are basically the Bible's TLDR, right? So it's like, hey, don't worry about reading that. Just hear the 10 things. Go do these 10 things and you're going to be fine.

Done in a story format. And so it really speaks to people. So I don't think this sort of intellectual approach to, hey, this is why AI is going to be great for you.

And in the future, it's going to solve all these problems. What's going to happen as a punctuated moment, I think on a long enough timeline, this is all great and it's wonderful and it brings about an age of abundance. But I'm talking about the punctuated moment where. People start losing their jobs and they don't want to make the transition. People get the sort of warmth and comfort from religion.

They're being drawn back into it. I don't know if the data will support this exact statement, but this feels accurate that people are coming back into religion and sort of regionally large numbers, like higher numbers than regional. I'm not saying ever in human history, but, you know, locally, time-wise. And so we've got this massive influx into religion right now.

You've got this massive thing that's going to disrupt all the things that religion is going to talk about, taking care of people, the soul, a connection to God, the afterlife, all these things that AI and robotics are going to challenge. And now I think you have this collision of people that aren't able to navigate intellectually the nuance. It becomes problematic.

And And I think that is going to have to be addressed. Now, let's take the super boring version of this. And it just plays out as regulatory capture.

And the government is just like, nah, my constituents don't want it. It gets mired. It gets super bogged down.

And now everything gets caught up in red tape. And the thing that I can already feel happening now where there's just so much regulation that it's hard to move forward at the rate we could say back when I was a kid, that gets exacerbated. That's my sort of mundane vision of how this plays out.

But I don't see a world in which it just all happens in a sunny, rosy way. Do you? It's complicated.

So look, I'm a techno optimist, not a techno utopian. And so I start by saying a couple of things, which I don't think technology answers all these questions. And so I don't think technology, or for that matter, economic growth, give answers to most people for meaning.

Right. And so I don't think any of this is a substitute for religion. And so I like I like from that standpoint, I maybe have a little bit of humility just on the on the scope of the importance of what we do out here.

So like I said, I think this I think, you know, even in a world of technological abundance and economic abundance, material welfare, I think the big questions of meaning are still are still open questions. And so, like, you know, I will hesitate to make make sweeping claims on that. Yeah, I guess I've just, maybe the other way to come at this, maybe the other way to think about this is I talk more about the religion side.

So my, my take on religion, like I completely buy religious revivals. And I think we're actually in quite a religious time right now, which we should talk about. Because like, for example, in politics have become a branch of religion. You know, we've, we've, we've, you know, we've invented a whole series of secular religions in the last 150 years.

And we continue to do that. And so the sort of form and shape of religions keeps playing out, even if they don't have. you know, sort of supposedly supernatural kind of elements to them. And, you know, I'm completely open to the idea of, like I said, I live through a fundamentalist religious revival.

I'm completely open to more of those. Those clearly are happening at various places in the world. You know, one of the, yeah, so I will certainly grant all that. That said, as we do, like, we moderns and postmoderns, like, we don't relate to religion the way that people did back before our times.

So, like, the further you go back in history, and for sure this is, like, this was true, like, 150 years ago back, the relationship that people had with religion was different than they have it today. I'm going to wait on the rabbit hole in this, but basically, for most of recorded human history, religion was not an a la carte thing. It was something that was a very deep part of who you were as a person. And specifically, they had the concept of peoplehood. There was a people, and the people would have shared genetics.

They'd all be related to each other. The people would have shared culture. The people would have a shared place, right? you know, their own land, and then they would have religion. And those concepts were all conjoined.

There's this great book, it was a great book called The Ancient City that goes through basically the prehistory of Western civilization. It goes through the basically what are called the old Indo-European religions and cultures, you know, that sort of ultimately resulted in the Greeks and the Romans and then Christianity. So it's sort of, it goes all the way back to the beginning of basically like how Western societies formed.

And it was basically a three-part structure. It was family, it was tribe, and then it was city. And then these concepts of shared kinship, genetics, shared culture, shared religion, and shared geography were all conjoined. And if you told somebody in that era that, you know, oh, you can switch religions, they would have considered you completely insane. Because being of that religion with those gods was precisely tied to these other factors of culture, genetics, and place.

Of course, in our society, we have completely disconnected those things. You know, if I go out in public today and I'm like, Like, no, I'm a part of a peoplehood where I have shared genetics, culture, religion, and place. And I'm going to have, you know, ethnostate for German, Dutch, you know, people in the Midwest.

Like, you know, obviously I get instantly tagged as a white supremacist and like I get, you know, shunned and ostracized from society. By the way, I'm not proposing that. I don't want that, just for the record.

Right. And so we live in a different time. We have abstracted religion away from those other things. And kind of to your point, actually, as a consequence of that, we can now choose our own religion. Right.

And as a as a modern Westerner, you or I are completely free tomorrow to become a Catholic or a Baptist or Jewish or Muslim or whatever we want. Or, by the way, to make up our own religions and by the way, proselytize and go try to get followers. And, you know, when we call those cults and people do that all the time and we, you know, I would argue we live in a world of cults and we've got all these new cults out here in California.

And, you know, some of them are, by the way, super involved in AI. So, like, it's a thing. So. But religion has become an a la carte. It's like the old choose-your-own-adventure books you might have had when you were a kid.

You can basically design the religion that you want. And so on the one hand, you would say, oh, well, then this is going to be a time of tremendous invention of religious concepts and religious behaviors. And by the way, I believe that's true.

I do think that's happening. On the other hand, is this like, okay, is religion going to control our lives in the way that it did back when that concept was conjoined with genetics, culture, and place? It's hard, like, we just don't take religion that seriously anymore. We could choose to take it seriously again if we want to, but just observationally we don't.

That's interesting. And when it becomes inconvenient, we change, right? I'll be, I'm going to run something by you.

Tell me how this lands. I know you have a broad historical context. So, also being a student of history, I hesitate to say this, but I have a hypothesis that the religious impulse plays out at the same volume, no matter what.

it just becomes a question of what is the religious impulse aimed at. So for instance, as a game developer, I am constantly awestruck by how toxic the communities can become. And so I sat down one day and I was like, what on earth is going on here? And I realized this is the religious impulse that's being met by a video game. So you are communing with the other players.

You are committing a ton of your time to this. You are giving yourself over to this game. You care about the lore. You care about the time that you've invested into it. I mean, this is a level of belonging to a game and a game community that you would only have gotten historically as a part of either a town, a family, or a religion.

And so it meets that criteria. And so when you have this sense of tremendous belonging and you, as the game developer, go in and mess with their thing, and the easiest way to explain it is imagine I... could go in and mess with the rules of football without consulting anybody. And tomorrow you roll up and it's just different. And now the player that you loved is no longer a good player and you don't really like it anymore.

It doesn't speak to your skill set. People would be outraged. Like my dad was into this team.

My dad was into this game and I was raised on it. And now I'm here and you changed it in your trash. And that's basically what happens.

Now, if I'm right, that that's writing on the neurological architecture. that makes religion so powerful, it's like, hey, that volume is still dialed to 11. Now, hopefully nobody's gonna go kill in the name of their favorite video game. But I think that's a narrative question and not an architectural question.

So if I were to get people to believe that by investing in this video game, like a cult, somehow meant something about you and society and we were all fighting for the, you know, insert now politics, and you get how suddenly with the right narrative, whoa. Like people will go. And that's another area. I think people are politics right now is triggering the religious impulse.

So I don't think the volume is dialed down. Even if we quote unquote, don't take religion as seriously, who I think the outcome is going to be the same because this is an this is the architecture of the human mind. Yeah, so I 100% agree with everything you said.

I just interpret the consequences of it differently, which is imagine telling an Athenian Greek or a Roman or a Christian in 300 AD or a Christian, for that matter, in 1800 AD that your now religion is a video game. They would have thought you completely lost your mind. Right? Like, wait a minute. you've now taken that entire religious impulse, which is every bit as strong as it was.

And you've like now applied it to video game. Like you're, you're like, you have, you have completely disconnected the importance of religion from reality from like actual physical reality. Like it no longer is relevant to you in terms of like the shape and form of any aspect of like your actual, anything, any traditional concept of community city environment, anything like that family, by the way, does it guide your decisions about like, you know, you know, things like reproduction children.

Um, And, you know, are you indoctrinating your kids? By the way, maybe you are. Maybe you're indoctrinating your kids in World of Warcraft. But like indoctrinating your kids in World of Warcraft is like, that's not the same as like indoctrinating your kids in Catholicism. Like that's a World of Warcraft.

It may be equally intense, but it's not as comprehensive an impact on the worldview of how people live their lives. And so I just, I agree with you, but I just think that leads to like tremendous amounts of displacement. But then also let me say, I really agree with your last point, which is the politics point. which I think is something that is extremely important because it, you know, especially sitting here today, three weeks before, you know, very big election. Something that I often point to when I talk to people about this is if you look at the charts of, you know, the big general population surveys of would you be comfortable with your kid marrying somebody of a different X?

You know, you know, there's the famous chart of a different race and, you know, whatever, six, 80 years ago, that was like 90% uncomfortable. Today it's like 10% and falling. somebody of a different, and then another one would be somebody of a different religion.

And if you had polled people 80 years ago, when they polled people on this, like Catholics, Jews, Protestants, all were like, no way, you're not marrying outside the faith. And today, at least like in the US, very few people care. And so like that chart is like way down.

The chart of, do you care if your kid marries somebody of the other political party? That chart is up into the right. And so to me, That maps exactly to what you said, which is, yeah, so politics has become our religion.

There was actually a very, very important thinker, writer in the 20th century, Eric Vogelin. And he was he's the best writer I found on this topic. And he basically started he started his work actually in the 30s and 40s.

And he was basically trying to explain at the time the rise of both communism and fascism. And he's like, wow, you know, these people are crazy. Like these people are really extreme. And then he's like, all right, like what is leading?

you know, Bolsheviks on the one hand and like, you know, Nazis on the other hand, to be like this, you know, sort of fevered and enthusiastic about these like incredible, you know, these incredibly high, you know, kind of impact social movements with all these consequences. And so he basically developed a theory very consistent with what you said, which is, you know, which he called, I think, you know, political religions. And he did the mapping and basically said, like, these are direct, these are in fact direct standings for religion and Christianity, actually both Christianity and Nazism. Sorry.

Both communism and Nazism were legendarily very hostile to Christianity, you know, precisely for that reason, because Christianity was the threat. They were, you know, quite literally trying to displace the, you know, the dominant religion in Europe at that time. And so, you know, again, like exactly you're right. I think the impulse is with us.

I think many, you know, both Republicans and Democrats in the U.S. today exhibit that exact same kind of religious behavior around their politics. You know, on the one hand, it can sound, I think, patronizing to say that because, you know, people. think that their politics are all carefully thought through.

They don't think they're doing it. But, you know, politics are important to people in the same way that religion is and was important to people. And so, you know, they're certainly acting like it, and they certainly point in their politics to how political choices are going to affect how people live, which is very consistent with the view of a religion. Yeah, and so I think they're displacing that religious energy into politics. I think if they displace that religious energy into video game cults, like, that's probably an improvement.

Maybe. Maybe. It's certainly more benign, I think, for the reasons that you said earlier. So what does the religious impulse done well look like? So there's obviously just funnel it into a traditional religion that's lasted for thousands of years, probably going to be fine.

But given that a lot of people are not doing that, how can you do that well? Yeah, so the anthropological view of religion, I think, is it's about group formation and cohesion. Right. And this is the role in the ancient city to talk about this. Like this is the role that religion.

So the original form of this and sort of prehistory, the original form of this was. We've got the family, which is like basic cousins. It's basically the extended family up through cousins. And by the way, cousin marriage, you marry your cousins. And so you try to keep the family in the family.

And then the family has its gods. And then over time, the families aggregated, the clans aggregated up into tribes, which consisted of multiple families. And then the tribes would have its gods. And then the tribes would aggregate up into the cities.

And the cities would have their gods. Right. And so as the member of a city, you had three tiers of gods that you basically were required to, you know, basically to worship and to honor. And you literally had it with the hearth. You had the fire, you know, the permanent fire.

And you had to keep the fire lit and sort of, you know, you had to do sacrifices to the gods and so forth. And then the morality, the original morality of it was if you meet somebody from another, you know, family, tribe, city, they worship different gods. Right? They have their own gods. And so your gods are inherently at war with their gods, and your moral obligation is to kill them on sight.

That's aggressive. Right? It was literally like, kill them on sight. So had you told them, had you told people from that era, from those many centuries, you know, no, you're supposed to be tolerant to people from other religions, they would have said, are you out of your mind?

They're a threat. If we don't kill them, they're going to kill us. We kill them on sight.

And so it was like, you know, it's like, The concept of human rights is like 180 degree inversion from like the original form of society. By the way, a big improvement, I think, but a very, very big inversion. And so like at sort of the most fundamental level.

So why don't I go through that at the most fundamental level? What's the religion for? It's for group cohesion.

Why did it work that way? It's because that's what maximally bonded the family, the tribe and the city together at a time when physical survival was very much up for grabs. Right. Like is the family, the tribe, the city going to make it through the year TBD? Is there going to be a famine, a flood, a mudslide, you know, a volcano eruption?

Is another tribe going to come over and kill you? Are you going to run out of food? Like, those are all very important questions.

The entire tribe, you know, city had to really pull together for physical survival. And so religion was like the bonding element that pulled together a group. And I would argue, you know, fast forward to today, that's exactly the behavior you see in video games, right?

Which is, you know, it's not just a member of like a video game cult. It's not just an individual. They're not acting as an individual. Inevitably, they're acting as a member of a group.

Right. And it's group cohesion. And then I also apply the Jonathan Haidt kind of theory here, you know, kind of coming from from psychology, which is he has this great, great line he talks about in the book, The Righteous Mind, where he says he uses the word morality.

But you can basically equivalently, I think, use the word religion. He said morality binds and blinds, which is to say a shared morality or a shared religion. It binds people together into a group. It identifies us versus them, friend versus foe. In the way that it did also in prehistory.

And then he said, he said, this is really important. The other part is it blinds. It sets up a knowledge framework, a perception framework by which you emphasize confirming information that's good for your group and you dismiss disconfirming information that's bad for your group.

And you literally become blind. Right. And to the point, and you see this today with Republicans and Democrats, where, you know, generally the more passionate the Republican or Democrat, the least, the less able they are to articulate the other side's point of view correctly. Right?

The less able they are to steal man the other side's view, which means they're literally giving up on psychological terms. They're giving up what's called theory of mind. They're giving up the ability to understand what it's like in somebody else's shoes because it's more important to be a member of the group than it is to be able to understand the other. Anyway, so this is all very much in support of what you're saying. These are very fundamental primal behaviors.

I think that they are very important today in our society as much as ever. which you see in the politics. And then, you know, I think they're going to be equally important, you know, hundreds of years from now.

Hopefully this impulse gets channeled in productive directions. Yeah. Yeah. We'll see. So Kai-Fu Lee has talked about how we're, we could experience up to 50% of job displacement.

It's not like there won't be new jobs, but you're going to have a very substantive percentage of people that are either just temperamentally or age wise, unwilling to make a change societally. How do we handle that? Yeah.

So I don't, I don't think that's true at all. So I just, yeah. Yeah. So that's the classic in, in, uh, in economics, that's what's called the lump of labor fallacy. So it's one of the, it's one, it's one.

And by the way, Kaifu is a very bright guy, so, you know, he may well be right on this, but what any economist will tell you is this is a fallacy and it's actually a fallacy at the heart of Marxism at the heart of socialism. And it's a, it's a very intuitive fallacy. It's one that people fall into very easily.

It's called the lump of labor fallacy because and there's like big great Wikipedia page in this people can read The lump of labor fallacy basically is there's a certain amount of labor being done in the world today Right and that labor is either going to be done by people or it's gonna be done by machines And if it's done by people then they're gonna make money by doing it be able to provide for themselves and it's done by machines Then the people are gonna become unemployed and they're gonna be screwed and What's interesting about this fallacy is this has been a fallacy that literally has been in place in basically, you know, political thought and, you know, sort of Marxist economic thought, socialist economic thought for like 300 years. The Marxists really kind of packaged it up and turned it into a religion, actually. But, you know, this is kind of the pervasive thing. This was sort of the immediate kind of concern panic at the very beginning of the Industrial Revolution, which was you were going to have machines that were going to substitute for human labor that were going to miserate everybody. This actually is sort of embedded in a lot of myths and legends that we kind of have in our kind of cultural DNA.

There's a famous, I don't know if you've heard about it, there used to be or is a famous ballad song of the myth of this figure John Henry. And kids are often taught this song. It's John Henry, the steel driving man.

And the idea was it's the guy. This is like, you know, this would be like when the railroads are getting built, right? So this is like the guy who's like using a hammer to drive spikes into the rail bed to put railroad tracks down, which used to be something people did by hand.

And it was this thing where, you know, one day, you know, John Henry's like the famous guy who can drive in the most spikes. And then one day the foreman shows up with a machine that drives in the spikes. And there's the they have a they have a contest where John Henry competes with the machine.

Sure. Who can drive in most most spikes? And it turns out John Henry wins the contest and then drops dead from a heart attack. It's so kind of symbolic, you know, the last gasp of human effort before the machines take over. And that dates back to like, I don't know, like 1870, right?

So that's like 150 years ago, people had this fear. And then basically what we've had is we've had 300 years of modern technology, industrialization, automation, computerization, literally three centuries now. And sitting here today, there are more jobs than ever in the world, than ever, and at higher wages for people. And so in practice, what's happened is we now have three centuries of evidence that basically that's a fallacy. That's actually not what happens.

What happens actually is the opposite, which is technology creates far more jobs than it destroys and creates jobs that are better at higher levels of income. Do people adopt those jobs? Are there going to be people that just get left behind? There will be some. And look, there is some response.

And I should also back up for a second and say conversations about this topic, it's very easy to come across. In my experience, I'm talking about myself, it's very easy to come across as like judgmental and patronizing because it's very easy to come across basically saying, you know, basically. So like one of the things that I will claim is that one of the things I will claim and what we're about to talk about is that there are some jobs that are better than other jobs.

Some jobs are just better jobs. They're like, you know, they're physically less taxing. You know, they pay better, you know, whatever. But, you know, there may be a bar to be able to get those jobs or people may not want to do those jobs. And so people may get, you know, you can, people can get very resentful at the idea that they have to give up what they have in order for the prospect of something that might be better, but maybe they don't want it.

And, you know, who, who, who are these experts on TV or on the internet to tell them that they should think in these terms? So, so I should start by saying, look, like people are going to have a lot of reactions. People always have, look, a lot of our politics for the same 300 years have been around this process of, of industrial change.

And then therefore, um, uh, you know, job change and, you know, like, you know, It's like the rise of unions and like there's all these things that happen in our politics as a consequence of these fights And so I should just start by saying like it You need to be able to talk clinically about this because you do need to be able to talk about the big issues I do recognize that it's very easy to come across as patronizing I also recognize that people are gonna have different points of view on this Some people are going to struggle some for sure, you know Look when that when the car came along blacksmiths were not happy, right? Because like all of a sudden you don't need as many horses like they were not happy now Many blacksmiths became car mechanics, but many blacksmiths maybe didn't want to become car mechanics and got very upset and resentful about that. So this, yes, all of the above is going to happen.

Having said that, the basic mechanism of introducing new technology into an economy is not job destruction. The basic mechanism is job creation, net job creation, overwhelming the job destruction. And the reason for that has to do with this concept of productivity growth.

And so the concept of productivity growth is very important. So the concept of productivity growth is the economic measure of the impact of technology in an economy. And basically what it means is the ability to generate more output with less input.

And so, you know, use the John Henry example. Can I put more nails in the roadbed to build railroad tracks faster at the same cost level? Can I build more cars at lower prices? Can I, you know, provide, you know, can I make more video games? more video game levels at lower prices.

In any industry, there's always this question of like, how much am I producing today? And then can I produce more output at lower cost? And it's what every business logically wants to do, right? They want to expand output and they want to reduce costs. And so productivity growth is the metric by which economists track the impact of technology, impact the environment.

And this is very important. The faster the rate of productivity growth, the faster the rate of economic growth. The faster the rate of productivity growth, the more prices of current goods and services in the economy fall.

Right. Because if you're able to produce more with less, then prices come down. Right.

And so just take food as an example, like food today is far cheaper than it was 200 years ago because of all the automation. Right. And so, you know, to buy an avocado, you know, 200 years ago would have cost, you know, the modern day equivalent of, you know, $100. You know, and now it's and, you know, and now it's a dollar. Right.

And so so productivity growth leads to declines in prices, declines in prices lead to increase in spending power. Because if as a consumer I pay less for the things I'm already buying because of productivity growth, then spending power is being unlocked. Without me even getting a raise, I have new spending power.

And then that new spending power then leads to the creation of new products, services, and industries and jobs to fulfill that all of a sudden I can spend on. So what I'm describing is this is the basic mechanism of technological adaptation of an economy, and it's a basic mechanism of economic growth. And theories Like the one that you mentioned, theories by which the introduction of technology has an emissary effect as compared to a cornucopian effect historically have not played out well because that's not actually how this works, which is why the socialists are like perpetually disappointed.

It's like every socialist is like super pissed like all the time because capitalism works so well. Like it's really annoying, right? Like that we live in a time of material plenty after all of this like runaway capitalism, like it's really, you know. It's Boris Yeltsin in the American supermarket in 1991, just like completely shocked at how like much food there is. It's just like, you know, they lied to us, right?

Like the communists lied to us, right, about how to do this. Anyway, so like we can go into any aspect of this you want to in detail. But basically, I completely convinced that's exactly what's going to happen here.

If AI works the way that we're imagining what's going to happen is productivity growth is going to take off. Prices of current goods and services are going to fall. Volume is going to expand.

More people in the world are going to be able to buy all the things that they want to buy. But also, it's going to unlock a lot of new spending power. That spending power is then going to create demand for new industries, right? It's going to unlock demand that we're going to be able to satisfy by producing and buying many new things.

And, you know, our future digital children, AI children, 100 years from now are sitting here having a podcast saying, can you believe that our human parents had this fallacy where they didn't think that this was going to turn out this way? Because, like, it always did, and it did again. And so anyway, so that's why I'm so optimistic about this. I love it.

For people that don't know you, he wrote a document basically saying technology is going to save us all. He went through in detail on a lot of these points. Very counterintuitive coming out of the Bay Area for sure. One thing that I... One moment.

Yeah, please. I wouldn't say it's going to save us all. So I would say I'm an optimist, not a utopian. And so it goes, this is very important. It goes back to where we started, which is I don't think this, everything I just described does not answer all of life's deep questions.

right? Like it's not enough to just have material welfare. Like I'm a hundred percent on that, but like having material welfare is better than not having material welfare, right?

And it's the best starting point to be able to answer the big questions. And so I just wanted to qualify that. I'm not, I'm not, I'm not, I am actually myself not proposing a new religion.

Mark, this has been incredible. Where can people follow along with you? Oh, good. So I am on Twitter now called X. I am on there as P Mark A, P M A R C A. Um, that is probably one of my main presences.

And then I have a sub stack, um, which is linked to from the Twitter account. Um, and then we have a YouTube channel. Um, and my partner, Ben and I have a YouTube show, uh, that we do intermittently. Um, but we get good feedback on. So maybe we can link to that.

Awesome guys. I can definitely vouch for his content. It is, uh, amazing. I hope you guys would check it out.

Speaking of things that I hope you will do, if you have not already be sure to subscribe and until next time, my friends. be legendary. Take care. Peace.

If you liked this conversation, check out this episode to learn more. Today, we're going deep into a conversation that has me incredibly fired up. We're talking about our future, your future, my future, the future of humanity itself. And we're doing it with one of the most visionary minds in artificial intelligence, Imad Mostak. And we were like.