Marc Andreessen: How Risk Taking, Innovation & Artificial Intelligence Transform Human Experience

Time: 0

Andrew Huberman: [MUSIC PLAYING] Welcome to the Huberman Lab

Time: 1.46

podcast, where we discuss science and science-based tools for everyday life.

Time: 9.26

I'm Andrew Huberman, and I'm a professor of neurobiology and ophthalmology

Time: 13.34

at Stanford School of Medicine.

Time: 15.38

Today, my guest is Marc Andreessen.

Time: 17.739

Marc Andreessen is a software engineer and an investor in technology companies.

Time: 22.27

He co-founded and developed Mosaic, which was one of the

Time: 25.21

first widely used web browsers.

Time: 27.309

He also co-founded and developed Netscape, which was one of the

Time: 30.86

earliest widespread used web browsers.

Time: 33.349

And he co-founded and is a general partner at Andreessen Horowitz,

Time: 37.31

one of the most successful Silicon Valley venture capital firms.

Time: 40.93

All of that is to say that Mark Andreessen is one of the most successful

Time: 44.66

innovators and investors ever.

Time: 47.429

I was extremely excited to record this episode with Marc for several reasons.

Time: 50.96

First of all, he himself is an incredible innovator.

Time: 54.01

Second of all, he has an uncanny ability to spot the innovators of the future.

Time: 58.67

And third, Marc has shown over and over again the ability to understand

Time: 62.4

how technologies not yet even developed are going to impact the

Time: 65.92

way that humans interact at large.

Time: 68.27

Our conversation starts off by discussing what makes for an exceptional innovator,

Time: 72.9

as well as what sorts of environmental conditions make for exceptional

Time: 76.37

innovation and creativity more generally.

Time: 78.84

In that context, we talk about risk taking, not just in terms of risk taking

Time: 82.84

in one's profession, but about how some people, not all, but how some people who

Time: 87.26

are risk takers and innovators in the context of their work also seem to take

Time: 91.55

a lot of risks in their personal life and some of the consequences that can bring.

Time: 96.24

Then we discuss some of the most transformative technologies that are

Time: 99.33

now emerging, such as novel approaches to developing clean energy, as well

Time: 104.05

as AI or artificial intelligence.

Time: 106.619

With respect to AI, Marc shares his views as to why AI is likely to

Time: 111.06

greatly improve human experience, and we discuss the multiple roles

Time: 114.83

that AI is very likely to have in all of our lives in the near future.

Time: 119.059

Marc explains how not too long from now, all of us are very likely to have AI

Time: 123.339

assistants, for instance, assistants that give us highly informed health advice,

Time: 128.21

highly informed psychological advice.

Time: 130.25

Indeed, it is very likely that all of us will soon have AI assistants that govern

Time: 134.75

most, if not all, of our daily decisions.

Time: 137.389

And Marc explains how, if done correctly, this can be a tremendously

Time: 141.49

positive addition to our life.

Time: 143.44

In doing so, Marc provides a stark argument for those that argue that AI

Time: 148.57

is going to diminish human experience.

Time: 150.77

So if you're hearing about and or concerned about the ways that AI is

Time: 154.43

likely to destroy us today, you are going to hear about the many different

Time: 158.25

ways that AI technologies now in development are likely to enhance

Time: 162.34

our human experience at every level.

Time: 164.66

What you'll soon find is that while today's discussion does center around

Time: 167.56

technology and technology development, it is really a discussion about

Time: 171.51

human beings and human psychology.

Time: 173.65

So whether you have an interest in technology development and or AI,

Time: 177.49

I'm certain that you'll find today's discussion to be an important and

Time: 181.03

highly lucid view into what will soon be the future that we all live in.

Time: 185.35

Before we begin, I'd like to emphasize that this podcast is separate from my

Time: 188.61

teaching and research roles at Stanford.

Time: 190.469

It is, however, part of my desire a nd effort to bring zero cost

Time: 193.39

to consumer information about science and science-related

Time: 195.7

tools to the general public.

Time: 197.549

In keeping with that theme, I'd like to thank the sponsors of today's podcast.

Time: 201.39

Our first sponsor is LMNT.

Time: 203.15

LMNT is an electrolyte drink that has everything you need and nothing you don't.

Time: 206.88

That means plenty of the electrolytes, sodium, magnesium, and potassium in

Time: 210.59

the correct ratios, but no sugar.

Time: 213

The electrolytes and hydration are absolutely key for mental health,

Time: 216.61

physical health, and performance.

Time: 218.44

Even a slight degree of dehydration can impair our ability to think, our energy

Time: 222.3

levels and our physical performance.

Time: 224.62

LMNT makes it very easy to achieve proper hydration, and it does so by

Time: 228.64

including the three electrolytes in the exact ratios they need to be present.

Time: 232.719

I drink LMNT first thing in the morning when I wake up.

Time: 234.89

I usually mix it with about 16 to 32oz of water.

Time: 238.05

If I'm exercising, I'll drink one while I'm exercising, and I tend to

Time: 241.63

drink one after exercising as well.

Time: 244.219

Now, many people are scared off by the idea of ingesting sodium because obviously

Time: 248.99

we don't want to consume sodium in excess.

Time: 251.09

However, for people that have normal blood pressure, and especially for

Time: 254.13

people that are consuming very clean diets, that is consuming not so many

Time: 258.61

processed foods or highly processed foods, oftentimes we are not getting

Time: 262.62

enough sodium, magnesium and potassium, and we can suffer as a consequence.

Time: 266.41

And with LMNT , simply by mixing in water, it tastes delicious.

Time: 269.11

It's very easy to get that proper hydration.

Time: 271.37

If you'd like to try LMNT , you can go to drinklmnt, that's L-M-N-T,

Time: 275.46

.com/huberman to claim a free element sample pack with your purchase.

Time: 278.99

Again, that's drinklmnt.com/huberman.

Time: 282.92

Today's episode is also brought to us by Eight Sleep.

Time: 285.509

Eight Sleep makes smart mattress covers with cooling, heating

Time: 288.19

and sleep tracking capacity.

Time: 289.84

I've spoken many times before on this podcast about the fact that sleep, that

Time: 293.72

is getting a great night's sleep, is the foundation of all mental health,

Time: 297.67

physical health and performance.

Time: 298.86

When we're sleeping well, everything goes far better.

Time: 301.219

And when we are not sleeping well or enough, everything gets far

Time: 304.59

worse at the level of mental health, physical health and performance.

Time: 307.45

Now, one of the key things to getting a great night's sleep and waking up feeling

Time: 310.2

refreshed is that you have to control the temperature of your sleeping environment.

Time: 313.48

And that's because in order to fall and stay deeply asleep, you

Time: 316.74

need your core body temperature to drop by about one to three degrees.

Time: 319.95

And in order to wake up feeling refreshed and energized, you want

Time: 323

your core body temperature to increase by about one to three degrees.

Time: 326.48

With Eight Sleep , it's very easy to induce that drop in core body

Time: 330.58

temperature by cooling your mattress early and throughout the night and

Time: 333.75

warming your mattress toward morning.

Time: 335.41

I started sleeping on an Eight Sleep mattress cover a few years ago, and

Time: 338.52

it has completely transformed the quality of the sleep that I get.

Time: 341.559

So much so that I actually loathe traveling because I don't have my Eight

Time: 345.3

Sleep mattress cover when I travel.

Time: 346.91

If you'd like to try Eight Sleep , you can go to eightsleep.com/huberman and you'll

Time: 351.1

save up to $150 off their Pod 3 Cover.

Time: 354.37

Eight Sleep currently ships in the USA, Canada, UK, select

Time: 357.14

countries in t he EU and Australia.

Time: 359.19

Again, that's eightsleep.com/huberman.

Time: 362.28

And now for my discussion with Marc Andreessen.

Time: 365.66

Marc, welcome.

Time: 366.42

Marc Andreessen: Hey, thank you.

Time: 367.85

Andrew Huberman: Delighted to have you here and have so many

Time: 369.62

questions for you about innovation AI, your view of the landscape

Time: 374.23

of tech, and humanity in general.

Time: 378.199

I want to start off by talking about innovation from three

Time: 381.45

different perspectives.

Time: 383.32

There's the inner game, so to speak, or the psychology of the innovator,

Time: 387.849

or innovators, things like their propensity for engaging in conflict

Time: 393.25

or not, their propensity for having a dream or a vision, and in particular,

Time: 398.7

their innovation as it relates to some psychological trait or expression.

Time: 405.15

So we'll get to that in a moment.

Time: 406.639

The second component that I'm curious about is the outer landscape

Time: 410.42

around innovators, who they place themselves with, the sorts of

Time: 414.31

choices that they make and also the sorts of personal relationships

Time: 418.05

that they might have or not have.

Time: 419.72

And then the last component is this notion of the larger landscape that

Time: 424.179

they happen to find themselves in.

Time: 425.4

What time in history?

Time: 427.71

What's the geography?

Time: 429.049

Bay Area, New York, Dubai, etc.

Time: 432.449

So to start off, is there a common trait of innovators that you think

Time: 439.09

is absolutely essential as a seed to creating things that are really impactful?

Time: 446.17

Marc Andreessen: Yeah.

Time: 446.599

So I'm not a psychologist, but I've picked up some of the

Time: 449.22

concepts and some of the terms.

Time: 451.15

And so it was a great moment of delight in my life when I learned about the Big

Time: 454.65

Five personality traits, because I was like, aha, there's a way to actually

Time: 457.91

describe the answer to this question in at least reasonably scientific terms.

Time: 462.099

And so I think what you're looking for, when you're talking about real

Time: 464.799

innovators, like people who actually do really creative breakthrough work, I think

Time: 467.44

you're talking about a couple of things.

Time: 468.38

So one is very high in what's called trait openness, which is one of the

Time: 472.57

Big Five, which is basically just like, flat out open to new ideas.

Time: 477.28

And of course, the nature of trait openness is trait openness means

Time: 480.409

you're not just open to new ideas in one category, you're open to

Time: 482.49

many different kinds of new ideas.

Time: 483.78

And so we might talk about the fact that a lot of innovators also

Time: 486.76

are very creative people in other aspects of their lives, even outside

Time: 490.36

of their specific creative domain.

Time: 493.05

So that's important.

Time: 493.73

But of course, just being open is not sufficient, because if you're just

Time: 496.16

open, you could just be curious and explore and spend your entire life

Time: 498.799

reading and doing, talking to people and never actually create something.

Time: 501.759

So you also need a couple of other things.

Time: 503.78

You need a high level of conscientiousness, which is

Time: 505.83

another one of the Big Five.

Time: 506.85

You need somebody who's really willing to apply themselves, and in our world,

Time: 510.28

typically over a period of many years to be able to accomplish something great.

Time: 514.2

They typically work very hard.

Time: 516.549

That often gets obscured because the stories that end up getting told

Time: 519.36

about these people are, it's just like this kid, and he just had this idea,

Time: 522.3

and it was like a stroke of genius.

Time: 523.35

And it was like a moment in time and was just like, oh, he was so lucky.

Time: 526.47

And it's like, no, for most of these people, it's years and

Time: 528.98

years and years of applied effort.

Time: 531.19

And so you need somebody with an extreme, basically, willingness to defer

Time: 534.69

gratification and really apply themselves to a specific thing for a long time.

Time: 538.839

And of course, this is why there aren't very many of these people, there aren't

Time: 542.23

many people who are high in openness and high in conscientiousness because to a

Time: 545.735

certain extent, they're opposed traits.

Time: 547.8

And so you need somebody who has both of those.

Time: 550.23

Third is you need somebody high in disagreeableness, which

Time: 553.11

is the third of the Big Five.

Time: 555.12

So you need somebody who's just basically ornery, because if they're not ornery,

Time: 559.69

then they'll be talked out of their ideas by people who will be like, oh,

Time: 562.34

well, because the reaction, most people have new ideas is, oh, that's dumb.

Time: 566.01

And so somebody who's too agreeable will be easily dissuaded to not

Time: 569.25

pursue, not pulling the thread anymore.

Time: 571.88

So you need somebody highly disagreeable.

Time: 573.139

Again, the nature of disagreeableness is they tend to

Time: 575.56

be disagreeable about everything.

Time: 577.94

So they tend to be these very sort of iconoclastic kind of renegade characters.

Time: 582.59

And then there's just a table stakes component, which is they

Time: 584.79

just also need to be high IQ.

Time: 586.379

They just need to be really smart because it's hard to innovate in

Time: 589.59

any category if you can't synthesize large amounts of information quickly.

Time: 593.559

And so those are four basically high spikes, very rare traits that

Time: 598.66

basically have to come together.

Time: 601.72

You could probably also say they probably at some point need to be relatively low

Time: 604.92

on neuroticism, which is another of the Big Five, because if they're too neurotic,

Time: 608.139

they probably can't handle the stress.

Time: 609.44

Right. So it's kind of this dial in there.

Time: 611.68

And then, of course, if you're into the sort of science of the Big Five, basically

Time: 616.17

these are all people who are on the far outlying kind of point on the normal

Time: 620.29

distribution across all these traits.

Time: 621.809

And then that just gets you to, I think, the sort of hardest topic of

Time: 625.34

all around this whole concept, which there are very few of these people.

Time: 629.52

Andrew Huberman: Do you think they're born with these traits?

Time: 631.88

Marc Andreessen: Yeah, they're born with the traits.

Time: 634.25

And then, of course, the traits are not genetics, are not destiny, and so

Time: 637.14

the traits are not deterministic in the sense of that just because they

Time: 640.36

have those personality traits doesn't mean they're going to deliver great

Time: 643.41

creativity, but they need to have those properties because otherwise they're just

Time: 647.4

not either going to be able to do the work or they're not going to enjoy it.

Time: 649.83

Right.

Time: 650.49

I mean, look, a lot of these people are highly capable, competent people.

Time: 653.94

It's very easy for them to get, like, high paying jobs in traditional

Time: 657.58

institutions and get lots of traditional awards and end up with big paychecks.

Time: 661.87

And there's a lot of people at big institutions that you and I know well,

Time: 666.319

and I deal with many of these where people get paid a lot of money and

Time: 668.98

they get a lot of respect and they go for 20 years and it's great and they

Time: 671.63

never create anything new, right?

Time: 673.92

There's a lot of administrators, a lot of them end up in administrative

Time: 678.11

jobs, and that's fine, that's good.

Time: 680.32

The world needs that also, right?

Time: 682.84

The innovators can't run everything because the rate

Time: 685.449

of change would be too high.

Time: 686.32

Society, I think, probably wouldn't be able to handle it.

Time: 687.93

So you need some people who are on the other side who are going to kind of keep

Time: 690.75

the lights on and keep things running.

Time: 692.179

But there is this decision that people have to make, which is okay if I

Time: 695.98

have the sort of latent capability to do this, is this actually what

Time: 699.34

I want to spend my life doing?

Time: 700.83

And do I want to go through the stress and the pain and the trauma

Time: 704.63

and anxiety and the risk of failure?

Time: 706.78

And so, do I really want to?

Time: 708.5

Once in a while you run into somebody who's just like,

Time: 710.49

can't do it any other way.

Time: 712.37

They just have to.

Time: 713.44

Andrew Huberman: Who's an example of that?

Time: 714.28

Marc Andreessen: I mean, Elon's the paramount example of our time, and

Time: 717.92

I bring him up in part because he's such an obvious example, but in part

Time: 720.82

because he's talked about this in interviews where he basically says,

Time: 724.87

he's like, I can't turn it off.

Time: 727.63

The ideas come, I have to pursue them, right?

Time: 730.13

It's why he's like running five companies at the same time

Time: 732.03

and, like working on a sixth.

Time: 734.51

It's just like he can't turn it off.

Time: 736.91

Look, there's a lot of other people who probably had the capability to do it,

Time: 740.16

who ended up talking themselves into or whatever events conspired to put them in

Time: 744.3

a position where they did something else.

Time: 746.73

Obviously, there are people who try to be creative, who

Time: 748.3

just don't have the capability.

Time: 749.96

And so, there's some venn diagram there of determinism through traits,

Time: 753.58

but also choices in life, and then also, of course, the situation in

Time: 757.71

which they're born, the context within which they grow up, culture, what their

Time: 761.81

parents expect of them, and so forth.

Time: 763.77

And so to kind of get all the way through this, you have to thread all

Time: 767.469

these needles kind of at the same time.

Time: 769.59

Andrew Huberman: Do you think there are folks out there that meet these criteria

Time: 773.25

who are disagreeable, but that can feign agreeableness, you know that can...?

Time: 779.09

[BOTH LAUGH] For those just listening, Marc just raised his right hand.

Time: 783.23

In other words, they can sort of, phrase that comes to mind maybe because

Time: 787.52

I can relate to it a little bit, they sneak up through the system, meaning

Time: 791.719

they behave ethically as it relates to the requirements of the system.

Time: 794.78

They're not breaking laws or breaking rules, in fact, quite the opposite,

Time: 797.68

they're paying attention to the rules and following the rules until

Time: 800.45

they get to a place where being disagreeable feels less threatening

Time: 806.049

to their overall sense of security.

Time: 808.959

Marc Andreessen: Yeah, I mean, look, the really highly competent people

Time: 810.609

don't have to break laws, right?

Time: 813.26

There was this myth that happened around the movie The Godfather , and

Time: 814.581

then there was this character, Meyer Lansky, who's like, ran basically

Time: 820.69

the Mafia 50, 60, 70 years ago.

Time: 822.51

And there was this great line of like, well, if Meyer Lansky had only applied

Time: 825.47

himself to running General Motors, he would have been the best CEO of all time.

Time: 828.44

It's like, no, not really, right?

Time: 830.34

The people who are great at running the big companies, they

Time: 832.93

don't have to be mob bosses.

Time: 834.02

They don't have to break laws.

Time: 837.3

They're smart and sophisticated enough to be able to work inside the system.

Time: 840.9

They don't need to take the easy out.

Time: 841.99

So, I don't think there's any implication that they have to break laws.

Time: 845.59

That said, they have to break norms, right?

Time: 847.81

And specifically, this is probably the thing that gets missed the most,

Time: 851.23

because the process of innovating, the process of creating something

Time: 855.06

new, once it works, the stories get retconned, as they say in comic books.

Time: 860.57

So the stories get adapted to where it's like it was inevitable all along.

Time: 863.85

Everybody always knew that this was a good idea.

Time: 866.01

The person has won all these awards, society embraced them.

Time: 869.03

And invariably, if you were with them when they were actually doing the work, or if

Time: 873.44

you actually get a couple of drinks into them and talk about it, it'd be like,

Time: 875.97

no, that's not how it happened at all.

Time: 877.59

They faced a wall of skepticism, just like a wall of basically

Time: 881.07

social, essentially denial.

Time: 882.75

No, this is not going to work.

Time: 883.95

No, I'm not going to join your lab.

Time: 885.36

No, I'm not going to come work for your company.

Time: 887.19

No, I'm not going to buy your product, right?

Time: 888.94

No, I'm not going to meet with you.

Time: 890.42

And so they get just like tremendous social resistance.

Time: 894.33

They're not getting positive feedback from their social network the way that

Time: 898.279

more agreeable people need to have, right?

Time: 900.559

And this is why agreeableness is a problem for innovation.

Time: 903.08

If you're agreeable, you're going to listen to the people around you.

Time: 905.49

They're going to tell you that new ideas are stupid, end of story.

Time: 908.9

You're not going to proceed.

Time: 910.429

And so I would put it more on like, they need to be able to deal with, they

Time: 913.224

need to be able to deal with social discomfort to the level of ostracism,

Time: 917.59

or at some point they're going to get shaken out and they're just going to quit.

Time: 920.67

Andrew Huberman: Do you think that people that meet these criteria

Time: 922.96

do best by banding with others that meet these criteria early?

Time: 926.87

Or is it important that they form this deep sense of self, like the ability

Time: 932.15

to cry oneself to sleep at night or lie in the fetal position, worrying

Time: 936.719

that things aren't going to work out and then still get up the next

Time: 939.66

morning and get right back out there.

Time: 941.47

Marc Andreessen: Right.

Time: 941.94

So, Sean Parker has the best line, by the way, on this.

Time: 945.059

He says being an entrepreneur or being a creator is like getting punched

Time: 948.65

in the face over and over again.

Time: 950.009

He said, eventually you start to like the taste of your own blood.

Time: 953.17

And I love that line because it makes everybody massively uncomfortable,

Time: 955.95

but it gives you a sense of how basically painful the process is.

Time: 959.689

If you talk to any entrepreneur who's been through it about that, they're like,

Time: 963.01

oh, yeah, that's exactly what it's like.

Time: 964.62

So, there is a big individual component to it.

Time: 968.299

But look, it can be very lonely, and especially very hard, I think, to do

Time: 972.86

this if nobody around you is trying to do anything even remotely similar.

Time: 976.04

And if you're getting just universally negative responses,

Time: 978.23

like very few people, I think very few people have the ego strength to

Time: 981.49

be able to survive that for years.

Time: 983.41

So I do think there's a huge advantage, and this is why you do see clusters.

Time: 986.809

There's a huge advantage to clustering.

Time: 990.109

Throughout history, you've had this clustering effect.

Time: 992.08

You had clustering of the great artists and sculptors in, you had the

Time: 996.28

clustering of the philosophers of Greece.

Time: 997.8

You had the clustering of tech people in Silicon Valley.

Time: 999.97

You have the clustering of know, arts, movie, TV people

Time: 1002.74

in Los Angeles, and so forth.

Time: 1004.679

And so, know, there's always a scene, right?

Time: 1007.48

There's always, like a nexus and a place where people come

Time: 1010.42

together for these kinds of things.

Time: 1013.32

So, generally speaking, if somebody wants to work in tech, innovate in tech,

Time: 1016.11

they're going to be much better off being around a lot of people who are trying to

Time: 1019.02

do that kind of thing than they are in a place where nobody else is doing it.

Time: 1022.87

Having said that, the clustering can have downsides, it can have side effects.

Time: 1026.469

And you put any group of people together, and you do start to get

Time: 1029.29

groupthink, even among people who are individually very disagreeable.

Time: 1032.98

And so these same clusters where you get these very idiosyncratic

Time: 1036.339

people, they do have fads and trends just like every place else.

Time: 1039.96

And so they get wrapped up in their own social dynamics.

Time: 1044.279

The good news is the social dynamic in those places is usually very forward

Time: 1047.099

looking, and so it's usually like, I don't know, it's like a herd of iconoclasts

Time: 1051.85

looking for the next big thing.

Time: 1053.849

So iconoclasts, looking for the next big thing.

Time: 1055.33

That's good.

Time: 1055.719

The herd part.

Time: 1056.79

That's what you've got to be careful of.

Time: 1058.589

So even when you're in one of these environments, you have to

Time: 1060.44

be careful that you're not getting sucked into the groupthink too much.

Time: 1063.01

Andrew Huberman: When you say groupthink, do you mean excessive friction?

Time: 1065.39

Do you do pressure testing each other's ideas to the point where

Time: 1068.11

things just don't move forward?

Time: 1069.23

Or are you talking about groupthink, where people start to form a consensus?

Time: 1072.82

Or the self belief that, gosh, we are so strong because we are so different?

Time: 1079.609

Can we better define groupthink?

Time: 1081

Marc Andreessen: It's actually less either one of those things both happen.

Time: 1083.99

Those are good.

Time: 1085.04

Those are good.

Time: 1086

The part of groupthink I'm talking about is just like, we all basically zero in, we

Time: 1090.19

just end up zeroing in on the same ideas.

Time: 1091.889

Right.

Time: 1092.83

In Hollywood, there's this classic thing.

Time: 1095.219

There are years where all of a sudden there's, like, a lot of volcano movies.

Time: 1098.559

It's like, why are there all these volcano movies?

Time: 1100.26

And it's just like, there was just something in the gestalt, right?

Time: 1102.68

There was just something in the air.

Time: 1104.26

Look, Silicon Valley has this.

Time: 1105.93

There are moments in time where you'll have these.

Time: 1107.82

It's like the old thing.

Time: 1109.719

What's the difference between a fad and a trend?

Time: 1111.69

Fad is the trend that doesn't last.

Time: 1113.179

Right.

Time: 1113.799

And so Silicon Valley is subject to both fads and trends, just like any place.

Time: 1119.22

In other words, you take smart, disagreeable people, you cluster them

Time: 1121.56

together, they will act like a herd.

Time: 1123.97

They will end up thinking the same things unless they try very hard not to.

Time: 1128.09

Andrew Huberman: You've talked about these personality traits of great innovators

Time: 1131.29

before, and we're talking about them now.

Time: 1135.2

You invest in innovators, you try and identify them, and you are one.

Time: 1139.09

So you can recognize these traits here.

Time: 1141.67

I'm making the presumption that you have these traits.

Time: 1143.15

Indeed you do.

Time: 1144.78

We'll just get that out of the way.

Time: 1148.099

Have you observed people trying to feign these traits, and are there any

Time: 1153.5

specific questions or behaviors that are a giveaway that they're pretending to

Time: 1160.73

be the young Steve Jobs or that they're pretending to be the young Henry Ford?

Time: 1166.719

Pick your list of other names that qualify as authentic, legitimate innovators.

Time: 1172.38

We won't name names of people who have tried to disguise

Time: 1174.77

themselves as true innovators.

Time: 1176.03

But what are some of the litmus tests?

Time: 1180.349

And I realize here that we don't want you to give these away to the

Time: 1183.42

point where they lose their potency.

Time: 1185.38

But if you could share a few of those.

Time: 1187.76

Marc Andreessen: Good, we're actually a pretty open book on this.

Time: 1190.84

First of all, yes, so there are people who definitely try to come in and basically

Time: 1194

present as being something that they're not, and they've read all the books.

Time: 1196.79

They will have listened to this interview.

Time: 1199.01

They study everything and they construct a facade, and they come in

Time: 1201.92

and present as something they're not.

Time: 1202.979

I would say the amount of that varies exactly, correlated to the NASDAQ.

Time: 1207.32

And so when stock prices are super low, you actually get the opposite.

Time: 1213.32

When stock prices are super low, people get too demoralized.

Time: 1215.46

And people who should be doing it basically give up because they

Time: 1217.409

just think that the industry is over, the trend is over, whatever.

Time: 1220.38

It's all hopeless.

Time: 1221.37

And so you get this flushing thing.

Time: 1222.41

So nobody ever shows up at a stock market low and says, like, I'm the

Time: 1226.31

new next big thing and doesn't really want to do it because there are higher

Time: 1232.13

status, the kinds of people who do the thing that you're talking about, they're

Time: 1235.23

fundamentally oriented for social status.

Time: 1236.79

They're trying to get the social status without actually the substance.

Time: 1241.31

And there are always other places to go to get social status.

Time: 1244

So after 2000, the joke was, when I got to Silicon Valley in

Time: 1249.64

'93, '94, the Valley was dead.

Time: 1251.61

We can talk about that.

Time: 1252.63

By '98, it was roaring, and you had a lot of these people showing up, who

Time: 1255.93

were, you basically had a lot of people showing up with these kind of stories.

Time: 1259.83

2000, the market crashed.

Time: 1260.69

By 2001, the joke was that there were these terms, B to C and B to B.

Time: 1266.21

And in 1998, they meant B to C meant business to consumer and B to B

Time: 1270.28

meant business to business, which is two different kinds of business

Time: 1272.639

models for Internet companies.

Time: 1273.67

By 2001, B to B meant back to banking and B to C meant back to consulting,

Time: 1280.18

which is the high status people who, the people oriented to status, who showed up

Time: 1283.62

to be in tech were like, yeah, screw it.

Time: 1285.73

This is over.

Time: 1286.3

Stick a fork in it.

Time: 1287.1

I'm going to go back to Goldman Sachs or go back to McKinsey,

Time: 1290.36

where I can be high status.

Time: 1291.959

And so you get this flushing kind of effect that happens in a downturn.

Time: 1296.62

That said, in a big upswing, yeah, you get a lot of people showing up with a

Time: 1302.71

lot of kind of, let's say, public persona without the substance to back it up.

Time: 1306.95

So the way we stress that you can actually say exactly how we test for this, because

Time: 1312.34

the test exactly addresses the issue in a way that is impossible to fake.

Time: 1316.11

And it's actually the same way homicide detectives try to find out if you've

Time: 1320.66

actually, like, if you're innocent or whether you've killed somebody.

Time: 1322.31

It's the same tactic, which is, you ask increasingly detailed questions, right?

Time: 1328.519

And so the way the homicide cop does this is, what were you doing last night?

Time: 1332.52

Oh, I was at a movie.

Time: 1333.79

Which movie?

Time: 1336.15

Which theater?

Time: 1337.7

Okay, which seat did you sit in?

Time: 1339.3

Okay, what was the end of the movie?

Time: 1342.01

And you ask increasingly detailed questions and people have trouble.

Time: 1345.86

At some point, people have trouble making up and things just fuzz

Time: 1348.279

into just kind of obvious bullshit.

Time: 1349.6

And basically fake founders basically have the same problem.

Time: 1353.34

They're able to relay a conceptual theory of what they're doing that they've

Time: 1355.99

kind of engineered, but as they get into the details, it just fuzzes out.

Time: 1360

Whereas the true people that you want to back that can do it, basically what

Time: 1364.17

you find is they've spent five or ten or 20 years obsessing on the details

Time: 1367.789

of whatever it is they're about to do.

Time: 1369.07

And they're so deep in the details that they know so much

Time: 1371.39

more about it than you ever will.

Time: 1373.03

And in fact, the best possible reaction is when they get mad, which

Time: 1376.57

is also what the homicide cops say.

Time: 1378.529

What you actually want is you want the emotional response of like, I can't

Time: 1382.18

believe that you're asking me questions this detailed and specific and picky

Time: 1386.209

and they kind of figure out what you're doing and then they get upset.

Time: 1389.35

That's good, that's perfect, right?

Time: 1391.59

But then they have to have proven themselves in the sense of,

Time: 1395.16

they have to be able to answer the questions in great detail.

Time: 1398.4

Andrew Huberman: Do you think that people that are able to answer those questions

Time: 1400.57

in great detail have actually taken the time to systematically think through the

Time: 1404.77

if-ands of all the possible implications of what they're going to do and they

Time: 1409.7

have a specific vision in mind of how things need to turn out or will turn out?

Time: 1414.76

Or do you think that they have a vision and it's a no matter

Time: 1419.37

what, it will work out because the world will sort of bend around it?

Time: 1422.83

I mean, in other words, do you think that they place their vision in

Time: 1425.27

context or they simply have a vision and they have that tunnel vision of

Time: 1429.379

that thing and that's going to be it?

Time: 1431.76

Let's use you for an example with Netscape.

Time: 1435.29

That's how I first came to know your name.

Time: 1438.33

When you were conceiving Netscape, did you think, okay, there's this

Time: 1443.73

search engine and this browser and it's going to be this thing that

Time: 1447.84

looks this way and works this way and feels this way, did you think that?

Time: 1453.58

And also think about that there was going to be a gallery of other search

Time: 1457.56

engines and it would fit into that landscape of other search engines?

Time: 1460.56

Or were you just projecting your vision of this thing as this

Time: 1463.46

unique and special brainchild?

Time: 1467.41

Marc Andreessen: Let me give the general answer, and then we can

Time: 1469.06

talk about the specific example.

Time: 1470.219

So the general answer is what?

Time: 1471.76

Entrepreneurship, creativity, innovation is what economists call

Time: 1474.84

decision making under uncertainty.

Time: 1477.439

In both parts, those are important decision making.

Time: 1478.87

Like, you're going to make a ton of decisions because you have to

Time: 1480.35

decide what to do, what not to do.

Time: 1481.6

And then uncertainty, which is like, the world's a complicated place.

Time: 1485.75

And in mathematical terms, the world is a complex adaptive

Time: 1488.04

system with feedback loops.

Time: 1489.27

And Isaac Asimov wrote in his novels, he wrote about this field

Time: 1496.42

called psychohistory, which is the idea that there's like a

Time: 1498.81

supercomputer that can predict the future of human affairs, right?

Time: 1501.3

And it's like, we don't have that.

Time: 1503.77

[LAUGHS] Not yet.

Time: 1504.62

Andrew Huberman: [LAUGHS] Not yet.

Time: 1505.11

We'll get to that later.

Time: 1506.759

Marc Andreessen: We certainly don't have that yet.

Time: 1508.86

And so you're just dealing, you know, military commanders call

Time: 1512.27

this the fog of war, right?

Time: 1513.77

You're just dealing with a situation where the number of

Time: 1515.58

variables are just off the charts.

Time: 1517.46

It's all these other people who are inherently unpredictable, making all

Time: 1520.84

these decisions in different directions.

Time: 1522.49

And then the whole system is combinatorial, which is these

Time: 1524.45

people are colliding with each other, influencing their decisions.

Time: 1526.91

And so, I mean, look, the most straightforward kind of way to

Time: 1530.24

think about this is, it's amazing.

Time: 1532.45

Like, anybody who believes in economic central planning, it always

Time: 1534.37

blows my mind because it's just like, try opening a restaurant.

Time: 1538.17

Try just opening a restaurant on the corner down here.

Time: 1540.849

And like 50/50 odds, the restaurant is going to work.

Time: 1543.45

And all you have to do to run a restaurant is have a thing and serve food.

Time: 1546.41

And it's like most restaurants fail, right?

Time: 1549.41

People who run restaurants are pretty smart.

Time: 1551.75

They usually think about these things very hard, and they all want to

Time: 1553.56

succeed, and it's hard to do that.

Time: 1555.439

And so to start a tech company or to start an artistic movement or to fight

Time: 1559.599

a war, you're just going into this, basically conceptual battleground or

Time: 1564.07

in military terms, real battleground, where there's just like incredible levels

Time: 1567.8

of complexity, branching future paths, and so there's nothing predictable.

Time: 1572.56

And so what we look for is basically the really good innovators.

Time: 1578.02

They've got a drive to basically be able to cope with that and deal with that.

Time: 1581.02

And they basically do that in two steps.

Time: 1582.709

So one is they try to pre-plan as much as they possibly can and we

Time: 1586.2

call that the process of navigating the, what we call the idea maze.

Time: 1589.19

And so the idea maze basically is, I've got this general idea, and it might be

Time: 1592.34

the Internet is going to work or search or whatever, and then it's like, okay, in

Time: 1595.38

their head, they have thought through of like, okay, if I do it this way, that way,

Time: 1598.86

this third way, here's what will happen.

Time: 1600.23

Then I have to do that, then I have to do this, then I have to

Time: 1602.31

bring in somebody to do that.

Time: 1603.33

Here's the technical challenge I'm going to hit.

Time: 1605.17

And they got in their heads as best anybody could, they've got as

Time: 1609.02

complete a sort of a map of possible futures as they could possibly have.

Time: 1612.16

And this is where I say, when you ask them increasingly detailed questions, that's

Time: 1615.089

what you're trying to kind of get them to kind of chart out, is, okay, how far ahead

Time: 1618.93

have you thought, and how much are you anticipating all of the different twists

Time: 1622.17

and turns that this is going to take?

Time: 1623.91

Okay, so then they start on day one, and then, of course, what

Time: 1625.97

happens is now they're in it, now they're in the fog of war, right?

Time: 1630.05

They're in future uncertainty.

Time: 1630.349

And now that idea maze is maybe not helpful practically, but now they're

Time: 1633.63

going to be basically constructing it on the fly, day by day, as they

Time: 1636.23

learn and discover new things and as the world changes around them.

Time: 1639.03

And of course, it's a feedback loop, because if their thing starts to

Time: 1641.82

work, it's going to change the world.

Time: 1642.91

And then the fact the world is changing is going to cause

Time: 1645.06

their plan to change as well.

Time: 1647.88

And so, yeah, the great ones, basically, the great ones

Time: 1652.099

course correct every single day.

Time: 1653.88

They take stock of what they've learned.

Time: 1656.36

They modify the plan.

Time: 1658.13

The great ones tend to think in terms of hypotheses, right?

Time: 1661.31

Like a scientific sort of mentality, which is they tend to think,

Time: 1663.29

okay, I'm going to try this.

Time: 1665.679

I'm going to go into the world, I'm going to announce that I'm doing this for sure.

Time: 1669.04

I'm going to say, this is my plan.

Time: 1670.369

I'm going to tell all my employees that, and I'm going to tell all my

Time: 1672.35

investors that, and I'm going to put a stake in there, and it's my plan,

Time: 1674.08

and then I'm going to try it, and even though I sound like I have complete

Time: 1677.31

certainty, I know that I need to test to find out whether it's going to work.

Time: 1680.41

And if it's not, then I have to go back to all those same people and

Time: 1683.05

have to say, well, actually, we're not going left, we're going right.

Time: 1685.78

And they have to run that loop thousands of times to get through the other side.

Time: 1690.26

And this led to the creation of this great term pivot, which has been very helpful

Time: 1694.1

in our industry because the word, when I was young, the word we used was fuck

Time: 1697.38

up, and pivot sounds like so much better, sounds like so much more professional.

Time: 1702.26

But, yeah, you make mistakes.

Time: 1703.7

It's just too complicated to understand.

Time: 1706.45

You course correct, you adjust, you evolve.

Time: 1709.07

Often these things, at least in business, the businesses that end up working

Time: 1712.33

really well tend to be different than the original plan, but that's part of

Time: 1716.27

the process of a really smart founder basically working their way through

Time: 1719.66

reality as they're executing their plan.

Time: 1722.849

Andrew Huberman: The way you're describing this has parallels to

Time: 1724.88

a lot of models in biology and the practice of science, random walks,

Time: 1729.11

but that aren't truly random, pseudo-random walks in biology, etc.

Time: 1732.82

But one thing that is becoming clear from the way you're describing

Time: 1736.35

this is that I could imagine a great risk to early success.

Time: 1741.58

So, for instance, somebody develops a product, people are excited by it,

Time: 1745.4

they start to implement that product, but then the landscape changes, and

Time: 1748.88

they don't learn how to pivot to use the less profane version of it.

Time: 1753.85

They don't learn how to do that.

Time: 1755.34

In other words, and I think of everything these days, or most everything, in

Time: 1759.48

terms of reward schedules and dopamine reward schedules, because that is

Time: 1763.32

the universal currency of reward.

Time: 1765.68

And so when you talk about the Sean Parker quote of learning to enjoy

Time: 1769.69

the taste of one's own blood, that is very different than learning to

Time: 1773.52

enjoy the taste of success, right?

Time: 1775.62

It's about internalizing success as a process of being self

Time: 1779.78

determined and less agreeable, etc.

Time: 1783.15

In other words, building up of those five traits becomes the source of dopamine,

Time: 1788.15

perhaps in a way that's highly adaptive.

Time: 1790.22

So on the outside, we just see the product, the end product, the iPhone,

Time: 1793.42

the MacBook, the Netscape, etc.

Time: 1795.83

But I have to presume, and I'm not a psychologist, but I have done

Time: 1800.25

neurophysiology and I've studied the dopamine system enough to know that

Time: 1804.51

what's being rewarded in the context of what you're describing sounds

Time: 1808.46

to be a reinforcement of those five traits, rather than, oh, it's going

Time: 1812.63

to be this particular product, or the company is going to look this way, or

Time: 1815.38

the logo is going to be this or that.

Time: 1816.969

That all seems like the peripheral to what's really going on, that great

Time: 1822.73

innovators are really in the process of establishing neural circuitry

Time: 1827.01

that is all about reinforcing the me and the process of being.

Time: 1833.46

Marc Andreessen: So this is like extrinsic versus intrinsic motivation.

Time: 1836.74

So, the Steve Jobs kind of Zen version of this, right?

Time: 1839.14

Or the sort of hippie version of this was the journey is the reward.

Time: 1843.5

He always told his employees that.

Time: 1844.46

It's like, look, everybody thinks in terms of these big public markers,

Time: 1847.22

like the stock price or the IPO or the product launch or whatever.

Time: 1849.719

He's like, no, it's actually the process itself is the point.

Time: 1853.91

Right to your point, if you have that mentality, then that's an intrinsic

Time: 1857.66

motivation, not an extrinsic motivation.

Time: 1859.65

And so that's the kind of intrinsic motivation that can

Time: 1862.13

keep you going for a long time.

Time: 1863.62

Another way to think about it is competing against yourself, right?

Time: 1865.889

It's like, can I get better at doing this?

Time: 1868.809

And can I prove to myself that I can get better?

Time: 1871.35

There's also a big social component to this, and this is one of the

Time: 1873.97

reasons why Silicon Valley punches so far above its weight as a place.

Time: 1877.58

There's a psychological component which also goes to the comparison set.

Time: 1881.139

So a phenomenon that we've observed over time is the leading tech company

Time: 1885.09

in any city will aspire to be as large as the previous leading tech company in

Time: 1889.709

that city, but often not larger, right?

Time: 1892.87

Because they have a model of success.

Time: 1895.1

And as long as they beat that level of success, they've kind of checked

Time: 1897.78

the box like they've made it.

Time: 1900.71

But then, in contrast, you're in Silicon Valley, and you look around

Time: 1903.06

and it's just like Facebook and Cisco and Oracle and Hewlett Packard and--

Time: 1907.02

Andrew Huberman: --Gladiators--

Time: 1907.98

Marc Andreessen: --Yeah.

Time: 1908.27

And you're just, like, looking at these giants.

Time: 1912.09

Many of them are still, Mark Zuckerberg, still going to work every day.

Time: 1916.61

And so these people are, like, the role models are, like, alive.

Time: 1920.3

They're, like, right there, and it's so clear how much better they are and how

Time: 1924.109

much bigger their accomplishments are.

Time: 1925.42

And so what we find is young founders in that environment

Time: 1928.78

have much greater aspirations.

Time: 1931.19

Because, again, at that point, maybe it's the social status, maybe there's

Time: 1935.03

an extrinsic component to that, or maybe it helps calibrate that internal

Time: 1939.13

system to basically say, actually, no, the opportunity here is not to build

Time: 1943.379

what you may call a local maximum form of success, but let's build to a

Time: 1946.36

global maximum form of success, which is something as big as we possibly can.

Time: 1950.47

Ultimately, the great ones are probably driven more internally than

Time: 1953.01

externally when it comes down to it.

Time: 1954.69

And that is where you get this phenomenon where you get people who are extremely

Time: 1958.809

successful and extremely wealthy who very easily could punch out and

Time: 1961.109

move to Fiji and just call it, and they're still working 16 hour days.

Time: 1965.969

Obviously something explains that that has nothing to do with external rewards,

Time: 1969.05

and I think it's an internal thing.

Time: 1972.67

Andrew Huberman: As many of you know, I've been taking AG1 daily

Time: 1974.97

since 2012, so I'm delighted that they're sponsoring the podcast.

Time: 1977.939

AG1 is a vitamin mineral probiotic drink that's designed to meet all of

Time: 1982.08

your foundational nutrition needs.

Time: 1984.07

Now, of course, I try to get enough servings of vitamins and minerals

Time: 1986.9

through whole food sources that include vegetables and fruits every day.

Time: 1990.25

But oftentimes I simply can't get enough servings.

Time: 1992.92

But with AG1, I'm sure to get enough vitamins and minerals

Time: 1995.9

and the probiotics that I need.

Time: 1997.629

And it also contains adaptogens to help buffer stress.

Time: 2000.84

Simply put, I always feel better when I take AG1.

Time: 2003.63

I have more focus and energy, and I sleep better.

Time: 2006.17

And it also happens to taste great.

Time: 2008.4

For all these reasons, whenever I'm asked if you could take just

Time: 2011.05

one supplement, what would it be?

Time: 2012.94

I answer AG1.

Time: 2014.639

If you'd like to try AG1, go to drinkag1.com/huberman

Time: 2019.2

to claim a special offer.

Time: 2020.8

They'll give you five free travel packs plus a year's supply of Vitamin D3K2.

Time: 2025.22

Again, that's drinkag1.com/huberman.

Time: 2029.82

I've heard you talk a lot about the inner landscape, the inner psychology

Time: 2035.07

of these folks, and I appreciate that.

Time: 2036.69

We're going even deeper into that today.

Time: 2038.57

And we will talk about the landscape around whether or not Silicon Valley

Time: 2041.9

or New York, whether or not there are specific cities that are ideal

Time: 2044.71

for certain types of pursuits.

Time: 2045.89

I think there was an article written by Paul Graham some years ago, about the

Time: 2049.809

conversations that you overhear in a city will tell you everything you need to know

Time: 2053.039

about whether or not you belong there in terms of your professional pursuits.

Time: 2058.21

Some of that's changed over time, and now we should probably add Austin to the

Time: 2061.09

mix because it was written some time ago.

Time: 2064.79

In any event, I want to return to that, but I want to focus on an aspect

Time: 2069.109

of this intrinsic versus extrinsic motivators in terms of something

Time: 2073.98

that's a bit more cryptic, which is one's personal relationships.

Time: 2078.98

If I think about the catalog of innovators in Silicon Valley, some of them, like

Time: 2083.71

Steve Jobs, had complicated personal lives, romantic personal lives early

Time: 2087.26

on, and it sounds like he worked it out.

Time: 2088.949

I don't know.

Time: 2090.159

I wasn't their couple's therapist.

Time: 2091.83

But when he died, he was in a marriage that for all the world

Time: 2096.52

seemed like a happy marriage.

Time: 2098.94

You also have examples of innovators who have had many partners, many

Time: 2103.099

children with other partners.

Time: 2104.219

Elon comes to mind.

Time: 2105.57

I don't think I'm disclosing anything that isn't already obvious.

Time: 2110.45

Those could have been happy relationships and just had many of them.

Time: 2113.5

But the reason I'm asking this is you can imagine that for the innovator,

Time: 2117.809

the person with these traits, who's trying to build up this thing, whatever

Time: 2122.91

it is, that having someone, or several people in some cases, who just truly

Time: 2130.67

believe in you when the rest of the world may not believe in you yet or

Time: 2134.64

at all, could be immensely powerful.

Time: 2136.79

And we have examples from cults that embody this.

Time: 2141.24

We have examples from politics.

Time: 2142.56

We have examples from tech innovation and science.

Time: 2146.42

And I've always been fascinated by this because I feel like it's the more

Time: 2149.38

cryptic and yet very potent form of allowing someone to build themselves up.

Time: 2154.96

It's a combination of inner psychology and extrinsic motivation.

Time: 2158.92

Because obviously, if that person were to die or leave them or cheat

Time: 2162.41

on them or pair up with some other innovator, which we've seen several

Time: 2166.86

times recently and in the past, it can be devastating to that person.

Time: 2170.26

But what are your thoughts on the role of personal, and in particular,

Time: 2174.179

romantic relationship as it relates to people having an idea and their

Time: 2179.47

feeling that they can really bring that idea to fruition in the world?

Time: 2182.98

Marc Andreessen: So it's a real mixed bag.

Time: 2184.51

You have lots of examples in all directions, and I

Time: 2186.83

think it's something like.

Time: 2187.99

Something like the following.

Time: 2189.25

So first, we talked about the personality traits of these people.

Time: 2192.12

They tend to be highly disagreeable.

Time: 2194.01

Andrew Huberman: Doesn't foster a good romantic relationship.

Time: 2195.619

Marc Andreessen: Highly disagreeable people can be

Time: 2197.002

difficult to be in a relationship.

Time: 2197.824

[LAUGHS]

Time: 2197.903

Andrew Huberman: [LAUGHS] I may have heard of that once or twice before.

Time: 2199.57

A friend may have given me that example.

Time: 2201.48

Marc Andreessen: Yeah.

Time: 2201.9

Right.

Time: 2202.15

And maybe you just need to find the right person who compliments that

Time: 2204.6

and is willing to, there's a lot of relationships where it's always this

Time: 2207.42

question about relationships, right?

Time: 2208.45

Which is, do you want to have the same personality growth profile, the

Time: 2211.32

same behavioral traits, basically, as your partner, or do you actually

Time: 2213.41

want to have, is it an opposite thing?

Time: 2217.23

I'm sure you've seen this.

Time: 2217.903

There are relationships where you'll have somebody who's highly disagreeable,

Time: 2220.12

who's paired with somebody who's highly agreeable, and it actually works out great

Time: 2223.15

because one person just gets to be on their soapbox all the time, and the other

Time: 2225.62

person is just like, okay, it's fine.

Time: 2228.72

Right? It's fine.

Time: 2229.15

It's good.

Time: 2230.57

You put two disagreeable people together, maybe sparks fly and they

Time: 2233.98

have great conversations all the time, and maybe they come to hate each other.

Time: 2238.05

Anyway, so these people, if you're going to be with one of these

Time: 2240.33

people, you're fishing out of the disagreeable end of the pond.

Time: 2242.65

And again, when I say disagreeable, I don't mean these are normal distributions.

Time: 2245.94

I don't mean, like 60% disagreeable or 80% disagreeable.

Time: 2248.819

The people we're talking about are 99.99% disagreeable.

Time: 2252

So these are ordinary people.

Time: 2254.6

So part of it's that.

Time: 2256.12

And then, of course, they have the other personality traits.

Time: 2258.23

They're super conscientious.

Time: 2259.51

They're super driven.

Time: 2260.389

As a consequence, they tend to work really hard.

Time: 2261.969

They tend to not have a lot of time for family vacations or other things.

Time: 2266.049

Then they don't enjoy them if they're forced to go on them.

Time: 2267.799

And so, again, that kind of thing can fray at a relationship.

Time: 2270.48

So there's a fair amount in there that's loaded.

Time: 2273.87

Like, somebody who's going to partner with one of these people

Time: 2275.8

needs to be signed up for the ride.

Time: 2277.81

And that's a hard thing.

Time: 2278.93

That's a hard thing to do.

Time: 2280.73

Or you need a true partnership of two of these, which is also hard to do.

Time: 2284.27

So I think that's part of it.

Time: 2286.2

And then, look, I think a big part of it is people achieve a certain level of

Time: 2289.04

success, and either in their own minds or publicly, and then they start to be

Time: 2294.87

able to get away with things, right?

Time: 2296.5

And they start to be able to.

Time: 2297.38

It's like, well, okay, now we're rich and successful and famous, and now I

Time: 2300.66

deserve, and this is where you get into...

Time: 2302.77

I view this now in the realm of personal choice.

Time: 2305.37

You get into this thing where people start to think that they deserve things,

Time: 2307.84

and so they start to behave in very bad ways, and then they blow up their

Time: 2312.1

personal worlds as a consequence.

Time: 2313.66

And maybe they regret it later, and maybe they don't.

Time: 2316.08

Right?

Time: 2317.44

It's always a question.

Time: 2319.429

I think there's that.

Time: 2321.46

And then, I don't know, maybe the other part of it is that some people just

Time: 2325.5

need more emotional support than others.

Time: 2326.93

And I don't know that that's a big, I don't know that that tilts either way.

Time: 2330.429

I know some of these people who have great, loving relationships and seem

Time: 2333.29

to draw very much on having this kind of firm foundation to rely upon.

Time: 2337.179

And then I know other people who are just like, their personal lives

Time: 2339.18

are just a continuous train wreck.

Time: 2340.359

And it doesn't seem to matter, like, professionally, they just

Time: 2343.97

keep doing what they're doing.

Time: 2344.94

And maybe we could talk here about whatever is the personality

Time: 2349.26

trait for risk taking.

Time: 2350.84

Some people are so incredibly risk prone that they need to take risk in

Time: 2354.23

all aspects of their lives at all times.

Time: 2356.289

And if part of their life gets stable, they find a way to blow it up.

Time: 2360.26

And that's some of these people you could describe in those terms also.

Time: 2364.14

Andrew Huberman: Yeah, let's talk about that.

Time: 2365.81

Because I think risk taking and sensation seeking is something that

Time: 2370.27

fascinates me for my own reasons and in my observations of others.

Time: 2377.129

Does it dovetail with these five traits in a way that can really serve innovation,

Time: 2382.06

in ways that can benefit everybody?

Time: 2383.48

The reason I say to benefit everybody is because there is a view of how

Time: 2387.219

we're painting this picture of the innovator as this really cruel person.

Time: 2392.959

But oftentimes, what we're talking about are innovations that make the

Time: 2395.52

world far better for billions of people.

Time: 2399.019

Marc Andreessen: Yeah, that's right.

Time: 2399.576

And by the way, everything we're talking about also is not just in

Time: 2401.5

tech or science or in business.

Time: 2403.469

Everything we're also talking about is true for the arts.

Time: 2406.219

The history of artistic expression.

Time: 2407.83

You have people with all these same kinds of traits.

Time: 2409.88

Andrew Huberman: Well, I was thinking about Picasso and his regular turnover

Time: 2412.47

of lovers and partners, and he was very open about the fact that it was one of the

Time: 2416.05

sources of his productivity, creativity.

Time: 2420.26

He wasn't shy about that.

Time: 2421.66

I suppose if he were alive today, it might be a little bit different.

Time: 2424.87

He might be judged a little differently.

Time: 2426.49

Marc Andreessen: Or that was his story for behaving in a pattern

Time: 2429.42

that was very awful for the people around him, and he didn't care.

Time: 2432.15

Andrew Huberman: Right, maybe they left him?

Time: 2434.25

Marc Andreessen: Yeah. Who knows?

Time: 2434.63

Right?

Time: 2436.5

Puts and takes to all this, but no.

Time: 2438.849

Okay, so I have a theory.

Time: 2439.64

So here's a theory.

Time: 2440.46

This is one of these, I keep a list of things that will get me

Time: 2442.32

kicked out of a dinner party and topics at any given point in time.

Time: 2446.069

Andrew Huberman: Do you read it before you go in?

Time: 2447.029

Marc Andreessen: Yeah.

Time: 2451.13

On auto recall, so that I can get out of these things.

Time: 2452.41

Here's the thing that can get me kicked out of a dinner

Time: 2455.91

party, especially these days.

Time: 2457.76

So think of the kind of person where it's very clear that they're super high, to

Time: 2463.68

your point, this is somebody who's super high output in whatever domain they're in.

Time: 2466.19

They've done things that have fundamentally changed the world.

Time: 2468.6

They've brought new, whether it's businesses or technologies or works

Time: 2471.71

of art, entire schools of creative expression, in some cases to the world.

Time: 2476.58

And then at a certain point, they blow themselves to smithereens, right?

Time: 2480.099

And they do that either through a massive financial scandal.

Time: 2483.48

They do that through a massive personal breakdown.

Time: 2485.94

They do that through some sort of public expression that causes

Time: 2488.92

them a huge amount of problems.

Time: 2490.51

They say the wrong thing, maybe not once, but several hundred times,

Time: 2494.13

and blow themselves to smithereens.

Time: 2498.03

There's this moral arc that people kind of want to apply, which it's

Time: 2500.68

like the Icarus flying too close to the sun and he had it coming and he

Time: 2505.21

needed to keep his ego under control.

Time: 2506.6

And you get kind of this judgment that applies.

Time: 2510.73

So I have a different theory on this.

Time: 2512.98

So the term I use to describe these people, and by the way, a lot of other

Time: 2516.32

people who don't actually blow themselves up but get close to it, which is a

Time: 2519.699

whole 'nother set of people, I call them martyrs to civilizational progress.

Time: 2526.29

We're backwards, civilizational progress.

Time: 2527.58

So look, the only way civilization gets moved forward is when people

Time: 2531.22

like this do something new.

Time: 2532.929

Because civilization as a whole does not do new things.

Time: 2536.299

Groups of people do not do new things.

Time: 2538.34

These things don't happen automatically.

Time: 2540.56

By default nothing changes.

Time: 2542.93

The only way civilizational change on any of these axes ever happens is because one

Time: 2547.41

of these people stands up and says, no, I'm going to do something different than

Time: 2550.54

what everybody else has ever done before.

Time: 2551.98

So, this is progress, like, this is actually how it happens.

Time: 2556.18

Sometimes they get lionized or awarded.

Time: 2558.19

Sometimes they get crucified.

Time: 2559.9

Sometimes the crucifixion is literal.

Time: 2562

Sometimes it's just symbolic.

Time: 2563.54

But they are those kinds of people, and then martyrs when they go down in

Time: 2569.639

flames and again, this is where it really screws the people's moral judgments

Time: 2573.609

because everybody wants to have the sort of super clear story of like, okay, he

Time: 2577.31

did a bad thing and he was punished.

Time: 2579.54

And I'm like, no, he was the kind of person who was going to do great things

Time: 2583.76

and also was going to take on a level of risk and take on a level of sort of

Time: 2587.54

extreme behavior such that he was going to expose himself to flying too close to

Time: 2591.23

the sun, wings melt and crash to ground.

Time: 2593.28

But it's a package deal.

Time: 2596.51

The reason you have the Picasso's and the Beethovens and all these

Time: 2599.03

people is because they're willing to take these extreme level of risks.

Time: 2602.51

They are that creative and original, not just in their art or their business,

Time: 2605.799

but in everything else that they do that they will set themselves up

Time: 2608.52

to be able to fail psychologically.

Time: 2610.379

A psychologist would probably, or psychiatrist would probably say maybe.

Time: 2612.95

To what extent do they actually have a death wish at some point.

Time: 2616.25

Do they want to punish themselves?

Time: 2617.24

Do they want to fail?

Time: 2617.83

That I don't know.

Time: 2619.33

But you see this.

Time: 2620.73

They deliberately move themselves too close to the sun, and you can see it

Time: 2624.054

when it's happening, because if they get too far away from the sun, they

Time: 2626.63

deliberately move back towards it.

Time: 2627.9

Right.

Time: 2629.18

They come right back, and they want the risk anyway.

Time: 2634.42

So martyrs to civilizational progress.

Time: 2636.49

This is how progress happens.

Time: 2638.429

When these people crash and burn, the natural inclination

Time: 2641.299

is to judge them morally.

Time: 2642.52

I tend to think we should basically say, look, and I don't even know if this

Time: 2645.9

means, like, giving them a moral pass or whatever, but it's like, look, this

Time: 2649.35

is how civilization progresses, and we need to at least understand that there's

Time: 2653.18

a self sacrificial aspect to this that may be tragic and often is tragic, but

Time: 2658.429

it is quite literally self sacrificial.

Time: 2660.5

Andrew Huberman: Are there any examples of great innovators who were able to

Time: 2665.31

compartmentalize their risk taking to such a degree that they had what seemed

Time: 2671.77

to be a morally impeccable life in every domain except in their business pursuits?

Time: 2676.86

Marc Andreessen: Yeah, that's right.

Time: 2677.33

So some people are very highly controlled like that.

Time: 2680.69

Some people are able to very narrowly, and I don't really want to set myself

Time: 2684.47

an example on a lot of this, but I will tell you as an example, I will

Time: 2687.78

never use debt in business, number one.

Time: 2690.46

Number two, I have the most placid personal life you can imagine.

Time: 2692.84

Number three, I'm the last person in the world who is ever

Time: 2695.14

going to do an extreme sport.

Time: 2697.049

I mean, I'm not even going to go in the sauna on the ice bath.

Time: 2700.19

I'm not doing any of this.

Time: 2702.05

I'm not tele skiing.

Time: 2703.12

Andrew Huberman: No obligation.

Time: 2704.84

Marc Andreessen: I'm not on the Titan.

Time: 2706.14

I'm not going down to see the Titanic.

Time: 2707.39

Goodness, you weren't doing any of this.

Time: 2709.12

I'm not doing any of this stuff.

Time: 2710

I have no interest.

Time: 2710.56

I don't play golf.

Time: 2711.17

I don't ski.

Time: 2711.71

I have no interest in any of this stuff, right?

Time: 2715

And I know people like this, right, who are very high achievers.

Time: 2717.029

It's just like, yeah, they're completely segmented.

Time: 2719.44

They're extreme risk takers.

Time: 2720.53

In business, they're completely buttoned down on the personal side, they're

Time: 2722.9

completely buttoned down financially.

Time: 2724.96

They're scrupulous with following every rule and law you can possibly imagine,

Time: 2729.27

but they're still fantastic innovators.

Time: 2730.83

And then I know many others who are just like their life is on fire all

Time: 2734.69

the time, in every possible way.

Time: 2736.76

And whenever it looks like the fire is turning into embers, they figure out

Time: 2739.23

a way to relight the fire, and they just really want to live on the edge.

Time: 2745.19

And so I think that's an independent variable.

Time: 2748.67

And again, I would apply the same thing.

Time: 2749.81

I think the same thing applies to the arts.

Time: 2752.67

Classical music as an example.

Time: 2754.02

I think Bach was, as an example, one of the best musicians of all

Time: 2757.42

time, had just a completely sedate personal life, never had any aberrant

Time: 2761.97

behavior at all in his personal life.

Time: 2763.37

Family man, tons of kids, apparently pillar of the community.

Time: 2767.139

Right.

Time: 2767.389

And so if Bach could be Bach and yet not burn his way through 300 mistresses

Time: 2771.86

or whatever, maybe you can, too.

Time: 2775.71

Andrew Huberman: So in thinking about these two different categories of

Time: 2778.2

innovators, those that take on tremendous risk in all domains of their life and

Time: 2781.9

those that take on tremendous risk in a very compartmentalized way, I don't

Time: 2785.45

know what the percentages are, but I have to wonder if in this modern age of

Time: 2790.39

the public being far less forgivable, what I'm referring to is cancel culture.

Time: 2796.17

Do you think that we are limiting the number of innovations in total

Time: 2801.759

by just simply frightening or eliminating an enormous category of

Time: 2806.639

innovators because they don't have the confidence or the means or the

Time: 2811.58

strategies in place to regulate?

Time: 2813.8

So they're just either bowing out or they're getting crossed off,

Time: 2816.84

they're getting canceled one by one.

Time: 2818.34

Marc Andreessen: So do you think the public is less tolerant than

Time: 2820.74

they used to be or more tolerant?

Time: 2823.88

Andrew Huberman: Well, the systems that, I'm not going to be careful here.

Time: 2828.41

I think the large institution systems are not tolerant of what the public

Time: 2836.28

tells them they shouldn't be tolerant of.

Time: 2840.259

And so if there's enough noise, there's enough noise in the mob.

Time: 2843.75

I think institutions bow out.

Time: 2845.62

And here I'm referring not just to, they essentially say, okay,

Time: 2849.09

let the cancellation proceed.

Time: 2851.78

Maybe they're the gavel that comes down, but they're not the

Time: 2855.63

lever that got the thing going.

Time: 2857.24

And so I'm not just thinking about universities.

Time: 2858.99

I'm also thinking about advertisers.

Time: 2860.64

I'm thinking about the big movie houses that cancel a film that a

Time: 2864.54

given actor might be in because they had something in their personal life

Time: 2867.12

that's still getting worked out.

Time: 2868.17

I'm thinking about people who are in a legal process that's not

Time: 2871.93

yet resolved, but the public has decided they're a bad person, etc.

Time: 2876.139

Marc Andreessen: My question is, are we really talking about the public?

Time: 2878.84

I agree with your question, and I'm going to come back to it, but I'm

Time: 2882.93

going to examine one part of your question, which is, is this really

Time: 2884.78

the public we're talking about.

Time: 2886.42

And I would just say Exhibit A is who is the current frontrunner for

Time: 2889.22

the Republican nomination today?

Time: 2893.34

The public, at least on one side of the political aisle, seems very on board.

Time: 2899.34

Number two, like, look, there's a certain musician who flew too close to

Time: 2904.42

the sun, blew himself to smithereens.

Time: 2905.86

He's still hitting all time highs on music streams every month.

Time: 2910.78

The public seems fine.

Time: 2914.459

I would argue the public is actually more open to these things than

Time: 2916.97

it actually maybe ever has been.

Time: 2919.109

And we could talk about why that's the case.

Time: 2920.28

I think it's a differentiation, and this is what your question was

Time: 2922.95

aiming at, but it's a differentiation between the public and the elites.

Time: 2927.29

My view is everything that you just described is an elite phenomenon.

Time: 2930.259

And actually, the public is very much not on board with it.

Time: 2933.94

So what's actually happening is what's happened is the public

Time: 2936.72

and the elites have gapped out.

Time: 2938.76

The public is more forgiving of what previously might have been considered kind

Time: 2942.78

of aberant and extreme behavior, right?

Time: 2946.03

F.

Time: 2946.14

Scott Fitzgerald, "there are no second acts in American lives"

Time: 2948.23

turns out was completely wrong.

Time: 2949.599

Turns out there are second acts, third acts, fourth acts.

Time: 2951.42

Apparently you can have an unlimited number of acts.

Time: 2952.84

The public is actually up for it.

Time: 2954.13

Yeah.

Time: 2954.34

Andrew Huberman: I mean, I think of somebody like Mike Tyson, right?

Time: 2956.93

I feel like his life exemplifies everything.

Time: 2961.23

That's amazing and great and also terrible about America.

Time: 2964.9

Marc Andreessen: If we took Mike Tyson to dinner tonight at any restaurant anywhere

Time: 2967.96

in the United States, what would happen?

Time: 2969.43

Andrew Huberman: He would be loved.

Time: 2970.109

Marc Andreessen: Oh, he would be like, the outpouring of enthusiasm and

Time: 2974.38

passion and love would be incredible.

Time: 2977.53

It would be unbelievable.

Time: 2978.72

This is a great example.

Time: 2980.75

And again, I'm not even going to draw more.

Time: 2982.19

I'm not even going to say I agree with that or disagree with that.

Time: 2984.19

I think we all intuitively know that the public is just like, 100%, absolutely.

Time: 2988.67

He's a legend. He's a living legend.

Time: 2989.34

He's like a cultural touchstone.

Time: 2992.38

Absolutely.

Time: 2992.81

And you see it when he shows up in movies, right?

Time: 2994.79

I don't remember the, I mean, the big breakthrough where I figured this out

Time: 2997.35

with respect to him because I don't really follow sports, but when he showed up in

Time: 3000.34

that, it was that first Hangover movie, and he shows up and I was in a theater

Time: 3004.44

and the audience just goes, bananas crazy.

Time: 3007.43

They're so excited to see him.

Time: 3008.93

Andrew Huberman: He evokes delight.

Time: 3010.12

I always say that Mike Tyson is the only person I'm aware of that can

Time: 3012.76

wear a shirt with his own name on it, and it somehow doesn't seem wrong.

Time: 3018.15

In fact, it just kind of makes you like him more.

Time: 3021.76

His ego feels very contoured in a way that he knows who he is and who he was, and yet

Time: 3028.63

there's a humbleness woven in, maybe as a consequence of all that he's been through.

Time: 3032.49

I don't know.

Time: 3033.92

But, yeah, people love Mike.

Time: 3036.05

Marc Andreessen: Public loves him now.

Time: 3037.37

Exactly.

Time: 3037.83

Now, if he shows up to lecture at Harvard, right, I think you're probably

Time: 3041.08

going to get a different reaction?

Time: 3041.989

[LAUGHS] Andrew Huberman: I don't know.

Time: 3043.08

I don't know!

Time: 3044.89

You know, the guy who wrote The Wire gave a talk at Harvard, and it sounded

Time: 3049.88

to me, based on his report of that, which is very interesting, in fact,

Time: 3054.52

that people adore people who are connected to everybody in that way.

Time: 3062.78

I feel like everybody loves Mike.

Time: 3064.08

From above his status, the sides below his status, he occupies

Time: 3070.19

this halo of love and adoration.

Time: 3072.95

Marc Andreessen: Okay.

Time: 3073.299

Andrew Huberman: All right.

Time: 3074.639

Marc Andreessen: Yeah.

Time: 3074.839

Look, the other side of this is the elites, and you kind of alluded

Time: 3077.7

to this, of the institution.

Time: 3078.84

So basically, it's like the people who are at least nominally in charge or

Time: 3081.75

feel like that they should be in charge.

Time: 3082.97

Andrew Huberman: I want to make sure we define elite.

Time: 3084.64

So you're not necessarily talking about people who are wealthy.

Time: 3086.43

You're talking about people who have authority within institutions.

Time: 3089.33

Marc Andreessen: So the ultimate definition of an elite is

Time: 3091.13

who can get who fired, right.

Time: 3094.809

That's the ultimate test.

Time: 3095.86

Who can get who fired, boycotted, blacklisted, ostracized.

Time: 3098.77

Like when push, prosecuted, jailed, like when push comes to shove.

Time: 3103.7

I think that's always the question, who can destroy whose career?

Time: 3106.27

And of course, you'll notice that that is heavily asymmetric

Time: 3109.29

when these fights play out.

Time: 3110.13

Like, it's very clear which side can get the other side fired and which side can't.

Time: 3114.27

And so, yeah, so, look, I think we live in a period of time where

Time: 3116.43

the elites have gotten to be extreme in a number of dimensions.

Time: 3119.16

I think it's characterized by, for sure, extreme groupthink, extreme

Time: 3124.389

sanctimony, extreme moral, I would say dudgeon, this weird sort of modern

Time: 3130.949

puritanism, and then an extreme sort of morality of punishment and terror

Time: 3135.19

against their perceived enemies.

Time: 3137.35

But I want to go through that because I actually think that's

Time: 3139.94

a very different phenomenon.

Time: 3141.069

I think what's happening at the elites is very different than what's

Time: 3142.78

happening in the population at large.

Time: 3144.91

And then, of course, I think there's a feedback loop in there, which

Time: 3147.53

is, I think the population at large is not on board with that program.

Time: 3151.19

Right.

Time: 3151.5

I think the elites are aware that the population is not

Time: 3153.549

on board WIth that program.

Time: 3154.36

I think they judge the population negatively as a consequence, that causes

Time: 3157.65

the elites to harden their own positions.

Time: 3159.34

That causes them to be even more alienating to the population.

Time: 3162.15

And so they're in sort of an oppositional negative feedback loop.

Time: 3165.799

But again, it's a sort of question, okay, who can get who fired?

Time: 3170.5

And so elites are really good at getting normal people fired.

Time: 3175.32

Ostracized, banned, hit pieces in the press, like, whatever.

Time: 3179.49

For normal people to get elites fired, they have to really band together, right.

Time: 3183.259

And really mount a serious challenge, which mostly doesn't happen, but might

Time: 3186.86

be starting to happen in some cases.

Time: 3188.28

Andrew Huberman: Do you think this power of the elites over, stemmed

Time: 3194.18

from social media sort of going against its original purpose?

Time: 3198.01

I mean, when you think social media, you think you're giving each

Time: 3200.009

and every person their own little reality TV show, their own voice.

Time: 3204.32

And yet we've seen a dramatic uptick in the number of cancellations and

Time: 3209.75

firings related to immoral behavior based on things that were either

Time: 3214.51

done or amplified on social media.

Time: 3217.139

It's almost as if the public is holding the wrong end of the knife.

Time: 3221.15

Marc Andreessen: Yeah, so the way I describe it, I use these two terms,

Time: 3225.8

and they're somewhat interchangeable, but elites and institutions.

Time: 3228.3

And then they're somewhat interchangeable because who runs the institutions?

Time: 3230.67

The elites, right?

Time: 3231.379

And so it's sort of a self reinforcing thing.

Time: 3235.479

And institutions of all kinds.

Time: 3236.53

Institutions, everything from the government, bureaucracies, companies,

Time: 3239.57

nonprofits, foundations, NGOs, tech companies, on and on and on.

Time: 3243.36

Like people who are in charge of big complexes and that carry a lot of,

Time: 3248.45

basically, power and influence and capability and money as a consequence

Time: 3251.89

of their positional authority.

Time: 3253.759

So the head of a giant foundation may never have done anything in their

Time: 3257.44

life that would cause somebody to have a high opinion of them as a person.

Time: 3259.74

But they're in charge of this gigantic multi billion dollar

Time: 3262.16

complex and have all this power.

Time: 3263.61

And so that's just defined terms, at least in institutions.

Time: 3268.309

So, it's actually interesting.

Time: 3269.57

Gallup has been doing polls on the following on the question of trust

Time: 3274.31

in institutions, which is sort of therefore a proxy for trust in elites,

Time: 3278.23

basically since the early 1970s.

Time: 3281.49

And they do this across all the categories of big institutions, basically everyone.

Time: 3285.349

I just talked about a bunch of others.

Time: 3286.59

Big business, small business, banks, newspapers, broadcast

Time: 3289.48

television, the military, police.

Time: 3292.23

So they've got like 30 categories or something.

Time: 3293.979

And basically what you see is almost all the categories basically started in

Time: 3297.65

the early 70s at like 60 or 70% trust.

Time: 3300.2

And now almost across the board, they've just had a complete,

Time: 3304.5

basically linear slide down for 50 years, basically my whole life.

Time: 3309.15

And they're now bottoming out.

Time: 3311.869

Congress and journalists bottom out at like 10%.

Time: 3316.58

The two groups everybody hates are Congress and journalists.

Time: 3319.44

And then it's like a lot of other big institutions are

Time: 3321.9

like, in their 20s, 30s, 40s.

Time: 3324.25

Actually, big business actually scores fairly high.

Time: 3326.03

Tech actually scores quite high.

Time: 3327.63

The military scores quite high.

Time: 3329.08

But basically everything else has really caved in.

Time: 3332.35

This is sort of my fundamental challenge to everybody who basically says, and

Time: 3335.63

you didn't do this, but you'll hear the simple form of this, which is social

Time: 3338.91

media caused the current trouble.

Time: 3341.54

And let's call this an example, collapse in faith in institutions and elites.

Time: 3345.7

Let's call that part of the current trouble.

Time: 3347.969

Everybody's like, well, social media caused that.

Time: 3349.136

I was like, well, no, social media, social media is new, right?

Time: 3352.37

In the last...

Time: 3353.13

social media is effectively new, practically speaking, since 2010,

Time: 3356.39

2012 is when it really took off.

Time: 3358.189

And so, if the trend started in the early 1970s and has been continuous, then

Time: 3362.06

we're dealing with something broader.

Time: 3364.029

Martin Gurri wrote, I think, the best book on this called the Revolt of the Public

Time: 3368.55

, where he goes through this in detail.

Time: 3371.19

He does say that social media had a lot to do with what's

Time: 3373.47

happened in the last decade.

Time: 3374.799

But he says, yeah, if you go back, you look further, it was

Time: 3376.78

basically two things coinciding.

Time: 3379.21

One was just a general change in the media environment.

Time: 3381.469

And in particular, the 1970s is when you started to, and especially in the 1980s,

Time: 3385.52

is when you started to get specifically talk radio, which was a new outlet.

Time: 3389.27

And then you also got cable television.

Time: 3392.87

And then you also, by the way, it's actually interesting in that you had

Time: 3395.329

paperback books, which was another one of these, which was an outlet.

Time: 3397.639

So you had like a fracturing in the media landscape that started in the

Time: 3400.889

50s through the, then, of course, the Internet blew it wide open.

Time: 3405.16

Having said that, if the elites and the institutions were fantastic,

Time: 3407.93

you would know it more than ever.

Time: 3410.15

Information is more accessible.

Time: 3411.27

And so the other thing that he says, and I agree with, is the public is

Time: 3414.54

not being tricked into thinking the elites and institutions are bad.

Time: 3417.26

They're learning that they're bad, and therefore, the mystery of the Gallup

Time: 3421.79

poll is why those numbers aren't all just zero, which is arguably, in a

Time: 3426.42

lot of cases, where they should be.

Time: 3428.059

Andrew Huberman: I think one reason that--

Time: 3429.49

Marc Andreessen: --By the way, he thinks this is bad.

Time: 3430.79

So he and I have a different view.

Time: 3432.139

So here's where he and I disagree.

Time: 3433.4

He thinks this is bad.

Time: 3434.309

So he basically says, you can't replace elites with nothing.

Time: 3438.059

You can't replace institutions with nothing, because what you're just left

Time: 3441.65

with is just going to be wreckage.

Time: 3442.72

You're going to be left with a completely, basically atomized, out of control

Time: 3445.97

society that has no ability to marshal any sort of activity in any direction.

Time: 3449.69

It's just going to be a dog eat dog awful world.

Time: 3452.96

I have a very different view on that which we can talk about.

Time: 3455.67

Andrew Huberman: Yeah, I'd love to hear your views on that.

Time: 3458.36

I'd like to take a quick break and acknowledge our sponsor, InsideTracker.

Time: 3462.17

InsideTracker is a personalized nutrition platform that analyzes

Time: 3465.14

data from your blood and DNA to help you better understand your body and

Time: 3468.7

help you meet your health goals.

Time: 3470.44

I'm a big believer in getting regular blood work done for the simple reason

Time: 3474

that many of the factors that impact your immediate and long term health can only

Time: 3477.9

be analyzed from a quality blood test.

Time: 3479.89

However, with a lot of blood tests out there, you get information back

Time: 3483.03

about blood lipids, about hormones and so on, but you don't know

Time: 3485.38

what to do with that information.

Time: 3486.969

With InsideTracker, they have a personalized platform that makes it very

Time: 3489.99

easy to understand your data, that is, to understand what those lipids, what

Time: 3494.11

those hormone levels, etc., mean, and behavioral supplement, nutrition and

Time: 3498.34

other protocols to adjust those numbers to bring them into the ranges that are ideal

Time: 3502.48

for your immediate and long term health.

Time: 3504.06

InsideTracker's ultimate plan now includes measures of both APOB and of Insulin,

Time: 3508.69

which are key indicators of cardiovascular health and energy regulation.

Time: 3512.92

If you'd like to try InsideTracker, you can visit insidetracker.com/huberman to

Time: 3517.25

get 20% off any of InsideTracker's plans.

Time: 3519.94

Again, that's insidetracker.com/huberman to get 20% off.

Time: 3524.799

The quick question I was going to ask before we go there is, I think that

Time: 3528.55

one reason that I and many other people sort of reflexively assume that social

Time: 3533.35

media caused the demise of our faith and institutions is, well, first of all, I

Time: 3539.46

wasn't aware of this lack of correlation between the decline in faith in

Time: 3545.47

institutions and the rise of social media.

Time: 3548.25

But secondarily that we've seen some movements that have essentially

Time: 3554.38

rooted themselves in tweets, in comments, in posts that get amplified,

Time: 3560.78

and those tweets and comments and posts come from everyday people.

Time: 3564.43

In fact, I can't name one person who initiated a given cancellation or

Time: 3570.27

movement because it was the sort of dogpiling or mob adding-on to some

Time: 3574.16

person that was essentially anonymous.

Time: 3575.37

So I think that for many of us, we have the, to use neuroscience language,

Time: 3578.43

as sort of a bottom up perspective, oh, someone sees something in their

Time: 3583.52

daily life or experiences something in their daily life, and they tweet about

Time: 3586.87

it or they comment about it or they post about it, and then enough people

Time: 3592.23

dogpile on the accused that it picks up force, and then the elites feel

Time: 3598

compelled, obligated to cancel somebody.

Time: 3603.22

That tends to be the narrative.

Time: 3604.68

And so I think the logical conclusion is, oh, social media

Time: 3607.48

allows for this to happen.

Time: 3608.82

Whereas normally someone would just be standing on the corner shouting

Time: 3611.27

or calling lawyers that don't have faith in them, and you've got the Erin

Time: 3614.95

Brockovich model that turns into a movie.

Time: 3618.99

But that's a rare case of this lone woman who's got this idea in mind about how a

Time: 3623.24

big institution is doing wrong or somebody is doing wrong in the world and then can

Time: 3627.91

leverage the big institution, excuse me.

Time: 3630.219

But the way that you describe it is that the elites are leading this shift.

Time: 3636.43

So what is the role of the public in it?

Time: 3640.21

Just to give it a concrete example, if, for instance, no one tweeted or

Time: 3647.27

commented on me, too, or no one tweeted or commented about some ill behavior

Time: 3653.43

of some, I don't know, university faculty member or business person,

Time: 3658.94

would the elite have come down on them?

Time: 3660.66

Marc Andreessen: Anyway, what's happening?

Time: 3664.27

Based on what I've seen over the years, there is so much astroturfing right now.

Time: 3670.69

There are entire categories of people who are paid to do this.

Time: 3674.02

Some of them we call journalists, some of them we call activists,

Time: 3676.41

some of them we call NGO nonprofit.

Time: 3678.85

Some of them we call university professors, some of them we

Time: 3680.929

call grad students, whatever, they're paid to do this.

Time: 3684.21

I don't know if you've ever looked into the misinformation industrial complex?

Time: 3687.62

There's this whole universe of basically these funded groups

Time: 3689.969

that basically do misinformation.

Time: 3693.25

And they're constantly mounting these kinds of attacks.

Time: 3696.11

They're constantly trying to gin up this kind of basically panic

Time: 3698.679

to cause somebody to get fired.

Time: 3699.649

Andrew Huberman: So it's not a grassroots--

Time: 3701.55

Marc Andreessen: --No. It's the opposite of grassroots.

Time: 3703.009

No.

Time: 3703.719

Almost always going to trace these things back.

Time: 3705.559

It was a journalist, it was an activist, it was a public figure of some kind.

Time: 3711.94

These are entrepreneurs in a sort of a weird way.

Time: 3715.92

Basically their job, mission calling, is all wrapped up together

Time: 3720.77

like they're true believers, but they're also getting paid to do it.

Time: 3723.299

And there's a giant funding, I mean, there's a very large funding

Time: 3725.65

complex for this coming from certain high profile people who put

Time: 3729.58

huge amounts of money into this.

Time: 3730.46

Andrew Huberman: Is this well known?

Time: 3731.52

Marc Andreessen: Yes.

Time: 3731.89

Well, it is in my world.

Time: 3733.35

So this is what the social media companies have been on the receiving

Time: 3736.04

end of for the last decade.

Time: 3739.57

It's basically a political media activism complex with very deep pockets behind it.

Time: 3743.91

And you've got people who basically, literally have people who sit all day

Time: 3746.86

and watch the TV network on the other side or watch the Twitter feeds on the

Time: 3749.7

other side, and they basically wait.

Time: 3751.69

It's like every politician, this has been the case for a long time now.

Time: 3754.13

Every politician who goes out and gives stump speeches, you'll see there's always

Time: 3756.67

somebody in the crowd with a camcorder or now with a phone recording them.

Time: 3759.549

And that's somebody from the other campaign who's paid somebody to

Time: 3762.83

just be there and record every single thing the politician says.

Time: 3766.04

So that when a Mitt Romney says, whatever, the 47% thing, they've

Time: 3768.74

got it on tape, and then they clip it, and they try to make it viral.

Time: 3771.75

And again, look, these people believe what they're doing.

Time: 3775.78

I'm not saying it's even dishonest.

Time: 3776.96

Like, these people believe what they're doing.

Time: 3778.059

They think they're fighting a holy war.

Time: 3779.19

They think they're protecting democracy.

Time: 3780.49

They think they're protecting civilization.

Time: 3781.91

They think they're protecting whatever it is they're protecting.

Time: 3784.7

And then they know how to use the tools, and so they know how

Time: 3788.45

to try to gin up the outrage.

Time: 3790.24

And then, by the way, sometimes it works in social cascades.

Time: 3792.969

Sometimes it works, sometimes it doesn't.

Time: 3794.889

Sometimes they cascade, sometimes they don't.

Time: 3796.74

But if you follow these people on Twitter, this is what they do every day.

Time: 3800.28

They're constantly trying to, like, light this fire.

Time: 3803.43

Andrew Huberman: I assume that it was really bottom up, but it sounds like

Time: 3806.04

it's sort of middle level, and that it captures the elites, and then the

Time: 3810.21

thing takes on a life of its own.

Time: 3811.599

Marc Andreessen: By the way, it also intersects with the trust and safety

Time: 3813.36

groups at the social media firms who are responsible for figuring out who gets

Time: 3816.81

promoted and who gets banned across this.

Time: 3819.04

And you'll notice one large social media company has recently changed

Time: 3821.69

hands and has implemented a different kind of set of trust and safety.

Time: 3826.59

And all of a sudden, a different kind of boycott movement has

Time: 3828.54

all of a sudden started to work that wasn't working before that.

Time: 3830.83

And another kind of boycott movement is not working as well anymore.

Time: 3834.16

And so, for sure, there's an intermediation happening.

Time: 3837.18

Look, the stuff that's happening in the world today is being intermediated

Time: 3839.77

through social media, because social media is the defining media of our time.

Time: 3843.349

But there are people who know how to do this and do this for a living.

Time: 3846.98

No, I view very much the cancellation wave, like, this whole thing, it's an

Time: 3854.3

elite phenomenon, and when it appears to be a grassroots thing, it's either

Time: 3859.05

grassroots among the elites, which is possible because there's a fairly

Time: 3862.62

large number of people who are signed up for that particular crusade, but

Time: 3866.58

there's also a lot of astroturfing that's taking place inside that.

Time: 3869.199

The question is, okay, at what point does the population at

Time: 3871.29

large get pulled into this?

Time: 3872.96

And maybe there are movements, certain points in time where they

Time: 3875.98

do get pulled in, and then maybe later they get disillusioned.

Time: 3878.12

And so then there's some question there.

Time: 3879.44

And then there's another question of like, well, if the population at

Time: 3881.92

large is going to decide what these movements are, are they going to be the

Time: 3884.06

same movements that the elites want?

Time: 3886.31

And how are the elites going to react when the population

Time: 3888.76

actually fully expresses itself?

Time: 3891.77

Like I said, there's a feedback loop between these where the more extreme

Time: 3894.42

the elites get, they tend to push the population to more extreme views

Time: 3896.815

on the other side and vice versa.

Time: 3898.36

So it ping pongs back and forth.

Time: 3899.99

And so, yeah, this is our world.

Time: 3901.99

Andrew Huberman: Yeah, this explains a lot.

Time: 3905.199

Marc Andreessen: I want to make sure that Schellenberger, Matt Taibbi, a bunch

Time: 3908.97

of these guys have done a lot of work.

Time: 3911.57

If you just look into what's called the misinformation industrial complex,

Time: 3914.95

you'll find a network of money and power that is really quite amazing.

Time: 3918.75

Andrew Huberman: I've seen more and more Schellenberger showing up.

Time: 3922.19

Marc Andreessen: Right.

Time: 3922.45

And he's just, look, he's just on this stuff.

Time: 3924.139

He, and just, they're literally just like tracking money.

Time: 3927.92

It's very clear how the money flows, including a remarkable amount of money

Time: 3931.26

out of the government, which is, of course, in theory, very concerning.

Time: 3935.94

Andrew Huberman: Very interesting.

Time: 3936.63

Marc Andreessen: The government should not be funding programs that take

Time: 3939.08

away people's constitutional rights.

Time: 3940.369

And yet somehow that is what's been happening.

Time: 3944.17

Andrew Huberman: Very interesting.

Time: 3946.13

I want to make sure that I hear your ideas about why the decline

Time: 3950.73

in confidence in institutions is not necessarily problematic.

Time: 3956.2

Is this going to be a total destruction, burning down of the

Time: 3959.84

forest that will lead to new life?

Time: 3961.63

Is that your view?

Time: 3963.11

Marc Andreessen: Well, so this is the thing.

Time: 3964.27

And look, there's a question if you're, there's a couple of questions in here,

Time: 3966.6

which is like, how bad is it really?

Time: 3969.984

How bad are they?

Time: 3970.96

Right. And I think they're pretty bad.

Time: 3973.53

A lot of them are actually pretty bad.

Time: 3976.49

So that's one big question.

Time: 3977.469

And then, yeah, look, the other question is like, okay, if the institution has gone

Time: 3980.72

bad or a group of elites have gone bad, it's this wonderful word, reform, right?

Time: 3985.33

Can they be reformed?

Time: 3986.13

And everybody always wants to reform everything, and yet somehow nothing

Time: 3988.63

ever quite ever gets reformed.

Time: 3990.96

And so people are trying to reform housing policy in the Bay Area for

Time: 3993.56

decades, and we're not building.

Time: 3995.529

We're building fewer houses than ever before.

Time: 3996.95

So somehow reform movements seem to lead to just more bad stuff.

Time: 4000.98

But anyway, yeah.

Time: 4001.58

So if you have an existing institution, can it be reformed?

Time: 4003.66

Can it be fixed from the inside?

Time: 4005.34

What's happened in universities?

Time: 4007.01

There are professors at Stanford as an example, who very much

Time: 4009.06

think that they can fix Stanford.

Time: 4011.509

Like, I don't know what you think.

Time: 4012.77

It doesn't seem like it's going in productive directions right now.

Time: 4016.2

Andrew Huberman: Well, I mean, there are many things about Stanford

Time: 4017.9

that function extremely well.

Time: 4019.64

It's a big institution.

Time: 4020.92

It's certainly got its issues like any other place.

Time: 4023.679

They're also my employer, Marc's giving me some interesting looks.

Time: 4026.28

He wants me to get a little more vocal.

Time: 4027.679

Marc Andreessen: I didn't mean to put you on the spot.

Time: 4031.58

Yeah.

Time: 4031.82

Andrew Huberman: I mean, one of the things about being a researcher

Time: 4033.5

at a big institution like Stanford is, well, first of all, it meets

Time: 4036.54

the criteria that you described.

Time: 4038.24

Know, you look to the left, you look to the right or anywhere above or

Time: 4041.11

below you, and you have excellence.

Time: 4043.679

Right?

Time: 4044.009

I mean, I've got a Nobel Prize winner below me whose daddy also won

Time: 4046.62

a Nobel Prize, and his scientific offspring is likely to win.

Time: 4050.29

I mean, it inspires you to do bigger things than one

Time: 4054.65

ordinarily would, no matter what.

Time: 4056.29

So there's that, and that's great.

Time: 4057.87

And that persists.

Time: 4059.48

There's all the bureaucratic red tape about trying to get things done and how

Time: 4063.78

to implement decisions is very hard, and there are a lot of reasons for that.

Time: 4067.619

And then, of course, there are the things that many people are aware of.

Time: 4070.96

There are public accusations about people in positions of great leadership,

Time: 4075.51

and that's getting played out.

Time: 4076.419

And the whole thing becomes kind of overwhelming and a little bit

Time: 4080.1

opaque when you're just trying to run your lab or live your life.

Time: 4083.43

And so I think one of the reasons for this lack of reform that you're

Time: 4086.25

referring to is because there's no position of reformer, right?

Time: 4091.9

So deans are dealing with a lot of issues.

Time: 4094.91

Provosts are dealing with a lot of issues.

Time: 4096.349

Presidents are dealing with a lot of issues, and then some in some cases.

Time: 4100.2

And so we don't have a dedicated role of reformer, someone to go in and

Time: 4105.25

say, listen, there's just a lot of fat on this and we need to trim it

Time: 4108.97

or we need to create this or do that.

Time: 4110.779

There just isn't a system to do that.

Time: 4113.279

And that's, I think in part, because universities are built on old systems,

Time: 4117.37

and it's like the New York subway.

Time: 4120.02

It's amazing i t still works as well as it does, and yet it's

Time: 4123.29

got a ton of problems also.

Time: 4126.22

Marc Andreessen: So the point, we could debate the university specifically,

Time: 4128.97

but the point is like, look, if you do think institutions are going bad, and

Time: 4131.811

then you have to make it number one.

Time: 4132.99

You have to figure out if you think institutions are going bad.

Time: 4134.83

The population largely does think that at the very least, the people

Time: 4138.379

who run institutions ought to really think hard about what that means.

Time: 4140.779

Andrew Huberman: But people still strive to go to these places.

Time: 4143.85

And I still hear from people who, for instance, did not go to

Time: 4147.02

college, are talking about how a university degree is useless.

Time: 4150.04

They'll tell you how proud they are that their son or daughter is going

Time: 4152.45

to Stanford or is going to UCLA or is going to Urbana Champaign.

Time: 4156.31

I mean, it's almost like, to me, that's always the most shocking contradiction,

Time: 4161.14

is like, these institutions don't matter.

Time: 4163.31

But then when people want to hold up a card that says why their

Time: 4166.149

kid is great, it's not about how many pushups they can do or that

Time: 4169.529

they started their own business.

Time: 4170.63

Most of the time it's they're going to this university.

Time: 4173.759

And I think, well, what's going on here?

Time: 4175.619

Marc Andreessen: So do you think the median voter in the United States

Time: 4177.3

can have their kid go to Stanford?

Time: 4178.51

Andrew Huberman: No.

Time: 4180.73

Marc Andreessen: Do you think the median voter in the United States

Time: 4181.95

could have their kid admitted to Stanford, even with a perfect SAT?

Time: 4185.8

Andrew Huberman: No, no.

Time: 4187.879

In this day and age, the competition is so fierce that it requires more.

Time: 4191.259

Marc Andreessen: Yeah.

Time: 4191.529

So first of all, again, we're dealing here.

Time: 4194.129

Yes.

Time: 4194.4

We're dealing with a small number of very elite institutions.

Time: 4197.26

People may admire them or not.

Time: 4199.39

Most people have no connectivity to them whatsoever.

Time: 4201.93

In the statistics, in the polling, universities are not doing well.

Time: 4206.57

The population at large, yeah, they may have fantasies about their

Time: 4208.861

kid going to Stanford, but the reality of it is they have a very

Time: 4211.839

collapsing view of these institutions.

Time: 4213.25

So anyway, this actually goes straight to the question of alternatives then, right?

Time: 4217.63

Which is like, okay, if you believe that there's collapsing faith in the

Time: 4220.24

institutions, if you believe that it is merited, at least in some ways, if

Time: 4223.67

you believe that reform is effectively impossible, then you are faced...

Time: 4227.94

We could debate each of those, but the population at large

Time: 4230.25

seems to believe a lot of that.

Time: 4232.45

Then there's a question of like, okay, can it be replaced?

Time: 4235.09

And if so, are you better off replacing these things basically,

Time: 4238.34

while the old things still exist?

Time: 4239.5

Or do you actually need to basically clear the field to be

Time: 4241.99

able to have the new thing exist?

Time: 4244.059

The universities are a great case study of this because of

Time: 4246.69

how student loans work, right?

Time: 4248.059

And the way student loans work is to be an actual competitive university

Time: 4253.41

and compete, you need to have access to federal student lending.

Time: 4255.53

Because if you don't, everybody has to pay out of pocket.

Time: 4257.47

And it's completely out of reach for anybody other than a certain class of

Time: 4260.61

either extremely rich or foreign students.

Time: 4262.38

So you need access to a federal student loan facility.

Time: 4264.789

To get access to a federal student loan facility, you need

Time: 4266.67

to be an accredited university.

Time: 4269.7

Guess who runs the accreditation council?

Time: 4271.25

Andrew Huberman: I don't know.

Time: 4272.4

Marc Andreessen: The existing universities, right?

Time: 4274.33

So it's a self laundering machine.

Time: 4277.41

Like they decide who the new universities are.

Time: 4279.01

Guess how many new universities get accredited, each year to be able...

Time: 4283.19

Andrew Huberman: Zero.

Time: 4283.55

Marc Andreessen: Zero, right?

Time: 4285.11

And so as long as that system is in place, and as long as they have the government

Time: 4289.33

wired the way that they do, and as long as they control who gets access to

Time: 4292.9

federal student loan funding, of course there's not going to be any competition.

Time: 4296.65

Of course there can't be a new institution that's going to be able to get to scale.

Time: 4299.79

It's not, not possible.

Time: 4300.529

And so if you actually wanted to create a new system that was better

Time: 4304.32

in, you know, I would argue dozens or hundreds of ways, it could obviously be

Time: 4307.53

better if you were starting it today.

Time: 4309.42

It probably can't be done as long as the existing institutions are actually intact.

Time: 4313.34

And this is my counter to Martin, which is like, yeah, look, if we're going

Time: 4317.42

to tear down the old, there may be a period of disruption before we get to

Time: 4320.06

the new, but we're never going to get to the new if we don't tear down the old.

Time: 4323.03

Andrew Huberman: When you say counter to Martin, you're talking about

Time: 4324.66

the author of Revolt of the Public

Time: 4326.159

? Marc Andreessen: Yeah, Martin Gurri.

Time: 4326.496

What Martin Gurri says is like, look, he said basically as follows, the elites

Time: 4332.17

deserve contempt, but the only thing worse than these elites that deserve

Time: 4337.02

contempt would be no elites at all.

Time: 4340.12

And he basically says on the other side of the destruction of the elites

Time: 4344.34

and the institutions is nihilism.

Time: 4346.25

You're basically left with nothing.

Time: 4347.33

And by the way, there is a nihilistic streak.

Time: 4348.98

I mean, there's a nihilistic streak in the culture and the politics today.

Time: 4351.49

There are people who basically would just say, yeah, just tear

Time: 4353.66

the whole system down without any particular plan for what follows.

Time: 4357.76

And so I think he makes a good point and that you want to be careful that you

Time: 4361.65

actually have a plan on the other side that you think is actually achievable.

Time: 4364.7

But again, the counterargument to that is if you're not willing

Time: 4368.059

to actually tear down the old, you're not going to get to the new.

Time: 4370.66

Now, what's interesting, of course, is this is what happens

Time: 4373.23

every day in business, right?

Time: 4374.71

So the entire way, how do you know that the capitalist system works?

Time: 4379.13

The way that you know is that the old companies, when they're no longer like

Time: 4382.62

the best at what they do, they get torn down and then they ultimately die and

Time: 4386.219

they get replaced by better companies.

Time: 4387.54

Andrew Huberman: Yeah, I haven't seen a Sears in a while.

Time: 4389.24

Marc Andreessen: Exactly.

Time: 4390.65

And we know what's so interesting is we know in capitalism, in a

Time: 4393.91

market economy, we know that's the sign of health, that's the sign of

Time: 4398.13

how the system is working properly.

Time: 4399.519

And in fact, we get actually judged by antitrust authorities

Time: 4402.59

in the government on that basis.

Time: 4404.559

It's like the best defense against antitrust charges is no, people

Time: 4407.41

are coming to kill us and they're doing a really good job of it.

Time: 4409.93

That's how we know we're doing our job.

Time: 4411.15

And in fact, in business we are specifically, it is specifically

Time: 4413.95

illegal for companies in the same industry to get together and plot

Time: 4416.89

and conspire and plan and have things like these accreditation bureaus.

Time: 4420.929

If I created the equivalent in my companies of the kind of accreditation

Time: 4423.969

bureau that the universities have, I'd get sent straight to federal prison

Time: 4426.609

and a trust violation Sherman Act.

Time: 4427.88

Straight to prison.

Time: 4428.42

People have been sent to prison for that.

Time: 4430.42

So in the business world, we know that you want everything

Time: 4433.43

subject to market competition.

Time: 4434.66

We know that you want creative destruction.

Time: 4436.51

We know that you want replacement of the old with superior new.

Time: 4439.889

It's just once we get outside of business, we're like, oh, we don't want any of that.

Time: 4442.77

We want basically stagnation and log rolling and basically institutional

Time: 4448.26

incestuous, like entanglements and conflicts of interest as

Time: 4451.44

far as the eye can see, and then we're surprised by the results.

Time: 4454.84

Andrew Huberman: So let's play it out as a bit of a thought experiment.

Time: 4457.63

So let's say that one small banding together of people who want to start a new

Time: 4464.18

university where there is free exchange of open ideas, where unless somebody has

Time: 4470.68

egregious behavior, violent behavior, truly sexually inappropriate behavior

Time: 4475.059

against somebody that is committing a crime, they're allowed to be there.

Time: 4478.599

They're allowed to be a student or a faculty member or administrator.

Time: 4482.94

And let's just say this accreditation bureau allowed student loans for

Time: 4487.34

this one particular university.

Time: 4488.53

Or let's say that there was an independent source of funding for that university

Time: 4491.44

such that students could just apply there.

Time: 4493.16

They didn't need to be part of this elite, accredited group, which sounds

Time: 4497.96

very mafia-like, frankly, not necessarily violent, but certainly coercive in

Time: 4504.72

the way that it walls people out.

Time: 4507.2

Let's say that then there were 20 or 30 of those or 40 of those.

Time: 4512.799

Do you think that over time, that model would overtake the existing model?

Time: 4517.84

Marc Andreessen: Isn't it interesting that those don't exist?

Time: 4521.73

Remember Sherlock Holmes, The Dog that Didn't Bark

Time: 4524.8

? Andrew Huberman: It is interesting that they don't exist.

Time: 4526.53

Marc Andreessen: Right. So there's two possibilities.

Time: 4528.05

One is like, nobody wants that, which I don't believe.

Time: 4531.24

And then the other is like, the system is wired in a way that

Time: 4533.139

will just simply not allow it.

Time: 4534.879

And you did a hypothetical in which the system would allow it.

Time: 4537.53

And my response to that is, no, of course the system won't allow that.

Time: 4539.89

Andrew Huberman: Or the people that band together have enough money or get enough

Time: 4543.17

resources to say, look, we can afford to give loans to 10,000 students per year.

Time: 4548.519

10,000 isn't a trivial number when thinking about the size of a university.

Time: 4551.88

And most of them hopefully will graduate in four years and there'll be a turnover.

Time: 4558.78

Do you think that the great future innovators would tend to orient toward

Time: 4563.849

that model more than they currently do toward the traditional model?

Time: 4568.559

What I'm trying to get back to here is how do you think that the current model

Time: 4571.73

thwarts innovation, as well as maybe some ways that it still supports innovation?

Time: 4577.53

Certainly cancellation and the risk of cancellation from the way that we framed

Time: 4581.639

it earlier, is going to discourage risk takers of the category of risk takers

Time: 4587.28

that take risk in every domain that really like to fly close to the sun

Time: 4591.83

and sometimes into the sun or are--

Time: 4593.58

Marc Andreessen: --Doing research that is just not politically palatable.

Time: 4599.24

Andrew Huberman: Right, that we can't even talk about on this podcast, probably

Time: 4602.89

without causing a distraction of what we're actually trying to talk about.

Time: 4606.03

Marc Andreessen: That gives up the whole game right there.

Time: 4607.17

Exactly.

Time: 4610.56

Andrew Huberman: I keep a file, and it's a written file because I'm afraid

Time: 4613.77

to put it into electronic form of all the things that I'm afraid to talk

Time: 4617.2

about publicly because I come from a lineage of advisors where all three died

Time: 4621.26

young, and I figure, if nothing else, I'll die, and then I'll make it into

Time: 4624.58

the world and let's say 510 years, 20 years, and if not, I know a certainty

Time: 4630.49

I'm going to die at some point, and then we'll see where all those issues stand.

Time: 4633.74

In any event--

Time: 4634.809

Marc Andreessen: --is that list getting l onger over time or shorter?

Time: 4636.54

Andrew Huberman: Oh, it's definitely getting longer.

Time: 4637.969

Marc Andreessen: Isn't that interesting?

Time: 4638.62

Andrew Huberman: Yeah, it's getting much longer.

Time: 4640.38

I mean, there are just so many issues that I would love to explore on this

Time: 4644.859

podcast with experts and that I can't explore, just even if I had a panel

Time: 4652.28

of them, because of the way that things get soundbited and segmented

Time: 4655.969

out and taken out of context, it's like the whole conversation is lost.

Time: 4659.94

And so, unfortunately, there are an immense number of equally interesting

Time: 4663.48

conversations that I'm excited to have, but it is a little disturbing.

Time: 4668.52

Marc Andreessen: Do you remember Lysenkoism?

Time: 4671.49

Andrew Huberman: No.

Time: 4672.769

Marc Andreessen: Famous in the history of the Soviet Union.

Time: 4674.59

This is the famous thing.

Time: 4675.37

So there was a geneticist named Lysenko.

Time: 4678.019

Andrew Huberman: That's why it sounds familiar, but I'm not calling to--

Time: 4680.389

Marc Andreessen: --Well, he was the guy who did communist genetics, the field

Time: 4685.51

of genetics, the Soviets did not approve of the field of genetics because, of

Time: 4688.66

course, they believed in the creation of the new man and total equality,

Time: 4691.49

and genetics did not support that.

Time: 4693.349

And so if you were doing traditional genetics, you were going to know, at

Time: 4697.109

the very least be fired, if not killed.

Time: 4699.58

And so this guy Lysenko stood up and said, oh, I've got Marxist genetics, right?

Time: 4702.61

I've got, like a whole new field of genetics that basically

Time: 4704.52

is politically compliant.

Time: 4705.99

And then they actually implemented that in the agriculture

Time: 4707.99

system of the Soviet Union.

Time: 4709.25

And it's the origin of one of the big reasons that the Soviet Union

Time: 4711.83

actually fell, which was they ultimately couldn't feed themselves.

Time: 4714.709

Andrew Huberman: So create a new notion of biology as it relates to genetics.

Time: 4717.92

Marc Andreessen: Politically correct biology, right?

Time: 4720.06

They not only created it, they taught it, they mandated it, they required it, and

Time: 4723.45

then they implemented it in agriculture.

Time: 4725.58

Andrew Huberman: Interesting.

Time: 4727.36

Marc Andreessen: I never understood.

Time: 4728.259

There was a bunch of things in history I never understood until the

Time: 4730.32

last decade, and that's one of them.

Time: 4731.88

Andrew Huberman: Well, I censor myself at the level of deleting certain things,

Time: 4734.9

but I don't contort what I do talk about.

Time: 4737.59

So I tend to like to play on lush, open fields.

Time: 4742

Just makes my life a lot easier.

Time: 4743.272

Marc Andreessen: But this goes to the rot.

Time: 4744.2

This goes to the rot, and I'll come back to your question, but this goes

Time: 4746.55

to the rot in the existing system, which is, by the way, I'm no different.

Time: 4749.39

I'm just like you.

Time: 4749.929

Like, I'm trying not to light myself on fire either.

Time: 4752.279

But the rot in the existing system, and by system, I mean the institutions

Time: 4755.132

and the elites, the rot is that the set of things that are no longer allowed.

Time: 4759.12

I mean, that list is obviously expanding over time, and that's real, historically

Time: 4765.57

speaking, that doesn't end in good places.

Time: 4767.879

Andrew Huberman: Is this group of a particular generation that

Time: 4770.21

we can look forward to the time when they eventually die off.

Time: 4773.16

Marc Andreessen: It's a third of the Boomers plus the Millennials.

Time: 4775.01

Andrew Huberman: So, got a while.

Time: 4776.71

Marc Andreessen: Good news, bad news.

Time: 4778.17

Gen X is weird, right?

Time: 4778.919

I'm Gen X.

Time: 4779.84

Gen X is weird because we kind of slipped in the middle.

Time: 4781.84

We were kind of the, I don't know how to describe it.

Time: 4782.719

We were the kind of non-political generation kind of sandwiched between

Time: 4788.23

the Boomers and the Millennials.

Time: 4789.72

Gen Z is a very, I think, open question right now which way they go.

Time: 4793.34

I could imagine them being actually much more intense than the

Time: 4796.97

Millennials on all these issues.

Time: 4798.3

I could also imagine them reacting to the Millennials

Time: 4800.03

and being far more open minded.

Time: 4801.969

Andrew Huberman: We don't know which way it's going to go.

Time: 4803.03

Marc Andreessen: Yeah, it's going to go.

Time: 4803.799

It might be different groups of them.

Time: 4805.4

Andrew Huberman: I'm Gen X also, I'm 47, you're...?

Time: 4808.9

Marc Andreessen: 52.

Time: 4809.33

Andrew Huberman: So I grew up with some John Hughes films and so where the

Time: 4812.14

jocks and the hippies and the punks, and were all divided and they were

Time: 4816.04

all segmented, but then it all sort of mishmashed together a few years later.

Time: 4821.82

And I think that had a lot to do with, like you said, the sort of

Time: 4824.74

apolitical aspect of our generation.

Time: 4828.58

Marc Andreessen: The Gen X just knew the Boomers were nuts, right?

Time: 4830.31

Like, one of the great sitcoms of the era was Family Ties , right?

Time: 4837.44

With the character Michael P.

Time: 4838.32

Keaton.

Time: 4839.16

And he was just like, this guy is just like, yeah, my Boomer

Time: 4841.45

hippie parents are crazy.

Time: 4842.73

I'm just going to go into business and actually do something productive.

Time: 4845.25

There was something iconic about that character in our culture.

Time: 4847.88

And people like me were like, yeah, obviously you go into business, you

Time: 4850.67

don't go into political activism.

Time: 4851.88

And then it's just like, man, that came whipping back around

Time: 4854.58

with the next generation.

Time: 4856.15

So just to touch real quick on the university thing.

Time: 4857.73

So, look, there are people trying to do, and I'm actually going to do a

Time: 4859.92

thing this afternoon with the University of Austin, which is one of these.

Time: 4863.5

And so there are people trying to do new universities.

Time: 4866.93

Like, I would say it's certainly possible.

Time: 4868.2

I hope they succeed.

Time: 4868.929

I'm pulling for them.

Time: 4869.66

I think it'd be great.

Time: 4870.45

I think it'd be great if there w ere a lot more of them.

Time: 4872.15

Andrew Huberman: Who founded this university?

Time: 4873.559

Marc Andreessen: This is a whole group of people.

Time: 4874.929

I don't want to freelance on that because I don't know originally who the idea was--

Time: 4878.54

Andrew Huberman: --University of Austin, not UT Austin.

Time: 4880.59

Marc Andreessen: Yeah. So this is not UT Austin.

Time: 4882.139

It's called the University of Austin.

Time: 4883.52

Or they call it.

Time: 4884

I think it's UATX?

Time: 4888.119

And it's a lot of very sharp people associated with it.

Time: 4893.69

They're going to try, very much exactly like what you described.

Time: 4895.678

They're going to try to do a new one.

Time: 4897.24

I would just tell you the wall of opposition that they're

Time: 4899.71

up against is profound.

Time: 4901.15

And part of it is economic, which is can they ever get access

Time: 4904.36

to federal student lending?

Time: 4905.34

And I hope that they can, but it seems nearly inconceivable the

Time: 4909.08

way the system is rigged today.

Time: 4911.37

And then the other is just like they already have come under, I mean,

Time: 4917.12

anybody who publicly associates with them who is in traditional academia

Time: 4919.76

immediately gets lit on fire, and there's, you know, cancellation campaigns.

Time: 4922.7

So they're up against a wall of social ostracism.

Time: 4924.889

Andrew Huberman: Wow.

Time: 4925.57

Marc Andreessen: They're up against a wall of press attacks.

Time: 4927.92

They're up against a wall of people just like doing the thing, pouncing on,

Time: 4932.119

anytime anybody says anything, they're going to try to burn the place down.

Time: 4934.71

Andrew Huberman: This reminds me of Jerry Springer episodes and Geraldo

Time: 4938.89

Rivera episodes where it's like if a teen listened to Danzig or Marilyn

Time: 4947.34

Manson type music or Metallica, that they were considered a devil worshiper.

Time: 4952.37

Now we just laugh, right?

Time: 4953.83

We're like, that's crazy, right?

Time: 4955.21

People listen to music with all sorts of lyrics and ideas and looks.

Time: 4959.17

That's crazy.

Time: 4960.8

But there were people legitimately sent to prison.

Time: 4964.78

I think it was a West Memphis three, right?

Time: 4966.37

These kids out in West Memphis that looked different, acted different,

Time: 4970.15

were accused of murders that eventually was made clear they clearly didn't

Time: 4974.39

commit, but they were in prison because of the music they listened to.

Time: 4978.07

I mean, this sounds very similar to that.

Time: 4979.899

And I remember seeing bumpersickers, Free the West Memphis Three!

Time: 4982.53

And I thought this was some crazy thing.

Time: 4984.69

And you look into it and this isn't, it's a little bit niche,

Time: 4987.43

but these are real lives.

Time: 4989.89

And there was an active witch hunt for people that looked

Time: 4994.38

different and acted different.

Time: 4995.61

And yet now we're sort of in this inverted world where on the one hand we're all

Time: 5001.33

told that we can express ourselves however we want, but on the other

Time: 5004.09

hand, you can't get a bunch of people together to take classes where they learn

Time: 5007.34

biology and sociology and econ in Texas.

Time: 5012.9

Wild.

Time: 5013.63

Marc Andreessen: Yes.

Time: 5014.12

Well, so the simple explanation is this is Puritanism, right?

Time: 5017.62

So this is the original American Puritanism that just works

Time: 5021.459

itself out through the system in different ways at different times.

Time: 5024.22

There's a religious phenomenon in America called the Great Awakenings.

Time: 5028.27

There will be these periods in American history where there's

Time: 5030.4

basically religiosity fades and then there will be this snapback

Time: 5032.67

effect where you'll have basically this frenzy basically, of religion.

Time: 5036.639

In the old days, it would have been tent revivals and people speaking

Time: 5039.76

in tongues and all this stuff.

Time: 5041.81

And then in the modern world, it's of the form that we're living through right now.

Time: 5045.93

And so, yeah, it's just basically these waves of sort of American religious, and

Time: 5050.539

remember, religion in our time, religious impulses in our time don't get expressed

Time: 5054.38

because we live in more advanced times.

Time: 5056.389

We live in scientifically informed times.

Time: 5057.769

And so religious impulses in our time don't show up as overtly religious.

Time: 5061.76

They show up in a secularized form, which, of course, conveniently, is

Time: 5065.81

therefore not subject to the First Amendment separation of church and state.

Time: 5068.7

As long as the church is secular, there's no problem.

Time: 5072.03

But we're acting out these kind of religious scripts over and over

Time: 5074.44

again, and we're in the middle of another religious frenzy.

Time: 5077.48

Andrew Huberman: There's a phrase that I hear a lot, and I don't

Time: 5081.81

necessarily believe it, but I want your thoughts on it, which is,

Time: 5084.3

"the pendulum always swings back."

Time: 5086.5

Marc Andreessen: Yeah, not quite.

Time: 5087.74

[LAUGHS]

Time: 5087.8

Andrew Huberman: So that's how I feel, too, because--

Time: 5090.719

Marc Andreessen: --Boy, that would be great.

Time: 5091.549

Andrew Huberman: Take any number of things that we've talked about, and,

Time: 5096.05

gosh, it's so crazy the way things have gone with institutions, or it's

Time: 5100.03

so crazy the way things have gone with social media, or it's so crazy, fill

Time: 5103.33

in the blank and people will say, well, the pendulum always swings back like

Time: 5109.099

it's the stock market or something.

Time: 5111.369

After every crash, there'll be an eventual boom and vice versa.

Time: 5115.51

Marc Andreessen: By the way, that's not true either.

Time: 5117.61

Most stock markets we have are, of course, survivorship.

Time: 5120.8

It's all survivorship.

Time: 5121.15

Everything is survivor.

Time: 5121.474

Everything you just said is obviously survivorship bias.

Time: 5122.91

Right.

Time: 5123.09

So if you look globally, most stock markets, over time crash

Time: 5126.61

and burn and never recover.

Time: 5128.84

The American stock market hasn't always recovered.

Time: 5130.71

Andrew Huberman: I was referring to the American stock market.

Time: 5132.63

Marc Andreessen: Globally, b ut the reason everybody refers to the

Time: 5134.57

American stock market is because it's the one that doesn't do

Time: 5136.22

that, the other 200 or whatever, crash and burn and never recover.

Time: 5141.03

Let's go check in on the Argentina stock market right now.

Time: 5143.61

I don't think it's coming back anytime soon.

Time: 5145.52

Andrew Huberman: My father is Argentine and immigrated to the US in the 1960s,

Time: 5149.46

so he would definitely agree with you.

Time: 5152.74

Marc Andreessen: Yeah.

Time: 5153.63

When their stocks crash, they don't come back.

Time: 5155.93

And then Lysenkoism, like, the Soviet Union never recovered from

Time: 5158.38

Lysenkoism, it never came back.

Time: 5160.09

It led to the end of the country, you know, literally.

Time: 5162.25

The things that took down the Soviet Union were oil and wheat.

Time: 5164.399

And the wheat thing, you can trace the crisis back to Lysenkoism.

Time: 5169.6

No, look, pendulum swings back is true only in the cases where the

Time: 5172.86

pendulum swings back, everybody just conveniently forgets all the other

Time: 5176.94

circumstances where that doesn't happen.

Time: 5178.41

One of the things people, you see this in business also, people have a really

Time: 5182.33

hard time confronting really bad news.

Time: 5185.43

I don't know if you've noticed that.

Time: 5187.91

I think every doctor who's listening right now is like, yeah, no shit.

Time: 5190.12

But have you seen in business, there are situations, that Star

Time: 5195.719

Trek , remember Star Trek ? The Kobayashi Maru simulator, right?

Time: 5198.8

So the big lesson to become a Star Trek captain is you had to go through the

Time: 5201

simulation called the Kobayashi Maru, and the point was, there's no way to win.

Time: 5203.7

It's a no win scenario.

Time: 5205.88

And then it turned out like, Captain Kirk was the only

Time: 5208.02

person to ever win the scenario.

Time: 5209.16

And the way that he did it was he went in ahead of time and hacked the simulator.

Time: 5213.01

It was the only way to actually get through.

Time: 5214.389

And then there was a debate whether to fire him or make him a captain.

Time: 5216.63

So they made him a captain.

Time: 5219.459

You know, the problem is, in real life, you do get the

Time: 5222.2

Kobayashi Maru on a regular basis.

Time: 5223.61

Like, there are actual no win situations that you can't work your way out of.

Time: 5227.049

And as a leader, you can't ever cop to that, right?

Time: 5229.13

Because you have to carry things forward, and you have to look for

Time: 5231.09

every possible choice you can.

Time: 5232.5

But every once in a while, you do run into a situation where

Time: 5234.63

it's really not recoverable.

Time: 5235.619

And at least I've found people just cannot cope with that.

Time: 5239.65

What happens is they basically, then they basically just exclude it from

Time: 5242.53

their memory that it ever happened.

Time: 5245.39

Andrew Huberman: I'm glad you brought up simulators, because I want to make sure

Time: 5247.6

that we talk about the new and emerging landscape of AI artificial intelligence.

Time: 5254.83

And I could try and smooth our conversation of a moment ago with this

Time: 5261.85

one by creating some clever segue, but I'm not going to, except I'm going to ask, is

Time: 5267.98

there a possibility that AI is going to remedy some of what we're talking about?

Time: 5273.289

Let's make sure that we earmark that for discussion a little bit later.

Time: 5276.12

But first off, because some of the listeners of this podcast

Time: 5279.69

might not be as familiar with AI as perhaps they should be.

Time: 5283.06

We've all heard about artificial intelligence.

Time: 5285.36

People hear about machine learning, etc.

Time: 5287.44

But it'd be great if you could define for us what AI is.

Time: 5292.059

People almost immediately hear AI and think, okay, robots taking over.

Time: 5297.41

I'm going to wake up, and I'm going to be strapped to the bed and my organs

Time: 5300.71

are going to be pulled out of me.

Time: 5302.24

The robots are going to be in my bank account.

Time: 5304.12

They're going to kill all my children and dystopia for most.

Time: 5311.32

Clearly, that's not the way it's going to go if you believe that machines

Time: 5317.02

can augment human intelligence, and human intelligence is a good thing.

Time: 5321.099

So tell us what AI is and where you think it can take us, both good and bad.

Time: 5328.78

Marc Andreessen: So, there was a big debate when the computer was first

Time: 5332.57

invented, which is in the 1930s, 1940s, people like Alan Turing and

Time: 5336.78

John von Neumann and these people.

Time: 5338.36

And the big debate at the time was because they knew they wanted to build computers.

Time: 5342.98

They had the basic idea, and there had been, like, calculating machines before

Time: 5346.879

that, and there had been these looms that you basically programmed to punch cards.

Time: 5350.18

And so there was a prehistory to computers that had to do with building sort of

Time: 5353.6

increasingly complex calculating machines.

Time: 5355.61

So they were kind of on a track, but they knew they were going to

Time: 5357.429

be able to build, they called it a general purpose computer that could

Time: 5360.03

basically, you could program, in the way that you program computers today.

Time: 5363.3

But they had a big debate early on, which is, should the fundamental

Time: 5365.78

architecture of the computer be based on either A, like calculating machines,

Time: 5370.46

like cache registers and looms and other things like that, or should it

Time: 5374.55

be based on a model of the human brain?

Time: 5376.79

And they actually had this idea of computers modeled on the human

Time: 5379.48

brain back then, and this is this concept of so called neural networks.

Time: 5383.61

And it's actually fairly astonishing from a research standpoint.

Time: 5386.58

The original paper on neural networks actually was published in 1943.

Time: 5391.01

So they didn't have our level of neuroscience, but they actually knew

Time: 5393.349

about the neuron, and they actually had a theory of neurons interconnecting

Time: 5396.13

and synapses and information processing in the brain even back then.

Time: 5400.33

And a lot of people at the time basically said, you know what?

Time: 5403.04

We should basically have the computer from the start be modeled after the

Time: 5405.39

human brain, because if the computer could do everything that the human

Time: 5408.559

brain can do, that would be the best possible general purpose computer.

Time: 5411.17

And then you could have it do jobs, and you could have it create

Time: 5413.67

art, and you could have it do all kinds of things like humans can do.

Time: 5416.85

It turns out that didn't happen.

Time: 5419.35

In our world, what happened instead was the industry went in the other direction.

Time: 5422.85

It went basically in the model of the calculating machine or the cash register.

Time: 5425.83

And I think, practically speaking, that kind of had to be the case, because

Time: 5428.87

that was actually the technology that was practical at the time.

Time: 5433.08

But that's the path and so what we all have experiences with, up to and including

Time: 5437.4

the iPhone in our pocket, is computers built on that basically calculating

Time: 5440.75

machine model, not the human brain model.

Time: 5442.83

And so what that means is computers, as we have come to understand

Time: 5445.679

them, they're basically like mathematical savants at best.

Time: 5450.12

So they're really good at doing lots of mathematical calculations.

Time: 5454.26

They're really good at executing these extremely detailed computer programs.

Time: 5457.7

They're hyper literal.

Time: 5459.51

One of the things you learn early when you're a programmer is, as the

Time: 5462.72

human programmer, you have to get every single instruction you give

Time: 5464.79

the computer correct because it will do exactly what you tell it to do.

Time: 5468.28

And bugs in computer programs are always a mistake on the part of the programmer.

Time: 5472.059

Interesting.

Time: 5472.45

You never blame the computer.

Time: 5473.36

You always blame the programmer because that's the nature of the

Time: 5476.57

thing that you're dealing with.

Time: 5477.55

Andrew Huberman: One downscore off and the whole thing--

Time: 5479.65

Marc Andreessen: --Yeah, and it's the programmer's fault.

Time: 5480.267

And if you talk to any programmer, they'll agree with this.

Time: 5483.639

They'll be like, yeah, if there's a problem, it's my fault.

Time: 5485.469

I did it.

Time: 5486.23

I can't blame the computer.

Time: 5487.32

The computer has no judgment.

Time: 5488.52

It has no ability to interpret, synthesize, develop an independent

Time: 5492.95

understanding of anything.

Time: 5493.99

It's literally just doing what I tell it to do step by step.

Time: 5497.39

So for 80 years we've had this, just this very kind of hyper

Time: 5500.309

literal kind of model computers.

Time: 5502.11

Technically, these are what are called von Neumann machines, based after

Time: 5505.54

the mathematician John von Neumann.

Time: 5507.4

They run in that way, and they've been very successful and very important,

Time: 5510.33

and our world has been shaped by them.

Time: 5512.16

But there was always this other idea out there, which is, okay, how about

Time: 5515.1

a completely different approach, which is based much more on how the

Time: 5517.79

human brain operates, or at least our kind of best understanding of

Time: 5521.43

how the human brain operates, right?

Time: 5522.829

Because those aren't the same thing.

Time: 5525.059

It basically says, okay, what if you could have a computer

Time: 5527.479

instead of being hyper literal?

Time: 5528.48

What if you could have it actually be conceptual and creative and

Time: 5533.12

able to synthesize information and able to draw judgments and able

Time: 5536.94

to behave in ways that are not deterministic but are rather creative?

Time: 5546.03

And the applications for this, of course, are endless.

Time: 5548.11

And so, for example, the self-driving car, the only way that you cannot

Time: 5553.08

program a computer with rules to make it a self-driving car, you

Time: 5555.51

have to do what Tesla and Waymo and these other companies have done.

Time: 5557.809

Now you have to use, right, you have to use this other architecture,

Time: 5561.199

and you have to basically teach them how to recognize objects in

Time: 5563.84

images at high speeds, basically the same way the human brain does.

Time: 5566.86

And so those are so called neural networks running inside.

Time: 5569.45

Andrew Huberman: So, essentially, let the machine operate based on priors.

Time: 5573.75

We almost clipped a boulder going up this particular drive, and so therefore,

Time: 5579.27

this shape that previously the machine didn't recognize as a boulder, it now

Time: 5583.17

introduces to its catalog of boulders.

Time: 5585.34

Is that a good example?

Time: 5587.38

Marc Andreessen: Let's even make it even starker for a self-driving car.

Time: 5590.2

There's something in the road.

Time: 5591.3

Is it a small child or a plastic shopping bag being blown by the wind?

Time: 5595.929

Very important difference.

Time: 5597.84

If it's a shopping bag, you definitely want to go straight through it, because

Time: 5600.7

if you deviate off course, you're going to make a fast, it's the same

Time: 5604.96

challenge we have when we're driving.

Time: 5606.01

You don't want to swerve to avoid a shopping bag because you might hit

Time: 5608.309

something that you didn't see on the side.

Time: 5609.51

But if it's a small child for sure you want to swerve, right?

Time: 5612.63

But in that moment, small children come in different shapes and descriptions and

Time: 5617.15

are wearing different kinds of clothes.

Time: 5617.93

Andrew Huberman: They might tumble onto the road the same way a bag would tumble.

Time: 5620.61

Marc Andreessen: Yeah, they might look like they're tumbling.

Time: 5621.81

And by the way, they might be wearing a Halloween mask.

Time: 5624.66

Right.

Time: 5625.38

They might not have a recognizable human face.

Time: 5628

It might be a kid with one leg.

Time: 5630.849

You definitely want to not hit those.

Time: 5635.1

This is what basically we figured out is you can't apply the rules based

Time: 5638.69

approach of a Von Neumann machine to basically real life and expect the

Time: 5642.1

computer to be in any way understanding or resilient, to change to basically

Time: 5645.58

things happening in real life.

Time: 5646.84

And this is why there's always been such a stark divide between what the

Time: 5649.04

machine can do and what the human can do.

Time: 5652.159

And so, basically, what's happened is in the last decade, that second type

Time: 5655.22

of computer, the neural network based computer, has started to actually work.

Time: 5658.82

It started to work, actually, first, interestingly, in vision, recognizing

Time: 5661.92

objects and images, which is why the self-driving car is starting to work.

Time: 5664.48

Andrew Huberman: Face recognition.

Time: 5665.25

Marc Andreessen: Face recognition.

Time: 5665.85

Andrew Huberman: I mean, when I started off in visual neuroscience,

Time: 5668.04

which is really my original home in neuroscience, the idea that a computer

Time: 5673.54

or a camera could do face recognition better than a human was like a very

Time: 5678.3

low probability event based on the technology we had at the time, based

Time: 5682.67

on the understanding of the face recognition cells and the fusiform gyrus.

Time: 5685.77

Now, you would be smartest to put all your money on the machine.

Time: 5691.39

You want to find faces in airports, even with masks on and at profile

Time: 5695.97

versus straight on, machines can do it far better than almost all people.

Time: 5700.639

I mean, they're the super recognizers.

Time: 5702.429

But even they can't match the best machines.

Time: 5705.629

Now, ten years ago, what I just said was the exact reverse, right?

Time: 5709.023

Marc Andreessen: That's right, yeah.

Time: 5710.27

So faces, handwriting, and then voice, being able to

Time: 5715.23

understand voice just as a user.

Time: 5717.68

If you use Google Docs, it has a built-in voice transcription.

Time: 5719.99

They have sort of the best industry leading kind of voice transcription.

Time: 5722.59

If you use a voice transcription in Google Docs, it's breathtakingly good.

Time: 5725.43

You just speak into it and it just types what you're saying.

Time: 5728.29

Andrew Huberman: Well, that's good, because in my phone, every once in

Time: 5729.83

a while, I'll say I need to go pick up a f ew things and it'll say,

Time: 5732.17

I need to pick up a few thongs.

Time: 5734.14

And so Apple needs to get on board.

Time: 5737.52

Whatever the voice recognition is that Google's using--

Time: 5739.75

Marc Andreessen: --Maybe it knows you better than you think.

Time: 5743.11

Andrew Huberman: [LAUGHS] That was not the topic I was avoiding discussing.

Time: 5745.84

Marc Andreessen: No.

Time: 5746.17

So that's on the list, right?

Time: 5747.07

That's on your...

Time: 5750.139

Actually, there's a reason, actually, why Google's so good and Apple is

Time: 5752.84

not right now at that kind of thing.

Time: 5754.09

And it actually goes to actually an ideological thing, of all things.

Time: 5759.199

Apple does not permit pooling of data for any purpose, including

Time: 5763.65

training AI, whereas Google does.

Time: 5766.4

And Apple's just like, stake their brand on privacy.

Time: 5768.81

And among that is sort of a pledge that they don't pool your data.

Time: 5772.08

And so all of Apple's AI is like, AI that has to happen locally on your phone.

Time: 5776.07

Whereas Google's AI can happen in the cloud.

Time: 5778.01

Right? It can happen across pool data.

Time: 5779.12

Now, by the way, some people think that that's bad because

Time: 5780.93

they think pooling data is bad.

Time: 5782.37

But that's an example of the shift that's happening in the industry right now,

Time: 5785.16

which is you have this separation between the people who are embracing the new

Time: 5788.58

way of training AIs and the people who basically, for whatever reason, are not.

Time: 5792.62

Andrew Huberman: Excuse me, you say that some people think it's

Time: 5794.67

bad because of privacy issues or they think it's bad because of the

Time: 5797.92

reduced functionality of that AI.

Time: 5800.15

Marc Andreessen: Oh, no. So you're definitely going to get...

Time: 5803.42

there's three reasons AIs have started to work.

Time: 5805.38

One of them is just simply larger data sets, larger amounts of data.

Time: 5809.79

Specifically, the reason why objects and images are now, the reason

Time: 5813.38

machines are now better than humans at recognizing objects, images or

Time: 5815.71

recognizing faces is because modern facial recognition AIs are trained across

Time: 5820.71

all photos on the Internet of people.

Time: 5822.8

Billions and billions and billions of photos, right?

Time: 5824.69

Unlimited number of photos of people on the Internet.

Time: 5826.939

Attempts to train facial recognition systems.

Time: 5829.02

Ten or 20 years ago, they'd be trained on thousands or tens of thousands of photos.

Time: 5833.01

Andrew Huberman: So the input data is simply much m ore vast

Time: 5835.57

. Marc Andreessen: Much larger.

Time: 5836.1

This is the reason to get to the conclusion on this.

Time: 5837.96

This is the reason why ChatGPT works so well.

Time: 5841.08

One of the reasons ChatGPT works so well is it's trained

Time: 5843.309

on the entire Internet of text.

Time: 5845.24

And the entire Internet of text was not something that was available for

Time: 5847.95

you to train an AI on until it came to actually exist itself, which is

Time: 5851.17

new in the last, basically decade.

Time: 5853.01

Andrew Huberman: So in the case of face recognition, I could see how

Time: 5855.65

having a much larger input data set would be beneficial if the goal is

Time: 5859.39

to recognize Marc Andreessen's face, because you are looking for signal to

Time: 5863.09

noise against everything else, right?

Time: 5865.37

But in the case of ChatGPT, when you're pooling all text on the internet and you

Time: 5870.7

ask ChatGPT to, say, construct a paragraph about Marc Andreessen's prediction of

Time: 5876.46

the future of human beings over the next ten years and the likely to be most

Time: 5882.94

successful industries, give ChatGPT that.

Time: 5886.08

If it's pooling across all text, how does it know what is

Time: 5890.68

authentically Marc Andreessen's text?

Time: 5892.91

Because in the case of face recognition, you've got a standard to work from a

Time: 5898.679

verified image versus everything else.

Time: 5902.309

In the case of text, you have to make sure that what you're starting with is

Time: 5906.32

verified text from your mouth, which makes sense if it's coming from video.

Time: 5910.96

But then if that video is deep faked, all of a sudden, what's true?

Time: 5916.259

Your valid Marc Andreessen is in question.

Time: 5921.17

And then everything ChatGPT is producing, that is then of question.

Time: 5925.83

Marc Andreessen: So I would say there's a before and after thing here.

Time: 5928.01

There's like a before ChatGPT and after GPT question, because the existence

Time: 5932.55

of GPT itself changes the answer.

Time: 5935.05

So before ChatGPT.

Time: 5936.67

So the version you're using today is trained on data up till September 2021.

Time: 5940.85

They're cut off with the training set.

Time: 5942.09

Up till September 2021, almost all text on the Internet was written by a human being.

Time: 5947.15

And then most of that was written by people under their own names.

Time: 5949.51

Some of it wasn't, but a lot of it was.

Time: 5951.66

And why do you know it's for me is because it was published in a magazine

Time: 5953.71

under my name, or it's a podcast transcript and it's under my name.

Time: 5956.879

And generally speaking, if you just did a search on what are things Marc

Time: 5959.72

Andreessen has written and said, 90% plus of that would be correct,

Time: 5963.8

and somebody might have written a fake parody article or something.

Time: 5967.39

Like that.

Time: 5967.81

But not that many people were spending that much time writing fake

Time: 5970.84

articles about things that I said.

Time: 5972.15

Andrew Huberman: Right now, so many people can pretend to be you.

Time: 5974

Marc Andreessen: Exactly right.

Time: 5974.71

And so, generally speaking, you can kind of get your arms around

Time: 5977.09

the idea that there's a corpus of material associated with me.

Time: 5979.31

Or by the way, same thing with you.

Time: 5980.28

There's a corpus of YouTube transcripts and other, your academic papers

Time: 5983.23

and talks you've given, and you can kind of get your hands around that.

Time: 5985.49

And that's how these systems are trained.

Time: 5986.86

They take all that data collectively, they put it in there.

Time: 5988.94

And that's why this works as well as it does.

Time: 5990.96

And that's why if you ask ChatGPT to speak or write like me or like you or like

Time: 5996.07

somebody else, it will actually generally do a really good job because it has all

Time: 5999.9

of our prior text in its training data.

Time: 6003.59

That said, from here on out, this gets harder.

Time: 6005.34

And of course, the reason this gets harder is because now we have AI that

Time: 6007.849

can create text and we have AI that can create text at industrial scale.

Time: 6012.85

Andrew Huberman: Is it watermarked as AI generated text?

Time: 6014.51

Marc Andreessen: No.

Time: 6014.62

Andrew Huberman: How hard would it be to do that?

Time: 6016.26

Marc Andreessen: I think it's impossible.

Time: 6017.75

I think it's impossible.

Time: 6018.63

There are people who are trying to do that.

Time: 6020.33

This is a hot topic in the classroom.

Time: 6021.71

I was just talking to a friend who's got like a 14 year old kid in a class, and

Time: 6024.45

there's like these recurring scandals.

Time: 6025.679

Every kid in the class is using ChatGPT to write their essays or to help them write

Time: 6029.73

their essays, and then the teacher is using one of, there's a tool that you can

Time: 6035.24

use that purports to be able to tell you whether something was written by ChatGPT.

Time: 6039.929

But it's like, only right like 60% of the time.

Time: 6042.53

And so there was this case where the student wrote an essay where their

Time: 6045.5

parent sat and watched them write the essay, and then they submitted it, and

Time: 6048.95

this tool got the conclusion incorrect.

Time: 6050.7

And then the student feels outraged because he got unfairly cheated.

Time: 6053.24

But the teacher is like, well, you're all using the tool.

Time: 6055.22

Then it turns out there's another tool that basically you feed in

Time: 6057.62

text, and they call it a summarizer.

Time: 6058.09

But what it really is is it's a cheating mechanism to basically

Time: 6065.289

just shuffle the words around enough so that it sheds whatever

Time: 6068.17

characteristics were associated with AI.

Time: 6069.94

So, there's like an arms race going on in educational settings right

Time: 6072.86

now around this exact question.

Time: 6074.54

I don't think it's possible to do.

Time: 6076.42

There are people working on the watermarking.

Time: 6077.383

I don't think it's possible to do the watermarking.

Time: 6078.787

And I think it's just kind of obvious why it's not possible to do that, which is

Time: 6081.65

you can just read the output for yourself.

Time: 6084.379

It's really good.

Time: 6085.71

How are you actually going to tell the difference between that and

Time: 6088.43

something that a real person wrote?

Time: 6089.69

And then, by the way, you can also ask ChatGPT to write

Time: 6092.29

in different styles, right?

Time: 6093.559

So you can tell it, like, write in the style of a 15 year old.

Time: 6097.12

You can tell it to write in the style of a non native English speaker.

Time: 6100.369

Or if you're a non native English speaker, you can tell it to

Time: 6102.08

write in the style of an English speaker, native English speaker.

Time: 6105.03

And so the tool itself will help you evade.

Time: 6108.08

I think there's a lot of people who are going to want to

Time: 6111.72

distinguish, "real" versus fake.

Time: 6115.17

I think those days are over.

Time: 6116.09

Andrew Huberman: Genie's out of the bottle.

Time: 6116.45

Marc Andreessen: Genie is completely out of the bottle.

Time: 6116.722

And by the way, I actually think this is good.

Time: 6120.23

This doesn't map to my worldview of how we use this technology

Time: 6123.29

anyway, which we can come back to.

Time: 6126.48

So there's that, and then there's the problem, therefore of the

Time: 6130.059

so-called deep fake problem.

Time: 6131.13

So then there's the problem of, like, deliberate basically, manipulation.

Time: 6134.299

And that's like one of your many enemies, one of your increasingly

Time: 6139.57

long list of enemies like mine, who basically is like, wow, I know

Time: 6144.14

how I'm going to get him, right?

Time: 6145.28

I'm going to use it to create something that looks like a Huberman

Time: 6149.38

transcript and I'm going to have him say all these bad things.

Time: 6151.81

Andrew Huberman: Or a video.

Time: 6152.28

Marc Andreessen: Or a video, or a video.

Time: 6152.929

Andrew Huberman: I mean, Joe Rogan and I were deep faked in a video.

Time: 6156.219

I don't want to flag people to it, so I won't talk about what it was about, but

Time: 6161.44

where it, for all the world looked like a conversation that we were having and

Time: 6165.8

we never had that specific conversation.

Time: 6167.94

Marc Andreessen: Yeah, that's right.

Time: 6168.63

So that's going to happen for sure.

Time: 6170.34

So what there's going to need to be is there need to be basically

Time: 6172.199

registries where basically in your case, you will submit your legitimate

Time: 6178.61

content into a registry under your unique cryptographic key, right.

Time: 6182.49

And then basically there will be a way to check against that registry to

Time: 6185.1

see whether that was the real thing.

Time: 6186.179

And I think this needs to be done for sure.

Time: 6187.969

For public figures, it needs to be done for politicians,

Time: 6189.839

it needs to be done for music.

Time: 6191.68

Andrew Huberman: What about taking what's already out there and being able to

Time: 6194.28

authenticate it or not in the same way that many times per week, I get asked,

Time: 6198.83

is this your account about a direct message that somebody got on Instagram?

Time: 6202.559

And I always tell them, look, I only have the one account,

Time: 6206.589

this one verified account.

Time: 6208.219

Although now, with the advent of pay to play, verification makes it

Time: 6212.44

a little less potent as a security blanket for knowing if it's not

Time: 6216.52

this account, then it's not me.

Time: 6218.929

But in any case, these accounts pop up all the time pretending to be me.

Time: 6222.599

And I'm relatively low on the scale.

Time: 6227.209

Not low, but relatively low on the scale to say, like a Beyonce

Time: 6231.43

or something like that, who has hundreds of millions of followers.

Time: 6234.21

So is there a system in mind where people could go in and

Time: 6239.28

verify text, click yes or no.

Time: 6241.342

This is me.

Time: 6241.9

This is not me.

Time: 6242.46

And even there, there's the opportunity for people to fudge, to eliminate things

Time: 6246.6

about themselves that they don't want out there, by saying, no, that's not me.

Time: 6250.08

I didn't actually say that.

Time: 6251.19

Or create that.

Time: 6251.679

Marc Andreessen: Yeah, no, that's right.

Time: 6253.13

Technologically, it's actually pretty straightforward.

Time: 6254.78

So the way to implement this technologically is with a public key.

Time: 6257.2

It's called public key cryptography, which is the basis for how cryptography

Time: 6260.339

information is secured in the world today.

Time: 6262.23

And so basically, the implementation form of this would be, you would pick whatever

Time: 6266

is your most trusted channel, and let's say it's your YouTube channel as an

Time: 6268.42

example, where just everybody just knows that it's you on your YouTube channel

Time: 6271.32

because you've been doing it for ten years or whatever, and it's just obvious.

Time: 6274.059

And you would just publish in the about me page on YouTube, you

Time: 6276.6

would just publish your public cryptographic key that's unique to you.

Time: 6280.67

Right.

Time: 6280.94

And then anytime anybody wants to check to see whether any piece

Time: 6283.69

of content is actually you, they go to a registry in the cloud

Time: 6287.27

somewhere, and they basically submit.

Time: 6288.88

They basically say, okay, is this him?

Time: 6290.91

And then they can basically see whether somebody with your public

Time: 6294.24

key, you had actually certified that this was something that you made.

Time: 6298.359

Now, who runs that registry is an interesting question.

Time: 6301.09

If that registry is run by the government, we will call that the Ministry of Truth.

Time: 6304.65

I think that's probably a bad idea.

Time: 6305.9

If that registry is run by a company, we would call that basically the

Time: 6310.61

equivalent of, like, a credit bureau or something like that.

Time: 6312.9

Maybe that's how it happens.

Time: 6313.87

The problem with that is that company now becomes hacking target number one,

Time: 6316.98

right, of every bad person on Earth.

Time: 6318.83

Because if anybody breaks into that company, they can

Time: 6322.12

fake all kinds of things.

Time: 6323.01

Andrew Huberman: They own the truth.

Time: 6324.09

Marc Andreessen: Right. They own the truth.

Time: 6324.84

And by the way, insider threat, also, their employees can monkey with it.

Time: 6327.58

So you have to really trust that company.

Time: 6329.61

The third way to do it is with a blockchain.

Time: 6331.48

And so this, with the crypto blockchain technology, you could have

Time: 6334

a distributed system, basically, a distributed database in the cloud

Time: 6337.469

that is run through a blockchain.

Time: 6338.85

And then it implements this cryptography and this certification process.

Time: 6342.59

Andrew Huberman: What about quantum Internet?

Time: 6344.36

Is that another way to encrypt these things?

Time: 6345.88

I know most of our listeners are probably not familiar with quantum

Time: 6348.103

Internet, but put simply, it's a way to secure communications on the Internet.

Time: 6353.25

Let's just leave it at that.

Time: 6354.61

It's sophisticated, and we'll probably do a whole episode about this at some point.

Time: 6357.53

But maybe you have a succinct way of describing quantum Internet,

Time: 6360.07

but that would be better.

Time: 6362.82

And if so, please offer it up.

Time: 6364.53

But is quantum Internet going to be one way to secure these

Time: 6367.72

kinds of data and resources?

Time: 6370.23

Marc Andreessen: Maybe in the future, years in the future?

Time: 6372.46

We don't yet have working quantum computers in practice, so it's

Time: 6375.04

not currently something you could do, but maybe in a decade or two?

Time: 6378.75

Andrew Huberman: Tell me.

Time: 6379.25

I'm going to take a stab at defining quantum Internet in one sentence.

Time: 6381.63

It's a way in which if anyone were to try and peer in on a conversation on the

Time: 6384.86

Internet, it essentially would be futile because of the way that quantum Internet

Time: 6391.86

changes the way that the communication is happening so fast and so many times in any

Time: 6396.02

one conversation, essentially changing the translation or the language so fast that

Time: 6400.16

there's just no way to keep up with it.

Time: 6401.48

Is that more or less accurate?

Time: 6402.759

Marc Andreessen: Yeah, conceivably not yet, but someday.

Time: 6405.809

Andrew Huberman: So, going back to AI, most people who

Time: 6408.755

hear about AI are afraid of AI.

Time: 6412.42

Marc Andreessen: Well?

Time: 6412.78

Andrew Huberman: I think most people who aren't informed--

Time: 6414.62

Marc Andreessen: --This goes back to our elites versus masses thing.

Time: 6417.13

Andrew Huberman: Oh, interesting.

Time: 6417.819

Well, I heard you say that, a his is from a really wonderful tweet thread that we

Time: 6424.889

will link in the show note captions that you put out not long ago and that I've

Time: 6429.48

read now several times, and that everyone really should take the time to read it.

Time: 6433.75

Probably takes about 20 minutes to read it carefully and to think about

Time: 6437.33

each piece, and I highly recommend it.

Time: 6439.759

But you said, and I'm quoting here, "Let's address the fifth, the

Time: 6447.01

one thing I actually agree with, which is AI will make it easier

Time: 6450.44

for bad people to do bad things."

Time: 6456.809

Marc Andreessen: First of all, there is a general freak out happening around AI.

Time: 6458.809

I think it's primarily, it's one of these, again, it's an elite driven freak out.

Time: 6461.519

I don't think the man in the street knows, cares, or feels one way or the other.

Time: 6464.29

It's just not a relevant concept, and it probably just sounds like science fiction.

Time: 6467.53

So I think there's an elite driven freak out that's happening right now.

Time: 6472.03

I think that elite driven freak out has many aspects to it that I think

Time: 6475.34

are incorrect, which is not surprising.

Time: 6478.01

I would think that, given that.

Time: 6478.69

I think the elites are incorrect about a lot of things, but I think

Time: 6480.96

they're very wrong about a number of things they're saying about AI.

Time: 6483.86

But that said, look, this is a very powerful new technology, right?

Time: 6486.98

This is like a new general purpose thinking technology.

Time: 6490.11

So what if machines could think?

Time: 6492.099

And what if you could use machines that think, and what if you

Time: 6495.11

could have them think for you?

Time: 6496.33

There's obviously a lot of good that could come from that.

Time: 6499.19

But also, people, look, criminals could use them to plan better crimes.

Time: 6504.29

Terrorists could use them to plan better terror attacks and so forth.

Time: 6506.49

And so these are going to be tools that bad people can use

Time: 6509.37

to do bad things, for sure.

Time: 6511.509

Andrew Huberman: I can think of some ways that AI could be

Time: 6513.589

leveraged to do fantastic things.

Time: 6515.18

Like in the realm of medicine, an AI pathologist perhaps, can scan 10,000

Time: 6524.78

slides of histology and find the one micro tumor, cellular aberration, that

Time: 6531.38

would turn into a full blown tumor, whereas the even mildly fatigued or

Time: 6536.48

well rested human pathologists, as great as they come, might miss that.

Time: 6541.92

And perhaps the best solution is for both of them to do it, and then

Time: 6545.58

for the human to verify what the AI has found and vice versa, right?

Time: 6548.973

Marc Andreessen: That's right.

Time: 6549.629

Andrew Huberman: And that's just one example.

Time: 6550.95

I mean, I can come up with thousands of examples where this would be wonderful.

Time: 6556.2

Marc Andreessen: I'll give you another one, by the way, medicine.

Time: 6557.28

So you're talking about an analytic result, which is good and important.

Time: 6560.139

The other is like, the machines are going to be much better at bedside manner.

Time: 6564.04

They're going to be much better at dealing with the patient.

Time: 6566.13

And we already know there's already been a study.

Time: 6567.57

There's already been a study on this.

Time: 6568.559

So there was already a study done on this where there was a study team that

Time: 6573.53

scraped thousands of medical questions off of an Internet forum, and then they

Time: 6576.55

had real doctors answer the questions, and then they had basically GPT4 answer

Time: 6580.09

the questions, and then they had another panel of doctors score the responses.

Time: 6584.509

So there were no patients experimented on here.

Time: 6586.41

This was a test contained within the medical world.

Time: 6591.35

The judges, the panel of doctors who are the judges, scored the

Time: 6594.06

answers in both factual accuracy and on bedside manner, on empathy.

Time: 6598.54

And the GPT4 was equal or better on most of the factual questions

Time: 6604.24

analytically, already, and it's not even a specifically trained medical AI, but

Time: 6609.24

it was overwhelmingly better on empathy.

Time: 6612.11

Andrew Huberman: Amazing,

Time: 6612.61

Marc Andreessen: Right?

Time: 6615.929

Do you treat patients directly in your work?

Time: 6618.63

You don't?

Time: 6619.269

Andrew Huberman: No, I don't.

Time: 6619.83

We run clinical trials.

Time: 6621.36

Marc Andreessen: Right.

Time: 6622.19

Andrew Huberman: But I don't do any direct clinical work.

Time: 6625.68

Marc Andreessen: I've no direct experience with this.

Time: 6626.71

But from the surgeons, if you talk to surgeons or you talk to people who

Time: 6630.25

train surgeons, what they'll tell you is surgeons need to have an emotional

Time: 6633.23

remove from their patients in order to do a good job with the surgery.

Time: 6635.96

The side effect of that, and by the way, look, it's a hell of a job to have to

Time: 6638.809

go in and tell somebody that they're going to die or that they have so you're

Time: 6641.539

never going to recover, they're never going to walk again or whatever it is.

Time: 6643.72

And so there's sort of something inherent in that job where they need

Time: 6646.99

to keep an emotional reserve from the patient to be able to do the job.

Time: 6650.6

And it's expected of them as professionals.

Time: 6653.17

The machine has no such limitation.

Time: 6655.469

The machine can be as sympathetic as you want it to be for as

Time: 6658.32

long as you want it to be.

Time: 6659.309

It can be infinitely sympathetic.

Time: 6660.469

It's happy to talk to you at four in the morning.

Time: 6662.01

It's happy to sympathize with you.

Time: 6663.49

And by the way, it's not just sympathizing with you in the way

Time: 6666.7

that, oh, it's just making up words to lie to you to make you feel good.

Time: 6670.389

It can also sympathize with you in terms of helping you through all

Time: 6672.96

the things that you can actually do to improve your situation.

Time: 6675.61

And so, boy, can you keep a patient actually on track with

Time: 6680.5

a physical therapy program.

Time: 6681.57

Can you keep a patient on track with a nutritional program?

Time: 6683.81

Can you keep a patient off of drugs or alcohol?

Time: 6686.37

And if they have a machine medical companion that's with them all the

Time: 6689.61

time that they're talking to all the time, that's infinitely patient,

Time: 6692.4

infinitely wise, infinitely loving, and it's just going to be there all the

Time: 6697.09

time and it's going to be encouraging and it's going to be, you know, you

Time: 6699.12

did such a great job yesterday, I know you can do this again today.

Time: 6702.05

Cognitive behavioral therapy is an obvious fit here.

Time: 6704.53

These things are going to be great at CBT and that's already starting.

Time: 6707.36

You can already use ChatGPT as a CBT therapist if you want.

Time: 6710.94

It's actually quite good at it.

Time: 6713.33

There's, there's a universe here that's, it goes to what you said,

Time: 6715.81

there's a universe here that's opening up, which is what I believe is it's

Time: 6718.99

partnership between man and machine.

Time: 6721.32

It's a symbiotic relationship, not an adversarial relationship.

Time: 6724.23

And so the doctor is going to pair with the AI to do all the things

Time: 6727.85

that you described, but the patient is also going to pair with the AI.

Time: 6731.129

And I think this partnership that's going to emerge is going to lead,

Time: 6735.53

among other things, to actually much better health outcomes.

Time: 6738.56

Andrew Huberman: I've relied for so much of my life on excellent mentors from a

Time: 6743.82

very young age, and still now, in order to make the best decisions possible with

Time: 6749.61

the information I had, and rarely were they available at four in the morning

Time: 6754.52

sometimes, but not on a frequent basis.

Time: 6756.82

And they fatigue like anybody else, and they have their own stuff like anybody

Time: 6761.82

else, baggage, events in their life, etc.

Time: 6765.509

What you're describing is a sort of AI coach or therapist of sorts, that

Time: 6770.45

hopefully would learn to identify our best self and encourage us to be our best self.

Time: 6776.34

And when I say best self, I don't mean that in any kind of pop psychology way.

Time: 6779.9

I could imagine AI very easily knowing how well I slept the night before and

Time: 6784.68

what types of good or bad decisions I tend to make at 02:00 in the afternoon when

Time: 6788.94

I've only had 5 hours of sleep, or maybe just less REM sleep the night before.

Time: 6793.05

It might encourage me to take a little more time to think about something.

Time: 6796.62

Might give me a little tap on the wrist through a device that no one else

Time: 6799.81

would detect to refrain from something.

Time: 6803.5

Marc Andreessen: Never going to judge you.

Time: 6804.9

It's never going to be resentful.

Time: 6806.12

It's never going to be upset that you didn't listen to it.

Time: 6808.93

It's never going to go on vacation.

Time: 6811

It's going to be there for you.

Time: 6811.99

I think this is the way people are going to live.

Time: 6814

It's going to start with kids, and then over time it's going to be adults.

Time: 6815.969

I think the way people are going to live is they're going to have

Time: 6818.429

a friend, therapist, companion, mentor, coach, teacher, assistant.

Time: 6823.25

Or, by the way, maybe multiple of those.

Time: 6826.56

It may be that we're actually talking about six, like, different personas

Time: 6828.52

interacting, which is a whole 'nother possibility, but they're going to have--

Time: 6831.95

Andrew Huberman: --A committee!

Time: 6832.31

Marc Andreessen: A committee, yeah, exactly.

Time: 6833.83

Actually different personas.

Time: 6834.509

And maybe, by the way, when there are difficult decisions to be made in your

Time: 6836.679

life, maybe what you want to hear is the argument among the different personas.

Time: 6840.73

And so you're just going to grow up, you're just going to have this in

Time: 6844.55

your life and you're going to always be able to talk to it and always

Time: 6847.42

be able to learn from it and always be able to help it make, it's going

Time: 6851.47

to be a symbiotic relationship.

Time: 6854.02

I think it's going to be a much better way to live.

Time: 6855.15

I think people are going to get a lot out of it.

Time: 6856.74

Andrew Huberman: What modalities will it include?

Time: 6858.66

So I can imagine my phone has this engine in it, this AI

Time: 6863.71

companion, and I'm listening in headphones as I walk into work.

Time: 6867.63

And it's giving me some, not just encouragement, some warnings, some

Time: 6871.969

thoughts that things that I might ask Marc Andreessen today that I

Time: 6875.199

might not have thought of and so on.

Time: 6877.73

I could also imagine it having a more human form.

Time: 6880.73

I could imagine it being tactile, having some haptic, so tapping to

Time: 6884.69

remind me so that it's not going to enter our conversation in a way

Time: 6887.54

that interferes or distracts you.

Time: 6890.14

But I would be aware.

Time: 6890.86

Oh, right.

Time: 6892.379

Things of that sort.

Time: 6893.459

I mean, how many different modalities are we going to allow these AI

Time: 6897.91

coaches to approach us with?

Time: 6899.81

And is anyone actually thinking about the hardware piece right now?

Time: 6903.19

Because I'm hearing a lot about the software piece.

Time: 6905.38

What does the hardware piece look like?

Time: 6907.09

Marc Andreessen: Yeah, so this is where Silicon Valley is going to kick in.

Time: 6909.44

So the entrepreneurial community is going to try all of those, right?

Time: 6912.62

By the way, the big companies and startups are going to try all those.

Time: 6915

And so obviously there's big companies that are working, the big

Time: 6918.34

companies that have talked about a variety of these, including heads

Time: 6920.52

up displays, AR, VR kinds of things.

Time: 6923.889

There's lots of people doing voice.

Time: 6925.339

Thing is, voice is a real possibility.

Time: 6927.55

It may just be an earpiece.

Time: 6930.01

There's a new startup that just unveiled a new thing where they actually project.

Time: 6935.22

So you'll have like a pendant you wear on like a necklace, and it actually

Time: 6938.059

projects, literally, it'll project images on your hand or on the table

Time: 6941.59

or on the wall in front of you.

Time: 6942.459

So maybe that's how it shows up.

Time: 6944.91

Yeah.

Time: 6945.099

There are people working on so-called haptic or touch based kinds of things.

Time: 6948.82

There are people working on actually picking up nerve

Time: 6951.11

signals, like out of your arm.

Time: 6957.67

There's some science for being able to do basically like subvocalization.

Time: 6961.36

So maybe you could pick up that way by bone conduction.

Time: 6967.45

These are all going to be tried.

Time: 6968.65

So that's one question is the physical form of it, and then the other

Time: 6971.63

question is the software version of it, which is like, okay, what's the

Time: 6974.14

level of abstraction that you want to deal with these things in right now?

Time: 6978.639

It's like a question answer paradigm, so called chatbot, like, ask a question, get

Time: 6981.889

an answer, ask a question, get an answer.

Time: 6983.699

Well, you want that to go for sure to more of a fluid conversation.

Time: 6986.4

You want it to build up more knowledge of who you are, and

Time: 6988.42

you don't want to have to explain yourself a second time and so forth.

Time: 6990.83

And then you want to be able to tell it things like, well, remind me this,

Time: 6993.29

that, or be sure and tell me when X.

Time: 6996.16

But then maybe over time, more and more, you want it actually deciding

Time: 6999.23

when it's going to talk to you, right?

Time: 7000.8

And when it thinks it has something to say, it says it,

Time: 7002.9

and otherwise it stays silent.

Time: 7005.17

Andrew Huberman: Normally, at least in my head, unless I make a

Time: 7008.13

concerted effort to do otherwise, I don't think in complete sentences.

Time: 7011.74

So presumably these machines could learn my style of fragmented internal dialogue.

Time: 7022.039

And maybe I have an earpiece, and I'm walking in and I start hearing

Time: 7026.57

something, but it's some advice, etc, encouragement, discouragement.

Time: 7028.379

But at some point, those sounds that I hear in an earphone are very

Time: 7037.859

different than seeing something or hearing something in the room.

Time: 7040.15

We know this based on the neuroscience of musical perception

Time: 7043.92

and language perception.

Time: 7044.73

Hearing something in your head is very different.

Time: 7047.74

And I could imagine at some point that the AI will cross a precipice where if

Time: 7051.82

it has inline wiring to actually control neural activity in specific brain areas,

Time: 7056.59

and I don't mean very precisely, even just stimulating a little more prefrontal

Time: 7060.24

cortical activity, for instance, through the earpiece, a little ultrasound wave

Time: 7062.88

now can stimulate prefrontal cortex in a non invasive way that's being

Time: 7066.389

used clinically and experimentally, that the AI could decide that I need

Time: 7072.19

to be a little bit more context aware.

Time: 7076.35

This is something that is very beneficial for those listening that are trying to

Time: 7079.72

figure out how to navigate through life.

Time: 7081.22

It's like, you know, the context you're in and know the catalog of behaviors

Time: 7084.6

and words that are appropriate for that situation and not, you know, this would

Time: 7090.04

go along with agreeableness, perhaps, but strategic agreeableness, right.

Time: 7093.79

Context is important.

Time: 7095.78

There's nothing diabolical about that.

Time: 7097.04

Context is important, but I could imagine the AI recognizing we're

Time: 7100.27

entering a particular environment.

Time: 7102.73

I'm now actually going to ramp up activity in prefrontal cortex a little bit in a

Time: 7106.03

certain way that allows you to be more situationally aware of yourself and

Time: 7110.62

others, which is great, unless I can't necessarily short circuit that influence,

Time: 7117.19

because at some point, the AI is actually then controlling my brain activity

Time: 7122.54

and my decision making and my speech.

Time: 7124.36

I think that's what people fear is that once we cross that precipice that we

Time: 7128.62

are giving up control to the artificial versions of our human intelligence.

Time: 7132.78

Marc Andreessen: And look, I think we have to decide, we collectively,

Time: 7135.28

and we as individuals, I think, have to decide exactly how to do that.

Time: 7137.7

And this is the big thing that I believe about AI.

Time: 7139.423

That's just a much more, I would say, practical view of the world than

Time: 7141.8

a lot of the panic that you hear.

Time: 7143.369

It's just like, these are machines.

Time: 7145.209

They're able to do things that increasingly are like the things that

Time: 7147.37

people can do in some circumstances.

Time: 7148.66

But these are machines.

Time: 7149.36

We built a machine, means we decide how to use the machines.

Time: 7152.13

When we want the machines turned on, they're turned on, we want them

Time: 7153.85

turned off, they're turned off.

Time: 7155.71

I think that's absolutely the kind of thing that the individual person

Time: 7158.33

should always be in charge of.

Time: 7159.96

Andrew Huberman: Everyone was.

Time: 7160.88

And I have to imagine some people are still afraid of CRISPR, of gene editing.

Time: 7165.13

But gene editing stands to revolutionize our treatment of all sorts of disease,

Time: 7169.48

you know, inserting and deleting particular genes in adulthood.

Time: 7172.83

Not having to recombine in the womb.

Time: 7175.2

A new organism is an immensely powerful tool.

Time: 7178.82

And yet the Chinese scientist who did CRISPR on humans, this has been

Time: 7182.969

done, actually did his postdoc at Stanford with Steve Quake, then

Time: 7186.81

went to China, did CRISPR on babies.

Time: 7189.34

Mutated something.

Time: 7190.69

I believe it was one of the HIV receptors.

Time: 7193.83

I'm told it was with the intention of augmenting human memory.

Time: 7197.71

It had very little to do, in fact, with limiting susceptibility to

Time: 7201.285

HIV per se, to do with the way that receptor is involved in human memory.

Time: 7206.7

The world demonized that person.

Time: 7210

We actually don't know what happened to them.

Time: 7211.41

Whether or not they have a laboratory now or they're sitting in jail, it's unclear.

Time: 7214.589

But in China and elsewhere, people are doing CRISPR on humans.

Time: 7218.61

We know this.

Time: 7220

It's not legal in the US and other countries, but it's happening.

Time: 7227.02

Do you think it's a mistake for us to fear these technologies so much that we back

Time: 7230.99

away from them and end up 10, 20 years behind other countries that could use it

Time: 7235.13

for both benevolent or malevolent reasons?

Time: 7238.61

Marc Andreessen: Yeah, the details matter.

Time: 7241.36

So it's technology by technology.

Time: 7242.97

But I would say there's two things you always have to think about in

Time: 7245.559

these questions, I think, in terms of counterfactuals and opportunity cost.

Time: 7248.3

CRISPR is an interesting one.

Time: 7251.45

CRISPR manipulates the human genome.

Time: 7253.279

Nature manipulates the human, like, in all kinds of ways.

Time: 7258.286

[LAUGHS] Andrew Huberman: Yeah.

Time: 7258.599

[LAUGHS]

Time: 7258.602

Marc Andreessen: When you pick a spouse and you--

Time: 7259.546

Andrew Huberman: --Have a child with that spouse--

Time: 7260.64

Marc Andreessen: --Oh, boy--

Time: 7261.38

Andrew Huberman: --You're doing genetic recombination.

Time: 7263

Marc Andreessen: Yes, you are.

Time: 7264.059

Quite possibly, if you're Genghis Khan, you're determining the future

Time: 7267

of humanity by those mutations.

Time: 7272.909

This is the old question of, basically, this is all state of

Time: 7277.93

nature, state of grace, basically.

Time: 7280

Is nature good?

Time: 7280.71

And then therefore, artificial things are bad, which is kind of shot.

Time: 7283.969

A lot of people have ethical views like that.

Time: 7285.9

I'm always of the view that nature is a bitch and wants us dead.

Time: 7291.08

Nature is out to get us, man.

Time: 7292.24

Nature wants to kill us, right?

Time: 7292.31

Like, nature wants to evolve all kinds of horrible viruses.

Time: 7296.8

Nature wants plagues.

Time: 7297.92

Nature wants to do weather.

Time: 7300.54

Nature wants to do all kinds of stuff.

Time: 7302.03

I mean, look, nature religion was the original religion, right?

Time: 7304.71

Like, that was the original thing people worshiped.

Time: 7306.6

And the reason was because nature was the thing that was out to get you right before

Time: 7310.06

you had scientific and technological methods to be able to deal with it.

Time: 7314.75

So, the idea of not doing these things, to me is just saying, oh,

Time: 7317.849

we're just going to turn over the future of everything to nature.

Time: 7320.12

And I think that there's no reason to believe that that leads in a

Time: 7323.02

particularly good direction or that that's not a value neutral decision.

Time: 7328.4

And then the related thing that comes from that is always this question around

Time: 7331.1

what's called the precautionary principle, which shows up in all these conversations

Time: 7334.57

on things like CRISPR, which basically is this principle that basically says, the

Time: 7339.1

inventors of a new technology should be required to prove that it will not have

Time: 7341.969

negative effects before they roll it out.

Time: 7344.69

This, of course, is a very new idea.

Time: 7346.809

This is actually a new idea in the 1970s.

Time: 7348.549

It's actually invented by the German Greens.

Time: 7350.6

The 1970s.

Time: 7351.7

Before that, people didn't think in those terms.

Time: 7353.75

People just invented things and rolled them out.

Time: 7356.65

And we got all of modern civilization by people inventing

Time: 7359.429

things and rolling them out.

Time: 7361.789

The German Greens came up with the precautionary principle

Time: 7363.59

for one specific purpose.

Time: 7364.77

I'll bet you can guess what it is.

Time: 7367.62

It was to prevent...?

Time: 7369.49

Andrew Huberman: Famine?

Time: 7369.96

Marc Andreessen: Nuclear power.

Time: 7371.42

It was to shut down attempts to do civilian nuclear power.

Time: 7374.58

And if you fast forward 50 years later, you're like, wow, that was a big mistake.

Time: 7379.159

So what they said at the time was, you have to prove that nuclear

Time: 7381.74

reactors are not going to melt down and cause all kinds of problems.

Time: 7384.23

And, of course, as an engineer, can you prove that will never happen?

Time: 7387.03

You can't.

Time: 7387.69

You can't rule out things that might happen in the future.

Time: 7390.91

And so that philosophy was used to stop nuclear power by the way, not

Time: 7395.27

just in Europe, but also in the US and around much of the rest of the world.

Time: 7398.59

If you're somebody who's concerned about carbon emissions, of course, this

Time: 7401.1

is the worst thing that happened in the last 50 years in terms of energy.

Time: 7404.18

We actually have the silver bullet answer to unlimited energy with zero

Time: 7407.33

carbon emissions, nuclear power.

Time: 7408.98

We choose not to do it.

Time: 7410.64

Not only do we choose not to do it, we're actually shutting down the

Time: 7412.92

plants that we have now in California.

Time: 7415.459

We just shut down the big plant.

Time: 7417.51

Germany just shut down their plants.

Time: 7419.11

Germany is in the middle of an energy war with Russia that, we are informed,

Time: 7422.55

is existential for the future of Europe.

Time: 7424.24

Andrew Huberman: But unless the risk of nuclear power plant meltdown has

Time: 7428.039

increased, and I have to imagine it's gone the other way, what is

Time: 7432.139

the rationale behind shutting down these plants and not expanding?

Time: 7434.91

Marc Andreessen: Because nuclear is bad.

Time: 7435.96

Right. Nuclear is icky.

Time: 7437.9

Nuclear has been tagged.

Time: 7438.93

Andrew Huberman: It just sounds bad.

Time: 7440.34

Nuclear.

Time: 7440.84

Marc Andreessen: Yeah.

Time: 7442.57

Andrew Huberman: Go nuclear.

Time: 7442.98

Marc Andreessen: Well, so what happened?

Time: 7443.647

Andrew Huberman: We didn't shut down postal offices and you hear go postal.

Time: 7446.07

Marc Andreessen: So what happened was, so nuclear technology arrived

Time: 7448.849

on planet Earth as a weapon, right?

Time: 7450.5

So it arrived in the form of.

Time: 7451.889

The first thing they did was in the middle of World War II.

Time: 7453.92

The first thing they did was the atomic bomb they dropped on Japan.

Time: 7456.26

And then there were all the debates that followed around

Time: 7458.04

nuclear weapons and disarmament.

Time: 7459.56

And there's a whole conversation to be had, by the way, about

Time: 7461.42

that, because there's different views you could have on that.

Time: 7464.5

And then it was in the.

Time: 7465.88

Where they started to roll out civilian nuclear power.

Time: 7467.48

And then there were accidents.

Time: 7469.25

There was like, three Mile island melted down, and then Chernobyl melted

Time: 7473.16

down in the Soviet Union, and then even recently, Fukushima melted down.

Time: 7476.65

And so there have been meltdowns.

Time: 7477.96

And so I think it was a combination of it's a weapon.

Time: 7480.41

It is sort of icky scientists sometimes with the ick factor, right.

Time: 7487.719

It glows green.

Time: 7488.53

And by the way, it becomes like a mythical fictional thing.

Time: 7493.33

And so you have all these movies of horrible supervillains powered by

Time: 7496.209

nuclear energy and all this stuff.

Time: 7498.07

Andrew Huberman: Well, the intro to the Simpsons, right.

Time: 7499.71

Is the nuclear power plant and the three eyed fish and all the negative

Time: 7504.99

implications of this nuclear power plant run by, at least in the Simpsons idiots.

Time: 7510.08

And that is the dystopia, where people are unaware of just how bad it.

Time: 7517.27

Marc Andreessen: Is and who owns the nuclear power plant.

Time: 7518.86

Right. This evil capitalist.

Time: 7521.42

Right.

Time: 7521.96

So it's connected to capitalism.

Time: 7523.92

Right.

Time: 7525.08

Andrew Huberman: We're blaming Matt Gronig for the demise of a particular--

Time: 7528.19

Marc Andreessen: --He certainly didn't help.

Time: 7531.43

But it's literally, this amazing thing where if you're just like, thinking.

Time: 7533.96

If you're just thinking rationally, scientifically, you're like, okay,

Time: 7536.389

we want to get rid of carbon.

Time: 7537.349

This is the obvious way to do it.

Time: 7538.96

Okay, fun fact.

Time: 7540.5

Richard Nixon did two things that really mattered on this.

Time: 7543.78

So one is he defined in 1971 something called Project Independence, which

Time: 7547.45

was to create 1000 new state of the art nuclear plants, civilian

Time: 7550.2

nuclear plants, in the US by 1980.

Time: 7552.13

And to get the US completely off of oil and cut the entire US energy grid

Time: 7556.04

over to nuclear power, electricity, cut over to electric cars, the whole

Time: 7558.68

thing, like, detach from carbon.

Time: 7561.34

You'll notice that didn't happen.

Time: 7563.45

Why did that not happen?

Time: 7564.35

Because he also created the EPA and the Nuclear Regulatory Commission, which

Time: 7567.74

then prevented that from happening.

Time: 7568.889

Right.

Time: 7569.119

And the Nuclear Regulatory Commission did not authorize a new nuclear

Time: 7571.71

plant in the US for 40 years.

Time: 7573.19

Andrew Huberman: Why would he hamstring himself like that?

Time: 7576.219

Marc Andreessen: He got distracted by Watergate in Vietnam.

Time: 7581.88

Andrew Huberman: I think Ellsberg just died recently, right?

Time: 7584.02

The guy who released the Pentagon papers.

Time: 7585.39

Marc Andreessen: Yeah. Andrew Huberman: So complicated.

Time: 7587.91

Marc Andreessen: Yeah, exactly.

Time: 7588.69

It's this thing.

Time: 7590.07

He left office shortly thereafter.

Time: 7591.23

He didn't have time to fully figure this out.

Time: 7593.299

I don't know whether he would have figured it out or know.

Time: 7595.51

Look, Ford could have figured it out.

Time: 7596.61

Carter could have figured it out.

Time: 7597.56

Reagan could have figured it out.

Time: 7598.53

Any of these guys could have figured it out.

Time: 7599.76

It's like the most obvious.

Time: 7600.72

Knowing what we know today, it's the most obvious thing in the world.

Time: 7603.59

The Russia thing is the amazing thing.

Time: 7604.849

It's like Europe is literally funding Russia's invasion of Ukraine

Time: 7607.38

by paying them for oil, right?

Time: 7609.4

And they can't shut off the oil because they won't cut over to nuclear, right?

Time: 7612.639

And then, of course, what happens?

Time: 7613.799

Okay, so then here's the other kicker of what happens, right?

Time: 7615.76

Which is they won't do nuclear, but they want to do renewables, right?

Time: 7619.35

Sustainable energy.

Time: 7620.24

And so what they do is they do solar and wind.

Time: 7622.91

Solar and wind are not reliable because it sometimes gets dark out

Time: 7626.49

and sometimes the wind doesn't blow.

Time: 7628.139

And so then what happens is they fire up the coal plants, right?

Time: 7631.23

And so the actual consequence of the precautionary principle for

Time: 7634.68

the purpose it was invented is a massive spike in use of coal.

Time: 7638.01

Andrew Huberman: That's taking us back over 100 years.

Time: 7639.79

Marc Andreessen: Yes.

Time: 7640.19

Correct.

Time: 7640.82

That is the consequence of the cautionary principle.

Time: 7643.36

That's the consequence of that mentality.

Time: 7645.66

And so it's a failure of a principle on its own merits

Time: 7648.059

for the thing it was designed.

Time: 7648.99

Then, you know, there's a whole movement of people who want to

Time: 7652.3

apply it to every new thing.

Time: 7653.28

And this is the hot topic on AI right now in Washington, which is like, oh

Time: 7657.04

my God, these people have to prove that this can never get used for bad things.

Time: 7659.87

Andrew Huberman: Sorry, I'm hung up on this nuclear thing.

Time: 7661.9

And I wonder, can it just be?

Time: 7666.289

I mean, there is something about the naming of things.

Time: 7670.42

We know this in, I mean, you know, Lamarckian evolution and things like that.

Time: 7676.15

These are bad words in biology.

Time: 7677.79

But we had a guest on this podcast, Oded Rechavii, who's over in Israel,

Time: 7680.57

who's shown inherited traits.

Time: 7683.219

But if you talk about his Lamarckian, then it has all sorts of negative implications.

Time: 7687.29

But his discoveries have important implications for everything from

Time: 7691.18

inherited trauma to treatment of disease.

Time: 7694.23

I mean, there's all sorts of positives that await us if we are able to reframe

Time: 7698.17

our thinking around something that, yes, indeed, could be used for evil,

Time: 7702.34

but that has enormous potential and that is in agreement with nature, right?

Time: 7707.23

This fundamental truth that at least to my knowledge, no one is revising

Time: 7710.98

in any significant way anytime soon.

Time: 7713.07

So what if it were called something else?

Time: 7715.34

It could be nuclear.

Time: 7716.34

It's called sustainable, right?

Time: 7719.679

I mean, it's amazing how marketing can shift our perspective

Time: 7722.709

of robots, for instance.

Time: 7724.53

Or anyway, I'm sure you can come up with better examples than I

Time: 7727.77

can, but is there a good, solid PR firm working from the nuclear side?

Time: 7735.11

Marc Andreessen: Thunbergian.

Time: 7737.01

Greta Thunberg.

Time: 7738.12

Andrew Huberman: Thunbergian.

Time: 7738.57

Marc Andreessen: Thunbergian.

Time: 7740.47

Like if she was in favor of it, which by the way, she's not.

Time: 7743.47

She's dead set against it.

Time: 7744.77

Andrew Huberman: She said that 100%.

Time: 7746.02

Marc Andreessen: Yeah.

Time: 7746.68

Andrew Huberman: Based on.

Time: 7747.38

Marc Andreessen: Based on Thunbergian principles.

Time: 7749.58

The prevailing ethic in environmentalism for 50 years is that nuclear is evil.

Time: 7753.59

Like, they won't consider it.

Time: 7754.88

There are, by the way, certain environmentalists who disagree with this.

Time: 7757.156

And so Stuart Brand is the one that's been the most public, and he has

Time: 7759.57

impeccable credentials in the space.

Time: 7760.84

Andrew Huberman: And he wrote Whole Earth Catalog

Time: 7762.49

. Marc Andreessen: Whole Earth Catalog guy.

Time: 7763.05

Yeah.

Time: 7763.36

And he's written a whole bunch of really interesting book since.

Time: 7765.53

And he wrote a recent book that goes through in detail.

Time: 7767.639

He's like, yes, obviously the correct environmental

Time: 7769.74

thing to do is nuclear power.

Time: 7772.01

And we should be implementing project independence.

Time: 7773.98

We should be building a thousand.

Time: 7775.459

Specifically, he didn't say this, but this is what I would say.

Time: 7777.62

We should hire Charles Koch.

Time: 7780.8

We should hire Koch Industries and they should build us a thousand nuclear

Time: 7783.88

power plants, and then we should give them the presidential Medal of

Time: 7786.36

Freedom for saving the environment.

Time: 7788.58

Andrew Huberman: And that would put us independent of our reliance on oil.

Time: 7790.81

Marc Andreessen: Yeah.

Time: 7791.29

Then we're done with.

Time: 7792.26

We're just, think about what happens.

Time: 7793.24

We're done with oil, zero emissions, we're done with the Middle East.

Time: 7795.8

We're done.

Time: 7797.15

We're done.

Time: 7797.67

We're not drilling on American land anymore.

Time: 7800.21

We're not drilling on foreign land.

Time: 7801.62

Like, we have no military entanglements in places where we're not despoiling Alaska.

Time: 7806.17

We're not, nothing.

Time: 7807.08

No offshore rigs, no nothing.

Time: 7808.21

We're done.

Time: 7809.38

And basically just you build state of the art plants, engineered properly,

Time: 7811.93

you have them just completely contained.

Time: 7813.26

When there's nuclear waste, you just entomb the waste in concrete.

Time: 7816.55

So it just sits there forever.

Time: 7819.57

It's just a very small footprint kind of thing.

Time: 7822.25

And you're just done.

Time: 7823.629

And so to me, it's like scientifically, technologically, this is just like

Time: 7827.33

the most obvious thing in the world.

Time: 7829.09

It's a massive tell on the part of the people who claim to be pro-environment

Time: 7832.05

that they're not in favor of this.

Time: 7833.65

Andrew Huberman: And if I were to say, tweet that I'm pro nuclear power

Time: 7837.74

because it's the more sustainable form of power, if I hypothetically did that

Time: 7841.29

today, what would happen to me in this.

Time: 7844.474

Marc Andreessen: You'd be a cryptofascist.] LAUGHS] Dirty,

Time: 7848.83

evil, capitalist monster.

Time: 7850.38

How dare you?

Time: 7851.32

Andrew Huberman: I'm unlikely to run that experiment.

Time: 7852.91

I was just curious.

Time: 7853.486

That was what we call a Gedanken experiment.

Time: 7855.29

Marc Andreessen: Andrew, you're a terrible human being.

Time: 7858.889

We were looking for evidence that you're a terrible human being, and now we know it.

Time: 7862.11

This is a great example of the, I gave Andrew a book on the way in

Time: 7865.79

here with this, my favorite new book.

Time: 7867.18

The title of it is When Reason Goes on Holiday , and this is a great example

Time: 7870.82

of it is, the people who simultaneously say they're environmentalists and

Time: 7874.89

say they're anti nuclear power.

Time: 7875.93

Like the positions just simply don't reconcile.

Time: 7878.29

But that doesn't bother them at all.

Time: 7880.58

So be clear.

Time: 7881.5

I predict none of this will happen.

Time: 7884.029

Andrew Huberman: Amazing.

Time: 7884.86

I need to learn more about nuclear power.

Time: 7887.09

Marc Andreessen: Long coal.

Time: 7888.33

Andrew Huberman: Long coal.

Time: 7888.96

Marc Andreessen: Long coal. Invest in coal.

Time: 7890.98

Andrew Huberman: Because you think we're just going to revert?

Time: 7892.28

Marc Andreessen: It's the energy source of the future.

Time: 7895.37

Well, because it can't be solar and wind, because they're not reliable.

Time: 7897.559

So you need something.

Time: 7899.309

If it's not nuclear, it's going to be either like oil, natural gas, or coal.

Time: 7902.48

Andrew Huberman: And you're unwilling to say bet on nuclear because you don't

Time: 7905.4

think that the sociopolitical elitist trends that are driving against nuclear

Time: 7911.84

are likely to dissipate anytime soon.

Time: 7913.53

Marc Andreessen: Not a chance.

Time: 7914.23

I can't imagine it would be great if they did.

Time: 7916.7

But the powers that be are very locked in on this as a position.

Time: 7922.48

And look, they've been saying this for 50 years, and so they'd have

Time: 7924.57

to reverse themselves off of a bad position they've had for 50 years.

Time: 7927.15

And people really don't like to do that.

Time: 7930.12

Andrew Huberman: One thing that's good about this and other podcasts

Time: 7932.129

is that young people listen and they eventually will take over.

Time: 7935.41

Marc Andreessen: And by the way, I will say also there are nuclear entrepreneurs.

Time: 7938.09

So on the point of young kids, there are a bunch of young entrepreneurs who are

Time: 7942.33

basically not taking no for an answer.

Time: 7944.09

And they're trying to develop, in particular, there's people trying

Time: 7946.27

to develop new, very small form factor nuclear power plants with

Time: 7951.16

a variety of possible use cases.

Time: 7952.4

So, look, maybe they show up with a better mousetrap and people

Time: 7956.77

take a second look, but we'll see.

Time: 7959.17

Andrew Huberman: Just rename it.

Time: 7961.15

So, my understanding is that you think we should go all in on

Time: 7966.12

AI with the constraints that we discover we need in order to rein

Time: 7971.92

in safety and things of that sort.

Time: 7973.129

Not unlike social media, not unlike the Internet.

Time: 7976.43

Marc Andreessen: Not unlike what we should have done with nuclear power.

Time: 7980.68

Andrew Huberman: And in terms of the near infinite number of ways that AI can be

Time: 7985.84

envisioned to harm us, how do you think we should cope with that psychologically?

Time: 7990.549

Because I can imagine a lot of people listening to this conversation are

Time: 7993.42

thinking, okay, that all sounds great, but there are just too many

Time: 7997.03

what ifs that are terrible, right?

Time: 7999.44

What if the machines take over?

Time: 8000.79

What if the silly example I gave earlier, but what if one day I

Time: 8004.85

could log into my hard earned bank account and it's all gone?

Time: 8009.27

The AI version of myself ran off with someone else, and with all my money, my

Time: 8016.44

AI coach abandoned me for somebody else.

Time: 8019.44

After it learned all the stuff that I taught it.

Time: 8022.3

It took off with somebody else stranded.

Time: 8025.36

And it has my bank account numbers, like this kind of thing.

Time: 8029.209

Marc Andreessen: You could really make this scenario horrible,

Time: 8030.88

right, if you kept going?

Time: 8031.84

Andrew Huberman: Yeah, well, we can throw in a benevolent example as well to

Time: 8037.03

counter it, but it's kind of fun to think about where the human mind goes, right?

Time: 8041.53

Marc Andreessen: Yeah.

Time: 8041.95

So first I say we've got to separate the real problems from the fake problems.

Time: 8044.75

And so there's a lot.

Time: 8045.54

A lot of the science fiction scenarios I think are just not real.

Time: 8047.87

And the ones that you decided as an example, like, it's.

Time: 8049.93

That's not what is going to happen.

Time: 8051

And I can explain why that's not what's going to happen.

Time: 8051.651

There's a set of fake ones, and the fake ones are the ones that just

Time: 8056.25

aren't, I think, technologically grounded, that aren't rational.

Time: 8059.13

It's the AI is going to wake up and decide to kill us all.

Time: 8061.389

It's going to develop the kind of agency where it's going to steal our money and

Time: 8065.6

our spouse and everything else, our kids.

Time: 8068.19

That's not how it works.

Time: 8070.37

And then there's also all these concerns, destruction of society concerns.

Time: 8073.809

And this is misinformation, hate speech, deepfakes, like all that stuff, which I

Time: 8077.92

don't think is actually a real problem.

Time: 8080.51

And then people have a bunch of economic concerns around what's going to take all

Time: 8084.16

the jobs and all those kinds of things.

Time: 8086.25

We could talk about that.

Time: 8087.04

I don't think that's actually the thing that happens.

Time: 8090.55

But then there are two actual real concerns that I actually

Time: 8093.02

do very much agree with.

Time: 8094

And one of them is what you said, which is bad people doing bad things.

Time: 8097.559

And there's a whole set of things to be done inside there.

Time: 8101.05

The big one is we should use AI to build defenses against

Time: 8103.92

all the bad things, right?

Time: 8105.56

And so, for example, there's a concern AI is going to make it easier

Time: 8108.48

for bad people to build pathogens, design pathogens in labs, which bad

Time: 8112.34

scientists can do today, but this is going to make it easier, easier to do.

Time: 8115.389

Well, obviously, we should have the equivalent of an Operation Warpspeed,

Time: 8118.5

operating in perpetuity anyway.

Time: 8120.99

But then we should use AI to build much better bio defenses.

Time: 8124.31

And we should be using AI today to design, like, for example, full spectrum vaccines

Time: 8127.68

against every possible form of pathogen.

Time: 8130.49

So defensive mechanism hacking, you can use AI to build

Time: 8133.74

better defense tools, right?

Time: 8135.069

And so you should have a whole new kind of security suite wrapped around

Time: 8137.87

you, wrapped around your data, wrapped around your money, where you're having

Time: 8141.2

AI repel attacks, disinformation, hate speech, deepfakes, all that stuff.

Time: 8146.02

You should have an AI filter when you use the Internet, where you shouldn't

Time: 8149.97

have to figure out whether it's really me or whether it's a made up thing.

Time: 8152.99

You should have an AI assistant that's doing that for you.

Time: 8155.12

Andrew Huberman: Oh, yeah.

Time: 8155.42

I mean, these little banners and cloaks that you see on social media like

Time: 8158.77

"this has been deemed misinformation."

Time: 8161.5

If you're me, you always click because you're like, what's behind the scrim?

Time: 8167.57

I don't always look at this image is gruesome type thing.

Time: 8171.37

Sometimes I just pass on that.

Time: 8173.19

But if it's something that seems debatable, of course you look well.

Time: 8177.58

Marc Andreessen: And you should have an AI assistant with you

Time: 8179.53

when you're on the Internet.

Time: 8180.12

And you should be able to tell that AI assistant what you want, right?

Time: 8182.75

So, yes, I want the full experience.

Time: 8185.1

Show me everything.

Time: 8186.299

I want it from a particular point of view.

Time: 8187.879

And I don't want to hear from these other people who I don't like, by the way.

Time: 8191.049

It's going to be, my eight year old is using this.

Time: 8192.44

I don't want anything that's going to cause a problem.

Time: 8194.309

And I want everything filtered and AI based filters like that that you

Time: 8197.78

program and control are going to work much better and be much more honest

Time: 8201.41

and straightforward and clear and so forth than what we have today.

Time: 8204.469

Anyway, basically, what I want people to do is think, every time you think

Time: 8207

of a risk of how it can be used, just think of like, okay, we can

Time: 8209.4

use it to build a countermeasure.

Time: 8210.98

And the great thing about the countermeasures is they

Time: 8212.72

can not only offset AI risks, they can offset other risks.

Time: 8215.45

Right?

Time: 8215.74

Because we already live in a world where pathogens are a problem, right?

Time: 8219.41

We ought to have better vaccines anyway.

Time: 8221.929

We already live in a world where there's cyber hacking and cyber terrorism.

Time: 8224.329

They already live in a world where there's bad content on the Internet.

Time: 8226.559

And we have the ability now to build much better AI powered tools

Time: 8229.61

to deal with all those things.

Time: 8232.1

Andrew Huberman: I also love the idea of the AI physicians.

Time: 8236.28

Getting decent health care in this country is so difficult, even for

Time: 8240.36

people who have means or insurance.

Time: 8242.15

I mean, the number of phone calls and waits that you have to go through to get a

Time: 8245.59

referral to see a specialist, it's absurd.

Time: 8248.83

The process is absurd.

Time: 8250.97

I mean, it makes one partially or frankly ill just to go through the

Time: 8255.01

process of having to do all that.

Time: 8256.599

I don't know how anyone does it.

Time: 8258.509

And granted, I don't have the highest degree of patience, but I'm pretty

Time: 8262.309

patient, and it drives me insane to even just get remedial care.

Time: 8268.629

So I can think of a lot of benevolent uses of AI.

Time: 8271.943

And I'm grateful that you're bringing this up and here and that you've

Time: 8275.5

tweeted about it in that thread.

Time: 8276.849

Again, we'll refer people to that.

Time: 8278.48

And that you're thinking about this.

Time: 8279.709

I have to imagine that in your role as investor nowadays, that

Time: 8283.24

you're also thinking about AI quite often in terms of all these roles.

Time: 8288.09

And so does that mean that there are a lot of young people who are really

Time: 8293.099

bullish on AI and are going for it?

Time: 8295.23

Marc Andreessen: Yeah. Okay.

Time: 8295.87

Andrew Huberman: This is here to stay.

Time: 8297.199

Marc Andreessen: Okay.

Time: 8299.32

Andrew Huberman: Unlike CRISPR, which is sort of in this liminal place where

Time: 8301.77

biotech companies aren't sure if they should invest or not in CRISPR because

Time: 8307.08

it's unclear whether or not the governing bodies are going to allow gene editing,

Time: 8310.599

just like it was unclear 15 years ago if they were going to allow gene therapy.

Time: 8313.95

But now we know they do allow gene therapy and immunotherapy.

Time: 8317.33

Marc Andreessen: Okay, so there is a fight.

Time: 8318.32

Having said that, there is a fight.

Time: 8319.35

There's a fight happening in Washington right now over exactly

Time: 8322.25

what should be legal or not legal.

Time: 8323.91

And there's quite a bit of risk, I think, attached to that fight right

Time: 8325.94

now because there are some people in there that are telling a very effective

Time: 8328.93

story to try to get people to either outlaw AI or specifically limit it to

Time: 8332.48

a small number of big companies, which I think is potentially disastrous.

Time: 8337.16

By the way, the EU also is, like, super negative.

Time: 8339.929

The EU has turned super negative on basically all new technology, so they're

Time: 8342.53

moving to try to outlaw AI, which if they outlaw AI, flat out don't want it.

Time: 8347.349

Andrew Huberman: But that's like saying you're going to outlaw the Internet.

Time: 8349.4

I don't see how you can stop this train.

Time: 8350.959

Marc Andreessen: And frankly, they're not a big fan of the Internet either.

Time: 8352.65

So I think they regret the EU has a very, especially the EU bureaucrats, the people

Time: 8358.37

who run the EU in Brussels have a very negative view on a lot of modernity.

Time: 8364.309

Andrew Huberman: But what I'm hearing calls to mind things that

Time: 8367.259

I've heard people like David Goggins say, which is, you know, there's

Time: 8369.869

so many lazy, undisciplined people out there that nowadays it's easier

Time: 8373.92

and easier to become exceptional.

Time: 8375.35

I've heard him say something to that extent.

Time: 8376.64

It almost sounds like there's so many countries that are just backing off of

Time: 8380.599

particular technologies because it just sounds bad from the PR perspective that

Time: 8387.18

it's creating great, kind of, low hanging fruit, opportunities for people to barge

Time: 8390.87

forward and countries to barge forward.

Time: 8392.49

If they're willing to embrace this stuff.

Time: 8394.15

Marc Andreessen: It is, but number one, you have to have a

Time: 8395.63

country that wants to do that.

Time: 8398.28

Those exist, and there are countries like that.

Time: 8400.38

And then the other is, look, they need to be able to withstand the

Time: 8402.8

attack from stronger countries that don't want them to do it, right?

Time: 8406.91

So the EU, the EU has nominal control over whatever it is, 27

Time: 8411.31

or whatever member countries.

Time: 8412.71

So even if you're like, whatever the Germans get all fired up about,

Time: 8415.46

whatever, Brussels can still, in a lot of cases, just like flat out, basically

Time: 8418.323

control them and tell them not to do it.

Time: 8420.03

And then the US, you know, we have a lot of control over a lot of the world.

Time: 8424.33

Andrew Huberman: But it sounds like we sit somewhere sort of in between.

Time: 8426.8

Like right now, people are developing AI technologies in US companies, r ight?

Time: 8432.5

So it is happening.

Time: 8433.52

Marc Andreessen: Yeah, today it's happening.

Time: 8434.73

But like I said, there's a set of people who are very focused in Washington

Time: 8437.67

right now about trying to either ban it outright or trying to, as I said, limit

Time: 8441.93

it to a small number of big companies.

Time: 8444.39

And then, look, China's got a whole, the other part of this is China's got a whole

Time: 8447.6

different kind of take on this than we do.

Time: 8449.34

And so they're, of course, going to allow it for sure, but they're

Time: 8451.7

going to allow it in the ways that their system wants it to happen.

Time: 8455.53

Right.

Time: 8455.7

Which is much more for population control and to implement authoritarianism.

Time: 8460.459

And then, of course, they are going to spread their technology

Time: 8463.21

and their vision of how society should run across the world.

Time: 8466.37

So we're back in a Cold War dynamic like we were with the Soviet Union,

Time: 8469.36

where there are two different systems that have fundamentally different views

Time: 8472.3

on issues, concepts like freedom and individual choice and freedom of speech.

Time: 8475.98

And so, you know, we know where the Chinese stand.

Time: 8479.18

We're still figuring out where we stand.

Time: 8480.72

I'm having specifically a lot of schizophrenic conversations with

Time: 8485.4

people in DC right now, where if I talk to them and China doesn't

Time: 8488.49

come up, they just hate tech.

Time: 8490.389

They hate American tech companies, they hate AI, they hate social media,

Time: 8493.54

they hate this, they hate that, they hate crypto, they hate everything,

Time: 8495.629

and they just want to punish and ban, and they're just very negative.

Time: 8499.86

But then if we have a conversation a half hour later and we talk about China, then

Time: 8503.37

the conversation is totally different.

Time: 8504.62

Now we need a partnership between the US government and American

Time: 8506.93

tech companies to defeat China.

Time: 8508.959

It's like the exact opposite discussion.

Time: 8511.25

Right?

Time: 8511.4

Andrew Huberman: Is that fear or competitiveness on China specifically

Time: 8514.049

in terms of the US response in, you know, you bring up these technologies,

Time: 8520.68

know, I'll lump CRISPR in there things like CRISPR, nuclear power, AI.

Time: 8524.34

It all sounds very cold, very dystopian to a lot of people.

Time: 8527.969

And yet there are all these benevolent uses as we've been talking about.

Time: 8532.549

And then you say you raise the issue of China and then it sounds

Time: 8535.23

like this big dark cloud emerging.

Time: 8537.52

And then all of a sudden, we need to galvanize and develop these

Time: 8542

technologies to counter their effort.

Time: 8543.84

So is it fear of them or is it competitiveness or both?

Time: 8548.15

Marc Andreessen: Well, so without them in the picture, you just have this.

Time: 8551.59

Basically there's an old Bedouin saying as me against my brother, me and my

Time: 8556.19

brother against my cousin, me and my brother and my cousin against the world.

Time: 8560.2

It's actually, it's evolution in action, I think we'd think about it,

Time: 8564.43

is if there's no external threat, then the conflict turns inward, and then

Time: 8568.14

at that point, there's a big fight between specifically, tech, and then

Time: 8571.78

I was just say, generally politics.

Time: 8573.21

And my interpretation of that fight is it's a fight for status.

Time: 8576.08

It's fundamentally a fight for status and for power, which is like, if you're

Time: 8579.2

in politics, you like the status quo of how power and status work in our society.

Time: 8583.77

You don't want these new technologies to show up and change things,

Time: 8586.37

because change is bad, right?

Time: 8588.23

Change threatens your position.

Time: 8589.68

It threatens the respect that people have for you and your control over things.

Time: 8593.24

And so I think it's primarily a status fight, which we could talk about.

Time: 8597.52

But the China thing is just like a straight up geopolitical us versus them.

Time: 8601.509

Like I said, it's like a Cold War scenario.

Time: 8603.1

And look, 20 years ago, the prevailing view in Washington was, we need

Time: 8606.61

to be friends with China, right?

Time: 8607.8

And we're going to be trading partners with China.

Time: 8609.3

And yes, they're a totalitarian dictatorship, but if we trade with them,

Time: 8612.32

over time, they'll become more democratic.

Time: 8614.2

In the last five to ten years, it's become more and more clear

Time: 8617.24

that that's just not true.

Time: 8619.37

And now there's a lot of people in both political parties in DC who very much

Time: 8623.23

regret that and want to change too much, more of a sort of a Cold War footing.

Time: 8627.18

Andrew Huberman: Are you willing to comment on TikTok and technologies

Time: 8630.57

that emerge from China that are in widespread use within the US, like how

Time: 8634.7

much you trust them or don't trust them?

Time: 8636.32

I can go on record myself by saying that early on, when TikTok was released,

Time: 8641.62

we were told, as Stanford faculty, that we should not and could not have

Time: 8645.66

TikTok accounts nor WeChat accounts.

Time: 8649.429

Marc Andreessen: So to start with, there are a lot of really bright

Time: 8652

Chinese tech entrepreneurs and engineers who are trying to do good things.

Time: 8654.95

I'm totally positive about that.

Time: 8656.77

So I think many of the people mean very well, but the Chinese have

Time: 8661.03

a specific system, and the system is very clear and unambiguous.

Time: 8665.52

And the system is, everything in China is owned by the party.

Time: 8668.329

It's not even owned by the state.

Time: 8669.963

It's owned by the party. It's owned by the Chinese Communist Party.

Time: 8671.21

So the Chinese Communist Party owns everything, and they control everything.

Time: 8673.92

By the way, it's actually illegal to this day.

Time: 8675.549

It's illegal for an investor to buy equity in a Chinese company.

Time: 8679.07

There's all these basically legal machinations that people do to try

Time: 8682.45

to do something that's like the economic equivalent to that, but it's

Time: 8684.54

actually still illegal to do that.

Time: 8687.28

The Chinese Communist Party has no intention of letting

Time: 8689.469

foreigners own any of China.

Time: 8691.13

Like, zero intention of that.

Time: 8692.549

And they regularly move to make sure that that doesn't happen.

Time: 8696.45

So they own everything.

Time: 8697.28

They control everything.

Time: 8698

Andrew Huberman: Sorry to interrupt you, but people in China can invest

Time: 8701.29

in American companies all the time.

Time: 8702.88

Marc Andreessen: Well, they can, subject to US government constraints.

Time: 8705.389

There is a US government system that attempts to mediate that called

Time: 8709.72

CFIUS, and there are more and more limitations being put on that.

Time: 8713.3

But if you can get through that approval process, then legally you

Time: 8716.25

can do that, whereas the same is not true with respect to China.

Time: 8719.67

So they just have a system.

Time: 8723.48

And so if you're the CEO of a Chinese company, it's not optional.

Time: 8727.28

If you're the CEO of ByteDance, CEO of Tencent, your relationship

Time: 8730.71

with the Chinese Communist Party is not optional, it's required.

Time: 8733.93

And what's required is you are a unit of the party and you and your

Time: 8737.4

company do what the party says.

Time: 8738.75

And when the party says we get full access to all user data in America, you say yes.

Time: 8743.16

When the party says you change the algorithm to optimize to a certain

Time: 8746.06

social result, you say whatever.

Time: 8749.9

It's whatever Xi Jinping and his party cadres decide, and

Time: 8753.89

that's what gets implemented.

Time: 8755.66

If you're the CEO of a Chinese tech company, there is a political

Time: 8758.31

officer assigned to you who has an office down the hall.

Time: 8761.74

And at any given time, he can come down the hall, he can grab you out of

Time: 8764.789

your staff meeting or board meeting, and he can take you down the hall

Time: 8766.749

and he can make you sit for hours and study Marxism and Xi Jinping thought

Time: 8770.18

and quiz you on it and test you on it, and you'd better pass the test, Right?

Time: 8774.38

So it's like a straight political control thing.

Time: 8777.28

And then, by the way, if you get crossways with them, like...

Time: 8782.379

Andrew Huberman: So when we see tech founders getting called up

Time: 8785.94

to Congress for what looks like interrogation, but it's probably

Time: 8790.61

pretty light interrogation compared to what happens in other countries.

Time: 8793.58

Marc Andreessen: Yeah, it's state power.

Time: 8795.73

They just have this view of top down state power, and they view it's that

Time: 8798.66

their system, and they view that it's necessary for lots of historical

Time: 8801.15

and moral reasons that they've defined, and that's how they run.

Time: 8803.09

And then they've got a view that says how they want to propagate

Time: 8805.18

that vision outside the country.

Time: 8806.7

And they have these programs like Belt and Road that basically are intended to

Time: 8810.21

propagate kind of their vision worldwide.

Time: 8812.9

And so they are who they are.

Time: 8815.58

I will say that they don't lie about it.

Time: 8817.679

They're very straightforward.

Time: 8819.059

They give speeches, they write books.

Time: 8820.24

You can buy Xi Jinping speeches.

Time: 8821.48

He goes through the whole thing.

Time: 8822.359

They have their tech 2025 plan.

Time: 8824.07

This is like ten years ago.

Time: 8825.72

Their whole AI agenda, it's all in there.

Time: 8827.81

Andrew Huberman: And is their goal that in 200 years, 300 years, that China is

Time: 8832.04

the superpower controlling everything?

Time: 8834.79

Marc Andreessen: Yeah.

Time: 8835.03

Or 20 years, 30 years, or two years, three years.

Time: 8837.16

Andrew Huberman: Yeah, but they've got a shorter horizon.

Time: 8838.359

Marc Andreessen: I don't know.

Time: 8841.859

Everybody's a little bit like this, I guess, but, yeah, they want to win.

Time: 8845.93

Andrew Huberman: Well, the CRISPR in humans example that I gave earlier

Time: 8848.94

was interesting to me because, first of all, I'm a neuroscientist and

Time: 8851.889

they could have edited any genes, but they chose to edit the genes

Time: 8856.78

involved in the attempt to create super memory babies, which presumably

Time: 8861.76

would grow into super memory adults.

Time: 8865.23

And whether or not they succeeded in that isn't clear.

Time: 8868.19

Those babies are alive and presumably by now, walking, talking.

Time: 8872.77

As far as I know, whether or not they have super memories isn't clear.

Time: 8875.56

But China is clearly unafraid to augment biology in that way.

Time: 8883.309

And I believe that that's inevitable, that's going to happen elsewhere, probably

Time: 8890.26

first for the treatment of disease.

Time: 8891.869

But at some point, I'm assuming people are going to augment biology to make

Time: 8895.64

smarter kids, not always, but often will select mates based on the traits they

Time: 8901.34

would like their children to inherit.

Time: 8903.4

So this happens far more frequently than could be deemed bad.

Time: 8907.809

Either that or people are bad, because people do this all the time,

Time: 8910.279

selecting mates that have physical and psychological and cognitive traits that

Time: 8914.41

you would like your offspring to have.

Time: 8916.75

CRISPR is a more targeted approach.

Time: 8918.12

Of course, the reason I'm kind of giving this example and examples

Time: 8921.759

like it is that I feel like so much of the way that governments and

Time: 8926.799

the public react to technologies is to just take that first glimpse.

Time: 8931.38

And it just feels scary.

Time: 8933.26

You think about the old Apple ad of the 1984 Ad.

Time: 8938.05

I mean, there was one very scary version of the personal computer

Time: 8941.08

and computers and robots taking over and everyone like automatons.

Time: 8944.5

And then there was the Apple version where it's all about creativity,

Time: 8947.55

love and peace, and it had the pseudo psychedelic California thing going for it.

Time: 8952.37

Again, great marketing seems to convert people's thinking about technology

Time: 8958.67

such that what was once viewed as very scary and dangerous and dystopian

Time: 8964.549

is like an oasis of opportunity.

Time: 8966.93

So why are people so afraid of new technologies?

Time: 8970.15

Marc Andreessen: So this is the thing I've tried to understand for a

Time: 8972.23

long time, because the history is so clear and the history basically is

Time: 8976.554

that every new technology is greeted by what's called a moral panic.

Time: 8979.91

And so it's basically this hysterical freak out of some kind that causes people

Time: 8984.16

to basically predict the end of the world.

Time: 8985.48

And you go back in time, and actually, this is a historical sort of effect,

Time: 8989.26

it happens even in things now where you just look back and it's ludicrous.

Time: 8991.79

And so you mentioned earlier the satanic panic of the concern

Time: 8995.66

around, like, heavy metal music.

Time: 8997.66

Before that, there was, like, a freak out around comic books.

Time: 9000.05

In the 50s, there was a freak out around jazz music in the

Time: 9002.91

20s and 30s, it's devil music.

Time: 9005.9

There was a freak out, the arrival of bicycles caused a moral panic

Time: 9008.63

in the, like, 1860s, 1870s.

Time: 9009.97

Bicycles?

Time: 9010.59

Bicycles, yeah.

Time: 9011.25

So there was this thing at the time.

Time: 9012.62

So bicycles were the first.

Time: 9015.099

They were the first very easy to use personal transportation thing that

Time: 9017.42

basically let kids travel between towns quickly without any overhead.

Time: 9022.48

You have to take care of a horse.

Time: 9024.196

You just jump on a bike and go.

Time: 9025.88

And so there was a historical panic, specifically around at the time,

Time: 9028.85

young women who for the first time, were able to venture outside the

Time: 9032.759

confines of the town to maybe go have a boyfriend, another town.

Time: 9036.51

And so the magazines at the time read all these stories on this phenomenon,

Time: 9039.56

medical phenomenon, called bicycle face.

Time: 9041.9

And the idea of bicycle face was the exertion caused by pedaling

Time: 9045.07

a bicycle would cause your face.

Time: 9046.46

Your face would grimace, and then if you were on the bicycle for too

Time: 9048.91

long, your face would lock into place.

Time: 9053.94

Andrew Huberman: [LAUGHS] Sorry.

Time: 9054.11

Marc Andreessen: And then you would be unattractive, and therefore, of

Time: 9055.99

course, unable to then get married.

Time: 9059.11

Cars, there was a moral panic around red flag laws.

Time: 9062.75

There are all these laws that created the automobile.

Time: 9064.59

Automobiles freaked people out.

Time: 9066.21

So there are all these laws in the early days of the automobile, in a

Time: 9069.02

lot of places, you would take a ride in an automobile and automobiles,

Time: 9073.7

they broke down all the time.

Time: 9074.55

So only rich people had automobiles.

Time: 9076.54

It'd be you and your mechanic in the car.

Time: 9078.92

Right, for when it broke down.

Time: 9080.599

And then you had to hire another guy to walk 200 yards in front of the car with a

Time: 9085.09

red flag, and he had to wave the red flag.

Time: 9088.02

And so you could only drive as fast as he could walk because the red flag was

Time: 9090.64

to warn people that the car was coming.

Time: 9094.26

I think it was Pennsylvania.

Time: 9095.23

They had the most draconian version, which was they were very worried

Time: 9098.48

about the car scaring the horses.

Time: 9100.09

And so there was a law that said if you saw a horse coming,

Time: 9103.719

you needed to stop the car.

Time: 9105.009

You had to disassemble the car, and you had to hide the pieces of the

Time: 9108.199

car behind the nearest hay bale, wait for the horse to go by, and then you

Time: 9112.58

could put your car back together.

Time: 9114.76

Anyways, an example is electric lighting.

Time: 9116.809

There was a panic around, like, whether this is going to become complete ruin.

Time: 9118.425

This is going to completely ruin the romance of the dark.

Time: 9121.71

And it was going to cause a whole new kind of terrible civilization where

Time: 9125.119

everything is always brightly lit.

Time: 9126.66

So there's just all these examples.

Time: 9128.46

And so it's like, okay, what on earth is happening?

Time: 9130.24

That this is always what happens?

Time: 9132.42

And so I finally found this book that I think has a good model for it.

Time: 9135.96

A book is called Men, Machines, and Modern Times . And it's written by

Time: 9138.31

this MIT professor, like, 60 years ago.

Time: 9140.33

So it predates the Internet, but it uses a lot of historical examples.

Time: 9145.059

And what he says, basically, is, he says there's actually a three stage response.

Time: 9148.2

There's a three stage societal response to new technologies.

Time: 9150.56

It's very predictable.

Time: 9151.51

He said, stage one is basically just denial.

Time: 9154.69

Just ignore.

Time: 9155.389

Like, we just don't pay attention to this.

Time: 9156.9

Nobody takes it seriously.

Time: 9157.92

There's just a blackout on the whole topic.

Time: 9160.789

He says, that's stage one.

Time: 9162.35

Stage two is rational counterargument.

Time: 9165.21

So stage two is where you line up all the different reasons

Time: 9167.47

why this can't possibly work.

Time: 9168.6

It can't possibly ever get cheap, or this, that it's not fast

Time: 9172.469

enough, or whatever the thing is.

Time: 9174.36

And then he says, stage three, he says, is when the name calling begins.

Time: 9177.54

So he says, stage three is like when they fail to ignore it and they've

Time: 9183.48

failed to argue society out of it.

Time: 9185.1

Andrew Huberman: I love it.

Time: 9185.8

Marc Andreessen: They move to the name calling.

Time: 9187.1

And what's the name calling?

Time: 9188.11

The name calling is, this is evil.

Time: 9189.41

This is moral panic.

Time: 9190.13

This is evil.

Time: 9190.679

This is terrible.

Time: 9191.26

This is awful.

Time: 9191.8

This is going to destroy everything.

Time: 9193.259

Don't you understand?

Time: 9194.599

All this is horrifying.

Time: 9197.26

And you, the person working on it, are being reckless and evil and all

Time: 9200.43

this stuff, and you must be stopped.

Time: 9202.15

And he said the reason for that is because, basically,

Time: 9204.7

fundamentally, what these things are is they're a war over status.

Time: 9208.179

It's a war over status, and therefore a war over power.

Time: 9210.73

And then, of course, ultimately money.

Time: 9212.679

But human status is the thing, because what he says is, what is the

Time: 9217.58

societal impact of a new technology?

Time: 9219.53

The societal impact of a new technology is it reorders status in the society.

Time: 9223.51

So the people who are specialists in that technology become high status, and the

Time: 9227.28

people who are specialists in the previous way of doing things become low status.

Time: 9230.62

And generally, people don't adapt.

Time: 9233.049

Generally, if you're the kind of person who is high status because

Time: 9235.97

you're an evolved adaptation to an existing technology, you're probably

Time: 9239.68

not the kind of person that's going to enthusiastically try to replant

Time: 9242.78

yourself onto a new technology.

Time: 9245.12

This is like every politician who's just like in a complete

Time: 9247.29

state of panic about social media.

Time: 9248.559

Like, why are they so freaked out about social media?

Time: 9250.09

Is, because they all know that the whole nature of modern politics has changed.

Time: 9253.16

The entire battery of techniques that you use to get elected before

Time: 9255.82

social media are now obsolete.

Time: 9257.71

Obviously, the best new politicians of the future are going to be

Time: 9260.27

100% creations of social media.

Time: 9261.94

Andrew Huberman: And podcasts.

Time: 9262.58

Marc Andreessen: And podcasts.

Time: 9263.41

Andrew Huberman: And we're seeing this now as we head towards

Time: 9265.36

the next presidential election.

Time: 9266.58

That podcasts clearly are going to be featured very heavily in that next

Time: 9270.78

election, because long form content is a whole different landscape.

Time: 9277.68

Marc Andreessen: Rogan's had, like, what?

Time: 9278.51

He's had, like Bernie, he's had like Tulsi, he's had like a whole series.

Time: 9281.21

Andrew Huberman: Of RFK most recently.

Time: 9282.639

And that's created a lot of controversy.

Time: 9284.09

Marc Andreessen: A lot of controversy.

Time: 9284.45

But also my understanding, I'm sure he's invited everybody.

Time: 9287.459

I'm sure he'd love to have Biden on.

Time: 9287.92

I'm sure he'd love to have Trump on.

Time: 9290.03

Andrew Huberman: I'm sure he'd have to ask him.

Time: 9291.18

I mean, I think every podcaster has their own ethos around who

Time: 9296.37

they invite on and why and how.

Time: 9298.74

So I certainly can't speak for him, but I have to imagine that any

Time: 9303.6

opportunity to have true, long form discourse that would allow people to

Time: 9308.27

really understand people's positions on things, I have to imagine that he

Time: 9312.31

would be in favor of that sort of thing.

Time: 9314.12

Marc Andreessen: Yeah. Or somebody else would, right?

Time: 9315.849

Some other top podcaster would.

Time: 9319.95

Exactly.

Time: 9320.32

I totally agree with you.

Time: 9321.02

But my point is, if you're a politician, if you're a legacy

Time: 9323.879

politician, you have the option of embracing the new technology.

Time: 9327.55

You can do it anytime you want.

Time: 9328.93

Right.

Time: 9329.83

But you don't.

Time: 9331.15

They're not, they won't.

Time: 9333.01

They won't do it.

Time: 9333.95

And why won't they do it?

Time: 9334.75

Well, okay, first of all, they want to ignore it.

Time: 9336.59

They want to pretend that things aren't changing.

Time: 9338.53

Second is they want to have rational counterarguments for why the

Time: 9341.359

existing campaign system works the way that it does, and this and that

Time: 9343.465

and the existing media networks.

Time: 9344.79

And here's how you do things, and here's how you give speeches, and here's the

Time: 9347.62

clothes you wear and the tie and the thing and the pocket square, and you've, that.

Time: 9351.07

It's how you succeeded was coming up through that system.

Time: 9352.99

So you've got all your arguments as to why that won't work anymore.

Time: 9355.4

And then we've now proceeded to the name calling phase,

Time: 9358.58

which is now it's evil, right?

Time: 9359.869

Now it's evil for somebody to show up on a stream, God forbid, for three

Time: 9364.65

hours and actually say what they think.

Time: 9366.7

It's going to destroy society, right?

Time: 9367.86

So it's exactly like, it's a classic example of this pattern.

Time: 9371.929

Anyway, so Morrison says in the book, basically, this is the forever pattern.

Time: 9376.549

This will never change.

Time: 9377.789

This is one of those things where you can learn about it and still nothing,

Time: 9381.05

the entire world could learn about this, and still nothing changes.

Time: 9383.8

Because at the end of the day, it's not the tech that's the question,

Time: 9387.17

it's the reordering of status.

Time: 9391.06

Andrew Huberman: I have a lot of thoughts about the podcast component.

Time: 9392.96

I'll just say this because I want to get back to the topic

Time: 9396.64

of innovation of technology.

Time: 9399.53

But on a long form podcast, there's no safe zone.

Time: 9405.3

The person can get up and walk out.

Time: 9407.12

But if the person interviewing them, and certainly Joe is the best of the very

Time: 9412.34

best, if not the most skilled podcaster in the entire universe at continuing

Time: 9418.19

to press people on specific topics when they're trying to bob and weave and

Time: 9423.07

wriggle out, he'll just keep either drilling or alter the question somewhat

Time: 9428.25

in a way that forces them to finally come up with an answer of some sort.

Time: 9432.07

And I think that probably puts certain people's cortisol levels

Time: 9436.8

through the roof, such that they just would never go on there.

Time: 9440.34

Marc Andreessen: I think there's another deeper question also, or another question

Time: 9442.429

along with that, which is how many people actually have something to say.

Time: 9447.69

Andrew Huberman: Real substance.

Time: 9448.42

Marc Andreessen: Right.

Time: 9448.85

Like how many people can actually talk in a way that's actually interesting

Time: 9451.69

to anybody else for any length of time.

Time: 9454.32

How much substance is there, really?

Time: 9455.47

And a lot of historical politics was to be able to manufacture a facade where you

Time: 9459.77

honestly, as far as you can't tell how deep the thoughts are, even if they have

Time: 9464.24

deep thoughts, it's kept away from you.

Time: 9466.01

They would certainly never cop to it.

Time: 9467.92

Andrew Huberman: It's going to be an interesting next, what

Time: 9469.76

is it, about 20 months or so.

Time: 9473.05

Marc Andreessen: So panic and the name calling have already started?

Time: 9475.9

Andrew Huberman: Yeah, I was going to say this list of three things, denial,

Time: 9480.48

the counterargument, and name calling.

Time: 9482.21

It seems like with AI, it's already just jumped to numbers two and three.

Time: 9486.5

Marc Andreessen: Yes, correct.

Time: 9487.129

Andrew Huberman: We're already at two and three, and it's kind of leaning three.

Time: 9490.64

Marc Andreessen: That's correct.

Time: 9490.99

AI is unusual just because new technologies that take off, they

Time: 9496.24

almost always have a prehistory.

Time: 9497.49

They almost always have a 30 or 40 year history where people tried and failed to

Time: 9500.22

get them to work before they took off.

Time: 9501.75

AI has an 80 year prehistory, so it has a very long one.

Time: 9504.959

And then it all of a sudden started to work dramatically

Time: 9508.59

well, seemingly overnight.

Time: 9511.02

And so it went from basically as far as most people were concerned,

Time: 9514.11

it went from it doesn't work at all to it works incredibly well in one

Time: 9516.776

step, and that almost never happens.

Time: 9519.24

I actually think that's exactly what's happening.

Time: 9520.619

I think it's actually speed running this progression just because if you

Time: 9523.47

use Midjourney or you use GPT or any of these things for five minutes, you're

Time: 9526.96

just like, wow, obviously this thing is going to be like, obviously in my life,

Time: 9530.76

this is going to be the best thing ever.

Time: 9531.99

This is amazing.

Time: 9532.58

There's all these ways that I can use it.

Time: 9533.88

And then therefore, immediately you're like, oh my God, this is

Time: 9536.9

going to transform everything.

Time: 9537.82

Therefore, step three, straight to the name calling.

Time: 9543.98

Andrew Huberman: In the face of all this.

Time: 9545.4

There are innovators out there.

Time: 9547.8

Maybe they are aware they are innovators.

Time: 9550.86

Maybe they are already starting companies, or maybe they are just some

Time: 9555.33

young or older person who has these five traits in abundance or doesn't,

Time: 9561.1

but knows somebody who does and is partnering with them in some sort of idea.

Time: 9566.31

And you have an amazing track record at identifying these people.

Time: 9570.969

I think in part because you have those same traits yourself.

Time: 9575.32

I've heard you say the following: the world is a very malleable place.

Time: 9579.87

If you know what you want and you go for it with maximum energy and drive

Time: 9583.46

and passion, the world will often reconfigure itself around you much more

Time: 9587.46

quickly and easily than you would think.

Time: 9589.98

That's a remarkable quote because it says at least two things to me.

Time: 9595.439

One is that you have a very clear understanding of the inner

Time: 9599.14

workings of these great innovators.

Time: 9602.309

We talked a little bit about that earlier, these five traits, etc.,

Time: 9606.53

but that also you have an intense understanding of the world landscape.

Time: 9611.549

And the way that we've been talking a bout it for the last hour or

Time: 9613.66

so is that it is a really intense and kind of oppressive landscape.

Time: 9617.81

You've got countries and organizations and elites and journalists that are

Time: 9623.759

trying to, not necessarily trying, but are suppressing the innovation process.

Time: 9628.529

I mean, that's sort of the picture that I'm getting.

Time: 9630.27

So it's like we're trying to innovate inside of a vise that's

Time: 9633.79

getting progressively tighter.

Time: 9635.48

And yet this quote argues that it is the person, the boy or girl, man or

Time: 9642.57

woman, who says, well, you know what?

Time: 9645.139

That all might be true, but my view of the world is the way the world's

Time: 9649.59

going to bend, or I'm going to create a dent in that vise that allows

Time: 9653.39

me to exist the way that I want.

Time: 9655.12

Or you know what, I'm actually going to uncurl the vise in the other direction.

Time: 9658.74

And so I'm at once picking up a sort of pessimistic, glass half empty view of the

Time: 9666.27

world, as well as a glass half full view.

Time: 9669.95

So tell me about that.

Time: 9672.62

Could you tell us about that from the perspective of someone listening who is

Time: 9676.07

thinking, I've got an idea, and I know it's a really good one, because I just

Time: 9682.06

know I might not have the confidence of extrinsic reward yet, but I just

Time: 9686.37

know there's a seed of something.

Time: 9688.86

What does it take to foster that?

Time: 9690.82

And how do we foster real innovation in the landscape that we're talking about?

Time: 9696.2

Marc Andreessen: Yeah, so part is, I think, one of the ways to square it is,

Time: 9698.92

I think you as the innovator need to be signed up to fight the fight, right?

Time: 9703.21

And again, this is where the fictional portrayals of startups, I think, take

Time: 9705.82

people off course, or even scientists or whatever, because when there's

Time: 9708.81

great success stories, they get kind of prettified after the fact and

Time: 9712.98

they get made to be cute and fun, and it's like, yeah, no, if you talk

Time: 9716.265

to anybody who actually did any of these, like, these things are always

Time: 9720.11

just like brutal exercises and just like sheer willpower and fighting

Time: 9723.44

forces that are trying to get you.

Time: 9723.513

So part of it is you have to be signed up for the fight.

Time: 9723.83

And this kind of goes to the conscientiousness

Time: 9730.54

thing we're talking also.

Time: 9732.69

My partner, Ben, uses the term courage a lot, which is some combination of

Time: 9736.879

just stubbornness, but coupled with a willingness to take pain and not stop

Time: 9741.68

and have people think very bad things of you for a long time until it turns out

Time: 9746.04

you hopefully prove yourself correct.

Time: 9748.88

And so you have to be willing to do that.

Time: 9751.49

It's a contact sport.

Time: 9752.83

These aren't easy roads, right?

Time: 9754.109

It's a contact sport, so you have to be signed up for the fight.

Time: 9757.38

The advantage that you have as an innovator is that at the end of the

Time: 9762.48

day, the truth actually matters.

Time: 9764.82

And all the arguments in the world, the classic Victor Hugo quote is, "There's

Time: 9768.63

nothing more powerful in the world than an idea whose time has come."

Time: 9772.84

If it's real, right?

Time: 9774.63

And this is just pure substance, if the thing is real, if the idea is

Time: 9778.46

real, if it's a legitimately good scientific discovery about how the

Time: 9782.92

nature works, if it's a new invention, if it's a new work of art, and if

Time: 9786.96

it's real, then you do, at the end of the day, you have that on your side.

Time: 9792.34

And all of the people who are fighting you and arguing with you and telling you

Time: 9795.15

no, they don't have that on their side.

Time: 9797.27

It's not that they're showing up with some other thing and they're like,

Time: 9800.5

my thing is better than your thing.

Time: 9801.75

That's not the main problem.

Time: 9803.4

The main problem is I have a thing.

Time: 9805.63

I'm convinced everybody else is telling me it's stupid, wrong, it should

Time: 9808.41

be illegal, whatever the thing is.

Time: 9810

But at the end of the day, I still have the thing, right?

Time: 9813.45

So at the end of the day, the truth really matters.

Time: 9816.06

The substance really matters if it's real.

Time: 9817.679

I'll give you an example.

Time: 9818.23

It's really hard historically to find an example of a new technology that came

Time: 9823.6

into the world that was then pulled back.

Time: 9827.32

Nuclear is maybe an example of that.

Time: 9829.35

But even still, there are still nuclear plants, like, running today.

Time: 9833.75

That still exists.

Time: 9835.31

I would say the same thing as scientific, at least I may ask you this.

Time: 9838.67

I don't know of any scientific discovery that was made, and then

Time: 9841.37

people like, I know there are areas of science that are not politically

Time: 9845.05

correct to talk about today, but every scientist knows the truth.

Time: 9849.83

The truth is still the truth.

Time: 9851.14

I mean, even the geneticists in the Soviet Union who were forced to buy in, like,

Time: 9854.79

knew the whole time that it was wrong.

Time: 9856.4

That I'm completely convinced of.

Time: 9858.12

Andrew Huberman: Yeah, they couldn't delude themselves, especially because

Time: 9860.59

the basic training that one gets in any field establishes some core truths upon

Time: 9864.32

which even the crazy ideas have to rest.

Time: 9866.54

And if they don't, as you pointed out, things fall to pieces.

Time: 9869.99

I would say that even the technologies that did not pan out and in some cases

Time: 9874.66

were disastrous, but that were great ideas at the beginning, are starting to pan out.

Time: 9882.219

So the example I'll give is that most people are aware of the Elizabeth Holmes

Time: 9885.73

Theranos debacle, to put it lightly, analyzing what's in a single drop of

Time: 9892.74

blood as a way to analyze hormones and disease and antibodies, etc.

Time: 9896.5

I mean, that's a great idea, it's a terrific idea.

Time: 9900.64

As opposed to having a phlebotomist come to your house or you have to

Time: 9902.9

go in and you get tapped and then pulling vials and the whole thing.

Time: 9907.15

There's now a company born out of Stanford that is doing exactly what she

Time: 9912.52

sought to do, except that at least the courts ruled that she fudged the thing,

Time: 9917.569

and that's why she's in jail right now.

Time: 9920

But the idea of getting a wide array of markers from a single drop of blood

Time: 9926.16

is an absolutely spectacular idea.

Time: 9928.17

The biggest challenge that company has is going to confront is the idea

Time: 9930.84

that it's just the next Theranos.

Time: 9932.179

But if they've got the thing and t hey're not fudging it, as apparently

Time: 9936.129

Theranos was, I think everything will work out ala Victor Hugo.

Time: 9942.91

Marc Andreessen: Yeah, exactly.

Time: 9944.26

Because who wants to go back if they get to the work, if it's real?

Time: 9950.18

This is the thing.

Time: 9950.94

The opponents, they're not bringing their own ideas.

Time: 9955.73

They're not bringing their, oh, my idea is better than yours.

Time: 9957.8

That's not what's happening.

Time: 9958.91

They're bringing the silence or counterargument or name calling.

Time: 9964.88

Andrew Huberman: Well, this is why I think people who need to be loved probably

Time: 9968.94

stand a reduced chance of success.

Time: 9972.59

And maybe that's also why having people close to you that do

Time: 9975.38

love you and allowing that to be sufficient can be very beneficial.

Time: 9978.91

This gets back to the idea of partnership and family around innovators, because

Time: 9984.17

if you feel filled up by those people local to you in your home, then you don't

Time: 9989.66

need people on the Internet saying nice things about you or your ideas, because

Time: 9993.3

you're good and you can forge forward.

Time: 9996.77

Another question about innovation is the teams that you assemble around you,

Time: 10000.28

and you've talked before about the sort of small squadron model, sort of David

Time: 10005.339

and Goliath examples as well, where a small group of individuals can create a

Time: 10011.35

technology that frankly outdoes what a giant like Facebook might be doing or what

Time: 10018.14

any other large company might be doing.

Time: 10020.96

There are a lot of theories as to why that would happen, but I know

Time: 10023.89

you have some unique theories.

Time: 10026.299

Why do you think small groups can defeat large organizations?

Time: 10031.31

Marc Andreessen: So the conventional explanation is, I think, correct, and

Time: 10033.88

it's just that large organizations have a lot of advantages, but they just have

Time: 10037.68

a very hard time actually executing anything because of all the overhead.

Time: 10043.02

So large organizations have combinatorial communication overhead.

Time: 10047.37

The number of people who have to be consulted, who have to agree

Time: 10049.36

on things, gets to be staggering.

Time: 10051.18

The amount of time it takes to schedule the meeting gets to be staggering.

Time: 10054.71

You get these really big companies and they have some issue they're dealing

Time: 10057.32

with, and it takes like a month to schedule the pre meeting, to plan for

Time: 10060.49

the meeting, which is going to happen two months later, which is then going to

Time: 10062.93

result in a post meeting, which will then result in a board presentation, which

Time: 10066.3

will then result in a planning off site.

Time: 10068.88

Andrew Huberman: I thought academia was bad.

Time: 10070.02

But what you're describing is giving me hives.

Time: 10071.79

Marc Andreessen: Kafka was a documentary.

Time: 10073.66

Yeah.

Time: 10077.17

Look, you'd have these organizations at 100,000 people are more like you're

Time: 10080.19

more of a nation state than a company.

Time: 10083.7

And you've got all these competing internal, it's the Bedouin

Time: 10086.21

thing I was saying before.

Time: 10086.86

You've got all these internal, at most big companies, your internal

Time: 10089.47

enemies are way more dangerous to you than anybody on the outside.

Time: 10092.82

Andrew Huberman: Can you elaborate on that?

Time: 10093.63

Marc Andreessen: Oh, yeah.

Time: 10094.01

At a big company, the big competition is for the next promotion, right?

Time: 10098.53

And the enemy for the next promotion is the next executive over in your company.

Time: 10103.1

That's your enemy.

Time: 10104.85

The competitor on the outside is like an abstraction.

Time: 10107.11

Like, maybe they'll matter someday, whatever.

Time: 10108.81

I've got to beat that guy inside my own company.

Time: 10111.179

Right?

Time: 10111.779

And so the internal warfare is at least as intense as the external warfare.

Time: 10115.48

This is just all the iron law of all these big bureaucracies and how they function.

Time: 10120.97

So if a big bureaucracy ever does anything productive, I think it's like a miracle.

Time: 10124.639

It's like a miracle to the point where there should be like a celebration,

Time: 10128.17

there should be parties, there should be like ticker tape parades for big, large

Time: 10130.83

organizations that actually do things.

Time: 10132.59

That's great because it's so rare.

Time: 10134.93

It doesn't happen very often anyway.

Time: 10137.7

So that's the conventional explanation, whereas, look, small companies, small

Time: 10140.98

teams, there's a lot that they can't do because they're not operating

Time: 10143.88

at scale and they don't have global coverage and all these kind of, they

Time: 10146.86

don't have the resources and so forth.

Time: 10148.19

But at least they can move quickly, right?

Time: 10150.369

They can organize fast.

Time: 10151.969

If there's an issue today, they can have a meeting today,

Time: 10154.71

they can solve the issue today.

Time: 10156.32

And everybody they need to solve the issue is in the room today.

Time: 10159.17

So they can just move a lot faster.

Time: 10161.2

I think that's part of it.

Time: 10161.99

But I think there's another deeper thing underneath that, that people

Time: 10164.53

really don't like to talk about.

Time: 10165.75

That takes us back full circle to where we started, which is just the

Time: 10168.18

sheer number of people in the world who are capable of doing new things

Time: 10171.07

is just a very small set of people.

Time: 10173.83

And so you're not going to have 100 of them in a company or 1000 or 10,000.

Time: 10177.83

You're going to have three, eight or ten, maybe.

Time: 10182.41

Andrew Huberman: And some of them are flying too close to the sun.

Time: 10184.65

Marc Andreessen: Some of them are blowing themselves up, right?

Time: 10186.549

Some of them are.

Time: 10187.189

So IBM.

Time: 10187.769

I actually first learned this at IBM.

Time: 10189.4

My first actual job job was at IBM when IBM was still on top of the world

Time: 10193.4

right before it caved in the early 90s.

Time: 10195.09

And so when I was there it was 440,000 employees which, and again if you

Time: 10199.4

inflation adjust like today for that same size of business, inflation

Time: 10202.83

adjusted, market size adjusted, it would be its equivalent today of like a two

Time: 10205.75

or three million person organization.

Time: 10206.91

It was a nation state.

Time: 10209.47

There were 6000 people in my division and we were next door to

Time: 10212.12

another building that had another 6000 people in another division.

Time: 10214.429

So you could work there for years and never meet anybody

Time: 10216.93

who didn't work for IBM.

Time: 10218.349

The first half of every meeting was just IBMers introducing

Time: 10220.93

themselves to each other.

Time: 10222.769

It was just mind boggling, the level of complexity.

Time: 10225.21

But they were so powerful that they had four years before I got there in 1985,

Time: 10231.34

they were 80% of the market capitalization of the entire tech industry.

Time: 10235.3

So they were at a level of dominance that even Google or Apple today

Time: 10238.299

is not even close to at the time.

Time: 10240.629

So that's how powerful they were.

Time: 10241.829

And so they had a system and it worked really well for like 50 years.

Time: 10245.22

They had a system which was.

Time: 10246.72

Most of the employees in the company were expected to basically follow rules.

Time: 10250.8

So they dressed the same, they acted the same, they did

Time: 10252.51

everything out of the playbook.

Time: 10253.88

They were trained very specifically but they had this category of

Time: 10257.11

people they called Wild Ducks.

Time: 10259.24

And this was an idea that the founder Thomas Watson

Time: 10261.15

had come up with, Wild Ducks.

Time: 10262.85

And the Wild ducks were, they often had the formal title of an IBM fellow and

Time: 10267.33

they were the people who could make new things and there were eight of them.

Time: 10272.139

And they got to break all the rules and they got to invent new products.

Time: 10276.029

They got to go off and work on something new.

Time: 10277.539

They didn't have to report back.

Time: 10279.02

They got to pull people off of other projects to work with them.

Time: 10282.62

They got budget when they needed it.

Time: 10284.26

They reported directly to the CEO, they got whatever they needed.

Time: 10286.97

He supported them in doing it.

Time: 10287.96

And they were glass breakers.

Time: 10289.41

And the one in Austin at the time was this guy Andy Heller.

Time: 10292.849

And he would show up in jeans and cowboy boots and amongst an ocean of men in

Time: 10298.14

blue suits, white shirts, red ties and put his cowboy boots up on the table and

Time: 10303.16

it was fine for Andy Heller to do that.

Time: 10304.75

And it was not fine for you to do that, right.

Time: 10306.27

And so they very specifically identified, we have almost like an

Time: 10311.15

aristocratic class within our company that gets to play by different rules.

Time: 10315.03

Now the expectation is they deliver, right?

Time: 10317.76

Their job is to invent the next breakthrough product.

Time: 10319.64

But we, IBM management, know that the 6000 person division is not

Time: 10323.61

going to invent the next product.

Time: 10324.95

We know it's going to be crazy.

Time: 10325.88

Andy Heller in his cowboy boots.

Time: 10328.37

And so I was always very impressed.

Time: 10330.32

Again, ultimately, IBM had its issues, but that model worked for 50 years.

Time: 10334.71

Right? Like, worked incredibly well.

Time: 10335.9

And I think that's basically the model that works.

Time: 10339.76

But it's a paradox, right?

Time: 10340.929

Which is like, how do you have a large, bureaucratic, regimented organization,

Time: 10344.72

whether it's academia or government or business or anything, that has all

Time: 10348.52

these rule followers in it and all these people who are jealous of their status

Time: 10351.4

and don't want things to change, but then still have that spark of creativity?

Time: 10357.6

I would say mostly it's impossible.

Time: 10359.809

Mostly it just doesn't happen.

Time: 10361.469

Those people get driven out.

Time: 10362.8

And in tech, what happens is those people get driven out because we will fund them.

Time: 10366.789

These are the people we fund.

Time: 10367.511

Andrew Huberman: I was going to say, rather, that you are in the business

Time: 10370.67

of finding and funding the wild ducks.

Time: 10372.63

Marc Andreessen: The wild ducks.

Time: 10373.09

That's exactly right.

Time: 10373.88

And actually, this is actually, close the loop.

Time: 10376.29

This is actually, I think, the simplest explanation for why IBM

Time: 10378.86

ultimately caved in, and then HP sort of in the 80s also caved in.

Time: 10382.689

IBM and HP kind of were these incredible, monolithic, incredible

Time: 10385.959

companies for 40 or 50 years, and then they kind of both caved in.

Time: 10389.29

In the actually, think it was the emergence of venture capital, it was

Time: 10393.01

the emergence of a parallel funding system where the wild ducks, or in

Time: 10396.19

HP's case, their superstar technical people, could actually leave and start

Time: 10399.43

their own companies is, and again, it goes back to the university discussion

Time: 10402.449

we're having is like, this is what doesn't exist at the university level.

Time: 10405.75

This certainly does not exist at the government level.

Time: 10407.91

Andrew Huberman: And until recently in media, it didn't exist until there's

Time: 10410.69

this thing that we call podcasts.

Time: 10413.139

Marc Andreessen: Exactly right.

Time: 10414.25

Andrew Huberman: Which clearly have picked up some momentum, and

Time: 10417.01

I would hope that these other wild duck models will move quickly.

Time: 10422.34

Marc Andreessen: Yeah, but the one thing you know, and you know this, the

Time: 10424.41

one thing you know is the people on the other side are going to be mad as hell.

Time: 10427.35

Andrew Huberman: Yeah, they're going to, well, I think they're past denial.

Time: 10431.59

The counterarguments continue.

Time: 10433.79

The name calling is prolific.

Time: 10435.19

Marc Andreessen: Name calling is fully underway.

Time: 10438.619

Andrew Huberman: Well, Marc, we've covered a lot of topics, but as with every time

Time: 10444.41

I talk to you, I learn oh, so very much.

Time: 10447.64

I'm so grateful for you taking the time out of your schedule to talk about

Time: 10450.86

all of these topics in depth with us.

Time: 10454.53

I'd be remiss if I didn't say that.

Time: 10455.59

It is clear to me now that you are hyper realistic about the landscape, but you

Time: 10462.14

are also intensely optimistic about the existence of wild ducks and those

Time: 10467.29

around them that support them and that are necessary for the implementation

Time: 10469.94

of their ideas at some point.

Time: 10472.09

And that also, you have a real rebel inside you.

Time: 10475.299

So that is oh, so welcome on this podcast.

Time: 10478.49

And it's also needed in these times and every time.

Time: 10482.44

So on behalf of myself and the rest of us here at the podcast, and especially

Time: 10487.15

the listeners, thank you so much.

Time: 10488.99

Marc Andreessen: Thanks for having me.

Time: 10490.429

Andrew Huberman: Thank you for joining me for today's

Time: 10491.7

discussion with Marc Andreessen.

Time: 10493.92

If you're learning from and or enjoying this podcast, please

Time: 10496.65

subscribe to our YouTube channel.

Time: 10498.11

That's a terrific, zero cost way to support us.

Time: 10500.65

In addition, please subscribe to the podcast on both Spotify and Apple.

Time: 10504.65

And on both Spotify and Apple, you can leave us up to a five star review.

Time: 10508.219

If you have questions for me or comments about the podcast or guests that you'd

Time: 10511.199

like me to consider hosting on the Huberman Lab Podcast, please put those

Time: 10514.93

in the comments section on YouTube.

Time: 10516.38

I do read all the comments.

Time: 10518.39

Please also check out the sponsors mentioned at the beginning and

Time: 10520.87

throughout today's episode.

Time: 10522.2

That's the best way to support this podcast, not on today's podcast, but on

Time: 10526.65

many previous episodes of the Huberman Lab Podcast, we discuss supplements.

Time: 10530.18

While supplements aren't necessary for everybody, many people

Time: 10532.8

derive tremendous benefit from them for things like improving

Time: 10535.67

sleep, hormone support and focus.

Time: 10537.91

The Huberman Lab Podcast has partnered with Momentous Supplements.

Time: 10540.82

If you'd like to access the supplements discussed on the Huberman Lab podcast, you

Time: 10544.33

can go to livemomentous, spelled O-U-S.

Time: 10546.69

So it's livemomentous.com/huberman, and you can also receive 20% off.

Time: 10551.85

Again, that's livemomentous, spelled O-U-S, .com/huberman.

Time: 10556.059

If you haven't already subscribed to our Neural Network Newsletter, our Neural

Time: 10559.57

Network Newsletter is a completely zero cost monthly newsletter that includes

Time: 10563.509

summaries of podcast episodes as well as protocols, that is, short PDFs

Time: 10568.59

describing, for instance, tools to improve sleep, tools to improve neuroplasticity.

Time: 10573.2

We talk about deliberate cold exposure, fitness, various aspects of mental

Time: 10577.05

health, again, all completely zero cost.

Time: 10579.099

And to sign up, you simply go to hubermanlab.com, go over to the

Time: 10582.379

menu in the corner, scroll down to newsletter, and provide your email.

Time: 10585.67

We do not share your email with anybody.

Time: 10588.06

If you're not already following me on social media, I am

Time: 10590.24

Huberman Lab on all platforms.

Time: 10592.38

So that's Instagram, Twitter, Threads, LinkedIn, and Facebook.

Time: 10596.359

And at all of those places I talk about science and science related

Time: 10599.6

tools, some of which overlaps with the content of the Huberman Lab podcast,

Time: 10602.699

but much of which is distinct from the content of the Huberman Lab podcast.

Time: 10606.01

Again, it's Huberman Lab on all social media platforms.

Time: 10609.17

Thank you once again for joining me for today's discussion with Marc Andreessen.

Time: 10613.17

And last but certainly not least, thank you for your interest in science.

Copyright © 2024. All rights reserved.