Mark Zuckerberg & Dr. Priscilla Chan: Curing All Human Diseases & the Future of Health & Technology

Time: 0

ANDREW HUBERMAN: Welcome to the Huberman Lab podcast,

Time: 2.208

where we discuss science and science-based tools

Time: 4.61

for everyday life.

Time: 5.36

[MUSIC PLAYING]

Time: 9.09

I'm Andrew Huberman.

Time: 10.33

And I'm a professor of neurobiology and ophthalmology

Time: 13.41

at Stanford School of Medicine.

Time: 15.06

My guests today are Mark Zuckerberg and Dr. Priscilla

Time: 18.03

Chan.

Time: 18.81

Mark Zuckerberg, as everybody knows,

Time: 21

founded the company Facebook.

Time: 22.62

He is now the CEO of Meta, which includes Facebook, Instagram,

Time: 27.07

WhatsApp, and other technology platforms.

Time: 29.49

Dr. Priscilla Chan graduated from Harvard

Time: 32.009

and went on to do her medical degree at the University

Time: 34.98

of California San Francisco.

Time: 36.75

Mark Zuckerberg and Dr. Priscilla Chan

Time: 39.03

are married and the co-founders of the CZI,

Time: 41.91

or Chan Zuckerberg Initiative, a philanthropic organization

Time: 45.45

whose stated goal is to cure all human diseases.

Time: 48.81

The Chan Zuckerberg Initiative is accomplishing that

Time: 51.3

by providing critical funding not available elsewhere,

Time: 54.39

as well as a novel framework for discovery

Time: 56.97

of the basic functioning of cells,

Time: 59.43

cataloging all the different human cell

Time: 61.68

types, as well as providing AI, or artificial intelligence,

Time: 65.129

platforms to mine all of that data

Time: 67.02

to discover new pathways and cures for all human diseases.

Time: 71.17

The first hour of today's discussion

Time: 73.18

is held with both Dr. Priscilla Chan and Mark Zuckerberg,

Time: 76.45

during which we discuss the CZI and what it really

Time: 79.48

means to try and cure all human diseases.

Time: 81.97

We talk about the motivational backbone for the CZI

Time: 84.64

that extends well into each of their personal histories.

Time: 87.61

Indeed, you'll learn quite a lot about Dr. Priscilla Chan, who

Time: 90.82

has, I must say, an absolutely incredible family story leading

Time: 94.42

up to her role as a physician and her motivations

Time: 97.12

for the CZI and beyond.

Time: 98.74

And you'll learn from Mark, how he is bringing an engineering

Time: 101.62

and AI perspective to the discovery

Time: 103.69

of new cures for human disease.

Time: 105.89

The second half of today's discussion

Time: 107.47

is just between Mark Zuckerberg and me, during which we discuss

Time: 110.83

various Meta Platforms, including, of course,

Time: 113.21

social media platforms, and their effects on mental health

Time: 116.18

in children and adults.

Time: 117.43

We also discuss VR, Virtual Reality, as well as

Time: 120.52

augmented and mixed reality.

Time: 122.47

And we discuss AI, Artificial Intelligence,

Time: 125.29

and how it stands to transform not just our online experiences

Time: 128.65

with social media and other technologies,

Time: 130.75

but how it stands to potentially transform

Time: 133.09

every aspect of everyday life.

Time: 135.37

Before we begin, I'd like to emphasize

Time: 137.26

that this podcast is separate from my teaching and research

Time: 139.84

roles at Stanford.

Time: 140.87

It is, however, part of my desire and effort

Time: 142.87

to bring zero cost to consumer information

Time: 144.76

about science and science-related tools

Time: 146.65

to the general public.

Time: 147.88

In keeping with that theme, I'd like

Time: 149.38

to thank the sponsors of today's podcast.

Time: 151.94

Our first sponsor is Eight Sleep Eight

Time: 154.24

Sleep makes smart mattress covers with cooling, heating,

Time: 156.8

and sleep tracking capacity.

Time: 158.75

I've spoken many times before on this podcast about the fact

Time: 161.47

that getting a great night's sleep

Time: 163.12

really is the foundation of mental health, physical health

Time: 165.65

and performance.

Time: 166.36

One of the key things to getting a great night's sleep

Time: 168.55

is to make sure that the temperature of your sleeping

Time: 170.758

environment is correct.

Time: 171.73

And that's because in order to fall and stay deeply asleep,

Time: 174.25

your body temperature actually has

Time: 175.667

to drop by about 1 to 3 degrees.

Time: 177.62

And in order to wake up feeling refreshed and energized,

Time: 180.693

your body temperature actually has

Time: 182.11

to increase by about 1 to 3 degrees.

Time: 184.088

With Eight Sleep, you can program the temperature

Time: 186.13

of your sleeping environment in the beginning, middle,

Time: 188.48

and end of your night.

Time: 189.483

It has a number of other features,

Time: 190.9

like tracking the amount of rapid eye movement

Time: 192.76

and slow wave sleep that you get,

Time: 194.135

things that are essential to really dialing

Time: 196.15

in the perfect night's sleep for you.

Time: 197.692

I've been sleeping on an Eight Sleep mattress

Time: 199.567

cover for well over two years now.

Time: 201.04

And it has greatly improved my sleep.

Time: 203.24

I fall asleep far more quickly.

Time: 204.76

I wake up far less often in the middle of the night.

Time: 207.1

And I wake up feeling far more refreshed

Time: 209.11

than I ever did prior to using an Eight Sleep mattress cover.

Time: 212.32

If you'd like to try Eight Sleep,

Time: 213.82

you can go to eightsleep.com/huberman to save

Time: 216.85

$150 off their Pod 3 cover.

Time: 219.25

Eight Sleep currently ships to the USA,

Time: 221.06

Canada, UK, select countries in the EU, and Australia.

Time: 224.33

Again, that's eightsleep.com/huberman.

Time: 226.96

Today's episode is also brought to us by LMNT.

Time: 229.69

LMNT is an electrolyte drink that has everything you need

Time: 232.51

and nothing you don't.

Time: 233.51

That means plenty of electrolytes-- sodium,

Time: 235.4

magnesium and potassium-- and no sugar.

Time: 237.738

The electrolytes are absolutely essential for the functioning

Time: 240.28

of every cell in your body.

Time: 241.492

And your neurons, your nerve cells,

Time: 242.95

rely on sodium, magnesium and potassium

Time: 245.17

in order to communicate with one another electrically and

Time: 247.66

chemically.

Time: 248.26

LMNT contains the optimal ratio of electrolytes

Time: 250.39

for the functioning of neurons and the other cells

Time: 252.7

of your body.

Time: 253.33

Every morning, I drink a packet of LMNT dissolved

Time: 255.64

in about 32 ounces of water.

Time: 257.62

I do that just for general hydration

Time: 259.839

and to make sure that I have adequate electrolytes

Time: 262.15

for any activities that day.

Time: 263.62

I'll often also have an LMNT packet, or even two packets,

Time: 266.62

in 32 to 60 ounces of water if I'm exercising very hard

Time: 270.55

and certainly if I'm sweating a lot, in order

Time: 272.77

to make sure that I replace those electrolytes.

Time: 274.9

If you'd like to try LMNT, you can go

Time: 276.64

to drinklmnt.com/huberman to get a free sample pack with

Time: 281.35

your purchase.

Time: 282.05

Again, that's drinklmnt.com/huberman.

Time: 285.7

I'm pleased to announce that we will

Time: 287.26

be hosting four live events in Australia, each of which

Time: 290.62

is entitled The Brain Body Contract, during which I will

Time: 293.74

share science and science-related tools

Time: 295.72

for mental health, physical health, and performance.

Time: 298.39

There will also be a live question and answer session.

Time: 301.49

We have limited tickets still available

Time: 303.13

for the event in Melbourne on February 10,

Time: 305.62

as well as the event in Brisbane on February 24.

Time: 309.17

Our event in Sydney, at the Sydney Opera House,

Time: 311.42

sold out very quickly.

Time: 313.02

So as a consequence, we've now scheduled

Time: 315.05

a second event in Sydney at the Aware Super Theatre

Time: 318.02

on February 18.

Time: 319.52

To access tickets to any of these events,

Time: 321.74

you can go to hubermanlab.com/events and use

Time: 325.58

the code Huberman at checkout.

Time: 327.44

I hope to see you there.

Time: 328.77

And as always, thank you for your interest in science.

Time: 331.49

And now, for my discussion with Mark Zuckerberg

Time: 334.19

and Dr. Priscilla Chan.

Time: 335.75

Priscilla, Mark, so great to meet you.

Time: 337.7

And thank you for having me here in your home.

Time: 339.92

MARK ZUCKERBERG: Oh, Thanks for having us on the podcast.

Time: 341.36

PRISCILLA CHAN: Yeah.

Time: 342.235

ANDREW HUBERMAN: I'd like to talk about the CZI, the Chan

Time: 344.9

Zuckerberg Initiative.

Time: 346.13

I learned about this a few years ago,

Time: 348.08

when my lab was-- and still is now-- at Stanford,

Time: 351.92

as a very exciting philanthropic effort

Time: 354.71

that has a truly big mission.

Time: 357.74

I can't imagine a bigger mission.

Time: 359.443

So maybe you could tell us what that big mission is.

Time: 361.61

And then we can get into some of the mechanics of how

Time: 364.19

that big mission can become a reality.

Time: 369.21

PRISCILLA CHAN: So like you're mentioning, in 2015,

Time: 374.15

we launched the Chan Zuckerberg Initiative.

Time: 376.82

And what we were hoping to do at CZI

Time: 379.79

was think about how do we build a better future for everyone

Time: 382.82

and looking for ways where we can contribute

Time: 385.4

the resources that we have to bring philanthropically

Time: 388.37

and the experiences that Mark and I have had,

Time: 391.25

for me as a physician and educator,

Time: 393.26

for Mark as an engineer, and then

Time: 395.84

our ability to bring teams together to build the builders.

Time: 400.34

Mark has been a builder throughout his career.

Time: 403.82

And what could we do if we actually

Time: 405.89

put together a team to build tools, do great science?

Time: 410.39

And so within our science portfolio,

Time: 413

we've really been focused on what some people think

Time: 416.87

is either an incredibly audacious goal

Time: 419.63

or an inevitable goal.

Time: 422

But I think about it as something

Time: 423.53

that will happen if we continue focusing on it, which

Time: 426.92

is to be able to cure, prevent, or manage

Time: 429.09

all disease by the end of the century.

Time: 430.89

ANDREW HUBERMAN: All disease?

Time: 431.61

PRISCILLA CHAN: All disease.

Time: 432.58

So that's important, right?

Time: 433.74

And so a lot of times, people ask like, which disease?

Time: 436.26

And the whole point is that there is not one disease.

Time: 439.66

And it's really about taking a step back to where I always

Time: 443.94

found the most hope as a physician, which

Time: 446.4

is new discoveries and new opportunities

Time: 449.76

and new ways of understanding how to keep people well come

Time: 453.12

from basic science.

Time: 454.35

So our strategy at CZI is really to build tools, fund science,

Time: 461.28

change the way basic scientists can see the world

Time: 465.21

and how they can move quickly in their discoveries.

Time: 469.35

And so that's what we launched in 2015.

Time: 473.64

We do work in three ways.

Time: 475.99

We fund great scientists.

Time: 478.68

We build tools-- right now, software tools

Time: 482.97

to help move science along and make it easier for scientists

Time: 487.05

to do their work.

Time: 488.04

And we do science.

Time: 489.69

You mentioned Stanford being an important pillar

Time: 492.06

for our science work.

Time: 493.32

We've built what we call biohubs, institutes where teams

Time: 497.88

can take on grand challenges to do work that

Time: 502.68

wouldn't be possible in a single lab

Time: 505.17

or within a single discipline.

Time: 506.88

And our first biohub was launched

Time: 509.04

in San Francisco, a collaboration between Stanford,

Time: 513.27

UC Berkeley, and UCSF.

Time: 515.88

ANDREW HUBERMAN: Amazing.

Time: 517.559

Curing all diseases implies that there will either

Time: 522.45

be a ton of knowledge gleaned from this effort, which

Time: 525.06

I'm certain there will be-- and there already has been.

Time: 527.61

We can talk about some of those early successes in a moment.

Time: 531.06

But it also sort of implies that if we can understand

Time: 534.21

some basic operations of diseases and cells

Time: 539.04

that transcend autism, Huntington's, Parkinson's,

Time: 543.6

cancer and any other disease that perhaps there

Time: 547.41

are some core principles that would make the big mission

Time: 551.31

a real reality, so to speak.

Time: 554.1

What I'm basically saying is, how are you attacking this?

Time: 556.86

My belief is that the cell sits at the center of all discussion

Time: 561.73

about disease, given that our body is made up of cells

Time: 564.79

and different types of cells.

Time: 566.27

So maybe you could just illuminate for us

Time: 569.47

a little bit of what the cell is, in your mind,

Time: 574.66

as it relates to disease and how one goes about understanding

Time: 578.26

disease in the context of cells because, ultimately, that's

Time: 581.26

what we're made up of.

Time: 582.28

MARK ZUCKERBERG: Yeah.

Time: 582.76

Well, let's get to the cell thing in a moment.

Time: 585.01

But just even taking a step back from that,

Time: 587.5

we don't think, at CZI, that we're

Time: 589.963

going to cure, prevent or manage all diseases.

Time: 591.88

The goal is to basically give the scientific community

Time: 594.94

and scientists around the world the tools

Time: 597.43

to accelerate the pace of science.

Time: 599.05

And we spent a lot of time, when we

Time: 601.63

were getting started with this, looking

Time: 604.17

at the history of science and trying to understand the trends

Time: 606.75

and how they've played out over time.

Time: 608.292

And if you look over this very long-term arc,

Time: 611.64

most large-scale discoveries are preceded

Time: 615.15

by the invention of a new tool or a new way to see something.

Time: 618.03

And it's not just in biology, right?

Time: 619.53

It's like having a telescope came

Time: 622.02

before a lot of discoveries in astronomy and astrophysics.

Time: 626.4

But similarly, the microscope and just different ways

Time: 630.36

to observe things or different platforms,

Time: 632.1

like the ability to do vaccines preceded the ability

Time: 634.5

to cure a lot of different things.

Time: 637.87

So this is the engineering part that you were talking about,

Time: 640.555

about building tools.

Time: 641.43

We view our goal is to try to bring together

Time: 645.96

some scientific and engineering knowledge to build tools

Time: 648.9

that empower the whole field.

Time: 651.15

And that's the big arc and a lot of the things

Time: 654.06

that we're focused on, including the work in single cell

Time: 657.09

and cell understanding, which you can jump in and get

Time: 662.033

into that if you want.

Time: 662.95

But yeah, I think I think we generally

Time: 665.44

agree with the premise that if you

Time: 667.39

want to understand this stuff from first principles--

Time: 670.91

people study organs a lot right.

Time: 673.5

You study how things present across the body.

Time: 676.15

But there's not a very widespread understanding

Time: 679.36

of how each cell operates.

Time: 681.49

And this is a big part of some of the initial work

Time: 684.648

that we tried to do on the Human Cell Atlas and understanding

Time: 687.19

what are the different cells.

Time: 688.84

And there's a bunch more work that we want

Time: 690.73

to do to carry that forward.

Time: 693.31

But overall, I think, when we think about the next 10 years

Time: 696.67

here of this long arc to try to empower the community

Time: 699.91

to be able to cure, prevent or manage all diseases,

Time: 704.71

we think that the next 10 years should really

Time: 706.81

be primarily about being able to measure and observe

Time: 710.08

more things in human biology.

Time: 711.373

There are a lot of limits to that.

Time: 712.79

It's like you want to look at something through a microscope,

Time: 715.332

you can't usually see living tissues

Time: 717.25

because it's hard to see through skin or things like that.

Time: 720.26

So there are a lot of different techniques

Time: 723.46

that will help us observe different things.

Time: 725.39

And this is where the engineering background

Time: 727.99

comes in a bit because--

Time: 729.76

I mean, when I think about this is from the perspective of how

Time: 732.7

you'd write code or something, the idea of trying

Time: 735.34

to debug or fix a code base, but not be able to step

Time: 738.19

through the code line by line, it's

Time: 740.02

not going to happen, right?

Time: 741.28

And at the beginning of any big project that we do at Meta,

Time: 745.175

we like to spend a bunch of the time up front just trying

Time: 747.55

to instrument things and understand

Time: 749.35

what are we going to look at and how are we

Time: 751.142

going to measure things so we know we're making progress

Time: 753.94

and know what to optimize.

Time: 755.12

And this is such a long-term journey

Time: 757.36

that we think that it actually makes sense to take the next 10

Time: 760

years to build those kinds of tools for biology

Time: 763.72

and understanding just how the human body works in action.

Time: 768.52

And a big part of that is, cells.

Time: 769.988

I don't know.

Time: 770.53

Do you want to jump and talk about some of the efforts?

Time: 772.915

PRISCILLA CHAN: Sure.

Time: 773.2

ANDREW HUBERMAN: Could I just interrupt briefly and just ask

Time: 775.7

about the different interventions, so to speak,

Time: 780.55

that CZI is in a unique position to bring to the quest

Time: 785.32

to cure all diseases?

Time: 786.71

So I can think of--

Time: 787.94

I mean, I know, as a scientist, that money is necessary but not

Time: 791.92

sufficient, right?

Time: 792.703

When you have money, you can hire more people.

Time: 794.62

You can try different things.

Time: 796.52

So that's critical.

Time: 797.35

But a lot of philanthropy includes money.

Time: 801.83

The other component is you want to be able to see things,

Time: 806.03

as you pointed out.

Time: 807.32

So you want to know that normal disease process--

Time: 809.96

like, what is a healthy cell?

Time: 811.25

What's a diseased cell?

Time: 812.645

Are cells constantly being bombarded with challenges

Time: 815.39

and then repairing those?

Time: 816.59

And then what we call cancer is just

Time: 818.37

a runaway train of those challenges

Time: 819.92

not being met by the cell itself or something like that?

Time: 822.84

So better imaging tools.

Time: 824.21

And then it sounds like there's not just a hardware component,

Time: 827.51

but a software component.

Time: 828.86

This is where AI comes in.

Time: 830.1

So maybe, at some point, we can break this up

Time: 832.52

into two, three different avenues.

Time: 834.06

One is understanding disease processes

Time: 836.33

and healthy processes.

Time: 837.41

We'll lump those together.

Time: 838.55

Then there's hardware-- so microscopes,

Time: 841.19

lenses, digital deconvolution, ways

Time: 844.25

of seeing things in bolder relief and more precision.

Time: 848.24

And then there's how to manage all the data.

Time: 851.93

And then I love the idea that maybe AI

Time: 854.66

could do what human brains can't do alone,

Time: 857.36

like manage understanding of the data

Time: 858.943

because it's one thing to organize data.

Time: 860.61

It's another to say, oh, this as you point out

Time: 863.37

in the analogy with code, that this particular gene

Time: 866.1

and that particular gene are potentially interesting,

Time: 869.22

whereas a human being would never

Time: 870.78

make that potential connection.

Time: 873.013

MARK ZUCKERBERG: Yeah.

Time: 873.93

PRISCILLA CHAN: So the tools that CZI

Time: 875.25

can bring to the table--

Time: 876.45

we fund science, like you're talking about.

Time: 881.5

There's lots of ways to fund science.

Time: 883.56

And just to be clear, what we fund

Time: 886.71

is a tiny fraction of what the NIH funds, for instance.

Time: 889.92

ANDREW HUBERMAN: So you guys have been generous enough

Time: 892.17

that it definitely holds wait to NIH's contribution.

Time: 897.805

PRISCILLA CHAN: Yeah.

Time: 898.68

But I think every funder has its own role in the ecosystem.

Time: 902.183

And for us, it's really, how do we

Time: 903.6

incentivize new points of view?

Time: 905.65

How do we incentivize collaboration?

Time: 907.26

How do we incentivize open science?

Time: 909.76

And so a lot of our grants include inviting people

Time: 914.49

to look at different fields.

Time: 917.04

Our first neuroscience RFA was aimed towards incentivizing

Time: 922.65

people from different backgrounds-- immunologists,

Time: 924.96

microbiologists-- to come and look

Time: 927.36

at how our nervous system works and how to keep it healthy.

Time: 931.2

Or we ask that our grantees participate

Time: 935.28

in the pre-print movement to accelerate

Time: 937.08

the rate of sharing knowledge and actually others being

Time: 940.47

able to build upon science.

Time: 942.21

So that's the funding that we do.

Time: 944.97

In terms of building, we build software and hardware,

Time: 949.98

like you mentioned.

Time: 950.91

We put together teams that can build

Time: 954.27

tools that are more durable and scalable than someone

Time: 959.46

in a single lab might be incentivized to do.

Time: 961.8

There's a ton of great ideas.

Time: 963.57

And nowadays, most scientists can tinker and build

Time: 967.53

something useful for their lab.

Time: 969.24

But it's really hard for them to be

Time: 971.55

able to share that tool sometimes

Time: 974.01

beyond their own laptop or forget the next Lab

Time: 977.61

over or across the globe.

Time: 979.75

So we partner with scientists to see what is useful,

Time: 983.34

what kinds of tools.

Time: 984.42

In imaging, Napari, it's a useful image annotation

Time: 989.82

tool that is born from an open source community.

Time: 994.17

And how can we contribute to that?

Time: 996.06

Or a CELLxGENE, which works on single cell data sets.

Time: 1000.35

And how can we make it build a useful tool so that scientists

Time: 1004.22

can share data sets, analyze their own

Time: 1006.65

and contribute to a larger corpus of information?

Time: 1010.74

So we have software teams that are building, collaborating

Time: 1015.05

with scientists to make sure that we're building

Time: 1017.12

easy to use, durable, translatable tools

Time: 1020.81

across the scientific community in the areas that we work in.

Time: 1024.51

We also have institutes-- this is where the imaging work comes

Time: 1028.94

in-- where we are proud owners of an electron microscope

Time: 1032.99

right now.

Time: 1034.349

It's going to be installed at our imaging institute.

Time: 1038.93

And that will really contribute to the way

Time: 1041.39

where we can see work differently.

Time: 1042.859

But more hardware does need to be developed.

Time: 1046.78

We're partnering with the fantastic scientists

Time: 1050.56

in the biohub network to build a mini-phase plate to increase

Time: 1056.8

to align the electrons through the electron microscope

Time: 1062.95

to be able to increase the resolution,

Time: 1064.6

so we can see in sharper detail.

Time: 1067.58

So there's a lot of innovative work within the network that's

Time: 1070.51

happening.

Time: 1071.95

And these institutes have grand challenges

Time: 1075.88

that they're working on.

Time: 1077.62

Back to your question about cells,

Time: 1079.84

cells are just the smallest unit that are alive.

Time: 1083.89

And your body, all of our bodies,

Time: 1087.88

have many, many, many cells.

Time: 1090.43

Some estimate of like 37 trillion cells,

Time: 1094.27

different cells in your body.

Time: 1095.62

And what are they all doing?

Time: 1097.21

And what do they look like when you're healthy?

Time: 1099.88

What do they look like when you're sick?

Time: 1101.61

And where we're at right now with our understanding of cells

Time: 1108.07

and what happens when you get sick

Time: 1110.76

is basically we've gotten pretty good at, from the Human Genome

Time: 1115.77

Project, looking at how different mutations

Time: 1118.47

in your genetic code lead for you

Time: 1120.69

to be more susceptible to get sick or directly

Time: 1123.36

cause you to get sick.

Time: 1124.69

So we go from a mutation in your DNA to, wow,

Time: 1129.12

you now have Huntington's disease, for instance.

Time: 1133.44

And there's a lot that happens in the middle.

Time: 1136.53

And that's one of the questions that we're going after at CZI,

Time: 1140.64

is what actually happens.

Time: 1142.75

So an analogy that I like to use to share with my friends

Time: 1145.8

is, right now, say we have a recipe for a cake.

Time: 1148.86

We know there's a typo in the recipe.

Time: 1151.08

And then the cake is awful.

Time: 1154.32

That's all we know.

Time: 1155.58

We don't know how the chef interprets the typo.

Time: 1158.79

We don't know what happens in the oven.

Time: 1160.77

And we don't actually know how it's exactly

Time: 1164.17

connected to how the cake didn't turn out

Time: 1166.06

or how you had expected it.

Time: 1168.61

A lot of that is unknown.

Time: 1172

But we can actually systematically try

Time: 1174.31

to break this down.

Time: 1175.39

And one segment of that journey that we're looking at

Time: 1178.72

is how that mutation gets translated and acted

Time: 1182.35

upon in your cells.

Time: 1183.76

And all of your cells have what's called mRNA.

Time: 1187.87

mRNA are the actual instructions that are taken from the DNA.

Time: 1193.58

And our work in Single-Cell is looking

Time: 1196.45

at how every cell in your body is actually interpreting

Time: 1202.3

your DNA slightly differently and what

Time: 1204.76

happens when healthy cells are interpreting the DNA

Time: 1207.58

instructions and when sick cells are

Time: 1209.68

interpreting those directions.

Time: 1211.21

And that is a ton of data.

Time: 1213.55

I just told you, there's 37 trillion cells.

Time: 1215.98

There's different large sets of mRNA in each cell.

Time: 1221.21

But the work that we've been funding is looking at how--

Time: 1225.55

first of all, gathering that information.

Time: 1227.95

We've been incredibly lucky to be

Time: 1231.46

part of a very fast-moving field where we've gone from,

Time: 1236.92

in 2017, funding some methods work to now

Time: 1241.63

having really not complete, but nearly complete

Time: 1244.96

atlases of how the human body works, how flies work, how mice

Time: 1249.85

work at the single-cell level and being

Time: 1253.21

able to then try to piece together

Time: 1255.94

how does that all come together when you're healthy

Time: 1258.85

and when you're sick.

Time: 1259.99

And the neat thing about the inflection point

Time: 1265

where we're at in AI is that I can't look at this data

Time: 1269.2

and make sense of it.

Time: 1270.26

There's just too much of it.

Time: 1271.45

And biology is complex.

Time: 1273.88

Human bodies are complex.

Time: 1275.18

We need this much information.

Time: 1277.21

But the use of large language models

Time: 1280.15

can help us actually look at that data

Time: 1282.7

and gain insights, look at what trends

Time: 1285.82

are consistent with health and what trends are unsuspected.

Time: 1292.96

And eventually, our hope, through the use

Time: 1296.92

of these data sets that we've helped curate

Time: 1299.05

and the application of large language models,

Time: 1301.27

is to be able to formulate a virtual cell, a cell that's

Time: 1304.78

completely built off of the data sets of what

Time: 1309.25

we know about the human body, but allows us to manipulate,

Time: 1312.82

and learn faster and try new things to help

Time: 1316.24

move science and then medicine along.

Time: 1319.382

ANDREW HUBERMAN: Do you think we've

Time: 1320.84

cataloged the total number of different cell types?

Time: 1325.25

Every week, I look at great journals

Time: 1327.56

like Cell Nature and Science.

Time: 1328.82

And for instance, I saw recently that, using single cell

Time: 1332.15

sequencing, they've categorized 18 plus different types

Time: 1336.98

of fat cells.

Time: 1338

We always think of like a fat cell versus a muscle cell.

Time: 1340.43

So now, you've got 18 types.

Time: 1342.44

Each one is going to express many, many different genes

Time: 1345.56

and mRNAs.

Time: 1347.36

And perhaps one of them is responsible

Time: 1351.35

for what we see in advanced type 2 diabetes,

Time: 1355.01

or in other forms of obesity, or where people can't lay down

Time: 1358.82

fat cells, which turns out to be just as detrimental

Time: 1361.19

in those extreme cases.

Time: 1362.76

So now, you've got all these lists of genes.

Time: 1364.79

But I always thought of single cell sequencing as necessary,

Time: 1370.187

but not sufficient, right?

Time: 1371.27

You need the information, but it doesn't resolve the problem.

Time: 1374.34

And I think of it more as a hypothesis-generating

Time: 1377.99

experiment.

Time: 1379.097

OK, so you have all these genes.

Time: 1380.43

And you can say, well, this gene is particularly

Time: 1382.43

elevated in the diabetic cell type of, let's say,

Time: 1387.3

one of these fat cells or muscle cells for that matter,

Time: 1389.94

whereas it's not in non-diabetics.

Time: 1391.99

So then of the millions of different cells,

Time: 1395.76

maybe only five of them differ dramatically.

Time: 1399.58

So then you generate a hypothesis.

Time: 1401.28

Oh, it's the ones that differ dramatically

Time: 1402.63

that are important.

Time: 1403.422

But maybe one of those genes, when it's only 50% changed,

Time: 1409.11

has a huge effect because of some network biology effect.

Time: 1412.48

And so I guess what I'm trying to get to here

Time: 1415.08

is how does one meet that challenge.

Time: 1417.66

And can AI help resolve that challenge

Time: 1420.03

by essentially placing those lists of genes

Time: 1422.22

into 10,000 hypotheses?

Time: 1424.63

Because I'll tell you that the graduate students

Time: 1426.63

and postdocs in my lab get a chance to test one

Time: 1429.06

hypothesis at a time.

Time: 1430.17

PRISCILLA CHAN: I know.

Time: 1430.57

ANDREW HUBERMAN: And that's really the challenge,

Time: 1431.55

let alone one lab.

Time: 1432.75

And so for those that are listening

Time: 1434.705

to this-- and hopefully, it's not

Time: 1436.08

getting outside the scope of standard understanding

Time: 1439.26

or the understanding we've generated here.

Time: 1441.155

But what I'm basically saying is,

Time: 1442.53

you have to pick at some point.

Time: 1444.3

More data always sounds great.

Time: 1445.71

But then how do you decide what to test?

Time: 1448.2

PRISCILLA CHAN: So no, we don't know all the cell types.

Time: 1450.93

I think one thing that was really exciting when we first

Time: 1455.07

launched this work was cystic fibrosis.

Time: 1458.79

Cystic fibrosis is caused by mutation in CFTR.

Time: 1461.67

That's pretty well known.

Time: 1463.06

It affects a certain channel that makes it hard for mucus

Time: 1466.08

to be cleared.

Time: 1466.83

That's the basics of cystic fibrosis.

Time: 1468.9

When I went to medical school, it was taught as fact.

Time: 1471.24

ANDREW HUBERMAN: So their lungs fill up with fluid.

Time: 1472.86

These are people who are carrying around

Time: 1474.527

sacks of fluid filling up.

Time: 1475.83

PRISCILLA CHAN: Yep.

Time: 1476.88

ANDREW HUBERMAN: I've worked with people like that.

Time: 1477.97

And they have to literally dump the fluid out.

Time: 1479.49

PRISCILLA CHAN: Exactly.

Time: 1479.82

ANDREW HUBERMAN: They can't run or do intense exercise.

Time: 1481.92

Life is shorter.

Time: 1482.76

PRISCILLA CHAN: Life is shorter.

Time: 1484.093

And when we applied single-cell methodologies to the lungs,

Time: 1488.04

they discovered an entirely new cell type

Time: 1491.55

that actually is affected by a mutation in the CF mutation,

Time: 1496.53

in cystic fibrosis mutation, that

Time: 1498.57

actually changes the paradigm of how

Time: 1500.22

we think about cystic fibrosis.

Time: 1501.66

ANDREW HUBERMAN: Amazing.

Time: 1502.02

PRISCILLA CHAN: [? Just ?] [? unknown. ?] So I don't think

Time: 1504.437

we know all the cell types.

Time: 1506.323

I think we'll continue to discover them.

Time: 1507.99

And we'll continue to discover new relationships between cell

Time: 1510.87

and disease, which leads me to the second example I want

Time: 1513.99

to bring up, is this large data set

Time: 1517.71

that the entire scientific community has

Time: 1519.96

built around single cell.

Time: 1521.16

It's starting to allow us to say this mutation, where is it

Time: 1526.03

expressed?

Time: 1526.53

What types of cell types it's expressed in?

Time: 1528.7

And we actually have built a tool

Time: 1531.54

at CZI called CELLxGENE, where you can put in the mutation

Time: 1536.587

that you're interested in.

Time: 1537.67

And it gives you a heat map of cross cell types

Time: 1540.39

of which cell types are expressing the gene that you're

Time: 1545.19

interested in.

Time: 1546.16

And so then you can start looking at, OK,

Time: 1548.49

if I look at gene X and I know it's related to heart disease--

Time: 1555.4

but if you look at the heat map, it's

Time: 1557.1

also spiking in the pancreas.

Time: 1559.14

That allows you to generate a hypothesis.

Time: 1561.18

Why?

Time: 1562.25

And what happens when this gene is mutated

Time: 1564.86

and the function of your pancreas?

Time: 1567.38

Really exciting way to look and ask questions differently.

Time: 1571.91

And you can also imagine a world where

Time: 1574.61

if you're trying to develop a therapy, a drug, and the goal

Time: 1579.89

is to treat the function in the heart,

Time: 1581.87

but you know that it's also really

Time: 1583.82

active in the pancreas again.

Time: 1585.56

So is there going to be an unexpected side effect

Time: 1589.28

that you should think about as you're bringing

Time: 1592.04

this drug to clinical trials?

Time: 1594

So it's an incredibly exciting tool

Time: 1596.99

and one that's only going to get better

Time: 1598.67

as we get more and more sophisticated

Time: 1601.01

ways to analyze the data.

Time: 1602.162

ANDREW HUBERMAN: I must say, I love

Time: 1603.62

that because if I look at the advances in neuroscience

Time: 1605.96

over the last 15 years, most of them

Time: 1609.08

didn't necessarily come from looking at the nervous system.

Time: 1611.735

They came from the understanding that the immune system

Time: 1614.84

impacts the brain.

Time: 1615.59

Everyone prior to that talked about the brain

Time: 1617.54

as an immune-privileged organ.

Time: 1619.26

What you just said also bridges the divide

Time: 1622.38

between single cells, organs and systems, right?

Time: 1625.47

Because ultimately, cells make up organs.

Time: 1627.602

Organs make up systems.

Time: 1628.56

And they're all talking to one another.

Time: 1629.89

And everyone nowadays is familiar with gut-brain axis

Time: 1632.31

or the microbiome being so important.

Time: 1634.05

But rarely is the discussion between organs discussed,

Time: 1640.95

so to speak.

Time: 1641.68

So I think it's wonderful.

Time: 1643.71

So that tool was generated by CZI.

Time: 1646.5

Or CCI funded that tool?

Time: 1647.758

MARK ZUCKERBERG: We built that.

Time: 1649.05

PRISCILLA CHAN: We built it.

Time: 1649.56

ANDREW HUBERMAN: You built it.

Time: 1650.81

So is it built by Meta?

Time: 1651.78

Is this Meta?

Time: 1652.397

MARK ZUCKERBERG: No, no, it has its own engineers.

Time: 1654.48

ANDREW HUBERMAN: Got it.

Time: 1655

MARK ZUCKERBERG: Yeah.

Time: 1655.917

They're completely different organizations.

Time: 1657.96

ANDREW HUBERMAN: Incredible.

Time: 1659.52

And so a graduate student or postdoc

Time: 1661.352

who's interested in a particular mutation

Time: 1663.06

could put this mutation into this database.

Time: 1665.25

That graduate student or postdoc might

Time: 1666.9

be in a laboratory known for working on heart,

Time: 1669.81

but suddenly find that they're collaborating

Time: 1671.79

with other scientists that work on the pancreas, which also

Time: 1675.45

is wonderful because it bridges the divide

Time: 1677.665

between these fields.

Time: 1678.54

Fields are so siloed in science--

Time: 1680.47

not just different buildings, but people

Time: 1682.14

rarely talk, unless things like this are happening.

Time: 1684.33

PRISCILLA CHAN: I mean, the graduate student is someone

Time: 1685.89

that we want to empower because, one, they're

Time: 1687.765

the future of science, as you know.

Time: 1689.43

And within CELLxGENE, if you put in the gene

Time: 1692.43

you're interested in and it shows you the heat map,

Time: 1694.59

we also will pull up the most relevant papers to that gene.

Time: 1698.56

And so read these things.

Time: 1700.733

ANDREW HUBERMAN: That's fantastic.

Time: 1702.15

As we all know, quality nutrition

Time: 1704.1

influences, of course, our physical health, but also

Time: 1706.5

our mental health and our cognitive functioning--

Time: 1708.78

our memory, our ability to learn new things and to focus.

Time: 1711.6

And we know that one of the most important features

Time: 1713.82

of high quality nutrition is making sure

Time: 1715.83

that we get enough vitamins and minerals from high quality,

Time: 1718.83

unprocessed, or minimally processed

Time: 1720.54

sources, as well as enough probiotics, and prebiotics

Time: 1723.6

and fiber to support basically all

Time: 1726.06

the cellular functions in our body,

Time: 1727.69

including the gut microbiome.

Time: 1729.43

Now, I, like most everybody try to get optimal nutrition

Time: 1733.32

from whole foods, ideally mostly from minimally processed or non

Time: 1737.88

processed foods.

Time: 1738.627

However, one of the challenges that I and so many other people

Time: 1741.21

face is getting enough servings of high quality fruits

Time: 1743.82

and vegetables per day, as well as

Time: 1745.5

fiber and probiotics that often accompany those fruits

Time: 1748.14

and vegetables.

Time: 1748.86

That's why, way back in 2012, long before I ever

Time: 1751.95

had a podcast, I started drinking AG1.

Time: 1754.83

And so I'm delighted that AG1 is sponsoring the Huberman Lab

Time: 1757.77

podcast.

Time: 1758.59

The reason I started taking AG1 and the reason I still

Time: 1761.28

drink AG1 once or twice a day is that it

Time: 1764.07

provides all of my foundational nutritional needs.

Time: 1766.57

That is, it provides insurance that I

Time: 1768.75

get the proper amounts of those vitamins, minerals, probiotics

Time: 1771.87

and fiber to ensure optimal mental health, physical

Time: 1775.23

health and performance.

Time: 1776.92

If you'd like to try AG1, you can go to drinkag1.com/huberman

Time: 1781.65

to claim a special offer.

Time: 1783.12

They're giving away five free travel

Time: 1784.68

packs plus a year's supply of vitamin D3 K2.

Time: 1787.83

Again, that's drinkag1.com/huberman to claim

Time: 1791.67

that special offer.

Time: 1792.462

MARK ZUCKERBERG: I just think going back to your question

Time: 1794.837

from before are there going to be more cell types that

Time: 1797.46

get discovered?

Time: 1798.12

I mean, I assume so, right?

Time: 1799.5

I mean, no catalog of this stuff is ever--

Time: 1801.828

it doesn't seem like we're ever done.

Time: 1803.37

we keep on finding more.

Time: 1804.81

But I think that that gets to one of the things

Time: 1809.18

that I think are the strengths of modern LLMs,

Time: 1812.78

is the ability to imagine different states that things

Time: 1816.32

can be in.

Time: 1817.1

So from all the work that we've done and funded

Time: 1820.61

on the Human Cell Atlas, there is a large corpus of data

Time: 1824.09

that you can now train a kind of large-scale model on.

Time: 1828.93

And one of the things that we're doing at CZI,

Time: 1832.255

which I think is pretty exciting,

Time: 1833.63

is building what we think is one of the largest non-profit life

Time: 1838.46

sciences AI clusters.

Time: 1841.1

It's on the order of 1,000 GPUs.

Time: 1844.19

And it's larger than what most people have access

Time: 1847.28

to in academia that you can do serious engineering work on.

Time: 1852.83

And by basically training a model

Time: 1855.74

with all of the Human Cell Atlas Data

Time: 1857.36

and a bunch of other inputs as well,

Time: 1860.69

we think you'll be able to basically imagine

Time: 1864.2

all of the different types of cells and all

Time: 1866.27

the different states that they can be in, and when they're

Time: 1868.92

healthy and diseased, and how they'll

Time: 1870.57

interact with different--

Time: 1874.62

interact with each other, interact

Time: 1876.39

with different potential drugs.

Time: 1878.01

But I think the state of LLMs, I think

Time: 1880.32

this is where it's helpful to understand--

Time: 1882.42

have a good understanding and be grounded

Time: 1884.16

in the modern state of AI.

Time: 1885.87

I mean, these things are not foolproof.

Time: 1888.18

I mean, one of the flaws of modern LLMs

Time: 1890.4

is they hallucinate.

Time: 1891.96

So the question is, how do you make it

Time: 1893.76

so that that can be an advantage rather than a disadvantage?

Time: 1897.42

And I think the way that it ends up being an advantage

Time: 1899.82

is when they help you imagine a bunch of states

Time: 1902.4

that someone could be in, but then you, as the scientist

Time: 1905.098

or engineer, go and validate that those are true,

Time: 1907.14

whether they're solutions to how a protein can

Time: 1909.36

be folded or possible states that a cell could

Time: 1911.94

be in when it's interacting with other things.

Time: 1913.92

But we're not yet at the state with AI

Time: 1916.83

that you can just take the outputs of these things

Time: 1919.41

as gospel and run from there.

Time: 1922.71

But they are very good, I think as you said,

Time: 1925.35

hypothesis generators or possible solution generators

Time: 1929.52

that then you can go validate.

Time: 1930.93

So I think that that's a very powerful thing

Time: 1934.14

that we can basically--

Time: 1935.46

building on the first five years of science work

Time: 1937.68

around the Human Cell Atlas and all the data that's

Time: 1939.54

been built out-- carry that forward into something

Time: 1941.64

that I think is going to be a very novel tool going forward.

Time: 1945.822

And that's the type of thing that I

Time: 1947.28

think we're set up to do well.

Time: 1949.29

I mean, you had this exchange a little while back about funding

Time: 1954.99

levels and how CZI is just a drop in the bucket compared

Time: 1960.21

to NIH.

Time: 1962.73

The thing that I think we can do that's different

Time: 1965.37

is funding some of these longer term, bigger projects.

Time: 1970.5

It is hard to galvanize the and pull together

Time: 1973.2

the energy to do that.

Time: 1974.65

And it's a lot of what most science funding is, relatively

Time: 1978.66

small projects that are exploring

Time: 1980.28

things over relatively short time horizons.

Time: 1982.71

And one of the things that we try to do

Time: 1984.48

is build these tools over 5, 10, 15-year periods.

Time: 1989.04

They're often projects that require

Time: 1990.8

hundreds of millions of dollars of funding

Time: 1992.55

and world-class engineering teams and infrastructure to do.

Time: 1995.52

And that, I think, is a pretty cool contribution to the field

Time: 1999.3

that I think is--

Time: 2001.82

there aren't as many other folks who

Time: 2003.832

are doing that kind of thing.

Time: 2005.04

But that's one of the reasons why

Time: 2005.93

I'm personally excited about the virtual cell stuff

Time: 2008.33

because it just this perfect intersection of all the

Time: 2011.228

stuff that we've done in single cell,

Time: 2012.77

the previous collaborations that we've done with the field

Time: 2016.25

and bringing together the industry and AI

Time: 2019.94

expertise around this.

Time: 2021.14

ANDREW HUBERMAN: Yeah, I completely

Time: 2022.82

agree that the model of science that you're putting together

Time: 2026.51

with CZI isn't just unique from NIH,

Time: 2029.24

but it's extremely important that

Time: 2030.71

the independent investigator model is what's

Time: 2034.398

driven the progression of Science in this country

Time: 2036.44

and, to some extent, in Northern Europe for the last 100 years.

Time: 2040.22

And it's wonderful, on the one hand,

Time: 2044.12

because it allows for that image we have of a scientist

Time: 2048.29

tinkering away or the people in their lab, and then

Time: 2050.48

the eurekas.

Time: 2052.429

And that hopefully translates to better human health.

Time: 2056.69

But I think, in my opinion, we've moved past that model

Time: 2061.159

as the most effective model or the only model that

Time: 2063.65

should be explored.

Time: 2064.554

MARK ZUCKERBERG: Yeah, I just think it's a balance.

Time: 2066.679

You want that.

Time: 2067.04

But you want to empower those people.

Time: 2068.21

I think that that's these tools empower those folks.

Time: 2070.01

ANDREW HUBERMAN: Sure.

Time: 2070.1

And there are mechanisms to do that, like NIH.

Time: 2072.26

But it's hard to do collaborative science.

Time: 2074.57

It's interesting that we're sitting here not far--

Time: 2077.81

because I grew up right near here as well.

Time: 2080.059

I'm not far from the garage model of tech, right?

Time: 2083.81

The Hewlett-Packard model, not far from here at all.

Time: 2088.55

And the idea was the tinkerer in the garage, the inventor.

Time: 2091.94

And then people often forget that to implement

Time: 2094.1

all the technologies they discovered

Time: 2095.6

took enormous factories and warehouses.

Time: 2098.84

So there's a similarity there to Facebook, Meta, et cetera.

Time: 2101.58

But I think, in science, we imagine

Time: 2103.31

that the scientists alone in their laboratory

Time: 2105.38

and those eureka moments.

Time: 2106.52

But I think, nowadays, the big questions really require

Time: 2110.21

extensive collaboration and certainly tool development.

Time: 2113.508

And one of the tools that you keep coming back to

Time: 2115.55

is these LLMs, these large language models.

Time: 2117.5

And maybe you could just elaborate,

Time: 2119.06

for those that aren't familiar.

Time: 2120.8

What is a large language model?

Time: 2124.7

For the uninformed, what is it?

Time: 2127.85

And what does it allow us to do that different, other types

Time: 2133.85

of AI don't allow?

Time: 2135.44

Or more importantly, perhaps what

Time: 2136.97

does it allow us to do that a bunch of really smart people,

Time: 2140.51

highly informed in a given area of science,

Time: 2142.58

staring at the data--

Time: 2143.72

what can it do that they can't do?

Time: 2146.613

MARK ZUCKERBERG: Sure.

Time: 2147.53

So I think a lot of the progression of machine learning

Time: 2151.16

has been about building systems, neural networks or otherwise,

Time: 2156.08

that can basically make sense and find patterns in larger

Time: 2160.28

and larger amounts of data.

Time: 2161.75

And there was a breakthrough a number of years

Time: 2164.51

back that some folks at Google actually made

Time: 2168.68

called this transformer model architecture.

Time: 2171.47

And it was this huge breakthrough

Time: 2173.81

because before then there was somewhat of a cap

Time: 2177.38

where if you fed more data into a Neural Network

Time: 2180.2

past some point, it didn't really

Time: 2183.2

glean more insights from it, whereas transformers

Time: 2186.14

just-- we haven't seen the end of how big that

Time: 2188.39

can scale to yet.

Time: 2189.17

I mean, I think that there's a chance

Time: 2190.712

that we run into some ceiling.

Time: 2192.95

ANDREW HUBERMAN: So it never asymptotes?

Time: 2194.63

MARK ZUCKERBERG: We haven't observed it yet.

Time: 2195.65

But we just haven't built big enough systems yet.

Time: 2197.692

So I would guess that--

Time: 2199.618

I don't know.

Time: 2200.16

I think that this is actually one

Time: 2201.535

of the big questions in the AI field today,

Time: 2203.78

is basically, are transformers and are the current model

Time: 2207.3

architectures sufficient?

Time: 2208.47

If you just build larger and larger clusters,

Time: 2210.21

do you eventually get something that's

Time: 2211.5

like human intelligence or super intelligence?

Time: 2213.6

Or is there some kind of fundamental limit

Time: 2217.26

to this architecture that we just haven't reached yet?

Time: 2219.94

And once we get a little bit further in building them out,

Time: 2223.723

then we'll reach that.

Time: 2224.64

And then we'll need a few more leaps

Time: 2226.14

before we get to the level of AI that I

Time: 2230.28

think will unlock a ton of really

Time: 2232.8

futuristic and amazing things.

Time: 2234.18

But there's no doubt that even just being

Time: 2235.89

able to process the amount of data

Time: 2237.87

that we can now with this model architecture

Time: 2241.59

has unlocked a lot of new use cases.

Time: 2243.583

And the reason why they're called large language models is

Time: 2246

because one of the first uses of them is people basically

Time: 2250.47

feed in all of the language from, basically, the world

Time: 2254.58

wide web.

Time: 2255.27

And you can think about them as basically prediction machines.

Time: 2261.21

You put in a prompt.

Time: 2262.71

And it can basically predict a version

Time: 2266.01

of what should come next.

Time: 2267.13

So you type in a headline for a news story.

Time: 2270.75

And it can predict what it thinks the story should be.

Time: 2274.46

Or you could train it so that it could

Time: 2276.41

be a chat, bot where, OK, if you're

Time: 2278.87

prompted with this question, you, can get this response.

Time: 2282.385

But one of the interesting things

Time: 2283.76

is it turns out that there's actually nothing specific

Time: 2287

to using human language in it.

Time: 2288.51

So if instead of feeding it human language, if you

Time: 2291.32

use that model architecture for a network and instead

Time: 2294.77

you feed it all of the Human Cell Atlas Data,

Time: 2298.64

then if you prompt it with a state of a cell,

Time: 2301.67

it can spit out different versions

Time: 2304.79

of how that cell can interact or different states

Time: 2308.55

that the cell could be in next when it interacts

Time: 2310.55

with different things.

Time: 2311.4

ANDREW HUBERMAN: Does it have to take a genetics class?

Time: 2312.99

So for instance, if you give it a bunch of genetics data,

Time: 2314.9

do you have to say, hey, by the way, and then

Time: 2316.73

you give it a genetics class so it understands that you've got

Time: 2319.313

DNA, RNA, mRNA, and proteins?

Time: 2321.315

MARK ZUCKERBERG: No, I think that the basic nature of all

Time: 2323.93

these machine learning techniques is they're

Time: 2326.42

basically pattern recognition systems.

Time: 2328.74

So there are these very deep statistical machines

Time: 2333.96

that are very efficient at finding patterns.

Time: 2337.86

So it's not actually--

Time: 2339.65

you don't need to teach a language model that's

Time: 2342.21

trying to speak a language a lot of specific things

Time: 2347.37

about that language either.

Time: 2348.613

You just feed it in a bunch of examples.

Time: 2350.28

And then let's say you teach it about something in English,

Time: 2354.93

but then you also give it a bunch of examples

Time: 2357.06

of people speaking Italian.

Time: 2359.1

It'll actually be able to explain the thing that it

Time: 2361.53

learned in English in Italian.

Time: 2363.78

So the crossover and just the pattern recognition

Time: 2367.59

is the thing that is pretty profound and powerful

Time: 2370.86

about this.

Time: 2371.55

But it really does apply to a lot of different things.

Time: 2374.5

Another example in the scientific community

Time: 2377.58

has been the work that AlphaFold,

Time: 2380.82

basically the folks at DeepMind, have done on protein folding.

Time: 2385.32

It's just basically a lot of the same model architecture.

Time: 2389.59

But instead of language, there they

Time: 2391.47

fold they fed in all of these protein data.

Time: 2394.653

And you can give it a state.

Time: 2395.82

And it can spit out solutions to how those proteins get folded.

Time: 2399.57

So it's very powerful.

Time: 2401.02

I don't think we know yet, as an industry, what

Time: 2405.81

the natural limits of it are.

Time: 2407.907

I think that that's one of the things that's

Time: 2409.74

pretty exciting about the current state.

Time: 2411.6

But it's certainly allows you to solve problems

Time: 2415.32

that just weren't solved with the generation of machine

Time: 2419.19

learning that came before it.

Time: 2421.302

ANDREW HUBERMAN: It sounds like CZI

Time: 2422.76

is moving a lot of work that was just done in vitro, in dishes,

Time: 2427.26

and in vivo, in living organisms,

Time: 2430.53

model organisms are humans, to in silico, as we say.

Time: 2434.04

So do you foresee a future where a lot of biomedical research,

Time: 2438.6

certainly the work of CZI included, is done by machines?

Time: 2444.42

I mean, obviously, it's much lower cost.

Time: 2446.97

And you can run millions of experiments, which,

Time: 2449.16

of course, is not to say that humans are not

Time: 2450.993

going to be involved.

Time: 2451.92

But I love the idea that we can run experiments in silico

Time: 2456.69

en masse.

Time: 2458.25

PRISCILLA CHAN: I think in silico experiments are

Time: 2461.37

going to be incredibly helpful to test things quickly,

Time: 2465.24

cheaply and just unleash a lot of creativity.

Time: 2471.18

I do think you need to be very careful about making

Time: 2473.76

sure it still translates and matches the humans.

Time: 2478.65

One thing that's funny in basic science

Time: 2482.04

is we've basically cured every single disease in mice.

Time: 2487.08

We know what's going on when they have a number of diseases

Time: 2490.47

because they're used as a model organism.

Time: 2492.75

But they are not humans.

Time: 2494.49

And a lot of times, that research

Time: 2496.8

is relevant, but not directly one-to-one

Time: 2500.73

translatable to humans.

Time: 2502.24

So you just have to be really careful about making sure

Time: 2505.05

that it actually works for humans.

Time: 2508.593

ANDREW HUBERMAN: Sounds like what CZI is doing

Time: 2510.51

is actually creating a new field.

Time: 2513.87

As I'm hearing all of this, I'm thinking, OK,

Time: 2515.76

this transcends immunology department, cardiothoracic

Time: 2520.35

surgery, I mean neuroscience.

Time: 2522.27

I mean, the idea of a new field, where you certainly embrace

Time: 2526.113

the realities of universities and laboratories

Time: 2528.03

because that's where most of the work that you're funding

Time: 2529.74

is done.

Time: 2530.24

Is that right?

Time: 2531.045

MARK ZUCKERBERG: Mm-hmm.

Time: 2531.24

ANDREW HUBERMAN: So maybe we need

Time: 2532.86

to think about what it means to do science differently.

Time: 2536.53

And I think that's one of the things that's most exciting.

Time: 2539.46

Along those lines, it seems that bringing together

Time: 2541.643

a lot of different types of people

Time: 2543.06

at different major institutions is going

Time: 2547.905

to be especially important.

Time: 2549.03

So I know that the initial CZI Biohub, gratefully,

Time: 2554.09

included Stanford.

Time: 2555.74

We'll put that first in the list,

Time: 2557.6

but also UCSF, forgive me.

Time: 2559.85

I have many friends at UCSF and also Berkeley.

Time: 2562.91

But there are now some additional institutions

Time: 2567.2

involved.

Time: 2567.84

So maybe you could talk about that,

Time: 2569.298

and what motivated the decision to branch outside the Bay Area

Time: 2572.84

and why you selected those particular additional

Time: 2576.5

institutions to be included.

Time: 2577.955

MARK ZUCKERBERG: Well, I'll just say it.

Time: 2580.22

A big part of why we wanted to create additional biohubs

Time: 2582.693

is we were just so impressed by the work

Time: 2584.36

that the folks who were running the first biohub did.

Time: 2587.33

PRISCILLA CHAN: Yeah.

Time: 2588.77

And you should walk through the work

Time: 2591.86

of the Chicago Biohub and the New York Biohub

Time: 2594.35

that we just announced.

Time: 2595.53

But I think it's actually an interesting set of examples

Time: 2598.64

that balance the limits of what you want

Time: 2602

to do with physical material engineering

Time: 2605.21

and where things are purely biological

Time: 2608.76

because the Chicago team is really building more

Time: 2611.407

sensors to be able to understand what's going on in your body.

Time: 2613.99

But that's more of a physical kind of engineering challenge,

Time: 2617.37

whereas the New York team-- we basically

Time: 2620.01

talk about this as like a cellular endoscope of being

Time: 2623.01

able to have an immune cell or something that

Time: 2625.8

can go and understand, what's the thing that's

Time: 2630.283

going on in your body?

Time: 2631.2

But it's not a physical piece of hardware.

Time: 2632.95

It's a cell that you can basically have just go report

Time: 2638.07

out on different things that are happening inside the body.

Time: 2640.783

ANDREW HUBERMAN: Oh, so making the cell the the microscope.

Time: 2643.241

PRISCILLA CHAN: Totally.

Time: 2644.085

MARK ZUCKERBERG: And then eventually actually

Time: 2644.91

being able to act on it.

Time: 2645.78

But I mean, you should go into more detail on all this.

Time: 2648.16

PRISCILLA CHAN: So a core principle

Time: 2649.618

of how we think about biohubs is that it has to be--

Time: 2653.91

when we invited proposals, it has

Time: 2655.68

to be at least three institutions,

Time: 2657.72

so really breaking down the barrier of a single university,

Time: 2662.13

oftentimes asking for the people designing the research

Time: 2665.52

aim to come from all different backgrounds and to explain why

Time: 2670.43

that the problem that they want to solve

Time: 2672.92

requires interdisciplinary, inter-university, institution

Time: 2678.05

collaboration to actually make happen.

Time: 2680.69

We just put that request for proposal

Time: 2682.67

out there with our San Francisco Biohub

Time: 2685.37

as an example, where they've done

Time: 2687.17

incredible work in single cell biology and infectious disease.

Time: 2692.33

And we got--

Time: 2694.4

I want to say-- like 57 proposals

Time: 2696.8

from over 150 institutions.

Time: 2700.4

A lot of ideas came together.

Time: 2702.2

And we were so, so excited that we've

Time: 2705.44

been able to launch Chicago and New York.

Time: 2708.6

Chicago is a collaboration between UIUC,

Time: 2711.74

University of Illinois Urbana-Champaign,

Time: 2714.32

and University of Chicago and Northwestern.

Time: 2718.73

Obviously, these universities are multifaceted.

Time: 2721.22

But if I were to describe them by their stereotypical

Time: 2725.27

strength, Northwestern has an incredible medical system

Time: 2730.25

and hospital system.

Time: 2731.66

University of Chicago brings to the table

Time: 2735.71

incredible basic science strengths.

Time: 2737.96

University of Illinois is a computing powerhouse.

Time: 2741.68

And so they came together and proposed

Time: 2744.027

that they were going to start thinking

Time: 2745.61

about cells in tissue, so one of the layers

Time: 2750.38

that you just alluded to.

Time: 2752.16

So how do the cells that we know behave and act differently when

Time: 2757.13

they come together as a tissue?

Time: 2758.78

And one of the first tissues that they're starting with

Time: 2761.27

is skin.

Time: 2762.18

So they've already been able to, as a collaboration

Time: 2766.04

under the leadership, of Shana Kelly design engineered

Time: 2771.29

skin tissue.

Time: 2772.73

The architecture looks the same as what's in you and I.

Time: 2777.02

And what they've done is built these super, super thin

Time: 2781.58

sensors.

Time: 2782.12

And they embed these sensors throughout the layers

Time: 2785.6

of this engineered tissue.

Time: 2787.1

And they read out the data.

Time: 2789.12

They want to see what these cells are secreting,

Time: 2793.11

how these cells talk to each other

Time: 2794.8

and what happens when these cells get inflamed.

Time: 2797.55

Inflammation is an incredibly important process

Time: 2799.98

that drives 50% of all deaths.

Time: 2803.35

And so this is another disease-agnostic approach.

Time: 2806.4

We want to understand inflammation.

Time: 2808.14

And they're going to get a ton of information

Time: 2810.33

out from these sensors that tell you what happens when something

Time: 2815.4

goes awry because right now we can say,

Time: 2818.28

when you have an allergic reaction,

Time: 2820.08

your skin gets red and puffy.

Time: 2821.82

But what is the earliest signal of that?

Time: 2824.4

And these sensors can look at the behaviors

Time: 2828.15

of these cells over time.

Time: 2829.56

And then you can apply a large language model

Time: 2831.66

to look at the earliest statistically significant

Time: 2834.78

changes that can allow you to intervene as early as possible.

Time: 2839.71

So that's what Chicago's doing.

Time: 2841.8

They're starting in the skin cells.

Time: 2844.8

They're also looking at the neuromuscular junction, which

Time: 2848.03

is the connection between where a neuron attaches to a muscle

Time: 2852.05

and tells the muscle how to behave--

Time: 2853.64

super important in things like ALS, but also in aging.

Time: 2858.68

The slowed transmission of information

Time: 2861.29

across that neuromuscular junction

Time: 2862.82

is what causes old people to fall.

Time: 2864.71

Their brain cannot trigger their muscles to react fast enough.

Time: 2867.9

And so we want to be able to embed

Time: 2870.2

these sensors to understand how these different, interconnected

Time: 2875.12

systems within our bodies work together.

Time: 2877.61

In New York, they're doing a related, but equally exciting

Time: 2883.82

project where they're engineering individual cells

Time: 2888.32

to be able to go in and identify changes in a human body.

Time: 2894.99

So what they'll do is--

Time: 2898.13

they're calling it--

Time: 2899.27

ANDREW HUBERMAN: It's wild.

Time: 2900.06

I mean, I love that.

Time: 2900.893

I mean, this is--

Time: 2902.077

I don't want to go on a tangent.

Time: 2903.41

But for those that want to look it up adaptive optics,

Time: 2905.9

there's a lot of distortion and interference

Time: 2908.812

when you try and look at something really

Time: 2910.52

small or really far away.

Time: 2911.72

And really smart physicists figured out,

Time: 2914.54

well, use the interference as part of the microscope.

Time: 2917.51

Make those actually lenses of the microscope.

Time: 2920.34

MARK ZUCKERBERG: We should talk about imaging

Time: 2921.14

separately after you talk about the New York Biohub.

Time: 2923.42

ANDREW HUBERMAN: It's extremely clever, along those lines.

Time: 2925.34

It's not intuitive.

Time: 2926.15

But then when you hear it, it's like it makes so much sense.

Time: 2929.06

It's not immediately intuitive.

Time: 2930.53

Make the cells that already can navigate to tissues

Time: 2933.26

or embed themselves in tissues be the microscope

Time: 2935.298

within that tissue.

Time: 2936.09

I love it.

Time: 2937.085

PRISCILLA CHAN: Totally.

Time: 2938.38

The way that I explain this to my friends

Time: 2940.52

and my family is this is Fantastic Voyage,

Time: 2943.76

but real life.

Time: 2946.13

We are going into the human body.

Time: 2947.81

And we're using the immune cells, which are privileged

Time: 2951.02

and already working to keep your body healthy,

Time: 2953.3

and being able to target them to examine certain things.

Time: 2957

So you can engineer an immune cell to go in your body

Time: 2961.19

and look inside your coronary arteries and say,

Time: 2964.47

are these arteries healthy?

Time: 2966.62

Or are there plaques?

Time: 2967.76

Because plaques lead to blockage,

Time: 2970.4

which lead to heart attacks.

Time: 2972.35

And the cell can then record that information

Time: 2975.35

and report it back out.

Time: 2976.48

That's the first half of what the New York

Time: 2978.23

Biohub is going to do.

Time: 2979.43

ANDREW HUBERMAN: Fantastic.

Time: 2979.97

PRISCILLA CHAN: The second half is can you

Time: 2981.72

then engineer the cells to go do something about it.

Time: 2984.27

Can I then tell a different cell,

Time: 2986.27

immune cell that is able to transport in your body

Time: 2989.15

to go in and clean that up in a targeted way?

Time: 2992.49

And so it's incredibly exciting.

Time: 2996.08

They're going to study things that

Time: 2998.06

are immune privilege, that your immune system normally

Time: 3001.18

doesn't have access to--

Time: 3003.25

things like ovarian and pancreatic cancer.

Time: 3006.37

They'll also look at a number of neurodegenerative diseases,

Time: 3009.7

since the immune system doesn't presently have a ton of access

Time: 3014.44

into the nervous system.

Time: 3017.02

But it's both mind blowing and it feels like sci-fi.

Time: 3022

But science is actually in a place

Time: 3023.74

where if you really push a group of incredibly

Time: 3026.92

qualified scientists say, could you do this

Time: 3029.62

if given the chance, the answer is like probably.

Time: 3033.04

Give us enough time, the bright team and resources.

Time: 3036.82

It's doable.

Time: 3037.66

MARK ZUCKERBERG: Yeah.

Time: 3038.577

I mean, it's a 10 to 15-year project.

Time: 3040.48

But it's awesome, engineered cells, yeah.

Time: 3043.258

ANDREW HUBERMAN: I love the optimism.

Time: 3044.8

And the moment you said make the cell the microscope,

Time: 3047.26

so to speak, I was like yes, yes and yes.

Time: 3050.11

It just makes so much sense.

Time: 3052.27

What motivated the decision to do the work of CZI

Time: 3056.98

in the context of existing universities as opposed to--

Time: 3060.19

there's still some real estate up in Redwood City

Time: 3062.59

where there's a bunch of space to put biotech companies

Time: 3064.93

and just hiring people from all backgrounds

Time: 3068.05

and saying, hey, have at it and doing this stuff from scratch?

Time: 3071.65

I mean, it's a very interesting decision

Time: 3074.68

to do this in the context of an existing

Time: 3076.36

framework of graduate students that need to do their thesis

Time: 3078.73

and get a first author paper because there's

Time: 3080.563

a whole set of structures within academia

Time: 3082.3

that I think both facilitate, but also limit

Time: 3084.61

the progression of science.

Time: 3086.23

That independent investigator model

Time: 3088.27

that we talked about a little bit earlier,

Time: 3090.46

it's so core to the way science has been done.

Time: 3092.842

This is very different and frankly sounds

Time: 3094.55

far more efficient, if I'm to be completely honest.

Time: 3097.19

And we'll see if I renew my NIH funding after saying that.

Time: 3101.36

But I think we all want the same thing.

Time: 3103.7

As scientists and as humans, we want

Time: 3107.85

to understand the way we work.

Time: 3109.1

And we want healthy people to persist to be healthy.

Time: 3113.305

And we want sick people to get healthy.

Time: 3114.93

I mean, that's really ultimately the goal.

Time: 3116.68

It's not super complicated.

Time: 3117.84

It's just hard to do.

Time: 3118.67

PRISCILLA CHAN: So the teams at the biohub

Time: 3120.42

are actually independent of the universities.

Time: 3123.25

ANDREW HUBERMAN: Got it.

Time: 3124.25

PRISCILLA CHAN: So each biohub will probably

Time: 3126.083

have in total maybe 50 people working on deep efforts.

Time: 3131.06

However, it's an acknowledgment that not all of the best

Time: 3134.84

scientists who can contribute to this area

Time: 3137.12

are actually going to, one, want to leave a university

Time: 3141.5

or want to take on the full-time scope of this project.

Time: 3145.52

So it's the ability to partner with universities

Time: 3149.48

and to have the faculty at all the universities

Time: 3153.83

be able to contribute to the overall project,

Time: 3157.25

is how the biohub is structured.

Time: 3159.22

ANDREW HUBERMAN: Got it.

Time: 3160.22

MARK ZUCKERBERG: But a lot of the way that we're approaching

Time: 3162.72

CZI is this long-term, iterative project

Time: 3165.92

to figure out-- try a bunch of different things,

Time: 3168.2

figure out which things produce the most interesting results,

Time: 3171.68

and then double down on those in the next five-year push.

Time: 3175.61

So we just went through this period

Time: 3177.8

where we wrapped up the first five

Time: 3180.2

years of the science program.

Time: 3181.695

And we tried a lot of different models,

Time: 3183.32

all kinds of different things.

Time: 3184.57

And it's not that the biohub model--

Time: 3187.31

we don't think it's the best or only model.

Time: 3190.13

But we found that it was a really interesting way

Time: 3194.09

to unlock a bunch of collaboration

Time: 3196.37

and bring some technical resources that

Time: 3198.95

allow for this longer term development.

Time: 3200.84

And it's not something that is widely being pursued

Time: 3205.37

across the rest of the field.

Time: 3206.6

So we figured, OK, this is an interesting thing

Time: 3209.09

that we can help push on.

Time: 3210.5

But I mean, yeah, we do believe in the collaboration.

Time: 3214.62

But I also think that we come at this with--

Time: 3217.91

we don't think that the way that we're pursuing this

Time: 3220.34

is the only way to do this or the way

Time: 3222.345

that everyone should do it.

Time: 3223.47

We're pretty aware of what is the rest of the ecosystem

Time: 3229.01

and how we can play a unique role in it.

Time: 3231.11

ANDREW HUBERMAN: It feels very synergistic

Time: 3232.25

with the way science is already done

Time: 3233.81

and also fills an incredibly important niche that,

Time: 3236.78

frankly, wasn't filled before.

Time: 3239.42

Along the lines of implementation--

Time: 3241.35

so let's say your large language models combined with imaging

Time: 3245.93

tools reveal that a particular set of genes acting

Time: 3250.79

in a cluster--

Time: 3252.11

I don't know-- set up an organ crash.

Time: 3255.32

Let's say the pancreas crashes at a particular stage

Time: 3258.17

of pancreatic cancer.

Time: 3259.98

I mean, it's still one of the most deadliest of the cancers.

Time: 3262.85

And there are others that you certainly wouldn't want to get.

Time: 3266.39

But that's among the ones you wouldn't want to get the most.

Time: 3269.9

So you discover that.

Time: 3270.883

And then and the idea is that, OK,

Time: 3272.3

then AI reveals some potential drug

Time: 3274.88

targets that then bear out in vitro, in a dish

Time: 3278.46

and in a mouse model.

Time: 3280.7

How is the actual implementation to drug discovery?

Time: 3283.97

Or maybe this target is druggable, maybe it's not.

Time: 3287.15

Maybe it requires some other approach--

Time: 3289.76

laser ablation approach or something.

Time: 3292.73

We don't know.

Time: 3293.69

But ultimately, is CZI going to be

Time: 3295.79

involved in the implementation of new therapeutics?

Time: 3298.07

Is that the idea?

Time: 3299.308

MARK ZUCKERBERG: Less so.

Time: 3300.35

PRISCILLA CHAN: Less so.

Time: 3302.13

This is where it's important to work in an ecosystem

Time: 3304.64

and to know your own limitations.

Time: 3306.65

There are groups, and startups and companies

Time: 3310.07

that take that and bring it to translation very effectively.

Time: 3314.33

I would say the place where we have

Time: 3316.85

a small window into that world is actually

Time: 3319.88

our work with rare disease groups.

Time: 3322.49

We have, through our Rare As One portfolio,

Time: 3325.94

funded patient advocates to create rare disease

Time: 3330.47

organizations where patients come together and actually pool

Time: 3337.18

their collective experience.

Time: 3339.13

They build bioregistries, registries

Time: 3341.47

of their natural history.

Time: 3342.85

And they both partner with researchers

Time: 3346.36

to do the research about their disease

Time: 3348.28

and with drug developers to incentivize drug developers

Time: 3353.44

to focus on what they may need for their disease.

Time: 3356.84

And one thing that's important to point out

Time: 3359.35

is that rare diseases aren't rare.

Time: 3361.03

There are over 7,000 rare diseases

Time: 3363.82

and collectively impact many, many individuals.

Time: 3368.42

And I think the thing that's, from a basic science

Time: 3374.24

perspective, the incredibly fascinating thing

Time: 3376.85

about rare diseases is that they're actually windows to how

Time: 3380.03

the body normally should work.

Time: 3382.65

And so there are often mutations that when

Time: 3388.34

genes that when they're mutated cause very specific diseases,

Time: 3392

but that tell you how the normal biology works as well.

Time: 3396.2

ANDREW HUBERMAN: Got it.

Time: 3397.25

So you discussed basically the major goals and initiatives

Time: 3402.71

of the CZI for the next, say, 5 to 10 years.

Time: 3406.46

And then beyond that, the targets

Time: 3408.74

will be explored by biotech companies.

Time: 3411.17

They'll grab those targets, and test them and implement them.

Time: 3414.48

MARK ZUCKERBERG: There's also, I think,

Time: 3416.105

been a couple of teams from the initial biohub that

Time: 3418.55

were interested in spinning out ideas into startups.

Time: 3421.14

So even though it's not a thing that we're

Time: 3424.07

going to pursue because we're a philanthropy,

Time: 3427.88

we want to enable the work that gets

Time: 3430.4

done to be able to get turned into companies and things

Time: 3433.49

that other people go take and run

Time: 3436.07

towards building ultimately therapeutics.

Time: 3440.002

So that's another zone.

Time: 3440.96

But that's not a thing that we're going to do.

Time: 3442.97

ANDREW HUBERMAN: Got it.

Time: 3444.38

I gather you're both optimists.

Time: 3446.63

Yeah?

Time: 3447.26

Is that part of what brought you together?

Time: 3450.21

Forgive me for switching to a personal question.

Time: 3452.39

But I love the optimism that seems

Time: 3455

to sit at the root of the CZI.

Time: 3456.523

PRISCILLA CHAN: I will say that we

Time: 3457.94

are incredibly hopeful people.

Time: 3460.28

But it manifests in different ways between the two of us.

Time: 3464.45

MARK ZUCKERBERG: Yeah.

Time: 3466.137

PRISCILLA CHAN: How would you describe

Time: 3467.72

your optimism versus mine?

Time: 3469.67

It's not a loaded question.

Time: 3472.445

MARK ZUCKERBERG: I don't know.

Time: 3478.057

Huh.

Time: 3482.74

I mean, I think I'm more probably technologically

Time: 3485.47

optimistic about what can be built.

Time: 3488

And I think you, because of your focus as an actual doctor,

Time: 3495.43

have more of a sense of how that's

Time: 3498.79

going to affect actual people in their lives,

Time: 3502.66

whereas, for me, it's like--

Time: 3504.58

I mean, a lot of my work is we touch a lot

Time: 3508.96

of people around the world.

Time: 3510.88

And the scale is immense.

Time: 3512.48

And I think, for you, it's like being

Time: 3514.57

able to improve the lives of individuals,

Time: 3518.63

whether it's students at any of the schools that you've started

Time: 3521.56

or any of the stuff that we've supported through the education

Time: 3524.143

work, which isn't the goal here, or just

Time: 3527.83

being able to improve people's lives in that way I think

Time: 3530.44

is the thing that I've seen be super passionate about.

Time: 3533.71

I don't know.

Time: 3534.473

Do you agree with that characterization?

Time: 3536.14

I'm trying I'm trying to--

Time: 3537.37

PRISCILLA CHAN: Yeah, I agree with that.

Time: 3539.037

I think that's very fair.

Time: 3540.79

And I'm sort of giggling to myself

Time: 3542.84

because in day-to-day life, as life partners,

Time: 3546.74

our relative optimism comes through

Time: 3549.35

as Mark just is overly optimistic about his time

Time: 3554.12

management and will get engrossed in interesting ideas.

Time: 3557.192

MARK ZUCKERBERG: I'm late.

Time: 3558.275

PRISCILLA CHAN: And he's late.

Time: 3559.708

ANDREW HUBERMAN: Physicians are very punctual, yeah.

Time: 3561.875

PRISCILLA CHAN: And because he's late,

Time: 3563.458

I have to channel Mark is an optimist whenever

Time: 3566.36

I'm waiting for him.

Time: 3567.2

MARK ZUCKERBERG: That's such a nice way of--

Time: 3569.033

OK, I'll start using that.

Time: 3570.32

PRISCILLA CHAN: That's what I think

Time: 3571.25

when I'm in the driveway with the kids waiting for you.

Time: 3573.542

I'm like, Mark is an optimist.

Time: 3576

And so his optimism translates to some tardiness,

Time: 3579.92

whereas I'm a how is this going to happen like.

Time: 3585.607

I'm going to open a spreadsheet.

Time: 3586.94

I'm going to start putting together a plan

Time: 3589.13

and pulling together all the pieces,

Time: 3591.41

calling people to bring something to life.

Time: 3595.503

MARK ZUCKERBERG: But it is one of my favorite quotes, that

Time: 3597.92

is optimists tend to be successful

Time: 3601.34

and pessimists tend to be right.

Time: 3603.94

And yeah, I mean, I think it's true in a lot

Time: 3606.55

of different aspects of life.

Time: 3609.308

ANDREW HUBERMAN: Who said that?

Time: 3610.6

Did you say that, Mark Zuckerberg?

Time: 3611.86

MARK ZUCKERBERG: No, I did not.

Time: 3613.03

PRISCILLA CHAN: Absolutely not.

Time: 3613.65

MARK ZUCKERBERG: No, no, no.

Time: 3614.89

I like it.

Time: 3615.43

I did not invent it.

Time: 3617.05

ANDREW HUBERMAN: We'll give it to you.

Time: 3617.84

We'll put it out there.

Time: 3618.4

MARK ZUCKERBERG: No, no, no.

Time: 3619.66

ANDREW HUBERMAN: Just kidding, just kidding.

Time: 3621.01

MARK ZUCKERBERG: But I do think that there's really

Time: 3622.69

something to it, right?

Time: 3623.648

I mean, if you're discussing any idea,

Time: 3626.11

there's all these reasons why it might not work.

Time: 3631.12

And those reasons are probably true.

Time: 3634.06

The people who are stating them probably have some validity

Time: 3637.5

to it.

Time: 3638

But the question is, is that the most productive way to view

Time: 3640.63

the world?

Time: 3642.64

Across the board, I think the people

Time: 3644.56

who tend to be the most productive

Time: 3646.21

and get the most done--

Time: 3647.92

you kind of need to be optimistic

Time: 3649.355

because if you don't believe that something can get done,

Time: 3651.73

then why would you go work on it?

Time: 3653.42

ANDREW HUBERMAN: The reason I ask

Time: 3654.795

the question is that these days we hear a lot about the future

Time: 3658.33

is looking so dark in these various ways.

Time: 3660.95

And you have children.

Time: 3662.93

So you have families.

Time: 3664.16

And you are a family, excuse me.

Time: 3666.77

And you also have families independently

Time: 3669.38

that are now merged.

Time: 3670.49

But I love the optimism behind the CZI

Time: 3674.27

because, behind all this, there's

Time: 3678.62

a set of big statements on the wall.

Time: 3681.02

One, the future can be better than the present,

Time: 3683.87

in terms of treating disease, maybe even, you said,

Time: 3686.99

eliminating diseases, all diseases.

Time: 3689.93

I love that optimism.

Time: 3691.94

And there's a tractable path to do it.

Time: 3695.21

We're going to put literally money, and time, and energy,

Time: 3698.48

and people, and technology and AI behind that.

Time: 3702.15

And so I have to ask, was having children

Time: 3707.51

a significant modifier in terms of your view of the future?

Time: 3711.598

Like wow, you hear all this doom and gloom.

Time: 3713.39

What's the future going to be like for them?

Time: 3715.25

Did you sit back and think, what would it

Time: 3718.4

look like if there was a future with no diseases?

Time: 3721.097

Is that the future, we want our children in?

Time: 3722.93

I mean, I'm voting a big yes.

Time: 3724.49

So we're not we're not going to debate that at all.

Time: 3726.63

But was having children an inspiration

Time: 3729.32

for the CZI in some way?

Time: 3731.403

MARK ZUCKERBERG: Yeah.

Time: 3732.32

So

Time: 3732.82

PRISCILLA CHAN: I think my answer to that--

Time: 3735.6

I would dial backwards for me.

Time: 3738.53

And I'll just tell a very brief story about my family.

Time: 3742.01

I'm the daughter of Chinese-Vietnamese refugees.

Time: 3745.7

My parents and grandparents were boat people,

Time: 3749.27

if you remember people left Vietnam

Time: 3751.37

during the war in these small boats into the South China Sea.

Time: 3754.91

And there were stories about how these boats would sink

Time: 3760.55

with whole families on them.

Time: 3761.88

And so my grandparents, both sets

Time: 3763.85

of grandparents who knew each other,

Time: 3765.5

decided that there was a better future out there.

Time: 3768.98

And they were willing to take risks for it.

Time: 3771.48

But they were afraid of losing all of their kids.

Time: 3774.89

My dad is one of six.

Time: 3776.45

My mom is one of 10.

Time: 3777.9

And so they decided that there was something

Time: 3782.63

out there in this bleak time.

Time: 3784.43

And they paired up their kids, one from each family,

Time: 3788.09

and sent them out on these little boats

Time: 3790.79

before the internet, before cell phones, and just said,

Time: 3795.93

we'll see you on the other side.

Time: 3797.515

ANDREW HUBERMAN: Wow.

Time: 3798.39

PRISCILLA CHAN: And the kids were

Time: 3800.31

between the ages of like 10 to 25, so young kids.

Time: 3805.68

My mom was a teenager, early teen when this happened.

Time: 3810.3

And everyone made it.

Time: 3812.25

And I get to sit here and talk to you.

Time: 3814.6

So how could I not believe that better is possible?

Time: 3819.19

And like I hope that that's in my epigenetics somewhere

Time: 3822.63

and that I carry on.

Time: 3824.145

ANDREW HUBERMAN: That is a spectacular story.

Time: 3826.02

PRISCILLA CHAN: Isn't that wild?

Time: 3826.77

ANDREW HUBERMAN: It is spectacular.

Time: 3827.7

PRISCILLA CHAN: How can I be a pessimist with that?

Time: 3829.53

ANDREW HUBERMAN: I love it.

Time: 3830.43

And I so appreciate that you became a physician

Time: 3832.41

because you're now bringing that optimism,

Time: 3835.29

and that epigenetic understanding,

Time: 3837.768

and cognitive understanding and emotional understanding

Time: 3840.06

to the field of medicine.

Time: 3841.62

So I'm grateful to the people that made that decision.

Time: 3845.005

PRISCILLA CHAN: Yeah.

Time: 3848.4

I've always known that story.

Time: 3850.2

But you don't understand how wild that feels

Time: 3853.11

until you have your own child.

Time: 3854.61

And you're like, well, I can't even--

Time: 3856.44

I refuse to let her use glass bottles only or something

Time: 3861.09

like that.

Time: 3861.66

And you're like, oh my God, the risk and the willingness

Time: 3866.22

of my grandparents to believe in something bigger and better

Time: 3870.24

is just astounding.

Time: 3872.35

And our own children give it a sense of urgency.

Time: 3876.127

ANDREW HUBERMAN: Again, a spectacular story.

Time: 3877.96

And you're sending knowledge out into the fields of science

Time: 3880.827

and bringing knowledge into the fields of science.

Time: 3882.91

And I love this.

Time: 3884.05

We'll see you on the other side.

Time: 3885.82

I'm confident that it will all come back.

Time: 3890.05

Well, thank you so much for that.

Time: 3893.593

Mark, you have the opportunity to talk about--

Time: 3895.51

did having kids change your worldview?

Time: 3897.67

MARK ZUCKERBERG: It's really tough to beat that story.

Time: 3899.99

ANDREW HUBERMAN: It is tough to beat that story.

Time: 3901.57

And they are also your children.

Time: 3903.08

So in this case, you get two for the price of one, so to speak.

Time: 3908.535

MARK ZUCKERBERG: Having children definitely changes

Time: 3910.66

your time horizon.

Time: 3911.84

So I think that that's one thing.

Time: 3914.08

There are all these things that I think we had talked about,

Time: 3917.003

for as long as we've known each other, that you eventually

Time: 3919.42

want to go do.

Time: 3920.07

But then it's like, oh, we're having kids.

Time: 3921.82

We need to get on this, right?

Time: 3923.74

So I think that there's--

Time: 3924.88

PRISCILLA CHAN: That was actually

Time: 3925.6

one of the checklists, the baby checklist before the first.

Time: 3928.51

MARK ZUCKERBERG: It was like, the baby's coming.

Time: 3930.51

We have to start CZI.

Time: 3932.153

PRISCILLA CHAN: Truly.

Time: 3933.07

MARK ZUCKERBERG: I'm like sitting in the hospital

Time: 3935.38

delivery room finishing editing the letter that we

Time: 3939.28

were going to publish to announce the work.

Time: 3941.313

PRISCILLA CHAN: Some people think that is an exaggeration.

Time: 3943.73

It was not.

Time: 3944.53

We really were editing the final draft.

Time: 3946.895

ANDREW HUBERMAN: Birthed CZI before you

Time: 3948.52

birthed the human child.

Time: 3950.89

Well, it's an incredible Initiative.

Time: 3953.26

I've been following it since its inception.

Time: 3956.86

And it's already been tremendously successful.

Time: 3960.22

And everyone in the field of science--

Time: 3962.23

and I have a lot of communication with those

Time: 3963.55

folks--

Time: 3964.09

feels the same way.

Time: 3964.96

And the future is even brighter for it, it's clear.

Time: 3967.63

And thank you for expanding to the Midwest and New York.

Time: 3970.21

And we're all very excited to see where all of this goes.

Time: 3974.68

I share in your optimism.

Time: 3976.6

And thank you for your time today.

Time: 3978.26

PRISCILLA CHAN: Yeah, thank you.

Time: 3979.33

MARK ZUCKERBERG: Thank you.

Time: 3980.455

A lot more to do.

Time: 3981.163

ANDREW HUBERMAN: I'd like to take a quick break

Time: 3983.122

and thank our sponsor, InsideTracker.

Time: 3984.82

InsideTracker is a personalized nutrition platform

Time: 3987.46

that analyzes data from your blood and DNA

Time: 3990.01

to help you better understand your body

Time: 3991.78

and help you reach your health goals.

Time: 3993.71

I've long been a believer in getting regular blood work done

Time: 3996.233

for the simple reason that many of the factors that impact

Time: 3998.65

your immediate and long-term health

Time: 4000.24

can only be analyzed from a quality blood test.

Time: 4003.103

Now, a major problem with a lot of blood

Time: 4004.77

tests out there, however, is that you get information

Time: 4007.38

back about metabolic factors, lipids, and hormones

Time: 4009.9

and so forth.

Time: 4010.443

But you don't know what to do with that information.

Time: 4012.61

With InsideTracker, they make it very easy

Time: 4014.89

because they have a personalized platform that

Time: 4017.05

allows you to see the levels of all those things--

Time: 4019.42

metabolic factors, lipids, hormones, et cetera.

Time: 4021.76

But it gives you specific directives

Time: 4023.83

that you can follow that relate to nutrition,

Time: 4025.99

behavioral modification, supplements,

Time: 4027.77

et cetera that can help you bring

Time: 4029.2

those numbers into the ranges that are optimal for you.

Time: 4032.03

If you'd like to try InsideTracker,

Time: 4033.49

you can go to insidetracker.com/huberman

Time: 4036.58

to get 20% off any of InsideTracker's plans.

Time: 4039.47

Again, that's insidetracker.com/huberman.

Time: 4042.64

And now for my discussion with Mark Zuckerberg.

Time: 4046

Slight shift of topic here--

Time: 4048.46

you're extremely well-known for your role

Time: 4050.95

in technology development.

Time: 4052.36

But by virtue of your personal interests

Time: 4055.57

and also where Meta technology interfaces

Time: 4059.59

with mental health and physical health,

Time: 4061.51

you're starting to become synonymous with health,

Time: 4065.09

whether you realize it or not.

Time: 4067.03

Part of that is because there's posts, footage

Time: 4070.21

of you rolling jiu jitsu.

Time: 4071.83

You won a jiu jitsu competition recently.

Time: 4074.68

You're doing other forms of martial arts, water sports,

Time: 4079.01

including surfing, and on and on.

Time: 4082.34

So you're doing it yourself.

Time: 4085.09

But maybe we could just start off with technology

Time: 4088.27

and get this issue out of the way first, which

Time: 4092.83

is that I think many people assume that technology,

Time: 4096.62

especially technology that involves a screen, excuse

Time: 4099.43

me, of any kind is going to be detrimental to our health.

Time: 4103.25

But that doesn't necessarily have to be the case.

Time: 4106.79

So could you explain how you see technology

Time: 4111.189

meshing with, inhibiting, or maybe even promoting

Time: 4115

physical and mental health?

Time: 4116.873

MARK ZUCKERBERG: Sure.

Time: 4117.79

I mean, I think this is a really important topic.

Time: 4123.43

The research that we've done suggests that it's not

Time: 4127.84

all good or all bad.

Time: 4129.229

I think how you're using the technology has

Time: 4131.47

a big impact on whether it is basically

Time: 4134.56

a positive experience for you.

Time: 4136.189

And even within technology, even within social media,

Time: 4139.51

there's not one type of thing that people do.

Time: 4142.51

I think, at its best, you're forming meaningful connections

Time: 4148

with other people.

Time: 4149.95

And there's a lot of research that basically suggests

Time: 4152.439

that it's the relationships that we have

Time: 4155.529

and the friendships that bring the most happiness in our lives

Time: 4161.21

and, at some level, end up even correlating

Time: 4163.51

with living a longer and healthier life

Time: 4165.22

because that grounding that you have in community

Time: 4168.354

ends up being important for that.

Time: 4169.729

So I think that aspect of social media,

Time: 4172.729

which is the ability to connect with people, to understand

Time: 4177.303

what's going on in people's lives,

Time: 4178.72

have empathy for them, communicate what's

Time: 4180.97

going on with your life, express that, that's

Time: 4183.85

generally positive.

Time: 4184.93

There are ways that it can be negative,

Time: 4187.39

in terms of bad interactions, things like bullying,

Time: 4190.507

which we can talk about because there's a lot that we've

Time: 4192.84

done to basically make sure that people can be safe from that

Time: 4195.72

and give people tools and give kids the ability to have

Time: 4198.54

the right parental controls.

Time: 4199.8

Their parents can oversee that.

Time: 4201.18

But that's the interacting with people side.

Time: 4204.27

There's another side of all of this,

Time: 4206.95

which I think of as just passive consumption, which,

Time: 4211.74

at its best, is entertainment.

Time: 4215.16

And entertainment is an important human thing, too.

Time: 4217.8

But I don't think that that has quite the same association

Time: 4222.3

with the long-term well-being and health benefits

Time: 4227.28

as being able to help people connect with other people does.

Time: 4231.33

And I think, at its worst, some of the stuff we see online--

Time: 4239.27

I think, these days, a lot of the news

Time: 4241.1

is just so relentlessly negative that it's just

Time: 4244.34

hard to come away from an experience

Time: 4246.95

where looking at the news for half an hour

Time: 4250.61

and feel better about the world.

Time: 4252.87

So I think that there's a mix on this.

Time: 4257.34

I think the more that social media

Time: 4259.02

is about connecting with people and the more

Time: 4263.31

that when you're consuming and using the media

Time: 4267.72

part of social media to learn about things that

Time: 4271.38

enrich you and can provide inspiration or education as

Time: 4275.52

opposed to things that just leave you with a more

Time: 4278.7

toxic feeling, that's the balance that we try to get

Time: 4283.38

right across our products.

Time: 4284.62

And I think we're pretty aligned with the community

Time: 4287.01

because, at the end of the day, I mean, people

Time: 4288.93

don't want to use a product and come away feeling bad.

Time: 4292.38

There's a lot that people talk about--

Time: 4296.032

evaluate a lot of these products in terms

Time: 4297.74

of information and utility.

Time: 4299.54

But I think it's as important, when

Time: 4301.37

you're designing a product, to think

Time: 4303.32

about what kind of feeling you're creating

Time: 4305.69

with the people who use it, whether that's

Time: 4308.3

an aesthetic sense when you're designing hardware,

Time: 4310.58

or just what do you make people feel.

Time: 4314.18

And generally, people don't want to feel bad, right?

Time: 4318.74

That doesn't mean that we want to shelter people

Time: 4320.82

from bad things that are happening in the world.

Time: 4322.82

But I don't really think that--

Time: 4325.79

it's not what people want for us to just

Time: 4329.15

be just showing all this super negative stuff all day long.

Time: 4334.917

So we work hard on all these different problems-- making

Time: 4337.25

sure that we're helping connect people as best as possible,

Time: 4340.14

helping make sure that we give people good tools

Time: 4343.288

to block people who might be bullying them,

Time: 4345.08

or harass them, or especially for younger folks,

Time: 4348.02

anyone under the age of 16 defaults into an experience

Time: 4350.45

where their experience is private.

Time: 4352.04

We have all these parental tools.

Time: 4354.26

So that way, parents can understand what their children

Time: 4358.92

are up to in a good balance.

Time: 4361.5

And then on the other side, we try

Time: 4363.6

to give people tools to understand how

Time: 4365.7

they're spending their time.

Time: 4367.44

We try to give people tools so that if you're a teen

Time: 4371.04

and you're stuck in some loop of just looking

Time: 4375.022

at one type of content, we'll nudge you and say, hey,

Time: 4377.23

you've been looking at content of this type for a while.

Time: 4379.62

How about something else?

Time: 4380.662

And here's a bunch of other examples.

Time: 4382.45

So I think that there are things that you

Time: 4383.67

can do to push this in a positive direction.

Time: 4385.59

But I think it just starts with having

Time: 4387.78

a more nuanced view of this isn't all good or all bad.

Time: 4391.65

And the more that you can make it a positive

Time: 4394.02

thing, the better this will be for all the people

Time: 4396.12

who use our products.

Time: 4397.53

ANDREW HUBERMAN: That makes really good sense.

Time: 4399.447

In terms of the negative experience, I agree.

Time: 4401.47

I don't think anyone wants a negative experience

Time: 4403.56

in the moment.

Time: 4404.31

I think where some people get concerned perhaps--

Time: 4406.96

and I think about my own interactions with, say,

Time: 4408.96

Instagram, which I use all the time for getting information

Time: 4412.47

out, but also consuming information.

Time: 4414.27

And I happen to love it.

Time: 4415.27

It's where I essentially launched

Time: 4416.82

the non-podcast segment of my podcast and continue to.

Time: 4421.66

I can think of experiences that are a little bit

Time: 4424.03

like highly processed food, where

Time: 4426.67

it tastes good at the time.

Time: 4428.5

It's highly engrossing.

Time: 4430.87

But it it's not necessarily nutritious.

Time: 4433.18

And you don't feel very good afterwards.

Time: 4435.95

So for me, that would be the little collage

Time: 4439.42

of default options to click on in Instagram.

Time: 4441.917

Occasionally, I notice-- and this just

Time: 4443.5

reflects my failure, not Instagram's, that there

Time: 4447.34

are a lot of street fight things,

Time: 4450.07

like people beating people up on the street.

Time: 4452.17

And I have to say, these have a very strong gravitational pull.

Time: 4455.68

I'm not somebody that enjoys seeing violence, per se.

Time: 4458.09

But you know I find myself--

Time: 4459.692

I'll click on one of these, like what happened?

Time: 4461.65

And I'll see someone get hit.

Time: 4463.63

And there's a little melee on the street or something.

Time: 4466.13

And those seem to be offered to me a lot lately.

Time: 4468.34

And again, this is my fault. It reflects

Time: 4470.29

my prior searching experience.

Time: 4472.31

But I noticed that it has a bit of a gravitational pull, where

Time: 4477.63

I didn't learn anything.

Time: 4479.31

It's not teaching me any useful street self-defense

Time: 4483.24

skills of any kind.

Time: 4485.43

And at the same time, I also really enjoy

Time: 4489.06

some of the cute animal stuff.

Time: 4490.845

And so I get a lot of those also.

Time: 4492.22

So there's this polarized collage

Time: 4494.58

that's offered to me that reflects my prior search

Time: 4496.83

behavior.

Time: 4498.24

You could argue that the cute animal stuff is just

Time: 4501.75

entertainment.

Time: 4502.62

But actually, it fills me with a feeling,

Time: 4504.99

in some cases, that truly delights me.

Time: 4506.79

I delight in animals.

Time: 4507.665

And we're not just talking about kittens.

Time: 4509.373

I mean, animals I've never seen before,

Time: 4511.08

interactions between animals I've never seen

Time: 4512.7

before that truly delight me.

Time: 4513.99

They energize me in a positive way

Time: 4515.43

that when I leave Instagram, I do think I'm better off.

Time: 4519.01

So I'm grateful for the algorithm in that sense.

Time: 4521.41

But I guess, the direct question is, is the algorithm just

Time: 4526.29

reflective of what one has been looking at a lot

Time: 4529.08

prior to that moment where they log on?

Time: 4531.33

Or is it also trying to do exactly what you described,

Time: 4535.24

which is trying to give people a good-feeling experience that

Time: 4538.54

leads to more good feelings?

Time: 4540.443

MARK ZUCKERBERG: Yeah.

Time: 4541.36

I mean, I think we try to do this in a long-term way.

Time: 4544.27

I think one simple example of this

Time: 4546.55

is we had this issue a number of years back

Time: 4549.76

about clickbait news, so articles

Time: 4552.76

that would have basically a headline that grabbed

Time: 4557.913

your attention, that made you feel

Time: 4559.33

like, oh, I need to click on this.

Time: 4560.45

And then you click on it.

Time: 4561.492

And then the article is actually about something that's

Time: 4563.89

somewhat tangential to it.

Time: 4565.85

But people clicked on it.

Time: 4567.79

So the naive version of this stuff, the 10-year-old version

Time: 4571.028

was like, oh, people seem to be clicking on this.

Time: 4573.07

Maybe that's good.

Time: 4574.03

But it's actually a pretty straightforward exercise

Time: 4576.94

to instrument the system to realize that, hey, people

Time: 4579.73

click on this, and then they don't really

Time: 4583.21

spend a lot of time reading the news after clicking on it.

Time: 4587.36

And after they do this a few times,

Time: 4590.26

it doesn't really correlate with them saying that they're

Time: 4594.07

having a good experience.

Time: 4597.25

Some of how we measure this is just

Time: 4600.07

by looking at how people use the services.

Time: 4602.11

But I think it's also important to balance

Time: 4604.24

that by having real people come in and tell us,

Time: 4608.14

OK-- we show them, here are the stories that we could have

Time: 4610.6

showed you, which of these are most meaningful to you,

Time: 4614.98

or would make it so that you have the best experience,

Time: 4617.23

and just mapping the algorithm and what

Time: 4620.338

we do to that ground truth of what people say that they want.

Time: 4622.88

So I think that, through a set of things like that,

Time: 4625.96

we really have made large steps to minimize things

Time: 4629.5

like clickbait over time.

Time: 4630.7

It's not like gone from the internet.

Time: 4631.93

But I think we've done a good job of minimizing it

Time: 4634.09

on our services.

Time: 4636.13

Within that though, I do think that we

Time: 4638.332

need to be pretty careful about not

Time: 4639.79

being paternalistic about what makes different people feel

Time: 4642.76

good.

Time: 4643.69

So I mean, I don't know that everyone

Time: 4645.52

feels good about cute animals.

Time: 4647.662

I mean, I can't imagine that people

Time: 4649.12

would feel really bad about it.

Time: 4650.47

But maybe they don't have as profound of a positive reaction

Time: 4653.44

to it as you just expressed.

Time: 4656.53

And I don't know.

Time: 4658.3

Maybe people who are more into fighting

Time: 4660.01

would look at the street fighting videos--

Time: 4661.942

assuming that they're within our community standards.

Time: 4664.15

I think that there's a level of violence

Time: 4665.47

that we just don't want to be showing at all.

Time: 4667.345

But that's a separate question.

Time: 4669.79

But if they are, I mean, then it's like--

Time: 4671.86

I mean, I'm pretty into MMA.

Time: 4673.417

I don't get a lot of street fighting videos.

Time: 4675.25

But if I did, maybe I'd feel like I was learning something

Time: 4677.667

from that.

Time: 4679.78

I think at various times in the company's history,

Time: 4682.84

we've been a little bit too paternalistic about saying,

Time: 4686.98

this is good content, this is bad, you should like this,

Time: 4691.276

this is unhealthy for you.

Time: 4693.11

And I think that we want to look at the long-term effects.

Time: 4697.33

You don't want to get stuck in a short term

Time: 4699.16

loop of like, OK, just because you

Time: 4700.33

did this today doesn't mean it's what you

Time: 4702.13

aspire for yourself over time.

Time: 4703.81

But I think, as long as you look at the long-term of what

Time: 4708.28

people both say they want and what they do, giving people

Time: 4711.91

a fair amount of latitude to like the things that they like,

Time: 4715

I just think feels like the right set of values

Time: 4717.43

to bring to this.

Time: 4719.052

Now, of course, that doesn't go for everything.

Time: 4721.01

There are things that are truly off limits and things that--

Time: 4725.7

like bullying, for example, or things that are really inciting

Time: 4729.415

violence, things like that.

Time: 4730.54

I mean, we have the whole community standards

Time: 4732.13

around this.

Time: 4732.82

But I think, except for those things

Time: 4735.76

which I would hope that most people can agree, OK,

Time: 4738.4

bullying is bad--

Time: 4739.33

I hope that 100% of people agree with that.

Time: 4742.9

And not 100%, maybe 99%.

Time: 4746.11

Except for the things that kind of get that very--

Time: 4751.61

that feel pretty extreme and bad like that,

Time: 4753.808

I think you want to give people space

Time: 4755.35

to like what they want to like.

Time: 4757.468

ANDREW HUBERMAN: Yesterday, I had the very good experience

Time: 4760.39

of learning from the Meta team about safety protections that

Time: 4763.81

are in place for kids who are using Meta Platforms.

Time: 4768.37

And frankly, I was really positively surprised

Time: 4772.72

at the huge number of filter-based tools and just

Time: 4777.64

ability to customize the experience so that it can stand

Time: 4781.48

the best chance of enriching-- not just remaining neutral,

Time: 4784.39

but enriching their mental health status.

Time: 4788.02

One thing that came about in that conversation,

Time: 4790.54

however, was I realized there are all these tools.

Time: 4794.2

But do people really know that these tools exist?

Time: 4796.485

And I think about my own experience with Instagram.

Time: 4798.61

I love watching Adam Mosseri's Friday Q&As because he explains

Time: 4803.56

a lot of the tools that I didn't know existed.

Time: 4808.49

And if people haven't seen that, I highly

Time: 4810.37

recommend they watch that.

Time: 4812.44

I think he takes questions on Thursdays

Time: 4814.27

and answers them most every Fridays.

Time: 4816.88

So if I'm not aware of the tools without watching that, that

Time: 4821.41

exists for adults, how does Meta look

Time: 4824.862

at the challenge of making sure that people know that there

Time: 4827.32

are all these tools--

Time: 4828.25

I mean, dozens and dozens of very useful tools?

Time: 4830.56

But I think most of us just know the hashtag, the tag,

Time: 4833.98

the click, stories versus feed.

Time: 4837.37

We now know that--

Time: 4839.08

I also post to Threads.

Time: 4840.16

I mean, so we know the major channels and tools.

Time: 4842.84

But this is like owning a vehicle that

Time: 4844.75

has incredible features that one doesn't

Time: 4847.45

realize can take you off road, can allow your vehicle to fly.

Time: 4850.63

I mean, there's a lot there.

Time: 4852.82

So what do you think could be done

Time: 4854.26

to get that information out?

Time: 4855.19

Maybe this conversation could cue people to [INAUDIBLE]..

Time: 4857.32

MARK ZUCKERBERG: I mean, that's part of the reason why I wanted

Time: 4859.6

to talk to you about this.

Time: 4860.92

I mean, I think most of the narrative around social media

Time: 4864.16

is not, OK, all of the different tools

Time: 4866.605

that people have to control their experience.

Time: 4868.48

It's the narrative of is this just negative

Time: 4872.83

for teens or something.

Time: 4874.875

And I think, again, a lot of this

Time: 4876.25

comes down to how is the experience being tuned.

Time: 4883.15

Are people using it to connect in positive ways?

Time: 4885.62

And if so, I think it's really positive.

Time: 4888.31

So yeah, I mean, I think part of this

Time: 4890.022

is we probably just need to get out and talk to people more

Time: 4892.48

about it.

Time: 4894.07

And then there's an in-product aspect,

Time: 4896.02

which is if you're a teen and you sign up,

Time: 4899.63

we take you through a pretty extensive experience that

Time: 4904.51

tries to outline some of this.

Time: 4906.16

But that has limits, too, because when you sign up

Time: 4908.68

for a new thing, if you're bombarded with here's

Time: 4911.65

a list of features, you're like, OK, I just signed up for this.

Time: 4914.38

I don't really understand much about what the service is.

Time: 4917.26

Let me go find some people to follow

Time: 4920.14

who are my friends on here before I learn

Time: 4922.18

about controls to prevent people from harassing me or something.

Time: 4927.76

That's why I think it's really important to also show

Time: 4931.87

a bunch of these tools in context.

Time: 4933.77

So if you're looking at comments,

Time: 4936.88

and if you go to delete a comment,

Time: 4940.67

or you go to edit something, try to give people prompts in line.

Time: 4944.633

It's like, hey, did that you can manage things

Time: 4946.55

in these ways around that?

Time: 4949.16

Or when you're in the inbox and you're filtering something,

Time: 4952.91

remind people in line.

Time: 4954.23

So just because of the number of people

Time: 4957.11

who use the products and the level of nuance

Time: 4959.36

around each of the controls, I think the vast majority

Time: 4962.69

of that education, I think, needs to happen in the product.

Time: 4966.98

But I do think that through conversations like this

Time: 4969.47

and others that we need to be doing,

Time: 4972.9

I think we can create a broader awareness that those things

Time: 4975.98

exist so that way at least people are primed

Time: 4978.2

so that way when those things pop up in the product people,

Time: 4980.24

they're like, oh yeah, I knew that there was this control.

Time: 4982.657

And here's how I would use that.

Time: 4985.235

ANDREW HUBERMAN: I find the restrict function

Time: 4987.11

to be very useful, more than the block function in most cases.

Time: 4990.92

I do sometimes have to block people.

Time: 4992.42

But the restrict function is really useful

Time: 4994.17

that you could filter specific comments.

Time: 4997.1

You might recognize that someone has a tendency

Time: 4999.62

to be a little aggressive.

Time: 5000.94

And I should point out that I actually don't really

Time: 5002.71

mind what people say to me.

Time: 5003.77

But I try and maintain what I call classroom rules

Time: 5005.853

in my comment section, where I don't like people attacking

Time: 5008.44

other people because I would never tolerate that

Time: 5010.09

in the university classroom.

Time: 5011.14

I'm not going to tolerate that in the comments section,

Time: 5012.82

for instance.

Time: 5013.63

MARK ZUCKERBERG: Yeah.

Time: 5014.547

And I think that the example that you just used about

Time: 5017.59

restrict versus block gets to something about product design

Time: 5020.59

that's important, too, which is that block is this very

Time: 5025.87

powerful tool that if someone is giving you a hard time

Time: 5028.162

and you just want them to disappear from the experience,

Time: 5030.495

you can do it.

Time: 5031.33

But the design trade-off with that is that in order to make

Time: 5036.7

it so that the person is just gone from the experience

Time: 5040.78

and that you don't show up to them,

Time: 5043.24

they don't show up to you--

Time: 5045.1

inherent to that is that they will have

Time: 5047.68

a sense that you blocked them.

Time: 5049.55

And that's why I think some stuff like restrict or just

Time: 5052.785

filtering, like I just don't want

Time: 5054.16

to see as much stuff about this topic--

Time: 5056.65

people like using different tools for very subtle reasons.

Time: 5060.44

I mean, maybe you want the content to not show up,

Time: 5063.58

but you don't want the person who's

Time: 5065.11

posting the content to know that you don't want it to show up.

Time: 5067.83

Maybe you don't want to get the messages in your main inbox,

Time: 5070.33

but you don't want to tell the person actually that you're not

Time: 5075.01

friends or something like that.

Time: 5076.72

You actually need to give people different tools that

Time: 5079.09

have different levels of power and nuance

Time: 5082.6

around how the social dynamics around using them

Time: 5085.93

play out in order to really allow

Time: 5089.17

people to tailor the experience in the ways that they want.

Time: 5091.63

ANDREW HUBERMAN: In terms of trying

Time: 5093.088

to limit total amount of time on social media,

Time: 5097.25

I couldn't find really good data on this.

Time: 5101.32

How much time is too much?

Time: 5102.82

I mean, I think it's going to depend

Time: 5104.32

on what one is looking at, the age of the user, et cetera.

Time: 5107.128

MARK ZUCKERBERG: I agree.

Time: 5108.17

ANDREW HUBERMAN: I know that you have

Time: 5109.712

tools that cue the user to how long

Time: 5112.75

they've been on a given platform.

Time: 5115.03

Are there tools to self-regulate--

Time: 5117.64

I'm thinking about the Greek myth of the sirens and people

Time: 5121.083

tying themselves to the mast and covering their eyes

Time: 5123.25

so that they're not drawn in by the sirens.

Time: 5125.042

Is there a function aside from deleting the app temporarily

Time: 5128.29

and then reinstalling it every time you want to use it again?

Time: 5131.56

Is there a true lockout, self-lockout function

Time: 5134.56

where one can lock themselves out of access to the app?

Time: 5137.215

MARK ZUCKERBERG: Well, I think we give people tools

Time: 5139.34

that let them manage this.

Time: 5141.76

And there's the tools that you get to use.

Time: 5143.51

And then there's the tools that the parents

Time: 5145.302

get to use to basically see how usage works.

Time: 5148.61

But yeah, I think that there's different--

Time: 5151.46

I think, for now, we've mostly focused

Time: 5153.17

on helping people understand this,

Time: 5155.3

and then give people reminders and things like that.

Time: 5159.38

It's tough, though, to answer the question that you

Time: 5163.58

were talking about before.

Time: 5164.99

Is there an amount of time which is too much?

Time: 5167.39

Because it does really get to what you're doing.

Time: 5169.91

If you fast forward beyond just the

Time: 5172.49

apps that we have today to an experience that

Time: 5175.67

is like a social experience in the future

Time: 5177.74

of the augmented reality glasses or something

Time: 5180.41

that we're building, a lot of this

Time: 5182.78

is going to be you're interacting with people

Time: 5187.61

in the way that you would physically

Time: 5189.29

as if you were like hanging out with friends

Time: 5191.39

or working with people.

Time: 5193.37

But now, they can show up as holograms.

Time: 5196.4

And you can feel like you're present right there with them,

Time: 5198.98

no matter where they actually are.

Time: 5200.79

And the question is, is there too much

Time: 5203.24

time to spend interacting with people like that?

Time: 5205.61

Well, at the limit, if we can get

Time: 5207.86

that experience to be as rich and giving you

Time: 5211.25

as good of a sense of presence as you would have if you were

Time: 5216.08

physically there with someone, then I

Time: 5218.63

don't see why you would want to restrict the amount that people

Time: 5221.91

use that technology to any less than what

Time: 5225.27

would be the amount of time that you'd be comfortable

Time: 5228.63

interacting with people physically,

Time: 5230.687

which obviously is not going to be 24 hours a day.

Time: 5232.77

You have to do other stuff.

Time: 5234.51

You have work.

Time: 5235.47

You need to sleep.

Time: 5236.34

But I think it really gets to how you're using these things,

Time: 5239.43

whereas if what you're primarily using the services for

Time: 5242.46

is you're getting stuck in loops reading news or something that

Time: 5247.8

is really getting you into a negative mental state,

Time: 5250.68

then I don't know.

Time: 5252.018

I mean, I think that there's probably

Time: 5253.56

a relatively short period of time

Time: 5255.03

that maybe that's a good thing that you want to be doing.

Time: 5258.54

But again, even then it's not zero

Time: 5260.34

because just because news might make you unhappy

Time: 5263.388

doesn't mean that the answer is to be

Time: 5264.93

unaware of negative things that are happening in the world.

Time: 5267.388

I just think that different people

Time: 5269.04

have different tolerances for what they can take on that.

Time: 5272.16

And I think it's generally having

Time: 5274.26

some awareness is probably good, as long as it's not more

Time: 5277.41

than you're constitutionally able to take.

Time: 5279.84

So I don't know.

Time: 5281.19

I try not be too paternalistic about this as our approach.

Time: 5285.27

But we want to empower people by giving them

Time: 5288.03

the tools, both people and, if you're a teen, your parents

Time: 5292.41

to have tools to understand what you're experiencing

Time: 5295.08

and how you're using these things, and then go from there.

Time: 5298.453

ANDREW HUBERMAN: Yeah.

Time: 5299.37

I think it requires of all of us some degree of self-regulation.

Time: 5302.8

I like this idea of not being too paternalistic.

Time: 5304.8

I mean, it seems like the right way to go.

Time: 5307.09

I find myself occasionally having

Time: 5309.24

to make sure that I'm not just passively scrolling,

Time: 5311.94

that I'm learning.

Time: 5312.78

I like foraging for, organizing and dispersing information.

Time: 5316.56

That's been my life's career.

Time: 5318.45

So I've learned so much from social media.

Time: 5320.71

I find great papers, great ideas.

Time: 5323.65

I think comments are a great source of feedback.

Time: 5325.65

And I'm not just saying that because you're sitting here.

Time: 5328.025

I mean, Instagram in particular, but other Meta platforms

Time: 5330.72

have been tremendously helpful for me to get science

Time: 5333.96

and health information out.

Time: 5335.46

One of the things that I'm really excited about,

Time: 5338.32

which I only had the chance to try for the first time today,

Time: 5341.22

is your new VR platform, the newest Oculus.

Time: 5345.18

And then we can talk about the glasses, the Ray-Bans.

Time: 5347.88

MARK ZUCKERBERG: Sure.

Time: 5349.177

ANDREW HUBERMAN: Those two experiences

Time: 5350.76

are still kind of blowing my mind, especially

Time: 5352.95

the Ray-Ban glasses.

Time: 5356.293

And I have so many questions about this.

Time: 5357.96

So I'll resist.

Time: 5358.62

But--

Time: 5359.317

MARK ZUCKERBERG: We can get into that.

Time: 5360.9

ANDREW HUBERMAN: OK.

Time: 5361.17

Well, yeah, I have some experience with VR.

Time: 5363.03

My Lab has used VR.

Time: 5365.07

Jeremy Bailenson's Lab at Stanford

Time: 5367.56

is one of the pioneering labs of VR and mixed reality.

Time: 5370.638

I guess they used to call it augmented reality, but now

Time: 5372.93

mixed reality.

Time: 5373.93

I think what's so striking about the VR

Time: 5376.08

that you guys had me try today is how well it interfaces

Time: 5380.58

with the real room, let's call it, the physical room.

Time: 5383.247

MARK ZUCKERBERG: Physical.

Time: 5384.33

ANDREW HUBERMAN: I could still see people.

Time: 5385.53

I could see where the furniture was.

Time: 5386.73

So I wasn't going to bump into anything.

Time: 5388.35

I could see people's smiles.

Time: 5389.517

I could see my water on the table

Time: 5392.49

while I was doing this what felt like a real martial arts

Time: 5397.23

experience, except I wasn't getting hit.

Time: 5399.57

Well, I was getting hit virtually.

Time: 5401.61

But it's extremely engaging.

Time: 5404.43

And yet, on the good side of things,

Time: 5407.04

it really bypasses a lot of the early concerns

Time: 5410.07

that Bailenson Lab--

Time: 5411.3

again, Jeremy's Lab-- was early to say that, oh, there's

Time: 5414.12

a limit to how much VR one can or should use each day,

Time: 5418.3

even for the adult brain because it can really

Time: 5421.29

disrupt your vestibular system, your sense of balance.

Time: 5424.17

All of that seems to have been dealt

Time: 5425.67

with in this new iteration of VR.

Time: 5428.265

I didn't come out of it feeling dizzy at all.

Time: 5430.14

I didn't feel like I was reentering the room in a way

Time: 5432.348

that was really jarring.

Time: 5433.71

Going into it is obviously, Whoa,

Time: 5435.21

this is a different world.

Time: 5436.293

But you can look to your left and say, oh, someone just

Time: 5439.842

came in the door.

Time: 5440.55

Hey, how's it going?

Time: 5441.12

Hold on, I'm playing this game, just

Time: 5442.26

as it was when I was a kid playing in Nintendo

Time: 5444.42

and someone would walk in.

Time: 5445.23

It's fully engrossing.

Time: 5446.1

But you'd be like, hold on.

Time: 5446.88

And you see they're there.

Time: 5447.963

So first of all, bravo, incredible.

Time: 5453.3

And then the next question is, what do we even

Time: 5457.05

call this experience?

Time: 5458.1

Because it is truly really mixed.

Time: 5459.97

It's a truly mixed reality experience.

Time: 5461.663

MARK ZUCKERBERG: Yeah.

Time: 5462.58

I mean, mixed reality is the umbrella term

Time: 5465.31

that refers to the combined experience

Time: 5468.28

of virtual and augmented reality.

Time: 5470.3

So augmented reality is what you're eventually

Time: 5473.32

going to get with some future version of the smart glasses,

Time: 5476.44

where you're primarily seeing the world,

Time: 5480.43

but you can put holograms in it.

Time: 5482.23

So we'll have a future where you're

Time: 5485.887

going to walk into a room.

Time: 5486.97

And there are going to be as many holograms

Time: 5489.19

as physical objects.

Time: 5490.99

If you just think about all the paper, the art, physical games,

Time: 5494.71

media, your workstation--

Time: 5496.3

ANDREW HUBERMAN: If we refer to, let's

Time: 5497.23

say, an MMA fight, we could just draw it up on the table right

Time: 5499.78

here and just see it repeat as opposed to us turning

Time: 5501.88

and looking at a screen.

Time: 5502.73

MARK ZUCKERBERG: Yeah.

Time: 5502.93

I mean, pretty much any screen that exists

Time: 5504.68

could be a hologram in the future with smart glasses.

Time: 5507.67

There's nothing that actually physically needs

Time: 5509.83

to be there for that when you have glasses

Time: 5511.72

that can put a hologram there.

Time: 5514.5

And it's an interesting thought experiment

Time: 5516.25

to just go around and think about, OK, what of the things

Time: 5518.625

that are physical in the world need to actually be physical.

Time: 5521.692

Your chair does, right?

Time: 5522.65

Because you're sitting on it.

Time: 5523.46

A hologram isn't going to support you.

Time: 5525.09

But like that art on the wall, I mean,

Time: 5526.97

that doesn't need to physically be there.

Time: 5530.84

So I think that that's the augmented reality experience

Time: 5535.52

that we're moving towards.

Time: 5536.76

And then we've had these headsets that historically we

Time: 5540.59

think about as VR.

Time: 5542.12

And that has been something that is like a fully

Time: 5546.02

immersive experience.

Time: 5547.53

But now, we're getting something that's

Time: 5550.1

a hybrid in between the two and capable

Time: 5551.84

of both, which is a headset that can do both virtual reality

Time: 5554.99

and some of these augmented reality experiences.

Time: 5558.18

And I think that that's really powerful,

Time: 5560.9

both because you're going to get new applications that allow

Time: 5564.92

people to collaborate together.

Time: 5566.24

And maybe the two of us are here physically,

Time: 5568.73

but someone joins us and it's their avatar there.

Time: 5571.7

Or maybe it's some version in the future.

Time: 5574.46

You're having a team meeting.

Time: 5575.98

And you have some people there physically.

Time: 5577.73

And you have some people dialing in.

Time: 5579.23

And they're basically like a hologram, there virtually.

Time: 5581.522

But then you also have some AI personas

Time: 5583.617

that are on your team that are helping

Time: 5585.2

you do different things.

Time: 5585.95

And they can be embodied as avatars and around the table

Time: 5588.283

meeting with you.

Time: 5589.04

ANDREW HUBERMAN: Are people are going

Time: 5589.61

to be doing first dates that are physically separated?

Time: 5592.74

I could imagine that some people would--

Time: 5594.6

is it even worth leaving the house type date?

Time: 5596.672

And then they find out.

Time: 5597.63

And then they meet for the first time.

Time: 5599.37

MARK ZUCKERBERG: I mean, maybe.

Time: 5600.662

I think dating has physical aspects to it, too.

Time: 5605.882

ANDREW HUBERMAN: Right.

Time: 5606.84

Some people might not be-- they want

Time: 5608.46

to know whether or not it's worth

Time: 5609.835

the effort to head out or not.

Time: 5612.06

They want to bridge the divide, right?

Time: 5613.827

MARK ZUCKERBERG: It is possible.

Time: 5615.16

I mean, I know some of my friends

Time: 5617.46

who are dating basically say that in order

Time: 5622.23

to make sure that they have a safe experience, if they're

Time: 5625.68

going on a first date, they'll schedule

Time: 5627.54

something that's shorter and maybe in the middle of the day.

Time: 5630.565

So maybe it's coffee.

Time: 5631.44

So that way, if they don't like the person,

Time: 5633.232

they can just get out before going and scheduling

Time: 5635.34

a dinner or a real, full date.

Time: 5636.983

So I don't know.

Time: 5637.65

Maybe in the future, people will have

Time: 5639.48

that experience where you can feel like you're

Time: 5641.79

kind of sitting there.

Time: 5642.84

And it's and it's even easier, and lighter weight and safer.

Time: 5645.998

And if you're not having a good experience,

Time: 5647.79

you can just teleport out of there and be gone.

Time: 5651.45

But yeah, I think that this will be an interesting question

Time: 5654.69

in the future.

Time: 5657.47

There are clearly a lot of things that are only possible

Time: 5660.05

physically that--

Time: 5661.7

or are so much better physically.

Time: 5663.722

And then there are all these things

Time: 5665.18

that we're building up that can be digital experiences.

Time: 5667.76

But it's this weird artifact of how

Time: 5671.42

this stuff has been developed that the digital world

Time: 5674

and the physical world exist in these completely

Time: 5676.25

different planes.

Time: 5677.017

When you want to interact with the digital world--

Time: 5679.1

we do it all the time.

Time: 5679.82

But we pull out a small screen.

Time: 5681.23

Or we have a big screen.

Time: 5682.55

And just basically, we're interacting

Time: 5683.6

with the digital world through these screens.

Time: 5685.475

But I think if we fast forward a decade

Time: 5689.27

or more, I think one of the really interesting questions

Time: 5694.22

about what is the world that we're

Time: 5696.133

going to live in, I think it's going to increasingly

Time: 5698.3

be this mesh of the physical and digital worlds

Time: 5700.61

that will allow us to feel, A, that the world that we're in

Time: 5705.02

is just a lot richer because there can be all

Time: 5706.97

these things that people create that are just so much easier

Time: 5709.47

to do digitally than physically.

Time: 5711.77

But B, you're going to have a real physical sense of presence

Time: 5718.47

with these things and not feel like interacting

Time: 5720.45

in the digital world is taking you away

Time: 5722.37

from the physical world, which today is just

Time: 5724.86

so much viscerally richer and more powerful.

Time: 5727.62

I think the digital world will be embedded in that

Time: 5732.27

and will feel just as vivid in a lot of ways.

Time: 5735.25

So that's why I always think-- when

Time: 5736.77

you were saying before, you felt like you could look

Time: 5738.937

around and see the real room.

Time: 5740.817

I actually think there's an interesting kind

Time: 5742.65

of philosophical distinction between the real room

Time: 5744.93

and the physical room, which historically I

Time: 5748.02

think people would have said those are the same thing.

Time: 5750.3

But I actually think, in the future,

Time: 5751.92

the real room is going to be the combination

Time: 5754.65

of the physical world with all the digital artifacts

Time: 5756.817

and objects that are in there that you can interact with them

Time: 5759.358

and feel present, whereas the physical world is just the part

Time: 5762

that's physically there.

Time: 5763.2

And I think it's possible to build a real world that's

Time: 5765.51

the sum of these two that will actually

Time: 5767.135

be more profound experience than what we have today.

Time: 5769.602

ANDREW HUBERMAN: Well, I was struck

Time: 5771.06

by the smoothness of the interface between the VR

Time: 5773.73

and the physical room.

Time: 5775.2

Your team had me try a--

Time: 5777.555

I guess it was an exercise class in the [INAUDIBLE]..

Time: 5780.71

But it was essentially like hitting mitts boxing,

Time: 5783.567

so hitting targets boxing.

Time: 5784.65

MARK ZUCKERBERG: Yeah, super natural.

Time: 5785.52

ANDREW HUBERMAN: Yeah, and it comes at a fairly fast pace

Time: 5787.895

that then picks up.

Time: 5788.7

It's got some tutorial.

Time: 5789.78

It's very easy to use.

Time: 5790.92

And it certainly got my heart rate up.

Time: 5792.63

And I'm in at least decent shape.

Time: 5794.64

And I have to be honest, I've never

Time: 5797.28

once desired to do any of these on-screen fitness things.

Time: 5800.55

I mean, I can't think of anything more aversive than a--

Time: 5804.89

I don't want to insult any particular products,

Time: 5807.51

but riding a stationary bike while looking

Time: 5809.82

at a screen pretending I'm on a road outside.

Time: 5811.98

I can't think of anything worse for me.

Time: 5814.838

MARK ZUCKERBERG: I do like the leaderboard.

Time: 5816.63

Maybe I'm just a very competitive person.

Time: 5818.4

If you're going to be running on a treadmill,

Time: 5820.44

at least give me a leaderboard so I can beat

Time: 5822.39

the people who are ahead of me.

Time: 5823.5

ANDREW HUBERMAN: I like moving outside and certainly

Time: 5825.72

an exercise class or aerobics class,

Time: 5827.467

as they used to call them.

Time: 5828.55

But the experience I tried today was extremely engaging.

Time: 5832.98

And I've done enough boxing to at least know

Time: 5835.41

how to do a little bit of it.

Time: 5837.33

And I really enjoyed it.

Time: 5838.38

It gets your heart rate up.

Time: 5839.13

And I completely forgot that I was

Time: 5840.75

doing an on-screen experience in part because, I believe,

Time: 5845.37

I was still in that physical room.

Time: 5848.2

And I think there's something about the mesh

Time: 5851.64

of the physical room and the virtual experience that

Time: 5856.53

makes it neither of one world or the other.

Time: 5858.85

I mean, I really felt at the interface of those.

Time: 5860.85

And I certainly got presence, this feeling

Time: 5862.92

of forgetting that I was in a virtual experience

Time: 5865.05

and got my heart rate up pretty quickly.

Time: 5866.717

We had to stop because we were going to start recording.

Time: 5869.05

But I would do that for a good 45 minutes in the morning.

Time: 5871.65

And there's no amount of money you could pay me truly

Time: 5874.77

to look at a screen while pedaling on a bike

Time: 5877.38

or running on a treadmill.

Time: 5878.58

So again, bravo, I think it's going to be very useful.

Time: 5881.537

It's going to get people moving their bodies more,

Time: 5883.62

which certainly--

Time: 5885.06

social media, up until now, and a lot of technologies

Time: 5889.02

have been accused of limiting the amount of physical activity

Time: 5893.16

that both children and adults are engaged in.

Time: 5895.98

And we know we need physical activity.

Time: 5897.66

You're a big proponent of and practitioner

Time: 5899.855

of physical activity.

Time: 5900.73

So is this a major goal of Meta, to get people

Time: 5903.51

moving their bodies more and getting their heart

Time: 5906.27

rates up and so on?

Time: 5908.313

MARK ZUCKERBERG: I think we want to enable it.

Time: 5910.23

And I think it's good.

Time: 5911.85

But I think it comes more from a philosophical view of the world

Time: 5918.68

than it is necessarily--

Time: 5921.29

I mean, I don't go into building products

Time: 5923.3

to try to shape people's behavior.

Time: 5925.74

I believe in empowering people to do what they want

Time: 5929.9

and be the best version of themselves that they can be.

Time: 5933.57

ANDREW HUBERMAN: So no agenda?

Time: 5934.82

MARK ZUCKERBERG: That said, I do believe that there's

Time: 5938.36

the previous generation of computers

Time: 5940.28

were devices for your mind.

Time: 5942.17

And I think that we are not brains and tanks.

Time: 5947.42

I think that there's a philosophical view of people

Time: 5950.09

of like, OK, you are primarily what you think about

Time: 5954.5

or your values or something.

Time: 5955.71

It's like, no, you are that and you

Time: 5957.5

are a physical manifestation.

Time: 5959.31

And people were very physical.

Time: 5962.06

And I think building a computer for your whole body and not

Time: 5968.84

just for your mind is very fitting with this worldview

Time: 5973.52

that the actual essence of you, if you want

Time: 5976.243

to be present with another person,

Time: 5977.66

if you want to be fully engaged in experience is not just--

Time: 5982.22

it's not just a video conference call that looks at your face

Time: 5985.55

and where you can share ideas.

Time: 5987.8

It's something that you can engage your whole body.

Time: 5991.13

So, yeah I mean, I think being physical

Time: 5993.71

is very important to me.

Time: 5995.11

I mean, that's a lot of the most fun stuff that I get to do.

Time: 6001.84

It's a really important part of how I personally

Time: 6004.27

balance my energy levels and just get

Time: 6007.84

a diversity of experiences because I could spend all

Time: 6010.57

my time running the company.

Time: 6013.07

But I think it's good for people to do some different things

Time: 6016.69

and compete in different areas or learn different things.

Time: 6019.33

And all of that is good.

Time: 6022.96

If people want to do really intense workouts with the work

Time: 6028.66

that we're doing with Quest or with eventual AR glasses,

Time: 6032.98

great.

Time: 6033.8

But even if you don't want to do a really intense workout,

Time: 6037.78

I think just having a computing environment and platform which

Time: 6040.57

is inherently physical captures more of the essence of what

Time: 6044.05

we are as people than any of the previous computing platforms

Time: 6047.65

that we've had to date.

Time: 6049.1

ANDREW HUBERMAN: I was even thinking just

Time: 6050.808

of the simple task of getting better range of motion a.k.a.

Time: 6054.85

flexibility.

Time: 6055.9

I could imagine, inside of the VR experience,

Time: 6058.36

leaning into a stretch, standard type of lunge-type stretch,

Time: 6061.93

but actually seeing a meter of are you are you

Time: 6064.45

approaching new levels of flexibility

Time: 6066.1

in that moment where it's actually

Time: 6067.517

measuring some kinesthetic elements

Time: 6070.21

on the body in the joints, whereas normally, you

Time: 6074.04

might have to do that in front of a camera, which then would

Time: 6076.54

give you the data on a screen that you'd look at afterwards

Time: 6078.998

or hire an expensive coach or looking at form and resistance

Time: 6082.96

training.

Time: 6083.69

So you're actually lifting physical weights.

Time: 6085.523

But it's telling you whether or not you're breaking form.

Time: 6087.898

I mean, there's just so much that could

Time: 6089.54

be done inside of there.

Time: 6090.42

And then my mind just starts to spiral

Time: 6092.003

into, wow, this is very likely to transform

Time: 6095.12

what we think of as, quote unquote, "exercise."

Time: 6097.403

MARK ZUCKERBERG: Yeah, I think so.

Time: 6098.82

I think there's still a bunch of questions

Time: 6100.57

that need to get answered.

Time: 6103.16

I don't think most people are going to necessarily want

Time: 6106.19

to install a lot of sensors or cameras

Time: 6109.64

to track their whole body.

Time: 6110.76

So we're just over time getting better

Time: 6112.79

from the sensors that are on the headsets of being able to do

Time: 6116.33

very good hand tracking.

Time: 6118.22

So we have this research demo where

Time: 6120.05

you now, just with the hand tracking from the headset,

Time: 6123.02

you can type.

Time: 6123.68

It just projects a little keyboard onto your table.

Time: 6125.852

And you can type.

Time: 6126.56

And people type like 100 words a minute with that.

Time: 6129.23

ANDREW HUBERMAN: With a virtual keyboard?

Time: 6130.28

MARK ZUCKERBERG: Yeah.

Time: 6131.197

We're starting to be able to--

Time: 6133.46

using some modern AI techniques, be able to simulate

Time: 6138.44

and understand where your torso's position is.

Time: 6141.38

Even though you can't always see it,

Time: 6143.18

you can see it a bunch of the time.

Time: 6144.835

And if you fuse together what you

Time: 6146.21

do see with the accelerometer and understanding

Time: 6150.02

how the thing is moving, you can kind of

Time: 6151.94

understand what the body position is going to be.

Time: 6155.54

But some things are still going to be hard.

Time: 6158.34

So you mentioned boxing.

Time: 6161.42

That one works pretty well because we understand your head

Time: 6164.6

position.

Time: 6165.15

We understand your hands.

Time: 6167.03

And now, we're increasingly understanding your body

Time: 6170.27

position.

Time: 6171.548

But let's say you want to expand that

Time: 6173.09

to Muay Thai or kickboxing.

Time: 6176.25

OK.

Time: 6176.75

So legs, that's a different part of tracking.

Time: 6178.76

That's harder because that's out of the field of view

Time: 6181.85

more of the time.

Time: 6183.18

But there's also the element of resistance.

Time: 6185.15

So you can throw a punch, and retract it,

Time: 6187.07

and shadow box and do that without upsetting

Time: 6191.42

your physical balance that much.

Time: 6193.32

But if you want to throw a roundhouse kick

Time: 6195.44

and there's no one there, then, I

Time: 6197.18

mean, the standard way that you do it when you're shadowboxing

Time: 6200.27

is you basically do a little 360.

Time: 6202.19

But I don't know.

Time: 6203.543

Is that going to feel great?

Time: 6204.71

I mean, I think there's a question about what

Time: 6207.23

that experience should be.

Time: 6209.103

And then if you want to go even further,

Time: 6210.77

if you want to get grappling to work,

Time: 6214.74

I'm not even sure how you would do

Time: 6216.447

that without having resistance of understanding what the force

Time: 6219.03

is applied to you would be.

Time: 6220.165

And then you get into, OK, maybe you're

Time: 6221.79

going to have some kind of bodysuit that

Time: 6224.07

can apply haptics.

Time: 6225.87

But I'm not even sure that even a pretty advanced haptic system

Time: 6229.35

is going to be able to be quite good enough to simulate

Time: 6232.26

the actual forces that would be applied to you in a grappling

Time: 6235.86

scenario.

Time: 6236.41

So this is part of what's fun about technology,

Time: 6238.65

though, is you keep on getting new capabilities.

Time: 6241.432

And then you need to figure out what things you

Time: 6243.39

can do with them.

Time: 6244.12

So I think it's really neat that we can do boxing.

Time: 6246.6

And we can do the supernatural thing.

Time: 6248.19

And there's a bunch of awesome cardio,

Time: 6250.08

and dancing and things like that.

Time: 6251.893

And then there's also still so much more

Time: 6253.56

to do that I'm excited to get to over time.

Time: 6257.67

But it's a long journey.

Time: 6259.38

ANDREW HUBERMAN: And what about things like painting,

Time: 6262.23

and art and music?

Time: 6263.82

I imagine-- of course, different mediums--

Time: 6268.44

I like to draw with pen and pencil.

Time: 6269.972

But I could imagine trying to learn how to paint virtually.

Time: 6272.43

And of course, you could print out a physical version

Time: 6275.118

of that at the end.

Time: 6275.91

This doesn't have to depart from the physical world.

Time: 6278.08

It could end in the physical world.

Time: 6279.31

MARK ZUCKERBERG: Did you see the demo,

Time: 6280.893

the piano demo where you--

Time: 6283.38

either you're there with a physical keyboard

Time: 6286.04

or it could be a virtual keyboard.

Time: 6287.57

But the app basically highlights what keys

Time: 6291.96

you need to press in order to play the song.

Time: 6295.26

So it's basically like you're looking at your piano.

Time: 6298.14

And it's teaching you how to play a song that you choose.

Time: 6301.955

ANDREW HUBERMAN: An actual piano?

Time: 6303.33

MARK ZUCKERBERG: Yeah.

Time: 6303.72

ANDREW HUBERMAN: But it's illuminating certain keys

Time: 6305.845

in the virtual space.

Time: 6306.87

MARK ZUCKERBERG: Yeah.

Time: 6307.787

And it could either be a virtual piano or a keyboard

Time: 6310.08

if you don't have a piano or keyboard.

Time: 6311.85

Or it could use your actual keyboard.

Time: 6315.15

So yeah, I think stuff like that is

Time: 6318.84

going to be really fascinating for education and expression.

Time: 6323.7

ANDREW HUBERMAN: And excuse me, but for broadening access

Time: 6326.49

to expensive equipment.

Time: 6328.11

I mean, a piano is no small expense.

Time: 6330.578

MARK ZUCKERBERG: Exactly.

Time: 6331.62

ANDREW HUBERMAN: And it takes up a lot of space

Time: 6333.578

and needs to be tuned.

Time: 6334.495

You can think of all these things, the kid that

Time: 6336.453

has very little income or their family

Time: 6338.04

has very little income could learn

Time: 6339.457

to play a virtual piano at a much lower cost.

Time: 6341.383

MARK ZUCKERBERG: Yeah.

Time: 6342.3

And it gets back to the question I

Time: 6343.29

was asking before about this thought experiment of how

Time: 6345.87

many of the things that we physically have

Time: 6348.18

today actually need to be physical.

Time: 6351.05

The piano doesn't.

Time: 6352.75

Maybe there's some premium where--

Time: 6356.44

maybe it's a somewhat better, more tactile experience

Time: 6360.91

to have a physical one.

Time: 6362.32

But for people who don't have the space for it,

Time: 6365.23

or who can't afford to buy a piano,

Time: 6367.133

or just aren't sure that they would want

Time: 6368.8

to make that investment at the beginning of learning how

Time: 6370.48

to play piano, I think, in the future,

Time: 6372.68

you'll have the option of just buying an app

Time: 6375.13

or a hologram piano which will be a lot more affordable.

Time: 6380.11

And I think that's going to unlock a ton of creativity too

Time: 6384.82

because instead of the market for piano makers

Time: 6389.35

being constrained to like a relatively small set of experts

Time: 6393.22

who have perfected that craft, you're

Time: 6395.8

going to have kids or developers all around the world designing

Time: 6400.99

crazy designs for potential keyboards and pianos

Time: 6404.143

that look nothing like what we've seen before,

Time: 6406.06

but maybe bring even more joy or even more

Time: 6409.69

fun into the world where you have fewer

Time: 6411.52

of these physical constraints.

Time: 6412.82

So I think there's going to be a lot of wild stuff to explore.

Time: 6416.23

ANDREW HUBERMAN: There's definitely

Time: 6416.77

going to be a lot of wild stuff to explore.

Time: 6418.562

I just had this idea/image in my mind

Time: 6423.64

of what you were talking about merged with our earlier

Time: 6425.98

conversation when Priscilla was here.

Time: 6427.99

I could imagine a time not too long from now

Time: 6430.15

where you're using mixed reality to run experiments in the lab,

Time: 6433.42

literally mixing virtual solutions,

Time: 6435.64

getting potential outcomes, and then picking the best

Time: 6438.075

one to then go actually do in the real world, which

Time: 6440.2

is very both financially costly and time-wise costly.

Time: 6445.043

MARK ZUCKERBERG: Yeah.

Time: 6445.96

I mean, people are already using VR for surgery and education

Time: 6452.41

on it.

Time: 6454.03

And there's some study that was done that basically tried

Time: 6458.578

to do a controlled experiment of people who learned how

Time: 6460.87

to do a specific surgery through just the normal textbook

Time: 6465.7

and lecture method versus you show the knee

Time: 6470.02

and you have it be a large, blown-up model.

Time: 6473.47

And people can manipulate it and practice

Time: 6475.72

where they would make the cuts.

Time: 6478.09

And like the people in that class did better.

Time: 6483.58

Yeah, I think that it's going to be profound

Time: 6485.89

for a lot of different areas.

Time: 6487.103

ANDREW HUBERMAN: And the last example that leaps to mind--

Time: 6489.52

I think social media and online culture

Time: 6492.1

has been accused of creating a lot of real world--

Time: 6494.98

let's call it physical world social anxiety for people.

Time: 6497.45

But I could imagine practicing a social interaction.

Time: 6500.8

Or a kid that has a lot of social anxiety

Time: 6502.922

or that needs to advocate for themselves better

Time: 6504.88

learning how to do that progressively

Time: 6507.043

through a virtual interaction, and then taking

Time: 6508.96

that to the real world because, in my very recent experience

Time: 6512.35

today, it's so blended now with real experience

Time: 6515.8

that the kid that feels terrified

Time: 6517.27

of advocating for themselves, or just talking

Time: 6519.55

to another human being, or an adult,

Time: 6521.05

or being in a new circumstance of a room full of kids, you

Time: 6523.467

could really experience that in silico

Time: 6525.91

first and get comfortable, let the nervous system

Time: 6528.55

attenuate a bit, and then take it into the, quote unquote,

Time: 6531.48

"physical world."

Time: 6532.55

MARK ZUCKERBERG: Yeah, I think we'll

Time: 6534.05

see experiences like that.

Time: 6535.57

I mean, I also think that some of the social dynamics

Time: 6538.07

around how people interact in this kind

Time: 6541.58

of blended digital world will be more nuanced in other ways.

Time: 6545.31

So I'm sure that there will be new anxieties that people

Time: 6549.32

develop too, just like teens today need to navigate dynamics

Time: 6554.48

around texting constantly that we just

Time: 6558.92

didn't have when we were kids.

Time: 6561.2

So I think it will help with some things.

Time: 6562.908

I think that there will be new issues that hopefully we can

Time: 6565.367

help people work through too.

Time: 6566.9

But overall, yeah, I think it's going to be

Time: 6568.97

really powerful and positive.

Time: 6570.593

ANDREW HUBERMAN: Let's talk about the glasses.

Time: 6572.51

MARK ZUCKERBERG: Sure.

Time: 6573.08

ANDREW HUBERMAN: This was wild.

Time: 6574.67

Put on a Ray-Bans--

Time: 6576.38

I like the way they look.

Time: 6578.34

They're clear.

Time: 6579.12

They look like any other Ray-Ban glasses,

Time: 6582.84

except that I could call out to the glasses.

Time: 6586.56

I could just say, hey Meta, I want

Time: 6589.14

to listen to the Bach variations--

Time: 6591.33

the Goldberg Variations of Bach.

Time: 6593.46

And Meta responded.

Time: 6596.67

And no one around me could hear.

Time: 6598.59

But I could hear with exquisite clarity.

Time: 6601.837

And by the way, I'm not getting paid to say any of this.

Time: 6604.17

I'm just still blown away by this.

Time: 6606.06

Folks, I want a these very badly.

Time: 6609.61

I could hear, OK, I'm selecting those now--

Time: 6612.09

or that music now.

Time: 6612.998

And then I could hear it in the background.

Time: 6614.79

But then I could still have a conversation.

Time: 6616.582

So this was neither headphones in nor headphones out.

Time: 6620.16

And I could say, wait, pause the music.

Time: 6621.87

And it would pause.

Time: 6623.52

And the best part was I didn't have to, quote unquote,

Time: 6625.77

"leave the room" mentally.

Time: 6627.33

I didn't even have to take out a phone.

Time: 6629.13

It was all interfaced through this very local environment

Time: 6632.19

in and around the head.

Time: 6633.322

And as a neuroscientist, I'm fascinated by this

Time: 6635.28

because, of course, all of our perceptions-- auditory,

Time: 6637.2

visual et cetera--

Time: 6637.98

are occurring inside the casing of this thing we call a skull.

Time: 6641.88

But maybe you could comment on the origin

Time: 6646

of that design for you, the ideas behind that,

Time: 6648.32

and where you think it could go because I'm sure

Time: 6650.32

I'm just scratching the surface.

Time: 6652.715

MARK ZUCKERBERG: The real product

Time: 6654.09

that we want to eventually get to is

Time: 6656.55

this full augmented reality product

Time: 6659.94

in a stylish and comfortable normal glasses form factor.

Time: 6664.5

ANDREW HUBERMAN: Not a dorky VR headset, so to speak?

Time: 6666.93

MARK ZUCKERBERG: No, I mean--

Time: 6667.53

ANDREW HUBERMAN: Because the VR headset does

Time: 6668.34

feel kind of big on the face.

Time: 6670.26

MARK ZUCKERBERG: There's going to be a place for that,

Time: 6671.61

too, just like you have your laptop

Time: 6673.2

and you have your workstation.

Time: 6675.167

Or maybe the better analogy is you have your phone

Time: 6677.25

and you have your workstation.

Time: 6679.35

These AR glasses are going to be like your phone in that you

Time: 6682.773

have something on your face.

Time: 6683.94

And you will, I think, be able to, if you want,

Time: 6687.33

wear it for a lot of the day and interact

Time: 6689.55

with it very frequently.

Time: 6692.502

I don't think that people are going

Time: 6693.96

to be walking around the world wearing VR headsets.

Time: 6696.813

ANDREW HUBERMAN: Let's hope.

Time: 6697.98

MARK ZUCKERBERG: But yeah, that's certainly not the future

Time: 6700.397

that I'm hoping we get to.

Time: 6702.75

But I do think that there is a place for having--

Time: 6707.1

because it's a bigger form factor,

Time: 6708.57

it has more compute power.

Time: 6710.05

So just like your workstation or your bigger computer

Time: 6713.89

can do more than your phone can do,

Time: 6715.967

there's a place for that when you want

Time: 6717.55

to settle into an intense task.

Time: 6719.508

If you have a doctor who's doing a surgery,

Time: 6721.3

I would want them doing it through the headset

Time: 6723.217

not through the phone equivalent or the lower powered glasses.

Time: 6727.39

But just like phones are powerful enough

Time: 6729.653

to do a lot of things, I think the glasses will eventually

Time: 6732.07

get there, too.

Time: 6734.42

Now, that said, there's a bunch of really hard technology

Time: 6737.81

problems to address in order to be able to get to this point

Time: 6741.98

where you can put full holograms in the world.

Time: 6746.24

You're basically miniaturizing a supercomputer

Time: 6749.21

and putting it into a glasses so that the glasses still

Time: 6755.09

look stylish and normal.

Time: 6756.65

And that's a really hard technology problem.

Time: 6760.46

Making things small is really hard.

Time: 6763.31

A holographic display is different from what

Time: 6768.17

our industry has optimized for for 30 or 40 years now,

Time: 6772.1

building screens.

Time: 6773.54

There's a whole industrial process

Time: 6775.94

around that goes into phones, and TVs, and computers,

Time: 6780.11

and increasingly so many things that have different screens.

Time: 6783.48

There's a whole pipeline that's gotten very good

Time: 6785.48

at making that kind of screen.

Time: 6786.92

And the holographic displays are just

Time: 6790.56

a completely different thing because it's not a screen.

Time: 6794.01

It's a thing that you can shoot light

Time: 6796.05

into through a laser or some other kind of projector.

Time: 6798.96

And it can place that as an object in the world.

Time: 6801.55

So that's going to need to be this whole other industrial

Time: 6804.72

process that gets built up to doing that in an efficient way.

Time: 6808.54

So all that said, we're basically

Time: 6812.46

taking two different approaches towards building this at once.

Time: 6815.95

One is we are trying to keep in mind what is the long-term

Time: 6821.32

thing that--

Time: 6822.19

it's not super far off.

Time: 6823.72

Within a few years, I think we'll

Time: 6826.24

have something that's a first version of this full vision

Time: 6829.822

that I'm talking about.

Time: 6830.78

I mean, we have something that's working internally

Time: 6832.905

that we use as a dev kit.

Time: 6835.75

But that one, that's a big challenge.

Time: 6841.13

It's going to be more expensive.

Time: 6843.98

And it's harder to get all the pieces working.

Time: 6847.19

The other approach has been, all right, let's

Time: 6849.17

start with what we know we can put

Time: 6851.18

into a pair of stylish sunglasses

Time: 6854.06

today and just make them as smart as we can.

Time: 6858.67

So for the first version, we worked with--

Time: 6862.6

we did this collaboration with Ray-Ban

Time: 6864.64

because that's well-accepted.

Time: 6867.13

These are well-designed glasses.

Time: 6869.11

They're classic.

Time: 6869.8

People have used them for decades.

Time: 6872.02

For the first version, we got a sensor on the front,

Time: 6874.605

so you could capture moments without having to take

Time: 6876.73

your phone out of your pocket.

Time: 6877.98

So you got photos and videos.

Time: 6880.57

You had the speaker and the microphone,

Time: 6882.195

so you can listen to music.

Time: 6884.86

You could communicate with it.

Time: 6887.05

But that was the first version of it.

Time: 6891.015

We had a lot of the basics there.

Time: 6892.39

But we saw how people used it.

Time: 6894.07

And we tuned it.

Time: 6896.23

We made the camera twice as good for this new version

Time: 6900.19

that we made.

Time: 6900.85

The audio is a lot crisper for the use cases

Time: 6902.86

that we saw that people actually used, which is-- some of it

Time: 6905.17

is listening to music.

Time: 6906.087

But a lot of it is people want to take calls on their glasses.

Time: 6909.25

They want to listen to podcasts.

Time: 6912.19

But the biggest thing that I think is interesting

Time: 6914.71

is the ability to get AI running on it, which it doesn't just

Time: 6919.58

run on the glasses.

Time: 6920.66

It also kind of proxies through your phone.

Time: 6923.66

But I mean, with all the advances in LLMs--

Time: 6928.607

we talked about this a bit in the first part

Time: 6930.44

of the conversation.

Time: 6933.23

Having the ability to have your Meta AI assistant

Time: 6935.715

that you can just talk to and basically

Time: 6937.34

ask any question throughout the day is--

Time: 6940.43

I think it'd be really fascinating.

Time: 6942.05

And like you were saying about how

Time: 6945.98

we process the world as people, eventually, I

Time: 6949.813

think you're going to want your AI

Time: 6951.23

assistant to be able to see what you see and hear what you hear.

Time: 6955.012

Maybe not all the time.

Time: 6955.97

But you're going to want to be able to tell

Time: 6957.762

it to go into a mode where it can see what you see and hear

Time: 6960.5

what you hear.

Time: 6961.28

And what's the device design that

Time: 6965.98

best positions an AI assistant to be

Time: 6968.175

able to see what you see and hear

Time: 6969.55

what you hear so it can best help you?

Time: 6971.29

Well, that's glasses, where it basically

Time: 6973.63

has a sensor to be able to see what you see

Time: 6975.76

and a microphone that is close to your ears that

Time: 6980.38

can hear what you hear.

Time: 6983.47

The other design goal is, like you said,

Time: 6985.96

to keep you present in the world.

Time: 6988.75

So I think one of the issues with phones

Time: 6991.87

is they pull you away from what's physically happening

Time: 6995.84

around you.

Time: 6996.34

And I don't think that the next generation of computing

Time: 6998.632

will do that.

Time: 6999.293

ANDREW HUBERMAN: I'm chuckling to myself

Time: 7000.96

because I have a friend.

Time: 7001.68

He's a very well known photographer.

Time: 7003.18

And he was laughing about how people go to a concert.

Time: 7006.18

And everyone's filming the concert on their phone

Time: 7008.573

so that they can be the person that posts the thing.

Time: 7010.74

But there are literally millions of other people

Time: 7012.72

who posted the exact same thing.

Time: 7014.053

But somehow, it feels important to post our unique experience.

Time: 7018.33

With glasses, that would essentially

Time: 7020.49

smooth that gap completely.

Time: 7023.61

You could just worry about it later, download it then.

Time: 7025.86

There are issues, I realize, with glasses

Time: 7028.29

because they are so seamless with everyday experience,

Time: 7031.023

even though you and I aren't wearing them now.

Time: 7032.94

It's very common for people to wear glasses--

Time: 7036.06

issues of recording and consent.

Time: 7037.747

[INTERPOSING VOICES]

Time: 7038.58

ANDREW HUBERMAN: Like if I go to a locker room at my gym,

Time: 7041.065

I'm assuming that the people with glasses aren't filming.

Time: 7043.44

Whereas right now, because there's a sharp transition when

Time: 7047.04

there's a phone in the room and someone's pointing it,

Time: 7050.37

people generally say, no phones in locker rooms and recording.

Time: 7055.45

So that's just one instance.

Time: 7056.86

I mean, there are other instances.

Time: 7057.51

MARK ZUCKERBERG: We have the whole privacy light.

Time: 7059.22

Did you get--

Time: 7059.8

ANDREW HUBERMAN: I didn't get a chance to explore that.

Time: 7061.275

MARK ZUCKERBERG: Yeah.

Time: 7062.192

So anytime that it's active, that the camera

Time: 7064.56

sensor is active, it's basically pulsing a white bright light.

Time: 7069.95

ANDREW HUBERMAN: Got it.

Time: 7070.95

MARK ZUCKERBERG: Which is, by the way, more than cameras do.

Time: 7073.982

ANDREW HUBERMAN: Right.

Time: 7074.94

Someone could be holding a phone.

Time: 7076.23

MARK ZUCKERBERG: Yeah.

Time: 7077.147

I mean, phones aren't showing a light, bright sensor

Time: 7080.43

when you're taking a photo.

Time: 7082.013

ANDREW HUBERMAN: People oftentimes

Time: 7083.43

will pretend they're texting and they're actually recording.

Time: 7085.225

I actually saw an instance of this in a barber shop

Time: 7087.61

once, where someone was recording

Time: 7089.17

and they were pretending that they were texting.

Time: 7091.33

And it was interesting.

Time: 7092.29

There was a pretty intense interaction that ensued.

Time: 7095.38

And it was like, wow, it's pretty easy for people

Time: 7097.87

to feign texting while actually recording.

Time: 7100.313

MARK ZUCKERBERG: Yeah.

Time: 7101.23

So I think when you're evaluating

Time: 7104.74

a risk with a new technology, the bar shouldn't be is it

Time: 7110.07

possible to do anything bad.

Time: 7112.29

It's does this new technology make it easier

Time: 7116.19

to do something bad than what people already had.

Time: 7118.95

And I think because you have this privacy light that is just

Time: 7122.58

broadcasting to everyone around you, hey,

Time: 7124.48

this thing is recording now--

Time: 7126.45

I think that makes it actually less discreet

Time: 7130.62

to do it through the glasses than what you could

Time: 7132.72

do with a phone already, which I think is basically the bar

Time: 7136.23

that we wanted to get over from a design perspective.

Time: 7138.91

ANDREW HUBERMAN: Thank you for pointing out

Time: 7139.8

that it has the privacy light.

Time: 7141.05

I didn't get long enough in the experience

Time: 7143.01

to explore all the features.

Time: 7144.42

But again, I can think of a lot of uses--

Time: 7148.65

being able to look at a restaurant from the outside

Time: 7151.2

and see the menu, get a status on how crowded it is.

Time: 7156.42

As much as I love--

Time: 7157.908

I don't want to call out-- let's just

Time: 7159.45

say app-based map functions that allow you to navigate

Time: 7163.68

and the audio is OK.

Time: 7165.36

It's nice to have a conversation with somebody on the phone

Time: 7168.45

or in the vehicle.

Time: 7169.32

And it'd be great if the road was traced where I should turn.

Time: 7171.893

MARK ZUCKERBERG: Yeah, absolutely.

Time: 7173.31

ANDREW HUBERMAN: These kinds of things

Time: 7174.893

seem like it's going to be straightforward for Meta

Time: 7177.09

engineers to create.

Time: 7177.97

MARK ZUCKERBERG: Yeah, in a version, we'll have it

Time: 7179.61

so it'll also have the holographic display, where

Time: 7181.652

it can show you the directions.

Time: 7183.04

But I think that there will basically just

Time: 7185.1

be different price points that pack different amounts

Time: 7189.04

of technology.

Time: 7190

The holographic display part, I think,

Time: 7191.65

is going to be more expensive than doing

Time: 7193.84

one that just has the AI, but is primarily communicating

Time: 7198.25

with you through audio.

Time: 7200.05

So I mean, the current Ray-Ban Meta glasses are $299.

Time: 7205.09

I think when we have one that has a display in it,

Time: 7207.925

it'll probably be some amount more than that.

Time: 7209.8

But it'll also be more powerful.

Time: 7211.34

So I think that people will choose

Time: 7213.46

what they want to use based on what the capabilities are

Time: 7216.64

that they want and what they can afford.

Time: 7219.82

But a lot of our goal in building things

Time: 7223.09

is we try to make things that can be accessible to everyone.

Time: 7229.26

Our game as a company isn't to build things and then charge

Time: 7234.33

a premium price for it.

Time: 7235.8

We try to build things that then everyone can use, and then

Time: 7239.79

become more useful because a very large number of people

Time: 7242.58

are using them.

Time: 7244.45

So it's just a very different approach.

Time: 7247.38

We're not like Apple or some of these companies that just

Time: 7250.83

try to make something and then sell it for as much

Time: 7253.83

as they can, which, I mean, they're a great company.

Time: 7257.02

So I mean, I think that model is fine, too.

Time: 7261.06

But our approach is going to be we

Time: 7262.98

want stuff that can be affordable

Time: 7264.42

so that way everyone in the world can use it.

Time: 7266.295

ANDREW HUBERMAN: Long lines of health,

Time: 7267.878

I think the glasses will also potentially solve

Time: 7270.03

a major problem in a real way, which

Time: 7272.67

is the following for both children and adults.

Time: 7275.02

It's very clear that viewing objects in particular screens

Time: 7278.43

up close for too many hours per day leads to myopia.

Time: 7281.46

It literally changes the length of length of the eyeball

Time: 7284.1

and nearsightedness.

Time: 7286.18

And on the positive side, we know,

Time: 7288.82

based on some really large clinical trials,

Time: 7291.4

that kids who spend--

Time: 7293.11

and adults who spend two hours a day or more out of doors

Time: 7297.82

don't experience that and maybe even reverse their myopia.

Time: 7300.61

And it has something to do with exposure to sunlight.

Time: 7302.89

But it has a lot to do with long view, viewing

Time: 7305.14

things at a distance greater than three or four feet away.

Time: 7307.69

And with the glasses, I realize, one

Time: 7309.64

could actually do digital work out of doors.

Time: 7313.6

It could measure and tell you how much time

Time: 7316.24

you've spent looking at things up close versus far away.

Time: 7318.97

I mean, this is just another example that leaps to mind.

Time: 7322.18

But in accessing the visual system,

Time: 7324.55

you're effectively accessing the whole brain

Time: 7326.497

because it's the only two bits of brain that

Time: 7328.33

are outside the cranial vault. So it just

Time: 7330.038

seems like putting technology right at the level of the eyes,

Time: 7332.77

seeing what the eyes see, has just

Time: 7335.11

got to be the best way to go.

Time: 7336.683

MARK ZUCKERBERG: Yeah.

Time: 7337.6

Well, multimodal, I think, is-- you want the visual sensation.

Time: 7344.27

But you also want text or language.

Time: 7348.05

ANDREW HUBERMAN: Sure.

Time: 7349.1

That all can be brought to the level of the eyes, right?

Time: 7351.44

MARK ZUCKERBERG: What do you mean by that?

Time: 7353.19

ANDREW HUBERMAN: Well, I mean, I think

Time: 7353.9

what we're describing here is essentially

Time: 7356.24

taking the phone, the computer, and bringing it

Time: 7359.27

all to the level of the eyes.

Time: 7360.56

And of course, one would like--

Time: 7361.61

MARK ZUCKERBERG: Oh, Physically at your eyes?

Time: 7362.45

ANDREW HUBERMAN: Physically at your eyes, right?

Time: 7363.8

MARK ZUCKERBERG: Yeah.

Time: 7364.01

ANDREW HUBERMAN: And one would like more kinesthetic

Time: 7365.63

information, as you mentioned before-- where the legs are,

Time: 7367.52

maybe even lung function.

Time: 7368.72

Hey, have you taken enough steps today?

Time: 7370.36

But that all can be-- if it can be figured out on the phone,

Time: 7372.86

it can be-- by the phone, it can be figured out by the glasses.

Time: 7376.37

But there's additional information there,

Time: 7378.14

such as what are you focusing on in your world.

Time: 7380.63

How much of your time is spent looking at things far away

Time: 7383.27

versus up close?

Time: 7383.96

How much social time did you have today?

Time: 7385.89

It's really tricky to get that with a phone.

Time: 7388.298

If my phone were right in front of us

Time: 7389.84

as if we were at a standard lunch

Time: 7391.215

nowadays, certainly in Silicon Valley,

Time: 7392.9

and then we're peering at our phones, I mean,

Time: 7394.775

how much real direct attention and was in the conversation

Time: 7397.52

at hand versus something else?

Time: 7398.99

You can get issues of where are you

Time: 7400.64

placing your attention by virtue of where

Time: 7403.798

you're placing your eyes.

Time: 7404.84

And I think that information is not accessible

Time: 7406.97

with a phone in your pocket or in front of you.

Time: 7409.05

Yeah, I mean, a little bit, but not nearly as rich and complete

Time: 7413.3

information as one gets when you're really

Time: 7415.28

pulling the data from the level of vision

Time: 7417.32

and what kids and adults are actually

Time: 7419.51

looking at and attending to.

Time: 7421.703

MARK ZUCKERBERG: Yeah, yeah.

Time: 7422.87

ANDREW HUBERMAN: It seems extremely valuable.

Time: 7424.85

You get autonomic information, size of the pupils.

Time: 7427.43

So you get information about internal states.

Time: 7429.32

MARK ZUCKERBERG: I mean, there's internal sensor and outside.

Time: 7433.07

So the sensor on the Ray-Ban Meta glasses is external.

Time: 7437.99

So it basically allows you to see what you see--

Time: 7441.95

sorry, the AI system to see what you're seeing.

Time: 7444.41

There's a separate set of things which

Time: 7446.03

are eye tracking, which are also very powerful for enabling

Time: 7452.27

a lot of interfaces.

Time: 7453.8

So if you want to just look at something

Time: 7456.17

and select it by looking at it with your eyes

Time: 7459.65

rather than having to drag a controller over or pick up

Time: 7463.28

a hologram or anything like that,

Time: 7465.87

you can do that with eye tracking.

Time: 7468.37

So that's a pretty profound and cool experience, too, as well

Time: 7473.675

as just understanding what you're

Time: 7475.05

looking at so that way you're not wasting compute power

Time: 7477.93

drawing pixels and high resolution

Time: 7479.925

in a part of the world that's going to be

Time: 7482.64

in your peripheral vision.

Time: 7485.05

So yeah, all of these things, there

Time: 7489.38

are interesting design and technology trade-offs,

Time: 7491.75

where if you want the external sensor, that's one thing.

Time: 7495.71

If you also want the eye tracking,

Time: 7497.697

now that's a different set of sensors.

Time: 7499.28

Each one of these consumes compute,

Time: 7502.31

which consumes battery.

Time: 7504.08

They take up more space.

Time: 7505.138

So it's like, where are the eye tracking sensors going to be?

Time: 7507.68

It's like, well, you want to make sure

Time: 7509.263

that the rim of the glasses is actually quite thin because--

Time: 7512.57

I mean, there's a variance of how thick can glasses

Time: 7517.52

be before they look more like goggles than glasses.

Time: 7521.64

So I think that there's this whole space.

Time: 7523.393

And I think people are going to end up choosing what

Time: 7525.56

product makes sense for them.

Time: 7526.74

Maybe they want something that's more powerful,

Time: 7528.698

that has more of the sensors, but it's

Time: 7530.45

going to be a little more expensive,

Time: 7531.95

maybe like slightly thicker.

Time: 7533.75

Or maybe you want a more basic thing

Time: 7535.58

that just looks very similar to what Ray-Ban glasses are

Time: 7539.33

that people have been wearing for decades but has AI in it

Time: 7542.78

and you can capture moments without having

Time: 7545.082

to take your phone out and send them to people.

Time: 7547.04

In the latest version, we got the ability in to live stream.

Time: 7552.21

I think that that's pretty crazy, that now you can be--

Time: 7555.57

going back to your concert case or whatever else you're doing,

Time: 7558.42

you can be doing sports or watching

Time: 7562.56

your kids play something.

Time: 7564.15

And you can be watching.

Time: 7565.71

And you can be live streaming it to your family group,

Time: 7570.39

so people can see it.

Time: 7571.47

I think that stuff is--

Time: 7575.163

I think that's pretty cool, that you basically

Time: 7577.08

have a normal looking glasses at this point that can live stream

Time: 7581.19

and has an AI assistant.

Time: 7582.72

So the stuff is making a lot faster progress

Time: 7586.02

in a lot of ways than I would have thought.

Time: 7588.97

And I don't know.

Time: 7589.87

I think people are going to like this version.

Time: 7590.97

But there's a lot more still to do.

Time: 7592.98

ANDREW HUBERMAN: I think it's super exciting.

Time: 7594.21

And I see a lot of technologies.

Time: 7595.62

This one's particularly exciting to me

Time: 7597.203

because of how smooth the interface is

Time: 7599.1

and for all the reasons that you just mentioned.

Time: 7602.28

What's happening with and what can we expect around

Time: 7605.37

AI interfaces and maybe even avatars

Time: 7608.37

of people within social media?

Time: 7610.29

Are we not far off from a day where

Time: 7613.95

there are multiple versions of me

Time: 7616.53

and you on the internet or people?

Time: 7618.145

For instance, I get asked a lot of questions.

Time: 7620.02

I don't have the opportunity to respond to all those questions.

Time: 7622.645

But with things like ChatGPT, people

Time: 7624.598

are trying to generate answers to those questions

Time: 7626.64

on other platforms.

Time: 7627.69

Will I have the opportunity to soon

Time: 7629.31

have an AI version of myself where people

Time: 7631.29

can ask me questions about what I recommend for sleep

Time: 7634.56

and circadian rhythm, fitness, mental health, et cetera based

Time: 7637.895

on content I've already generated

Time: 7639.27

that will be accurate so they could just ask my avatar?

Time: 7642.45

MARK ZUCKERBERG: Yeah, this is something

Time: 7644.82

that I think a lot of creators are going

Time: 7646.5

to want that we're trying to build

Time: 7651.58

and I think we'll probably have a version of next year.

Time: 7655

But there's a bunch of constraints

Time: 7657.623

that I think we need to make sure that we get right.

Time: 7659.79

So for one, I think it's really important that--

Time: 7663.21

it's not that there's a bunch of versions of you.

Time: 7665.67

It's that if anyone is creating an AI assistant version of you,

Time: 7670.54

it should be something that you control.

Time: 7672.625

I think there are some platforms that are out there today

Time: 7675

that just let people like make--

Time: 7677.56

I don't know-- an AI bought of me or other figures.

Time: 7681.943

And it's like, I don't know.

Time: 7683.11

I mean, we have platform policies for--

Time: 7688.47

and for decades, since the beginning

Time: 7692.538

of the company at this point, which is almost 20 years,

Time: 7694.83

that basically don't allow impersonation.

Time: 7698.58

Real identity is like one of the core aspects

Time: 7701.1

that our company was started on.

Time: 7703.95

You want to authentically be yourself.

Time: 7705.91

So yeah, I think if you're almost any creator,

Time: 7711.06

being able to engage your community--

Time: 7714.66

and there's just going to be more demand

Time: 7717.87

to interact with you than you have hours in the day.

Time: 7720.67

So there are both people who out there

Time: 7722.7

who would benefit from being able to talk

Time: 7724.44

to an AI version of you.

Time: 7726.52

And I think you, and other creators,

Time: 7728.7

would benefit from being able to keep your community engaged

Time: 7731.46

and service that demand that people have to engage with you.

Time: 7734.43

But you're going to want to know that that AI version of you

Time: 7739.89

or assistant is going to represent you

Time: 7742.833

the way that you would want.

Time: 7744

And there are a lot of things that

Time: 7746.49

are awesome about these modern LLMs.

Time: 7749.59

But having perfect predictability

Time: 7753.31

about how it's going to represent something

Time: 7755.17

is not one of the current strengths.

Time: 7757.01

So I think that there's some work that

Time: 7759.09

needs to get done there.

Time: 7760.09

I don't think it needs to be 100% perfect all the time.

Time: 7762.838

But you need to have very good confidence, I would say,

Time: 7765.13

that it's going to represent you the way that you'd

Time: 7767.32

want for you to want to turn it on,

Time: 7769.29

which, again, you should have control over

Time: 7771.04

whether you turn it on.

Time: 7772.64

So we wanted to start in a different place, which

Time: 7775.3

I think is a somewhat easier problem, which is creating

Time: 7779.83

new characters for AI personas.

Time: 7783.76

So that way, it's not--

Time: 7786.4

we built one of the AIs is like a chef.

Time: 7791.38

And they can help you come up with things

Time: 7795.52

that you could cook and can help you cook them.

Time: 7798.97

There's a couple of people that are

Time: 7800.995

interested in different types of fitness that

Time: 7802.87

can help you plan out your workouts

Time: 7805.66

or help with recovery or different things like that.

Time: 7810.33

There's an AI that's focused on DIY crafts.

Time: 7814.958

There's somebody who's a travel expert that

Time: 7816.75

can help you make travel plans or give you ideas.

Time: 7819.27

But the key thing about all of these

Time: 7820.8

is they're not modeled off of existing people.

Time: 7824.76

So they don't have to have 100% fidelity to making sure

Time: 7830.25

that they never say something that a real person who they're

Time: 7833.623

modeled after would never say because they're just made up

Time: 7836.04

characters.

Time: 7836.95

So I think that that's a somewhat easier problem.

Time: 7842.38

And we actually got a bunch of different well-known people

Time: 7847.86

to play those characters because we thought

Time: 7849.733

that would make it more fun.

Time: 7850.9

So there's like Snoop Dogg is the dungeon master.

Time: 7853.41

So you can drop him into a thread

Time: 7854.85

and play text-based games.

Time: 7856.81

And I do this with my daughter when I tuck her in at night.

Time: 7861.06

And she just loves storytelling.

Time: 7864.96

And it's like Snoop Dogg, as the dungeon master,

Time: 7868.092

will come up with here's what's happening next.

Time: 7870.05

And she's like, OK, I turn into a mermaid.

Time: 7872.35

And then I like swim across the bay.

Time: 7874.03

And I go and find the treasure chest and unlock it.

Time: 7877.273

And it's like, and then Snoop Dogg just always

Time: 7879.19

will have a next version of the--

Time: 7881.08

a next iteration on the story.

Time: 7882.55

So I mean, it's stuff that's fun.

Time: 7884.54

But it's not actually Snoop Dogg.

Time: 7886.132

He's just the actor who's playing the dungeon master,

Time: 7888.34

which makes it more fun.

Time: 7889.34

So I think that's probably the right place to start,

Time: 7891.91

is you can build versions of these characters

Time: 7896.32

that people can interact with doing different things.

Time: 7898.847

But I think where you want to get over

Time: 7900.43

time is to the place where any creator or any small business

Time: 7905.26

can very easily just create an AI assistant that can represent

Time: 7909.91

them and interact with your community or customers,

Time: 7914.05

if you're a business, and basically just help

Time: 7918.25

you grow your enterprise.

Time: 7920.93

So I think that's going to be cool.

Time: 7923.59

It's a long-term project.

Time: 7924.64

I think we'll have more progress on it to report on next year.

Time: 7929.35

But I think that's coming.

Time: 7931.398

ANDREW HUBERMAN: I'm super excited about it

Time: 7933.19

because we hear a lot about the downsides of AI.

Time: 7935.83

I mean, I think people are now coming around to the reality

Time: 7938.92

that AI is neither good nor bad.

Time: 7940.448

It can be used for good or bad.

Time: 7941.74

And there are a lot of life-enhancing spaces

Time: 7944.02

that it's going to show up and really, really

Time: 7946.18

improve the way that we engage socially, what we learn,

Time: 7950.95

and that mental health and physical health

Time: 7952.903

don't have to suffer and, in fact,

Time: 7954.32

can be enhanced by the sorts of technologies

Time: 7956.248

we've been talking about.

Time: 7957.29

So I know you're extremely busy.

Time: 7959.9

I so appreciate the large amount of time

Time: 7963.49

you've given me today to sort through all these things.

Time: 7966.25

MARK ZUCKERBERG: Yeah, it's been fun.

Time: 7966.85

ANDREW HUBERMAN: And to talk with you and Priscilla

Time: 7968.975

and to hear what's happening and where things are headed,

Time: 7972.49

the future certainly is bright.

Time: 7973.93

I share in your optimism.

Time: 7975.88

And it's been only strengthened by today's conversation.

Time: 7979.07

So thank you so much.

Time: 7980.92

And keep doing what you're doing.

Time: 7982.72

And on behalf of myself and everyone listening,

Time: 7986.18

thank you because, regardless of what people say,

Time: 7988.63

we all use these platforms excitedly.

Time: 7991.73

And it's clear that there's a ton of intention,

Time: 7994.34

and care, and thought about what could be in the positive sense.

Time: 8001.54

And that's really worth highlighting.

Time: 8003.46

MARK ZUCKERBERG: Awesome, thank you.

Time: 8004.96

I appreciate it.

Time: 8005.627

ANDREW HUBERMAN: Thank you for joining me

Time: 8007.335

for today's discussion with Mark Zuckerberg and Dr. Priscilla

Time: 8009.89

Chan.

Time: 8010.49

If you're learning from and/or enjoying this podcast,

Time: 8013.2

please subscribe to our YouTube channel.

Time: 8014.97

That's a terrific zero-cost way to support us.

Time: 8017.46

In addition, please subscribe to the podcast

Time: 8019.46

on both Spotify and Apple.

Time: 8020.99

And on both Spotify and Apple, you

Time: 8022.67

can leave us up to a five star review.

Time: 8024.74

Please also check out the sponsors

Time: 8026.33

mentioned at the beginning and throughout today's episode.

Time: 8028.83

That's the best way to support this podcast.

Time: 8031.26

If you have questions for me, or comments about the podcast,

Time: 8033.86

or guests that you'd like me to consider

Time: 8035.527

hosting on the Huberman Lab podcast,

Time: 8037.17

please put those in the comment section on YouTube.

Time: 8039.63

I do read all the comments.

Time: 8041.15

Not during today's episode, but on many previous episodes

Time: 8043.64

of the Huberman Lab podcast, we discuss supplements.

Time: 8046.32

While supplements aren't necessary for everybody,

Time: 8048.45

many people derive tremendous benefit from them for things

Time: 8051.26

like enhancing sleep, hormone support, and improving focus.

Time: 8054.292

If you'd like to learn more about the supplements discussed

Time: 8056.75

on the Huberman Lab podcast, you can go to live momentous--

Time: 8059.66

spelled O-U-S--

Time: 8060.71

so livemomentous.com/huberman.

Time: 8063.785

If you're not already following me on social media,

Time: 8065.91

it's hubermanlab on all social media platforms.

Time: 8068.46

So that's Instagram, Twitter-- now called X--

Time: 8071.64

Threads, Facebook, LinkedIn.

Time: 8073.36

And on all those places, I discuss

Time: 8074.94

science and science-related tools, some of which

Time: 8076.98

overlaps with the content of the Huberman Lab podcast,

Time: 8079.323

but much of which is distinct from the content

Time: 8081.24

on the Huberman Lab podcast.

Time: 8082.44

So again, it's hubermanlab on all social media platforms.

Time: 8085.655

If you haven't already subscribed

Time: 8087.03

to our monthly Neural Network Newsletter,

Time: 8089.1

the Neural Network Newsletter is a completely zero-cost

Time: 8091.65

newsletter that gives you podcast summaries

Time: 8094.23

as well as toolkits in the form of brief PDFs.

Time: 8096.93

We have toolkits related to optimizing sleep, regulating

Time: 8100.83

dopamine, deliberate cold exposure,

Time: 8102.93

fitness, mental health, learning, and neuroplasticity

Time: 8106.38

and much more.

Time: 8107.37

Again, it's completely zero-cost to sign up.

Time: 8109.26

You simply go to hubermanlab.com, go over

Time: 8111.24

to the Menu tab, scroll down to newsletter

Time: 8113.22

and supply your email.

Time: 8114.72

I should emphasize that we do not

Time: 8116.1

share your email with anybody.

Time: 8118.08

Thank you once again for joining me for today's discussion

Time: 8120.54

with Mark Zuckerberg and Dr. Priscilla Chan.

Time: 8123.09

And last but certainly not least,

Time: 8125.32

thank you for your interest in science.

Time: 8127.05

[MUSIC PLAYING]

Copyright © 2024. All rights reserved.