Silicon and Sparks: How AI and Crypto Are Remaking the Grid (Part 2 of 2)
Ep. 31

Silicon and Sparks: How AI and Crypto Are Remaking the Grid (Part 2 of 2)

Episode description

Welcome back for Part 2! If you’re just joining us, be sure to listen to last week’s episode first.

This week on The Overlap, we’re diving deeper into the paradox of AI and crypto. We’ve established they’re incredibly energy-hungry, but could their massive consumption actually force us to innovate our way out of our energy problems? From the industrial revolution’s steam engine to the promise of nuclear fusion, we’ll explore historical parallels and future possibilities.

We’ll also tackle the dark side of this new tech: the volatility of crypto, the ethical minefield of AI, and the very real dangers of bias, misinformation, and lack of accountability. Is the need for more power the only thing that will push us toward a clean energy future, or does that path lead to a global scramble for resources and deepening inequality?

Join Joshua and Will as they navigate this technological trilemma of innovation, consumption, and responsibility. It’s a powerful conversation about the choices we’re making today that will determine our tomorrow.

Download transcript (.srt)
0:00

If this is where you picked up the podcast today, you are in the wrong place.

0:07

This is a part two of a two-part episode.

0:11

Go back to last week's episode and check that one out and give it a start and come back

0:15

right here and join us here at The Overlap as we talk about AI, cryptocurrency, electricity,

0:22

and the future of innovation.

0:41

All right.

0:42

Now welcome back to part two of The Overlap podcast about AI, cryptocurrency, and electricity.

0:47

Will, why don't you kick it off where we left off from?

0:51

Right.

0:52

So, why the hope that these two technologies, although they're very energy hungry, we understand

1:00

now, why they're so energy hungry?

1:03

How do we come up with this idea that perhaps they could rescue us from the energy scarce

1:06

or energy consuming future that we fear and actually give us a positive way out?

1:12

I asked that question, I'm actually going to ask you that question.

1:14

Are you asking me?

1:15

I mean, I have my answer, but.

1:18

Well, what we have is we have examples from the past, right?

1:21

Yeah.

1:22

And the fact that new technologies, although they appear to at first be energy consuming,

1:26

actually enable us to create things that are energy producing, right?

1:29

The very steam turbines we were just talking about, if we could bury these AI and use them

1:34

to heat water and all that, those turbines exist because of the industrial revolution

1:38

and the things that that made available.

1:40

Our variability to harness steam in engines was what gave us the capacity to generate much

1:47

more energy than we actually used ever prior to the industrial revolution.

1:51

And while there's.

1:52

Yeah, I think.

1:53

Go ahead.

1:54

Go ahead.

1:55

We both do that to each other all the time.

1:57

I think that it's important to evaluate this, right?

2:01

I think we're at the precipice of a junction in which we're looking at a major categorical

2:08

shift in how we do things, right?

2:10

For millennia, the human progress was limited by the power that could be generated by human

2:17

and animal muscle.

2:19

Right.

2:20

And that was that was the limit, whether that was swinging of an axe or the walking of an

2:25

ox.

2:27

So the development of the water wheel, for example, produced an increase.

2:33

We had it.

2:34

We had it there that we were now able to extend ourselves past our own physical limitations

2:40

and the limitations of work animals.

2:43

So we have the water wheel, right.

2:45

But they were then geographically tethered.

2:48

They were tethered to rivers.

2:49

If there was no river in the area, there could be no water wheel.

2:52

Right.

2:53

And then you have the steam engine like you were talking about.

2:57

That that thing by itself transformed and created the power available to a society by

3:03

about 600 times.

3:06

So a steam engine had about 600 times the maximum amount of power that a human being

3:11

or an animal could produce.

3:13

Exactly.

3:14

And that's good.

3:16

But because because the coal used to create the steam engines could be transported almost

3:22

anywhere, right.

3:23

Now you can have a steam engine completely decoupled from energy production and completely

3:28

from geography for the first time in the in the history of the modern world.

3:33

Which is the industrial revolution, which gave way to factories and railroads and mass

3:38

production.

3:40

Exactly.

3:41

And now we have access to a technology that can simulate potentially one day simulate

3:46

new types of engines and new types of energy retrieving and generating devices that can

3:52

potentially unlock more power than we can even conceive of now to not only feed the

3:57

future AI and future cryptocurrency devices, but also many other things that we may not

4:03

have even dreamt of yet.

4:05

Perhaps those starships that you were talking about before.

4:08

Yeah, I think I think one way that crypto is similar in this scenario is that it decouples

4:14

our system of trading from any one country.

4:18

Right.

4:19

And allows free money in the same way that cross borders without needing middle middlemen.

4:26

We're using a network in place of that.

4:28

So we're increasing our ability to exchange goods and services.

4:33

And then on the AI side, I see it as you know, in the same way that the steam engine was

4:38

an answer to to overcoming the limitations of human ability physically, AI is now the

4:49

hope I don't know that it does now.

4:51

I don't think it does is the hope that it can now overcome our mental capacity and our

4:56

mental abilities generally by 600 times, but even more than 600 times.

5:01

I mean, if we had a GI, we could put a GI to building its own a si and we would potentially

5:10

see an even bigger savings.

5:13

But go ahead.

5:14

Go ahead.

5:15

I didn't have anything else.

5:16

I was going to let you go with that thought.

5:18

Oh, no, I think I lost my train of thought there.

5:21

Well, I think, I mean, to your point, like this idea that by setting free the value system,

5:27

right.

5:28

The thing that the sort of to use the analogy or to torture the analogy a little bit, the

5:33

engine and the input of our economic systems.

5:37

By setting that free, we can enable people to utilize the technology and the intelligence

5:43

that we hope that a GI will eventually generate to take us to that level.

5:48

Like you said, take us far beyond anything we've conceived and way beyond 600 times our

5:53

output productivity.

5:54

Because I think you could probably argue that GPT's and LLMs have already done that in some,

5:59

to some extent, with our thought output or content outputs, right?

6:04

I think there are studies about how much output there is from GPT's even currently.

6:09

And we haven't even begun to scratch the surface of what they're capable of.

6:12

Yeah.

6:13

Because I find in general our data tends to be reflexive, you know.

6:16

Right.

6:17

So I mean, that's kind of, you know, the history though, I guess to dive a little bit more

6:21

into that or to elaborate a little bit more on that is just that, you know, the new technologies

6:25

enable things that were never thought of, right.

6:28

So the harnessing of electricity or the harnessing of a steam turbine gave the possibility of

6:32

steady enough electricity or steady enough power output to use incandescent light bulbs,

6:37

right.

6:38

Which allowed us like to, which basically freed up the 24 hour world that we all live

6:41

in now, right.

6:42

Whereas before you were limited to, you know, how many candles you could put around you

6:46

and, you know, utilizing at a given time and there was limitation to how much light that

6:51

generated.

6:52

You know, eventually now we have skyscrapers full of incandescent light bulbs and all sorts

6:56

of other, you know, technologies that were never even dreamt up before the industrial

7:00

revolution.

7:02

The hope is that that will repeat itself but even on a much larger scale, an incomprehensively

7:06

larger scale than it did the first time around.

7:09

Yeah.

7:10

I think another example of that is sort of like the fossil fuel generation industry,

7:15

right.

7:16

I think none of us disagree that fossil burning and consumption of fossil fuels is bad for

7:21

the planet.

7:22

It's bad for our overall air quality.

7:24

It affects our ability to breathe and to succeed in the far future.

7:28

Maybe not as far as I think but that persistent demand for energy drove relentless technological

7:36

advancement.

7:38

So over the past century you've seen improvements in extraction techniques, extraction technologies,

7:43

horizontal drilling in wells, right, and look I'm not a fan of fracking.

7:47

I, especially where I live, I think it actively hurts me.

7:52

I've sat through earthquakes that I've never, never experienced an earthquake before in

7:57

the middle of the United States but because of fracking we're seeing those things increase.

8:01

The US coal miner for instance increased ninefold.

8:05

Their production, their output increased ninefold in 50 years.

8:08

So these innovations have always pushed back the predictions of resource scarcity by giving

8:17

us a higher output and driving innovation through demonstrating that really reserves

8:24

are really not a fixed geological quantity but a dynamic sort of economic and technological

8:32

one.

8:33

Right, and essentially what it amounts to as I see it is that humanity is hurtling down

8:39

the road at an ever-increasing speed.

8:42

The question is where are we headed towards?

8:43

Paradise and utopia or are we headed towards destruction?

8:47

And the signs change pretty frequently depending on what you look around and see as to which

8:53

road we're on or whether we're on the road that could lead to both.

8:56

And the decisions we make today are probably going to determine where we end up.

9:00

Yeah, 100%.

9:01

And technology's just going to accelerate that speed and multiply that effort.

9:05

Yeah, and I mean I guess sometimes I think we do have to kind of sit back and say to

9:10

what end.

9:11

Right.

9:12

And I think that reason differs for everyone.

9:16

I think I have my own reasons for wanting to see it.

9:20

I think a couple of ways that we've already seen that I think are really good examples

9:26

in already with text generation is the ability to use specific A&I models to mimic the replication

9:37

and synthesization of proteins that can produce new chemicals, new cures, new, it helps to

9:47

fight diseases.

9:49

Now I think that there's pluses and minuses to that both.

9:53

Maybe, through these sort of protein synthesis, we can overcome and help eventually cure cancer,

10:01

but at the same time that also might open up a whole new world of toxic chemicals that

10:07

people can use for biological warfare.

10:10

Right?

10:11

I think that there are tradeoffs that we have which is, I mean obviously it's a great reason

10:16

for regulation, which I'm pro-regulation around these sorts of things for that very reason.

10:22

But it's like saying, look, we could find the cure to every known world illness by employing

10:29

an A&I model to do it.

10:30

Yes, it takes a lot of electricity, but we could save lives in the long run.

10:34

So should we completely ignore our ability to cure these diseases because it uses a lot

10:40

of electricity and that could eventually kill the planet as a whole?

10:44

Or do we say, no, why don't we point it at those problems and develop those technologies

10:50

to solve those problems as they arise at the speed of faster than human thinking?

10:56

Right.

10:57

And who do we allow to utilize those computers?

10:59

Because on the one hand, there's the worry about a rogue actor who might discover some

11:03

sort of terrible new virus and unleash it on the world.

11:07

But then on the other hand, there are pharmaceutical companies who know that they'll be putting

11:10

themselves out of business if they cure all the illnesses.

11:12

And are they going to be able to resist the temptation to create new illnesses on their

11:17

pipeline?

11:18

Hey, that's why I'm a socialist, I'm just saying, you know what I mean?

11:21

Not right there, that's reasonable.

11:22

A democratic socialist, not by force, but through democratic means.

11:26

I think one of the things that I wanted to point out, and look, there's detriments to

11:29

this and I hope we can poke holes in it.

11:34

And what I think you're saying, Will, is that as we get down this road that there kind of

11:43

is no way back from, right?

11:44

You wouldn't be able to just reel AI back in and reel cryptocurrency back in.

11:49

Obviously, in the cryptocurrency sake, part of the design was such that it could not be

11:55

reeled in by any single entity or any single force.

12:01

But in the sense of AI, I mean, there are ways in already putting measures in place

12:07

to limit harm, I think is that we have also, as things have come around, increased, A,

12:17

our knowledge, right?

12:18

Like scientifically, when we were putting coal out in the world, we didn't think, "Oh,

12:22

this is going to cause global climate change."

12:26

We basically just thought we have a need and that is we need more energy.

12:31

We need to put light bulbs in everybody's houses so we could take them away from having

12:36

only eight hours of every day with daylight or 12 hours of every day with daylight to

12:42

we don't need people dying of useless diseases, but we also don't need to trade our ecological

12:48

wealth for curing diseases because either way, we're going to die.

12:57

Which one's going to be the worst?

12:58

And I think that doesn't have to be a decision that we live with every day.

13:02

I mean, sorry, that we live with one time.

13:05

I think it's something that we can live with every day and say, "Look, are we getting closer?"

13:11

That to me is kind of the epitome of science is collecting that data and making meaningful

13:17

observations, and then adjusting to those observations.

13:22

Right, it's feedback.

13:24

It's using our feedback and using it to take us to the next step and make the next decision,

13:30

the scientific method.

13:31

So let's talk a little bit about some of the, and even before AI and cryptocurrency, we've

13:36

had a need of wanting and needing to generate more and more and more electricity as we increase

13:44

our technology, right?

13:47

The current power grid is in a miserable state.

13:50

We know that.

13:51

We need lots of infrastructure improvements.

13:54

And we're seeing major swaths of Texas being browned out at certain times.

13:59

They talk about blackouts in California, and we've seen demand-related problems.

14:06

And I think it's more than just one thing.

14:09

First of all, I think that it's because we generate electricity for money, we create false

14:16

scarcity.

14:17

And if we don't create a whole lot of generation, if we don't have a lot of means of generating

14:23

electricity, we keep the costs higher, right?

14:27

Because then we keep the threat of losing it that much.

14:30

And as we increase not just these technologies, but as the temperature rises globally, there

14:38

will be an even greater need for what?

14:41

Temperature control, for air conditioning, for heating, in incredibly cold winters and

14:46

air conditioning in hot summers.

14:49

That demand is going to increase regardless, as long as we continue to increase as a species.

14:56

We've agreed.

14:57

We're not getting any further away from utilizing that energy.

14:59

That's where we're headed, is more and more consumption.

15:03

And the hope is that these technologies will allow us to keep pushing the limits further

15:08

and further while not destroying ourselves in the process.

15:11

Yeah.

15:12

So let's talk about some of the advances that we've already seen in the current electricity

15:18

and power generation space that are sort of driving innovation.

15:22

I think one thing that has been kind of flip-flopped back and forth and has somehow been made political

15:29

is something like nuclear power generation.

15:36

Ultimately, it's terrible, but it really does come down to steam engines.

15:41

And nuclear power generation facilities are essentially just using nuclear power to boil

15:48

water because it has a greater output to input ratio.

15:55

And then steam engines turn and that generates electricity.

15:58

We're still on the same standard, right?

16:03

But it's incredibly difficult to enrich plutonium, to find plutonium, and it's also very dangerous

16:11

to mine it.

16:12

It's a very dirty process, both on the front end and the back end.

16:15

And then there's also the problems, obviously, Chernobyl, if there's some sort of radioactive

16:21

chemical spill in this scenario or radioactive condition.

16:25

Right.

16:26

But we're able to hopefully utilize these, again, these advancing technologies to offset

16:31

those dangers to give us new safety procedures, new development.

16:34

There's a lot of ways, and they're also advising us or simulating these sorts of reactions

16:40

and telling us how to make them safer, how to make them more efficient, and how to increase

16:44

the output without endangering ourselves or blowing ourselves into oblivion.

16:48

Yeah.

16:49

And there's also been alternatives proposed, right?

16:53

There's two approaches when you're faced with the reality of the dangers of a particular

17:00

type of energy generation.

17:01

In the case of nuclear, yeah, nuclear waste and nuclear spills and enrichment of nuclear

17:08

material is dangerous, inherently.

17:11

What can we do to overcome that?

17:13

And you see that has already driven productivity gains in energy.

17:20

China, this year, brought on their first thorium nuclear reactor.

17:26

Now thorium is a little bit easier to cultivate, to enrich.

17:30

I don't think the returns are as high, but the dangers are lower.

17:33

So they're bringing on this thorium.

17:36

From what I understand, thorium is still pretty dirty.

17:38

It's similar to plutonium, but not as much so.

17:42

Again, a trade off, right?

17:45

For additional progress, you got to have a little bit of malaise.

17:51

We've made significant, in the past three years, we've made a significant movement toward

17:56

nuclear fission.

17:58

For the first time ever, I believe China, and then it was reproduced in California,

18:06

there was a nuclear fission reaction that was for the first time ever a net gain of electricity.

18:17

So it required a lot less to put out than it did, or than it consumed, which is enormous.

18:23

I mean, that's massive because in a fusion reaction, you're not going to have the actual,

18:31

the explosive part, right?

18:33

If you think about it, it goes inward and creates cold.

18:36

It doesn't implode as opposed to exploding.

18:40

You're joining two atoms together rather than exploding them apart.

18:47

Which again, is going to, I mean, it could be massive.

18:50

If we could get nuclear fusion to a point where the net gains are just interminable,

18:58

you could literally put a fusion reactor, in terms of safety, you could put a fusion

19:02

reactor at everybody's house.

19:04

It could be just as common as a hot water heater in 30 years.

19:08

I don't know why you'd want to heat hot water, but.

19:11

I'm looking for the Mr. Fusion, you know, the back to the future where you just throw

19:15

your trash in your car, engine in your car, and it generates all the power you need.

19:20

But realistically, that's not far from what we're moving toward.

19:24

Yeah.

19:25

But what has driven that progress?

19:30

The need for power, right?

19:31

The need for more power.

19:32

The necessity for that electricity.

19:33

Exactly.

19:34

Now...

19:35

Thus, the source and the potential solution to our problem.

19:39

Exactly.

19:40

Now, I'm not saying fusion is necessarily, but it could end the scarcity of electricity.

19:45

But like you said earlier about pharmaceutical companies not wanting them to find cures for,

19:51

you know, the cancer drugs and things of this nature, there are also gigantic lobbies, gigantic,

19:58

billionaire, trillionaire lobbies in the energy creation and energy dispersion space that

20:10

absolutely would not want that technology to get into the hands of the average human

20:13

being.

20:14

That's right.

20:15

I think it's fortunate in some sense that open AI may be worth more than ExxonMobil,

20:22

once it's publicly traded, in the sense of where that economic high energy flows.

20:28

Where it doesn't flow, right?

20:30

Now, I would think that somebody like Sam Altman and open AI might be incentivized to

20:38

invest in that sort of thing.

20:41

But if supposedly, and this is what they tell us about capitalism, right?

20:45

The necessity drives the innovation.

20:47

So if open AI has a necessity for unlimited amounts of power, they have no choice but

20:55

to invest in things that will continue to allow them to consume that power.

21:00

Then they will invest in the clean creation of it.

21:05

And hopefully it trickles down to us.

21:07

Now, we know that doesn't happen.

21:09

That's why I think we as a society should say, let's fund, let's self-fund as a country,

21:16

as a nation, as a world.

21:18

Let's figure out how to self-fund fusion so that we all can have a fusion reactor.

21:23

Everybody's car is powered by fusion and we're driving around in the cleaner world because

21:26

we're not burning coal, we're not burning gas, and we're not burning diesel.

21:32

And then we own the means.

21:34

So then what do you do?

21:35

Now, in my opinion, I don't have a problem with private companies making money by generating

21:40

electricity and putting it into a nationally owned grid.

21:44

I believe that we as a democratic socialist should own the grid as a whole and we should

21:50

also invest as a people in reactors of whatever sort, even fine with a nuclear reactor, like

21:58

a fission reactor to generate electricity because that is the best technology that we

22:03

currently have at a large scale.

22:06

Thorium is still not at a large scale.

22:07

It's still being test bedded in China and fusion is still not even remotely in the realm

22:15

of being able to scale.

22:17

So we should be in the generation business because we're seeing rising power costs across

22:26

the nation now and they're saying it's because the rising cost of fuel.

22:31

But they're using coal and they're using diesel to drive generators, to generate electricity.

22:39

And they're not going to have a reason to change from those things if they can just

22:44

pass the costs on to the people who have to pay the electricity bills.

22:47

Right, as long as they can keep up, they'll continue to burn the most available and the

22:53

most profitable source for them.

22:55

Exactly.

22:56

So I will say it is my position that innovation will be driven by the need in this space.

23:04

So we as a show would be remiss if we didn't talk about the very real, very raw reality

23:13

of how much this sucks in the everyday for specific people.

23:19

Croc has built a giant AI farm in Georgia.

23:26

Open AI, no, Metta, Metta is building a giant AI farm in Louisiana.

23:32

Because why?

23:34

Low cost of energy.

23:36

Right?

23:37

Relative to the rest of the country.

23:39

And it's putting people out of house and home, obviously, by scooping up land.

23:46

Nobody wants to live near those things.

23:47

They're loud, they're hot, and they're hard to be around.

23:53

Right.

23:55

And they're not really approved by the people.

23:59

Now obviously, who would want, who would sign up to have an AI farm in their backyard?

24:04

Nobody, unless they were somehow benefiting from it.

24:08

So I think it's important in these scenarios to use legislation to say you can build an

24:13

AI farm, but you also have to build a power generation plant, or you have to build a power

24:18

generation facility and contribute, not just remove.

24:28

But the real fact is, exactly.

24:31

And the municipalities are saying, yeah, come on in, we'll give you tax breaks even.

24:35

The problem is, those costs get passed through, sent down to the average people who live in

24:42

that community.

24:43

And it's wrong, it's not right.

24:45

But I want to talk a little bit from a perspective of skepticism, right?

24:51

Let's talk about a little bit of cryptocurrency.

24:55

Right?

24:57

I think there are some practical and ethical challenges of crypto.

25:07

Do you have any opinions on these?

25:09

We'll see.

25:10

I'll ask you too specifically.

25:11

I was talking to the audience, but I also am talking to you.

25:17

So first of all, I think that here's why crypto sucks.

25:20

Right?

25:21

Volatility.

25:22

Crypto, I said the spot price today was 111,000.

25:26

About 112,000.

25:28

It was 127,000, 123,000 two weeks ago.

25:33

So if you own one Bitcoin, in the last couple of weeks, you've lost $12,000.

25:38

$15,000.

25:39

If that's your 401k, then that could be pretty scary.

25:42

Exactly.

25:43

And guess who just legalized external sources inside a 401k.

25:49

Yeah.

25:50

So I think that we're not talking about just volatility, guys.

25:54

We're not talking about like the index goes up a point or two and your 401k gains $12

26:01

one day and $127 the next or whatever.

26:04

We're talking about extreme price volatility.

26:08

That kind of limits cryptocurrency's usefulness as a stable medium of exchange.

26:12

They've tried to fix this introducing stable coins.

26:16

The US Mint is actually talking about a USDC, US dollar coin that's tethered to the dollar.

26:21

There's also Tether, which is another cryptocurrency that's directly tied to the value of the US

26:27

dollar.

26:28

But it's an asset that can fluctuate by double digit percentages.

26:33

We're talking 30, 40% in a single day.

26:37

And that's really impractical for commercial transactions.

26:40

If you take a hundred bucks and you buy a hundred dollars worth of Bitcoin, that's not

26:45

a lot of Bitcoin.

26:46

It's about a hundred thousandth of a Bitcoin, right?

26:50

And that is worth a hundred dollars when you buy it.

26:54

At the end of the day, that hundred dollars could only have the spending power of 40 when

26:59

you're talking about that kind of volatility.

27:01

So unlike payments made with a credit card or a debit card, those sorts of crypto transactions

27:09

have no consumer protections either.

27:11

It's so if you get defrauded of hundreds of thousands of dollars of cryptocurrency,

27:17

no, ain't nobody can help you.

27:20

They have those.

27:21

They have those coins now.

27:23

They're tied to their cryptography keys and there's no protection.

27:29

If you make a mistake, just fat finger an address that you're trying to send somebody

27:34

money to.

27:35

You can hope they send it back voluntarily, but there's no mechanism to get that money

27:40

back.

27:41

There's also the risks, the idea of a centralized system is contrasted against the practical

27:50

reality of how we interact with it.

27:53

So the underlying blockchain, the ledger that we were talking about earlier, it is really

28:00

secure by design.

28:02

Unfortunately, most people store their assets on exchanges which are centralized.

28:10

Why for convenience, the ability to withdraw, to buy more, to add it, to purchase it, to

28:16

trade it, to make money with that money.

28:19

Or in my case where I got hung up is it guaranteed a return that was greater than the market

28:26

average 5.9% or something to that effect to basically let them borrow your cryptocurrency

28:33

and use it to trade inside of a volatile market and hopefully get something back from it.

28:43

There was a whole lot of mess with the Gemini exchange and you can feel free to look that

28:48

up.

28:49

I was very lucky in that I was able to get it all back eventually, but it took two and

28:54

a half years.

28:56

Otherwise it was just sitting there in flux and it was not a small amount of money.

29:01

When you put money in a bank, if there's a large scale hack, if you're defrauded, you

29:07

go to the bank and you say, "Hey, look, this is fraud.

29:09

I didn't do this.

29:10

This is not my transaction."

29:12

You were hacked or somebody generated my credit card number.

29:16

That's not mine and you get your money back because that is a built-in protection.

29:20

Our system builds in those protections through FDIC insurance.

29:25

Your money is guaranteed up to $250,000 or whatever.

29:28

Same thing with the NCUA for credit unions.

29:31

There is no organization like this.

29:33

Can we make it?

29:34

Absolutely.

29:35

Could it be done?

29:36

Yes, absolutely.

29:37

But who would guarantee it?

29:39

Essentially, it would be a form of insurance, which what do we know about insurance?

29:43

The only people who ever make money using insurance are the people who create insurance.

29:48

They don't stay around that long to give away all their money, all the premiums.

29:54

Exactly.

29:56

In order to get some sort of regulation, we would have to reasonably create a cryptocurrency

30:04

that needed to be centralized in order to protect it.

30:07

It's like the Wild West in the sense that you can use it to trade money, but if you

30:12

mess up, that's on you.

30:13

It's why the libertarians love it.

30:15

They're like, "It's all about your decisions."

30:17

Also, there are smart contracts.

30:20

I didn't really talk about that, but there are smart contracts that automate transactions

30:24

on the Ethereum platform that they could contain actual coding errors or flaws that can lead

30:34

to leaking of coins, that could lead to catastrophic losses to terms of 100% or more.

30:44

Also, obviously, I'm going to point it out, it's almost not even worth considering illicit

30:49

use.

30:50

The currency has been the number one way to do something illegal in the world since its

30:55

inception, mainly because nobody can really track it or see it.

31:03

It's not tied to your identity.

31:05

It's just a thing that can be traded.

31:08

But I would argue, cash is too.

31:12

Cash is absolutely the same concept.

31:15

That's why in the '70s and '80s movies we saw about the cocaine trade and when the CIA and

31:23

the FBI were bringing in massive amounts of drugs into the United States, you saw everything

31:27

was dealt with in cash.

31:29

So it's the same concept.

31:30

It just became the new cash.

31:31

It was an easy way to conduct it electronically without having to carry around large sums.

31:37

It's why we can't take a flight with more than $10,000 in cash without declaring it

31:42

and letting somebody know that we're leaving with massive amounts of cash.

31:45

So it is used for illicit reasons.

31:49

To kind of tackle the downsides of AI as well, now we're talking a high level from an ethical

31:58

and moral standpoint on these topics.

32:01

So it's not just about the power consumption of it.

32:05

Algorithmic bias, you know, ultimately these models are just giant repositories of things

32:11

that have already been written in the world and on the internet.

32:14

That means that along with it, all of these are trained on these datasets and scraped

32:20

images as a reflection of our world.

32:25

That means all of those social biases, all those prejudices and stereotypes about gender

32:30

and race and culture and everything else, those models reflect those biases and then

32:38

can recreate them and sometimes even amplify them in their output.

32:43

There are a lot of examples of this.

32:45

You can check the internet for that, especially around like AI powered hiring tools.

32:51

They've specifically shown bias against female candidates.

32:55

Facial recognition systems have lower accuracy in regards to people of color.

33:00

AI models used in criminal justice system have actually been shown to disproportionately

33:05

flag black defendants as being high risk of reoffending.

33:10

That leads to discriminatory outcomes and the reinforcement of those inequalities.

33:17

Then there's the all encompassing and very well publicized misinformation and hallucinations

33:23

that have been well documented that AI models have the tendency to hallucinate.

33:28

Why?

33:29

Well, it's a great question.

33:30

We're asking you to generate something new from what it was trained on.

33:34

That means there's a possibility that whatever it's making up is completely nonfactual and

33:41

nonsensical.

33:42

Well, most of the time it's sensical, but it might not be based in fact.

33:46

It doesn't have a comprehension of the data.

33:49

So if you ask it a question about the speed limit in Wisconsin, it doesn't understand that

33:55

the speed limit in Wisconsin has a specific legislative paper trail.

34:00

You could ask me about the speed limit in Zootopia and it'll tell you something because you're

34:05

asking it a question, right?

34:07

Exactly.

34:08

Exactly.

34:09

There will be some sort of determination.

34:10

Exactly.

34:11

It doesn't know that there's no such place.

34:12

Think of it kind of like my nine year old who hasn't learned the power of "I don't know"

34:16

yet.

34:17

You know, like that's the power to shut down a conversation.

34:19

You ask him a question, they say, "I don't know."

34:20

That's a teenage thing.

34:21

But as a nine year old, he will answer every question you ask him.

34:24

He just will answer it with total confidence whether he knows it or not, and that's how

34:28

AI responds.

34:29

So hopefully it'll grow out before it destroys us all.

34:33

I do that myself now.

34:34

I do that all the time.

34:37

I say if you just say it with confidence, they'll surely believe it.

34:40

Also...

34:41

Go ahead.

34:42

As I was saying, unfortunately, hallucinations didn't originate with AI.

34:46

No, they did not.

34:48

And the truth is, if we're trying to create these language models to sound and be more

34:53

like we are, we hallucinate, we dream, we have ideas that are nonsensical and not based in

35:00

fact.

35:01

But what does this do?

35:02

It means that scammers, people who are nefariously using AI, sound more convincing.

35:08

They sound more real.

35:11

Those letters that we... the emails we used to get from the Zimbabwe government that reminded

35:16

us how well off we were, and how we had magically either won their lottery or something to that

35:25

effect.

35:26

They were hoping for some sort of barrister situation where you had to send them money.

35:33

Those were easily determined to be AI.

35:38

And for a very specific reason...

35:41

I mean, sorry, not easily determined to be AI.

35:44

They were easily sussed out as being fraudulent.

35:48

And why do you think that is?

35:50

Well, because there were spelling errors, there were grammar problems because they were

35:55

clearly from people who didn't speak English as their first language.

35:59

But you know, now with AI, they can use that same script that they had before and run it

36:05

through an AI and say, "Hey, clean this up.

36:07

Make it more formal.

36:08

Make it match American English."

36:09

Those sorts of things.

36:11

And still get away... and more now than ever, they're actually targeting people and getting

36:17

away with it because it looks more legitimate than it did before.

36:21

You know, I remember Kim Commando, who was a radio-popular host that helped old people

36:27

with computers back in the day.

36:29

And my grandparents when they were alive loved to listen to her radio show.

36:34

And she would say like, "Here's how you pick out these things and look for grammar mistakes

36:39

and look for spelling errors.

36:41

But in today's scams, there are no spelling errors.

36:45

There are no grammar mistakes.

36:46

It sounds exactly like a person who would coming from... and they're using proper language.

36:51

You know, they're using barrister instead of lawyer, and they're using, you know, these

36:55

sorts of things.

36:57

Another thing, privacy and data, right?

37:00

When you train these large language models, it requires a colossal amount of information.

37:07

Most of that is scraped from public websites.

37:10

It's scraped from forums like Reddit and Substack and those sorts of things.

37:15

Stack trace, stack overflow, lots of stacks.

37:18

And from social media... all of them.

37:22

And is pulled from social media without the knowledge or consent of the people who created

37:29

that content.

37:31

That raises a lot of privacy concerns, not to mention people's personal stories, their

37:36

sensitive information, pictures of their children, copyrighted artworks, copyrighted music.

37:42

All of these things are ingested into these models.

37:45

So there is a not insignificant risk that models could inadvertently leak, right?

37:53

Or reproduce private data in their responses or worse, maliciously be coerced into producing

38:01

this information that was used to train them.

38:03

There's also a good bit of accountability and transparency issues in these AI models, right?

38:10

We as consumers don't have the ability to produce an AI model on the equipment that we have

38:14

in our homes.

38:16

They require massive amounts of information.

38:18

The average model right now is right at $22 million.

38:23

You need that amount of money to train one of these models.

38:27

So many of the most powerful of these really are just black boxes, like we were talking

38:32

about earlier with the computational aspects of it.

38:36

But in the informational aspect of it, they're just black boxes.

38:40

Their internal workings are so complex that even the people who do this for a living cannot

38:46

fully explain why a model produced the specific output that you're seeing on your chat.

38:54

Like you said, when model 4.0 was so friendly, and then 5.0 was cold and objective, people

39:00

were like, "What just happened?" and open their eyes, shrugging their shoulders.

39:04

They don't know.

39:06

Exactly.

39:07

And that opaqueness is a real challenge for accountability for a very specific reason.

39:16

When an AI system hurts someone and we're seeing it hurt people, whether by providing

39:23

dangerously incorrect medical advice, causing a self-driving car to crash, or making a biased

39:30

loan decision.

39:32

And it's really often unclear who's responsible for that.

39:37

Is it the user who prompted the AI?

39:40

Is it the people who train the AI?

39:43

Is it the company that made the AI available for you to use?

39:47

Or was it the creators of the biased data that were used to train it?

39:54

So without transparency in how these systems make decisions, the ability to establish any

40:01

sort of accountability or transparency is almost impossible.

40:05

All right, let's talk about the innovation, Fowl.

40:08

We have built for you a narrative of historical information that presents progress as being

40:15

a driver for additional progress.

40:18

Progress in one area creates the need, which drives innovation in another area.

40:23

And it is a valid one.

40:25

I do want to say that, but it's incomplete.

40:29

Innovation is really not a moral force.

40:35

It's kind of a neutral process.

40:38

It's a tool.

40:39

Gosh, where do you heard that argument before?

40:41

It's a tool, and it can be used for good or bad.

40:46

It just magnifies the effort.

40:51

And that's the point of a tool to begin with, is to expand our capabilities into further.

40:58

So the final and really the most fundamental critique of ourselves challenges this, this

41:07

sort of techno-optimistic premise that a high demand for these resources will lead to positive

41:16

and beneficial innovation.

41:18

That is my belief.

41:20

I will say that there is data that actively backs it up.

41:24

Will, is it your belief that you've seen it come to pass in your lifetime?

41:31

I have, but more importantly, I think it's more of just hope that I have to hold on to,

41:35

because the alternative is just the dark ages again.

41:38

Right?

41:39

We've already gone so far at this point.

41:40

The hope is to completely borrow something entirely unrelated.

41:44

If you're going through hell, keep on going, as they say, right?

41:47

Like the idea is the only way out is through.

41:50

The only way out is through.

41:51

Exactly.

41:52

I think high demand, I think if I had to word it correctly, I would say high demand provides

41:58

a motive for innovation, but it doesn't actually dictate the ethics of the outcome.

42:06

Exactly.

42:08

So demand for cheap textiles in the 19th century drove innovation in factory machinery and

42:16

the steam engine, but it also led to horrific labor conditions, child labor, dangerous urban

42:23

pollution, both in the water and in the air, but the demand for agricultural efficiency

42:31

in the 20th century drove innovation in chemical fertilizers and pesticides, but also led to

42:37

widespread water contamination and ecological damage that we're still dealing with the consequences

42:42

of.

42:44

Historical innovations that power our modern world right now, the steam engine, the internal

42:49

combustion engine were based on fossil fuels, the extraction of use and use of which have

42:56

led directly to our current climate crisis, a massive and potentially catastrophic negative

43:03

externality of that progress.

43:05

So a closer look at the consequences of high resource demand kind of reveals this pattern

43:11

of negative outcomes, right?

43:13

That often come along with the advancement of technology.

43:19

The extraction of raw materials, whether they be coal for a steam engine or lithium and

43:24

cobalt for batteries is a very energy intensive and environmentally destructive process.

43:31

It very often leads to really terrible and sometimes irreversible ecological damage.

43:38

I mean, soil degradation, water shortages.

43:41

It pollutes the air and the water both.

43:44

And the less obvious one for you biologists out there is the loss of biodiversity.

43:49

The global pursuit of resources is inherently intertwined with social conflict and inequality.

43:56

The process of material extraction has been linked to severe human rights violations and

44:02

forced displacement of local and indigenous populations and acute health problems for

44:08

the communities that surround these things from contamination.

44:11

We're already looking at the prospect of all of us losing the ability to work in white

44:17

collar jobs at all because of the possibility of an eventual AGI.

44:22

Which makes alignment all that much more important, right?

44:27

Right.

44:28

The economic benefits of the resource extraction flows to multinational corporations and governments

44:37

and developed nations, while the ecological and social costs are actually burdened upon

44:45

by the actual population of less developed countries.

44:48

According to the UN actually, natural resources play a key role in 40% of intra-INTRA state

44:58

conflicts with profits from their sale being used to finance armed militias.

45:05

And look, this is a crucial point in us building up our current moment.

45:15

I mean, this immense demand for energy and materials generated by A&I and by cryptocurrency

45:23

could spur development, right, of clean fusion power which effectively would end scarcity

45:33

for energy on our planet as a whole.

45:36

But it could also lead, and if I'm also looking at history, to a giant global grab for resources

45:45

needed to build that new energy infrastructure and the computing hardware that it will power.

45:52

And that outcome is not guaranteed to be positive.

45:56

So it's not going to be determined by like… go ahead.

45:59

It's guaranteed not to be positive for some people, even if it's generally positive for

46:04

the majority of us.

46:05

Some of us will have to pay the cost.

46:08

That is a reality.

46:09

I view it kind of like I view war.

46:13

And look, I'm not a fan of war, but the theory behind it was that a few put up their own

46:20

lives at stake to protect the majority.

46:25

And I don't think it's necessarily right to equate this with war.

46:31

But it's not going to be determined by the technology itself, AI or crypto.

46:37

It's going to be determined by the choices that we make as people regarding the regulation,

46:43

the governance and the ethical deployment of these technologies.

46:48

Right.

46:49

We've got access to a lot of power.

46:51

Now it comes with a responsibility.

46:53

Thanks, Spider-Man.

46:54

And Peter Parker over here.

46:56

Hey, Uncle Ben.

46:57

No, it was Uncle Ben.

46:59

I was about to say it was the uncle.

47:00

It was not Peter Parker.

47:02

So okay.

47:03

So where does that leave us, right?

47:04

In terms of a conclusion, I came across a fun word that I really liked in doing the

47:09

research for this and it was trilemma.

47:11

I mean, it makes sense.

47:12

I heard of dilemma.

47:14

Most of us have heard of dilemma, but trilemma or trilemma is a fun word that I really enjoy.

47:21

I think it creates this trilemma of innovation and consumption and responsibility.

47:29

So I feel like we're currently standing like dead center and we have to we take care of

47:35

all of these things because of the profound implications of the twin revolutions of crypto

47:41

and AI that we're, regardless of whether you want it, are being put into the real world

47:48

for us to contend with.

47:50

So this podcast has been an effort to show that these technologies aren't like abstract

47:56

concepts, but they're real powerful forces with real world consequences, right?

48:05

Especially with their immense and endless, meaningly endless growth of desire for consumption

48:13

of energy.

48:14

So where we go from here is not like a simple choice between, "Hey, I'm going to embrace

48:21

AI and I'm going to embrace cryptocurrency" or flat out reject it, right?

48:25

And decide to be Amish and go live in Pennsylvania.

48:28

But it's really, it's a very complex navigation of this trilemma.

48:35

And opting out is no guarantee that you'll avoid the consequences and in fact, maybe

48:38

just abdicating your ability to do something about it.

48:41

Exactly.

48:42

So if you were to take each branch of this, right, innovation, we have potential benefits

48:50

of AI and decentralized technology that we talked about earlier.

48:56

Not only that, we're talking about the possibility of increased productivity.

49:00

Now, productivity for what?

49:02

You know, again, this is a tool.

49:04

We get to determine that.

49:06

We've got new forms of economic organization, accelerated scientific discovery and solutions,

49:13

possibly to some of humanity's most pressing problems.

49:16

So if we turn our back on the potential innovation, we would literally be rejecting like a pretty

49:22

powerful engine for future progress.

49:26

The second point is consumption, right?

49:28

Like as we talked about earlier, that innovation comes at a steep environmental cost.

49:34

It doesn't matter how you look at it.

49:35

The silicon, from the silicon of the chips to the coal that we are apparently now, once

49:42

again, all excited about burning to produce electricity inside of our electrical grid.

49:48

That is the energy demand of these technologies is already on the scale of like coal nations,

49:54

right?

49:55

Like, and growing exponentially.

49:56

That is, I mean, that drives a voracious appetite, not just for that electricity, but for the

50:02

raw materials needed to build the data centers, to build the processors, the new energy infrastructure

50:08

required to sustain them.

50:11

And if we don't instill and enforce a system of checks and balances, it really could threaten

50:18

to undermine our climate goals.

50:21

And I don't mean the ones that our current president has decided to remove us from.

50:28

I mean our individual personal climate goals and also exacerbate the destruction that we're

50:37

already seeing at an environmental level, which brings up the most important part of

50:42

all of this.

50:43

It's about responsibility.

50:45

We cannot have the benefits of innovation without managing the consequences of consumption.

50:52

And those have a tangible effect on everyday people, not just multinational corporations.

50:59

And that requires a distinctly profound sense of responsibility to develop and deploy these

51:09

technologies in a way that is ethical and sustainable, but equitable too.

51:15

Absolutely.

51:17

And that means confronting the risks of the algorithmic bias, the misinformation and yes,

51:24

the illegal use, right?

51:27

And I mean, whatever.

51:29

I guess I personally don't really care if some guy from Montana really needs LSD and

51:34

can figure out a way to get it through the internet.

51:36

That doesn't really affect me that much.

51:40

But it's about ensuring that the pursuit of new energy sources doesn't just recreate all

51:48

of the crappy historical patterns, right?

51:51

Of social and environmental exploitation that often come along with resource extraction.

51:59

And we have real world examples of those.

52:01

Plenty of historical examples.

52:03

Yes.

52:05

But even looking toward the possibilities, when we look at possible hiccups in the road,

52:13

it's pretty clear that the future is not just a predetermined outcome of technological inevitability.

52:20

The pattern of demand driving innovation is powerful.

52:24

I mean, you can't deny that either, but it is not a moral guarantee, right?

52:29

It doesn't mean that it is...

52:31

A scientific fact.

52:32

Right.

52:33

It is morally good.

52:34

It does not mean that.

52:35

It is just a fact.

52:38

So that same demand for electricity that could actually get us there faster for clean fusion

52:46

power could also trigger a destructive scramble for the resources to build Fusion React.

52:52

That same AI that could cure diseases we've already seen through Russian propaganda entrenches

52:58

societal biases and completely obliterates public trust.

53:03

So our challenge really is to manage this trilemma, you're welcome, with intention and foresight.

53:10

Right?

53:11

So the path forward is really not just about the breakthroughs that we're seeing from this

53:16

technology.

53:19

And they're cool.

53:20

They're cool.

53:22

The idea of endless clean energy generation is amazing.

53:27

Not to mention computational efficiency of having a constant brain more powerful than

53:33

the human when continually working on a problem.

53:36

But it demands conscious, deliberate, and international conversations about the kind

53:43

of society we want to build.

53:45

And it has to be one that balances the power of that innovation with the very finite reality

53:52

of our planet and the enduring importance of our collective human value.

53:58

And that is the overlap.

54:03

We hope you enjoyed this podcast here today.

54:06

We worked really hard on coming up with all the information and doing all the research

54:10

to get it to you.

54:11

If you have any questions or comments, you can email us the overlap@fof.foundation.

54:22

You can hit us up and give us a follow over there on Blue Sky, the overlap podcast, as

54:26

well as Mastodon, the overlap podcast.

54:30

You can also check out our website, fof.foundation, and keep up with us there.

54:36

We will see you again soon, folks.

54:39

Well, maybe not see you.

54:41

Maybe we'll hear from you again soon, but you'll definitely hear from us later.

54:45

Thanks, everybody.

54:47

And thank you again, Will, for being here.

54:49

Thanks everybody.

54:50

Bye now.

54:50

[MUSIC]

55:00

[MUSIC]

55:10

♪♪

55:17

♪♪