1 00:00:00,000 --> 00:00:07,320 If this is where you picked up the podcast today, you are in the wrong place. 2 00:00:07,320 --> 00:00:11,120 This is a part two of a two-part episode. 3 00:00:11,120 --> 00:00:15,320 Go back to last week's episode and check that one out and give it a start and come back 4 00:00:15,320 --> 00:00:22,280 right here and join us here at The Overlap as we talk about AI, cryptocurrency, electricity, 5 00:00:22,280 --> 00:00:41,000 and the future of innovation. 6 00:00:41,000 --> 00:00:42,000 All right. 7 00:00:42,000 --> 00:00:47,420 Now welcome back to part two of The Overlap podcast about AI, cryptocurrency, and electricity. 8 00:00:47,420 --> 00:00:51,320 Will, why don't you kick it off where we left off from? 9 00:00:51,320 --> 00:00:52,320 Right. 10 00:00:52,320 --> 00:01:00,760 So, why the hope that these two technologies, although they're very energy hungry, we understand 11 00:01:00,760 --> 00:01:03,040 now, why they're so energy hungry? 12 00:01:03,040 --> 00:01:06,820 How do we come up with this idea that perhaps they could rescue us from the energy scarce 13 00:01:06,820 --> 00:01:12,280 or energy consuming future that we fear and actually give us a positive way out? 14 00:01:12,280 --> 00:01:14,920 I asked that question, I'm actually going to ask you that question. 15 00:01:14,920 --> 00:01:15,920 Are you asking me? 16 00:01:15,920 --> 00:01:18,260 I mean, I have my answer, but. 17 00:01:18,260 --> 00:01:21,320 Well, what we have is we have examples from the past, right? 18 00:01:21,320 --> 00:01:22,320 Yeah. 19 00:01:22,320 --> 00:01:26,260 And the fact that new technologies, although they appear to at first be energy consuming, 20 00:01:26,260 --> 00:01:29,760 actually enable us to create things that are energy producing, right? 21 00:01:29,760 --> 00:01:34,040 The very steam turbines we were just talking about, if we could bury these AI and use them 22 00:01:34,040 --> 00:01:38,640 to heat water and all that, those turbines exist because of the industrial revolution 23 00:01:38,640 --> 00:01:40,780 and the things that that made available. 24 00:01:40,780 --> 00:01:47,180 Our variability to harness steam in engines was what gave us the capacity to generate much 25 00:01:47,180 --> 00:01:51,120 more energy than we actually used ever prior to the industrial revolution. 26 00:01:51,120 --> 00:01:52,440 And while there's. 27 00:01:52,440 --> 00:01:53,440 Yeah, I think. 28 00:01:53,440 --> 00:01:54,440 Go ahead. 29 00:01:54,440 --> 00:01:55,440 Go ahead. 30 00:01:55,440 --> 00:01:57,720 We both do that to each other all the time. 31 00:01:57,720 --> 00:02:01,640 I think that it's important to evaluate this, right? 32 00:02:01,640 --> 00:02:08,280 I think we're at the precipice of a junction in which we're looking at a major categorical 33 00:02:08,280 --> 00:02:10,480 shift in how we do things, right? 34 00:02:10,480 --> 00:02:17,340 For millennia, the human progress was limited by the power that could be generated by human 35 00:02:17,340 --> 00:02:19,120 and animal muscle. 36 00:02:19,120 --> 00:02:20,680 Right. 37 00:02:20,680 --> 00:02:25,940 And that was that was the limit, whether that was swinging of an axe or the walking of an 38 00:02:25,940 --> 00:02:27,320 ox. 39 00:02:27,320 --> 00:02:33,320 So the development of the water wheel, for example, produced an increase. 40 00:02:33,320 --> 00:02:34,320 We had it. 41 00:02:34,320 --> 00:02:40,640 We had it there that we were now able to extend ourselves past our own physical limitations 42 00:02:40,640 --> 00:02:43,960 and the limitations of work animals. 43 00:02:43,960 --> 00:02:45,920 So we have the water wheel, right. 44 00:02:45,920 --> 00:02:48,100 But they were then geographically tethered. 45 00:02:48,100 --> 00:02:49,720 They were tethered to rivers. 46 00:02:49,720 --> 00:02:52,640 If there was no river in the area, there could be no water wheel. 47 00:02:52,640 --> 00:02:53,640 Right. 48 00:02:53,640 --> 00:02:57,200 And then you have the steam engine like you were talking about. 49 00:02:57,200 --> 00:03:03,800 That that thing by itself transformed and created the power available to a society by 50 00:03:03,800 --> 00:03:06,200 about 600 times. 51 00:03:06,200 --> 00:03:11,960 So a steam engine had about 600 times the maximum amount of power that a human being 52 00:03:11,960 --> 00:03:13,480 or an animal could produce. 53 00:03:13,480 --> 00:03:14,480 Exactly. 54 00:03:14,480 --> 00:03:16,380 And that's good. 55 00:03:16,380 --> 00:03:22,020 But because because the coal used to create the steam engines could be transported almost 56 00:03:22,020 --> 00:03:23,600 anywhere, right. 57 00:03:23,600 --> 00:03:28,660 Now you can have a steam engine completely decoupled from energy production and completely 58 00:03:28,660 --> 00:03:33,180 from geography for the first time in the in the history of the modern world. 59 00:03:33,180 --> 00:03:38,400 Which is the industrial revolution, which gave way to factories and railroads and mass 60 00:03:38,400 --> 00:03:40,400 production. 61 00:03:40,400 --> 00:03:41,540 Exactly. 62 00:03:41,540 --> 00:03:46,280 And now we have access to a technology that can simulate potentially one day simulate 63 00:03:46,280 --> 00:03:52,140 new types of engines and new types of energy retrieving and generating devices that can 64 00:03:52,140 --> 00:03:57,720 potentially unlock more power than we can even conceive of now to not only feed the 65 00:03:57,720 --> 00:04:03,520 future AI and future cryptocurrency devices, but also many other things that we may not 66 00:04:03,520 --> 00:04:05,480 have even dreamt of yet. 67 00:04:05,480 --> 00:04:08,520 Perhaps those starships that you were talking about before. 68 00:04:08,520 --> 00:04:14,560 Yeah, I think I think one way that crypto is similar in this scenario is that it decouples 69 00:04:14,560 --> 00:04:18,800 our system of trading from any one country. 70 00:04:18,800 --> 00:04:19,800 Right. 71 00:04:19,800 --> 00:04:26,480 And allows free money in the same way that cross borders without needing middle middlemen. 72 00:04:26,480 --> 00:04:28,460 We're using a network in place of that. 73 00:04:28,460 --> 00:04:33,740 So we're increasing our ability to exchange goods and services. 74 00:04:33,740 --> 00:04:38,620 And then on the AI side, I see it as you know, in the same way that the steam engine was 75 00:04:38,620 --> 00:04:49,440 an answer to to overcoming the limitations of human ability physically, AI is now the 76 00:04:49,440 --> 00:04:51,560 hope I don't know that it does now. 77 00:04:51,560 --> 00:04:56,200 I don't think it does is the hope that it can now overcome our mental capacity and our 78 00:04:56,200 --> 00:05:01,900 mental abilities generally by 600 times, but even more than 600 times. 79 00:05:01,900 --> 00:05:10,080 I mean, if we had a GI, we could put a GI to building its own a si and we would potentially 80 00:05:10,080 --> 00:05:13,000 see an even bigger savings. 81 00:05:13,000 --> 00:05:14,000 But go ahead. 82 00:05:14,000 --> 00:05:15,000 Go ahead. 83 00:05:15,000 --> 00:05:16,000 I didn't have anything else. 84 00:05:16,000 --> 00:05:18,280 I was going to let you go with that thought. 85 00:05:18,280 --> 00:05:21,760 Oh, no, I think I lost my train of thought there. 86 00:05:21,760 --> 00:05:27,620 Well, I think, I mean, to your point, like this idea that by setting free the value system, 87 00:05:27,620 --> 00:05:28,620 right. 88 00:05:28,620 --> 00:05:33,820 The thing that the sort of to use the analogy or to torture the analogy a little bit, the 89 00:05:33,820 --> 00:05:37,720 engine and the input of our economic systems. 90 00:05:37,720 --> 00:05:43,000 By setting that free, we can enable people to utilize the technology and the intelligence 91 00:05:43,000 --> 00:05:48,000 that we hope that a GI will eventually generate to take us to that level. 92 00:05:48,000 --> 00:05:53,280 Like you said, take us far beyond anything we've conceived and way beyond 600 times our 93 00:05:53,280 --> 00:05:54,280 output productivity. 94 00:05:54,280 --> 00:05:59,580 Because I think you could probably argue that GPT's and LLMs have already done that in some, 95 00:05:59,580 --> 00:06:04,960 to some extent, with our thought output or content outputs, right? 96 00:06:04,960 --> 00:06:09,840 I think there are studies about how much output there is from GPT's even currently. 97 00:06:09,840 --> 00:06:12,480 And we haven't even begun to scratch the surface of what they're capable of. 98 00:06:12,480 --> 00:06:13,480 Yeah. 99 00:06:13,480 --> 00:06:16,760 Because I find in general our data tends to be reflexive, you know. 100 00:06:16,760 --> 00:06:17,760 Right. 101 00:06:17,760 --> 00:06:21,380 So I mean, that's kind of, you know, the history though, I guess to dive a little bit more 102 00:06:21,380 --> 00:06:25,760 into that or to elaborate a little bit more on that is just that, you know, the new technologies 103 00:06:25,760 --> 00:06:28,020 enable things that were never thought of, right. 104 00:06:28,020 --> 00:06:32,780 So the harnessing of electricity or the harnessing of a steam turbine gave the possibility of 105 00:06:32,780 --> 00:06:37,520 steady enough electricity or steady enough power output to use incandescent light bulbs, 106 00:06:37,520 --> 00:06:38,520 right. 107 00:06:38,520 --> 00:06:41,680 Which allowed us like to, which basically freed up the 24 hour world that we all live 108 00:06:41,680 --> 00:06:42,680 in now, right. 109 00:06:42,680 --> 00:06:46,740 Whereas before you were limited to, you know, how many candles you could put around you 110 00:06:46,740 --> 00:06:51,080 and, you know, utilizing at a given time and there was limitation to how much light that 111 00:06:51,080 --> 00:06:52,080 generated. 112 00:06:52,080 --> 00:06:56,700 You know, eventually now we have skyscrapers full of incandescent light bulbs and all sorts 113 00:06:56,700 --> 00:07:00,380 of other, you know, technologies that were never even dreamt up before the industrial 114 00:07:00,380 --> 00:07:02,220 revolution. 115 00:07:02,220 --> 00:07:06,600 The hope is that that will repeat itself but even on a much larger scale, an incomprehensively 116 00:07:06,600 --> 00:07:09,520 larger scale than it did the first time around. 117 00:07:09,520 --> 00:07:10,520 Yeah. 118 00:07:10,520 --> 00:07:15,540 I think another example of that is sort of like the fossil fuel generation industry, 119 00:07:15,540 --> 00:07:16,540 right. 120 00:07:16,540 --> 00:07:21,600 I think none of us disagree that fossil burning and consumption of fossil fuels is bad for 121 00:07:21,600 --> 00:07:22,600 the planet. 122 00:07:22,600 --> 00:07:24,040 It's bad for our overall air quality. 123 00:07:24,040 --> 00:07:28,920 It affects our ability to breathe and to succeed in the far future. 124 00:07:28,920 --> 00:07:36,700 Maybe not as far as I think but that persistent demand for energy drove relentless technological 125 00:07:36,700 --> 00:07:38,340 advancement. 126 00:07:38,340 --> 00:07:43,220 So over the past century you've seen improvements in extraction techniques, extraction technologies, 127 00:07:43,220 --> 00:07:47,920 horizontal drilling in wells, right, and look I'm not a fan of fracking. 128 00:07:47,920 --> 00:07:52,660 I, especially where I live, I think it actively hurts me. 129 00:07:52,660 --> 00:07:57,060 I've sat through earthquakes that I've never, never experienced an earthquake before in 130 00:07:57,060 --> 00:08:01,300 the middle of the United States but because of fracking we're seeing those things increase. 131 00:08:01,300 --> 00:08:05,140 The US coal miner for instance increased ninefold. 132 00:08:05,140 --> 00:08:08,720 Their production, their output increased ninefold in 50 years. 133 00:08:08,720 --> 00:08:17,980 So these innovations have always pushed back the predictions of resource scarcity by giving 134 00:08:17,980 --> 00:08:24,040 us a higher output and driving innovation through demonstrating that really reserves 135 00:08:24,040 --> 00:08:32,200 are really not a fixed geological quantity but a dynamic sort of economic and technological 136 00:08:32,200 --> 00:08:33,200 one. 137 00:08:33,200 --> 00:08:39,180 Right, and essentially what it amounts to as I see it is that humanity is hurtling down 138 00:08:39,180 --> 00:08:42,500 the road at an ever-increasing speed. 139 00:08:42,500 --> 00:08:43,820 The question is where are we headed towards? 140 00:08:43,820 --> 00:08:47,140 Paradise and utopia or are we headed towards destruction? 141 00:08:47,140 --> 00:08:53,120 And the signs change pretty frequently depending on what you look around and see as to which 142 00:08:53,120 --> 00:08:56,040 road we're on or whether we're on the road that could lead to both. 143 00:08:56,040 --> 00:09:00,140 And the decisions we make today are probably going to determine where we end up. 144 00:09:00,140 --> 00:09:01,860 Yeah, 100%. 145 00:09:01,860 --> 00:09:05,300 And technology's just going to accelerate that speed and multiply that effort. 146 00:09:05,300 --> 00:09:10,660 Yeah, and I mean I guess sometimes I think we do have to kind of sit back and say to 147 00:09:10,660 --> 00:09:11,660 what end. 148 00:09:11,660 --> 00:09:12,660 Right. 149 00:09:12,660 --> 00:09:16,720 And I think that reason differs for everyone. 150 00:09:16,720 --> 00:09:20,220 I think I have my own reasons for wanting to see it. 151 00:09:20,220 --> 00:09:26,740 I think a couple of ways that we've already seen that I think are really good examples 152 00:09:26,740 --> 00:09:37,860 in already with text generation is the ability to use specific A&I models to mimic the replication 153 00:09:37,860 --> 00:09:47,580 and synthesization of proteins that can produce new chemicals, new cures, new, it helps to 154 00:09:47,580 --> 00:09:49,040 fight diseases. 155 00:09:49,040 --> 00:09:53,320 Now I think that there's pluses and minuses to that both. 156 00:09:53,320 --> 00:10:01,340 Maybe, through these sort of protein synthesis, we can overcome and help eventually cure cancer, 157 00:10:01,340 --> 00:10:07,800 but at the same time that also might open up a whole new world of toxic chemicals that 158 00:10:07,800 --> 00:10:10,380 people can use for biological warfare. 159 00:10:10,380 --> 00:10:11,380 Right? 160 00:10:11,380 --> 00:10:16,380 I think that there are tradeoffs that we have which is, I mean obviously it's a great reason 161 00:10:16,380 --> 00:10:22,300 for regulation, which I'm pro-regulation around these sorts of things for that very reason. 162 00:10:22,300 --> 00:10:29,280 But it's like saying, look, we could find the cure to every known world illness by employing 163 00:10:29,280 --> 00:10:30,840 an A&I model to do it. 164 00:10:30,840 --> 00:10:34,020 Yes, it takes a lot of electricity, but we could save lives in the long run. 165 00:10:34,020 --> 00:10:40,580 So should we completely ignore our ability to cure these diseases because it uses a lot 166 00:10:40,580 --> 00:10:44,020 of electricity and that could eventually kill the planet as a whole? 167 00:10:44,020 --> 00:10:50,100 Or do we say, no, why don't we point it at those problems and develop those technologies 168 00:10:50,100 --> 00:10:56,460 to solve those problems as they arise at the speed of faster than human thinking? 169 00:10:56,460 --> 00:10:57,460 Right. 170 00:10:57,460 --> 00:10:59,500 And who do we allow to utilize those computers? 171 00:10:59,500 --> 00:11:03,740 Because on the one hand, there's the worry about a rogue actor who might discover some 172 00:11:03,740 --> 00:11:07,260 sort of terrible new virus and unleash it on the world. 173 00:11:07,260 --> 00:11:10,420 But then on the other hand, there are pharmaceutical companies who know that they'll be putting 174 00:11:10,420 --> 00:11:12,700 themselves out of business if they cure all the illnesses. 175 00:11:12,700 --> 00:11:17,140 And are they going to be able to resist the temptation to create new illnesses on their 176 00:11:17,140 --> 00:11:18,140 pipeline? 177 00:11:18,140 --> 00:11:21,400 Hey, that's why I'm a socialist, I'm just saying, you know what I mean? 178 00:11:21,400 --> 00:11:22,720 Not right there, that's reasonable. 179 00:11:22,720 --> 00:11:26,540 A democratic socialist, not by force, but through democratic means. 180 00:11:26,540 --> 00:11:29,960 I think one of the things that I wanted to point out, and look, there's detriments to 181 00:11:29,960 --> 00:11:34,180 this and I hope we can poke holes in it. 182 00:11:34,180 --> 00:11:43,060 And what I think you're saying, Will, is that as we get down this road that there kind of 183 00:11:43,060 --> 00:11:44,700 is no way back from, right? 184 00:11:44,700 --> 00:11:49,740 You wouldn't be able to just reel AI back in and reel cryptocurrency back in. 185 00:11:49,740 --> 00:11:55,860 Obviously, in the cryptocurrency sake, part of the design was such that it could not be 186 00:11:55,860 --> 00:12:01,800 reeled in by any single entity or any single force. 187 00:12:01,800 --> 00:12:07,340 But in the sense of AI, I mean, there are ways in already putting measures in place 188 00:12:07,340 --> 00:12:17,300 to limit harm, I think is that we have also, as things have come around, increased, A, 189 00:12:17,300 --> 00:12:18,300 our knowledge, right? 190 00:12:18,300 --> 00:12:22,620 Like scientifically, when we were putting coal out in the world, we didn't think, "Oh, 191 00:12:22,620 --> 00:12:26,940 this is going to cause global climate change." 192 00:12:26,940 --> 00:12:31,020 We basically just thought we have a need and that is we need more energy. 193 00:12:31,020 --> 00:12:36,520 We need to put light bulbs in everybody's houses so we could take them away from having 194 00:12:36,520 --> 00:12:42,440 only eight hours of every day with daylight or 12 hours of every day with daylight to 195 00:12:42,440 --> 00:12:48,980 we don't need people dying of useless diseases, but we also don't need to trade our ecological 196 00:12:48,980 --> 00:12:57,500 wealth for curing diseases because either way, we're going to die. 197 00:12:57,500 --> 00:12:58,860 Which one's going to be the worst? 198 00:12:58,860 --> 00:13:02,520 And I think that doesn't have to be a decision that we live with every day. 199 00:13:02,520 --> 00:13:05,260 I mean, sorry, that we live with one time. 200 00:13:05,260 --> 00:13:11,740 I think it's something that we can live with every day and say, "Look, are we getting closer?" 201 00:13:11,740 --> 00:13:17,220 That to me is kind of the epitome of science is collecting that data and making meaningful 202 00:13:17,220 --> 00:13:22,600 observations, and then adjusting to those observations. 203 00:13:22,600 --> 00:13:24,460 Right, it's feedback. 204 00:13:24,460 --> 00:13:30,160 It's using our feedback and using it to take us to the next step and make the next decision, 205 00:13:30,160 --> 00:13:31,340 the scientific method. 206 00:13:31,340 --> 00:13:36,980 So let's talk a little bit about some of the, and even before AI and cryptocurrency, we've 207 00:13:36,980 --> 00:13:44,700 had a need of wanting and needing to generate more and more and more electricity as we increase 208 00:13:44,700 --> 00:13:47,820 our technology, right? 209 00:13:47,820 --> 00:13:50,460 The current power grid is in a miserable state. 210 00:13:50,460 --> 00:13:51,460 We know that. 211 00:13:51,460 --> 00:13:54,100 We need lots of infrastructure improvements. 212 00:13:54,100 --> 00:13:59,540 And we're seeing major swaths of Texas being browned out at certain times. 213 00:13:59,540 --> 00:14:06,100 They talk about blackouts in California, and we've seen demand-related problems. 214 00:14:06,100 --> 00:14:09,180 And I think it's more than just one thing. 215 00:14:09,180 --> 00:14:16,500 First of all, I think that it's because we generate electricity for money, we create false 216 00:14:16,500 --> 00:14:17,700 scarcity. 217 00:14:17,700 --> 00:14:23,200 And if we don't create a whole lot of generation, if we don't have a lot of means of generating 218 00:14:23,200 --> 00:14:27,240 electricity, we keep the costs higher, right? 219 00:14:27,240 --> 00:14:30,180 Because then we keep the threat of losing it that much. 220 00:14:30,180 --> 00:14:38,460 And as we increase not just these technologies, but as the temperature rises globally, there 221 00:14:38,460 --> 00:14:41,300 will be an even greater need for what? 222 00:14:41,300 --> 00:14:46,560 Temperature control, for air conditioning, for heating, in incredibly cold winters and 223 00:14:46,560 --> 00:14:49,740 air conditioning in hot summers. 224 00:14:49,740 --> 00:14:56,340 That demand is going to increase regardless, as long as we continue to increase as a species. 225 00:14:56,340 --> 00:14:57,340 We've agreed. 226 00:14:57,340 --> 00:14:59,420 We're not getting any further away from utilizing that energy. 227 00:14:59,420 --> 00:15:03,500 That's where we're headed, is more and more consumption. 228 00:15:03,500 --> 00:15:08,140 And the hope is that these technologies will allow us to keep pushing the limits further 229 00:15:08,140 --> 00:15:11,980 and further while not destroying ourselves in the process. 230 00:15:11,980 --> 00:15:12,980 Yeah. 231 00:15:12,980 --> 00:15:18,060 So let's talk about some of the advances that we've already seen in the current electricity 232 00:15:18,060 --> 00:15:22,460 and power generation space that are sort of driving innovation. 233 00:15:22,460 --> 00:15:29,660 I think one thing that has been kind of flip-flopped back and forth and has somehow been made political 234 00:15:29,660 --> 00:15:36,700 is something like nuclear power generation. 235 00:15:36,700 --> 00:15:41,700 Ultimately, it's terrible, but it really does come down to steam engines. 236 00:15:41,700 --> 00:15:48,980 And nuclear power generation facilities are essentially just using nuclear power to boil 237 00:15:48,980 --> 00:15:55,020 water because it has a greater output to input ratio. 238 00:15:55,020 --> 00:15:58,820 And then steam engines turn and that generates electricity. 239 00:15:58,820 --> 00:16:03,500 We're still on the same standard, right? 240 00:16:03,500 --> 00:16:11,780 But it's incredibly difficult to enrich plutonium, to find plutonium, and it's also very dangerous 241 00:16:11,780 --> 00:16:12,780 to mine it. 242 00:16:12,780 --> 00:16:15,700 It's a very dirty process, both on the front end and the back end. 243 00:16:15,700 --> 00:16:21,220 And then there's also the problems, obviously, Chernobyl, if there's some sort of radioactive 244 00:16:21,220 --> 00:16:25,500 chemical spill in this scenario or radioactive condition. 245 00:16:25,500 --> 00:16:26,500 Right. 246 00:16:26,500 --> 00:16:31,620 But we're able to hopefully utilize these, again, these advancing technologies to offset 247 00:16:31,620 --> 00:16:34,520 those dangers to give us new safety procedures, new development. 248 00:16:34,520 --> 00:16:40,020 There's a lot of ways, and they're also advising us or simulating these sorts of reactions 249 00:16:40,020 --> 00:16:44,100 and telling us how to make them safer, how to make them more efficient, and how to increase 250 00:16:44,100 --> 00:16:48,860 the output without endangering ourselves or blowing ourselves into oblivion. 251 00:16:48,860 --> 00:16:49,860 Yeah. 252 00:16:49,860 --> 00:16:53,700 And there's also been alternatives proposed, right? 253 00:16:53,700 --> 00:17:00,220 There's two approaches when you're faced with the reality of the dangers of a particular 254 00:17:00,220 --> 00:17:01,420 type of energy generation. 255 00:17:01,420 --> 00:17:08,980 In the case of nuclear, yeah, nuclear waste and nuclear spills and enrichment of nuclear 256 00:17:08,980 --> 00:17:11,860 material is dangerous, inherently. 257 00:17:11,860 --> 00:17:13,980 What can we do to overcome that? 258 00:17:13,980 --> 00:17:20,660 And you see that has already driven productivity gains in energy. 259 00:17:20,660 --> 00:17:26,160 China, this year, brought on their first thorium nuclear reactor. 260 00:17:26,160 --> 00:17:30,620 Now thorium is a little bit easier to cultivate, to enrich. 261 00:17:30,620 --> 00:17:33,820 I don't think the returns are as high, but the dangers are lower. 262 00:17:33,820 --> 00:17:36,140 So they're bringing on this thorium. 263 00:17:36,140 --> 00:17:38,500 From what I understand, thorium is still pretty dirty. 264 00:17:38,500 --> 00:17:42,860 It's similar to plutonium, but not as much so. 265 00:17:42,860 --> 00:17:45,740 Again, a trade off, right? 266 00:17:45,740 --> 00:17:51,100 For additional progress, you got to have a little bit of malaise. 267 00:17:51,100 --> 00:17:56,220 We've made significant, in the past three years, we've made a significant movement toward 268 00:17:56,220 --> 00:17:58,620 nuclear fission. 269 00:17:58,620 --> 00:18:06,180 For the first time ever, I believe China, and then it was reproduced in California, 270 00:18:06,180 --> 00:18:17,460 there was a nuclear fission reaction that was for the first time ever a net gain of electricity. 271 00:18:17,460 --> 00:18:23,100 So it required a lot less to put out than it did, or than it consumed, which is enormous. 272 00:18:23,100 --> 00:18:31,880 I mean, that's massive because in a fusion reaction, you're not going to have the actual, 273 00:18:31,880 --> 00:18:33,380 the explosive part, right? 274 00:18:33,380 --> 00:18:36,980 If you think about it, it goes inward and creates cold. 275 00:18:36,980 --> 00:18:40,580 It doesn't implode as opposed to exploding. 276 00:18:40,580 --> 00:18:47,380 You're joining two atoms together rather than exploding them apart. 277 00:18:47,380 --> 00:18:50,700 Which again, is going to, I mean, it could be massive. 278 00:18:50,700 --> 00:18:58,360 If we could get nuclear fusion to a point where the net gains are just interminable, 279 00:18:58,360 --> 00:19:02,440 you could literally put a fusion reactor, in terms of safety, you could put a fusion 280 00:19:02,440 --> 00:19:04,700 reactor at everybody's house. 281 00:19:04,700 --> 00:19:08,480 It could be just as common as a hot water heater in 30 years. 282 00:19:08,480 --> 00:19:11,500 I don't know why you'd want to heat hot water, but. 283 00:19:11,500 --> 00:19:15,820 I'm looking for the Mr. Fusion, you know, the back to the future where you just throw 284 00:19:15,820 --> 00:19:20,620 your trash in your car, engine in your car, and it generates all the power you need. 285 00:19:20,620 --> 00:19:24,740 But realistically, that's not far from what we're moving toward. 286 00:19:24,740 --> 00:19:25,740 Yeah. 287 00:19:25,740 --> 00:19:30,180 But what has driven that progress? 288 00:19:30,180 --> 00:19:31,180 The need for power, right? 289 00:19:31,180 --> 00:19:32,180 The need for more power. 290 00:19:32,180 --> 00:19:33,660 The necessity for that electricity. 291 00:19:33,660 --> 00:19:34,660 Exactly. 292 00:19:34,660 --> 00:19:35,660 Now... 293 00:19:35,660 --> 00:19:39,560 Thus, the source and the potential solution to our problem. 294 00:19:39,560 --> 00:19:40,560 Exactly. 295 00:19:40,560 --> 00:19:45,540 Now, I'm not saying fusion is necessarily, but it could end the scarcity of electricity. 296 00:19:45,540 --> 00:19:51,620 But like you said earlier about pharmaceutical companies not wanting them to find cures for, 297 00:19:51,620 --> 00:19:58,940 you know, the cancer drugs and things of this nature, there are also gigantic lobbies, gigantic, 298 00:19:58,940 --> 00:20:10,140 billionaire, trillionaire lobbies in the energy creation and energy dispersion space that 299 00:20:10,140 --> 00:20:13,940 absolutely would not want that technology to get into the hands of the average human 300 00:20:13,940 --> 00:20:14,940 being. 301 00:20:14,940 --> 00:20:15,940 That's right. 302 00:20:15,940 --> 00:20:22,300 I think it's fortunate in some sense that open AI may be worth more than ExxonMobil, 303 00:20:22,300 --> 00:20:28,840 once it's publicly traded, in the sense of where that economic high energy flows. 304 00:20:28,840 --> 00:20:30,780 Where it doesn't flow, right? 305 00:20:30,780 --> 00:20:38,420 Now, I would think that somebody like Sam Altman and open AI might be incentivized to 306 00:20:38,420 --> 00:20:41,880 invest in that sort of thing. 307 00:20:41,880 --> 00:20:45,540 But if supposedly, and this is what they tell us about capitalism, right? 308 00:20:45,540 --> 00:20:47,820 The necessity drives the innovation. 309 00:20:47,820 --> 00:20:55,540 So if open AI has a necessity for unlimited amounts of power, they have no choice but 310 00:20:55,540 --> 00:21:00,580 to invest in things that will continue to allow them to consume that power. 311 00:21:00,580 --> 00:21:05,740 Then they will invest in the clean creation of it. 312 00:21:05,740 --> 00:21:07,740 And hopefully it trickles down to us. 313 00:21:07,740 --> 00:21:09,700 Now, we know that doesn't happen. 314 00:21:09,700 --> 00:21:16,820 That's why I think we as a society should say, let's fund, let's self-fund as a country, 315 00:21:16,820 --> 00:21:18,620 as a nation, as a world. 316 00:21:18,620 --> 00:21:23,460 Let's figure out how to self-fund fusion so that we all can have a fusion reactor. 317 00:21:23,460 --> 00:21:26,980 Everybody's car is powered by fusion and we're driving around in the cleaner world because 318 00:21:26,980 --> 00:21:32,580 we're not burning coal, we're not burning gas, and we're not burning diesel. 319 00:21:32,580 --> 00:21:34,140 And then we own the means. 320 00:21:34,140 --> 00:21:35,300 So then what do you do? 321 00:21:35,300 --> 00:21:40,420 Now, in my opinion, I don't have a problem with private companies making money by generating 322 00:21:40,420 --> 00:21:44,940 electricity and putting it into a nationally owned grid. 323 00:21:44,940 --> 00:21:50,020 I believe that we as a democratic socialist should own the grid as a whole and we should 324 00:21:50,020 --> 00:21:58,620 also invest as a people in reactors of whatever sort, even fine with a nuclear reactor, like 325 00:21:58,620 --> 00:22:03,920 a fission reactor to generate electricity because that is the best technology that we 326 00:22:03,920 --> 00:22:06,420 currently have at a large scale. 327 00:22:06,420 --> 00:22:07,900 Thorium is still not at a large scale. 328 00:22:07,900 --> 00:22:15,060 It's still being test bedded in China and fusion is still not even remotely in the realm 329 00:22:15,060 --> 00:22:17,260 of being able to scale. 330 00:22:17,260 --> 00:22:26,740 So we should be in the generation business because we're seeing rising power costs across 331 00:22:26,740 --> 00:22:31,500 the nation now and they're saying it's because the rising cost of fuel. 332 00:22:31,500 --> 00:22:39,300 But they're using coal and they're using diesel to drive generators, to generate electricity. 333 00:22:39,300 --> 00:22:44,060 And they're not going to have a reason to change from those things if they can just 334 00:22:44,060 --> 00:22:47,880 pass the costs on to the people who have to pay the electricity bills. 335 00:22:47,880 --> 00:22:53,060 Right, as long as they can keep up, they'll continue to burn the most available and the 336 00:22:53,060 --> 00:22:55,360 most profitable source for them. 337 00:22:55,360 --> 00:22:56,360 Exactly. 338 00:22:56,360 --> 00:23:04,600 So I will say it is my position that innovation will be driven by the need in this space. 339 00:23:04,600 --> 00:23:13,420 So we as a show would be remiss if we didn't talk about the very real, very raw reality 340 00:23:13,420 --> 00:23:19,780 of how much this sucks in the everyday for specific people. 341 00:23:19,780 --> 00:23:26,140 Croc has built a giant AI farm in Georgia. 342 00:23:26,140 --> 00:23:32,540 Open AI, no, Metta, Metta is building a giant AI farm in Louisiana. 343 00:23:32,540 --> 00:23:34,460 Because why? 344 00:23:34,460 --> 00:23:36,260 Low cost of energy. 345 00:23:36,260 --> 00:23:37,260 Right? 346 00:23:37,260 --> 00:23:39,860 Relative to the rest of the country. 347 00:23:39,860 --> 00:23:46,200 And it's putting people out of house and home, obviously, by scooping up land. 348 00:23:46,200 --> 00:23:47,540 Nobody wants to live near those things. 349 00:23:47,540 --> 00:23:53,820 They're loud, they're hot, and they're hard to be around. 350 00:23:53,820 --> 00:23:55,860 Right. 351 00:23:55,860 --> 00:23:59,580 And they're not really approved by the people. 352 00:23:59,580 --> 00:24:04,900 Now obviously, who would want, who would sign up to have an AI farm in their backyard? 353 00:24:04,900 --> 00:24:08,800 Nobody, unless they were somehow benefiting from it. 354 00:24:08,800 --> 00:24:13,740 So I think it's important in these scenarios to use legislation to say you can build an 355 00:24:13,740 --> 00:24:18,580 AI farm, but you also have to build a power generation plant, or you have to build a power 356 00:24:18,580 --> 00:24:28,380 generation facility and contribute, not just remove. 357 00:24:28,380 --> 00:24:31,440 But the real fact is, exactly. 358 00:24:31,440 --> 00:24:35,500 And the municipalities are saying, yeah, come on in, we'll give you tax breaks even. 359 00:24:35,500 --> 00:24:42,300 The problem is, those costs get passed through, sent down to the average people who live in 360 00:24:42,300 --> 00:24:43,300 that community. 361 00:24:43,300 --> 00:24:45,260 And it's wrong, it's not right. 362 00:24:45,260 --> 00:24:51,260 But I want to talk a little bit from a perspective of skepticism, right? 363 00:24:51,260 --> 00:24:55,540 Let's talk about a little bit of cryptocurrency. 364 00:24:55,540 --> 00:24:57,060 Right? 365 00:24:57,060 --> 00:25:07,540 I think there are some practical and ethical challenges of crypto. 366 00:25:07,540 --> 00:25:09,100 Do you have any opinions on these? 367 00:25:09,100 --> 00:25:10,100 We'll see. 368 00:25:10,100 --> 00:25:11,540 I'll ask you too specifically. 369 00:25:11,540 --> 00:25:17,060 I was talking to the audience, but I also am talking to you. 370 00:25:17,060 --> 00:25:20,560 So first of all, I think that here's why crypto sucks. 371 00:25:20,560 --> 00:25:21,560 Right? 372 00:25:21,560 --> 00:25:22,560 Volatility. 373 00:25:22,560 --> 00:25:26,860 Crypto, I said the spot price today was 111,000. 374 00:25:26,860 --> 00:25:28,740 About 112,000. 375 00:25:28,740 --> 00:25:33,500 It was 127,000, 123,000 two weeks ago. 376 00:25:33,500 --> 00:25:38,780 So if you own one Bitcoin, in the last couple of weeks, you've lost $12,000. 377 00:25:38,780 --> 00:25:39,780 $15,000. 378 00:25:39,780 --> 00:25:42,540 If that's your 401k, then that could be pretty scary. 379 00:25:42,540 --> 00:25:43,540 Exactly. 380 00:25:43,540 --> 00:25:49,140 And guess who just legalized external sources inside a 401k. 381 00:25:49,140 --> 00:25:50,140 Yeah. 382 00:25:50,140 --> 00:25:54,100 So I think that we're not talking about just volatility, guys. 383 00:25:54,100 --> 00:26:01,740 We're not talking about like the index goes up a point or two and your 401k gains $12 384 00:26:01,740 --> 00:26:04,260 one day and $127 the next or whatever. 385 00:26:04,260 --> 00:26:08,380 We're talking about extreme price volatility. 386 00:26:08,380 --> 00:26:12,940 That kind of limits cryptocurrency's usefulness as a stable medium of exchange. 387 00:26:12,940 --> 00:26:16,100 They've tried to fix this introducing stable coins. 388 00:26:16,100 --> 00:26:21,700 The US Mint is actually talking about a USDC, US dollar coin that's tethered to the dollar. 389 00:26:21,700 --> 00:26:27,420 There's also Tether, which is another cryptocurrency that's directly tied to the value of the US 390 00:26:27,420 --> 00:26:28,420 dollar. 391 00:26:28,420 --> 00:26:33,000 But it's an asset that can fluctuate by double digit percentages. 392 00:26:33,000 --> 00:26:37,300 We're talking 30, 40% in a single day. 393 00:26:37,300 --> 00:26:40,480 And that's really impractical for commercial transactions. 394 00:26:40,480 --> 00:26:45,540 If you take a hundred bucks and you buy a hundred dollars worth of Bitcoin, that's not 395 00:26:45,540 --> 00:26:46,540 a lot of Bitcoin. 396 00:26:46,540 --> 00:26:50,180 It's about a hundred thousandth of a Bitcoin, right? 397 00:26:50,180 --> 00:26:54,840 And that is worth a hundred dollars when you buy it. 398 00:26:54,840 --> 00:26:59,540 At the end of the day, that hundred dollars could only have the spending power of 40 when 399 00:26:59,540 --> 00:27:01,540 you're talking about that kind of volatility. 400 00:27:01,540 --> 00:27:09,340 So unlike payments made with a credit card or a debit card, those sorts of crypto transactions 401 00:27:09,340 --> 00:27:11,580 have no consumer protections either. 402 00:27:11,580 --> 00:27:17,340 It's so if you get defrauded of hundreds of thousands of dollars of cryptocurrency, 403 00:27:17,340 --> 00:27:20,940 no, ain't nobody can help you. 404 00:27:20,940 --> 00:27:21,940 They have those. 405 00:27:21,940 --> 00:27:23,400 They have those coins now. 406 00:27:23,400 --> 00:27:29,300 They're tied to their cryptography keys and there's no protection. 407 00:27:29,300 --> 00:27:34,020 If you make a mistake, just fat finger an address that you're trying to send somebody 408 00:27:34,020 --> 00:27:35,220 money to. 409 00:27:35,220 --> 00:27:40,380 You can hope they send it back voluntarily, but there's no mechanism to get that money 410 00:27:40,380 --> 00:27:41,380 back. 411 00:27:41,380 --> 00:27:50,380 There's also the risks, the idea of a centralized system is contrasted against the practical 412 00:27:50,380 --> 00:27:53,700 reality of how we interact with it. 413 00:27:53,700 --> 00:28:00,460 So the underlying blockchain, the ledger that we were talking about earlier, it is really 414 00:28:00,460 --> 00:28:02,860 secure by design. 415 00:28:02,860 --> 00:28:10,820 Unfortunately, most people store their assets on exchanges which are centralized. 416 00:28:10,820 --> 00:28:16,340 Why for convenience, the ability to withdraw, to buy more, to add it, to purchase it, to 417 00:28:16,340 --> 00:28:19,660 trade it, to make money with that money. 418 00:28:19,660 --> 00:28:26,580 Or in my case where I got hung up is it guaranteed a return that was greater than the market 419 00:28:26,580 --> 00:28:33,780 average 5.9% or something to that effect to basically let them borrow your cryptocurrency 420 00:28:33,780 --> 00:28:43,400 and use it to trade inside of a volatile market and hopefully get something back from it. 421 00:28:43,400 --> 00:28:48,260 There was a whole lot of mess with the Gemini exchange and you can feel free to look that 422 00:28:48,260 --> 00:28:49,420 up. 423 00:28:49,420 --> 00:28:54,140 I was very lucky in that I was able to get it all back eventually, but it took two and 424 00:28:54,140 --> 00:28:56,540 a half years. 425 00:28:56,540 --> 00:29:01,420 Otherwise it was just sitting there in flux and it was not a small amount of money. 426 00:29:01,420 --> 00:29:07,980 When you put money in a bank, if there's a large scale hack, if you're defrauded, you 427 00:29:07,980 --> 00:29:09,900 go to the bank and you say, "Hey, look, this is fraud. 428 00:29:09,900 --> 00:29:10,900 I didn't do this. 429 00:29:10,900 --> 00:29:12,300 This is not my transaction." 430 00:29:12,300 --> 00:29:16,060 You were hacked or somebody generated my credit card number. 431 00:29:16,060 --> 00:29:20,340 That's not mine and you get your money back because that is a built-in protection. 432 00:29:20,340 --> 00:29:25,420 Our system builds in those protections through FDIC insurance. 433 00:29:25,420 --> 00:29:28,820 Your money is guaranteed up to $250,000 or whatever. 434 00:29:28,820 --> 00:29:31,220 Same thing with the NCUA for credit unions. 435 00:29:31,220 --> 00:29:33,900 There is no organization like this. 436 00:29:33,900 --> 00:29:34,900 Can we make it? 437 00:29:34,900 --> 00:29:35,900 Absolutely. 438 00:29:35,900 --> 00:29:36,900 Could it be done? 439 00:29:36,900 --> 00:29:37,900 Yes, absolutely. 440 00:29:37,900 --> 00:29:39,620 But who would guarantee it? 441 00:29:39,620 --> 00:29:43,980 Essentially, it would be a form of insurance, which what do we know about insurance? 442 00:29:43,980 --> 00:29:48,260 The only people who ever make money using insurance are the people who create insurance. 443 00:29:48,260 --> 00:29:54,780 They don't stay around that long to give away all their money, all the premiums. 444 00:29:54,780 --> 00:29:56,060 Exactly. 445 00:29:56,060 --> 00:30:04,700 In order to get some sort of regulation, we would have to reasonably create a cryptocurrency 446 00:30:04,700 --> 00:30:07,580 that needed to be centralized in order to protect it. 447 00:30:07,580 --> 00:30:12,140 It's like the Wild West in the sense that you can use it to trade money, but if you 448 00:30:12,140 --> 00:30:13,420 mess up, that's on you. 449 00:30:13,420 --> 00:30:15,020 It's why the libertarians love it. 450 00:30:15,020 --> 00:30:17,220 They're like, "It's all about your decisions." 451 00:30:17,220 --> 00:30:20,020 Also, there are smart contracts. 452 00:30:20,020 --> 00:30:24,780 I didn't really talk about that, but there are smart contracts that automate transactions 453 00:30:24,780 --> 00:30:34,440 on the Ethereum platform that they could contain actual coding errors or flaws that can lead 454 00:30:34,440 --> 00:30:44,300 to leaking of coins, that could lead to catastrophic losses to terms of 100% or more. 455 00:30:44,300 --> 00:30:49,180 Also, obviously, I'm going to point it out, it's almost not even worth considering illicit 456 00:30:49,180 --> 00:30:50,180 use. 457 00:30:50,180 --> 00:30:55,180 The currency has been the number one way to do something illegal in the world since its 458 00:30:55,180 --> 00:31:03,780 inception, mainly because nobody can really track it or see it. 459 00:31:03,780 --> 00:31:05,820 It's not tied to your identity. 460 00:31:05,820 --> 00:31:08,020 It's just a thing that can be traded. 461 00:31:08,020 --> 00:31:12,500 But I would argue, cash is too. 462 00:31:12,500 --> 00:31:15,700 Cash is absolutely the same concept. 463 00:31:15,700 --> 00:31:23,300 That's why in the '70s and '80s movies we saw about the cocaine trade and when the CIA and 464 00:31:23,300 --> 00:31:27,940 the FBI were bringing in massive amounts of drugs into the United States, you saw everything 465 00:31:27,940 --> 00:31:29,180 was dealt with in cash. 466 00:31:29,180 --> 00:31:30,660 So it's the same concept. 467 00:31:30,660 --> 00:31:31,820 It just became the new cash. 468 00:31:31,820 --> 00:31:37,020 It was an easy way to conduct it electronically without having to carry around large sums. 469 00:31:37,020 --> 00:31:42,220 It's why we can't take a flight with more than $10,000 in cash without declaring it 470 00:31:42,220 --> 00:31:45,900 and letting somebody know that we're leaving with massive amounts of cash. 471 00:31:45,900 --> 00:31:49,620 So it is used for illicit reasons. 472 00:31:49,620 --> 00:31:58,220 To kind of tackle the downsides of AI as well, now we're talking a high level from an ethical 473 00:31:58,220 --> 00:32:01,580 and moral standpoint on these topics. 474 00:32:01,580 --> 00:32:05,100 So it's not just about the power consumption of it. 475 00:32:05,100 --> 00:32:11,040 Algorithmic bias, you know, ultimately these models are just giant repositories of things 476 00:32:11,040 --> 00:32:14,060 that have already been written in the world and on the internet. 477 00:32:14,060 --> 00:32:20,720 That means that along with it, all of these are trained on these datasets and scraped 478 00:32:20,720 --> 00:32:25,100 images as a reflection of our world. 479 00:32:25,100 --> 00:32:30,100 That means all of those social biases, all those prejudices and stereotypes about gender 480 00:32:30,100 --> 00:32:38,420 and race and culture and everything else, those models reflect those biases and then 481 00:32:38,420 --> 00:32:43,780 can recreate them and sometimes even amplify them in their output. 482 00:32:43,780 --> 00:32:45,860 There are a lot of examples of this. 483 00:32:45,860 --> 00:32:51,780 You can check the internet for that, especially around like AI powered hiring tools. 484 00:32:51,780 --> 00:32:55,060 They've specifically shown bias against female candidates. 485 00:32:55,060 --> 00:33:00,220 Facial recognition systems have lower accuracy in regards to people of color. 486 00:33:00,220 --> 00:33:05,540 AI models used in criminal justice system have actually been shown to disproportionately 487 00:33:05,540 --> 00:33:10,780 flag black defendants as being high risk of reoffending. 488 00:33:10,780 --> 00:33:17,160 That leads to discriminatory outcomes and the reinforcement of those inequalities. 489 00:33:17,160 --> 00:33:23,700 Then there's the all encompassing and very well publicized misinformation and hallucinations 490 00:33:23,700 --> 00:33:28,500 that have been well documented that AI models have the tendency to hallucinate. 491 00:33:28,500 --> 00:33:29,500 Why? 492 00:33:29,500 --> 00:33:30,500 Well, it's a great question. 493 00:33:30,500 --> 00:33:34,740 We're asking you to generate something new from what it was trained on. 494 00:33:34,740 --> 00:33:41,100 That means there's a possibility that whatever it's making up is completely nonfactual and 495 00:33:41,100 --> 00:33:42,100 nonsensical. 496 00:33:42,100 --> 00:33:46,000 Well, most of the time it's sensical, but it might not be based in fact. 497 00:33:46,000 --> 00:33:49,620 It doesn't have a comprehension of the data. 498 00:33:49,620 --> 00:33:55,620 So if you ask it a question about the speed limit in Wisconsin, it doesn't understand that 499 00:33:55,620 --> 00:34:00,260 the speed limit in Wisconsin has a specific legislative paper trail. 500 00:34:00,260 --> 00:34:05,420 You could ask me about the speed limit in Zootopia and it'll tell you something because you're 501 00:34:05,420 --> 00:34:07,060 asking it a question, right? 502 00:34:07,060 --> 00:34:08,060 Exactly. 503 00:34:08,060 --> 00:34:09,060 Exactly. 504 00:34:09,060 --> 00:34:10,060 There will be some sort of determination. 505 00:34:10,060 --> 00:34:11,060 Exactly. 506 00:34:11,060 --> 00:34:12,500 It doesn't know that there's no such place. 507 00:34:12,500 --> 00:34:16,540 Think of it kind of like my nine year old who hasn't learned the power of "I don't know" 508 00:34:16,540 --> 00:34:17,540 yet. 509 00:34:17,540 --> 00:34:19,140 You know, like that's the power to shut down a conversation. 510 00:34:19,140 --> 00:34:20,840 You ask him a question, they say, "I don't know." 511 00:34:20,840 --> 00:34:21,840 That's a teenage thing. 512 00:34:21,840 --> 00:34:24,940 But as a nine year old, he will answer every question you ask him. 513 00:34:24,940 --> 00:34:28,980 He just will answer it with total confidence whether he knows it or not, and that's how 514 00:34:28,980 --> 00:34:29,980 AI responds. 515 00:34:29,980 --> 00:34:33,020 So hopefully it'll grow out before it destroys us all. 516 00:34:33,020 --> 00:34:34,580 I do that myself now. 517 00:34:34,580 --> 00:34:37,260 I do that all the time. 518 00:34:37,260 --> 00:34:40,020 I say if you just say it with confidence, they'll surely believe it. 519 00:34:40,020 --> 00:34:41,020 Also... 520 00:34:41,020 --> 00:34:42,020 Go ahead. 521 00:34:42,020 --> 00:34:46,980 As I was saying, unfortunately, hallucinations didn't originate with AI. 522 00:34:46,980 --> 00:34:48,380 No, they did not. 523 00:34:48,380 --> 00:34:53,580 And the truth is, if we're trying to create these language models to sound and be more 524 00:34:53,580 --> 00:35:00,900 like we are, we hallucinate, we dream, we have ideas that are nonsensical and not based in 525 00:35:00,900 --> 00:35:01,900 fact. 526 00:35:01,900 --> 00:35:02,900 But what does this do? 527 00:35:02,900 --> 00:35:08,760 It means that scammers, people who are nefariously using AI, sound more convincing. 528 00:35:08,760 --> 00:35:11,040 They sound more real. 529 00:35:11,040 --> 00:35:16,520 Those letters that we... the emails we used to get from the Zimbabwe government that reminded 530 00:35:16,520 --> 00:35:25,400 us how well off we were, and how we had magically either won their lottery or something to that 531 00:35:25,400 --> 00:35:26,400 effect. 532 00:35:26,400 --> 00:35:33,460 They were hoping for some sort of barrister situation where you had to send them money. 533 00:35:33,460 --> 00:35:38,860 Those were easily determined to be AI. 534 00:35:38,860 --> 00:35:41,960 And for a very specific reason... 535 00:35:41,960 --> 00:35:44,560 I mean, sorry, not easily determined to be AI. 536 00:35:44,560 --> 00:35:48,200 They were easily sussed out as being fraudulent. 537 00:35:48,200 --> 00:35:50,440 And why do you think that is? 538 00:35:50,440 --> 00:35:55,520 Well, because there were spelling errors, there were grammar problems because they were 539 00:35:55,520 --> 00:35:59,520 clearly from people who didn't speak English as their first language. 540 00:35:59,520 --> 00:36:05,420 But you know, now with AI, they can use that same script that they had before and run it 541 00:36:05,420 --> 00:36:07,100 through an AI and say, "Hey, clean this up. 542 00:36:07,100 --> 00:36:08,100 Make it more formal. 543 00:36:08,100 --> 00:36:09,940 Make it match American English." 544 00:36:09,940 --> 00:36:11,360 Those sorts of things. 545 00:36:11,360 --> 00:36:17,840 And still get away... and more now than ever, they're actually targeting people and getting 546 00:36:17,840 --> 00:36:21,280 away with it because it looks more legitimate than it did before. 547 00:36:21,280 --> 00:36:27,180 You know, I remember Kim Commando, who was a radio-popular host that helped old people 548 00:36:27,180 --> 00:36:29,160 with computers back in the day. 549 00:36:29,160 --> 00:36:34,360 And my grandparents when they were alive loved to listen to her radio show. 550 00:36:34,360 --> 00:36:39,760 And she would say like, "Here's how you pick out these things and look for grammar mistakes 551 00:36:39,760 --> 00:36:41,280 and look for spelling errors. 552 00:36:41,280 --> 00:36:45,180 But in today's scams, there are no spelling errors. 553 00:36:45,180 --> 00:36:46,720 There are no grammar mistakes. 554 00:36:46,720 --> 00:36:51,480 It sounds exactly like a person who would coming from... and they're using proper language. 555 00:36:51,480 --> 00:36:55,120 You know, they're using barrister instead of lawyer, and they're using, you know, these 556 00:36:55,120 --> 00:36:57,000 sorts of things. 557 00:36:57,000 --> 00:37:00,400 Another thing, privacy and data, right? 558 00:37:00,400 --> 00:37:07,640 When you train these large language models, it requires a colossal amount of information. 559 00:37:07,640 --> 00:37:10,200 Most of that is scraped from public websites. 560 00:37:10,200 --> 00:37:15,360 It's scraped from forums like Reddit and Substack and those sorts of things. 561 00:37:15,360 --> 00:37:18,840 Stack trace, stack overflow, lots of stacks. 562 00:37:18,840 --> 00:37:22,520 And from social media... all of them. 563 00:37:22,520 --> 00:37:29,720 And is pulled from social media without the knowledge or consent of the people who created 564 00:37:29,720 --> 00:37:31,480 that content. 565 00:37:31,480 --> 00:37:36,840 That raises a lot of privacy concerns, not to mention people's personal stories, their 566 00:37:36,840 --> 00:37:42,680 sensitive information, pictures of their children, copyrighted artworks, copyrighted music. 567 00:37:42,680 --> 00:37:45,960 All of these things are ingested into these models. 568 00:37:45,960 --> 00:37:53,480 So there is a not insignificant risk that models could inadvertently leak, right? 569 00:37:53,480 --> 00:38:01,360 Or reproduce private data in their responses or worse, maliciously be coerced into producing 570 00:38:01,360 --> 00:38:03,840 this information that was used to train them. 571 00:38:03,840 --> 00:38:10,240 There's also a good bit of accountability and transparency issues in these AI models, right? 572 00:38:10,240 --> 00:38:14,960 We as consumers don't have the ability to produce an AI model on the equipment that we have 573 00:38:14,960 --> 00:38:16,540 in our homes. 574 00:38:16,540 --> 00:38:18,320 They require massive amounts of information. 575 00:38:18,320 --> 00:38:23,940 The average model right now is right at $22 million. 576 00:38:23,940 --> 00:38:27,560 You need that amount of money to train one of these models. 577 00:38:27,560 --> 00:38:32,760 So many of the most powerful of these really are just black boxes, like we were talking 578 00:38:32,760 --> 00:38:36,240 about earlier with the computational aspects of it. 579 00:38:36,240 --> 00:38:40,280 But in the informational aspect of it, they're just black boxes. 580 00:38:40,280 --> 00:38:46,940 Their internal workings are so complex that even the people who do this for a living cannot 581 00:38:46,940 --> 00:38:54,260 fully explain why a model produced the specific output that you're seeing on your chat. 582 00:38:54,260 --> 00:39:00,680 Like you said, when model 4.0 was so friendly, and then 5.0 was cold and objective, people 583 00:39:00,680 --> 00:39:04,660 were like, "What just happened?" and open their eyes, shrugging their shoulders. 584 00:39:04,660 --> 00:39:06,640 They don't know. 585 00:39:06,640 --> 00:39:07,960 Exactly. 586 00:39:07,960 --> 00:39:16,320 And that opaqueness is a real challenge for accountability for a very specific reason. 587 00:39:16,320 --> 00:39:23,000 When an AI system hurts someone and we're seeing it hurt people, whether by providing 588 00:39:23,000 --> 00:39:30,720 dangerously incorrect medical advice, causing a self-driving car to crash, or making a biased 589 00:39:30,720 --> 00:39:32,800 loan decision. 590 00:39:32,800 --> 00:39:37,680 And it's really often unclear who's responsible for that. 591 00:39:37,680 --> 00:39:40,840 Is it the user who prompted the AI? 592 00:39:40,840 --> 00:39:43,700 Is it the people who train the AI? 593 00:39:43,700 --> 00:39:47,940 Is it the company that made the AI available for you to use? 594 00:39:47,940 --> 00:39:54,480 Or was it the creators of the biased data that were used to train it? 595 00:39:54,480 --> 00:40:01,820 So without transparency in how these systems make decisions, the ability to establish any 596 00:40:01,820 --> 00:40:05,400 sort of accountability or transparency is almost impossible. 597 00:40:05,400 --> 00:40:08,260 All right, let's talk about the innovation, Fowl. 598 00:40:08,260 --> 00:40:15,300 We have built for you a narrative of historical information that presents progress as being 599 00:40:15,300 --> 00:40:18,520 a driver for additional progress. 600 00:40:18,520 --> 00:40:23,160 Progress in one area creates the need, which drives innovation in another area. 601 00:40:23,160 --> 00:40:25,820 And it is a valid one. 602 00:40:25,820 --> 00:40:29,960 I do want to say that, but it's incomplete. 603 00:40:29,960 --> 00:40:35,960 Innovation is really not a moral force. 604 00:40:35,960 --> 00:40:38,920 It's kind of a neutral process. 605 00:40:38,920 --> 00:40:39,920 It's a tool. 606 00:40:39,920 --> 00:40:41,720 Gosh, where do you heard that argument before? 607 00:40:41,720 --> 00:40:46,520 It's a tool, and it can be used for good or bad. 608 00:40:46,520 --> 00:40:51,080 It just magnifies the effort. 609 00:40:51,080 --> 00:40:58,540 And that's the point of a tool to begin with, is to expand our capabilities into further. 610 00:40:58,540 --> 00:41:07,060 So the final and really the most fundamental critique of ourselves challenges this, this 611 00:41:07,060 --> 00:41:16,120 sort of techno-optimistic premise that a high demand for these resources will lead to positive 612 00:41:16,120 --> 00:41:18,880 and beneficial innovation. 613 00:41:18,880 --> 00:41:20,040 That is my belief. 614 00:41:20,040 --> 00:41:24,840 I will say that there is data that actively backs it up. 615 00:41:24,840 --> 00:41:31,040 Will, is it your belief that you've seen it come to pass in your lifetime? 616 00:41:31,040 --> 00:41:35,760 I have, but more importantly, I think it's more of just hope that I have to hold on to, 617 00:41:35,760 --> 00:41:38,120 because the alternative is just the dark ages again. 618 00:41:38,120 --> 00:41:39,120 Right? 619 00:41:39,120 --> 00:41:40,860 We've already gone so far at this point. 620 00:41:40,860 --> 00:41:44,200 The hope is to completely borrow something entirely unrelated. 621 00:41:44,200 --> 00:41:47,360 If you're going through hell, keep on going, as they say, right? 622 00:41:47,360 --> 00:41:50,000 Like the idea is the only way out is through. 623 00:41:50,000 --> 00:41:51,420 The only way out is through. 624 00:41:51,420 --> 00:41:52,420 Exactly. 625 00:41:52,420 --> 00:41:58,520 I think high demand, I think if I had to word it correctly, I would say high demand provides 626 00:41:58,520 --> 00:42:06,820 a motive for innovation, but it doesn't actually dictate the ethics of the outcome. 627 00:42:06,820 --> 00:42:08,000 Exactly. 628 00:42:08,000 --> 00:42:16,080 So demand for cheap textiles in the 19th century drove innovation in factory machinery and 629 00:42:16,080 --> 00:42:23,940 the steam engine, but it also led to horrific labor conditions, child labor, dangerous urban 630 00:42:23,940 --> 00:42:31,080 pollution, both in the water and in the air, but the demand for agricultural efficiency 631 00:42:31,080 --> 00:42:37,180 in the 20th century drove innovation in chemical fertilizers and pesticides, but also led to 632 00:42:37,180 --> 00:42:42,500 widespread water contamination and ecological damage that we're still dealing with the consequences 633 00:42:42,500 --> 00:42:44,240 of. 634 00:42:44,240 --> 00:42:49,460 Historical innovations that power our modern world right now, the steam engine, the internal 635 00:42:49,460 --> 00:42:56,180 combustion engine were based on fossil fuels, the extraction of use and use of which have 636 00:42:56,180 --> 00:43:03,640 led directly to our current climate crisis, a massive and potentially catastrophic negative 637 00:43:03,640 --> 00:43:05,620 externality of that progress. 638 00:43:05,620 --> 00:43:11,920 So a closer look at the consequences of high resource demand kind of reveals this pattern 639 00:43:11,920 --> 00:43:13,760 of negative outcomes, right? 640 00:43:13,760 --> 00:43:19,360 That often come along with the advancement of technology. 641 00:43:19,360 --> 00:43:24,000 The extraction of raw materials, whether they be coal for a steam engine or lithium and 642 00:43:24,000 --> 00:43:31,620 cobalt for batteries is a very energy intensive and environmentally destructive process. 643 00:43:31,620 --> 00:43:38,000 It very often leads to really terrible and sometimes irreversible ecological damage. 644 00:43:38,000 --> 00:43:41,180 I mean, soil degradation, water shortages. 645 00:43:41,180 --> 00:43:44,260 It pollutes the air and the water both. 646 00:43:44,260 --> 00:43:49,500 And the less obvious one for you biologists out there is the loss of biodiversity. 647 00:43:49,500 --> 00:43:56,320 The global pursuit of resources is inherently intertwined with social conflict and inequality. 648 00:43:56,320 --> 00:44:02,460 The process of material extraction has been linked to severe human rights violations and 649 00:44:02,460 --> 00:44:08,280 forced displacement of local and indigenous populations and acute health problems for 650 00:44:08,280 --> 00:44:11,900 the communities that surround these things from contamination. 651 00:44:11,900 --> 00:44:17,500 We're already looking at the prospect of all of us losing the ability to work in white 652 00:44:17,500 --> 00:44:22,700 collar jobs at all because of the possibility of an eventual AGI. 653 00:44:22,700 --> 00:44:27,140 Which makes alignment all that much more important, right? 654 00:44:27,140 --> 00:44:28,140 Right. 655 00:44:28,140 --> 00:44:37,020 The economic benefits of the resource extraction flows to multinational corporations and governments 656 00:44:37,020 --> 00:44:45,440 and developed nations, while the ecological and social costs are actually burdened upon 657 00:44:45,440 --> 00:44:48,540 by the actual population of less developed countries. 658 00:44:48,540 --> 00:44:58,620 According to the UN actually, natural resources play a key role in 40% of intra-INTRA state 659 00:44:58,620 --> 00:45:05,860 conflicts with profits from their sale being used to finance armed militias. 660 00:45:05,860 --> 00:45:15,100 And look, this is a crucial point in us building up our current moment. 661 00:45:15,100 --> 00:45:23,900 I mean, this immense demand for energy and materials generated by A&I and by cryptocurrency 662 00:45:23,900 --> 00:45:33,020 could spur development, right, of clean fusion power which effectively would end scarcity 663 00:45:33,020 --> 00:45:36,580 for energy on our planet as a whole. 664 00:45:36,580 --> 00:45:45,900 But it could also lead, and if I'm also looking at history, to a giant global grab for resources 665 00:45:45,900 --> 00:45:52,580 needed to build that new energy infrastructure and the computing hardware that it will power. 666 00:45:52,580 --> 00:45:56,880 And that outcome is not guaranteed to be positive. 667 00:45:56,880 --> 00:45:59,740 So it's not going to be determined by like… go ahead. 668 00:45:59,740 --> 00:46:04,820 It's guaranteed not to be positive for some people, even if it's generally positive for 669 00:46:04,820 --> 00:46:05,820 the majority of us. 670 00:46:05,820 --> 00:46:08,460 Some of us will have to pay the cost. 671 00:46:08,460 --> 00:46:09,460 That is a reality. 672 00:46:09,460 --> 00:46:13,820 I view it kind of like I view war. 673 00:46:13,820 --> 00:46:20,860 And look, I'm not a fan of war, but the theory behind it was that a few put up their own 674 00:46:20,860 --> 00:46:25,740 lives at stake to protect the majority. 675 00:46:25,740 --> 00:46:31,260 And I don't think it's necessarily right to equate this with war. 676 00:46:31,260 --> 00:46:37,180 But it's not going to be determined by the technology itself, AI or crypto. 677 00:46:37,180 --> 00:46:43,980 It's going to be determined by the choices that we make as people regarding the regulation, 678 00:46:43,980 --> 00:46:48,900 the governance and the ethical deployment of these technologies. 679 00:46:48,900 --> 00:46:49,900 Right. 680 00:46:49,900 --> 00:46:51,320 We've got access to a lot of power. 681 00:46:51,320 --> 00:46:53,960 Now it comes with a responsibility. 682 00:46:53,960 --> 00:46:54,960 Thanks, Spider-Man. 683 00:46:54,960 --> 00:46:56,380 And Peter Parker over here. 684 00:46:56,380 --> 00:46:57,780 Hey, Uncle Ben. 685 00:46:57,780 --> 00:46:59,580 No, it was Uncle Ben. 686 00:46:59,580 --> 00:47:00,740 I was about to say it was the uncle. 687 00:47:00,740 --> 00:47:02,420 It was not Peter Parker. 688 00:47:02,420 --> 00:47:03,420 So okay. 689 00:47:03,420 --> 00:47:04,540 So where does that leave us, right? 690 00:47:04,540 --> 00:47:09,180 In terms of a conclusion, I came across a fun word that I really liked in doing the 691 00:47:09,180 --> 00:47:11,540 research for this and it was trilemma. 692 00:47:11,540 --> 00:47:12,900 I mean, it makes sense. 693 00:47:12,900 --> 00:47:14,500 I heard of dilemma. 694 00:47:14,500 --> 00:47:21,220 Most of us have heard of dilemma, but trilemma or trilemma is a fun word that I really enjoy. 695 00:47:21,220 --> 00:47:29,300 I think it creates this trilemma of innovation and consumption and responsibility. 696 00:47:29,300 --> 00:47:35,340 So I feel like we're currently standing like dead center and we have to we take care of 697 00:47:35,340 --> 00:47:41,380 all of these things because of the profound implications of the twin revolutions of crypto 698 00:47:41,380 --> 00:47:48,100 and AI that we're, regardless of whether you want it, are being put into the real world 699 00:47:48,100 --> 00:47:50,280 for us to contend with. 700 00:47:50,280 --> 00:47:56,360 So this podcast has been an effort to show that these technologies aren't like abstract 701 00:47:56,360 --> 00:48:05,420 concepts, but they're real powerful forces with real world consequences, right? 702 00:48:05,420 --> 00:48:13,240 Especially with their immense and endless, meaningly endless growth of desire for consumption 703 00:48:13,240 --> 00:48:14,240 of energy. 704 00:48:14,240 --> 00:48:21,080 So where we go from here is not like a simple choice between, "Hey, I'm going to embrace 705 00:48:21,080 --> 00:48:25,340 AI and I'm going to embrace cryptocurrency" or flat out reject it, right? 706 00:48:25,340 --> 00:48:28,020 And decide to be Amish and go live in Pennsylvania. 707 00:48:28,020 --> 00:48:35,020 But it's really, it's a very complex navigation of this trilemma. 708 00:48:35,020 --> 00:48:38,380 And opting out is no guarantee that you'll avoid the consequences and in fact, maybe 709 00:48:38,380 --> 00:48:41,900 just abdicating your ability to do something about it. 710 00:48:41,900 --> 00:48:42,940 Exactly. 711 00:48:42,940 --> 00:48:50,480 So if you were to take each branch of this, right, innovation, we have potential benefits 712 00:48:50,480 --> 00:48:56,700 of AI and decentralized technology that we talked about earlier. 713 00:48:56,700 --> 00:49:00,900 Not only that, we're talking about the possibility of increased productivity. 714 00:49:00,900 --> 00:49:02,780 Now, productivity for what? 715 00:49:02,780 --> 00:49:04,840 You know, again, this is a tool. 716 00:49:04,840 --> 00:49:06,260 We get to determine that. 717 00:49:06,260 --> 00:49:13,020 We've got new forms of economic organization, accelerated scientific discovery and solutions, 718 00:49:13,020 --> 00:49:16,260 possibly to some of humanity's most pressing problems. 719 00:49:16,260 --> 00:49:22,940 So if we turn our back on the potential innovation, we would literally be rejecting like a pretty 720 00:49:22,940 --> 00:49:26,180 powerful engine for future progress. 721 00:49:26,180 --> 00:49:28,320 The second point is consumption, right? 722 00:49:28,320 --> 00:49:34,660 Like as we talked about earlier, that innovation comes at a steep environmental cost. 723 00:49:34,660 --> 00:49:35,960 It doesn't matter how you look at it. 724 00:49:35,960 --> 00:49:42,220 The silicon, from the silicon of the chips to the coal that we are apparently now, once 725 00:49:42,220 --> 00:49:48,500 again, all excited about burning to produce electricity inside of our electrical grid. 726 00:49:48,500 --> 00:49:54,020 That is the energy demand of these technologies is already on the scale of like coal nations, 727 00:49:54,020 --> 00:49:55,020 right? 728 00:49:55,020 --> 00:49:56,800 Like, and growing exponentially. 729 00:49:56,800 --> 00:50:02,940 That is, I mean, that drives a voracious appetite, not just for that electricity, but for the 730 00:50:02,940 --> 00:50:08,540 raw materials needed to build the data centers, to build the processors, the new energy infrastructure 731 00:50:08,540 --> 00:50:11,180 required to sustain them. 732 00:50:11,180 --> 00:50:18,960 And if we don't instill and enforce a system of checks and balances, it really could threaten 733 00:50:18,960 --> 00:50:21,240 to undermine our climate goals. 734 00:50:21,240 --> 00:50:28,320 And I don't mean the ones that our current president has decided to remove us from. 735 00:50:28,320 --> 00:50:37,400 I mean our individual personal climate goals and also exacerbate the destruction that we're 736 00:50:37,400 --> 00:50:42,860 already seeing at an environmental level, which brings up the most important part of 737 00:50:42,860 --> 00:50:43,860 all of this. 738 00:50:43,860 --> 00:50:45,840 It's about responsibility. 739 00:50:45,840 --> 00:50:52,140 We cannot have the benefits of innovation without managing the consequences of consumption. 740 00:50:52,140 --> 00:50:59,700 And those have a tangible effect on everyday people, not just multinational corporations. 741 00:50:59,700 --> 00:51:09,580 And that requires a distinctly profound sense of responsibility to develop and deploy these 742 00:51:09,580 --> 00:51:15,660 technologies in a way that is ethical and sustainable, but equitable too. 743 00:51:15,660 --> 00:51:17,200 Absolutely. 744 00:51:17,200 --> 00:51:24,980 And that means confronting the risks of the algorithmic bias, the misinformation and yes, 745 00:51:24,980 --> 00:51:27,860 the illegal use, right? 746 00:51:27,860 --> 00:51:29,780 And I mean, whatever. 747 00:51:29,780 --> 00:51:34,500 I guess I personally don't really care if some guy from Montana really needs LSD and 748 00:51:34,500 --> 00:51:36,880 can figure out a way to get it through the internet. 749 00:51:36,880 --> 00:51:40,000 That doesn't really affect me that much. 750 00:51:40,000 --> 00:51:48,300 But it's about ensuring that the pursuit of new energy sources doesn't just recreate all 751 00:51:48,300 --> 00:51:51,860 of the crappy historical patterns, right? 752 00:51:51,860 --> 00:51:59,460 Of social and environmental exploitation that often come along with resource extraction. 753 00:51:59,460 --> 00:52:01,420 And we have real world examples of those. 754 00:52:01,420 --> 00:52:03,740 Plenty of historical examples. 755 00:52:03,740 --> 00:52:05,060 Yes. 756 00:52:05,060 --> 00:52:13,500 But even looking toward the possibilities, when we look at possible hiccups in the road, 757 00:52:13,500 --> 00:52:20,500 it's pretty clear that the future is not just a predetermined outcome of technological inevitability. 758 00:52:20,500 --> 00:52:24,660 The pattern of demand driving innovation is powerful. 759 00:52:24,660 --> 00:52:29,460 I mean, you can't deny that either, but it is not a moral guarantee, right? 760 00:52:29,460 --> 00:52:31,640 It doesn't mean that it is... 761 00:52:31,640 --> 00:52:32,640 A scientific fact. 762 00:52:32,640 --> 00:52:33,640 Right. 763 00:52:33,640 --> 00:52:34,640 It is morally good. 764 00:52:34,640 --> 00:52:35,640 It does not mean that. 765 00:52:35,640 --> 00:52:38,020 It is just a fact. 766 00:52:38,020 --> 00:52:46,120 So that same demand for electricity that could actually get us there faster for clean fusion 767 00:52:46,120 --> 00:52:52,020 power could also trigger a destructive scramble for the resources to build Fusion React. 768 00:52:52,020 --> 00:52:58,300 That same AI that could cure diseases we've already seen through Russian propaganda entrenches 769 00:52:58,300 --> 00:53:03,260 societal biases and completely obliterates public trust. 770 00:53:03,260 --> 00:53:10,260 So our challenge really is to manage this trilemma, you're welcome, with intention and foresight. 771 00:53:10,260 --> 00:53:11,260 Right? 772 00:53:11,260 --> 00:53:16,780 So the path forward is really not just about the breakthroughs that we're seeing from this 773 00:53:16,780 --> 00:53:19,380 technology. 774 00:53:19,380 --> 00:53:20,700 And they're cool. 775 00:53:20,700 --> 00:53:22,020 They're cool. 776 00:53:22,020 --> 00:53:27,060 The idea of endless clean energy generation is amazing. 777 00:53:27,060 --> 00:53:33,080 Not to mention computational efficiency of having a constant brain more powerful than 778 00:53:33,080 --> 00:53:36,380 the human when continually working on a problem. 779 00:53:36,380 --> 00:53:43,560 But it demands conscious, deliberate, and international conversations about the kind 780 00:53:43,560 --> 00:53:45,220 of society we want to build. 781 00:53:45,220 --> 00:53:52,980 And it has to be one that balances the power of that innovation with the very finite reality 782 00:53:52,980 --> 00:53:58,100 of our planet and the enduring importance of our collective human value. 783 00:53:58,100 --> 00:54:03,120 And that is the overlap. 784 00:54:03,120 --> 00:54:06,640 We hope you enjoyed this podcast here today. 785 00:54:06,640 --> 00:54:10,480 We worked really hard on coming up with all the information and doing all the research 786 00:54:10,480 --> 00:54:11,480 to get it to you. 787 00:54:11,480 --> 00:54:22,620 If you have any questions or comments, you can email us the overlap@fof.foundation. 788 00:54:22,620 --> 00:54:26,980 You can hit us up and give us a follow over there on Blue Sky, the overlap podcast, as 789 00:54:26,980 --> 00:54:30,260 well as Mastodon, the overlap podcast. 790 00:54:30,260 --> 00:54:36,300 You can also check out our website, fof.foundation, and keep up with us there. 791 00:54:36,300 --> 00:54:39,220 We will see you again soon, folks. 792 00:54:39,220 --> 00:54:41,380 Well, maybe not see you. 793 00:54:41,380 --> 00:54:45,500 Maybe we'll hear from you again soon, but you'll definitely hear from us later. 794 00:54:45,500 --> 00:54:47,580 Thanks, everybody. 795 00:54:47,580 --> 00:54:49,420 And thank you again, Will, for being here. 796 00:54:49,420 --> 00:54:50,420 Thanks everybody. 797 00:54:50,420 --> 00:54:50,560 Bye now. 798 00:54:50,560 --> 00:55:00,560 [MUSIC] 799 00:55:00,560 --> 00:55:10,560 [MUSIC] 800 00:55:10,560 --> 00:55:17,980 ♪♪ 801 00:55:17,980 --> 00:55:20,820 ♪♪