Audio Player

Starting at:

Theories of Everything with Curt Jaimungal

Wayne Myrvold: A 2 Hour Deep Dive Into Entropy

September 29, 2025 2:05:34 undefined

⚠️ Timestamps are hidden: Some podcast MP3s have dynamically injected ads which can shift timestamps. Show timestamps for troubleshooting.

Transcript

Enhanced with Timestamps
279 sentences 17,082 words
Method: api-polled Transcription time: 122m 56s
[0:00] The Economist covers math, physics, philosophy, and AI in a manner that shows how different countries perceive developments and how they impact markets. They recently published a piece on China's new neutrino detector. They cover extending life via mitochondrial transplants, creating an entirely new field of medicine. But it's also not just science they analyze.
[0:20] Culture, they analyze finance, economics, business, international affairs across every region. I'm particularly liking their new insider feature. It was just launched this month. It gives you, it gives me, a front row access to The Economist's internal editorial debates.
[0:36] Where senior editors argue through the news with world leaders and policy makers in twice weekly long format shows. Basically an extremely high quality podcast. Whether it's scientific innovation or shifting global politics, The Economist provides comprehensive coverage beyond headlines. As a total listener, you get a special discount. Head over to economist.com slash TOE to subscribe. That's economist.com slash TOE for your discount.
[1:06] Even though a lot of physicists will say second law says the total entropy is never decreasing, that can't be actually be the second law, that can't be a consequence of the second law.
[1:18] This is a two-hour deep dive into entropy and the second law. Most talks on this subject are 10 minutes long, but today, Professor Wayne Muirvold gives a tour de force, explaining entropy from multiple angles, dispelling myths, and even the stunning realization that the second law is opposite to what you think.
[1:39] You will get both answers from perfectly competent physicists on each side. We'll be absolutely certain that that is the right answer. Questions explored are why is entropy not the same as disorder? What do popular accounts and even undergraduate texts on entropy and thermodynamics consistently get incorrect? Is the universe subject to the second law? Can you break these supposed entropic limits? And why quantum mechanics changes everything?
[2:09] There's plenty of confusion and puzzles about entropy. Today I would like to talk about exactly what is entropy and one of the questions that give conflicting answers is imagine in front of you, you have some physical system. I believe this comes from Shelley Goldstein.
[2:23] Like, let's just say you give this example like a glass of water, for instance, and its physical state is clearly not completely known to you. But then something else appears to you. You can be an intelligent person, an angel or what have you. It gives you a much better approximation of this glass of water than what you had before. Then the question is, has the systems, the glass of water has its entropy decreased? So they're broadly speaking, two answers one can give. One is
[2:52] Yes, obviously, because entropy has to do with the information of the system. So if you've gained information, the entropy has changed. Then the other is that's absurd. The entropy has something objective to do with the system. So why would your information about the system change its entropy? Take it away. Yep. Yeah, absolutely. And you will get both answers from perfectly competent physicists and they will be absolutely certain that that is the right answer.
[3:22] So I think the best way to start reading and thinking about that is you post the question as, what is entropy? And I think that's a bit of a misleading question because entropy is one of those words used in different senses. And we're used to that. It's not unusual. Like if you open up a dictionary,
[3:47] Very often in science people coin a new technical term because they want something to have a precise, well-agreed upon meaning.
[4:08] And that's actually what Clausius did back in 1865, I think it was. He thought, okay, here's this important quantity that I and others have been batting around in thermodynamics. It's important enough it needs a dignified name. And his rationale was,
[4:28] Everybody studies the dead languages, right? We all know Greek and Latin and so we don't want something like from English or German or Italian because then it becomes sort of nationalistic words. So let's coin something from a Greek word. So he coined the word entropy from a Greek word for transformation and he deliberately coined it to sound kind of like energy because it's closely related concept. If I had my way,
[4:57] We would respect Clausius and we would only use the word entropy in exactly the same sense that Clausius defined it. That would be what everyone means by entropy. But historically, that's not what happened. There's been a number of different quantities that people call entropy and they're all related and they're all related in some sense to thermodynamic entropy, but they're just different things. If someone asks you the question, has entropy decreased? I think an actual question is, well, which entropy?
[5:27] So in other words, there should be entropy sub one, entropy sub two, entropy sub three. And when someone says, well, what is the entropy of the system? You say, OK, are you referring to two, three, sub one? Yeah, exactly, exactly. Or if I had my way, people would just have coined different words for these different kinds of kinds of things. But there's a reason why some people say, well, of course, entropy has to be
[5:58] Um, uh, an intrinsic property of a system because, you know, this is physics after all, you know, we're not doing psychology. It's not in, you know, we're not studying people's information. We're studying physical properties of physical systems, which they, you know, qualifies they have no matter what anybody knows about them or what we think about. Like if I ask, okay, what's the mass of this cup? Um, that, you know, it would seem absurd to say, well,
[6:26] How much do you know about the cup? The mass of the cup is something that the cup has, um, um, talking about the rest mass, because sometimes people will talk about relativistic mass and talk about that as observer dependent, but okay, what's the rest mass of this cup? Then, um, yeah, that's the property of the cup and, um, it doesn't matter what anyone thinks about it.
[6:50] And if you think that thermodynamics is a science like that, that is just studying the physical properties of things, then it seems absurd that one of its central concepts, entropy, would be something that would be defined relative to a state of information. And I think that
[7:15] At bottom, the fact that people are inclined to think that different notions of entropy are obviously the right one, and different answers to this question are obviously the right answer, is even though this gets completely blurred in the textbook tradition, there are actually different conceptions about what the science of thermodynamics is all about. Okay, so look, in the second law, it stated that entropy doesn't decrease.
[7:42] Yeah. Oh, yes. Yeah. Your caveats closed system or isolated system. Yeah. Okay. Then there's a formula for entropy. Are you saying that even here there should be sub one and sub two? Actually, if you look these different notions of entropy are actually defined differently. And you actually if you look at different textbooks, when they introduce the concept of entropy,
[8:10] They actually will sometimes give very different definitions. So maybe I should just talk about what Clausius was doing, because that's one of the definitions that's out there. Sure. So Clausius was working in the 1850s, 1860s. Those are the early days of what we now call thermodynamics.
[8:37] And it was Kelvin who gave the science that name and I think a lot of people actually misunderstand what that word means thermodynamics because in
[8:50] Physics these days, when you talk about dynamics, you usually mean the laws of evolution, like the dynamical laws that govern the behavior of systems. And that's actually not what Kelvin meant when he decided to call this emerging science thermodynamics. This was, as I said, back in the days when everybody studied Greek in school, and it's formed from two Greek words.
[9:19] the words for heat and for power. And thermodynamics has its roots in the study of how you can get useful mechanical work out of heat. Like it really ultimately has its roots in Carnot's study of heat engines, efficiency of heat engines. If you think of that as what thermodynamics is about,
[9:46] Physicists these days have a word for a theory like that. It's a resource theory and This comes out of quantum information theory. So what happened? You really got going a couple decades ago is This you know field of quantum information theory it includes quantum communication and cryptography and stuff like that and they were asking questions like, you know, if you've got two agents Who have access to certain resources? What can they do with those?
[10:16] So for example, these agents are always called Alice Bob and Bob, by the way. For example, if Alice and Bob want to send a secure signal that an eavesdropper could not, as a matter of physical principle, eavesdrop on, what can they do? Can they do it if they have a certain amount of shared entanglement?
[10:38] Using physics in the sense that quantum physics is telling you how physical systems are going to respond to certain operations and stuff like that. But the questions you're asking are really questions not within physics proper. It's questions about what agents with certain kind of means of manipulating a system and certain resources can do to achieve certain goals. Why is that not in physics proper?
[11:07] Um, because when I say physics proper people, um, usually what I mean is what physics usually think, but physics is about the properties of physical systems period. Right. And if I'm talking up in a end, so these goals of these agents aren't a matter of physics, these are something that you're adding on. Hmm. Right. So is it, so there's a one, one question is what, you know, what do things do, um,
[11:34] So there is a certain question, what do things do under certain circumstances? But if I'm, if I'm setting you for certain goals, like is an agent itself a part of the physical system? The agents ourselves are physical systems, right? But physics studies physical systems in certain respects.
[12:02] So I'm a physical system, right? You're a physical system. You have thoughts and beliefs. Thank you. It's the nicest thing anyone's ever said about me. Some might disagree and say you're not just a physical system, you're a combination of a physical system plus an immaterial mind, but I actually think that we are all physical systems.
[12:23] Um, so, um, and you know, we qua physical systems have, you know, thoughts and desires and hopes and dreams and stuff like that. But if a scientist is studying my thoughts, so study that side that scientist is doing psychology and not doing physics. Okay. Yeah. Um, so, um, you're just studying different aspects of things. And if I bringing in things like, um,
[12:51] Here's the game that Alice and Bob are going to play, and here's how we're going to score them. And then you give them certain resources, and physics tells you what the highest possible score is. But basically, you're not doing the sorts of thing you usually find in physics textbooks if you're talking about goals and scoring and things like that. That's what I mean. Got it. Yeah, good.
[13:17] And I think this will become important for thermodynamics. So with people who are doing quantum information theory, they said, okay, what we're doing is this is a resource theory. And then some of the same methods ended up when people started doing quantum thermodynamics, a lot of people started thinking of this as a resource theory.
[13:45] Like, so if I give you certain resources and you've got a certain task like lifting a weight or something like that, and maybe you've got some kind of system and you've got a heat bath at a certain temperature, what's the most work you can get out of it? So how high can you lift the weight?
[14:13] people, a lot of people working in quantum thermodynamics these days think, okay, what we're doing is a resource theory similar and in some, some sense is modeled on the quantum information theory. Okay. And, um, that's basically how the founders of thermodynamics, what wasn't quantum, it wasn't quantum, but that's basically how the founders of thermodynamics thought of thermodynamics. It's a study of given certain physical resources,
[14:41] like heat baths and things like that. How can I exploit these resources to do work, like drive a car or something like that, or race?
[14:56] Let me see if I can summarize this. So Clausius who coined the term entropy was thinking of it in terms of a resource theory. Now a resource theory is what can I do with these resources and often in thermodynamics, when you take an introductory course, you speak either the first or second lecture about pistons. So given this system, can I move a piston? So they were thinking practically.
[15:20] Yeah, there's a sort of interesting trade-off between practical concerns and theoretical concerns because these questions were initially raised by practical concerns where they sort of took on a life of their own. And you can see that already in Carnot's work.
[15:35] So Carnot wrote this little pamphlet called Reflections on the Motive Power of Heat. And he was actually responding to some issues, and this is something that his father, Lazaro Carnot, had done some work on, that was going on at the time is, if you've got a heat engine,
[16:00] And usually that was you've got some kind of gas in a chamber and you heat it up in the drives a piston, right? Is it more efficient if you use a more volatile substance? So it doesn't matter whether you're using air or steam or say alcohol or ether or something like that. Are you going to get more work out of the same amount of heat?
[16:25] And this was actually a practical matter because some people were thinking, okay, yeah, let's use alcohol or ether. And you can kind of imagine what happens because these things not only expand a lot faster when they're heated up than air does, they're also highly flammable. And it's kind of dangerous to have these things around fire.
[16:47] And so one of the questions that Karno was asking is, well, does it matter what the working substance is? Does it matter what gas you have in the piston? And he argued that actually the maximum efficiency, if I have two heat sources at different temperatures, the maximum efficiency of an engine running between them is independent of what the stuff is that you're using in the gas.
[17:15] So that had its roots in practical concerns, but Carnot considered what we now call thermodynamically reversible processes. And the thermodynamically reversible process involves you're only exchanging heat from two things at the same temperature. And so what you're doing instead of is you're actually expanding the gas very, very slowly.
[17:45] and then when you're dumping heat out, but you're compressing it very, very slowly. And of course, actual engines are not anywhere close to thermodynamic reversibly because what we're actually concerned is not just efficiency, but also power, right? How much work we're getting per unit time, right? Sure. If someone tries to sell you a car and says, okay, this has an amazing gas efficiency,
[18:16] but you can only go, you know, five kilometers an hour, right? Right. We're not buying it literally. Right. And so, and so a lot of the study of, um, even though the thermodynamic grew out of study of efficiency of engines, a lot of the actual theoretical work, the things you can actually prove things about involve consideration of thermodynamic reversible processes and, um,
[18:45] In the real world, there actually are no thermodynamically reversible processes. I noticed that you interviewed John Norton a while back. I'm sure he emphasized that point. Yes. Right. But we can approximate thermodynamically reversible processes. They just have to go very, very slowly, right? And actual machines were not interested in things that work very, very slowly.
[19:14] So you can still you can have a resource theory as a more there is maybe be motivated by abstract concerns, but you could be actually considering situations that are very far from realistic ones. Talk about how the second law assumes a certain definition of entropy. Maybe it shouldn't even be called the second law. Good.
[19:40] I'm really glad you asked me that because I think it's the other way around. Um, when you ask people, what's the second law of thermodynamics, you go ask, ask people in the street and something that they'll say, you know, um, they'll, um, a lot of them will say things like what you said that the entropy of an isolated closed system always increases. Right. Now, the interesting thing is, is if you, what you mean is thermodynamic entropy,
[20:09] as Clausius defined it, that actually is not right for an important reason. Even though Clausius himself sort of as a tongue of cheek at the end of one of his papers said we can express the first and second law as, or he says, if we may be permitted to talk about the total energy of the universe and totally entropy of the universe, we can express the first and second laws as
[20:35] total energy of the universe is constant, the total entropy of the universe drives to a maximum. That's actually not, that's actually not as official statement of the second law. And there's very good reason. Cause you said the second law of thermodynamics presupposes a certain notion of entropy is actually the reverse. The causes is definition of energy entropy presupposes the second law. Hmm. Okay. How so? Okay. So one way of, um,
[21:06] expressing the second law would be suppose I've got some kind of system and it's easiest to imagine a gas with a piston and it goes around in a cycle in the sense that it comes back to the same thermodynamic state that it started in and people were thinking about these cycles because they're thinking about heat engines so what heat engine typically does
[21:31] is you've got some working substance gas, you heat it up, it drives the piston out and then you either expel the substance or cool it down and compress it and push the piston in and then you're ready to start again. So the engine itself is working in a cycle. Suppose I've got a gas in a box and I can change its temperature and I can
[21:59] This episode is brought to you by eBay. Buying parts for your car? You'll know that. Will it work? Feeling. But on eBay, buying parts is different. eBay's free returns means if it doesn't fit, or if it isn't what you expected, you just print a label, drop it off, and get your refund fast. No haggling, no stress, and at least 30 days to return any eligible item. Millions of parts, free returns. eBay. Things people love. Eligible items only. Exclusions apply.
[22:30] I'm going to say this a little bit differently because I'm going to introduce the concept of entropy
[22:42] without using the word and let's just see if you notice where it comes in. Okay. Okay. Okay. So imagine you've got, say, for example, a gas in a container and there's a piston you can move around and, um, you can put it next to a heat source and maybe expand it and use that to move away to something. And so suppose I start with it at a certain temperature and pressure in the pit and a certain volume, the piston is a certain place. And I,
[23:10] Slowly expand it in a thermodynamically reversible sense and I'm raising a weight and It's connected to heat source. So heat is going into it Okay, and then I hand it to you Kurt and say I want you to put it back to the original state Okay now Here's what what I'm closest is thing
[23:33] It's a consequence of the second law of thermodynamics. You can't do that without expelling any heat at all from the system. Now, one thing I can do is if the original process was thermodynamically reversible, is I can just do that original process in reverse, compress the gas back to its original volume, expelling exactly the same amount of heat into the same reservoir I got it from at the same temperature.
[24:04] And that might be the best you can do if you only have one heat source or sink at one particular temperature. But what we really want to do is not lower the weight as much as we raised it. And so Clausius says, hey, look, if you've got another heat bath at a lower temperature,
[24:32] What you can do is you can expel a smaller amount of heat at a lower temperature and get the thing back to its original state that way. So he introduced this notion of equivalence value of heat.
[24:58] Heat transferred between a system in a reversible process and a heat bath is, in a certain sense, worth more for what you want to do if it's at a lower temperature than a higher temperature. Because if I want to restore the initial state, I can either use a large quantity of heat at a high temperature or less heat at a lower temperature.
[25:23] That's what he calls the equivalence value of heat. It's a function not just of the amount of heat, but the temperature which is being transferred. In fact, as Kelvin realized, I can define a temperature scale, which we call the Kelvin scale or absolute temperature scale, so that the equivalence value of a quantity of heat at a given temperature is just inversely proportional to the temperature. Just define the temperature scale that way.
[25:53] Okay, so one Statement of the second law is that if I take something in a cycle and There's heat being exchanged to various temperatures Add up all the equivalent equivalence values to those heats It can't be More greater than zero. It's less than or equal to zero
[26:24] And if it's a reversible cycle, then it's equal to zero. The sum total of all those equivalence devalues is equal to zero. Yes. Okay. Okay. Now I can define entropy because it's a consequence of that. If I have two different thermodynamic states and I go from one to another in a reversible process,
[26:51] The total equivalence values over those processes is not going to matter which reversible process. Suppose there's more than one reversible process that gets me from state A to state B. And if I go via one reversible process, add up all the equivalence values of heat, it'll be the same as another process. And the argument is if these are reversible processes,
[27:22] Let me see, so you require the second law, if my understanding is correct, in order to make thermodynamic entropy a well-defined quantity. Exactly, yeah. So the definition of thermodynamic entropy is, if I want to know the entropy difference between two states, then cook up any
[27:49] Reversible process to connect those two states and just add up the equivalence values of heat transferred in that process And it's a consequence of the second law that it doesn't matter which reversible process I use So when someone says I've broken the second law can that statement even be made Or are you saying that there's you have to assume it in order to define the entropy. And so how are you going to break the second law? Yeah, so if you
[28:17] If you try to express the second law as total entropy of a, in any process, the entropy of an isolated system never decreases. Um, that's actually incorrect because if you could break the second law, then you, um, you wouldn't have a well-defined thermodynamic entropy. Interesting. Yeah. So, I mean, this is a point I think that a lot of people miss. It's actually a consequence. What I'm saying is,
[28:45] What I'm saying about the definition of thermodynamic entropy is not anything radical. It's like standard textbooks. Very often what people do in textbooks is they give a statement of the second law, something like Clausius version is there's no process whose net has no other effect than moving heat from a cold body to a hot body. And then
[29:16] That's one version of the second law, and I express it without mentioning entropy. Yes. And then say, given the second law, we can define thermodynamic entropy using the standard definition that you just take any reversible process connects two states, add up the equivalence values of heat along those processes and it's going to be the same. So even though a lot of people will say second law is
[29:46] Total entropy of a bunch of systems that's isolated is never decreasing. If you mean thermodynamic entropy, that can't actually be the second law. That can be a consequence of the second law. But if you actually break the second law, like if I could have a process that had no other effect than to move heat from a cold body to a hot body,
[30:14] Then thermodynamic entropy just wouldn't be well defined. Wayne, let me ask you something. So you're extremely historically informed. Yeah. So historically speaking, how do people think about entropy? What I mean to say is we can make an analogy with heat. So heat we now think of as having to do with the motion of molecules. OK, it's something about the motion of molecules. Now, temperature is also something about the motion of molecules. But prior, 100 years ago, 200 years ago, it was thought of as some form of fluid.
[30:43] Okay, so how did people used to think about entropy? Did they think of it as a quantity like temperature was something abstract? Did they think of it like a fluid? Did they think of it like something else? What was their mental model for entropy? That's a very good question. Because as a matter of fact, Carnot, when he wrote his book, he was thinking of heat as a fluid they call the caloric that was conserved and it flowed between bodies. And
[31:11] Even Kelvin, when he was writing his first papers on what we now call the Kelvin scale, was thinking of it that way. But what happened was that shortly after that, and this happened during Carnot's lifetime, Jewell did his experiments on what people called the mechanical equivalent of heat. The basic idea is if I do a certain amount of work,
[31:40] I can generate a certain amount of heat and you can measure the work in terms of say foot pounds and you measure the heat with using a calorimeter. How much will this warm a given sample of water? And Jule decided there's a mechanical equivalent of heat that there's an equivalence between work measured in energy, you know, terms like foot pounds and things like that.
[32:09] and heat measured in calories and you can convert them and also there's no limit to how much heat you can generate if you just are doing enough work.
[32:21] A precursor to that was Count Rubford doing experiments with grinding cannonballs. Like if you guessed, not cannonballs, cannon bores. Like if you're grinding away... Oh, okay. Yeah, so what people have noticed is... I don't know what that is. What's a cannonbore? Okay, so how do you make a cannon? I don't know how people make cannons these days, but the way people make cannons back then is you would make a cylinder of iron or steel and then you drill a hole in it.
[32:50] So the cannon bore is the path that the projectiles and unsurprisingly, if you're drilling away the piece of metal, right? It gets hot, right? So the process involves, I think, horses driving this drill bit.
[33:14] Bore. Yeah. Yeah. Yeah. Um, horses driving this, um, drill bit and everything's got hot. So you have to cool it off with water and count. Redford did experiments and he got convinced that, um, if you've got enough horsepower, there's no limit to how much heat you can generate. And that didn't fit well with the idea that caloric heat is this fluid that you're squeezing out of the substance.
[33:41] because if there's a finite amount of heat in any given substance, eventually you think you would run out and not be able to, um, generate any more heat. Right. Right. Like you can't sweat infinitely. Yeah, exactly. You can't sweat it. Perfect. That's a wonderful analogy. Yes. Um, eventually you get to dehydrate dehydrated. Right. And so, yeah, um,
[34:05] So what happened was people, largely due to Guell's experiments on the mechanical equivalent of heat, actually became convinced that heat was a form of energy similar to the mechanical energy of things moving around. And that became known as the kinetic theory of heat. And along with that,
[34:27] is this picture of gases, for example, being full of molecules bouncing around and when they're hotter, they're moving faster. Now, the interesting thing is even though Carnot, when he initiated what we now call the theory of thermodynamics and Kelvin in his very first papers on this, even though they were thinking of heat as this conserved fluid, um,
[34:56] Very quickly, the people who are working on thermodynamics got converted to the kinetic theory of heat. And so a lot of the same people who are developing what we now call thermodynamics were also working on the mechanical theory, the kinetic theory of heat, or Clausius had called it the mechanical theory of heat. And so Clausius was thinking of heat as involving molecules bouncing around. And so that
[35:23] raises the question of how entropy might actually be realized in terms of what's going on with the molecules. And Clausius has some ideas about that which are mostly forgotten because none of them were very satisfactory. But the interesting thing is that I can develop the science of thermodynamics with
[35:48] independently of the molecular hypothesis. Like I can talk about work being done at the macroscopic level and he being exchanged without really being committed to what was happening at the micro physical level there. Yes. And so Maxwell, who is one of these people who was at the same time participating in the development of what we now call thermodynamics and participating in the development of what we now call statistical mechanics.
[36:18] And of course, these are very often together in the same textbook these days. Maxwell said, well, thermodynamics is the study of the thermal and dynamical properties of matter without any hypotheses as the molecular constituent of matter. So according to Maxwell, as long as I'm doing thermodynamics, I should be kind of neutral about whether or not matter is composed of molecules. I can just talk about
[36:49] heat and work as different kinds of energy exchange without being committed to how it's realized in the microphysics. And some thermodynamics textbooks actually do that. Like some thermodynamics textbooks you'll have, or actually sometimes you have a thermodynamics course and then a statistical mechanics course. And the thermodynamics course can actually be completely independent of talk about molecules. Yes.
[37:18] And, but in other books, and these usually have the title of thermal physics rather than thermodynamics, the two are go hand in hand. I actually think there's something to be said for the kind of old fashioned way of thinking about it that's still in some textbooks is, let's talk about thermodynamics as
[37:43] sides of exchange of heat and energy, try to express the basic principles of thermodynamics independently of any particular theories about the molecular structure of matter. And then you can say, okay, once we acknowledge that matter has a molecular structure, how does it have to be modified? Okay. Why do you prefer that?
[38:13] because it highlights a difference between two different forms of the second law of thermodynamics. And here's why. So one consequence of the second law of thermodynamics is if I have a heat engine
[38:38] operating between two heat hole sources and things at different temperatures, there's a maximum efficiency. So the efficiency is if I pull out a quantity of heat, how much work can I get out? What you want to do is get as much work out as possible and then dump as small quantity as possible of heat back into the lower temperature thing. And one consequence of second law of thermodynamics is given any two temperatures,
[39:06] There's a maximum efficiency of a heat engine operating between heat source and sink at those temperatures. Okay. And that's known as the Carnot efficiency, Carnot bound on efficiency. Okay. Now Maxwell, I think was the first. And so one way of staying second law of thermodynamics is no heat engine is going to have an efficiency that exceeds the Carnot bound. All right.
[39:36] Okay, as Maxwell was the first to articulate, clearly, if the molecular kinetic theory of heat is right, that actually can't be strictly true. Why? Yeah, that's of course the right question, why?
[39:56] Um, because, um, if the, um, molecular theory, the kinetic theory, if you're right, there's going to be a certain kind of unpredictability about how much work you're going to actually get because these molecules are bouncing around more or less at random. And the pressure of the gas on the piston is a matter of the molecules hitting the piston and bouncing off. And.
[40:24] on a fine enough scale that that force per unit area is going to fluctuating because the molecules are bouncing around more or less at random. Yes. And it could happen that you just happen to get lucky that during the time you're expanding the piston, more molecules than usual or maybe with a higher average energy than usual happen to hit the piston and you get more workouts than you would have expected. Right. Right.
[40:53] If you say to a physicist that, okay, that shows that the second law of thermodynamics can't be strictly true, they'll go nonsense, right? Because what you can't do is completely rely on that, right? Because it could happen that I get less than the Carnot efficiency, right? Because I'm- Yes. And so what physicists these days accept is, okay,
[41:21] fine enough scale there's going to be these fluctuations in the amount of work you get but if you do it again and again and again on average you're not going to be you're not going to be reliably able to exceed it's a statistical law it's a statistical law yes perfect yes and so why I'm so
[41:42] flummoxed as to why you like the thermodynamics one is because I personally don't like the thermodynamics view. I very much like statistical physics, but disliked thermodynamics. Okay. And so even the definition of entropy as a weighted sum of logs of probabilities, that makes intuitive sense to me, I can derive something with that. And I can make sense of picturing balls, billiard balls bouncing around. But
[42:08] I would do something dangerous by accident.
[42:30] I don't build things by the hands but I do actually have heat engines that other people have built on my shelf back there.
[43:01] Thermodynamic entropy is closest to find it, and it's totally independent of any hypothesis of a molecular structure, and its definition presupposes for its definition the second law of thermodynamics in one of its formulations. So if the second law of thermodynamics is not right, then
[43:27] entropy there is no such quantity as entropy as entropy as Clausius defined it because it's just not going to be true anymore that it doesn't matter which reversible process you pick to go from A to B to define your entropy. Okay here's why I think it's important to realize to make clear okay that's what thermodynamic entropy as Clausius defined it is because
[43:56] It helps us realize that when you now make the move to statistical mechanics, the second law of thermodynamics as originally conceived actually isn't quite right. And it has to be replaced by, as you say, a statistical law. Yes. And is this statistical version not the original version of the second law of thermodynamics that physicists these days accept?
[44:24] And then you use words like tendency. Yeah, right. And Maxwell himself said, and I think he was the first one to express it this way, the second law of thermodynamics is a statistical regularity.
[44:42] And this is at a time when people like were, this is like middle of the 19th century, or actually he said this in 1878. So, but this is a time when people are getting really impressed by statistical regularities because this is like, it was early in the 19th century that people really started gathering statistics about populations and noticing that there were these regularities, say the number of murders per capita in Paris year after year. And, um,
[45:12] That's interesting. That's interesting because these are statistical regularities that you can depend on year after year of there are averages over, um, aggregate aggregates of individually in unpredictable events, right? Like, you know, like if, if people could predict exactly when and where a murder would take, then it would, would, would take place. Then it wouldn't take place. Right. Yeah. So, um,
[45:42] Yeah, so this is Maxwell saying the second law really is just a statistical regularity and it has to do with, it's similar to the statistical regularities that the statisticians who are out there gathering data about populations are doing. And he actually gave a talk on molecules to the British Association for the Advancement of Science
[46:12] which had only recently created a section for statistics and the hard scientists were kind of looking down on the social sciences and he gave this talk on on um molecules saying we the physicists have adopted methods from the statisticians because we're talking taking averages over a large number of quantities of things so um interesting yeah so yeah so the law the version of the second law that um
[46:43] Most people accept these days as some kind of probabilistic or statistically qualified thing.
[46:51] The way that Szilard put it in the 1920s is, imagine someone who is trying to exceed the Carnot bound of efficiency is kind of like a gambler who's trying to break the bank at a casino. You might have occasional wins and losses, but in no way you're going to be reliably, on average, going to be able to win.
[47:15] Yeah, and there, you know, theorems coming from a probability theory about the impossibility of doing that because the expectation value of your winnings is always negative if the casino is doing what's right. And a lot of large numbers says that your winnings per game is going to, with high probability, get closer and closer to the expectation value. So it's using the second law should be thought of as a similar thing.
[47:46] This actually brings in a connection with information if you're still thinking about thermodynamics as some kind of a resource theory. Suppose you've got a fluctuating pressure on your piston. If you knew when those fluctuations were going to happen,
[48:11] If you could reliably say I'm going to pull the push-up piston out only when the pressure is momentarily higher than average, then you could violate even the statistical version of the second law. And think of it this way, rather than a piston, imagine you've got a box with a gas in it and there's a partition down the middle and a little hole in the middle so the gas can go through, right?
[48:36] Okay, so there's going to be continually small fluctuations in the number of molecules on each side as they go back and forth, and there's going to be continuous small fluctuations in the pressure. And suppose it's been doing this for a while and then we close the hole and you're fairly certain that the pressure is greater on one side than other, right? If you know which side the pressure is greater,
[49:03] Then you could exploit that to slightly increase the piston. And if you reliably knew, if you could do this again and again and reliably know where the greater pressure was, you could violate even a statistical version of the second law. Okay, so here's the, let's go back to the ancient question. So suppose you've got a box of gas in front of you.
[49:29] and it initially starts out with same pressure on both sides and then it fluctuates a bit and there's a higher pressure on one side than another and you close the hole so it's stuck like that. Has the entropy decreased? Well if you think okay you do the standard calculations of entropy you've got it doesn't matter which side the pressure is higher
[49:56] because you can just do the standard calculation and if you've got a box with higher pressure on this side than lower pressure on this side, it has a lower entropy than a gas with the same pressure on both sides, right? And so if you're thinking of entropy as something that's supposed to be just, you know, property of the gas itself,
[50:20] Then, yeah, the entropy is lower. It has decreased, right? And you will say that even when you... Sorry, you mean the property of the physical system, not the gas, like not the molecules, individual molecules of the gas, but the physical system, the whole system. The physical system is the whole gas, you know, the whole thing. Yes, okay. So the physical system is the gas in the box, right? Got it. And
[50:48] Standard calculation, if I tell you here's a box with two chambers and there's this much gas at this much, let's say it's the same temperature on both sides and you had this much gas and this much pressure on this side and this much gas at this pressure on that side, then the minimum of entropy is when the two pressures are the same. Standard calculation that you learn to do in your intro thermonautics courses. Okay.
[51:16] Okay, so when you have a spontaneous fluctuation, one thing you can say is, well, look, what's happening is the entropy is actually spontaneously fluctuating around some kind of mean value, right? And so there are actually spontaneous decreases of entropy. However, here's another way to phrase the press question.
[51:44] I'm going to phrase it not in terms of entropy but in terms of what Kelvin and Maxwell called available energy. So here's a question about available energy. Available energy is, imagine I've got some physical system in front of you and you've got a heat bath at some fixed temperature and I task you with taking it in the state it's in and
[52:14] Try to get as much work out of it as you can, but I'm going to specify the state you have to leave the system at the end. And the available energy is a measure of how much work you can get out. And it's equivalent to what we now call the Helmholtz free energy, which is total internal energy minus temperature times the entropy. If a gas
[52:42] Spontaneously fluctuates to a situation where there's a more pressure on one side and we close the hole Has the available energy increased I'd say yes, you'd say yes, you can now use it to do something Okay, so how are you gonna use it to do something? If all the molecules are now on one side or just more of them, yeah, okay more Yeah, then you can place something here and it will spontaneously start to push this guy
[53:12] to the left. Which side are you going to place it? Do that on? What do you mean? Like so you so if all molecules are one side, then you can just have a piston on that side. So yes. So I have the gas and you know, let's just say all the molecules on one side. Okay. So they're either all on the left side or on the right side. Yeah. Okay. So what do you what do you do to get work out of it?
[53:41] You place the piston in the middle point that divides it and then you watch the piston grow. Right. So suppose you want to raise a, so if I, if I just let the piston go, then I haven't gotten a useful work. Well, suppose I want to raise a weight. Uh-huh. What do I do? I don't know. Well, here's the thing. If I, if I hook the thing up, if I put a piston, attach your weight to a piston,
[54:10] Then, and I want to raise the weight. So they say, I've got the piston, I've got a string on a pulley, you're in a pulley and you've got a weight that can go up or down, right? You know, that string and pulley can be either on the left side or the right side of the piston. And if I don't know which side of the gas of the box the gas is in. Oh, sorry, I didn't realize that you don't know which side. Yeah, all I said was it's either in the left side or right side, right?
[54:39] I see. Okay. Okay. So right. If you don't know, then you, you know, if you guess right, you might say, okay, I'm going to guess. Right. And I'm going to put the piston, I'm going to put the weight on this side of the piston and you could end up raising it. Right. But if you guess wrong, you could end up lowering the weight. Yes. Yes. So this is why, according to some people, it makes sense.
[55:08] um, to have entropy be a function, not only of like a physical state of the system, but about what somebody knows about it. Because if entropy is supposed to have the connection that with available energy that I just said, that is a measure of how if available available energy has to do with how much work you can get out of a system that depends not only the physical state of the system, but
[55:39] your means of manipulating the system available to you and what you know about it. And what I just said is non-controversial. If the question is, how much work can I get out of a system? That depends on my means of manipulating the system and what I know about the system. This is a non-controversial thing.
[56:03] And if you want a notion of entropy to have this connection to available energy, then it makes sense to have a notion of entropy which is relative to means of manipulation available and a state of information by the system. And in my experience, I can say that to people who initially say to the angel question,
[56:32] Oh, of course not. Entropy is patently a property of physical system alone. It's got nothing to do with what you might know about it. If I say, oh, well, if I think of thermodynamics as a resource theory about a theory about what agents with certain goals and certain means of manipulation can do with systems, and I want this notion of available energy
[57:02] to be a measure of how much work you can get out of a system, then clearly available energy can be relative to means of manipulation and knowledge about the system. It matters if you know, because if you have to do different things to the system to get working on it, out of it, depending on which side of the box molecules are on, then information is a resource, right? And
[57:29] And so there's a perfectly acceptable notion, and it doesn't matter whether you call it entropy or not, but there's this notion that has that connection to available energy. And what are we going to call it? Well, if you don't like to call that entropy, make up a new word for it. But it is actually very closely related to the
[57:58] concept that Clausius coined the term entropy for. And then the difference between this and the Clausius notion of entropy is in traditional thermodynamics, you know, thermodynamics 1.0, what people were doing in the 19th century, they always assumed that even if
[58:21] Matter is made up of lots of molecules and there's these fluctuations at the molecular level. We're dealing with them in bulk and any fluctuations are going to be negligible. Things might as well be predictable. And so we can just assume we know what's going to happen as a result of our manipulations. When you start getting down to the molecular level, and this is what the people who are working quantum thermodynamics are doing, they're saying, okay, you know, we're working at a level where
[58:49] These molecular fluctuations aren't negligible. And if, you know, what you really want for a notion of entropy is something that's relative, say, to certain physical characteristics, but also, you know, it might be in this, this and this, this state.
[59:09] And what I can do with it, how it's going to respond to what my manipulations can depend on what state it is. So it can actually be relative to some, say, probability distribution over possible states. And then what you get is a sort of quantum, what you get is a signal of thermodynamics as a sort of statistical average.
[59:39] Now what happens in textbooks these days is there's basically, even statistical mechanics textbooks, there are textbooks that basically take that kind of approach and whether you're paying attention or not, entropy is actually defined in terms of a probability distribution which can represent a state of information about the system. And then the other way of doing it, and this is what you were regarding
[60:07] So that is what's often called Boltzmann entropy.
[60:35] And the other one where entropy is defined in terms of probability distribution is often called Gibbs entropy. And they're both perfectly good concepts. But they have different uses. They're different. They're different things and they're different uses. So if I give you a box and say with probability
[60:58] one half it's all the molecules are in this side and with probability one half the other side that will have a certain gives entropy associated with it which will have something to do with what you can do with it what works you can get out get out of it and then you say and and that will vary with what those probabilities are like it's more used that that it's more like if
[61:28] Yes, right.
[61:58] You know, if it's either on this side or on this side or on that side, you calculate the Boltzmann entropy. If it's on this side, you calculate the Boltzmann entropy. If it's on the other side, it's the same. I see. And that's also correct. Right. So the Boltzmann entropy doesn't depend on what you know about the system. The Gibbs entropy does. They serve different purposes because different concepts.
[62:25] And what happens when people get in arguments about whether or not it makes sense to, when people get in arguments about whether or not it makes sense for entropy to be relative to a state of information, they have in mind different concepts of entropy which are perfectly well defined but for different purposes.
[62:49] Close your eyes, exhale, feel your body relax, and let go of whatever you're carrying today. Well, I'm letting go of the worry that I wouldn't get my new contacts in time for this class. I got them delivered free from 1-800-CONTACTS. Oh my gosh, they're so fast! And breathe. Oh, sorry. I almost couldn't breathe when I saw the discount they gave me on my first order. Oh, sorry. Namaste. Visit 1-800-CONTACTS.COM today to save on your first order.
[63:19] OK, when I said probability here, the person who's listening may be thinking probability of what and then what we didn't say much. Maybe you mentioned it once or twice, but not much as microstate versus macrostate. So it's the probability of a certain macrostate. What is a macrostate? It's seen as a count of microstates. What is a microstate? Well, when people say what is the physical system, most of the time on this channel, when we're speaking about quote unquote fundamental physics, we're thinking of a microstate.
[63:49] So a macro state is then what? Like what the heck defines a macro state? Is it just us as people? We say this is something we care about more. So we're going to call this a macro state. Yeah. So that's a good question. And in fact, you'll find different answers in different textbooks because the people who want entropy
[64:16] Statistical and mechanical entropy to be a property of the system by itself. They usually mean Boltzmann entropy Right, but the Boltzmann entropy what you do first step you do is you partition the possible the set of possible microstates into macro states and you say Whatever micro state it's in it's going to be in some macro state and some macro states
[64:44] correspond to a bigger range of possible microstates than others, and the macrostates which correspond to a bigger range of microstates have higher entropy than the ones that correspond to a narrower range of microstates. And so the entropy does change with microstate because if the microstate changes,
[65:10] Within a macro state, the entropy doesn't change, but if it crosses from one micro state to another, then the Boltzmann entropy changes. But the entropy isn't a property of the micro state alone because it requires this division into macro states, which isn't there in the fundamental physics. Yes. So technically speaking, any given micro state has entropy zero. Well, if you're talking about Boltzmann entropy, right?
[65:40] Then in order to define Boltzmann entropy, I first have to partition the possible states into macro states, right? Right. Okay. And, um, however, if I, if I tell you, if I'm going to do like a really, really fine partition, right? You say my partition is I'm going to tell you, you know, every micro state is in, is in a different element of partition. I'm going to tell you exactly what the micro state is, right?
[66:08] Then yeah, then every of those microstates will have zero entropy, right? Yeah. But that's what's, that would be kind of useful. That would be totally useless, right? So I think that even the people who are saying, no, entropy can't depend on us. It can't depend on what we know about it. It can't depend on how we can manipulate it. If what they're using as a notion of entropy is Boltzmann entropy,
[66:32] It starts with a division, dividing up the set of possible states into macrostates. And you asked yesterday the question, well, what is a macrostate? Okay. Now, one thing that people often say is, well, look,
[66:53] There's certain variables that we're going to be measuring, macro variables, and our measuring instruments are going to have a certain rank of precision and a macro state is a set of microstates that are indistinguishable according to the measurements that we're going to do. And then it's not there in the fundamental physics because it's relative to some set of
[67:18] It's relative to some set of measures and some, you know, some set of instruments, some set of measurement positions. And I think that that's perfectly fine. And then saying, okay, well, we're not doing fundamental physics when we're talking about entropy. I think that's perfectly fine. It bothers people because
[67:44] entropy increase is supposed to be one of the fundamental laws of the universe and it's not supposed to depend on things that aren't there in the fundamental physics. If you're, um, but, you know, it just might, that just might be the right answer. Right. Um, like a storm dynamics is not a fundamental theory. Right. Now, another thing you could say is, well, what really matters is, you know, if I'm thinking about this as a resource theory,
[68:11] A distinction between macrostates, well, you know, I'm going to pay attention to distinctions if they make a difference to what I can do with them and not if they don't. So if all the molecules are on one, so if I guess, if I tell you how many molecules are on one side of the box and how many molecules are on the other, that, okay, that's really useful to know because
[68:41] I can use that to expand one side or another. That's good to know. If all I've got is a piston that can expand the things in bulk and I don't have the means to manipulate things at the micro level, you tell me anything beyond that. You tell me the exact microstate, it doesn't affect what I can do with it. It doesn't affect what I can get out of it.
[69:10] And this is something that a lot of people misunderstand about Maxwell's demon example. The demon example was meant to illustrate the dependence of thermodynamic concepts like entropy on means of manipulation available. So in the
[69:35] The first appearance of what we now call Maxwell's Demon was a letter from Maxwell to his friend, Peter Guthrie Tate, who was writing a sketch of thermodynamics. And Maxwell said, you know, you might want to pick a hole in the second law because he's saying the second law, you know, if the kinetic theory of gas is true, needs to be modified.
[70:02] And, you know, imagine some, you know, little being that could manipulate molecules individually or, you know, imagine that he's got a little trap door in between the compartments of the... I'll place a video on screen about this. Yeah, imagine that you've got gas in a box
[70:25] Divide into two compartments and there's a little trap door and the demon can manipulate the door and let the faster molecules go one way and the slower molecules go another way. Well, that demon could create without expenditure of work or more minimal expenditure of work, create big pressure temperature differences that now we as macroscopic beings could exploit. And the moral of the story, according to Maxwell is that
[70:55] The second law of thermodynamics is a statistical generalization, which is applicable only to situations where you're dealing with large numbers of molecules in bulk. And when he says statistical generalization, he's expecting his readers to be familiar with the sorts of statistical generalizations that the social sciences are coming up with, things like numbers of murders per capita per year.
[71:23] And if you think about it, there actually is a nice analogy. So if you keep sort of the macro level conditions the same, the broad scale socioeconomic conditions the same, then plausibly you're going to get a fairly stable number of murders per capita per year if you're in a given situation. But if you could
[71:51] You imagine a team of psychologists going in and talking to people if they had the ability to identify people who were at risk for committing murder or something like that and talk to them and deal with them, you know, if they could deal with the people on a visual scale, then you might be able to change that per capita per murder, per capita murders per capita. Right. And so, but so what, um,
[72:20] Matsuda's saying is this demon would be able to do what is at present impossible for us. Because we do not have the abilities to manipulate things at the molecular level. He didn't think, the way he put it made it clear that he didn't think there's any fundamental law of physics that would prevent further technological developments from getting to the point where we could do this.
[72:50] Yes. Right. Now, as a matter of fact, he didn't really see this. But if you now include the demon, make the demon operate in a cycle. So the demon, whatever it does, has to reset itself at the end of each iteration of whatever it's doing.
[73:13] Then that actually is a consequence of the laws of physics, both classical and quantum that on average, the demon can't break the second law of thermodynamics because the classical case would be the, you know, if you take the operation of, you know, take the demon plus the whole system as an isolated system. Yes. Yes. If the demon can,
[73:41] operate in a cycle while reliably putting all the molecules in the left side of the box that is incompatible with Hamiltonian evolution, which conserves phase space volume. Like you'd be able to take a system that you actually be able to reduce the volume of phase space occupied with the system.
[74:07] And there's something similar in quantum mechanics where you've got, if you've got the whole system involving isolated evolution, then you've got, you know, you can't take something that's initially spread out over a big subspace of Hilbert space and put it into a small subspace.
[74:31] And so, actually, Maxwell didn't realize that if you require the demon to act in a cycle, but there's theorems to the effect of both classical and quantum mechanics, that the demon cannot reliably and continually do this. Precisely, what do you mean when you say that the demon acts in a cycle? The demon has to end up in the same physical state it started out with. Okay, why do you have to do that?
[75:00] I imagine that look if the demon has a brain right and is opening closing this door then the brain changes. Yeah Yeah, right. So so here's what people were thinking and You know, you're right to talk about the brain and you know people have given simple models of this as like not, you know, not a A creature with a brain, but maybe a little device with like a memory device or something like that. Yeah. Yeah so that
[75:28] The idea is that if the daemon has some kind of memory storage, and it always remembers what it did on previous iterations and never ever erases anything, eventually it's just going to run out of memory. So it can't keep on doing this forever and ever and ever.
[75:58] And if it has to act in a cycle, if it has to eventually erase the memory, then there's actually an entropy cost associated with erasing the memory, and that's sometimes known as Landauer's Principle.
[76:19] Yeah. And it really is just basically a consequence of what I just said, that if you require the demon to act in a cycle, then it can't consistently or reliably violate the second law. So if I don't require the demon to act in a cycle, then yeah, what it can do is
[76:49] Okay, think about that blank memory as a resource. And it's doing this and eventually uses that resource and hands you this box with a higher pressure on one side than the others and said, okay, good, you know, now you can use that to raise a piston or something like that. Okay, what all you did was you took a resource and you converted it to another resource. You didn't actually violate the second law. Yes, yes, I see. Okay. So you actually have to think about that memory reserve, that blank memory reserve as having a
[77:19] entropy of its own, so a memory which is just blank or maybe full of all zeros on this view has a lower entropy than a memory that's randomly populated by ones and zeros.
[77:35] Okay, so let me see if I got this. There are two cases. Either it operates on a cycle or it doesn't. If it doesn't, it's going to use up a resource, in which case you still have a resource theory. If it does operate on a cycle, then fundamentally you'll be shrinking phase space. Now I know the most physical systems shrink phase space because there's some friction and so on, but at a fundamental level you don't shrink phase space. Sorry, you don't shrink the volume you initially started with in phase space. Absolutely, yeah. So when you've got a dissipative system, like something that's friction, right?
[78:04] Then you write down equations of motion and it will go from, you know, everything in this original version of Space Face will go to that, right? But that's because we're not actually thinking that system is an isolated system, like it's in contact with something that's a source of friction, right? And if you include everything, like, you know, the
[78:26] pendulum that's going back and forth and just made and whatever medium it is that's The source of friction and you think of all that is undergoing ice You know isolated evolution and you think okay ordinary Hamiltonian dynamics is going to apply then that System as a whole is not going to show me fixed phase space what's happening is as the pendulum is stamped and it goes and you know
[78:54] Okay. Well, you're an expert in quantum mechanics and quantum field theory.
[79:19] And I'd like to talk to you about that next time in person, because you live actually close by. So hopefully we get to meet up shortly for my colleagues in Europe, London, Ontario and Toronto here. Ontario Count is close. My colleagues in the Netherlands always find it funny when I say things like that. So there's a Heisenberg cut.
[79:40] That's a good question.
[80:05] I will really have to think about, I sort of see where you're getting at, whether prima facie there might be a connection, but I'm not seeing exactly what the connection might be. And it's not obviously wrong. So I would have to think about that. Yeah, there might be actually. Yeah. My second funny question is, is the universe an isolated system?
[80:34] Is the universe an isolated system? Can we even talk about the universe as a whole? Presumably, yes, if you actually mean, yes, everything in the universe is, if you literally mean the universe is everything there is, then... It seems tautologically the case. However, another question is, does the universe as a whole obey the sorts of laws that we usually think of
[81:05] applicable to isolated systems. And here's why this is a genuine question. It might be that for relatively small systems that we can actually isolate, the physics we apply to isolated systems applies, but when things are big enough that doesn't actually apply. And what I mean by that is in quantum mechanics, what usually
[81:33] When you're asking is the universe an isolated system, what you usually mean is it evolves according to whatever the appropriate analog of the Schrodinger equation is. And that can be represented by a family of unitary operators and that preserves Hilbert space norm. And so it can't start out in say, you know, a small subspace of Hilbert space and go to bigger, right?
[82:01] But people who take dynamical collapse theories seriously think that actually this isolated Schrodinger revolution that we apply to sort of systems in the lab that we isolate is actually an idealization and not quite right. And that if you have systems that meet certain criteria, either they have enough particles or they involve displacements of large masses,
[82:29] Then actually the physical law is a different one that isn't the law that we usually think of as isolated. And in fact, mathematically mimics the sort of laws that we use for systems that aren't isolated. As you know, because I know you've talked to people with this, there are dynamic collapse theories, right, which modify the usual Schrodinger equation.
[82:54] and basically you know the origin of those was you know people are studying um evolution of non-isolated systems and saying okay here's what happens to the state of the system if it's say in contact with the heat bath or something like that and then saying well let's just imagine that something of this form or a similar form actually is the fundamental law
[83:21] Ford BlueCruise hands-free highway driving takes the work out of being behind the wheel, allowing you to relax and reconnect while also staying in control. Enjoy the drive in BlueCruise-enabled vehicles like the F-150, Explorer, and Mustang Mach-E. Available feature on equipped vehicles. Terms apply. Does not replace safe driving. See ford.com slash BlueCruise for more details.
[83:51] It will still be tautologically. If that's the case, it'll still be a tautology that the universe is isolated. But it might be that the way the universe as a whole evolves is as if it's continually be monitored by some external measure. OK, my other funny question is,
[84:20] It's often said that at some point we'll reach the heat death of the universe and that we won't be able to do anything, even if we're supposing that we're around or whatever is our descendants. Yeah. Now, do you imagine that to be the case? Because if we're thinking in terms of a resource theory, then I could imagine that there would be certain questions that would be more important to us that would be different to us, to whatever our descendants are. Maybe they
[84:47] are able to utilize a system with more precision? The way we actually calculate entropy is, and this is something that's not often emphasized in textbooks, is relative to a certain set of parameters that we think we're interested in or we're going to manipulate. For example, if I want to
[85:10] Calculate the entropy of a some standard volume of gas. The question is still do I count? Get samples of gas with different isotope isotope ratios differently and as long as I'm only dealing with things chemically it doesn't matter like how much of my oxygen has one is one isotope and how much it doesn't and
[85:32] If I'm dealing with nuclear reactions, or if I've got some way to separate out things according to their mass, then that actually might be something. And so I might want to include that in the entropy calculations. There's a sense in which entropy is relative to what is it you're going to manipulate. However, when people are talking about the heat death is, well, no matter what it is that you want to do,
[86:01] The natural tendency for things is towards, left to themselves, is towards diminished ability to do that. So eventually, and it was really in the 19th century that people started talking about the heat death of the universe. Kelvin himself wrote an article called On a Universal Tendency Towards Dissipation of Energy.
[86:25] And so what is going to happen, if things just keep going, is the sun's going to burn out and, okay, pretty much anything that we want to do here on earth, no matter what your goals are or your means of manipulation, it involves some kind of entropy difference that ultimately traces from the fact that on one side of us, we've got this high temperature source of
[86:54] I'm
[87:10] even if you have more subtle ways of manipulating things, um, eventually everything's going to decay into black holes and you know, eventually no matter what your goals are and no matter what, um, means you have manipulation, eventually things are going to run out and just stay that way forever. Unless Roger Penrose is right about his conformal cyclical cosmology when you know, every,
[87:38] after that happens, then things get restarted. Okay. But like, um, so that's true. And, but honestly, um, we're talking like absurdly long time scales, like billions and billions of years. So, um, I think we should be more worried about whether humans going, what are things going to be like for people on earth in the next few centuries? Right. You know, um, that, that is something that we can do something about.
[88:08] Right. Um, and what's going to happen on this time scale of millions and billions of years, that's actually hard for us to wrap around our heads around it. So like, some people find this, you know, heat death of the universe, they sort of depressing and, um, you know, sometimes even people even say, okay, this makes everything meaningless. Um, well, you know what you all,
[88:33] You've known most of your life you're going to die eventually, right? And you've got a certain limited amount of time to do stuff with, right? And do what you can with it while you have it. Make the best of the time that you have, right? And that applies on the human time scale. And I think the same thing we say, suppose
[88:55] The human species is only going to be around for another million years. Well, I would say to that species, you do make the best of what you can while you with the time you have. So actually, I don't find it. I mean, I do get occasionally like, you know, everyone has to deal with the fact that you and everyone else that you know is mortal and you're you're not you have a finite lifespan. I do, you know, it's hard not to.
[89:25] Disorder. Disorder. Entropy. Yeah. Disorder is a word that we haven't said. Yeah. And it's something that many people in the popular press, when speaking about entropy, make an equivalence between entropy and disorder. What's wrong about that?
[89:52] that you have to be careful about what you mean about order and disorder, right? And so, um, there's a real sense of what you're, if I've got a box of gas and there's a partition and all the gases on one side and none on the other, that is a more ordered state than if the, if the partition is out and there's gases all over the place. And there is a, um, and one way of thinking about that is that if
[90:21] The gas is indeed allowed to roam freely. It could spontaneously end up in one side of the box, but that corresponds to a very, very small region of its phase space. And so the idea is that there are certain kinds of states that we find to be ordered, and those just are a small percentage of all the possible states.
[90:47] There is a sense in which entropy is a measure of disorder.
[90:57] Let's say I'm generating heat by friction, like I'm grinding this cannonball, right? There's a sense in which I've got some regular ordered motion, I've got this thing going around like that, and I'm taking energy from that ordered motion and I'm transferring it to the iron in the cannon which is manifested as a higgledy-piggledy motion molecule.
[91:26] I think what's not right about that is that not everything that we would intuitively think of as distinction between order and disorder is is actually corresponds to a distinction in energy. I'm sorry, not everything that entropy, right? Yeah. So if the easiest way that I think about it is a coffee cup.
[91:49] and initially it's black with black coffee and then you pour some milk and then there's all this turbulence and you'd say oh that's extremely disordered and so you stir it and then you're like oh wow now it's extremely ordered but it actually has the highest entropy yeah right yeah so yeah i mean that's that's a good that's a really good example because there are some things that seem more disordered to us that are actually lower entropy and that's a really good example you've got you know
[92:16] Before the milk, it's actually better if it's cream because the cream can take some time to sort of dissolve, disperse, right? So if I've got, you know, if I take some thick cream and put in the coffee cup, I might have these swirls of cream in there, right? And that seems, you know, can be really turbulent and disordered. And then it settles down to a situation where the cream is evenly distributed. And that is a higher entropy state than the intermediate state. But
[92:43] It seems to us like a simpler state, like more. And that's why it's important to think actually in terms of order and disorder on the molecular level. And also, not everything that we think of as more or less ordered really corresponds to entropy differences. So when gravity comes into play,
[93:13] The natural tendency, I've got a bunch of gas spread out in the interstellar space, the natural tendency for it is for it to gravitationally clump together. A bunch of gas uniformly spread out, which clumps together and forms a star, that's actually an entropy increasing process, even though intuitively you might think the end state is more ordered than the initial state.
[93:44] And so if you're thinking sort of as a rough and ready way, there is a sense in which molecular disorder and entropy go together. It's not a reliable guide. And I think of what sometimes people think about when they mean order disorder is actually something a bit different in what people sometimes call complexity. So Sean Carroll was here sometime last year and he was talking about
[94:13] Origins of complexity and when people who study complexity, you know, that is another really spike precise difference is that you know tend to say is neither the Minimum nor maximum entropy states are the most complex There's a sense right, okay, right Yeah Something we didn't speak about that comes to mind is ergodicity
[94:40] So, are the laws of physics ergodic? Is that a well-defined statement? And also, please define what ergodic is. Yeah. So, classically, ergodicity pertains to a system confined to a finite region of a state space and undergoing isolated evolution. And to be ergodic means that take virtually any initial condition you want,
[95:09] and take any finite region of faith space, eventually that initial condition ends up in that region of faith space. And for actual physical systems, it's very difficult to decide, you know, if I hand you like, here's a lot of dynamics and say, is this ergodic or not? It's actually different, very mathematically different, difficult problem to actually decide, right? So
[95:39] Of course, that was a classical definition. The laws of physics deep down we know aren't classical. In quantum mechanics, basically, that definition of ergodicity, like a state, just doesn't really apply. There are
[95:59] Things that are called quantum ergodicity theorems basically have the effect that any state can be approached as closely as possible. Are the actual laws of physics, like if I actually took say a box of gas and
[96:27] somehow or another isolated it and let it go according to ordinary quantum evolution. There is a sense in which something like ergodicity applies in that if you look at sufficiently long time averages,
[96:47] then the amount of time it will spend in a given subspace will be for almost all initial states proportional to the dimension of that subspace. I'm not sure that, and something of that sort of flavor often comes in when people are trying to, when people are trying to prove equilibration results. So you want, you know, what we haven't talked about is this sort of process of
[97:17] You leave something alone, it starts out from a far equilibrium state and then it goes to a equilibrium state. And that's sometimes called the minus first law of thermodynamics. Right, right. And we haven't talked about that. And one of the reasons we haven't talked about that is that it's hard to say anything really precise about that because there's various results and it's not always clear. You're going to get nice clean mathematical results.
[97:47] whose physical significance for actual systems is a bit obscure. And then you get sort of plausibility results, arguments for actual physical systems. And I actually think that air gradicity in any sense, which really has to do with infinite long-term, um, average behavior really isn't the right question because what I want to know if I pour the milk in the coffee cup, right?
[98:16] not what it's going to do on average if you've left it alone isolated for all eternity but what is what is it going to do in the next few minutes like you want to know actually what's going to happen in finite time scales so statistical mechanics textbooks are divided on whether their gaudicity is actually important for statistical mechanics some in this reading some will say okay this
[98:45] Ergodic hypothesis is at the root of statistical mechanics, the hypothesis that actual systems are ergodic. And then others will say, oh, there's all this math, really nice mathematical work having to do with ergodicity and is completely irrelevant to statistical mechanics. Okay, here with the cream and the coffee cup, we only have to wait a few minutes.
[99:09] And so it's not infinite, it's not T goes to infinity. However, Neema Arkani Hamed also talks about how with particle physics, particle physics occurs at the boundary. Why? Because in the math, we're scattering from minus infinity to plus infinity. And yes, it takes place in just a few milliseconds or a few seconds or what have you. But for
[99:29] For the calculations, we just use infinity. Now, he seems to be using the opposite argument that you just used. So would you be able to convince him? No, no, no, Nima. It's actually not happening at infinity. It's not at the boundary. Well, yeah. So when he says happening at infinity, I think one thing you have to realize is that when physicists say infinity, what they often mean is literally infinity, but
[99:56] Large enough that it doesn't really matter how big it is. There's a nice book called Understanding the Infinite by a philosopher named Sean Levine and he introduces what he calls the theory of zillions and a zillion is a technical term. A zillion is a number that's so big it doesn't really matter how big it is.
[100:18] Right. So it's context dependent. Right. And if you think about it, that's sort of how we use the word. Like if I say, you know, if someone says, Wayne, you go back, you go to conferences so much, why don't you just buy a Learjet? Right. And I would say, well, you know, that costs like a zillion dollars. Right. I have no idea what a Learjet costs. Right. But I do know that whatever that cost is, is so far beyond my own financial resources. It doesn't really matter exactly how big it is. Right. And when, and, um,
[100:48] one of my colleagues said, you know, we have to, um, realizing quantum field theory, you know, asymptotic infinity is like five meters, right? Because what you do when you're doing these scattering experiments, right? What you're doing is there's a relatively small scattering region, right? And far enough from that scattering region, the, the field is effectively free, right?
[101:15] And then, and so you're basically taking in and out states as if, you know, as if they're free fields, you're doing your calculation and you're calculating scabbings, cross sections, et cetera, for effectively free fields. And really what you'll say add infinity, but, and mathematically you might take the limit as things go to infinity because that's a nice clean, clean physical result.
[101:44] But what you really mean is this is a good approximation far enough from the scattering region that the interactions can be neglected. And I think that that's what he means when he says the interesting stuff happens at infinity, right? And so with something like that, if I've got an interaction and I have a sense of how fast the interaction falls off with distance,
[102:13] I can get a sense of how far I have to be from the scattering region to say, okay, these are effectively free fields, right? What we want from equilibration results is some result about, okay, how long do I have to wait till I say, okay, yeah, we're effectively at infinity because the thing has equilibrated. And that's what you're trying to get out of the equilibration results. And it's not as simple because
[102:39] It's not as simple as in the quantum case because in the quantum case you've got these distance-dependent forces and you know how fast they drop off. And what you're trying to find out in the equilibration case is, well, how fast does something equilibrate? How fast does it get to the point where I can basically ignore the fact that it was out of equilibrium at the beginning? Okay, I have another funny question. Yes, okay.
[103:08] So Natty Seiberg said that one of the ways that we can, an indicator for quantum field theory being on shaky foundations or not firm foundations is that we teach quantum field theory differently. So almost no textbook in quantum field theory is the same and almost no course is the same. So some person may say, let's start with scalar fields and then add interactions. And then some person may say, well, let's start with all free fields and then add interactions and others as a functional approach and so on and so on.
[103:38] So
[103:40] This episode is brought to you by State Farm. Listening to this podcast? Smart move. Being financially savvy? Smart move. Another smart move? Having State Farm help you create a competitive price when you choose to bundle home and auto. Bundling. Just another way to save with a personal price plan. Like a good neighbor, State Farm is there. Prices are based on rating plans that vary by state. Coverage options are selected by the customer. Availability, amount of discounts and savings, and eligibility vary by state.
[104:09] That's not a controversial statement. Right. That QFT isn't firm. Right. But what I'm wondering is, do you personally, Wayne, have an idea as to a field or a subfield, this particular subject in physics that other people think, no, no, this is well understood, but you think actually there is trouble here? Well, I think that statistics mechanics is a case in point because people
[104:36] Textbooks are written to give the impression that we understand everything and this is all worked out. And if you actually go from one textbook to another, you'll find very different approaches. In quantum field theory, everyone knows that there's different approaches and statistical mechanics is sort of swept under the rug.
[104:58] And so that is one case where I think there are real questions about the rationale for certain kind of methods that got kind of swept under the rug. And the thing about thermodynamics is this thermodynamics is sort of similar, even though we don't think of it as fundamental cutting edge science, it's got its roots in 19th century physics.
[105:21] Different thermodynamic textbooks will take very different approaches. And I think that the root of that is that there are different conceptions about what thermodynamics is supposed to be. So one conception is what I call the resource theoretic conception, where it really is about what you can do with various things. But what people usually want from a
[105:50] Thermodynamics textbook, especially if it's chemical, you know, preparing people for doing chemical thermodynamics is you want to figure out what the equilibrium states of a system are. And those are the ones that maximize entropy. And a textbook with that orientation will tend to minimize talk of manipulations in doing work and things like that and treat entropy as if
[106:19] It is simply a property of matter, like mass and other things. I think in a lot of areas, actually, the textbook tradition will sometimes obscure different ways of thinking about the theory. And so the question is, in what areas are there where there is a sort of settled, everyone agrees on how to do this?
[106:50] This classical electric dynamics, you know, every single textbook in existence is a copy of JD Jackson's book. Um, you know, yeah. And, and, and I think it is a, um, you know, that's possible because there is this, there is this
[107:13] Theory that we call classical electrodynamics that we think has been superseded by quantum electrodynamics. So we can all agree on what classical electrodynamics is because it's not, it's in a sense a closed book. And quantum field theory is a continuing area of active research and
[107:37] Yeah, so one of the different, one of the reasons for the difference of approaches in quantum field theory is we just don't have a good mathematical, as good a mathematical grip on the theory as we do other areas of physics, right? You know, you write down a Lagrangian, standard model Lagrangian, and is this a well-defined, you put in a cutoff and, you know, if you let the cutoff go through infinity, you have
[108:05] You have blow ups which you have certain techniques for regulating. Is that telling us that the theory we wrote down actually isn't well defined at all energies? Or are these cutoffs that we're introducing just a calculational tool for getting at the consequences of a well-defined theory? I think that
[108:33] As far as I understand it, and there are people who are much more on the literature, I actually think that that's still more or less an open question. I think the standard view is it doesn't really matter whether the theory you're writing down is well-defined at all energies because we think that it's an effective field theory valid at certain energies and we don't know what's going on beyond those energies.
[108:59] But you don't buy that answer or what? I think that's right. I think that's right. So that's why we can get away with actually not knowing the answers, whether the theory we write down is actually well-defined at all energies. If that's your attitude, then it doesn't really matter whether it is. Yeah. What's a lesson you learned too late? What is a lesson I learned too late?
[109:32] And I'm assuming you want to know about, you know, physics and philosophy of physics and not about my personal life.
[109:44] So this, okay, this idea that, um, you know, what I've been saying about thermodynamics, that there's two different conceptions of what, what the theory is, but we have resource theory, um, um, consider a conception and this other conception according to which is more like mainstream physics. That took me a surprising amount of time to actually get clear in my own head about, but now I think it's really, you know,
[110:14] It should be like one of the first things that anyone says when they're talking about thermodynamics. And as you know, I've given talks several times with the title, a tale of two sciences, both called thermodynamics. And yeah, really, it only is, you know, relatively recent years. I think, OK, that's the way I want to be thinking about that. Now, many people watch this channel who are researchers in math, physics, computer science, adjacent fields.
[110:43] and philosophers and there's also lay people that watch. So I'm curious what advice you have that you give to your graduate students, but also advice that may apply to this wide gamut of people that watch. Okay. I would say here's advice I give to my, um, grad students when they're in this will apply to like researchers in any field of the just starting out or something like that is,
[111:12] When you're choosing what things to work on, what you should not do is look around and say, okay, what's the hot topic? What's the popular thing? And jump on the current bandwagon. And for two reasons. If you're doing something because you think it's popular and you're not particularly interested in it, well, if you're not interested in your work, there's just no way you're going to get anyone else interested in your work, right?
[111:40] And also if you're jumping on a bandwagon and then you're, you know, applying for jobs and you're submitting your things you've written or you're submitting parts of your dissertation for public education. My experience as an editor, I was editor of a philosophy, physical kernel for a number of years. My experience as an editor, if you get, when we get a paper, which is, um,
[112:08] The nth minor addition to a well-worn topic, the threshold for that being worth publishing is very, very high because you don't want to publish. Even if what you're saying is correct, one of the things you're asking is, okay, if this is going to take up journal space, is this actually a significant advance over what's out there in the literature? And really, if something's a hot topic, then people are going to get tired of it fairly quick.
[112:38] So do something you're interested in, but don't choose something just so narrow that only three people are going to have an idea of what you're talking about. So there's sort of a happy medium between choosing a research topic that some people are going to have some knowledge about and jumping on the bandwagon and doing what everyone else is doing. OK, so let's imagine you were speaking to your PhD and postdoc students who want to get a job in the field.
[113:08] I imagine that at some point they have to maybe not jump on a bandwagon, but hitch a ride occasionally because don't you have to get grants? Don't you have to be marketable? Okay. So how do you navigate that? So my teacher Abner Shimoni had the honor of working with him at Boston University when I was a grad student. One thing he would say is just as Aristotle taught us that ethical virtues are means between opposing vices, intellectual virtues are also means between opposing vices.
[113:37] And I think when you're choosing a research area, there's two opposing vices. One is choosing something that's such a niche area that only three people in the world are going to have any idea what you're talking about. And I think the other vice is jumping on a bandwagon and doing what everyone else is doing. And I think the reason I mentioned that other vice first is I think there's a mistaken impression out there that that's what you're supposed to do. That's what you should be doing.
[114:07] And I think that that's a mistake in the this is based on my experience as an editor and also talking to other people in the field who edit journals and also talking in my experience as On panels to the qtk grants and things like that is Sure, like if someone reading something it says I have no idea what this is about. Okay, that's that's a you know, that's a tough sell, right? but if you've got
[114:38] A dozen grant applications in front of you, and ten of them are minor variants on the same thing. And what they're going to do is make a minor advance in a well-worn field. And another one is an interesting and promising research project that is worth doing but relatively unexplored. That's actually going to count in favor of the one that's worth doing and relatively unexplored.
[115:07] In terms of getting jobs, let me tell you this is a true story. Many, many years ago, we were hiring at the University of Western Ontario.
[115:38] And I was on the hiring committee and we had a job ad which was fairly broad. And what had happened is in one of the areas of specialization that was included in the job ad, a big name philosopher had recently published a book that was getting a lot of attention. And what happened was everybody in the world did a grad seminar on this book. And
[116:06] I was sitting there reading these applications. This was back in the days when people actually sent us paper applications and there's a file box with all the applications in it and you take it into your office after hours and you're going through it, right? And I was reading the writing sample of one candidate and I said, I read the open paragraph and going, didn't I just read this? I'm going,
[116:33] Oh my God, one of our applicants has plagiarized the writing sample from another applicant. And then I went back and got that other file and they were in fact different, but the opening paragraphs were almost word for word the same because this was this issue that everyone was talking about and there was a very standard way of setting up the issue. I see. So if you want people to actually confuse your writing sample with someone else's, then jump on a bandwagon.
[117:03] Interesting. So this also applies to film and businesses in general. You don't want to be in the red is said to be the red contested waters. You want to be in the blue ocean. I'm not familiar with that terminology, but I believe you. Yeah, right. You just referenced I could be I could give you a personal lesson, but instead I'll give you a lesson that applies to philosophers and physicists. And that has to be curious. What would be the personal lesson that you learned too late and something not trivial like
[117:31] Oh, I learned to I should double bag my groceries. I see. Listen to my personal life that I think I learned too late.
[118:05] If there are toxic people in your life, avoid them. Be around people that you're comfortable around and you feel good around and try to minimize your contact with the toxic people. Thank you so much for spending so much time with me. Well, thank you. I've really enjoyed this. Well, it's been two hours. It feels like I just flew by.
[118:34] Yes, that's always a great site. In fact, in Harry Potter, I think there was a sands of time. And then Harry asks the professor, what is this? Because it was a different type of sands. And then the person said it stands still when the conversations engaging. Well, that's good.
[118:51] I have another life lesson, which I heard early in my life, but I am to this day, not particularly good at applying. And this is some, this was an interview that I heard on the radio as a teenager with David Lee Roth, who was big at the time when I was a teenager. And he said, here's my life lesson. Don't sweat the little shit. And it's all little shit.
[119:21] Interesting. I mean, I don't think it's actually true that it's all little shit, but I think a lot of I think they don't sweat the little shit is something a lot of us have difficulty applying that we end up fussing too much about things in the long run aren't really important.
[119:39] Hi there, Kurt here. If you'd like more content from Theories of Everything and the very best listening experience, then be sure to check out my sub stack at KurtGymungle.org.
[120:07] Some of the top perks are that every week you get brand new episodes ahead of time. You also get bonus written content exclusively for our members. That's C-U-R-T-J-A-I-M-U-N-G-A-L dot org. You can also just search my name and the word sub stack on Google. Since I started that sub stack,
[120:30] It's somehow already became number two in the science category. Now, Substack for those who are unfamiliar is like a newsletter, one that's beautifully formatted. There's zero spam. This is the best place to follow the content of this channel that isn't anywhere else. It's not on YouTube. It's not on Patreon.
[120:50] It's exclusive to the Substack. It's free. There are ways for you to support me on Substack if you want, and you'll get special bonuses if you do. Several people ask me like, hey, Kurt, you've spoken to so many people in the field of theoretical physics, of philosophy, of consciousness. What are your thoughts, man? Well, while I remain impartial in interviews, this Substack is a way to peer into my present deliberations on these topics.
[121:19] And it's the perfect way to support me directly. KurtJaymungle.org or search KurtJaymungle sub stack on Google. Oh, and I've received several messages, emails and comments from professors and researchers saying that they recommend theories of everything to their students. That's fantastic. If you're a professor or a lecturer or what have you, and there's a particular standout episode that students can benefit from or your friends,
[121:49] Please do share. And of course, a huge thank you to our advertising sponsor, The Economist. Visit Economist.com slash Toe, T-O-E, to get a massive discount on their annual subscription. I subscribe to The Economist and you'll love it as well. Toe is actually the only podcast that they currently partner with. So it's a huge honor for me. And for you, you're getting an exclusive discount. That's Economist.com slash Toe, T-O-E.
[122:19] And finally, you should know this podcast is on iTunes, it's on Spotify, it's on all the audio platforms. All you have to do is type in theories of everything and you'll find it. I know my last name is complicated, so maybe you don't want to type in Jymungle, but you can type in theories of everything and you'll find it.
[122:37] Personally, I gain from rewatching lectures and podcasts. I also read in the comment that toll listeners also gain from replaying. So how about instead you relisten on one of those platforms like iTunes, Spotify, Google podcasts, whatever podcast catcher you use. I'm there with you. Thank you for listening.
View Full JSON Data (Word-Level Timestamps)
{
  "source": "transcribe.metaboat.io",
  "workspace_id": "AXs1igz",
  "job_seq": 1868,
  "audio_duration_seconds": 7375.88,
  "completed_at": "2025-11-30T21:15:32Z",
  "segments": [
    {
      "end_time": 20.896,
      "index": 0,
      "start_time": 0.009,
      "text": " The Economist covers math, physics, philosophy, and AI in a manner that shows how different countries perceive developments and how they impact markets. They recently published a piece on China's new neutrino detector. They cover extending life via mitochondrial transplants, creating an entirely new field of medicine. But it's also not just science they analyze."
    },
    {
      "end_time": 36.067,
      "index": 1,
      "start_time": 20.896,
      "text": " Culture, they analyze finance, economics, business, international affairs across every region. I'm particularly liking their new insider feature. It was just launched this month. It gives you, it gives me, a front row access to The Economist's internal editorial debates."
    },
    {
      "end_time": 64.514,
      "index": 2,
      "start_time": 36.34,
      "text": " Where senior editors argue through the news with world leaders and policy makers in twice weekly long format shows. Basically an extremely high quality podcast. Whether it's scientific innovation or shifting global politics, The Economist provides comprehensive coverage beyond headlines. As a total listener, you get a special discount. Head over to economist.com slash TOE to subscribe. That's economist.com slash TOE for your discount."
    },
    {
      "end_time": 76.664,
      "index": 3,
      "start_time": 66.135,
      "text": " Even though a lot of physicists will say second law says the total entropy is never decreasing, that can't be actually be the second law, that can't be a consequence of the second law."
    },
    {
      "end_time": 98.865,
      "index": 4,
      "start_time": 78.848,
      "text": " This is a two-hour deep dive into entropy and the second law. Most talks on this subject are 10 minutes long, but today, Professor Wayne Muirvold gives a tour de force, explaining entropy from multiple angles, dispelling myths, and even the stunning realization that the second law is opposite to what you think."
    },
    {
      "end_time": 127.363,
      "index": 5,
      "start_time": 99.309,
      "text": " You will get both answers from perfectly competent physicists on each side. We'll be absolutely certain that that is the right answer. Questions explored are why is entropy not the same as disorder? What do popular accounts and even undergraduate texts on entropy and thermodynamics consistently get incorrect? Is the universe subject to the second law? Can you break these supposed entropic limits? And why quantum mechanics changes everything?"
    },
    {
      "end_time": 143.302,
      "index": 6,
      "start_time": 129.172,
      "text": " There's plenty of confusion and puzzles about entropy. Today I would like to talk about exactly what is entropy and one of the questions that give conflicting answers is imagine in front of you, you have some physical system. I believe this comes from Shelley Goldstein."
    },
    {
      "end_time": 171.732,
      "index": 7,
      "start_time": 143.643,
      "text": " Like, let's just say you give this example like a glass of water, for instance, and its physical state is clearly not completely known to you. But then something else appears to you. You can be an intelligent person, an angel or what have you. It gives you a much better approximation of this glass of water than what you had before. Then the question is, has the systems, the glass of water has its entropy decreased? So they're broadly speaking, two answers one can give. One is"
    },
    {
      "end_time": 201.459,
      "index": 8,
      "start_time": 172.415,
      "text": " Yes, obviously, because entropy has to do with the information of the system. So if you've gained information, the entropy has changed. Then the other is that's absurd. The entropy has something objective to do with the system. So why would your information about the system change its entropy? Take it away. Yep. Yeah, absolutely. And you will get both answers from perfectly competent physicists and they will be absolutely certain that that is the right answer."
    },
    {
      "end_time": 227.056,
      "index": 9,
      "start_time": 202.261,
      "text": " So I think the best way to start reading and thinking about that is you post the question as, what is entropy? And I think that's a bit of a misleading question because entropy is one of those words used in different senses. And we're used to that. It's not unusual. Like if you open up a dictionary,"
    },
    {
      "end_time": 247.961,
      "index": 10,
      "start_time": 227.261,
      "text": " Very often in science people coin a new technical term because they want something to have a precise, well-agreed upon meaning."
    },
    {
      "end_time": 268.217,
      "index": 11,
      "start_time": 248.353,
      "text": " And that's actually what Clausius did back in 1865, I think it was. He thought, okay, here's this important quantity that I and others have been batting around in thermodynamics. It's important enough it needs a dignified name. And his rationale was,"
    },
    {
      "end_time": 295.657,
      "index": 12,
      "start_time": 268.831,
      "text": " Everybody studies the dead languages, right? We all know Greek and Latin and so we don't want something like from English or German or Italian because then it becomes sort of nationalistic words. So let's coin something from a Greek word. So he coined the word entropy from a Greek word for transformation and he deliberately coined it to sound kind of like energy because it's closely related concept. If I had my way,"
    },
    {
      "end_time": 326.92,
      "index": 13,
      "start_time": 297.705,
      "text": " We would respect Clausius and we would only use the word entropy in exactly the same sense that Clausius defined it. That would be what everyone means by entropy. But historically, that's not what happened. There's been a number of different quantities that people call entropy and they're all related and they're all related in some sense to thermodynamic entropy, but they're just different things. If someone asks you the question, has entropy decreased? I think an actual question is, well, which entropy?"
    },
    {
      "end_time": 355.691,
      "index": 14,
      "start_time": 327.722,
      "text": " So in other words, there should be entropy sub one, entropy sub two, entropy sub three. And when someone says, well, what is the entropy of the system? You say, OK, are you referring to two, three, sub one? Yeah, exactly, exactly. Or if I had my way, people would just have coined different words for these different kinds of kinds of things. But there's a reason why some people say, well, of course, entropy has to be"
    },
    {
      "end_time": 385.896,
      "index": 15,
      "start_time": 358.302,
      "text": " Um, uh, an intrinsic property of a system because, you know, this is physics after all, you know, we're not doing psychology. It's not in, you know, we're not studying people's information. We're studying physical properties of physical systems, which they, you know, qualifies they have no matter what anybody knows about them or what we think about. Like if I ask, okay, what's the mass of this cup? Um, that, you know, it would seem absurd to say, well,"
    },
    {
      "end_time": 410.111,
      "index": 16,
      "start_time": 386.459,
      "text": " How much do you know about the cup? The mass of the cup is something that the cup has, um, um, talking about the rest mass, because sometimes people will talk about relativistic mass and talk about that as observer dependent, but okay, what's the rest mass of this cup? Then, um, yeah, that's the property of the cup and, um, it doesn't matter what anyone thinks about it."
    },
    {
      "end_time": 434.224,
      "index": 17,
      "start_time": 410.691,
      "text": " And if you think that thermodynamics is a science like that, that is just studying the physical properties of things, then it seems absurd that one of its central concepts, entropy, would be something that would be defined relative to a state of information. And I think that"
    },
    {
      "end_time": 461.817,
      "index": 18,
      "start_time": 435.009,
      "text": " At bottom, the fact that people are inclined to think that different notions of entropy are obviously the right one, and different answers to this question are obviously the right answer, is even though this gets completely blurred in the textbook tradition, there are actually different conceptions about what the science of thermodynamics is all about. Okay, so look, in the second law, it stated that entropy doesn't decrease."
    },
    {
      "end_time": 489.821,
      "index": 19,
      "start_time": 462.244,
      "text": " Yeah. Oh, yes. Yeah. Your caveats closed system or isolated system. Yeah. Okay. Then there's a formula for entropy. Are you saying that even here there should be sub one and sub two? Actually, if you look these different notions of entropy are actually defined differently. And you actually if you look at different textbooks, when they introduce the concept of entropy,"
    },
    {
      "end_time": 517.108,
      "index": 20,
      "start_time": 490.316,
      "text": " They actually will sometimes give very different definitions. So maybe I should just talk about what Clausius was doing, because that's one of the definitions that's out there. Sure. So Clausius was working in the 1850s, 1860s. Those are the early days of what we now call thermodynamics."
    },
    {
      "end_time": 528.626,
      "index": 21,
      "start_time": 517.551,
      "text": " And it was Kelvin who gave the science that name and I think a lot of people actually misunderstand what that word means thermodynamics because in"
    },
    {
      "end_time": 559.053,
      "index": 22,
      "start_time": 530.401,
      "text": " Physics these days, when you talk about dynamics, you usually mean the laws of evolution, like the dynamical laws that govern the behavior of systems. And that's actually not what Kelvin meant when he decided to call this emerging science thermodynamics. This was, as I said, back in the days when everybody studied Greek in school, and it's formed from two Greek words."
    },
    {
      "end_time": 584.121,
      "index": 23,
      "start_time": 559.428,
      "text": " the words for heat and for power. And thermodynamics has its roots in the study of how you can get useful mechanical work out of heat. Like it really ultimately has its roots in Carnot's study of heat engines, efficiency of heat engines. If you think of that as what thermodynamics is about,"
    },
    {
      "end_time": 615.879,
      "index": 24,
      "start_time": 586.254,
      "text": " Physicists these days have a word for a theory like that. It's a resource theory and This comes out of quantum information theory. So what happened? You really got going a couple decades ago is This you know field of quantum information theory it includes quantum communication and cryptography and stuff like that and they were asking questions like, you know, if you've got two agents Who have access to certain resources? What can they do with those?"
    },
    {
      "end_time": 637.466,
      "index": 25,
      "start_time": 616.425,
      "text": " So for example, these agents are always called Alice Bob and Bob, by the way. For example, if Alice and Bob want to send a secure signal that an eavesdropper could not, as a matter of physical principle, eavesdrop on, what can they do? Can they do it if they have a certain amount of shared entanglement?"
    },
    {
      "end_time": 666.118,
      "index": 26,
      "start_time": 638.285,
      "text": " Using physics in the sense that quantum physics is telling you how physical systems are going to respond to certain operations and stuff like that. But the questions you're asking are really questions not within physics proper. It's questions about what agents with certain kind of means of manipulating a system and certain resources can do to achieve certain goals. Why is that not in physics proper?"
    },
    {
      "end_time": 693.592,
      "index": 27,
      "start_time": 667.073,
      "text": " Um, because when I say physics proper people, um, usually what I mean is what physics usually think, but physics is about the properties of physical systems period. Right. And if I'm talking up in a end, so these goals of these agents aren't a matter of physics, these are something that you're adding on. Hmm. Right. So is it, so there's a one, one question is what, you know, what do things do, um,"
    },
    {
      "end_time": 721.544,
      "index": 28,
      "start_time": 694.121,
      "text": " So there is a certain question, what do things do under certain circumstances? But if I'm, if I'm setting you for certain goals, like is an agent itself a part of the physical system? The agents ourselves are physical systems, right? But physics studies physical systems in certain respects."
    },
    {
      "end_time": 743.131,
      "index": 29,
      "start_time": 722.159,
      "text": " So I'm a physical system, right? You're a physical system. You have thoughts and beliefs. Thank you. It's the nicest thing anyone's ever said about me. Some might disagree and say you're not just a physical system, you're a combination of a physical system plus an immaterial mind, but I actually think that we are all physical systems."
    },
    {
      "end_time": 770.845,
      "index": 30,
      "start_time": 743.746,
      "text": " Um, so, um, and you know, we qua physical systems have, you know, thoughts and desires and hopes and dreams and stuff like that. But if a scientist is studying my thoughts, so study that side that scientist is doing psychology and not doing physics. Okay. Yeah. Um, so, um, you're just studying different aspects of things. And if I bringing in things like, um,"
    },
    {
      "end_time": 797.005,
      "index": 31,
      "start_time": 771.578,
      "text": " Here's the game that Alice and Bob are going to play, and here's how we're going to score them. And then you give them certain resources, and physics tells you what the highest possible score is. But basically, you're not doing the sorts of thing you usually find in physics textbooks if you're talking about goals and scoring and things like that. That's what I mean. Got it. Yeah, good."
    },
    {
      "end_time": 825.162,
      "index": 32,
      "start_time": 797.654,
      "text": " And I think this will become important for thermodynamics. So with people who are doing quantum information theory, they said, okay, what we're doing is this is a resource theory. And then some of the same methods ended up when people started doing quantum thermodynamics, a lot of people started thinking of this as a resource theory."
    },
    {
      "end_time": 852.927,
      "index": 33,
      "start_time": 825.811,
      "text": " Like, so if I give you certain resources and you've got a certain task like lifting a weight or something like that, and maybe you've got some kind of system and you've got a heat bath at a certain temperature, what's the most work you can get out of it? So how high can you lift the weight?"
    },
    {
      "end_time": 881.049,
      "index": 34,
      "start_time": 853.541,
      "text": " people, a lot of people working in quantum thermodynamics these days think, okay, what we're doing is a resource theory similar and in some, some sense is modeled on the quantum information theory. Okay. And, um, that's basically how the founders of thermodynamics, what wasn't quantum, it wasn't quantum, but that's basically how the founders of thermodynamics thought of thermodynamics. It's a study of given certain physical resources,"
    },
    {
      "end_time": 895.589,
      "index": 35,
      "start_time": 881.732,
      "text": " like heat baths and things like that. How can I exploit these resources to do work, like drive a car or something like that, or race?"
    },
    {
      "end_time": 919.599,
      "index": 36,
      "start_time": 896.118,
      "text": " Let me see if I can summarize this. So Clausius who coined the term entropy was thinking of it in terms of a resource theory. Now a resource theory is what can I do with these resources and often in thermodynamics, when you take an introductory course, you speak either the first or second lecture about pistons. So given this system, can I move a piston? So they were thinking practically."
    },
    {
      "end_time": 935.213,
      "index": 37,
      "start_time": 920.759,
      "text": " Yeah, there's a sort of interesting trade-off between practical concerns and theoretical concerns because these questions were initially raised by practical concerns where they sort of took on a life of their own. And you can see that already in Carnot's work."
    },
    {
      "end_time": 959.411,
      "index": 38,
      "start_time": 935.879,
      "text": " So Carnot wrote this little pamphlet called Reflections on the Motive Power of Heat. And he was actually responding to some issues, and this is something that his father, Lazaro Carnot, had done some work on, that was going on at the time is, if you've got a heat engine,"
    },
    {
      "end_time": 984.65,
      "index": 39,
      "start_time": 960.606,
      "text": " And usually that was you've got some kind of gas in a chamber and you heat it up in the drives a piston, right? Is it more efficient if you use a more volatile substance? So it doesn't matter whether you're using air or steam or say alcohol or ether or something like that. Are you going to get more work out of the same amount of heat?"
    },
    {
      "end_time": 1006.698,
      "index": 40,
      "start_time": 985.947,
      "text": " And this was actually a practical matter because some people were thinking, okay, yeah, let's use alcohol or ether. And you can kind of imagine what happens because these things not only expand a lot faster when they're heated up than air does, they're also highly flammable. And it's kind of dangerous to have these things around fire."
    },
    {
      "end_time": 1034.394,
      "index": 41,
      "start_time": 1007.244,
      "text": " And so one of the questions that Karno was asking is, well, does it matter what the working substance is? Does it matter what gas you have in the piston? And he argued that actually the maximum efficiency, if I have two heat sources at different temperatures, the maximum efficiency of an engine running between them is independent of what the stuff is that you're using in the gas."
    },
    {
      "end_time": 1065.333,
      "index": 42,
      "start_time": 1035.657,
      "text": " So that had its roots in practical concerns, but Carnot considered what we now call thermodynamically reversible processes. And the thermodynamically reversible process involves you're only exchanging heat from two things at the same temperature. And so what you're doing instead of is you're actually expanding the gas very, very slowly."
    },
    {
      "end_time": 1095.708,
      "index": 43,
      "start_time": 1065.725,
      "text": " and then when you're dumping heat out, but you're compressing it very, very slowly. And of course, actual engines are not anywhere close to thermodynamic reversibly because what we're actually concerned is not just efficiency, but also power, right? How much work we're getting per unit time, right? Sure. If someone tries to sell you a car and says, okay, this has an amazing gas efficiency,"
    },
    {
      "end_time": 1125.589,
      "index": 44,
      "start_time": 1096.459,
      "text": " but you can only go, you know, five kilometers an hour, right? Right. We're not buying it literally. Right. And so, and so a lot of the study of, um, even though the thermodynamic grew out of study of efficiency of engines, a lot of the actual theoretical work, the things you can actually prove things about involve consideration of thermodynamic reversible processes and, um,"
    },
    {
      "end_time": 1152.517,
      "index": 45,
      "start_time": 1125.896,
      "text": " In the real world, there actually are no thermodynamically reversible processes. I noticed that you interviewed John Norton a while back. I'm sure he emphasized that point. Yes. Right. But we can approximate thermodynamically reversible processes. They just have to go very, very slowly, right? And actual machines were not interested in things that work very, very slowly."
    },
    {
      "end_time": 1179.974,
      "index": 46,
      "start_time": 1154.616,
      "text": " So you can still you can have a resource theory as a more there is maybe be motivated by abstract concerns, but you could be actually considering situations that are very far from realistic ones. Talk about how the second law assumes a certain definition of entropy. Maybe it shouldn't even be called the second law. Good."
    },
    {
      "end_time": 1208.729,
      "index": 47,
      "start_time": 1180.52,
      "text": " I'm really glad you asked me that because I think it's the other way around. Um, when you ask people, what's the second law of thermodynamics, you go ask, ask people in the street and something that they'll say, you know, um, they'll, um, a lot of them will say things like what you said that the entropy of an isolated closed system always increases. Right. Now, the interesting thing is, is if you, what you mean is thermodynamic entropy,"
    },
    {
      "end_time": 1235.606,
      "index": 48,
      "start_time": 1209.36,
      "text": " as Clausius defined it, that actually is not right for an important reason. Even though Clausius himself sort of as a tongue of cheek at the end of one of his papers said we can express the first and second law as, or he says, if we may be permitted to talk about the total energy of the universe and totally entropy of the universe, we can express the first and second laws as"
    },
    {
      "end_time": 1265.009,
      "index": 49,
      "start_time": 1235.896,
      "text": " total energy of the universe is constant, the total entropy of the universe drives to a maximum. That's actually not, that's actually not as official statement of the second law. And there's very good reason. Cause you said the second law of thermodynamics presupposes a certain notion of entropy is actually the reverse. The causes is definition of energy entropy presupposes the second law. Hmm. Okay. How so? Okay. So one way of, um,"
    },
    {
      "end_time": 1290.998,
      "index": 50,
      "start_time": 1266.305,
      "text": " expressing the second law would be suppose I've got some kind of system and it's easiest to imagine a gas with a piston and it goes around in a cycle in the sense that it comes back to the same thermodynamic state that it started in and people were thinking about these cycles because they're thinking about heat engines so what heat engine typically does"
    },
    {
      "end_time": 1318.37,
      "index": 51,
      "start_time": 1291.357,
      "text": " is you've got some working substance gas, you heat it up, it drives the piston out and then you either expel the substance or cool it down and compress it and push the piston in and then you're ready to start again. So the engine itself is working in a cycle. Suppose I've got a gas in a box and I can change its temperature and I can"
    },
    {
      "end_time": 1348.558,
      "index": 52,
      "start_time": 1319.821,
      "text": " This episode is brought to you by eBay. Buying parts for your car? You'll know that. Will it work? Feeling. But on eBay, buying parts is different. eBay's free returns means if it doesn't fit, or if it isn't what you expected, you just print a label, drop it off, and get your refund fast. No haggling, no stress, and at least 30 days to return any eligible item. Millions of parts, free returns. eBay. Things people love. Eligible items only. Exclusions apply."
    },
    {
      "end_time": 1361.971,
      "index": 53,
      "start_time": 1350.043,
      "text": " I'm going to say this a little bit differently because I'm going to introduce the concept of entropy"
    },
    {
      "end_time": 1389.991,
      "index": 54,
      "start_time": 1362.534,
      "text": " without using the word and let's just see if you notice where it comes in. Okay. Okay. Okay. So imagine you've got, say, for example, a gas in a container and there's a piston you can move around and, um, you can put it next to a heat source and maybe expand it and use that to move away to something. And so suppose I start with it at a certain temperature and pressure in the pit and a certain volume, the piston is a certain place. And I,"
    },
    {
      "end_time": 1413.251,
      "index": 55,
      "start_time": 1390.299,
      "text": " Slowly expand it in a thermodynamically reversible sense and I'm raising a weight and It's connected to heat source. So heat is going into it Okay, and then I hand it to you Kurt and say I want you to put it back to the original state Okay now Here's what what I'm closest is thing"
    },
    {
      "end_time": 1443.609,
      "index": 56,
      "start_time": 1413.848,
      "text": " It's a consequence of the second law of thermodynamics. You can't do that without expelling any heat at all from the system. Now, one thing I can do is if the original process was thermodynamically reversible, is I can just do that original process in reverse, compress the gas back to its original volume, expelling exactly the same amount of heat into the same reservoir I got it from at the same temperature."
    },
    {
      "end_time": 1471.613,
      "index": 57,
      "start_time": 1444.138,
      "text": " And that might be the best you can do if you only have one heat source or sink at one particular temperature. But what we really want to do is not lower the weight as much as we raised it. And so Clausius says, hey, look, if you've got another heat bath at a lower temperature,"
    },
    {
      "end_time": 1497.79,
      "index": 58,
      "start_time": 1472.108,
      "text": " What you can do is you can expel a smaller amount of heat at a lower temperature and get the thing back to its original state that way. So he introduced this notion of equivalence value of heat."
    },
    {
      "end_time": 1522.09,
      "index": 59,
      "start_time": 1498.985,
      "text": " Heat transferred between a system in a reversible process and a heat bath is, in a certain sense, worth more for what you want to do if it's at a lower temperature than a higher temperature. Because if I want to restore the initial state, I can either use a large quantity of heat at a high temperature or less heat at a lower temperature."
    },
    {
      "end_time": 1553.097,
      "index": 60,
      "start_time": 1523.319,
      "text": " That's what he calls the equivalence value of heat. It's a function not just of the amount of heat, but the temperature which is being transferred. In fact, as Kelvin realized, I can define a temperature scale, which we call the Kelvin scale or absolute temperature scale, so that the equivalence value of a quantity of heat at a given temperature is just inversely proportional to the temperature. Just define the temperature scale that way."
    },
    {
      "end_time": 1583.609,
      "index": 61,
      "start_time": 1553.609,
      "text": " Okay, so one Statement of the second law is that if I take something in a cycle and There's heat being exchanged to various temperatures Add up all the equivalent equivalence values to those heats It can't be More greater than zero. It's less than or equal to zero"
    },
    {
      "end_time": 1610.606,
      "index": 62,
      "start_time": 1584.599,
      "text": " And if it's a reversible cycle, then it's equal to zero. The sum total of all those equivalence devalues is equal to zero. Yes. Okay. Okay. Now I can define entropy because it's a consequence of that. If I have two different thermodynamic states and I go from one to another in a reversible process,"
    },
    {
      "end_time": 1641.374,
      "index": 63,
      "start_time": 1611.852,
      "text": " The total equivalence values over those processes is not going to matter which reversible process. Suppose there's more than one reversible process that gets me from state A to state B. And if I go via one reversible process, add up all the equivalence values of heat, it'll be the same as another process. And the argument is if these are reversible processes,"
    },
    {
      "end_time": 1669.258,
      "index": 64,
      "start_time": 1642.824,
      "text": " Let me see, so you require the second law, if my understanding is correct, in order to make thermodynamic entropy a well-defined quantity. Exactly, yeah. So the definition of thermodynamic entropy is, if I want to know the entropy difference between two states, then cook up any"
    },
    {
      "end_time": 1696.578,
      "index": 65,
      "start_time": 1669.548,
      "text": " Reversible process to connect those two states and just add up the equivalence values of heat transferred in that process And it's a consequence of the second law that it doesn't matter which reversible process I use So when someone says I've broken the second law can that statement even be made Or are you saying that there's you have to assume it in order to define the entropy. And so how are you going to break the second law? Yeah, so if you"
    },
    {
      "end_time": 1724.974,
      "index": 66,
      "start_time": 1697.875,
      "text": " If you try to express the second law as total entropy of a, in any process, the entropy of an isolated system never decreases. Um, that's actually incorrect because if you could break the second law, then you, um, you wouldn't have a well-defined thermodynamic entropy. Interesting. Yeah. So, I mean, this is a point I think that a lot of people miss. It's actually a consequence. What I'm saying is,"
    },
    {
      "end_time": 1755.026,
      "index": 67,
      "start_time": 1725.316,
      "text": " What I'm saying about the definition of thermodynamic entropy is not anything radical. It's like standard textbooks. Very often what people do in textbooks is they give a statement of the second law, something like Clausius version is there's no process whose net has no other effect than moving heat from a cold body to a hot body. And then"
    },
    {
      "end_time": 1786.049,
      "index": 68,
      "start_time": 1756.118,
      "text": " That's one version of the second law, and I express it without mentioning entropy. Yes. And then say, given the second law, we can define thermodynamic entropy using the standard definition that you just take any reversible process connects two states, add up the equivalence values of heat along those processes and it's going to be the same. So even though a lot of people will say second law is"
    },
    {
      "end_time": 1813.319,
      "index": 69,
      "start_time": 1786.869,
      "text": " Total entropy of a bunch of systems that's isolated is never decreasing. If you mean thermodynamic entropy, that can't actually be the second law. That can be a consequence of the second law. But if you actually break the second law, like if I could have a process that had no other effect than to move heat from a cold body to a hot body,"
    },
    {
      "end_time": 1842.688,
      "index": 70,
      "start_time": 1814.002,
      "text": " Then thermodynamic entropy just wouldn't be well defined. Wayne, let me ask you something. So you're extremely historically informed. Yeah. So historically speaking, how do people think about entropy? What I mean to say is we can make an analogy with heat. So heat we now think of as having to do with the motion of molecules. OK, it's something about the motion of molecules. Now, temperature is also something about the motion of molecules. But prior, 100 years ago, 200 years ago, it was thought of as some form of fluid."
    },
    {
      "end_time": 1870.93,
      "index": 71,
      "start_time": 1843.336,
      "text": " Okay, so how did people used to think about entropy? Did they think of it as a quantity like temperature was something abstract? Did they think of it like a fluid? Did they think of it like something else? What was their mental model for entropy? That's a very good question. Because as a matter of fact, Carnot, when he wrote his book, he was thinking of heat as a fluid they call the caloric that was conserved and it flowed between bodies. And"
    },
    {
      "end_time": 1899.633,
      "index": 72,
      "start_time": 1871.305,
      "text": " Even Kelvin, when he was writing his first papers on what we now call the Kelvin scale, was thinking of it that way. But what happened was that shortly after that, and this happened during Carnot's lifetime, Jewell did his experiments on what people called the mechanical equivalent of heat. The basic idea is if I do a certain amount of work,"
    },
    {
      "end_time": 1929.155,
      "index": 73,
      "start_time": 1900.026,
      "text": " I can generate a certain amount of heat and you can measure the work in terms of say foot pounds and you measure the heat with using a calorimeter. How much will this warm a given sample of water? And Jule decided there's a mechanical equivalent of heat that there's an equivalence between work measured in energy, you know, terms like foot pounds and things like that."
    },
    {
      "end_time": 1941.664,
      "index": 74,
      "start_time": 1929.616,
      "text": " and heat measured in calories and you can convert them and also there's no limit to how much heat you can generate if you just are doing enough work."
    },
    {
      "end_time": 1970.316,
      "index": 75,
      "start_time": 1941.903,
      "text": " A precursor to that was Count Rubford doing experiments with grinding cannonballs. Like if you guessed, not cannonballs, cannon bores. Like if you're grinding away... Oh, okay. Yeah, so what people have noticed is... I don't know what that is. What's a cannonbore? Okay, so how do you make a cannon? I don't know how people make cannons these days, but the way people make cannons back then is you would make a cylinder of iron or steel and then you drill a hole in it."
    },
    {
      "end_time": 1993.524,
      "index": 76,
      "start_time": 1970.93,
      "text": " So the cannon bore is the path that the projectiles and unsurprisingly, if you're drilling away the piece of metal, right? It gets hot, right? So the process involves, I think, horses driving this drill bit."
    },
    {
      "end_time": 2020.572,
      "index": 77,
      "start_time": 1994.019,
      "text": " Bore. Yeah. Yeah. Yeah. Um, horses driving this, um, drill bit and everything's got hot. So you have to cool it off with water and count. Redford did experiments and he got convinced that, um, if you've got enough horsepower, there's no limit to how much heat you can generate. And that didn't fit well with the idea that caloric heat is this fluid that you're squeezing out of the substance."
    },
    {
      "end_time": 2045.111,
      "index": 78,
      "start_time": 2021.084,
      "text": " because if there's a finite amount of heat in any given substance, eventually you think you would run out and not be able to, um, generate any more heat. Right. Right. Like you can't sweat infinitely. Yeah, exactly. You can't sweat it. Perfect. That's a wonderful analogy. Yes. Um, eventually you get to dehydrate dehydrated. Right. And so, yeah, um,"
    },
    {
      "end_time": 2067.312,
      "index": 79,
      "start_time": 2045.555,
      "text": " So what happened was people, largely due to Guell's experiments on the mechanical equivalent of heat, actually became convinced that heat was a form of energy similar to the mechanical energy of things moving around. And that became known as the kinetic theory of heat. And along with that,"
    },
    {
      "end_time": 2096.084,
      "index": 80,
      "start_time": 2067.961,
      "text": " is this picture of gases, for example, being full of molecules bouncing around and when they're hotter, they're moving faster. Now, the interesting thing is even though Carnot, when he initiated what we now call the theory of thermodynamics and Kelvin in his very first papers on this, even though they were thinking of heat as this conserved fluid, um,"
    },
    {
      "end_time": 2123.524,
      "index": 81,
      "start_time": 2096.527,
      "text": " Very quickly, the people who are working on thermodynamics got converted to the kinetic theory of heat. And so a lot of the same people who are developing what we now call thermodynamics were also working on the mechanical theory, the kinetic theory of heat, or Clausius had called it the mechanical theory of heat. And so Clausius was thinking of heat as involving molecules bouncing around. And so that"
    },
    {
      "end_time": 2148.763,
      "index": 82,
      "start_time": 2123.951,
      "text": " raises the question of how entropy might actually be realized in terms of what's going on with the molecules. And Clausius has some ideas about that which are mostly forgotten because none of them were very satisfactory. But the interesting thing is that I can develop the science of thermodynamics with"
    },
    {
      "end_time": 2178.217,
      "index": 83,
      "start_time": 2148.951,
      "text": " independently of the molecular hypothesis. Like I can talk about work being done at the macroscopic level and he being exchanged without really being committed to what was happening at the micro physical level there. Yes. And so Maxwell, who is one of these people who was at the same time participating in the development of what we now call thermodynamics and participating in the development of what we now call statistical mechanics."
    },
    {
      "end_time": 2208.387,
      "index": 84,
      "start_time": 2178.814,
      "text": " And of course, these are very often together in the same textbook these days. Maxwell said, well, thermodynamics is the study of the thermal and dynamical properties of matter without any hypotheses as the molecular constituent of matter. So according to Maxwell, as long as I'm doing thermodynamics, I should be kind of neutral about whether or not matter is composed of molecules. I can just talk about"
    },
    {
      "end_time": 2238.695,
      "index": 85,
      "start_time": 2209.326,
      "text": " heat and work as different kinds of energy exchange without being committed to how it's realized in the microphysics. And some thermodynamics textbooks actually do that. Like some thermodynamics textbooks you'll have, or actually sometimes you have a thermodynamics course and then a statistical mechanics course. And the thermodynamics course can actually be completely independent of talk about molecules. Yes."
    },
    {
      "end_time": 2262.551,
      "index": 86,
      "start_time": 2238.865,
      "text": " And, but in other books, and these usually have the title of thermal physics rather than thermodynamics, the two are go hand in hand. I actually think there's something to be said for the kind of old fashioned way of thinking about it that's still in some textbooks is, let's talk about thermodynamics as"
    },
    {
      "end_time": 2292.483,
      "index": 87,
      "start_time": 2263.148,
      "text": " sides of exchange of heat and energy, try to express the basic principles of thermodynamics independently of any particular theories about the molecular structure of matter. And then you can say, okay, once we acknowledge that matter has a molecular structure, how does it have to be modified? Okay. Why do you prefer that?"
    },
    {
      "end_time": 2318.046,
      "index": 88,
      "start_time": 2293.933,
      "text": " because it highlights a difference between two different forms of the second law of thermodynamics. And here's why. So one consequence of the second law of thermodynamics is if I have a heat engine"
    },
    {
      "end_time": 2346.681,
      "index": 89,
      "start_time": 2318.609,
      "text": " operating between two heat hole sources and things at different temperatures, there's a maximum efficiency. So the efficiency is if I pull out a quantity of heat, how much work can I get out? What you want to do is get as much work out as possible and then dump as small quantity as possible of heat back into the lower temperature thing. And one consequence of second law of thermodynamics is given any two temperatures,"
    },
    {
      "end_time": 2375.879,
      "index": 90,
      "start_time": 2346.988,
      "text": " There's a maximum efficiency of a heat engine operating between heat source and sink at those temperatures. Okay. And that's known as the Carnot efficiency, Carnot bound on efficiency. Okay. Now Maxwell, I think was the first. And so one way of staying second law of thermodynamics is no heat engine is going to have an efficiency that exceeds the Carnot bound. All right."
    },
    {
      "end_time": 2395.896,
      "index": 91,
      "start_time": 2376.988,
      "text": " Okay, as Maxwell was the first to articulate, clearly, if the molecular kinetic theory of heat is right, that actually can't be strictly true. Why? Yeah, that's of course the right question, why?"
    },
    {
      "end_time": 2422.5,
      "index": 92,
      "start_time": 2396.271,
      "text": " Um, because, um, if the, um, molecular theory, the kinetic theory, if you're right, there's going to be a certain kind of unpredictability about how much work you're going to actually get because these molecules are bouncing around more or less at random. And the pressure of the gas on the piston is a matter of the molecules hitting the piston and bouncing off. And."
    },
    {
      "end_time": 2452.858,
      "index": 93,
      "start_time": 2424.428,
      "text": " on a fine enough scale that that force per unit area is going to fluctuating because the molecules are bouncing around more or less at random. Yes. And it could happen that you just happen to get lucky that during the time you're expanding the piston, more molecules than usual or maybe with a higher average energy than usual happen to hit the piston and you get more workouts than you would have expected. Right. Right."
    },
    {
      "end_time": 2478.814,
      "index": 94,
      "start_time": 2453.643,
      "text": " If you say to a physicist that, okay, that shows that the second law of thermodynamics can't be strictly true, they'll go nonsense, right? Because what you can't do is completely rely on that, right? Because it could happen that I get less than the Carnot efficiency, right? Because I'm- Yes. And so what physicists these days accept is, okay,"
    },
    {
      "end_time": 2502.142,
      "index": 95,
      "start_time": 2481.049,
      "text": " fine enough scale there's going to be these fluctuations in the amount of work you get but if you do it again and again and again on average you're not going to be you're not going to be reliably able to exceed it's a statistical law it's a statistical law yes perfect yes and so why I'm so"
    },
    {
      "end_time": 2527.79,
      "index": 96,
      "start_time": 2502.398,
      "text": " flummoxed as to why you like the thermodynamics one is because I personally don't like the thermodynamics view. I very much like statistical physics, but disliked thermodynamics. Okay. And so even the definition of entropy as a weighted sum of logs of probabilities, that makes intuitive sense to me, I can derive something with that. And I can make sense of picturing balls, billiard balls bouncing around. But"
    },
    {
      "end_time": 2550.265,
      "index": 97,
      "start_time": 2528.251,
      "text": " I would do something dangerous by accident."
    },
    {
      "end_time": 2580.725,
      "index": 98,
      "start_time": 2550.998,
      "text": " I don't build things by the hands but I do actually have heat engines that other people have built on my shelf back there."
    },
    {
      "end_time": 2607.022,
      "index": 99,
      "start_time": 2581.852,
      "text": " Thermodynamic entropy is closest to find it, and it's totally independent of any hypothesis of a molecular structure, and its definition presupposes for its definition the second law of thermodynamics in one of its formulations. So if the second law of thermodynamics is not right, then"
    },
    {
      "end_time": 2635.384,
      "index": 100,
      "start_time": 2607.91,
      "text": " entropy there is no such quantity as entropy as entropy as Clausius defined it because it's just not going to be true anymore that it doesn't matter which reversible process you pick to go from A to B to define your entropy. Okay here's why I think it's important to realize to make clear okay that's what thermodynamic entropy as Clausius defined it is because"
    },
    {
      "end_time": 2663.746,
      "index": 101,
      "start_time": 2636.169,
      "text": " It helps us realize that when you now make the move to statistical mechanics, the second law of thermodynamics as originally conceived actually isn't quite right. And it has to be replaced by, as you say, a statistical law. Yes. And is this statistical version not the original version of the second law of thermodynamics that physicists these days accept?"
    },
    {
      "end_time": 2680.725,
      "index": 102,
      "start_time": 2664.036,
      "text": " And then you use words like tendency. Yeah, right. And Maxwell himself said, and I think he was the first one to express it this way, the second law of thermodynamics is a statistical regularity."
    },
    {
      "end_time": 2711.374,
      "index": 103,
      "start_time": 2682.654,
      "text": " And this is at a time when people like were, this is like middle of the 19th century, or actually he said this in 1878. So, but this is a time when people are getting really impressed by statistical regularities because this is like, it was early in the 19th century that people really started gathering statistics about populations and noticing that there were these regularities, say the number of murders per capita in Paris year after year. And, um,"
    },
    {
      "end_time": 2741.254,
      "index": 104,
      "start_time": 2712.517,
      "text": " That's interesting. That's interesting because these are statistical regularities that you can depend on year after year of there are averages over, um, aggregate aggregates of individually in unpredictable events, right? Like, you know, like if, if people could predict exactly when and where a murder would take, then it would, would, would take place. Then it wouldn't take place. Right. Yeah. So, um,"
    },
    {
      "end_time": 2772.21,
      "index": 105,
      "start_time": 2742.432,
      "text": " Yeah, so this is Maxwell saying the second law really is just a statistical regularity and it has to do with, it's similar to the statistical regularities that the statisticians who are out there gathering data about populations are doing. And he actually gave a talk on molecules to the British Association for the Advancement of Science"
    },
    {
      "end_time": 2802.227,
      "index": 106,
      "start_time": 2772.858,
      "text": " which had only recently created a section for statistics and the hard scientists were kind of looking down on the social sciences and he gave this talk on on um molecules saying we the physicists have adopted methods from the statisticians because we're talking taking averages over a large number of quantities of things so um interesting yeah so yeah so the law the version of the second law that um"
    },
    {
      "end_time": 2810.469,
      "index": 107,
      "start_time": 2803.507,
      "text": " Most people accept these days as some kind of probabilistic or statistically qualified thing."
    },
    {
      "end_time": 2834.616,
      "index": 108,
      "start_time": 2811.101,
      "text": " The way that Szilard put it in the 1920s is, imagine someone who is trying to exceed the Carnot bound of efficiency is kind of like a gambler who's trying to break the bank at a casino. You might have occasional wins and losses, but in no way you're going to be reliably, on average, going to be able to win."
    },
    {
      "end_time": 2864.855,
      "index": 109,
      "start_time": 2835.742,
      "text": " Yeah, and there, you know, theorems coming from a probability theory about the impossibility of doing that because the expectation value of your winnings is always negative if the casino is doing what's right. And a lot of large numbers says that your winnings per game is going to, with high probability, get closer and closer to the expectation value. So it's using the second law should be thought of as a similar thing."
    },
    {
      "end_time": 2890.452,
      "index": 110,
      "start_time": 2866.63,
      "text": " This actually brings in a connection with information if you're still thinking about thermodynamics as some kind of a resource theory. Suppose you've got a fluctuating pressure on your piston. If you knew when those fluctuations were going to happen,"
    },
    {
      "end_time": 2916.084,
      "index": 111,
      "start_time": 2891.323,
      "text": " If you could reliably say I'm going to pull the push-up piston out only when the pressure is momentarily higher than average, then you could violate even the statistical version of the second law. And think of it this way, rather than a piston, imagine you've got a box with a gas in it and there's a partition down the middle and a little hole in the middle so the gas can go through, right?"
    },
    {
      "end_time": 2942.551,
      "index": 112,
      "start_time": 2916.783,
      "text": " Okay, so there's going to be continually small fluctuations in the number of molecules on each side as they go back and forth, and there's going to be continuous small fluctuations in the pressure. And suppose it's been doing this for a while and then we close the hole and you're fairly certain that the pressure is greater on one side than other, right? If you know which side the pressure is greater,"
    },
    {
      "end_time": 2969.155,
      "index": 113,
      "start_time": 2943.473,
      "text": " Then you could exploit that to slightly increase the piston. And if you reliably knew, if you could do this again and again and reliably know where the greater pressure was, you could violate even a statistical version of the second law. Okay, so here's the, let's go back to the ancient question. So suppose you've got a box of gas in front of you."
    },
    {
      "end_time": 2995.657,
      "index": 114,
      "start_time": 2969.821,
      "text": " and it initially starts out with same pressure on both sides and then it fluctuates a bit and there's a higher pressure on one side than another and you close the hole so it's stuck like that. Has the entropy decreased? Well if you think okay you do the standard calculations of entropy you've got it doesn't matter which side the pressure is higher"
    },
    {
      "end_time": 3019.292,
      "index": 115,
      "start_time": 2996.732,
      "text": " because you can just do the standard calculation and if you've got a box with higher pressure on this side than lower pressure on this side, it has a lower entropy than a gas with the same pressure on both sides, right? And so if you're thinking of entropy as something that's supposed to be just, you know, property of the gas itself,"
    },
    {
      "end_time": 3048.183,
      "index": 116,
      "start_time": 3020.589,
      "text": " Then, yeah, the entropy is lower. It has decreased, right? And you will say that even when you... Sorry, you mean the property of the physical system, not the gas, like not the molecules, individual molecules of the gas, but the physical system, the whole system. The physical system is the whole gas, you know, the whole thing. Yes, okay. So the physical system is the gas in the box, right? Got it. And"
    },
    {
      "end_time": 3075.52,
      "index": 117,
      "start_time": 3048.626,
      "text": " Standard calculation, if I tell you here's a box with two chambers and there's this much gas at this much, let's say it's the same temperature on both sides and you had this much gas and this much pressure on this side and this much gas at this pressure on that side, then the minimum of entropy is when the two pressures are the same. Standard calculation that you learn to do in your intro thermonautics courses. Okay."
    },
    {
      "end_time": 3104.155,
      "index": 118,
      "start_time": 3076.135,
      "text": " Okay, so when you have a spontaneous fluctuation, one thing you can say is, well, look, what's happening is the entropy is actually spontaneously fluctuating around some kind of mean value, right? And so there are actually spontaneous decreases of entropy. However, here's another way to phrase the press question."
    },
    {
      "end_time": 3133.046,
      "index": 119,
      "start_time": 3104.65,
      "text": " I'm going to phrase it not in terms of entropy but in terms of what Kelvin and Maxwell called available energy. So here's a question about available energy. Available energy is, imagine I've got some physical system in front of you and you've got a heat bath at some fixed temperature and I task you with taking it in the state it's in and"
    },
    {
      "end_time": 3162.176,
      "index": 120,
      "start_time": 3134.326,
      "text": " Try to get as much work out of it as you can, but I'm going to specify the state you have to leave the system at the end. And the available energy is a measure of how much work you can get out. And it's equivalent to what we now call the Helmholtz free energy, which is total internal energy minus temperature times the entropy. If a gas"
    },
    {
      "end_time": 3192.295,
      "index": 121,
      "start_time": 3162.756,
      "text": " Spontaneously fluctuates to a situation where there's a more pressure on one side and we close the hole Has the available energy increased I'd say yes, you'd say yes, you can now use it to do something Okay, so how are you gonna use it to do something? If all the molecules are now on one side or just more of them, yeah, okay more Yeah, then you can place something here and it will spontaneously start to push this guy"
    },
    {
      "end_time": 3219.616,
      "index": 122,
      "start_time": 3192.654,
      "text": " to the left. Which side are you going to place it? Do that on? What do you mean? Like so you so if all molecules are one side, then you can just have a piston on that side. So yes. So I have the gas and you know, let's just say all the molecules on one side. Okay. So they're either all on the left side or on the right side. Yeah. Okay. So what do you what do you do to get work out of it?"
    },
    {
      "end_time": 3249.087,
      "index": 123,
      "start_time": 3221.067,
      "text": " You place the piston in the middle point that divides it and then you watch the piston grow. Right. So suppose you want to raise a, so if I, if I just let the piston go, then I haven't gotten a useful work. Well, suppose I want to raise a weight. Uh-huh. What do I do? I don't know. Well, here's the thing. If I, if I hook the thing up, if I put a piston, attach your weight to a piston,"
    },
    {
      "end_time": 3279.053,
      "index": 124,
      "start_time": 3250.247,
      "text": " Then, and I want to raise the weight. So they say, I've got the piston, I've got a string on a pulley, you're in a pulley and you've got a weight that can go up or down, right? You know, that string and pulley can be either on the left side or the right side of the piston. And if I don't know which side of the gas of the box the gas is in. Oh, sorry, I didn't realize that you don't know which side. Yeah, all I said was it's either in the left side or right side, right?"
    },
    {
      "end_time": 3307.585,
      "index": 125,
      "start_time": 3279.377,
      "text": " I see. Okay. Okay. So right. If you don't know, then you, you know, if you guess right, you might say, okay, I'm going to guess. Right. And I'm going to put the piston, I'm going to put the weight on this side of the piston and you could end up raising it. Right. But if you guess wrong, you could end up lowering the weight. Yes. Yes. So this is why, according to some people, it makes sense."
    },
    {
      "end_time": 3338.2,
      "index": 126,
      "start_time": 3308.456,
      "text": " um, to have entropy be a function, not only of like a physical state of the system, but about what somebody knows about it. Because if entropy is supposed to have the connection that with available energy that I just said, that is a measure of how if available available energy has to do with how much work you can get out of a system that depends not only the physical state of the system, but"
    },
    {
      "end_time": 3362.619,
      "index": 127,
      "start_time": 3339.309,
      "text": " your means of manipulating the system available to you and what you know about it. And what I just said is non-controversial. If the question is, how much work can I get out of a system? That depends on my means of manipulating the system and what I know about the system. This is a non-controversial thing."
    },
    {
      "end_time": 3392.398,
      "index": 128,
      "start_time": 3363.746,
      "text": " And if you want a notion of entropy to have this connection to available energy, then it makes sense to have a notion of entropy which is relative to means of manipulation available and a state of information by the system. And in my experience, I can say that to people who initially say to the angel question,"
    },
    {
      "end_time": 3421.8,
      "index": 129,
      "start_time": 3392.875,
      "text": " Oh, of course not. Entropy is patently a property of physical system alone. It's got nothing to do with what you might know about it. If I say, oh, well, if I think of thermodynamics as a resource theory about a theory about what agents with certain goals and certain means of manipulation can do with systems, and I want this notion of available energy"
    },
    {
      "end_time": 3448.336,
      "index": 130,
      "start_time": 3422.21,
      "text": " to be a measure of how much work you can get out of a system, then clearly available energy can be relative to means of manipulation and knowledge about the system. It matters if you know, because if you have to do different things to the system to get working on it, out of it, depending on which side of the box molecules are on, then information is a resource, right? And"
    },
    {
      "end_time": 3477.415,
      "index": 131,
      "start_time": 3449.241,
      "text": " And so there's a perfectly acceptable notion, and it doesn't matter whether you call it entropy or not, but there's this notion that has that connection to available energy. And what are we going to call it? Well, if you don't like to call that entropy, make up a new word for it. But it is actually very closely related to the"
    },
    {
      "end_time": 3500.367,
      "index": 132,
      "start_time": 3478.148,
      "text": " concept that Clausius coined the term entropy for. And then the difference between this and the Clausius notion of entropy is in traditional thermodynamics, you know, thermodynamics 1.0, what people were doing in the 19th century, they always assumed that even if"
    },
    {
      "end_time": 3528.251,
      "index": 133,
      "start_time": 3501.084,
      "text": " Matter is made up of lots of molecules and there's these fluctuations at the molecular level. We're dealing with them in bulk and any fluctuations are going to be negligible. Things might as well be predictable. And so we can just assume we know what's going to happen as a result of our manipulations. When you start getting down to the molecular level, and this is what the people who are working quantum thermodynamics are doing, they're saying, okay, you know, we're working at a level where"
    },
    {
      "end_time": 3548.968,
      "index": 134,
      "start_time": 3529.206,
      "text": " These molecular fluctuations aren't negligible. And if, you know, what you really want for a notion of entropy is something that's relative, say, to certain physical characteristics, but also, you know, it might be in this, this and this, this state."
    },
    {
      "end_time": 3577.688,
      "index": 135,
      "start_time": 3549.241,
      "text": " And what I can do with it, how it's going to respond to what my manipulations can depend on what state it is. So it can actually be relative to some, say, probability distribution over possible states. And then what you get is a sort of quantum, what you get is a signal of thermodynamics as a sort of statistical average."
    },
    {
      "end_time": 3607.005,
      "index": 136,
      "start_time": 3579.889,
      "text": " Now what happens in textbooks these days is there's basically, even statistical mechanics textbooks, there are textbooks that basically take that kind of approach and whether you're paying attention or not, entropy is actually defined in terms of a probability distribution which can represent a state of information about the system. And then the other way of doing it, and this is what you were regarding"
    },
    {
      "end_time": 3634.258,
      "index": 137,
      "start_time": 3607.363,
      "text": " So that is what's often called Boltzmann entropy."
    },
    {
      "end_time": 3658.251,
      "index": 138,
      "start_time": 3635.247,
      "text": " And the other one where entropy is defined in terms of probability distribution is often called Gibbs entropy. And they're both perfectly good concepts. But they have different uses. They're different. They're different things and they're different uses. So if I give you a box and say with probability"
    },
    {
      "end_time": 3688.353,
      "index": 139,
      "start_time": 3658.865,
      "text": " one half it's all the molecules are in this side and with probability one half the other side that will have a certain gives entropy associated with it which will have something to do with what you can do with it what works you can get out get out of it and then you say and and that will vary with what those probabilities are like it's more used that that it's more like if"
    },
    {
      "end_time": 3718.268,
      "index": 140,
      "start_time": 3688.763,
      "text": " Yes, right."
    },
    {
      "end_time": 3743.695,
      "index": 141,
      "start_time": 3718.865,
      "text": " You know, if it's either on this side or on this side or on that side, you calculate the Boltzmann entropy. If it's on this side, you calculate the Boltzmann entropy. If it's on the other side, it's the same. I see. And that's also correct. Right. So the Boltzmann entropy doesn't depend on what you know about the system. The Gibbs entropy does. They serve different purposes because different concepts."
    },
    {
      "end_time": 3767.619,
      "index": 142,
      "start_time": 3745.418,
      "text": " And what happens when people get in arguments about whether or not it makes sense to, when people get in arguments about whether or not it makes sense for entropy to be relative to a state of information, they have in mind different concepts of entropy which are perfectly well defined but for different purposes."
    },
    {
      "end_time": 3798.234,
      "index": 143,
      "start_time": 3769.309,
      "text": " Close your eyes, exhale, feel your body relax, and let go of whatever you're carrying today. Well, I'm letting go of the worry that I wouldn't get my new contacts in time for this class. I got them delivered free from 1-800-CONTACTS. Oh my gosh, they're so fast! And breathe. Oh, sorry. I almost couldn't breathe when I saw the discount they gave me on my first order. Oh, sorry. Namaste. Visit 1-800-CONTACTS.COM today to save on your first order."
    },
    {
      "end_time": 3828.712,
      "index": 144,
      "start_time": 3799.053,
      "text": " OK, when I said probability here, the person who's listening may be thinking probability of what and then what we didn't say much. Maybe you mentioned it once or twice, but not much as microstate versus macrostate. So it's the probability of a certain macrostate. What is a macrostate? It's seen as a count of microstates. What is a microstate? Well, when people say what is the physical system, most of the time on this channel, when we're speaking about quote unquote fundamental physics, we're thinking of a microstate."
    },
    {
      "end_time": 3855.503,
      "index": 145,
      "start_time": 3829.138,
      "text": " So a macro state is then what? Like what the heck defines a macro state? Is it just us as people? We say this is something we care about more. So we're going to call this a macro state. Yeah. So that's a good question. And in fact, you'll find different answers in different textbooks because the people who want entropy"
    },
    {
      "end_time": 3883.746,
      "index": 146,
      "start_time": 3856.766,
      "text": " Statistical and mechanical entropy to be a property of the system by itself. They usually mean Boltzmann entropy Right, but the Boltzmann entropy what you do first step you do is you partition the possible the set of possible microstates into macro states and you say Whatever micro state it's in it's going to be in some macro state and some macro states"
    },
    {
      "end_time": 3910.572,
      "index": 147,
      "start_time": 3884.07,
      "text": " correspond to a bigger range of possible microstates than others, and the macrostates which correspond to a bigger range of microstates have higher entropy than the ones that correspond to a narrower range of microstates. And so the entropy does change with microstate because if the microstate changes,"
    },
    {
      "end_time": 3939.377,
      "index": 148,
      "start_time": 3910.862,
      "text": " Within a macro state, the entropy doesn't change, but if it crosses from one micro state to another, then the Boltzmann entropy changes. But the entropy isn't a property of the micro state alone because it requires this division into macro states, which isn't there in the fundamental physics. Yes. So technically speaking, any given micro state has entropy zero. Well, if you're talking about Boltzmann entropy, right?"
    },
    {
      "end_time": 3967.483,
      "index": 149,
      "start_time": 3940.469,
      "text": " Then in order to define Boltzmann entropy, I first have to partition the possible states into macro states, right? Right. Okay. And, um, however, if I, if I tell you, if I'm going to do like a really, really fine partition, right? You say my partition is I'm going to tell you, you know, every micro state is in, is in a different element of partition. I'm going to tell you exactly what the micro state is, right?"
    },
    {
      "end_time": 3992.5,
      "index": 150,
      "start_time": 3968.2,
      "text": " Then yeah, then every of those microstates will have zero entropy, right? Yeah. But that's what's, that would be kind of useful. That would be totally useless, right? So I think that even the people who are saying, no, entropy can't depend on us. It can't depend on what we know about it. It can't depend on how we can manipulate it. If what they're using as a notion of entropy is Boltzmann entropy,"
    },
    {
      "end_time": 4012.961,
      "index": 151,
      "start_time": 3992.927,
      "text": " It starts with a division, dividing up the set of possible states into macrostates. And you asked yesterday the question, well, what is a macrostate? Okay. Now, one thing that people often say is, well, look,"
    },
    {
      "end_time": 4036.664,
      "index": 152,
      "start_time": 4013.524,
      "text": " There's certain variables that we're going to be measuring, macro variables, and our measuring instruments are going to have a certain rank of precision and a macro state is a set of microstates that are indistinguishable according to the measurements that we're going to do. And then it's not there in the fundamental physics because it's relative to some set of"
    },
    {
      "end_time": 4063.865,
      "index": 153,
      "start_time": 4038.114,
      "text": " It's relative to some set of measures and some, you know, some set of instruments, some set of measurement positions. And I think that that's perfectly fine. And then saying, okay, well, we're not doing fundamental physics when we're talking about entropy. I think that's perfectly fine. It bothers people because"
    },
    {
      "end_time": 4090.725,
      "index": 154,
      "start_time": 4064.462,
      "text": " entropy increase is supposed to be one of the fundamental laws of the universe and it's not supposed to depend on things that aren't there in the fundamental physics. If you're, um, but, you know, it just might, that just might be the right answer. Right. Um, like a storm dynamics is not a fundamental theory. Right. Now, another thing you could say is, well, what really matters is, you know, if I'm thinking about this as a resource theory,"
    },
    {
      "end_time": 4121.357,
      "index": 155,
      "start_time": 4091.817,
      "text": " A distinction between macrostates, well, you know, I'm going to pay attention to distinctions if they make a difference to what I can do with them and not if they don't. So if all the molecules are on one, so if I guess, if I tell you how many molecules are on one side of the box and how many molecules are on the other, that, okay, that's really useful to know because"
    },
    {
      "end_time": 4149.019,
      "index": 156,
      "start_time": 4121.561,
      "text": " I can use that to expand one side or another. That's good to know. If all I've got is a piston that can expand the things in bulk and I don't have the means to manipulate things at the micro level, you tell me anything beyond that. You tell me the exact microstate, it doesn't affect what I can do with it. It doesn't affect what I can get out of it."
    },
    {
      "end_time": 4175.043,
      "index": 157,
      "start_time": 4150.759,
      "text": " And this is something that a lot of people misunderstand about Maxwell's demon example. The demon example was meant to illustrate the dependence of thermodynamic concepts like entropy on means of manipulation available. So in the"
    },
    {
      "end_time": 4201.101,
      "index": 158,
      "start_time": 4175.998,
      "text": " The first appearance of what we now call Maxwell's Demon was a letter from Maxwell to his friend, Peter Guthrie Tate, who was writing a sketch of thermodynamics. And Maxwell said, you know, you might want to pick a hole in the second law because he's saying the second law, you know, if the kinetic theory of gas is true, needs to be modified."
    },
    {
      "end_time": 4225.742,
      "index": 159,
      "start_time": 4202.005,
      "text": " And, you know, imagine some, you know, little being that could manipulate molecules individually or, you know, imagine that he's got a little trap door in between the compartments of the... I'll place a video on screen about this. Yeah, imagine that you've got gas in a box"
    },
    {
      "end_time": 4255.179,
      "index": 160,
      "start_time": 4225.913,
      "text": " Divide into two compartments and there's a little trap door and the demon can manipulate the door and let the faster molecules go one way and the slower molecules go another way. Well, that demon could create without expenditure of work or more minimal expenditure of work, create big pressure temperature differences that now we as macroscopic beings could exploit. And the moral of the story, according to Maxwell is that"
    },
    {
      "end_time": 4283.251,
      "index": 161,
      "start_time": 4255.845,
      "text": " The second law of thermodynamics is a statistical generalization, which is applicable only to situations where you're dealing with large numbers of molecules in bulk. And when he says statistical generalization, he's expecting his readers to be familiar with the sorts of statistical generalizations that the social sciences are coming up with, things like numbers of murders per capita per year."
    },
    {
      "end_time": 4310.486,
      "index": 162,
      "start_time": 4283.609,
      "text": " And if you think about it, there actually is a nice analogy. So if you keep sort of the macro level conditions the same, the broad scale socioeconomic conditions the same, then plausibly you're going to get a fairly stable number of murders per capita per year if you're in a given situation. But if you could"
    },
    {
      "end_time": 4339.462,
      "index": 163,
      "start_time": 4311.305,
      "text": " You imagine a team of psychologists going in and talking to people if they had the ability to identify people who were at risk for committing murder or something like that and talk to them and deal with them, you know, if they could deal with the people on a visual scale, then you might be able to change that per capita per murder, per capita murders per capita. Right. And so, but so what, um,"
    },
    {
      "end_time": 4369.633,
      "index": 164,
      "start_time": 4340.418,
      "text": " Matsuda's saying is this demon would be able to do what is at present impossible for us. Because we do not have the abilities to manipulate things at the molecular level. He didn't think, the way he put it made it clear that he didn't think there's any fundamental law of physics that would prevent further technological developments from getting to the point where we could do this."
    },
    {
      "end_time": 4392.568,
      "index": 165,
      "start_time": 4370.469,
      "text": " Yes. Right. Now, as a matter of fact, he didn't really see this. But if you now include the demon, make the demon operate in a cycle. So the demon, whatever it does, has to reset itself at the end of each iteration of whatever it's doing."
    },
    {
      "end_time": 4420.23,
      "index": 166,
      "start_time": 4393.626,
      "text": " Then that actually is a consequence of the laws of physics, both classical and quantum that on average, the demon can't break the second law of thermodynamics because the classical case would be the, you know, if you take the operation of, you know, take the demon plus the whole system as an isolated system. Yes. Yes. If the demon can,"
    },
    {
      "end_time": 4446.561,
      "index": 167,
      "start_time": 4421.544,
      "text": " operate in a cycle while reliably putting all the molecules in the left side of the box that is incompatible with Hamiltonian evolution, which conserves phase space volume. Like you'd be able to take a system that you actually be able to reduce the volume of phase space occupied with the system."
    },
    {
      "end_time": 4470.333,
      "index": 168,
      "start_time": 4447.005,
      "text": " And there's something similar in quantum mechanics where you've got, if you've got the whole system involving isolated evolution, then you've got, you know, you can't take something that's initially spread out over a big subspace of Hilbert space and put it into a small subspace."
    },
    {
      "end_time": 4500.418,
      "index": 169,
      "start_time": 4471.288,
      "text": " And so, actually, Maxwell didn't realize that if you require the demon to act in a cycle, but there's theorems to the effect of both classical and quantum mechanics, that the demon cannot reliably and continually do this. Precisely, what do you mean when you say that the demon acts in a cycle? The demon has to end up in the same physical state it started out with. Okay, why do you have to do that?"
    },
    {
      "end_time": 4528.012,
      "index": 170,
      "start_time": 4500.776,
      "text": " I imagine that look if the demon has a brain right and is opening closing this door then the brain changes. Yeah Yeah, right. So so here's what people were thinking and You know, you're right to talk about the brain and you know people have given simple models of this as like not, you know, not a A creature with a brain, but maybe a little device with like a memory device or something like that. Yeah. Yeah so that"
    },
    {
      "end_time": 4556.988,
      "index": 171,
      "start_time": 4528.899,
      "text": " The idea is that if the daemon has some kind of memory storage, and it always remembers what it did on previous iterations and never ever erases anything, eventually it's just going to run out of memory. So it can't keep on doing this forever and ever and ever."
    },
    {
      "end_time": 4579.206,
      "index": 172,
      "start_time": 4558.012,
      "text": " And if it has to act in a cycle, if it has to eventually erase the memory, then there's actually an entropy cost associated with erasing the memory, and that's sometimes known as Landauer's Principle."
    },
    {
      "end_time": 4608.439,
      "index": 173,
      "start_time": 4579.394,
      "text": " Yeah. And it really is just basically a consequence of what I just said, that if you require the demon to act in a cycle, then it can't consistently or reliably violate the second law. So if I don't require the demon to act in a cycle, then yeah, what it can do is"
    },
    {
      "end_time": 4638.899,
      "index": 174,
      "start_time": 4609.667,
      "text": " Okay, think about that blank memory as a resource. And it's doing this and eventually uses that resource and hands you this box with a higher pressure on one side than the others and said, okay, good, you know, now you can use that to raise a piston or something like that. Okay, what all you did was you took a resource and you converted it to another resource. You didn't actually violate the second law. Yes, yes, I see. Okay. So you actually have to think about that memory reserve, that blank memory reserve as having a"
    },
    {
      "end_time": 4654.838,
      "index": 175,
      "start_time": 4639.582,
      "text": " entropy of its own, so a memory which is just blank or maybe full of all zeros on this view has a lower entropy than a memory that's randomly populated by ones and zeros."
    },
    {
      "end_time": 4683.865,
      "index": 176,
      "start_time": 4655.333,
      "text": " Okay, so let me see if I got this. There are two cases. Either it operates on a cycle or it doesn't. If it doesn't, it's going to use up a resource, in which case you still have a resource theory. If it does operate on a cycle, then fundamentally you'll be shrinking phase space. Now I know the most physical systems shrink phase space because there's some friction and so on, but at a fundamental level you don't shrink phase space. Sorry, you don't shrink the volume you initially started with in phase space. Absolutely, yeah. So when you've got a dissipative system, like something that's friction, right?"
    },
    {
      "end_time": 4705.589,
      "index": 177,
      "start_time": 4684.292,
      "text": " Then you write down equations of motion and it will go from, you know, everything in this original version of Space Face will go to that, right? But that's because we're not actually thinking that system is an isolated system, like it's in contact with something that's a source of friction, right? And if you include everything, like, you know, the"
    },
    {
      "end_time": 4733.626,
      "index": 178,
      "start_time": 4706.169,
      "text": " pendulum that's going back and forth and just made and whatever medium it is that's The source of friction and you think of all that is undergoing ice You know isolated evolution and you think okay ordinary Hamiltonian dynamics is going to apply then that System as a whole is not going to show me fixed phase space what's happening is as the pendulum is stamped and it goes and you know"
    },
    {
      "end_time": 4759.224,
      "index": 179,
      "start_time": 4734.445,
      "text": " Okay. Well, you're an expert in quantum mechanics and quantum field theory."
    },
    {
      "end_time": 4779.855,
      "index": 180,
      "start_time": 4759.582,
      "text": " And I'd like to talk to you about that next time in person, because you live actually close by. So hopefully we get to meet up shortly for my colleagues in Europe, London, Ontario and Toronto here. Ontario Count is close. My colleagues in the Netherlands always find it funny when I say things like that. So there's a Heisenberg cut."
    },
    {
      "end_time": 4802.688,
      "index": 181,
      "start_time": 4780.503,
      "text": " That's a good question."
    },
    {
      "end_time": 4832.637,
      "index": 182,
      "start_time": 4805.794,
      "text": " I will really have to think about, I sort of see where you're getting at, whether prima facie there might be a connection, but I'm not seeing exactly what the connection might be. And it's not obviously wrong. So I would have to think about that. Yeah, there might be actually. Yeah. My second funny question is, is the universe an isolated system?"
    },
    {
      "end_time": 4864.275,
      "index": 183,
      "start_time": 4834.445,
      "text": " Is the universe an isolated system? Can we even talk about the universe as a whole? Presumably, yes, if you actually mean, yes, everything in the universe is, if you literally mean the universe is everything there is, then... It seems tautologically the case. However, another question is, does the universe as a whole obey the sorts of laws that we usually think of"
    },
    {
      "end_time": 4893.08,
      "index": 184,
      "start_time": 4865.196,
      "text": " applicable to isolated systems. And here's why this is a genuine question. It might be that for relatively small systems that we can actually isolate, the physics we apply to isolated systems applies, but when things are big enough that doesn't actually apply. And what I mean by that is in quantum mechanics, what usually"
    },
    {
      "end_time": 4921.152,
      "index": 185,
      "start_time": 4893.422,
      "text": " When you're asking is the universe an isolated system, what you usually mean is it evolves according to whatever the appropriate analog of the Schrodinger equation is. And that can be represented by a family of unitary operators and that preserves Hilbert space norm. And so it can't start out in say, you know, a small subspace of Hilbert space and go to bigger, right?"
    },
    {
      "end_time": 4948.712,
      "index": 186,
      "start_time": 4921.732,
      "text": " But people who take dynamical collapse theories seriously think that actually this isolated Schrodinger revolution that we apply to sort of systems in the lab that we isolate is actually an idealization and not quite right. And that if you have systems that meet certain criteria, either they have enough particles or they involve displacements of large masses,"
    },
    {
      "end_time": 4974.189,
      "index": 187,
      "start_time": 4949.326,
      "text": " Then actually the physical law is a different one that isn't the law that we usually think of as isolated. And in fact, mathematically mimics the sort of laws that we use for systems that aren't isolated. As you know, because I know you've talked to people with this, there are dynamic collapse theories, right, which modify the usual Schrodinger equation."
    },
    {
      "end_time": 4998.063,
      "index": 188,
      "start_time": 4974.855,
      "text": " and basically you know the origin of those was you know people are studying um evolution of non-isolated systems and saying okay here's what happens to the state of the system if it's say in contact with the heat bath or something like that and then saying well let's just imagine that something of this form or a similar form actually is the fundamental law"
    },
    {
      "end_time": 5031.015,
      "index": 189,
      "start_time": 5001.015,
      "text": " Ford BlueCruise hands-free highway driving takes the work out of being behind the wheel, allowing you to relax and reconnect while also staying in control. Enjoy the drive in BlueCruise-enabled vehicles like the F-150, Explorer, and Mustang Mach-E. Available feature on equipped vehicles. Terms apply. Does not replace safe driving. See ford.com slash BlueCruise for more details."
    },
    {
      "end_time": 5060.452,
      "index": 190,
      "start_time": 5031.493,
      "text": " It will still be tautologically. If that's the case, it'll still be a tautology that the universe is isolated. But it might be that the way the universe as a whole evolves is as if it's continually be monitored by some external measure. OK, my other funny question is,"
    },
    {
      "end_time": 5086.596,
      "index": 191,
      "start_time": 5060.896,
      "text": " It's often said that at some point we'll reach the heat death of the universe and that we won't be able to do anything, even if we're supposing that we're around or whatever is our descendants. Yeah. Now, do you imagine that to be the case? Because if we're thinking in terms of a resource theory, then I could imagine that there would be certain questions that would be more important to us that would be different to us, to whatever our descendants are. Maybe they"
    },
    {
      "end_time": 5109.326,
      "index": 192,
      "start_time": 5087.654,
      "text": " are able to utilize a system with more precision? The way we actually calculate entropy is, and this is something that's not often emphasized in textbooks, is relative to a certain set of parameters that we think we're interested in or we're going to manipulate. For example, if I want to"
    },
    {
      "end_time": 5132.09,
      "index": 193,
      "start_time": 5110.094,
      "text": " Calculate the entropy of a some standard volume of gas. The question is still do I count? Get samples of gas with different isotope isotope ratios differently and as long as I'm only dealing with things chemically it doesn't matter like how much of my oxygen has one is one isotope and how much it doesn't and"
    },
    {
      "end_time": 5160.759,
      "index": 194,
      "start_time": 5132.5,
      "text": " If I'm dealing with nuclear reactions, or if I've got some way to separate out things according to their mass, then that actually might be something. And so I might want to include that in the entropy calculations. There's a sense in which entropy is relative to what is it you're going to manipulate. However, when people are talking about the heat death is, well, no matter what it is that you want to do,"
    },
    {
      "end_time": 5184.65,
      "index": 195,
      "start_time": 5161.63,
      "text": " The natural tendency for things is towards, left to themselves, is towards diminished ability to do that. So eventually, and it was really in the 19th century that people started talking about the heat death of the universe. Kelvin himself wrote an article called On a Universal Tendency Towards Dissipation of Energy."
    },
    {
      "end_time": 5213.968,
      "index": 196,
      "start_time": 5185.572,
      "text": " And so what is going to happen, if things just keep going, is the sun's going to burn out and, okay, pretty much anything that we want to do here on earth, no matter what your goals are or your means of manipulation, it involves some kind of entropy difference that ultimately traces from the fact that on one side of us, we've got this high temperature source of"
    },
    {
      "end_time": 5230.128,
      "index": 197,
      "start_time": 5214.428,
      "text": " I'm"
    },
    {
      "end_time": 5258.387,
      "index": 198,
      "start_time": 5230.538,
      "text": " even if you have more subtle ways of manipulating things, um, eventually everything's going to decay into black holes and you know, eventually no matter what your goals are and no matter what, um, means you have manipulation, eventually things are going to run out and just stay that way forever. Unless Roger Penrose is right about his conformal cyclical cosmology when you know, every,"
    },
    {
      "end_time": 5288.166,
      "index": 199,
      "start_time": 5258.831,
      "text": " after that happens, then things get restarted. Okay. But like, um, so that's true. And, but honestly, um, we're talking like absurdly long time scales, like billions and billions of years. So, um, I think we should be more worried about whether humans going, what are things going to be like for people on earth in the next few centuries? Right. You know, um, that, that is something that we can do something about."
    },
    {
      "end_time": 5312.5,
      "index": 200,
      "start_time": 5288.524,
      "text": " Right. Um, and what's going to happen on this time scale of millions and billions of years, that's actually hard for us to wrap around our heads around it. So like, some people find this, you know, heat death of the universe, they sort of depressing and, um, you know, sometimes even people even say, okay, this makes everything meaningless. Um, well, you know what you all,"
    },
    {
      "end_time": 5335.265,
      "index": 201,
      "start_time": 5313.848,
      "text": " You've known most of your life you're going to die eventually, right? And you've got a certain limited amount of time to do stuff with, right? And do what you can with it while you have it. Make the best of the time that you have, right? And that applies on the human time scale. And I think the same thing we say, suppose"
    },
    {
      "end_time": 5365.026,
      "index": 202,
      "start_time": 5335.862,
      "text": " The human species is only going to be around for another million years. Well, I would say to that species, you do make the best of what you can while you with the time you have. So actually, I don't find it. I mean, I do get occasionally like, you know, everyone has to deal with the fact that you and everyone else that you know is mortal and you're you're not you have a finite lifespan. I do, you know, it's hard not to."
    },
    {
      "end_time": 5392.381,
      "index": 203,
      "start_time": 5365.469,
      "text": " Disorder. Disorder. Entropy. Yeah. Disorder is a word that we haven't said. Yeah. And it's something that many people in the popular press, when speaking about entropy, make an equivalence between entropy and disorder. What's wrong about that?"
    },
    {
      "end_time": 5420.572,
      "index": 204,
      "start_time": 5392.927,
      "text": " that you have to be careful about what you mean about order and disorder, right? And so, um, there's a real sense of what you're, if I've got a box of gas and there's a partition and all the gases on one side and none on the other, that is a more ordered state than if the, if the partition is out and there's gases all over the place. And there is a, um, and one way of thinking about that is that if"
    },
    {
      "end_time": 5445.213,
      "index": 205,
      "start_time": 5421.357,
      "text": " The gas is indeed allowed to roam freely. It could spontaneously end up in one side of the box, but that corresponds to a very, very small region of its phase space. And so the idea is that there are certain kinds of states that we find to be ordered, and those just are a small percentage of all the possible states."
    },
    {
      "end_time": 5457.005,
      "index": 206,
      "start_time": 5447.261,
      "text": " There is a sense in which entropy is a measure of disorder."
    },
    {
      "end_time": 5486.237,
      "index": 207,
      "start_time": 5457.671,
      "text": " Let's say I'm generating heat by friction, like I'm grinding this cannonball, right? There's a sense in which I've got some regular ordered motion, I've got this thing going around like that, and I'm taking energy from that ordered motion and I'm transferring it to the iron in the cannon which is manifested as a higgledy-piggledy motion molecule."
    },
    {
      "end_time": 5509.241,
      "index": 208,
      "start_time": 5486.63,
      "text": " I think what's not right about that is that not everything that we would intuitively think of as distinction between order and disorder is is actually corresponds to a distinction in energy. I'm sorry, not everything that entropy, right? Yeah. So if the easiest way that I think about it is a coffee cup."
    },
    {
      "end_time": 5535.725,
      "index": 209,
      "start_time": 5509.65,
      "text": " and initially it's black with black coffee and then you pour some milk and then there's all this turbulence and you'd say oh that's extremely disordered and so you stir it and then you're like oh wow now it's extremely ordered but it actually has the highest entropy yeah right yeah so yeah i mean that's that's a good that's a really good example because there are some things that seem more disordered to us that are actually lower entropy and that's a really good example you've got you know"
    },
    {
      "end_time": 5563.012,
      "index": 210,
      "start_time": 5536.305,
      "text": " Before the milk, it's actually better if it's cream because the cream can take some time to sort of dissolve, disperse, right? So if I've got, you know, if I take some thick cream and put in the coffee cup, I might have these swirls of cream in there, right? And that seems, you know, can be really turbulent and disordered. And then it settles down to a situation where the cream is evenly distributed. And that is a higher entropy state than the intermediate state. But"
    },
    {
      "end_time": 5592.5,
      "index": 211,
      "start_time": 5563.473,
      "text": " It seems to us like a simpler state, like more. And that's why it's important to think actually in terms of order and disorder on the molecular level. And also, not everything that we think of as more or less ordered really corresponds to entropy differences. So when gravity comes into play,"
    },
    {
      "end_time": 5622.637,
      "index": 212,
      "start_time": 5593.2,
      "text": " The natural tendency, I've got a bunch of gas spread out in the interstellar space, the natural tendency for it is for it to gravitationally clump together. A bunch of gas uniformly spread out, which clumps together and forms a star, that's actually an entropy increasing process, even though intuitively you might think the end state is more ordered than the initial state."
    },
    {
      "end_time": 5653.012,
      "index": 213,
      "start_time": 5624.599,
      "text": " And so if you're thinking sort of as a rough and ready way, there is a sense in which molecular disorder and entropy go together. It's not a reliable guide. And I think of what sometimes people think about when they mean order disorder is actually something a bit different in what people sometimes call complexity. So Sean Carroll was here sometime last year and he was talking about"
    },
    {
      "end_time": 5680.179,
      "index": 214,
      "start_time": 5653.37,
      "text": " Origins of complexity and when people who study complexity, you know, that is another really spike precise difference is that you know tend to say is neither the Minimum nor maximum entropy states are the most complex There's a sense right, okay, right Yeah Something we didn't speak about that comes to mind is ergodicity"
    },
    {
      "end_time": 5709.309,
      "index": 215,
      "start_time": 5680.64,
      "text": " So, are the laws of physics ergodic? Is that a well-defined statement? And also, please define what ergodic is. Yeah. So, classically, ergodicity pertains to a system confined to a finite region of a state space and undergoing isolated evolution. And to be ergodic means that take virtually any initial condition you want,"
    },
    {
      "end_time": 5737.944,
      "index": 216,
      "start_time": 5709.565,
      "text": " and take any finite region of faith space, eventually that initial condition ends up in that region of faith space. And for actual physical systems, it's very difficult to decide, you know, if I hand you like, here's a lot of dynamics and say, is this ergodic or not? It's actually different, very mathematically different, difficult problem to actually decide, right? So"
    },
    {
      "end_time": 5757.773,
      "index": 217,
      "start_time": 5739.804,
      "text": " Of course, that was a classical definition. The laws of physics deep down we know aren't classical. In quantum mechanics, basically, that definition of ergodicity, like a state, just doesn't really apply. There are"
    },
    {
      "end_time": 5786.681,
      "index": 218,
      "start_time": 5759.582,
      "text": " Things that are called quantum ergodicity theorems basically have the effect that any state can be approached as closely as possible. Are the actual laws of physics, like if I actually took say a box of gas and"
    },
    {
      "end_time": 5806.578,
      "index": 219,
      "start_time": 5787.227,
      "text": " somehow or another isolated it and let it go according to ordinary quantum evolution. There is a sense in which something like ergodicity applies in that if you look at sufficiently long time averages,"
    },
    {
      "end_time": 5837.312,
      "index": 220,
      "start_time": 5807.875,
      "text": " then the amount of time it will spend in a given subspace will be for almost all initial states proportional to the dimension of that subspace. I'm not sure that, and something of that sort of flavor often comes in when people are trying to, when people are trying to prove equilibration results. So you want, you know, what we haven't talked about is this sort of process of"
    },
    {
      "end_time": 5866.886,
      "index": 221,
      "start_time": 5837.756,
      "text": " You leave something alone, it starts out from a far equilibrium state and then it goes to a equilibrium state. And that's sometimes called the minus first law of thermodynamics. Right, right. And we haven't talked about that. And one of the reasons we haven't talked about that is that it's hard to say anything really precise about that because there's various results and it's not always clear. You're going to get nice clean mathematical results."
    },
    {
      "end_time": 5895.862,
      "index": 222,
      "start_time": 5867.398,
      "text": " whose physical significance for actual systems is a bit obscure. And then you get sort of plausibility results, arguments for actual physical systems. And I actually think that air gradicity in any sense, which really has to do with infinite long-term, um, average behavior really isn't the right question because what I want to know if I pour the milk in the coffee cup, right?"
    },
    {
      "end_time": 5924.667,
      "index": 223,
      "start_time": 5896.357,
      "text": " not what it's going to do on average if you've left it alone isolated for all eternity but what is what is it going to do in the next few minutes like you want to know actually what's going to happen in finite time scales so statistical mechanics textbooks are divided on whether their gaudicity is actually important for statistical mechanics some in this reading some will say okay this"
    },
    {
      "end_time": 5948.882,
      "index": 224,
      "start_time": 5925.009,
      "text": " Ergodic hypothesis is at the root of statistical mechanics, the hypothesis that actual systems are ergodic. And then others will say, oh, there's all this math, really nice mathematical work having to do with ergodicity and is completely irrelevant to statistical mechanics. Okay, here with the cream and the coffee cup, we only have to wait a few minutes."
    },
    {
      "end_time": 5969.735,
      "index": 225,
      "start_time": 5949.633,
      "text": " And so it's not infinite, it's not T goes to infinity. However, Neema Arkani Hamed also talks about how with particle physics, particle physics occurs at the boundary. Why? Because in the math, we're scattering from minus infinity to plus infinity. And yes, it takes place in just a few milliseconds or a few seconds or what have you. But for"
    },
    {
      "end_time": 5995.401,
      "index": 226,
      "start_time": 5969.855,
      "text": " For the calculations, we just use infinity. Now, he seems to be using the opposite argument that you just used. So would you be able to convince him? No, no, no, Nima. It's actually not happening at infinity. It's not at the boundary. Well, yeah. So when he says happening at infinity, I think one thing you have to realize is that when physicists say infinity, what they often mean is literally infinity, but"
    },
    {
      "end_time": 6018.08,
      "index": 227,
      "start_time": 5996.305,
      "text": " Large enough that it doesn't really matter how big it is. There's a nice book called Understanding the Infinite by a philosopher named Sean Levine and he introduces what he calls the theory of zillions and a zillion is a technical term. A zillion is a number that's so big it doesn't really matter how big it is."
    },
    {
      "end_time": 6048.148,
      "index": 228,
      "start_time": 6018.985,
      "text": " Right. So it's context dependent. Right. And if you think about it, that's sort of how we use the word. Like if I say, you know, if someone says, Wayne, you go back, you go to conferences so much, why don't you just buy a Learjet? Right. And I would say, well, you know, that costs like a zillion dollars. Right. I have no idea what a Learjet costs. Right. But I do know that whatever that cost is, is so far beyond my own financial resources. It doesn't really matter exactly how big it is. Right. And when, and, um,"
    },
    {
      "end_time": 6074.565,
      "index": 229,
      "start_time": 6048.814,
      "text": " one of my colleagues said, you know, we have to, um, realizing quantum field theory, you know, asymptotic infinity is like five meters, right? Because what you do when you're doing these scattering experiments, right? What you're doing is there's a relatively small scattering region, right? And far enough from that scattering region, the, the field is effectively free, right?"
    },
    {
      "end_time": 6104.241,
      "index": 230,
      "start_time": 6075.299,
      "text": " And then, and so you're basically taking in and out states as if, you know, as if they're free fields, you're doing your calculation and you're calculating scabbings, cross sections, et cetera, for effectively free fields. And really what you'll say add infinity, but, and mathematically you might take the limit as things go to infinity because that's a nice clean, clean physical result."
    },
    {
      "end_time": 6132.858,
      "index": 231,
      "start_time": 6104.753,
      "text": " But what you really mean is this is a good approximation far enough from the scattering region that the interactions can be neglected. And I think that that's what he means when he says the interesting stuff happens at infinity, right? And so with something like that, if I've got an interaction and I have a sense of how fast the interaction falls off with distance,"
    },
    {
      "end_time": 6158.575,
      "index": 232,
      "start_time": 6133.268,
      "text": " I can get a sense of how far I have to be from the scattering region to say, okay, these are effectively free fields, right? What we want from equilibration results is some result about, okay, how long do I have to wait till I say, okay, yeah, we're effectively at infinity because the thing has equilibrated. And that's what you're trying to get out of the equilibration results. And it's not as simple because"
    },
    {
      "end_time": 6187.534,
      "index": 233,
      "start_time": 6159.428,
      "text": " It's not as simple as in the quantum case because in the quantum case you've got these distance-dependent forces and you know how fast they drop off. And what you're trying to find out in the equilibration case is, well, how fast does something equilibrate? How fast does it get to the point where I can basically ignore the fact that it was out of equilibrium at the beginning? Okay, I have another funny question. Yes, okay."
    },
    {
      "end_time": 6217.773,
      "index": 234,
      "start_time": 6188.66,
      "text": " So Natty Seiberg said that one of the ways that we can, an indicator for quantum field theory being on shaky foundations or not firm foundations is that we teach quantum field theory differently. So almost no textbook in quantum field theory is the same and almost no course is the same. So some person may say, let's start with scalar fields and then add interactions. And then some person may say, well, let's start with all free fields and then add interactions and others as a functional approach and so on and so on."
    },
    {
      "end_time": 6218.848,
      "index": 235,
      "start_time": 6218.37,
      "text": " So"
    },
    {
      "end_time": 6248.916,
      "index": 236,
      "start_time": 6220.196,
      "text": " This episode is brought to you by State Farm. Listening to this podcast? Smart move. Being financially savvy? Smart move. Another smart move? Having State Farm help you create a competitive price when you choose to bundle home and auto. Bundling. Just another way to save with a personal price plan. Like a good neighbor, State Farm is there. Prices are based on rating plans that vary by state. Coverage options are selected by the customer. Availability, amount of discounts and savings, and eligibility vary by state."
    },
    {
      "end_time": 6275.435,
      "index": 237,
      "start_time": 6249.991,
      "text": " That's not a controversial statement. Right. That QFT isn't firm. Right. But what I'm wondering is, do you personally, Wayne, have an idea as to a field or a subfield, this particular subject in physics that other people think, no, no, this is well understood, but you think actually there is trouble here? Well, I think that statistics mechanics is a case in point because people"
    },
    {
      "end_time": 6297.585,
      "index": 238,
      "start_time": 6276.578,
      "text": " Textbooks are written to give the impression that we understand everything and this is all worked out. And if you actually go from one textbook to another, you'll find very different approaches. In quantum field theory, everyone knows that there's different approaches and statistical mechanics is sort of swept under the rug."
    },
    {
      "end_time": 6321.135,
      "index": 239,
      "start_time": 6298.234,
      "text": " And so that is one case where I think there are real questions about the rationale for certain kind of methods that got kind of swept under the rug. And the thing about thermodynamics is this thermodynamics is sort of similar, even though we don't think of it as fundamental cutting edge science, it's got its roots in 19th century physics."
    },
    {
      "end_time": 6349.974,
      "index": 240,
      "start_time": 6321.988,
      "text": " Different thermodynamic textbooks will take very different approaches. And I think that the root of that is that there are different conceptions about what thermodynamics is supposed to be. So one conception is what I call the resource theoretic conception, where it really is about what you can do with various things. But what people usually want from a"
    },
    {
      "end_time": 6379.292,
      "index": 241,
      "start_time": 6350.316,
      "text": " Thermodynamics textbook, especially if it's chemical, you know, preparing people for doing chemical thermodynamics is you want to figure out what the equilibrium states of a system are. And those are the ones that maximize entropy. And a textbook with that orientation will tend to minimize talk of manipulations in doing work and things like that and treat entropy as if"
    },
    {
      "end_time": 6409.241,
      "index": 242,
      "start_time": 6379.701,
      "text": " It is simply a property of matter, like mass and other things. I think in a lot of areas, actually, the textbook tradition will sometimes obscure different ways of thinking about the theory. And so the question is, in what areas are there where there is a sort of settled, everyone agrees on how to do this?"
    },
    {
      "end_time": 6432.824,
      "index": 243,
      "start_time": 6410.333,
      "text": " This classical electric dynamics, you know, every single textbook in existence is a copy of JD Jackson's book. Um, you know, yeah. And, and, and I think it is a, um, you know, that's possible because there is this, there is this"
    },
    {
      "end_time": 6457.381,
      "index": 244,
      "start_time": 6433.439,
      "text": " Theory that we call classical electrodynamics that we think has been superseded by quantum electrodynamics. So we can all agree on what classical electrodynamics is because it's not, it's in a sense a closed book. And quantum field theory is a continuing area of active research and"
    },
    {
      "end_time": 6484.889,
      "index": 245,
      "start_time": 6457.824,
      "text": " Yeah, so one of the different, one of the reasons for the difference of approaches in quantum field theory is we just don't have a good mathematical, as good a mathematical grip on the theory as we do other areas of physics, right? You know, you write down a Lagrangian, standard model Lagrangian, and is this a well-defined, you put in a cutoff and, you know, if you let the cutoff go through infinity, you have"
    },
    {
      "end_time": 6513.08,
      "index": 246,
      "start_time": 6485.452,
      "text": " You have blow ups which you have certain techniques for regulating. Is that telling us that the theory we wrote down actually isn't well defined at all energies? Or are these cutoffs that we're introducing just a calculational tool for getting at the consequences of a well-defined theory? I think that"
    },
    {
      "end_time": 6537.295,
      "index": 247,
      "start_time": 6513.507,
      "text": " As far as I understand it, and there are people who are much more on the literature, I actually think that that's still more or less an open question. I think the standard view is it doesn't really matter whether the theory you're writing down is well-defined at all energies because we think that it's an effective field theory valid at certain energies and we don't know what's going on beyond those energies."
    },
    {
      "end_time": 6566.647,
      "index": 248,
      "start_time": 6539.104,
      "text": " But you don't buy that answer or what? I think that's right. I think that's right. So that's why we can get away with actually not knowing the answers, whether the theory we write down is actually well-defined at all energies. If that's your attitude, then it doesn't really matter whether it is. Yeah. What's a lesson you learned too late? What is a lesson I learned too late?"
    },
    {
      "end_time": 6582.415,
      "index": 249,
      "start_time": 6572.739,
      "text": " And I'm assuming you want to know about, you know, physics and philosophy of physics and not about my personal life."
    },
    {
      "end_time": 6612.944,
      "index": 250,
      "start_time": 6584.07,
      "text": " So this, okay, this idea that, um, you know, what I've been saying about thermodynamics, that there's two different conceptions of what, what the theory is, but we have resource theory, um, um, consider a conception and this other conception according to which is more like mainstream physics. That took me a surprising amount of time to actually get clear in my own head about, but now I think it's really, you know,"
    },
    {
      "end_time": 6643.422,
      "index": 251,
      "start_time": 6614.104,
      "text": " It should be like one of the first things that anyone says when they're talking about thermodynamics. And as you know, I've given talks several times with the title, a tale of two sciences, both called thermodynamics. And yeah, really, it only is, you know, relatively recent years. I think, OK, that's the way I want to be thinking about that. Now, many people watch this channel who are researchers in math, physics, computer science, adjacent fields."
    },
    {
      "end_time": 6670.691,
      "index": 252,
      "start_time": 6643.916,
      "text": " and philosophers and there's also lay people that watch. So I'm curious what advice you have that you give to your graduate students, but also advice that may apply to this wide gamut of people that watch. Okay. I would say here's advice I give to my, um, grad students when they're in this will apply to like researchers in any field of the just starting out or something like that is,"
    },
    {
      "end_time": 6700.486,
      "index": 253,
      "start_time": 6672.654,
      "text": " When you're choosing what things to work on, what you should not do is look around and say, okay, what's the hot topic? What's the popular thing? And jump on the current bandwagon. And for two reasons. If you're doing something because you think it's popular and you're not particularly interested in it, well, if you're not interested in your work, there's just no way you're going to get anyone else interested in your work, right?"
    },
    {
      "end_time": 6726.664,
      "index": 254,
      "start_time": 6700.759,
      "text": " And also if you're jumping on a bandwagon and then you're, you know, applying for jobs and you're submitting your things you've written or you're submitting parts of your dissertation for public education. My experience as an editor, I was editor of a philosophy, physical kernel for a number of years. My experience as an editor, if you get, when we get a paper, which is, um,"
    },
    {
      "end_time": 6757.91,
      "index": 255,
      "start_time": 6728.746,
      "text": " The nth minor addition to a well-worn topic, the threshold for that being worth publishing is very, very high because you don't want to publish. Even if what you're saying is correct, one of the things you're asking is, okay, if this is going to take up journal space, is this actually a significant advance over what's out there in the literature? And really, if something's a hot topic, then people are going to get tired of it fairly quick."
    },
    {
      "end_time": 6787.722,
      "index": 256,
      "start_time": 6758.507,
      "text": " So do something you're interested in, but don't choose something just so narrow that only three people are going to have an idea of what you're talking about. So there's sort of a happy medium between choosing a research topic that some people are going to have some knowledge about and jumping on the bandwagon and doing what everyone else is doing. OK, so let's imagine you were speaking to your PhD and postdoc students who want to get a job in the field."
    },
    {
      "end_time": 6817.346,
      "index": 257,
      "start_time": 6788.456,
      "text": " I imagine that at some point they have to maybe not jump on a bandwagon, but hitch a ride occasionally because don't you have to get grants? Don't you have to be marketable? Okay. So how do you navigate that? So my teacher Abner Shimoni had the honor of working with him at Boston University when I was a grad student. One thing he would say is just as Aristotle taught us that ethical virtues are means between opposing vices, intellectual virtues are also means between opposing vices."
    },
    {
      "end_time": 6846.493,
      "index": 258,
      "start_time": 6817.585,
      "text": " And I think when you're choosing a research area, there's two opposing vices. One is choosing something that's such a niche area that only three people in the world are going to have any idea what you're talking about. And I think the other vice is jumping on a bandwagon and doing what everyone else is doing. And I think the reason I mentioned that other vice first is I think there's a mistaken impression out there that that's what you're supposed to do. That's what you should be doing."
    },
    {
      "end_time": 6876.869,
      "index": 259,
      "start_time": 6847.005,
      "text": " And I think that that's a mistake in the this is based on my experience as an editor and also talking to other people in the field who edit journals and also talking in my experience as On panels to the qtk grants and things like that is Sure, like if someone reading something it says I have no idea what this is about. Okay, that's that's a you know, that's a tough sell, right? but if you've got"
    },
    {
      "end_time": 6907.637,
      "index": 260,
      "start_time": 6878.012,
      "text": " A dozen grant applications in front of you, and ten of them are minor variants on the same thing. And what they're going to do is make a minor advance in a well-worn field. And another one is an interesting and promising research project that is worth doing but relatively unexplored. That's actually going to count in favor of the one that's worth doing and relatively unexplored."
    },
    {
      "end_time": 6937.722,
      "index": 261,
      "start_time": 6907.927,
      "text": " In terms of getting jobs, let me tell you this is a true story. Many, many years ago, we were hiring at the University of Western Ontario."
    },
    {
      "end_time": 6966.63,
      "index": 262,
      "start_time": 6938.336,
      "text": " And I was on the hiring committee and we had a job ad which was fairly broad. And what had happened is in one of the areas of specialization that was included in the job ad, a big name philosopher had recently published a book that was getting a lot of attention. And what happened was everybody in the world did a grad seminar on this book. And"
    },
    {
      "end_time": 6992.568,
      "index": 263,
      "start_time": 6966.8,
      "text": " I was sitting there reading these applications. This was back in the days when people actually sent us paper applications and there's a file box with all the applications in it and you take it into your office after hours and you're going through it, right? And I was reading the writing sample of one candidate and I said, I read the open paragraph and going, didn't I just read this? I'm going,"
    },
    {
      "end_time": 7022.637,
      "index": 264,
      "start_time": 6993.046,
      "text": " Oh my God, one of our applicants has plagiarized the writing sample from another applicant. And then I went back and got that other file and they were in fact different, but the opening paragraphs were almost word for word the same because this was this issue that everyone was talking about and there was a very standard way of setting up the issue. I see. So if you want people to actually confuse your writing sample with someone else's, then jump on a bandwagon."
    },
    {
      "end_time": 7050.759,
      "index": 265,
      "start_time": 7023.217,
      "text": " Interesting. So this also applies to film and businesses in general. You don't want to be in the red is said to be the red contested waters. You want to be in the blue ocean. I'm not familiar with that terminology, but I believe you. Yeah, right. You just referenced I could be I could give you a personal lesson, but instead I'll give you a lesson that applies to philosophers and physicists. And that has to be curious. What would be the personal lesson that you learned too late and something not trivial like"
    },
    {
      "end_time": 7074.684,
      "index": 266,
      "start_time": 7051.561,
      "text": " Oh, I learned to I should double bag my groceries. I see. Listen to my personal life that I think I learned too late."
    },
    {
      "end_time": 7114.258,
      "index": 267,
      "start_time": 7085.725,
      "text": " If there are toxic people in your life, avoid them. Be around people that you're comfortable around and you feel good around and try to minimize your contact with the toxic people. Thank you so much for spending so much time with me. Well, thank you. I've really enjoyed this. Well, it's been two hours. It feels like I just flew by."
    },
    {
      "end_time": 7130.879,
      "index": 268,
      "start_time": 7114.565,
      "text": " Yes, that's always a great site. In fact, in Harry Potter, I think there was a sands of time. And then Harry asks the professor, what is this? Because it was a different type of sands. And then the person said it stands still when the conversations engaging. Well, that's good."
    },
    {
      "end_time": 7160.794,
      "index": 269,
      "start_time": 7131.288,
      "text": " I have another life lesson, which I heard early in my life, but I am to this day, not particularly good at applying. And this is some, this was an interview that I heard on the radio as a teenager with David Lee Roth, who was big at the time when I was a teenager. And he said, here's my life lesson. Don't sweat the little shit. And it's all little shit."
    },
    {
      "end_time": 7178.592,
      "index": 270,
      "start_time": 7161.305,
      "text": " Interesting. I mean, I don't think it's actually true that it's all little shit, but I think a lot of I think they don't sweat the little shit is something a lot of us have difficulty applying that we end up fussing too much about things in the long run aren't really important."
    },
    {
      "end_time": 7207.329,
      "index": 271,
      "start_time": 7179.582,
      "text": " Hi there, Kurt here. If you'd like more content from Theories of Everything and the very best listening experience, then be sure to check out my sub stack at KurtGymungle.org."
    },
    {
      "end_time": 7230.043,
      "index": 272,
      "start_time": 7207.568,
      "text": " Some of the top perks are that every week you get brand new episodes ahead of time. You also get bonus written content exclusively for our members. That's C-U-R-T-J-A-I-M-U-N-G-A-L dot org. You can also just search my name and the word sub stack on Google. Since I started that sub stack,"
    },
    {
      "end_time": 7250.196,
      "index": 273,
      "start_time": 7230.333,
      "text": " It's somehow already became number two in the science category. Now, Substack for those who are unfamiliar is like a newsletter, one that's beautifully formatted. There's zero spam. This is the best place to follow the content of this channel that isn't anywhere else. It's not on YouTube. It's not on Patreon."
    },
    {
      "end_time": 7279.275,
      "index": 274,
      "start_time": 7250.418,
      "text": " It's exclusive to the Substack. It's free. There are ways for you to support me on Substack if you want, and you'll get special bonuses if you do. Several people ask me like, hey, Kurt, you've spoken to so many people in the field of theoretical physics, of philosophy, of consciousness. What are your thoughts, man? Well, while I remain impartial in interviews, this Substack is a way to peer into my present deliberations on these topics."
    },
    {
      "end_time": 7308.968,
      "index": 275,
      "start_time": 7279.531,
      "text": " And it's the perfect way to support me directly. KurtJaymungle.org or search KurtJaymungle sub stack on Google. Oh, and I've received several messages, emails and comments from professors and researchers saying that they recommend theories of everything to their students. That's fantastic. If you're a professor or a lecturer or what have you, and there's a particular standout episode that students can benefit from or your friends,"
    },
    {
      "end_time": 7339.07,
      "index": 276,
      "start_time": 7309.309,
      "text": " Please do share. And of course, a huge thank you to our advertising sponsor, The Economist. Visit Economist.com slash Toe, T-O-E, to get a massive discount on their annual subscription. I subscribe to The Economist and you'll love it as well. Toe is actually the only podcast that they currently partner with. So it's a huge honor for me. And for you, you're getting an exclusive discount. That's Economist.com slash Toe, T-O-E."
    },
    {
      "end_time": 7357.534,
      "index": 277,
      "start_time": 7339.326,
      "text": " And finally, you should know this podcast is on iTunes, it's on Spotify, it's on all the audio platforms. All you have to do is type in theories of everything and you'll find it. I know my last name is complicated, so maybe you don't want to type in Jymungle, but you can type in theories of everything and you'll find it."
    },
    {
      "end_time": 7375.879,
      "index": 278,
      "start_time": 7357.637,
      "text": " Personally, I gain from rewatching lectures and podcasts. I also read in the comment that toll listeners also gain from replaying. So how about instead you relisten on one of those platforms like iTunes, Spotify, Google podcasts, whatever podcast catcher you use. I'm there with you. Thank you for listening."
    }
  ]
}

No transcript available.