Audio Player

Starting at:

Theories of Everything with Curt Jaimungal

Jacob Barandes: "There is No Quantum Multiverse"

February 18, 2025 2:54:41 undefined

ℹ️ Timestamps visible: Timestamps may be inaccurate if the MP3 has dynamically injected ads. Hide timestamps.

Transcript

Enhanced with Timestamps
396 sentences 29,617 words
Method: api-polled Transcription time: 171m 35s
[0:00] The Economist covers math, physics, philosophy, and AI in a manner that shows how different countries perceive developments and how they impact markets. They recently published a piece on China's new neutrino detector. They cover extending life via mitochondrial transplants, creating an entirely new field of medicine. But it's also not just science, they analyze culture, they analyze finance, economics, business, international affairs across every region.
[0:26] I'm particularly liking their new insider feature was just launched this month it gives you gives me a front row access to the economist internal editorial debates where senior editors argue through the news with world leaders and policy makers and twice weekly long format shows basically an extremely high quality podcast whether it's scientific innovation or shifting global politics the economist provides comprehensive coverage beyond headlines.
[0:53] We don't have a single interpretation of quantum mechanics. It doesn't have serious problems.
[1:12] I traveled to the oldest laboratory in the United States to meet with theoretical physicist Jacob Barandes at Harvard. He's the co-director of the graduate studies department there. We delved into the technical depths of his innovative reformulation of quantum theory based on more fundamental mechanisms called indivisible stochastic processes.
[1:29] My name is Kurt Jaimungal, and this was part of my three-day tour at Harvard Tufts and MIT, where I recorded five podcasts, one of them you're seeing now with Jacob Arndis. It was actually over seven hours long, so we're splitting it into two parts, and this is part two. Part one is also linked in the description. The others are with Mike Eleven, Anna Chaunica, Manolis Kellis, and William Hahn. Subscribe to get notified.
[1:51] In this episode, we talk about what are the misconceptions of the wave-particle duality and entanglement? Is gravity indeed quantum? What about non-locality and Bell's theorem? And what exactly are indivisible stochastic processes? Kurt, it's good to see you again. Good to see you. It's been so long. Wave-particle duality. What is that? All right.
[2:16] When Schrodinger introduced the idea of his wave function in that paper in early 1926, building out of Hamlet-Jacobi theory, his undulatory theory of mechanics, this wave function that lived in high-dimensional configuration space, he had provided a new methodology, a technique for computing things in quantum mechanics. He used the wave function as an indirect way to calculate energy levels. What are the energy levels of atoms which then corresponded to the frequencies of radiation that came out of atoms?
[2:47] Einstein had a lot of problems.
[3:06] And part of the reason that Einstein in particular was concerned was because Schrodinger embraced a kind of what we would call wave function realism, that the wave function is a real thing, physically, metaphysically, real thing in a high dimensional configuration space that somehow projects its meaning into three dimensions of physical space and that really where everything was happening was in this high dimensional abstract possibility space, this configuration space, that's where the waves were. Eventually Schrodinger recanted that view
[3:32] In one of our earlier conversations, I talked about how in 1928 in his fourth lecture wave mechanics, Schrodinger expressed some doubt about wave function realism. He indicated that maybe you could think of the wave function as playing out all the possible realities of what could happen to the system in sort of a very embryonic version of the many worlds interpretation. But Schrodinger recanted that view in 1928.
[3:56] In the face of things like Born saying that the wave function should be understood as a tool for computing measurement probabilities. But in the period from 1926 to 1928 when Schrodinger was still pushing this idea of the wave function as being sort of physically fundamental, Einstein was very confused. There's a very famous letter from December 4th 1926 from Einstein to his colleague Max Born, the same Born of the Born rule.
[4:26] In which Einstein famously says that he doesn't believe that God plays dice. This is famous. I just don't believe that God plays dice. What people don't often know is that the very next sentence in that letter is a criticism of Schrodinger's wave function reality. He says waves in 3N dimensional space as if
[4:56] Rubber bands and he even has like dots like he writes an ellipsis in the letter. He doesn't even know what to say. What's interesting is that in the canonical translation of the Einstein born letters, the collection of letters of correspondence between Einstein and Max Born, the letters translated into English was translated by Irene born and the end is missing.
[5:20] Einstein just says waves in three dimensional space as if by rubber, you know, the end is missing and without the end, you don't realize that his concern is not waves per se. His concern is three and dimensional wave functions and configuration space. That's what he was nervous about. So you miss a very important, but if you look in original German, the end is there. So, you know, Einstein had a lot of misgivings about this idea. Um,
[5:50] But the idea has origins that go back earlier, right? De Broglie's sort of matter-wave idea that particles and electrons had waves associated with them in analogy with how light was thought to be a wave classically and then there was evidence coming starting from Planck and Einstein that light had a particle-like character. This idea that certain phenomena had both particle-like and wave-like features became known as wave-particle duality.
[6:17] And when people do a study of, for example, the double slit experiment, and they approach the double slit experiment in the traditional way, one particle at a time, a wave function that we can pretend is moving in three dimensional space, but this is really just an artifact of the fact that configuration space for one particle looks three dimensional. It looks like you should treat particles a wave as it goes through the slits to get the correct pattern over many repetitions of landing sites.
[6:42] You know, we don't actually see a wave on the other side. What we see is dots, many, many landing sites over many repetitions of the experiment. The wave is inferred. But when you measure where the particle is at the end of the experiment or you measure which hole it goes through, you get a definite result and that makes it look more like a particle. So this is the idea that sometimes things are particle-like. Sometimes they're wave-like depending on what feature of the system we're trying to study. This became known as wave-particle duality. This is further complicated by the fact that there are waves
[7:12] of a different kind in physics. Electromagnetic waves, for example. Light is a disturbance in the electromagnetic field that propagates like a wave through three dimensional space. And those are waves. I mean, like I said, I teach Jackson electromagnetism. We talk about waves moving through three dimensional space. It's very easy to confuse the waves of a field
[7:37] like the electromagnetic field with the wave functions or Schrodinger waves of quantum mechanics, but they're not the same thing. And this is bled into the wave particles reality. When Planck in 1900 and Einstein in 1905 and various people were proposing that light came in quanta, discrete particle like quanta called photons, the wave
[8:02] that they were imagining was the wave corresponding to photons was a three-dimensional electromagnetic wave, a wave of the familiar kind of wave. The wave functions that Schrodinger introduced in 1926 were not like those waves. They were not three-dimensional waves in physical space of a field.
[8:26] They were these abstract complex valued functions in a high dimensional configuration space. And when you measured them, they collapsed. Now, if you're in an MRI machine and they've turned on a very strong magnetic field, you don't have to worry that if you do the wrong measurement you're going to collapse the magnetic field in the MRI machine. It's not that kind of field. The waves they're beaming at you are not those kinds of waves. So you have to make a distinction between
[8:55] yield waves, the waves of a field and Schrodinger waves. And I want to make super clear that in the indivisible stochastic approach to quantum mechanics that we've been talking about, I'm saying Schrodinger waves are not real things. These abstract things that live in this high dimensional configuration space, those are not physically real. But classical waves or the waves of a field, which are a different conceptually different kind of a wave,
[9:21] Those are perfectly valid. And if you're studying a system that's not made of particles, but a system made of fields, you're going to see wave-like behavior as well. But those are a different kind of wave. And these are the kinds of subtleties that I think get lost when someone just says wave-particle duality. So again, just to summarize, the relationship between a photon, a particle of light, and an electromagnetic wave is not like the relationship between an electron and a Schrodinger wave function for the electron.
[9:51] Now, what makes this even more confusing is that electrons do have fields also. There's a so-called Dirac field that plays a very important role in the Standard Model. And this is a field, a field in three dimensions for the electron, but the Dirac field for the electron is not the Schrodinger wave for an electron. So, you know, these are super subtle distinctions, but it's important to keep them in mind.
[10:17] What makes it even more confusing is that particles like electrons, which are called fermions, these are particles that have an intrinsic half-anager spin. They're the particles that obey a Pauli exclusion principle. You can't put them all in the same energy state. They make chemistry possible by not having all the atoms collapse at the ground state. Electrons are like this, quarks, protons, neutrons. Although they have fields associated with them, the fields associated with them are not classical fields like the electromagnetic field. The fields are much more bizarre and weird.
[10:46] And I'm not going to have time to talk very much about them except to say that one of the limitations of Bohmian mechanics is that it has a great deal of difficulty dealing with the kinds of fields associated with fermions. And that's one reason why Bohm mechanics has difficulty, the Bohm pilot wave theory. I'm getting way ahead of myself, but I just wanted to just clarify what's going on in wave-particle duality. So in the indivisible stochastic approach, there are no Schrodinger waves as part of the fundamental physics. Of course, when you go to the Hilbert space picture,
[11:15] mathematically write down wave functions and use them, write down Schrodinger waves, but they're not physically there. You don't need them to explain the interference patterns. The indivisible stochastic dynamics itself generically predicts that you'll have what look like over many repetitions of the experiment, dots that look like they're following some kind of wave equation. But there is no wave actually involved in those experiments. But I'm not saying that field waves, the waves in fields are not there. That's a different kind of wave.
[11:43] So, speaking of these waves, you mentioned quantum field theory indirectly with Dirac. Does your approach illuminate any aspect of quantum field theory or the standard model? We've been talking about quantum mechanics, sure, especially in part one and part two. What about QFT? Yeah, so one of the nice things about Bohm's pilot wave theory is that it works really beautifully for systems of fixed numbers of finitely many non-relativistic particles. That's a lot of qualifications.
[12:13] doesn't work so easily for fields. You end up either having to do very complicated things or maybe even introducing stochasticity of some kind. It gets kind of messy and there's a lot of difficulty handling for me on fields in particular the fields associated with particles like electrons. One of the advantages of this approach is
[12:36] Although, okay, so one of the, so let me just say something very quickly about Bohm mechanics. Now this is different because this is also related. In Bohm mechanics, for again, systems of fixed numbers of finally many non-reversible particles, we have deterministic equations. There's a pilot wave that guides the particles around. The wave function, the pilot wave obeys the Schrodinger equation.
[12:55] Then another equation called the guiding equation is how the wave function, the pilot wave, guides the particles around. And everything is deterministic. There's no fundamental probabilities. There are some initial uncertainties in the initial configuration of the system. And these evolve to become the Born Rule probabilities later. But the dynamics is fundamentally deterministic and is not generating the probabilities in a fundamental, law-like way. This picture is in some ways very elegant.
[13:25] Provided you're okay with a pilot wave living in a high dimensional configuration space. Although I should say that Goldstein, Durer, and Zange have already proposed the idea that the Bohmian pilot wave is law-like and not a physical thing. So there are other ways to read this theory. The problem is it helps itself to a lot of very special features.
[13:46] of models that consist of fixed numbers of finitely many non-relevant particles, features that are unavailable when you go to more general systems like fields. So you end up having to write down a very different looking model, including in some cases models that you need to now deal with stochasticity and indeterministic dynamics, and they just don't really work very well when you try to go beyond.
[14:09] One of the other things that Bohm mechanics requires is a preferred foliation of space-time. So last time we spoke, we talked about how in special relativity, there's no preferred way to take space and time and divide it up into moments of time, like different ways to do it. The guiding equation, the equation that takes the pilot wave and explains how the pilot wave obeying the Schrodinger equation, how the pilot wave guides the particles around, they call the guiding equation, depends on there being a preferred foliation of space-time, a slicing of space into moments of time.
[14:39] It's also not really great. It works fine in the non-relativistic case, but we want to do relativistic physics like we often do when we want to do quantum field theory, which is the kind of models we use when we want to deal with special relativity and quantum mechanics together, as in the standard model. Preferred foliation is really difficult to deal with. Not impossible, but it'd be nice if we didn't need it. In the indivisible stochastic approach, there's no guiding equation. There's no pilot wave.
[15:06] It's not that you solve the Schrodinger equation, get a pilot wave, and then take the pilot wave and plug it into a guiding equation, which depends on a portfolio and then the guiding. None of that happens. There's just the indivisible stochastic dynamics, which can be represented in Hilbert space language. But the dynamics is just directly happening. There's no middleman. There's no pilot wave and guiding equation in the middle. This means the theory is not going to be deterministic. I think one question the comments is, is this fundamentally deterministic or not? It's indeterministic. It's not a deterministic theory.
[15:36] But because there's no guiding equation, there's no preferred foliation. And because we're not relying on all these special features of the particle case, it's perfectly easy to now generalize this to more general kinds of systems. Have you done it? Have I done it? Good question. So there's this thing called time, and time is bounded and limited. Is it? It is, amazingly.
[16:02] In your framework? At least in my life. Okay.
[16:22] The term here is straightforward in principle to generalize this to quantum fields because none of the obstructions are there like they were before. One of the problems with Bohmian mechanics is your wave function has to live in a space, configuration space.
[16:37] And fermionic particles don't have a familiar kind of configuration space. This is what makes it so hard to do bone mechanics. But there's no pilot wave here, so you just don't even have that obstruction. So many of the things that would have obstructed us from just applying this to any kind of system are just, they're just not there anymore. So if you want to deal with a field theory, you just replace particle positions with localized field intensities. These become your degrees of freedom. And then you just apply the stochastic laws to them and it works the usual way.
[17:04] The problem with quantum field theory is that quantum fields in general is that they have infinitely many degrees of freedom, infinitely many moving parts. At every sort of point in space in the most sort of – this is a whole renormalization story of effective field theory, but at a simplest sort of bird's eye view, you have a degree of freedom at every point in space, you have infinitely many of them.
[17:24] And this makes them very mathematically difficult to deal with. Even in the traditional Hilbert space or path integral formulation, quantum field theories are really mathematically tricky. And there are very few, if any, I think there are none, rigorously defined quantum field theories that are also empirically adequate. Like none of the quantum field theories that make up the standard model have been rigorously defined. This means that anytime you mention quantum field theory, you're going to run into mathematical difficulties that are just because quantum field theory is
[17:54] Mathematically very complicated. So I think there's a research direction for an enterprising students to not only formulate quantum field theory in this language, but also see does it make any of the mathematical difficulties easier? Do any of them become harder? Like what exactly does it look like when you do this super carefully?
[18:15] As you know on theories of everything we delve into some of the most reality spiraling concepts from theoretical physics and consciousness to AI and emerging technologies to stay informed
[18:37] In an ever-evolving landscape, I see The Economist as a wellspring of insightful analysis and in-depth reporting on the various topics we explore here and beyond.
[18:49] The economist's commitment to rigorous journalism means you get a clear picture of the world's most significant developments, whether it's in scientific innovation or the shifting tectonic plates of global politics. The economist provides comprehensive coverage that goes beyond the headlines. What sets the economist apart is their ability to make complex issues accessible and engaging, much like we strive to do in this podcast.
[19:13] If you're passionate about expanding your knowledge and gaining a deeper understanding of the forces that shape our world, then I highly recommend subscribing to The Economist. It's an investment into intellectual growth, one that you won't regret. As a listener of Toe, you get a special 20% off discount. Now you can enjoy The Economist and all it has to offer for less.
[19:35] What is it about quantum field theory that makes it not rigorously defined other than the path integral? Because there are other approaches to quantizing than the path integral.
[20:05] So what makes it hard? Uh, not hard, not rigorously defined, not rigorously defined. So, well, I mean, we have rigorously defined quantum field theories, but they tend to be quantum field theories defined in very low numbers of space time dimensions where you can like rigorously define all the integrations and take all the limits. Um, we have quantum field theories defined by what are called the Whiteman axioms.
[20:35] But these axioms are very strong and preclude the kinds of quantum field theories that seem most apt to describe sort of nature. There's so many different angles I could take for this. Let me just, I'll just pick one. So here's one way to see what can go wrong. If you take quantum electrodynamics, which is the quantum field theory that best describes electrons, and if you want you can add some of the heavier cousins of electrons like muons,
[21:02] and interacting with photons with electromagnetic fields. I should say, by the way, that most of what we do when we do quantum field theory is not look at particles dancing around. What we do is we introduce in the asymptotically distant past a so-called in-state, a quantum state vector that consists of some menu, some assortment of particles that are supposed to come into the experiment, and then we write down some menu of outgoing particles.
[21:32] A KFC tale in the pursuit of flavor. The holidays were tricky for the Colonel. He loved people, but he also loved peace and quiet. So he cooked up KFC's 499 Chicken Pot Pie. Warm, flaky, with savory sauce and vegetables. It's a tender chicken-filled excuse to get some time to yourself and step away from decking the halls. Whatever that means. The Colonel lived so we could chicken. KFC's Chicken Pot Pie. The best 499 you'll spend this season. Prices and participation may vary while supplies last. Taxes, tips, and fees extra.
[22:02] You might go, how do we know a particle is going to come out? Well, we don't. We're going to be computing a probability that this goes to that. So we start with some incoming particles. We start with some proposed outgoing particles.
[22:11] And then, using either the path integral or other calculational methods, we compute the so-called complex-valued scattering amplitude. It's the complex number you get when you put these things together. It's the complex number that when you mod squared is supposed to be connected to the probability you'll get that particular outcome. In practice, what we do is compute what are called scattering cross-sections, which is like what fraction come out one way, what fraction come out another way.
[22:35] Notice these are all phrased in a way that is perfectly consistent with the textbook formulation of quantum mechanics. We're not asking what's going on in between. We're not dealing with macroscopic systems. We're doing exactly what the textbook axioms ask us to do which is
[22:47] You prepare, you compute probabilities of measurement outcomes. All the averages and numbers you're doing are like experiments repeated large numbers of times. So you're not going to run in for the most part to any of the fundamental inconsistencies or ambiguities in textbook quantum theory. So it's very easy to do quantum field theory and think there's no problem. Everything is great. We're doing quantum field theory. What's the problem? It's because most of what you're doing just doesn't run into any of those ambiguities you run into with the axioms. Now,
[23:16] This theory is very useful and we can compute a lot of stuff. We can't compute everything. There's some ingredients that you have to take from experiments, right? The so-called physical couplings you have to go out and measure and you plug them into the model. Because if you just sort of naively try to compute everything from first principles, what you discover is that certain quantities you might want to compute depend very sensitively on sort of the upper boundary of what you've put on the theory.
[23:46] So when you study a theory like this, you recognize you can't access arbitrarily high energy physics. Our experiments don't pump in more than a certain amount of energy. So we shouldn't extrapolate the theory to arbitrarily high energies. We're going to put a cutoff on the theory. We're only going to discuss what's happening to theory up to some energy level, some energy cutoff. It's just that some of the things you might want to calculate from first principles depend sensitively on the cutoff, and those are things your theory cannot provide you with.
[24:11] So we have to take some things from empirical data and put them in. We plug them in. They become some of the parameters in our theory. The standard model has about 20 or so of these parameters you have to take from the experiment and plug them in. And once you have them, you can now make huge numbers of non-trivial, highly accurate predictions about what happens. But you still have this upper boundary. And if you try to calculate things at arbitrarily high energies, eventually your calculations stop working.
[24:38] So one of the dirty secrets of physics is that much of the calculating we do is highly approximate. A lot of it, when we do it by hand, is using a tool called perturbation theory, which I cover in some of my courses. Perturbation theory is a systematic recipe for predicting, for calculating things. And this recipe just doesn't work very well once you start trying to push your predictions beyond a certain energy level. There is a trick you can do to change what the theory looks like as you study the different energy levels. It's called renormalization.
[25:07] And what you find is that some of the parameters in the theory, they stop having values that make it possible to do consistent perturbation theory. Now, if you wanted to rigorously define a quantum field theory, what you want to do is take some kind of a limit where you can study the theory at arbitrarily high energies. This corresponds, roughly speaking, to being able to assign degrees of freedom to arbitrarily small points in space.
[25:31] And you see there's immediately an obstruction here. For most of our theories, there's a cutoff. There's a boundary we can't go beyond. The theory simply doesn't let us go to arbitrarily high energy scales. And so we're not going to be able to write down a so-called ultraviolet complete, like perfectly fine-grained version of this theory. We can only use a theory up to some cutoff. The standard model, for example, is not expected to hold to arbitrarily high energy scales. We think that the theory is reliable only up to
[26:01] All of this takes us pretty far, no pun intended, a field of what we were talking about before, but these are the kinds of things like
[26:24] Maybe quantum field theories in the real world, real life out there in the wild, quantum field theories are never perfectly defined. Maybe all we have is a cascade of so-called effective field theories that are all well defined within some bounds. And there is no ultimate theory that is like perfect and fine-grained and, you know, perfectly fine-grained and ultraviolet complete. Maybe there's just like a tower of these theories. This makes questions of ontology and what's physically out there, I think, very murky.
[26:51] Because if we think there's never going to be some fundamental theory at the bottom of all of this, what really is out there in nature? That's, I think, an open question. Whether or not you want to phrase that question in a language of this sort of indivisible stochastic approach, I don't know. Or it could be that these theories tap out to some ultimate quantum field theory or some very different kind of theory like string theory or something like that. I mean, there are many proposals for maybe where this terminates. But
[27:19] I don't know. What I will say is this, though. There is a view that nature is fundamentally described by Hilbert spaces and quantum mechanics, the Hilbert space formulation of quantum mechanics. And if that's fundamental, then we already know what nature fundamentally is. Nature fundamentally is some wave function. That's it. Some wave function in some Hilbert space. We don't exactly know the features of the wave function. We don't know whether it's best described in terms of fields or something else.
[27:48] But we already know the fundamental ontology of nature. It's a wave function. So we're good. I think that's too ambitious. In the indivisible stochastic approach, there's no wave function. So the wave function is not what the ontology is. The wave function is whatever your choice of configurations are. And if you're modeling particles, you use particle configurations. If you're modeling fields, you use field configurations. If there's some
[28:11] Ultimate system that grounds everything else, some system at the bottom, some system that if we understood it and understood its laws, we would have the unified theory of all the physics. There would presumably be some configurations for that and we would use those instead, but we don't know that theory yet. And so I think it's premature to think that we know the right fundamental degrees of freedom. So if someone asks me, what do I think is fundamentally out there?
[28:36] I don't know, but then I'm just where we were 100 years ago or 150 years ago. We don't know the ultimate theory of nature yet, and I think it's premature to at this point guess what the ultimate ontology is going to be until we have that theory, should we ever have it. Well, I'm interested in ultimate theories. Theory of everything is the name of your podcast. Sorry, I don't have one for you. Well, I'm interested in your thoughts into how to merge quantum field theory with gravity.
[29:00] So I know we have a slew of audience questions and we're going to get to them, but they're going to have to wait because I have these questions first. These are, these are, these are like close to what I wanted to also talk about. So go on. Yeah. Great. Great. Okay. So two questions here. People say that, okay, look, we have this Heisenberg uncertainty and that applies even in QFTs. And so thus space time is, is shaky. Okay.
[29:28] But space-time in QFT is defined. You can perfectly pick out an X comma T and the values of creating particles in the fields themselves vibrate or are uncertain, but the space-time itself is there and given. However, some people say that if you were to zoom in and you follow QFT because of Heisenberg's uncertainty, you thus get to uncertainties of space-time itself.
[29:54] Is that argument valid? Now, I imagine that one of the ways that they get to this argument is by saying you have an energy time uncertainty in general relativity, space time itself has energy. And so the space time itself must have some uncertainty to it. But then you could also say, well, in QFT, you don't know if the energy that it's talking about is the same of, well, okay.
[30:18] If you have a statement that applies to all natural numbers, you can't just say that any, so X squared is always going to be a natural number if you're pulling from natural numbers, but not every number squared is a natural number. So it depends on the scope of what you're quantifying over. So is it the case that in Q of T we can apply the energy time uncertainty to GR? I don't know. Cause so that's one question. We should address that question first before we ask any more questions. Yeah. So, um,
[30:47] It's important to step back here and just make sure we know what we're all talking about. So a quantum field, just because maybe not everyone knows what they are. So in a typical Hilbert space formulation of a quantum system, we have observables. Observables are these self-adjoint operators. And in a quantum field, we have operators associated with all points in space.
[31:09] And if we work in a formulation in which we move the time evolution out of the state vectors and into the observables, we have what's called the Heisenberg picture, and then our field operators depend on space and time. It's a fancy way of saying everywhere in space-time we've got these sort of local operators that are associated with points in space and time. Quantum field theories like QED, we talked about quantum electrodynamics, they presuppose this classical background space-time. There's no gravity, space-time is
[31:36] usually treated as flat. We call flat ordinary special relativity spacetime. We call it Minkowski spacetime. You can do quantum field theory in a static curved spacetime. Still not treating gravity as dynamical, but that gets very complicated. Let's start to start with quantum field theories like QED on special relativity flat non-dynamical spacetime.
[32:01] In that case, you're right. x, y, z, the coordinates of where you are, and t, do not fluctuate. They are fixed features of the background architecture of space-time. They're the stage on which the action is happening. Your question about the uncertainty principle and about fluctuations of fields is an interesting question. In the Dirac-von Neumann formulation of quantum mechanics, nothing is fluctuating between measurements because nothing is happening between measurements. The only things that are happening are measurements in the Dirac-von Neumann formulation.
[32:30] So to say, oh, when you're not measuring it, the fields are just like dancing. The Dirac von Neumann axioms don't say that. They say nothing about it. They don't say that particles are zooming around. The Dirac von Neumann axioms don't let you say, oh, the reason this happened was a photon emitted an electron. All that is for color. I said this in one of our earlier conversations that physicists often talk this way. They're like, oh, this happened because an electron emitted that and did this and the field was fluctuating. If you're only working on the Dirac von Neumann axioms, all of that is just
[32:59] fluff fluff none of it is really legitimated by the axioms now if you're frustrated by that you're like well but surely something is happening i want to be able to say something is happening well then you're on my side which is that we need to do something to the right one axioms you're making my case for me okay so the uncertainty principle that the traditional we talked about this a little earlier an observable you know corresponds to a certain basis and when the state vector of your system is aligned with one axis of that basis you're definitely that result when you measure it
[33:29] If the state vector is not aligned with that basis vector, it's got components along multiple basis vectors, then you're going to have probabilistic measurement results given by the Born rule. And you can be aligned along the axis of one observable and have a definite result, but not along one axis of another observable. And you don't have a definite result. And if you change the state vector so that it's aligned along the axis of one, it's not aligned along the axis of the other. And this is the uncertainty principle that some observables will have sharp values that when you measure them, you always get a definite result and others won't. And if you try to make one observable sharp, others will become unsharp. This is the uncertainty principle.
[34:00] But notice again this is all phrase and level of measurements. We're not saying that between measurements anything is fluctuating. So honestly there's like no way to really talk about what like to say that the field is fluctuating on top of the space-time or to say anything more about the Heisenberg's theory principle other than this is the pattern of measurement results we get when we do measurements on the system. Now if you want to do something like Bohmian mechanics or the indivisible stochastic approach
[34:25] or the many worlds approach or something like that. Now we can actually begin to talk about what's happening between measurements because these are all theories that describe things happening that are not just the narrow class of measurements. In some of the theories, like in the indivisible stochastic approach, there's stochastic behavior. The fields really are fluctuating.
[34:46] In the bohmian approach, it kind of depends on whether you're trying to get fields into a deterministic kind of bohmian approach or whether you're going to allow the fields to be stochastic in some sense. There are many different formulations of bohmian mechanics for fields and I can't do justice to all of them. Some of them the fields would be fluctuating, some of the fields wouldn't be. In some of them you deny that there are fields and try to do everything with particles somehow. Many worlds is a little more subtle because in many worlds
[35:10] There's not one reality in which things are fluctuating. It's just more subtle and we'll talk about the many worlds approach a little bit later. So I wanted to just get that out of the way before we then talk about the more subtle question about is space-time fluctuating. So when you go from field theory like quantum field theory like QED, where again the thing you're primarily computing is scattering amplitudes. You set up a preparation, you get measurement results, you're computing cross-sections, decay rates, those sorts of things. Now you want to talk about gravity.
[35:39] So in general relativity, gravity is a manifestation of the change in curvature of space-time. Space-time doesn't stay flat. It curves. People often ask, where is it curving? Is it curving in some other dimension? There's a way to define curvature called intrinsic curvature that does not make reference to other dimensions. You can define it solely in terms of the four dimensions of space plus time, so you don't need an extra dimension for curvature. But there is this notion of intrinsic curvature, and if
[36:08] Gravity is quantum mechanical. Does that mean that the curvature or the shape of space time or the geometry is also fluctuating in some sense? Now there's this discussion about, well, if you zoom in, does it look, I mean, I don't even, I don't exactly know what zooming in means. Do you mean if you're doing measurements or something like that? I mean, if we do very like precise measurements on a quantum fields on a background space time, we may get a large variance of results.
[36:32] But those are measurement outcomes. It's not the field doing anything between measurements because, again, without augmenting the Dirac-Vinoman axioms, we can't talk about what the fields are doing. Is space-time fluctuating? Well, according to the Dirac-Vinoman axioms, we can't say that. We could only say something like, if you do some kind of measurement of space, it's fluctuating. But I don't quite know how to measure space the way we would measure the intensity of a field. It's kind of subtle because
[37:02] The relationship between the gravitational field and the curvature of space-time and the behavior of test particles, of particles moving around on space-time, it's really subtle. Even the notion of energy is super subtle in general relativity. There isn't an invariant non-trivial definition of local energy density in the gravitational field itself in general relativity.
[37:25] It's actually really hard to pin down even what we mean by all of this. And we're not going to be guided by experiments because we would expect to see distinctly quantum mechanical features of the gravitational field. Unless there's some miracle, we wouldn't expect to see that until you're talking about Planck scale physics. Planck scale physics is the physics associated with distance scales that are like 10 to the minus 43 meters.
[37:51] Right. I mean, they're as far from an atom as like an atom is from like the observable universe, something like that. I mean, they're so far away from maybe not have to work out the exact orders of magnitude, but the Planck scale is really, really small. I guess that's 10 to the minus 43 seconds. That's the Planck time. The Planck scale, the length scale is 10 to the negative 35 meters. So like these are these are super duper tiny, tiny distance scales. And we can't do experiments there. So we have no experimental data to guide us.
[38:20] So this is exactly a situation in which we need the kind of careful, rigorous scrutiny that one gets from, yes, understanding the physics as well as we can, but also having a strong background in philosophy, because it's very easy to make statements that are super speculative, that build speculations on top of speculations, to make what I call speculative metaphysical hypotheses, SMHs, and the acronym is not an accident,
[38:46] Just to tower them on top of each other and then not to know whether what you're saying is something that's genuine and reliable. So I don't have any idea whether we should be thinking about space-time as truly fluctuating. The indivisible stochastic approach, like all approaches to quantum mechanics, faces fundamental first-order conceptual difficulties in dealing with space-time that fluctuates like a curving dynamical space-time.
[39:15] Let me explain why. In order to talk about stochastic probabilities and division events and all this stuff, you need some notion of what you mean by time, by slices of the universe at constant time. You need the ability to talk about which directions in space-time are the directions that are space-like directions and which directions are the time-like directions.
[39:44] When you want to specify your configuration of your system, you're doing it at a time over some region of space. And so you really need to know which slices of space-time are the space slices. And that's all well and good when you're doing Newtonian or non-relativistic space-time, even in quantum mechanics, not necessarily Newtonian, or even special relativistic space-time. In special relativistic space-time, you're given which directions are time and which directions are space, and they're just fixed.
[40:14] But when you consider dynamical space-time, so space-times in which the so-called metric tensor, which is the kind of field-ish thing associated with space-time and general relativity, the metric tensor is the thing that tells you which directions are time and which are space. If that is itself fluctuating, you don't know a priori which directions are the space directions and which directions are the time directions. So you can't even obviously phrase a probabilistic theory. And this is very curious.
[40:45] For one thing, it means it's very difficult to understand whether space-time fluctuates, even in an invisible stochastic theory, because it's hard to even specify, like, where do I put my probabilities? What is my initial? These probabilities are conditional. They connect one configuration to another at one time to another time. But if I don't know which directions are time directions, how do I do this if the space-time itself is itself fluctuating? That's interesting. But what's also interesting is it highlights a gap
[41:16] the scientific study of quantum gravity. So here's a very interesting thing. We can take classical Newtonian physics, we can numerically simulate it in a computer, and we can also model many Newtonian systems probabilistically as Markov processes. Often the Markov approximation is perfectly well and can be used all the time to model Newtonian systems, to model other kinds of systems.
[41:42] And there are stochastic methods, stochastic formulations of other physical theories beyond Tony mechanics. There isn't one for general relativity. So Einstein's theory of general relativity is not a probabilistic theory. Einstein's theory of general relativity is a deterministic type theory.
[42:06] It's more subtle. There's some questions over whether it's always formulated in a Markovian way. So there's even some evidence from general relativity, even just ordinary general relativity, that maybe the Markovian picture is not quite the right picture. And there's some amazing people like Emily Adlam, who's at Chapman University, philosopher of physics, and Eddie Chen at UCSD, and Shelley Goldstein at Rutgers, who
[42:34] are trying to think about laws of physics in a different more sort of global space-time sense that would fit better with theories like general relativity. And there may be some connections to indivisible non-Markovian type laws. But in any event, general relativity is phrased in sort of a deterministic non-probabilistic way. And people are trying to work on quantum gravity now, but you might have asked, shouldn't we have worked on an intermediate step first? What about just a probabilistic version of general relativity?
[43:04] Like a formulation of general relativity that is stochastic, that is, we take Einstein's field equations, the equations that describe the deterministic shape of space-time, and replace them with a probabilistic version, not a quantum version, just a probabilistic version, like as a stepping stone. You might have thought that would be the natural thing to do before trying to go to a fully quantum version of the theory.
[43:28] As far as I know, there's been very little work done in that area. I could be missing something. I haven't seen everything that's been written and maybe people will see this and they'll chime in in the comments and say, wait a second, there's a theory where this is happening. And there's current work. I mean, I know that Oppenheim is working on a stochastic version of general relativity, but this is recent, right? Like this is not 50 years ago. So I think this is a huge target for research. And it kind of makes sense this would happen. I mean, general relativity, you know,
[43:57] finished. The level of the Einstein field equation being fully formulated in November of 1915. You know, Einstein is giving these super high stakes lectures, the Prussian Academy of Sciences, and he's scrambling to finish the theory in between the lectures, and he manages to do it. And, you know, and then Schwarzschild comes along and writes on the Schwarzschild solution shortly, you know, in the beginning of 1916. But there's this whole story that Schwarzschild was doing it in the trenches of World War One, he was not in the trenches.
[44:26] Yeah, there's this really great paper, I think by Dennis Lemkul, who's a historian of science, who's great. He was like, Schwarzschild was actually stationed in this very nice house. And, and, and he was in the war, but he wasn't he wasn't doing it. But anyway, so, so people obviously was developed like 1915 1916. Stochastic process theory was not developed at the time. Right? I mean, even like,
[44:54] Kolmogorov's axiomization of probability theory that comes in 1933, right? So that's, that's all that's like 17 years, 18 years after general relativity. And that's not even stochastic. I mean, random variables don't start becoming prominently used until like the 40s and 50s maybe. And I think like a, like a well developed theory of stochastic processes
[45:18] If I'm not mistaken, and again, my history on the theory of stochastic processes may be somewhat mistaken, so people can correct us, but I believe it wasn't until later, like the 50s and 60s. I mean, Markov introduced Markov matrices already in like 1906, but like fully building out like an actual comprehensive theory of stochastic processes, that comes decades later. And people had already been working on quantum gravity for decades by this point. I mean, you know, people began trying to do quantum gravity
[45:47] like the 1920s. I mean, Pauli is already trying to quantize general relativity by the late 1920s. And people are already like giving up and pulling their hair out and saying you can't do it right already, like decades before there's a theory of stochastic processes.
[46:00] So it's actually not so surprising historically that no one said maybe before we do quantum gravity, we should do probabilistic general relativity and see if we can do that. And there have been a lot of proposals to do this, you know, maybe what you do is you want an ensemble of space time a block universes or maybe but it's like not clear that any of these are really the right way to do it. I have some suspicions. And this is this is now me
[46:24] doing something I don't want to do which is just like speculation but you know what let's just let's just speculate. Surmise away. I think a fully probabilistic version of general relativity and I don't mean taking general relativity and adding some small noise terms like small corrections I mean like a fully fully probabilistic generalization of general relativity I think that would either teach us a lot about quantum gravity or even potentially be quantum gravity because remember
[46:54] the indivisible stochastic approach doesn't start with Hilbert spaces. It's just probability, just very non Markovian probability. There's a sense in which general relativity in its most general formulation is like not exactly I mean, depending on the nature of the space time, if you've got certain kinds of spaces and certain properties, you can formulate it as a kind of initial value problem. But like, there's something about general relativity that's a little different from the laws of other theories. And I have a suspicion that if you could fully probabilize the theory,
[47:21] You basically be doing indivisible stochastic mechanics but for gravitational field and that would already be the theory of quantum gravity. Now that is super conjectural. I want to be super clear. I have not worked on this in any depth. It would be very interesting to study this problem more but this is the kind of question you can begin to ask because if you think that you have to start with Hilbert spaces you'd go well it must be the case that quantum gravity is going to be some Hilbert space thing or some generalization of Hilbert spaces
[47:48] But because we didn't have to start with Hilbert spaces, we can now ask much more basic questions like what's just probabilistic general relativity, indivisible probabilistic general relativity, and is that already all we need? That's not to say it's easy because again, when you have a dynamical space time, it's very hard to talk about where you even put the conditional probabilities, but at least it centers the question on something a little more basic. And I think this comports with a couple of other principles I think that one gets from thinking philosophically about doing physics. One is
[48:17] It's usually better to isolate problems as much as you can and deal with them in the simplest circumstances. I would much rather try to deal with probabilities and general relativity first before I try to do all of quantum gravity, right? Like let's study problems in their simplest initial incarnation. Let's not teach people quantum mechanics by starting with quantum field theory. Let's start with the simplest kind of systems and add complexity step by step rather than doing it all at once.
[48:41] That's one thing and the other thing is the idea that when approaching problems or conceptual confusions or trying to progress on a very thorny set of theoretical questions involving one of our best physical theories, sometimes you don't want to just build stuff on the end.
[48:56] Sometimes we have to do is you have to go down into the deep programming of the model and do some debugging. So for people who've done computer programming, you know that sometimes in a program isn't working. It's not because the end of your code is wrong. Sometimes it's because like way at the beginning of your code, you made some mistakes and to debug it, you have to go all the way back to the beginning and really start with the definitions, like how you've defined certain variables or how you define certain functions and like make sure all those definitions are really good before you proceed.
[49:24] And that's kind of what I'm doing here. Rather than trying to glom gravity onto Hilbert space quantum mechanics, I'm saying maybe we need to go and ask some very foundational questions first. Debug this program all the way down to the roots of the axioms. Make sure the axioms really make sense. And I can give a very concrete example of where this breaks down. So we talked about the uncertainty principle. Another thing that you compute in quantum mechanics are expectation values. Now in a previous conversation we talked about an expectation value is an average.
[49:54] You have some observable thing you want to measure, and you know the quantum state of the system, and you can compute its average. And there is this way of thinking about those averages, that they're averages of just stuff happening, of phenomena happening, but they're not. They're defined by the Dirac von Neumann axioms as statistical averages of numerical measurement outcomes weighted by their corresponding measurement outcome probabilities, and that's it.
[50:27] If you're not measuring stuff, there's no average there. But there are a lot of physicists who think that when you put brackets around something, which is the notation for an expectation value, we no longer have to think about measurements anymore. We can just think about it as stuff happening.
[50:48] So people will say something like, well, you know, quantum mechanics predicts measurements, and if you measure something, you'll get one of the eigenvalues, you'll get it with the Born rule. How do we get the classical limit? Oh, what we do is we take expectation values, we average everything, and then we show that those averages evolve in time the way that classical observables evolve in time, and this is how the classical limit happens. But this is clearly wrong, because
[51:12] at least if you don't think everything is a measurement. I mean, if you want to argue that every phenomenon happening is a kind of measurement, then you can do this. But then you have the onus of trying to show that if you're not willing to say that everything happening is a measurement, you've got a problem because things are happening all over the place. Objects are sitting on Mars, not falling down and primordial gases are mixing.
[51:36] And you can't just put brackets around them and the quantum mechanical things and just say these are things happening because those brackets only refer to measurement averages and if there's no measurements happening then those things aren't happening. The conflation of a quantum mechanical measurement expectation value with just on average this is what's going on, the conflation of those two things, measurement averages and on average stuff is just happening in a certain way, is pervasive in the literature.
[52:04] So if you take, for example, I'm sorry to mention this because I really like this book. It's Shankar's book, Principles of Quantum Mechanics. Wrote a book. It's a wonderful book. It's a big pink book on quantum mechanics. And chapter six is called the classical limit. And the entire chapter is based on this fallacy. That you just put brackets around things and then you can treat them like classical variables where they're just happening and no one's measuring them. But it's just wrong. Now, at least according to the Dirac-Von Nomen axioms. Now,
[52:32] If you're willing to augment or change the Dirac-Vinomon axioms and turn quantum theory from a theory of only measurements to a theory of phenomena happening generally, like in an indivisible stochastic approach or bohmian mechanics or avaredian many-worlds interpretation, then you are legitimated in doing this. But you need something to take measurement averages and turn them into just averages of things happening. This is just a category problem again, that we're only referring the Dirac-Vinomon axioms to this very narrow category of measurement outcomes,
[52:58] we're not the larger category of things that want to be able to be happening. So how does this then affect quantum gravity and quantum gravity? We often take quantum mechanical things, put brackets around them, and then plug them into the Einstein field equation and treat them like they're classical things. So one of the starting assumptions of semi classical quantum gravity, which is where we try to sort of mix a little quantum is we take the distribution of matter broadly construed, broadly construed matter is
[53:29] like massive particles, massive objects, but also electromagnetic fields are considered a form of matter. Really anything that's not the gravitational field that can source gravitational fields or respond to gravitational fields we call matter. And what we do is we take the quantum mechanical observables, these self-adjoint operators, we put brackets around them, call them averages, pretend that they're classical averages, and then put them into the Einstein field equation.
[53:55] and
[54:10] And they will stick a bunch of functions into one of these integrals. Sometimes to make things more well-defined, they will take time and give it an imaginary part and even rotate the time axis in the complex plane to imaginary time to make the integrals more well-defined. And they'll compute these things called correlation functions. And sometimes I'll have a conversation with someone who does this and I'll say, what is this correlation function? They'll be like, oh, it's a correlation function. It's an average. And I'm like, but
[54:40] Your universe has no observers in it. And you're describing a situation in quantum gravity in which there's like no planets or people or measurements happening. So what is this an average of? Are you saying that these quantities are just doing things and we're averaging them? That's not legitimated by the Dirac, Feynman axioms. So what is the physical meaning of these quantities that you're writing down?
[55:08] and you know sometimes a lot of you know very sophisticated conversation about this and we actually get you know make some progress on it but but a lot of times people like I actually don't even know what I'm doing right so this is what I mean when I when I say that like applying rigorous scrutiny to the things we're computing like beyond the mathematics like what do they mean is actually kind of important because otherwise you might find yourself writing things down you don't even know what exactly it is you're writing down I think what this gets across is that the difference between quantum mechanics and general relativity is actually much deeper than I think
[55:37] I mean, we all know that there are differences. Quantum mechanics is supposed to be this sort of fluctuating probabilistic theory and general relativity is supposed to be based on smooth spacetimes and things like this and how do we reconcile them? But I think that the difference between them is even deeper. General relativity is a theory of things happening.
[55:53] General relativity is a theory in which you have an Einstein field equation, you impose appropriate boundary conditions, you introduce whatever distribution of matter and energy and sources you want in your space time, and then you find a space time with the right geometry that satisfies all the constraints and obeys the Einstein field equation. And this is the space time which things happen.
[56:15] Projectiles follow what are called geodesics. If they're only subject to gravitational forces, geodesics can cross, they can meet, they can intersect five times. People can, you know, you can compute various and variant quantities. Not everything in general relativity is relative. Some things are invariant. They're like things are happening in this universe. In quantum mechanics, at least the textbook Dirac-Vinoman picture, all you've got are measurement outcomes.
[56:44] at the beginning and end of
[57:14] You set things up and you take measurements at the end. In the asymptotic past, you set up your initial state, the asymptotic future. We take the times to be infinitely in the past, infinitely in the future. These are obviously just approximations. And we're just computing measurement results, cross-section scatter, you know. But in general, in quantum gravity, we're trying to describe what space-time is doing. We're trying to understand like what's happening to space-time. And those are questions that just are beyond the kinds of things that we usually do when we're doing QED.
[57:44] We're demanding more of quantum gravity. We're demanding more of a picture, more of a description than the textbook quantum mechanics has been designed to provide. And so I think that if you want to do quantum gravity and really tell a story, tell a picture, paint a rigorous picture of what's happening in space-time, you're just not going to be able to do it with textbook Dirac von Neumann-Hilbert space quantum mechanics. You're going to need a theory of something in order to describe a space-time where something is actually happening.
[58:13] Hope that makes sense. So I think there are reasons why a conceptual shift in how we think about quantum mechanics may be necessary before we are able to address certain deep problems in quantum gravity.
[58:24] Hi everyone, hope you're enjoying today's episode. If you're hungry for deeper dives into physics, AI, consciousness, philosophy, along with my personal reflections, you'll find it all on my sub stack. Subscribers get first access to new episodes, new posts as well, behind the scenes insights, and the chance to be a part of a thriving community of like-minded pilgrimers.
[58:45] Problems in quantum gravity.
[59:09] Great answer. Okay, so let's get to some of the questions about Bell. Yes. So people had questions about Bell's inequalities and how they're represented in your framework. Good. Yeah. So ultimately Bell's theorem is about entangled systems. So I have to say a little bit about entanglement. We've got to talk about entanglement first. What is entanglement according to usual textbook quantum mechanics? This is what entanglement
[59:42] Composite systems systems where you've got two systems not one system anymore. That could be the superposition of two states, but two systems so suppose that I have system a and it's the state one and I've got system B and it's the state one prime and that's all I have Well, then we would say okay the composite system is in the state one and one prime and
[60:09] System A is in state one, system B is in state one prime. That's all there is to say. I could also imagine that system A is in state two and system B is in state two prime and the composite system is in the state two, two prime. Perfectly fine. I could also imagine that system A alone is in a superposition of one and one prime.
[60:30] Let's say 1 over root 2 times 1 plus 1 over root 2 times 1 prime, because in quantum mechanics, when we superpose, we put a number in front, and that number when you square it is supposed to be related to a measurement probability. The 1 over root 2s have the property that you square them, they become halves, and you add them, you get 1. That's probabilities adding up. You can imagine the system A is in the state 1 over root 2 1 plus 1 over root 2 1 prime. You can imagine system B is in the state 1 over root 2 2
[60:58] plus one over two two prime and you could imagine that those are the states two systems now the composite system is also in it so the composite system is in the state well it's hard to say let me call the first state psi the greek letter psi psi is the state one over two one plus one over two one prime and let's
[61:17] I'm sorry, I did my numbering wrong. It's one plus two and two and sorry, because A can be in the state one or one prime, one or two and state and system B can be in state one prime or two prime. I got it wrong. My apologies. Okay. So, so psi will say the Greek letter psi, trident symbol psi will be one over root two state one plus one over root two state two and psi prime, which corresponds system B is
[61:47] Si prime is the state that represents 1 over root 2 1 prime plus 1 over root 2 2 prime. And I can say that the composite system is in the state Si comma Si prime. If I multiply everything out, I'll get four terms. There'll be a term that's 1 half 1 1 prime plus 1 half 1 2 prime plus 1 half 2 1 prime plus 1 half 2 2 prime.
[62:17] We would say this is not an entangled state because it's factorizable. I can factorize it into psi next to psi prime. Psi for system A, psi prime for system B, I would say these are not entangled. And you can show that when they're not entangled, they also have statistical independence. If you do measurements on them and compute measurement probabilities, you'll find that they are statistically uncorrelated. But now let me propose a different quantum state. This quantum state is going to be
[62:48] 1 over root 2, 1, 1 prime plus 1 over root 2, 2, 2 prime, and that's it. Just those two terms. Notice this is a superposition, but over both systems. And now I've got 1, 1 prime in one term and 2, 2 prime in the other term. And I don't have all those. I don't have the 1, 2 prime term. I don't have the 2, 1. They're not there. I only have 1, 1 prime plus 2, 2 prime. That's it. I cannot factorize that into two different states.
[63:17] There's no psi for the first system and psi prime for the second that would let me describe them both as having their own states. We would now say those are entangled.
[63:25] Just a quick question here. So people who are driving and they're listening to this or people, maybe they have a pen and paper and they're thinking, okay, well, I'm going to try to multiply some states to get that. And then they don't. So then they wonder, okay, but just because I tried some, I didn't get to it. Is there a way that I can look at this and then prove that there exists no factorizable component? Yeah, there is. And that's the way to think about this is just it's forgetting to foil when you do arithmetic. So if someone gives you, for example, I've got X plus Y over here,
[63:53] and I've got W plus Z over here and I multiply X plus Y as a quantity times W plus Z as a quantity, I get four terms. I get X W plus X Z plus Y W plus Y get four terms. If I see those four terms, I know I can refactorize them and write them as a thing, X plus Y times other thing, Z plus W. But if I only give you X Y plus
[64:21] Sorry, not XY. XW plus ZY. I only give you those two things. You can't factorize them. They don't factorize into a thing times a thing. This is like an arithmetic example of entanglement, basically. Now, entanglement is usually phrased as something that has no classical correspondence. There is nothing like entanglement classically. In fact, in a 1935 paper, Schrodinger wrote
[64:52] that entanglement was not a but the feature of quantum mechanics and forced its distinction from the classical case. You can also link to that paper. I'll send you a link to it. Now, you might go well, there are certainly some things that are like entanglement. For example, you know, john bell has this
[65:16] Paper Bertelman socks. He talks about this guy Bertelman who's got socks and the socks are always different. If you know what color one sock is, you'll know what the other color is not the same. There are systems in which, for example, if I have someone preparing coins and they always prepare the coins so that when one is heads the others are always tails. Always. And you discover one is heads and you know the other is tails. They're correlated. Even if the coins are very far apart when you look at them. If you prepare the coins and send them far apart and you look at one coin it's heads, you know the other one even is very far away as tails.
[65:45] This is called correlation. And if you do this over many coins and the coins are flipped, you don't always know what you'll get heads or tails, but you know the results will be correlated, statistically correlated. So statistical correlation certainly happened classically, but entanglement is stronger than that. And that's one of the things that Einstein, Podolsky and Rosen and Bell, they were trying to get at this
[66:08] feature of entanglement that is somehow stronger. You get correlations that are stronger than you would think could be possible on normal how we usually reason about classical probability theory for systems that are widely separated from each other. To explain the Bell inequality, I have to start with where it came from. So Bell's theorem in 1964 is in a paper called on the Einstein-Podolsky-Rosen paradox. He's referring to a 1935 paper by Einstein-Podolsky-Rosen.
[66:37] So I have to talk about that paper and what they did and then what Bell was supposed to do. You should link to a copy of that paper. People should read it. I don't know how many physicists have actually sat down and read that paper really carefully, but it's and even Einstein wasn't super happy with it. He was a little upset about how it finally came out. But it is a very important paper to read. You mean the EPR paper? This is the famous EPR paper. Yep. Yeah. It's a very subtle argument, but it basically boils down to this.
[67:05] If I've got two quantum systems, and they're entangled, I prepare them, I prepare them in some state that's entangled, and to get them entangled, something has to be local between them. Either they have to be together initially, or you have to send something from one to the other, but some kind of, at some point, local thing should happen in order to get them entangled with each other. And then you send one of the systems very, very far away. This is a weird thing about entanglement.
[67:35] When I measure the first system, usually people do these thought experiments, they imagine Alice and Bob. Alice has the first system and Bob is very far away with the second system. Alice does a measurement on her system and she could measure a variety of different observable features. She could measure some observable feature and when she does it, she will know if you have the right kind of entanglement, she'll know exactly what Bob will get when he does his measurement.
[68:05] She'll measure observable A, she'll get some answer, and then she'll know, ah, I got this answer because of the entanglement, I know what Bob will get. Bob will definitely get this other answer. But Alice didn't have to measure that thing. She could instead have measured a different observable. She could have measured observable A prime, a different observable that is not compatible with A in the same way that position is not compatible with momentum, which is what they originally used in the EPR paper. The original EPR paper was written in terms of position and momentum, but these are incompatible observables, they obey an uncertainty principle. If you know one, you don't know the other with certainty.
[68:36] And so she measures A'. She can make Bob's system collapse, have a definite answer for a different observable. Okay? So she can steer Bob's system. This is called quantum steering. The word quantum steering was introduced by Schrodinger shortly after the EPR paper. Because it feels like Alice's choice of measurement, she measures A or A' is like steering Bob's system.
[69:02] Now, the steering does not send signals. Again, there's this theorem called the no signaling and no communication theorem that shows that Alice cannot deliberately send controllable messages this way. The steering is something more subtle and can't be used to send signals or communication. This is rigorously established because of this theorem. Nonetheless, there is some sense in which she is somehow steering Bob's system. She'll measure observable A. She doesn't control what answer she gets. A is uncertain. She could get this, she could get that.
[69:29] Depending on what she gets, Bob will get a certain corresponding thing, but because she can't control what she gets, she can't control what Bob gets. She just knows that once she's done her measurement and gets a certain result, she knows that Bob, if he decided to measure the same thing, she'd know exactly what he would get. If Alice instead measures A', she'll collapse Bob's system to a different basis, and whatever result she gets, she'll know Bob, if he measured that corresponding observable, she'd know exactly what he would get.
[69:59] Now there's two possibilities as far as EPR, Einstein-Belsky-Rosen were concerned. Either Alice's decision is really changing Bob's system. And Bob's system, when they do the experiments, could be a light year away. And that would seem to be something superluminal, unacceptable happening, faster than light happening. But if not, Bob's system must already have known what answer it would get if he measured the first observable.
[70:26] and what answer he'd get if measure the second observable because Alice could measure either of hers and depending on what she measures she can make Bob's system have a definite value of one measures everyone has a definite value of the other and if Alice is not really changing Bob's system Bob system must have known all along what it was going to have they called their paper can and they leave out the but can the quantum mechanical description of reality be considered it complete
[70:53] They're saying that unless you allow something faster than light to be happening, Bob's system must already know the answers it should yield for all of his measurements because Alice cannot possibly, by her choice of measurement, be doing the steering. So an EPR basically establishes that there is a logical fork. Either you allow faster than light influences of some kind, causal influences of some kind, or
[71:22] There are hidden extra parameters and the wave function, the standard approach to quantum theory is incomplete. There's more to the story than just the wave function. That's where Bell starts. In 1964, he says, well, so here's what they said. They said that either you have non-local or some kind of causal influence happening that's going from Alice to Bob,
[71:51] Or there's more to the story than just the wave function. There are some hidden variables. Bob's system already knew what answers it would yield. What Bell wanted to do was show that that fork was actually not really there. That there wasn't an escape from the non-local causation. That if you tried to escape the non-local causation the way that EPR argued you should, assume there's more to the story, hidden variables, extra things,
[72:22] Think Verizon, the best 5G network, is expensive? Think again. Bring in your AT&T or T-Mobile bill to a Verizon store today and we'll give you a better deal.
[72:50] And then write down a simple example of a quantum mechanical system that violates it, that you can go out and do an experiment and check that it violates it.
[73:19] So in other words, Bell is trying to close a possible way out of the non-local causation. EPR says there's either non-local causation or hidden variables, and Bell is saying, well, even with hidden variables, you still get non-local causation. Therefore, quantum theory is simply a non-local theory, and that's the end of the story. That's what he did. This theorem has gone through a giant game of telephone. So first of all, I should say that the paper
[73:48] was like published in an underbell was not and he was a particle physicist doing this foundational work on the side and he would caution people against doing foundational work because it was considered very bad for your career which is really shameful I mean physics is supposed to be an intellectual enterprise and closing down avenues of intellectual investigation of exploration is just anti-intellectual that's a shame but his paper
[74:17] somehow eventually became more widely known. And it's like through a game of telephone. Eventually, people began thinking that what he did was prove there couldn't be hidden variables. And people would say, oh, you have a hidden variables theory? That's ruled out. Bell said there can't be hidden variables. In fact, the Nobel Prize was given for experimental tests of the violation of the Bell inequality, right? There's this Nobel Prize that was given to Clauser and and Zeilinger and and
[74:46] was asked by I think was asked by all. And it says if you look at the press release for the Nobel Prize, it says that Bell proved there couldn't be hidden variables. And this Nobel Prize is given because they proved hidden variables are impossible. That's not what Bell showed at all. In fact, not only did Bell not show that, but he said in the paper, that's not what he was showing. In fact, he begins the paper by talking about Bohmian mechanics. He says, Bohmian mechanics is at least for
[75:13] Systems of fixed numbers of finitely many non-relativist particles, a perfectly empirically adequate theory of quantum mechanics. It is grossly non-local, that's the words he used for it. Could there be a hidden variables theory that is better behaved than Bohmian mechanics when it comes to locality?
[75:29] And what he was showing was that there wasn't. But his argument wasn't that, okay, well, as long as you get rid of hidden variables, you can keep locality. He thought EPR had shown that if you don't have hidden variables, then you definitely have non-locality. So it wasn't like he was saying, well, it's hidden variables or locality. He was saying, EPR said it was
[75:54] non-locality or hidden variables and in fact in variables still non-locality non-locality is just all you get that's what he thought he was doing and this paper has been widely misinterpreted Bell himself in later writings complained about how people kept misinterpreting his paper either not reading it carefully or getting it second hand or I guess like in the opening of what we talked about the textbook that said oh Bell prove Bell showed that the orthodox approach is the only approach right I mean that's not what Bell said so
[76:25] Okay, but then what do we go from here? Bell claimed that he'd shown that quantum mechanics was just non-local full stop. But the EPR paper, the original EPR paper, and Bell's 1964 paper, these are arguments. They're mathematical arguments, especially Bell's paper, which is a theorem.
[76:55] And you have to be very careful when you talk about theorems in a physical context. So we were talking earlier about inductive, deductive, all these different arguments. In pure mathematics, a theorem begins with premises. The premises should be rooted ultimately, if you have to, in whatever the axioms are of the field you're working in. Maybe they go back to the axioms of set theory, who knows. And then you go through
[77:25] a sequence of logically valid mathematical arguments that culminate in some conclusion
[77:37] Good correct premises and your logic was valid. You have a sound proof. You have a sound deductive argument and you're done. And if anyone wants to claim there's something wrong, they're going to have to either challenge your premises or challenge your reasoning. And if they're both good, then you're just good. So Euclid proves the infinitude of the prime numbers. That's a great example. You begin with certain premises about how the natural numbers work. And then you have this logical argument that leads to the conjecture that there cannot be a biggest prime number.
[78:03] as long as you're willing to take on the axioms, the standard axioms that we use for arithmetic. But physical theorems, theorems like Bell's theorem, theorems that are about physics, the Cauchy and Specker theorem, the PBR theorem, that's the Puzi-Berut-Rudolf theorem, there are all these other theorems that are so-called physical theorems. And these can suffer from another problem.
[78:33] They can succeed as mathematical theorems. They begin with mathematically formulated ingredients that you use in the premises and then you proceed through rigorous logical deductive reasoning and you arrive at a conclusion that's the theorem you've claimed to prove. And that can all be fine. But your theorem is just floating out in math world unless it connects to something in the physical world. And that connection is where there can be a problem. So your mathematical ingredients
[79:03] Aren't just supposed to be pure math anymore. They're supposed to have physical reference and I I'm sorry the way a singular is referent Referent is singular reference is plural. They're supposed to have things out in the world that they are representing and the things they're representing Need to be sufficiently rigorously defined and the connection between those reference and
[79:29] and the mathematical representations. The connections have to be sufficiently rigorous. And if either of those two things breaks down, we have a connection problem. I call it the connection problem. So let's take Bell's 1964 theorem as a good example of this, okay? Well, let's even go back to EPR. Let's go back to the EPR paper. The EPR paper is a good example. So the EPR paper has premises. There are premises to the EPR paper. One premise is that wave functions collapse.
[79:57] when we do measurements on them. Another premise is of course the Drakvon-Norman axioms, which include collapse. Another premise is that we have a notion of causal influence that can be cashed out in terms of interventions by agents. I needed an Alice and a Bob to talk about this. Alice is an agent who does an intervention on her system. We call it a measurement in this case. Bob is also an agent who does an intervention.
[80:25] The interventionist theory of causation is one particular way to talk about causal influences. According to the interventionist conception of causation, to say that a thing A causally influences another thing B is just to say that if an agent comes along and intervenes in some way on A, there will also be a change in B. That's what it means to say that A causal influences B. But if there are no agents and there are no interventions,
[80:55] Then what do we do with this theory of causal influence? And you might go, well, there are observers. But if you want a theory of quantum mechanics or theory of physics in which observers and measurements and measuring devices are not part of the fundamental axioms, you're going to have a lot of trouble talking about causation in that kind of a theory.
[81:18] If you try to do EPR and subtract out the agents and subtract out the intervention and subtract out the wave function collapse, it's actually really hard to talk about what's happening in the thing. So the extent that you take all these things on, sure, you have this rigorous statement, sort of rigorous statement about what's going on. It could fail because the reasoning is bad, but could also fail because there aren't agents out there and there aren't interventions out there. And you might go, well, again, what do you mean? I mean, there are people, Alice and Bob,
[81:52] phrase it to meet the level of the atoms. Are atoms intervening? Are atoms agents? And then you actually run into kind of a deep problem. Like if you really are asking me to phrase this not with people, not with measuring devices, but the level of the atoms, the individual atoms that are not making decisions and freely choosing to do things and doing interventions and acting as agents, I don't even know what this theory of causation is supposed to mean.
[82:17] If you don't have a theory of causation, you don't have a theory of causal influence and you don't have a theory of non-local causal influence and the whole argument just breaks down. This is a thorny problem because causation
[82:31] is just like a nightmare subject in metaphysics. People have been trying to understand causation for a very, very long time. Causation is one of these things where we feel like we kind of intuitively understand causation. In fact, Kant even argued that cause and effect was like built into our brain architecture. We needed to think of the world in this way. But it's really hard to pin down what you mean rigorously by cause and effect, especially if you're trying to start from physics. So there's a view of physics.
[82:58] Around the turn of the beginning of the 1800s, the sort of Laplacian view of physics, all there is is just the state of the universe, all the particles in the universe with their positions and velocities, that's it. At one snapshot in time, and then there is just a giant differential equation, the laws of physics as a giant Markovian differential equation,
[83:19] that take this state of the whole universe, all the particle positions and velocities and tell you the next infinitesimally in time state and the next one and also the previous ones. And that's all there is. That's all there is to physics. That's all there is to the development of the evolution of physical systems. From this point of view, there's no sense in which that rock over there
[83:47] is causally responsible for the motion of that rock exactly right because it's like you don't need that you have the overall state and it's just sort of propagating forward and backward this giant differential equation there is no role to be played by having these extra ingredients these idle wheels these notions of causal influences now when we teach Newtonian mechanics we often talk about oh why did that rock begin to accelerates because this other rock exerted a force on it right this other rock
[84:16] exerted a force on this rock and therefore cause it to move. But if you step back and look at the entire universe, there's just some giant state evolving forward by some differential equation. It doesn't look like there's any place for causation in this picture, at least at a fundamental microphysical level. Bertrand Russell said in the beginning of the 20th century, you know, causation is a relic. I think he said it was like the British monarchy.
[84:44] It's something that continues to persist under the erroneous assumption that it does no harm. He thought you didn't need causation anymore in physics, at least at the micro-physical level, we should just get rid of it. Of course, if you get rid of causation, then there's no non-local causation, and then what is Bell's theorem even about? What is EPR even about? If there's no causation, there's no superluminal causation, and then the problem is just solved.
[85:10] I think if one takes the point of view, as some philosophers do, John Norton, for example, has a paper which you should also link to, it's called Causation as Folk Science, that there is no fundamental causation in nature, that science in the early days was about looking for cause and effect, but we've really become more sophisticated in that we're not trying to phrase things in cause and effect anymore.
[85:30] You know, cause and effect is language we can introduce later just to simplify how we describe things, but like we shouldn't be looking for physical theories fundamentally phrased in terms of cause and effect anymore. That's a relic of an old time. I think if you want to take that point of view, that's a self-consistent view, but then you're not going to be able to then appeal to Bell's theorem and say that there's non-local causation happening in invariable theories. If you want to talk about non-local causation, you need a theory of causation. You need to actually bite the bullet and say, we're going to talk about causal influences.
[85:59] And if you rely on interventionist causation, you run into the problem that interventionist causation just doesn't seem like the kind of fundamental microphysical definition of causation that we should be talking about. We're talking about microphysical theories like quantum mechanics. A lot of the no-go theorems that are related to Bell's original 1964 theorem, that theorem itself, the EPR paper, the GHZ argument, a lot of them help themselves to interventionist causation. They ultimately involve agents manipulating things, doing interventions.
[86:29] In a fundamental microphysical theory, which is just atoms doing the things that they're doing with no agents, no fundamental role of agents or interventions or measurements, it's not clear what these theorems are even saying. Bell wrote another version of his theorem, a generalization in 1975. It's a remarkable paper and I think not widely read enough by physicists. A lot of people, I think, tend to focus on the 1964 paper.
[86:57] The 1975 paper is much more sophisticated. I'll put the link on screen. Yeah, you should put the link. It's a great paper. It's beautifully written. And in this paper, he tries to deal with this problem. I mean, he doesn't use the words interventionism, but he tries to get away from the reliance on measurements and on collapse. He retreats to a much more primordial notion. He just says, look, even textbook quantum theory is committed to some ontology, things physically existing, measurement results,
[87:26] Textbook quantum theory says there are measurement results. That's a thing it's committed to. Those are the beables, the things that are really out there according to textbook quantum theory. There are actual facts of the matter about how the measurements happen, and they're really out there in the world. He calls them beables. Not observables, but beables, the way things can be. Maybe that's all the beables you have. Maybe there are more beables in your theory, but textbook quantum theory only has those. And to be clear, a beable is what? An ontological entity? Yes. A beable is what you think is real.
[87:56] What you think is actually physically out there and according to textbook quantum theory you're committed at least to measuring things now. This of course raises some questions if measuring devices are out there and they're really real what are they made out of. In textbook quantum theory there's just nothing and you can't say well they're emergence because emergence requires a substrate.
[88:16] Water, fluid water emerges from water molecules. You need to have the things out of which the emergence is constructed. And in textbook quantum theory, you can't just say measuring devices are emergent without saying what are the things that it emerges from. In a theory like Bohmian mechanics or many worlds or indivisible stochastic formulation, you have those ingredients out of which the emergence is supposed to take place. But in textbook quantum theory, you don't. Okay, but that's putting all that aside. You're at least committed to the measuring devices, measurement results as the beables of the theory.
[88:46] And so Bell just, he rephrases the premises of his theory differently. He doesn't rely on interventionism. He doesn't even propose any theory of causation. He just says, look, I don't have a good theory of causation. I'm not going to give you a full comprehensive theory of causation. But I think any good theory of causation should have a certain feature. It should have a feature which today we would call Reichenbachian common cause factorization. Right.
[89:14] This is just the statement that if A is a thing correlated with another thing B, they statistically rise and fall together in some statistical way, and A and B do not causally influence each other directly, maybe because they're so far apart when they happen that they can't communicate with light, then there must be some other variable C that is causally influencing both of them. You know, so for example, if people have one condition,
[89:44] Nicholas Cage movies are released when people tend to die from drowning in swimming pools.
[90:03] Sure. I don't know if that's an interesting suggestion. Well, it turns out that it's because Nicolas Cage releases movies in the summertime. Ah, good. Yes. So the common cause is summertime. Good. Yes, exactly. It'd be like saying, well, barometers show low pressure, and that's correlated with hurricanes. But it doesn't seem that the barometers are causing the hurricanes. Right. Or the hurricanes, which haven't happened yet, are causing the barometers. But there's a low pressure system that happens first, and this leads to both of them. OK. So this is the so-called common cause
[90:32] And what Bell asserts is that any good theory of local causation should have the property that local beables, whatever they are, they can be measurement results, they can be beables in some other sense, he's being very general about this, but local beables associated with distant places, if they're correlated, there must exist other beables in the past, in the causal past, in the so-called overlap of their light cones. That's the fancy way of saying it. And there must be a rich enough set of those beables, a rich enough set of them,
[91:02] That if you specify them all and know them all, then they explain the correlation in a very rigorous mathematical sense. They lead the joint probability distribution for the two things, the two beables, A and B. They lead the joint probability distribution to factorize cleanly when you condition on the local beables in the past, the common cause local beables. This is called Reichenbachian factorization. I don't know that Bell knew about Reichenbach's work. Reichenbach had formulated this idea in the 50s.
[91:33] And it's certainly something you could imagine a good theory of causation should have. Bell needed this factorization in order to derive his inequality with these weaker, more general assumptions. And this 1975 theorem was general enough that it could encompass probabilistic theories, theories with stochastic hidden variables where the hidden variables didn't uniquely determine measurement outcomes, but only determine them probabilistically. So this is a more general theorem, but he's changed his premises.
[92:00] And now he's taking on this premise that in order for a theory to count as locally causal, his principle of local causality is, again, it's locally causal if whenever we have statistically correlated local variables A and B that are far enough apart when they occur that they can't be causally influenced each other, then there must be a rich enough set of causal variables in the past that when you condition on them, the correct joint probability distribution factorizes in this neat way. And this is necessary to get the theorem.
[92:31] Now I'm not the first to suggest that Reichenbachian factorization is too strong a requirement, too strong a condition to impose on a theory
[92:49] football fan, a basketball fan, it always feels good to be ranked. Right now, new users get $50 instantly in lineups when you play your first $5. The app is simple to use. Pick two or more players. Pick more or less on their stat projections. Anything from touchdowns to threes and if you're right, you can win big. Mix and match players from
[93:11] any sport on ProgePix, America's number one daily fantasy sports app. ProgePix is available in 40 plus states including California, Texas,
[93:22] Others
[93:45] You know, Bill Unruh, for example, in a 2002 paper, which I can also link, has this long explanation. He says, well, yeah, I mean, the things got entangled. There was some interaction that entangled them. But in quantum mechanics, interactions are not variables or variables. They're not the kinds of things that you can condition on. There was a common cause, the interaction in the past, but it's not the right kind of common cause to get a factorization. So there's no problem here. And various philosophers of science have made this argument also. There's a bunch of papers by Jeremy Butterfield, who's a philosopher of physics at University of Cambridge.
[94:14] who also has cast doubt on reichenbach factorization. Why would we even think reichenbach factorization is good? Well, it kind of works for everyday macro world joint probabilities, but that's not a strong argument that it should also hold for micro physical probabilities. And there are already good reasons to be suspicious that in fact it should hold. But this just sets up a target.
[94:38] If you deny that Reichenbachian factorization is a good requirement of any good theory of local causation, then Bell's theorem has no teeth. It simply doesn't work anymore. Now, in a lecture Bell gave in the early 90s called La Nouvelle Cuisine,
[94:57] which is in his collected work, Speakable and Unspeakable, is this collection of all these papers, but not the first edition, the second edition of Speakable and Unspeakable. He has this lecture, it's called La Nouvelle Cuisine, which we can also link to. He tells the 1975 theorem story over again and he modifies the premises a little bit.
[95:12] He modifies the premises a little bit. I've been having an email correspondence with a philosopher of physics, Joanna Luke, about this. She's working on a paper where she's looking at all the different formulations of Bell's theorem. And in 1991, he slightly changes the premises a little bit, so he's not relying on exactly the same kind of Reichenbachian factorization, but he still needs all these sort of like assumptions about what a good theory of causation could be. He's not proposing a theory of causation. There are many theories of causation historically,
[95:41] there's regularity theories to say that a causally influences b is to say that when a happens b happens or later or counterfactual theories that a causes b just in a case that b would not have happened if a had not happened and and there's there's conservation law causation and there's probability raising cause there's all these theories of causation Bell doesn't propose a theory of causation he just says I think a good theory of causation should have this feature
[96:09] And if you assume this feature, you get this inequality, the inequality is violated by quantum mechanics. Therefore, whatever quantum mechanics is, it doesn't have this feature, therefore cannot have a good theory of local causation. But he didn't propose a theory of local causation. So this is a very long way of saying, in the indivisible stochastic approach, we replace the differential equations.
[96:29] We no longer have the Schrodinger equation as a fundamental equation or Newton's laws or Maxwell equations or any of that. We don't have those things anymore. Instead, we have these conditional probabilities, the sparse set of what I call directed conditional probabilities. I'll explain the directedness in a moment. But these directed conditional probabilities are exactly the kinds of ingredients that show up in the literature on causal modeling. When you do probabilistic causal modeling, you've got
[96:52] Random variables, these are the things that can change and they've got links between them that describe causal relationships and those causal relationships take the form of conditional probabilities. The probability of B having certain values given that these other variables have their values and we would say therefore that those variables causally influence B. This is exactly the language in which the laws are formulated in indivisible stochastic formulation of quantum mechanics.
[97:17] So you might think, well, they're phrased in a way that provides a very hospitable domain for talking about causal relationships. Maybe we should read those conditional probabilistic relationships through a causal lens. And now you have the opportunity that maybe you could build a theory of microphysical causation out of these ingredients. They're no longer based on a Laplacian paradigm of differential equations. Now they're based on exactly the kinds of
[97:48] conditional relationships that we might think have a causal gloss to them. So in one of my later papers, this is the paper New Prospects for a Causally Local Formulation of Quantum Theory, I run with this. I say, okay, well, let's take these and use these to talk about causal influences between things. And now let's say that what it means for a theory to be causally local is that when you have two systems that are at space-like separation, they're far enough apart that they can't influence each other,
[98:18] then there's a clean factorization of the conditional probabilities between them. Right. And I'm phrasing this very vaguely because it's a little technical to write it down, but you can read the paper. But this is basically proposing an actual is taking a stand. It's proposing, like proactively proposing a theory of of of microphysical theory of causation and then asking on that theory of microphysical causation,
[98:44] Do we get non-local causal influences in the EPR experiments in particular? And the answer is we don't.
[98:52] So you can read this. This is in the paper. I also have some talks online. People can go and they can watch the talks where I go through all the technical details. I very precisely define what I mean by causal influences based on these conditional probabilities. And then I carefully define what I mean for two things to be causally independent of each other. And I define what I mean rigorously to say that two things are not exerting a non-local causal influence on each other. And then I carefully go through the EPR experiment. And I
[99:20] And I show that in the EPR experiment, there is a causal influence that goes from the instantiation of the entangled pair to the two particles, which makes sense because the instantiation is in their past light cone. But there is no causal influence that goes from whatever Alice does to whatever Bob does. So I'll put a link to all of your talks on screen and in the description as well. And maybe at some point,
[99:51] When you have another talk planned, I'd like you to give it on tow so that people can see some of the math behind what you're saying. That would be very cool. I hope that was all somewhat clear and understandable. Well, many people have questions about Bell, so I'm glad that you were able to give this explanation. Yeah. So that's a brief summation of how to think about Bell's theorem, but it's a general
[100:14] It's a kind of care one has to take whenever approaching any theorem about physics, any physical theorem. It's not enough to check that the theorem is mathematically sound as a mathematical argument. You have to ask, do the things it refers to out in the world, the reference, are they rigorously defined? And in the case of Bell, he needs local causation. Are those terms sufficiently well defined? And I would argue they're actually not.
[100:38] And then you have to worry about the connection between those reference and the mathematical ingredients. Is that sufficiently established? That's where there's a weakness. If Bell's definition of causation is not sufficiently rigorously established, then the theorem just doesn't have any teeth. And if you can provide a theory of microphysical causation and a theory of what it means on that theory of microphysical causation for things to not be able to causally influence each other non-locally,
[101:05] Then that's all you have. If people still don't like it and still think, well, it still seems there's too much correlation. Well, maybe that doesn't feel great. Maybe it's unintuitive, but it's not a source of brokenness. Okay. Now, this all leads to the question, what do we even mean by it? What is entanglement in this picture? So this indivisible stochastic picture, like what is going on in entanglement? If there's no state vector, if there's no superposition actually happening, what do we mean by entanglement? There's actually a very nice picture of what's going on with entanglement now.
[101:35] Suppose I start with two systems. Think of two particles, let's say, or two qubits, two simple systems. And suppose that these systems initially are independent of each other, they have their own configurations, they are not interacting with each other in any way. Well then, according to the indivisible stochastic approach, by definition they're going to have their means to be independent and not interacting is that they have their own indivisible stochastic laws. Now let's suppose that there's a certain time, we'll call this time
[102:04] T prime. At this time, T prime, they interact in some way. And because interactions happen locally, whether you're doing quantum mechanics or not, they have to be nearby each other or sharing some intermediary in order to communicate, but some way they begin to interact. What does that interaction mean? Well, even in Newtonian physics, when two systems are interacting, they no longer have their own separate potentials anymore. There's one potential function for both of them that doesn't factorize.
[102:29] In the indivisible stochastic approach, the interaction is represented by the fact that now there's an overall stochastic dynamics for the two systems, and that overall stochastic dynamics does not factorize while they're interacting. Now, what you might imagine happens is once you separate the systems and take them to far distance separations in space, that they'll have their own separate stochastic dynamics now. And that's what would happen in the Newtonian case? In the Newtonian case, but it doesn't happen here. And it doesn't happen here because
[102:57] The overall stochastic map is indivisible. It goes all the way back to when they first like before they interacted. It cumulatively encodes all the statistical effects between before they interacted and all future times. And if there was a moment when it stopped factorizing, it's not going to start factorizing again.
[103:15] So the two systems will not have their own separate laws. There will be one overall indivisible stochastic dynamics that's not factorizable for the two systems. But there's a common cause. The common cause was their interaction. But the common cause is not the kind of common cause that would be plugged into Reichenbach's principle of common causes.
[103:35] Now, if you have an agent, if you want, Alice or Bob, or an environment, or even just one of those little qubits we talked about, the detector bit that we did when we were talking about the double set experiment, that interacts with one of the systems and reads off its configuration, at some later time, T prime prime, T double prime, later, when they're far apart, it will produce a division event. That division event will let us restart the overall stochastic dynamics, but the systems are now separated.
[104:04] And so when the indivisible stochastic dynamics restarts, starts cleanly, they're no longer interacting. It's going to begin factorized and it will remain factorized. And this is the breaking of entanglement. So the two systems are not initially interacting. They have their own separate indivisible stochastic dynamics. We would say they're not entangled. They begin interacting for some amount of time during the interaction and after until the next division event. They no longer have their own separate indivisible stochastic dynamics that factorizes.
[104:33] Then when there's a division event later on, once they're far separated and we can restart the stochastic evolution, we can stop, look at what configurations they're in, and then write down new laws for them. They're separated now. Now they'll have their own independent laws again, and that's the breaking of entanglement. Notice this is a picture of entanglement phrased entirely in terms of ordinary probability theory with no Hilbert spaces. This is the claim that's going on. This is what's happening under entanglement.
[104:58] And this picture of what's happening with entanglement comports with this microphysical theory of causation I was describing before, a theory that does not permit whatever agent or environment or measuring system acting on one system having a causal influence at space like separation on the other. So that's what I would be saying is happening with entanglement. It's a picture of entanglement at the level of ordinary probability theory.
[105:22] Whether you call it classical probability theory is subtle. It depends on whether you think that indivisibility is a classical property or not. But it's certainly ordinary probability theory and it doesn't require Hilbert spaces and so forth. So that's that's one way to think about how entanglement is ultimately happening at the at the sort of deeper level of the indivisible stochastic process. So you mentioned that the stochastic dynamics somehow encoded had the memory. I know you don't like this word memory, but somehow encoded what happened before.
[105:52] into itself. So yeah, if I was to think of that as information that's being encoded, well, information, if you accumulate enough of it, you form a black hole in a small enough region. So does this mean that if entangled particles are entangled for long enough, then they'll just form a black hole because the dynamics between them encode so much information? Help me decode this question. Yeah. So I think, um,
[106:20] There is a sense in which the overall indivisible stochastic map is encoding sort of cumulative like statistical connections. But even that, I mean, I'm really sort of fishing for metaphors here when I say that because it's not really memory in the traditional sense. Again, a traditional non-Markovian process, the way we usually talk about non-Markovian processes, we have this hierarchy, this tower
[106:44] Hi,
[107:14] description of the later configuration of the system depends on its initial configuration and that can happen in the past. It's not that information is being encoded in a literal sense. It's not the kind of information that
[107:34] you know, the Bekenstein bound would say, could saturate the maximum amount of information that could happen in some region of space and might lead to the formation of like, it would exceed how much information you could have and might necessitate a black hole forming. So I would just say that I don't think that information, it's not, it's not information, I think, in in, in the sense of like, it's encoded on physical qubits in space, that would back react on on space time and have gravitational effects.
[108:01] It's just the laws are a little weird and stranger than we might have thought. I see. Okay, so tell me about the loss of phase information. We talked about this off air, but explain this on there. So one question you might ask is, okay, well, when I do this change of representation between the stochastic process that has no complex numbers in it, no phases, but indivisible dynamics, and I go to this sort of quantum system where I've got phases and all that sort of thing, right?
[108:27] It seems like the phase information is really important. I mean, we need it in order to make predictions about interference. How could it be missing from the indivisible side? Well, the point is it's not missing. The phases on the Hilbert space side are just the indivisibility on the, so they're there. They're just manifesting somewhat differently. But even then you might say, well, but come on. I mean, I can indirectly measure those phases if I
[108:52] like take a unitary time evolution matrix that I'm using to describe evolution on the Hilbert space side and I like mod square the entries and I lose all the phase information. How can that possibly still capture the same information? How can it possibly do it? And the answer is in this picture when you model a measurement process you have to bring the measuring device in just like Bohm did when he was writing those later chapters in his
[109:19] 1951 textbook on the measurement process or in his Bohmian mechanics pair of papers in 1952. You have to bring the measuring device in and when you do that and describe the whole thing as one giant indivisible stochastic process, you don't need the phases. You just run the overall indivisible stochastic process with the measuring device and it will probabilistically end up in one of its measurement reading outcome configurations with probabilities that agree with the predictions of the Born Rule.
[109:49] And then the phases are immaterial. You don't need them. If, however, I want to excise the measuring device from my formal description of the system, if I don't want to deal with the whole measuring device, if I just want to remove it and just look at the subject system and ignore the measuring device, treat the measuring device as kind of like a background character, not someone who's in the foreground of the story, then I need the phases to make predictions.
[110:16] And then I would replace the detailed physical measurement process with a Von Neumann-Lüder's collapse. I would use the textbook Dirac-Von Neumann axioms. So what I'm saying is, the textbook Dirac-Von Neumann axioms aren't going away. We're just identifying them as describing a certain regime of validity. When you're doing a standard measurement with a big measuring device on some microscopic system,
[110:41] You could model the whole thing and include the measuring device and do everything and then you don't need all those phase factors. You can just run the whole thing as some overall giant stochastic process and you'll get the right answer. This is all done out in detail in the first paper, the stochastic quantum correspondence paper. But if we don't want to go to all that trouble, if we want to simplify our description and ignore the measuring device, treat it as a background character and just focus on the system in question,
[111:03] Then, and the system is microscopic, so we don't run into the ambiguities that we might run into, well then we can ignore the measuring device, we can treat the measurement as an instantaneous collapse process, and then we do need to worry about those phase factors. So the phase factors are a way of encoding not just the indivisibility, but also the unseen measuring device. That's one way to think about what happens to those phase factors. This sounds like Copenhagen still. So how is this not Copenhagen? Okay, so
[111:31] I mentioned that Heisenberg wrote a lot of philosophy. He wrote a book called Physics and Philosophy, and it's a chapter in his book, Physics and Philosophy, which we can also link to. People can find it. And he's a chapter called the Copenhagen Interpretation. He describes what he saw as the Copenhagen Interpretation. Now, there's not agreement or consensus on exactly what the Copenhagen Interpretation means, and different people who are responsible for what we think of as the Copenhagen Interpretation had somewhat different views on it. Let me just describe how Heisenberg basically described it. He said that
[112:01] He basically said, well, Kant told us that our human brains can only understand the world in certain ways. We understand the world in terms of three-dimensional geometry and cause and effect, but there are certain things that we just understand. This is how our brain is supposed to work. And the quantum world simply doesn't work in those ways. It doesn't work in ways that our brains can understand. The classical macroscopic world does, and we have good theories for the classical macroscopic world. We've got classical mechanics, classical physics.
[112:29] The microscopic world is simply beyond our comprehension. So we use the mathematics of quantum mechanics, Hilbert spaces, wave functions, the Schrodinger equation, not because we think the world literally is these things, the wave function is real, but merely because they just give us a formal instrumentalist, meaning just a tool set for making predictions. They give us a set of mathematical tools for predicting what will happen
[112:55] Back on the macroscopic classical scale, a big macroscopic system sets up the experiment, a big macroscopic measuring device measures it. What's happening in between, we have no ability to understand. We use the weird mathematics of quantum mechanics to make the predictions about what will happen. But really, at the end of the day, everything has to then show up in some classical results.
[113:16] And that's the picture that's the opening interpretation, at least according to Heisenberg. And he had some words he said about where the probabilities came from. He's like, well, there's an uncertainty principle, and for big macroscopic systems, we're all kind of uncertain. And when microscopic systems interact with macroscopic systems, that's where the probabilities come in. He had a somewhat more sophisticated picture about all of this. And people can go and read his chapter on all of this. This is not Copenhagen because I'm not
[113:41] practicing the same kind of quietism about the micro world that he was practicing. I'm not saying we don't know what's happening in the micro world. I'm not saying we just basically only have classical physics and then the micro world is inscrutable to us. We need this other theory to describe the micro world and all it does is make predictions. I'm saying the micro world has an ontology. I'm saying that classical things have physical configurations, measuring devices have physical configurations. Measuring devices are emergent from atoms
[114:08] And that's okay now because the atoms also have an ontology. The atoms really exist. They really do have configurations. And when you're doing the experiment, the particles are really doing things. They're moving in particular ways. The laws are these indivisible stochastic laws, which are a little bit unintuitive, but things are really happening between the measurements. And now I can hopefully tell, at least in broad outlines, a picture of emergence, a story about emergence.
[114:33] where we have the particles or whatever the ontology is, fields particles, whatever, and then larger macro scale things emerge from them the way, in spirit at least, that fluid water emerges from water molecules. The Copenhagen interpretation doesn't do that.
[114:49] You can't talk about how the classical world is emergent because the Copenhagen interpretation practices quietism about the micro world. It doesn't say what is there in the micro world. It doesn't posit any kind of substrate, any lower level reality, physical reality, out of which the emergence of classical things is supposed to happen. So these are all ways in which this picture is distinct from the Copenhagen interpretation. And of course the Copenhagen definition also has this weird
[115:14] unspecified boundary between what is quantum and microscopic and what is classical and macroscopic. This is the so-called Heisenberg cut. There's a threshold above which you're classical and below which you're quantum and that's a murky line and people have debated whether it's really there or whether the idea is you can move it around but in any event it's not part of the indivisible stochastic approach. Are electrons single particles? Are they composite or are they point particles in your picture?
[115:44] I don't know what they're made out of. Our best theory, the standard model, describes electrons as not composites. So I don't know if they're made of anything else. I mean, there's also this interaction between electrons and Higgs field, which is, you know, complicated. But they're not any more or less composite in this picture than they would be according to the standard model. Okay. So something I'm interested in is research. What open questions does this pose? Where can people come in to help you with this theory? Yeah.
[116:14] What I find exciting about this project is it doesn't often happen that you stumble on like a blank canvas in an area of what you might have thought was settled fundamental physics. Where you can ask questions that really have no answers yet and there are a lot of directions that people can pursue when it comes to research.
[116:39] This project opens up a lot of these directions. One of them is just the mathematics of this new class of processes, these indivisible stochastic processes, which only showed up in the research literature in like 2021 in this review article by Simon Mills and Kavan Modi, which we can also link to people can look at it. It shows up in this sort of figure in their paper. There's a figure five in this paper. You know, mathematics has all these very simple ideas like
[117:09] functions, matrices, limits, derivatives, that are reasonably simple to define, but yet have profound implications. It's not super often that you see relatively simple ideas, simple mathematical ideas that have big applications and ramifications. Indivisible stochastic processes are a fairly simple idea that I guess people just didn't really think about.
[117:37] And so there's just some interesting work to be done on trying to understand the mathematics of these processes. That could be interesting work for someone interested in math, applied math, theory of stochastic processes. We talked about how you would model real world systems like quantum field theories, like the standard model. There's a lot of work to be done in taking this picture and applying it to systems that show up in solid state physics and high energy physics and the standard model.
[118:04] to make sure it works for one thing and also to see if it reveals any interesting features of these theories that might have been difficult to see otherwise. Dynamical symmetries are a really important subject in physics. Dynamical symmetries show up in a very interesting way in this approach and so there's a lot of work to be done there. There are old problems in statistical mechanics. So one of the outstanding problems in the philosophy and foundations of statistical mechanics is
[118:32] Where do the probabilities in system mechanics come from? In classical system mechanics, you're imagining you've got particles, like a gas is made of particles. The particles are all evolving because it's classical according to Newtonian mechanics, the rules of Newtonian mechanics. But Newtonian mechanics is not a probabilistic theory. There's this lovely argument by the philosopher of physics, David Elbert, that there's nothing whatsoever in the laws of Newtonian physics that would preclude a bunch of rocks
[119:01] spontaneously falling together to form a bunch of statuettes of the royal family. You might think that's impossible, but it's not impossible. I mean, after all, you could start with statues of the royal family and have them crumble into rocks. And because Newtonian physics is time-reversal and variant, the opposite should be possible. And yet we would just say that's unlikely somehow. But Newtonian mechanics doesn't come with probabilities. So where do those probabilities come from?
[119:28] One argument is the probabilities come from the initial state of the universe. The universe began in some initial state but of course there was one initial state of the universe, not a probabilistic collection of initial states. So there's some work to be done in understanding how we go from the beginning of the universe in some sense to some notion of probability distribution and it has to be the right kind of probability distribution. On the one hand it should be the kind of probability distribution that doesn't lead to rocks forming the statuettes of
[119:58] of the royal family, because we don't see that around us. We don't see that happening. We look around and we don't see rocks spontaneously assembling into statuettes of the royal family. And so we hopefully we were looking for some kind of explanation for why we don't see that happening. Oh, what I mean is, if you were to wait around for long enough, wouldn't you see it? Maybe, but only if the set of possibilities is bounded in the right sense.
[120:22] If the number of possible configurations of the universe is unbounded, there's no requirement you ever have to revisit or visit every possibility. If there's only a bounded, a so-called compact space of possibilities, then there are arguments that eventually you have to get recurrences or you have to visit everything. But in any event, in the time we've had since our universe has existed, we have not seen that happen. We've not seen rocks spontaneously form. What I mean is, even if we have this space that's not bounded,
[120:51] some events will occur that will be extremely, extremely unlikely. Yes. Of the same order of magnitude, if not greater than the royal family. That's right. But we don't expect them to happen all the time. Right? We live in a universe where they happen, but only rarely, not all the time. It'd be very weird if this were happening all the time all around us. How do we explain why it's not happening all the time around us? Somehow this is connected with how the universe began. The universe began in some kind of configuration that was very
[121:19] typical in some sense. It was very generic. It was very boring. It didn't have the very special arrangements that would lead to us seeing strange, unlikely things happening all the time. But we can't make it too typical. Because there's some sense in which the most typical initial configuration is just very random and in some loose sense, very high entropy.
[121:41] We actually need the initial beginning of the universe to begin in a low entropy configuration so that we get a well-defined thermodynamic arrow of time. David Albert calls this the past hypothesis. So there's something mysterious going on about the beginning of the universe if you're living in a deterministic universe where the laws are deterministic because how else do we get probabilities out? They must come
[122:01] from some statement about the initial conditions, but those initial conditions of the universe must be such that we began in low entropy and are rising toward high entropy, and yet are typical enough that we don't see surprising things happening all the time. In a theory in which the laws themselves are probabilistic, stochastic, we don't have the same kinds of problems. If the laws are themselves stochastic, we're getting probabilities out of the laws. We don't need to get them out of the initial conditions of the universe.
[122:29] So this gives a whole other way to think about where the probabilities of statistical mechanics can come from. Now, one might ask, okay, does that mean that all statistically fluctuating things in statistical mechanics and in thermodynamics are ultimately quantum mechanical in origin? That's not quite the way I would phrase it. The way I would say it is we need some source of probabilities in order to get things like
[122:58] To get statistical mechanics off the ground, you need some statement like, all else equal, all of the possible configurations or states of a system that are energetically accessible are in some sense equally probable, right? The technical term for this assumption is that we're assuming the microcanonical ensemble, but it's basically all else equal if a system can have lots of states and they're all available, the system can get to them, we should treat them all as being equally probable unless we have some good reason to think otherwise. How do we get that off the ground?
[123:29] There were arguments for a while that maybe systems just rapidly oscillated and changed, even according to Newtonian mechanics, in a way that was called ergodic. Ergodic systems are systems that rapidly explore their possibility or state space. Very rapidly. So rapidly that you can sort of pretend that the system is equally likely to be in any of its states. Unfortunately, proving that systems are ergodic is very hard. And there are many systems that are known not to be ergodic. So the ergodic hypothesis turns out not to hold for a lot of systems.
[123:57] There have been some information theoretic arguments to get this off the ground. But then you run into some very deep questions like, if the probabilities are all just in my head, how can the probabilities actually lead to coffee boiling or something like that? It feels like the probabilities should somehow be out there in nature because they seem to be doing physical work in some general sense. So information approaches toward trying to derive the equal probability of all the microstates is very hard.
[124:25] These, but theories that have probabilities in the laws provide a different way to get probabilistic behavior at this sort of necessary level. Once you've got this probabilistic behavior and can talk about Boltzmannian statistical mechanical systems, you can then take these Boltzmannian statistical mechanical systems with sort of all the states being assigned probabilities in roughly equal amounts. You can couple them together. You can take big, big, big systems called reservoirs, which model the environment and little systems.
[124:55] And you can, from these interactions, derive notions like thermal equilibrium at some temperature. And then you can derive what's called the canonical ensemble, which is the probability distribution we would associate to a system that is energetically interacting with a larger, a very large environment called a reservoir.
[125:18] And these systems will exhibit fluctuations that are thermal fluctuations and those thermal fluctuations are distinct from quantum mechanical fluctuations. So there's like a higher level of fluctuations, thermal fluctuations that you get for these systems. It's not that you need the indivisible stochastic approach to explain that higher level of emergence of thermal fluctuations.
[125:37] earlier when you said that it's not just all in our heads because the water is boiling and doing something. Are you referring to that some people think randomness is about our ignorance?
[126:07] Right. Right. Yeah. So one way to think about probability is that probability is objective chance type probability, that nature is really behaving in kind of a chancy, unpredictable way, that phenomena are happening in an unpredictable way. Another view is that the probabilities are all in our heads.
[126:25] Right. When we assign probabilities to things, we're talking about what are called subjective credences, credences or degrees of belief. When we assign probabilities to things, we're not saying the probabilities are really out there in any sense. We're just describing like our belief in whether something is actually a particular way or not. And there's a relationship between objective chancy probabilities and subjective credence probabilities. It's most famously formulated as what David Lewis called his principle principle
[126:52] The first principle is principle, P-A-L, and the second is principle, P-L-E, which is just to say that if you happen to know the objective chance for something and you condition on that, then your credence should be equal to the objective chance. There's a connection between objective chance and credence. But in these sorts of pictures, we acknowledge that there are different kinds of probability. There are objective chance probabilities. There are subjective credence probabilities.
[127:19] From time to time people have tried to say there is only one kind of probability. Maybe all there is is just subjective credence probability, and there is no fundamental objective chance probability, or vice versa, I guess. Maybe we'll talk a little bit about that in the context of Everettian quantum mechanics in a little bit, because it does show up in that context.
[127:41] But the question is, if all probability is really just subjective credence probability, then how can subjective credence probability in our heads underlie Boltzmannian statistical mechanics, which underlies thermodynamics and thermal fluctuations and all this stuff that happens in the world around us? I mean, if you just happen to know the exact specific state of a system, and now that specific state has 100% probability or nearly 100% probability, have we just mentally, like, now that my knowledge has changed, I've changed all the probabilities, and so that suddenly makes thermodynamics stop working.
[128:11] That's obviously too quick a statement, but there is a little bit of a mystery here around like could it be just that all the probabilities are heads or is there something random in some sense happening actually in the physical world? The reason this is very tricky is because coming up with a self-consistent, unambiguous, rigorous theory of objective chance turns out to be very hard.
[128:37] That's one reason why people have retreated to thinking that probability is all credence, because if it's credence, it's okay if it's not perfectly rigorous. Objective chance probability is very hard to specify. It runs into all kinds of basic problems. What does it mean to say that some thing out in the world objectively has a chance of 72% of .72? You might say, well, it means that in the long run,
[129:09] times 72% of the time it will come out a certain way, but that's actually not true, right? If you take a coin, for example, and you believe the coin is a 50 50 coin, and you flip it 10,000 times, it's not going to be if it's a fair coin, it's not going to be heads 5000 times, it'll be heads a little off of 5000 times.
[129:34] But if you think about it hard enough, you realize, but actually there's a chance it could be heads every time. It's very unlikely it could be heads every time, but it could be heads every time. And if you try to say something like, well, okay, we need to take some kind of limit, maybe in the limit as the number of flips goes to infinity, it's like exactly 50%. But that's not how limits work. What about if the coin has a propensity to be 50-50? Well, propensity theories of chance are tricky because what is a propensity?
[130:02] A tendency to yield results 50% of the time, but you see it's like circular. It's very hard to pin down what you mean by propensity. Propensity theories of chance say that they're just certain objects that they want to do something in a certain way.
[130:19] But then, what does the 50% mean? Are you saying that they want to do it a certain way, this fraction of time, but then we run into the same problems we have here? This theory of probability that it's about frequency ratios, whether they're propensity, like they're coming from the object or they're from the laws or whatever, that they're about the frequency with which you get certain results is called frequentism. And frequentism is tough to make rigorous. You might say, well, just take the limit n goes to infinity, take the number of trials to infinity, but that's not how limits work.
[130:47] A limit, when you say that a certain sequence of things has a certain limit, what you're saying is that if you go beyond a certain term in the limit, you go beyond the nth term in the limit, then all the later terms are closer to the claimed limiting value than whatever, you give me some error, some epsilon, I can find a far enough distance along the sequence that everybody farther along is closer to the claimed limit than epsilon.
[131:16] If you make Epsilon smaller, I just go farther down the line. If you make Epsilon smaller, I go farther down the line. As long as I go far enough down the line, everything later down the line will be closer to the limit than Epsilon. Probability doesn't work that way. Frequentist probability doesn't work that way. There's no number of times you can flip a coin that will make, for sure, its frequencies closer to 50%. You could flip a coin a billion times and it could land heads every single time. It's unlikely, but it could happen.
[131:45] Some epsilon, you can't give me any n, any number of flips that will guarantee that it will fall closer to 50% than that. You might roll your eyes and say, oh, come on, but it's unlikely to do that. It's likely to be closer than epsilon, but the word likely is probability. What you can say is that if I flip the coin enough times, I can make the probability that it is farther from epsilon, away from 50%, smaller than epsilon.
[132:15] But that's just relating one probability to another. It's a totally circular definition. The law of large numbers is phrased this way. It's just a circularity relating one kind of probability to another. And the formal way to describe this is that when you do a limit, you have to have a notion of a measure. A notion of a metric, I'm sorry. You have to have a metric. You have to have a notion of how far away something is from something else. And for probabilistic systems, the metric itself is a probabilistic metric. That's what we're using for distance.
[132:42] And so any attempt to use limits with a probabilistic metric to describe probability is going to run into the circularity injection. Nonetheless, even though we don't have a rigorous theory of frequent probability, we certainly have an intuition that when we look at a long sequence of coin flips, or a long sequence of ones and zeros, that we can distinguish a highly random sequence from a non-random sequence.
[133:05] If we look at 10,000 zeros and ones, and we discover that about 50% of them are zeros and 50% of them are ones, and furthermore, runs of zeros, runs of three or four or five zeros in a row, or ones, three or four or five in a row, occur with certain frequencies, and this sequence obeys a number of other criteria for randomness, like, you know, the various criteria for randomness,
[133:33] There's the right kinds of lack of correlation over time. There's all these things you can run on a sequence of 10,000. We would look at that and we'd go, that to me looks like a random sequence that was generated by a 50-50 coin. It's not rigorous. You can't make it rigorous. And maybe there will never be a perfectly rigorous theory of probability at the level of like frequent probability. But when you look at a long sequence, there's at least an approximate notion that certain sequences seem to have all the hallmarks of probability.
[133:59] So maybe we don't need a theory of probability for system mechanics. Maybe it's enough to rely on randomness suitably defined. There are terms that come up when people talk about Kolmogorov complexity for characterizing how random a sequence is. Maybe we can rely on those instead of relying on probability. Maybe probability is all in our heads and what's out there in nature is something like complexity or Kolmogorov complexity or randomness. Or maybe nature is just inherently chancy. So there are a lot of ways to think about these kinds of problems.
[134:29] Now, you just mentioned measure, incidentally, but there's a problem of a measure in the many-worlds interpretation. We should talk about these other... So why bother introducing a new interpretation of quantum theory at all? Don't we already have enough interpretations? I mean, there are a lot of people who are like, we don't need any more interpretations. The world just keeps adding more and more of them. Why do we need any of them? Here is the reason I think we need a new interpretation of quantum theory.
[134:56] The existing interpretation suffer from one of the following or more than one of the following problems. Vagueness. They're vague about things they shouldn't be vague about. Or they're instrumentalist, which means they only talk about what happens in measurements, but then what are measurements and what are measuring devices and measuring devices made of things and, you know, you run into all these circularity problems. You run into measurement problems, basically.
[135:20] Or they're ambiguous when trying to deal with systems that are macro size. We've talked about like the Wigner's friend thought experiment. Once you've got systems that are of the same size as, you know, big classical measuring devices, does the theory render unique or unambiguous predictions? Or the theory is empirically inadequate. Like it works for some systems, like Bohm mechanics works pretty well for systems of fixed numbers of finitely many non-relativistic particles, but doesn't appear to be empirically adequate enough to be able to handle the standard model.
[135:51] Or finally, the theory relies on too many extra-empirical assumptions, axioms, and speculative metaphysical hypotheses. That is, to get the interpretation to work, we have to take on a whole collection of assumptions that cannot be verified empirically and that seem kind of like desperate measures or seem very far-fetched or seem difficult to justify, except that they give us the interpretation we want.
[136:20] those are the problems i think that all the existing interpretations have all of them have one of them i mean bohmian mechanics suffers from it doesn't appear to be empirically adequate the philosopher of physics david wallace who is at university of pittsburgh wrote a paper that i think characterizes very neatly he says you know the sky is blue
[136:42] And the sky is blue and our best theory of why the sky is blue is based on what's called Rayleigh scattering. Rayleigh scattering is when you, I teach Jackson electromagnetism, we cover Rayleigh scattering. When electromagnetic radiation impinges on charged particles, the charged particles vibrate and re-radiate radiation and they do it, they radiate power according to a certain frequency dependence that favors high frequency radiation, so you get much more scattering from high frequency radiation to low radiation.
[137:14] Bohmian mechanics at this point doesn't seem capable of explaining Rayleigh scattering. And it's been around. I mean, de Broglie first introduced pilot wave theories in the late 1920s. Bohm again independently discovered them and then eventually began talking with de Broglie in the early 50s. It's been over 70 years now. And the inability of Bohmian mechanics to account for these sort of familiar
[137:42] features of our physical world isn't as a sign of empirical inadequacy, and that's a problem. Copenhagen, well, instrumentalism, vagueness, what is it measuring? Copenhagen interpretation has lots of problems. We've talked about all of those. There are spontaneous dynamical collapse approaches to quantum mechanics.
[138:05] And there are some of those I think that are still viable that haven't been ruled out empirically. Some of them are now empirically been ruled out. That means they're not empirically adequate.
[138:13] They often involve some ad hoc choices you have to make, that you have to introduce sort of ad hoc parameters, what's the time scale of which collapse is supposed to be taking place, but some of those are still live possibilities. People are working on them, and people should work on them, and I'm not saying people should stop working on any of these things. We should see if Bohmian mechanics can be made empirically adequate. We should see if dynamical collapse approaches can work, but so far they don't yet. They don't yet work. And then there are other things that are even farther away from these things, like cubism.
[138:43] So Cubism, which comes from quantum Bayesianism, is associated with Chris Fuchs, who is at University of Massachusetts at Boston.
[138:54] Quantum Bayesianism begins with the idea that probability really is in our heads, there isn't really physical probability out there, and that quantum mechanics, the formalism of quantum mechanics, is really a methodology for dealing with uncertainty, for dealing with uncertainty about the world, and it's a particular mathematical framework that you need to use to do this. It purports to not be anti-realist, it purports to be compatible with the idea that there is in fact a fact of the matter about what's going on behind quantum mechanics,
[139:23] but it hasn't yet been able to formulate what that picture is supposed to look like. And I feel so bad because every time Chris gives a talk at some point, you know, in the question session, I'll raise my hand and I'll ask Chris this question about, well, where's the picture? What's the ontology? What's going on? He says they're just not ready to provide that yet. I feel very bad whenever I ask him that because he's so nice and patient with me when I say these things. But so I think the problem is we kind of don't have a place to stand. Right.
[139:51] I think one view is, what's the hurry? What's the emergency? Why do we need another interpretation? Just do the textbook quantum theory, Dirac von Neumann, or Copenhagen, or, you know, Bohmian mechanics, or whatever it is you want. There's no rush. There's no problem. There are too many interpretations, actually, and I would say there are too few. We do not have a problem of underdetermination with too many viable interpretations for one theory.
[140:17] We have a problem of overdetermination or at least a potential problem. We don't have a single interpretation in my view that works, that meets all the requirements I laid out, that doesn't have these serious problems. And without one, we're in danger. We're like at sea without a raft. We need something and that's why I think that the time is due for a new interpretive approach.
[140:42] Now, I've talked about the indivisible stochastic approach. We've talked about many of its features. We've talked about open questions, and there's more open questions, right? I mean, there are potential applications to quantum simulation and quantum computing that people should think about. I mean, after all, if Hilbert space pictures are dual to stochastic pictures, that may mean that quantum hardware could be very good at simulating certain kinds of stochastic systems efficiently.
[141:12] Just a moment. It's not exactly dual because you said it's many to many. That's right. It's many to many. But the idea is that a given Hilbert space picture can describe many different stochastic systems. That's good. It may mean that with quantum hardware, we can simulate many kinds of stochastic systems that might have been difficult to simulate otherwise. So one area of inquiry people can look into is, you know, and I'm certainly thinking about this is are there applications of this picture to finding new ways to simulate
[141:38] more general kinds of stochastic systems, especially stochastic systems outside the Markov approximation, using quantum hardware in an efficient way. And then there's more formal stuff. There's a whole formulation of quantum theory in the language of C star algebras we talked about in our first talk. What's the C star algebraic formulation that's appropriate for this kind of a theory? And do we need something like that to talk about certain kinds of physical systems? If we're not starting with Hilbert spaces anymore,
[142:06] Then we're not beholden to Hilbert spaces. We're not trying to build on top of them or modify Hilbert spaces. We're starting at a different place. We're starting just with ordinary probability theory. Does this lend itself to generalizations of quantum theory that would have been impossible to get to if we'd started with Hilbert spaces? So when we start with the Hilbert space, the worry is that if you modify the Hilbert space picture in the wrong way, you'll get nonsense. You'll get probabilities that are negative or probabilities that sum to more than one or things that don't make any sense.
[142:31] But if you begin with a theory phrased from the beginning in language of old-fashioned probability theory, you're not at risk in the same way that generalizations are going to lead to inconsistent or nonsense results probability. You don't need to get to probability from something else. When you start with Hilbert spaces, it could be the path you take to probability could break down.
[142:52] If you modify Hilbert spaces in the wrong way, the path to get to good probability breaks down. If you begin with probability, you're already there and you're just not at the same risk of running into inconsistencies with how you formulate probability. And finally, as we've already talked about, there could be some potential avenues for rethinking our approaches to quantum gravity. At this point, it would be great to talk about what is the many worlds interpretation and what is the fundamental problem or problems with it. Okay, so open questions, other interpretations,
[143:19] I haven't said very much about Everettian quantum theory. What about Everettian quantum theory? What about the many worlds interpretation? Here is the story of the Everettian approach. There's a cartoon picture. In the cartoon picture of Everettian quantum theory, every time you do a quantum measurement, the universe splits into branches. You have a cat. The cat's superposition alive and dead. This is the cartoon version. You measure the cat and now you split. There's a universe in which
[143:45] There's a you and a live cat, and there's a universe in which there's a different you and a dead cat. This is how the cartoon picture is supposed to work. And it seems kind of intuitive. And if you want to take wave functions to be fundamental, it seems like, well, this is the natural thing to do with them, if you sort of want to take them seriously. But you run into problems almost immediately with this cartoon picture. One problem is that it's not always 50-50.
[144:10] If the wave function is root two-thirds alive cat and root one-third dead cat, you still get two branches. So in what sense is one of them now two-thirds likely and one of them is one-third likely? If there are two branches, what does it mean to say that one of the two branches has a two-thirds probability and the other one is a one-third probability? How do we connect the branches with the notion of probability I was talking about before, randomness? If you've got a 50-50 random sequence, we expect to see zeros and ones according to some distribution that looks random.
[144:39] How do we get that picture of probability out of the branch picture of probability? This is not obvious. One thing you might try to do is argue that somehow when you have a root one-third branch and a root two-third branches, we should think of the root two-thirds branches as really two branches and there's like three branches now.
[144:57] But it turns out that branch counting arguments don't work very well. There's a well-known paper from 1989 in Analysts of Physics by Farhi, Goldstone, and Gutman called How Probability Arises in Quantum Mechanics. And you can link it. People can look at it. They try to get this sort of counting picture. You just consider infinitely or large numbers of experiments.
[145:23] large numbers of repeated trials of experiments and somehow argue that certain branches in the long run survive and others don't and you can sort of count them in some sense and this is where probability comes from. These sorts of arguments just, they fall out of favor because they don't work very well. So what do you do? Well, you could just add an axiom. You could just say axiomatically when there are branches, the Born Rule tells you what probabilities they have.
[145:51] The problem is how do we relate these probabilities back to the randomness probabilities we're talking about? Like, what does it mean to say just by fiat there's a probability here? But there's actually a deeper problem. You see, remember we talked about different bases you could use? In the Everettian approach, there's just a giant universal wave function. And there are infinitely many bases you could pick. And if you change what bases you pick, then the branches change.
[146:16] Right? All the components of the universal state vector are the branches. And if you change your basis, you change the branches. Which basis are the probabilities referring to? If there is, in fact, parallel universes with probabilities assigned to them, in which basis do we do this? This is known as the preferred basis problem. And I would add one more thing.
[146:42] Probability, when you say something is a certain probability, what you're saying is that there are n possible ways it could happen, only one of which is realized. In the Many Worlds approach, they all happen. So is this even a probability at all? Is it even coherent to talk about this using probabilistic language? And Many Worlds interpretation forces us to be skeptical about some things that we just see around us. I mean, we do experiments, we get a single outcome. The outcomes appear to be happening probabilistically.
[147:11] And the many worlds interpretation denies that that's true, right? If you're going to do that, you better have good evidence for it. OK, so. So what do you do with all these problems? One argument is to say. OK, the per basis problem is kind of a kind of problem, but but maybe nature dynamically picks out a basis. Maybe as you let the universe evolve,
[147:41] Decoherence works out well in only one basis. There's a particular basis in which a particular way to decompose the universal wave function, so that when you decompose it in that way, decoherence gives you branches that no longer interfere with each other noticeably. I think that's Sean Carroll's argument in the Mad Dog Everett lecture, and I think you were there. It's also the view that is at the center of David Wallace's 2012 book, The Emergent Multiverse.
[148:11] This idea is that we don't presuppose a particular basis in which the branches happen. The universe just evolves and decoherence just doesn't work in most bases. But in a certain basis, we get nice, emergent, decoherent, no longer interfering branches. And that's what dynamically is the correct branching. And the branches are not fundamental. The world is not fundamental. They're not fundamentally there. They're just useful, convenient ways to describe the wave function. But now we have a problem.
[148:41] If the branches are not fundamental, if they're emergent, we can't have a probability axiom that assigns them probabilities. You see, the axioms, the fundamental axioms of your theory are supposed to refer to fundamental things. If the branches are emergent, approximate things, not fundamental things, the axioms cannot say, oh, if at some point in the future we develop these emergent approximate branches, then by axiom they'll be assigned probabilities.
[149:06] If the branches are now not fundamental, but merely emergent, merely just convenient ways to describe what's going on, then it's very difficult to think about how you would make an axiom that they should be assigned probabilities. If we're not going to get the probabilities from the axioms, we now have a fundamental problem. And this is where so much of the work in Everettian quantum theory has happened, this problem of probabilities. If the branches are emergent things, not fundamental, and we can't assign them probabilities by fiat through the axioms,
[149:35] How do probabilities happen? Now, I think the argument I would make here is that they don't. If you were compelled to believe in an outlandish metaphysical picture like the many worlds interpretation because you had to, because it was empirically unavoidable, like we look out into outer space and we see galaxies, many, many, many billions of light years away, we see countless galaxies billions of light years away, that leads us to believe that there is a big universe out there.
[150:06] We see clocks on airplanes move at slightly different rates, atomic clocks move at slightly different rates. That's hard to believe, but we can do the experiments and we see this repeated rigorously many times. It's not that we should never believe outlandish things, but as Carl Sagan said, extraordinary claims require extraordinary evidence. The Many-Worlds Interpretation says that there is an uncountable, you know, an uncountable profusion of universes that are coming out of every single moment, not even just measurements, but all the time.
[150:35] Uh, that's an outlandish statement, and sure, we could believe it if we were compelled to by either rigorous logical reasoning or by just unavoidable empirical results. But we're just not. And when you're formulating manuals interpretation and you run into this problem of, well, I have the per basis problem, I guess I can deal with that by letting the branches be emerged into decoherence, but then I can't axiomatically assign them probabilities anymore. At that point, you should just give up.
[151:04] Because you're no longer compelled through rigorous logic or empirical data that you have to believe in many worlds. So why are you still trying to chase it down? That is, this extravagant, outlandish metaphysical picture is no longer forced upon us logically or by experiment. So why are we chasing it down? Why are we starting with the assumption that they should be there and we need to somehow
[151:29] Jerry Mander are axioms and principles and assumptions to get the many worlds picture to come out. And that's the impression that I get when I see some of the work going on right now, right? We're not compelled to take many worlds on as a serious idea. We can only get it off the ground by adding lots more stuff. Why are we doing this? So let me just describe a couple of the routes people have taken and then we can quit because that's basically the end of it. One route is the route that David Wallace takes in his book.
[151:59] The Emergent Multiverse. It is an excellent book. You should list it on the YouTube channel. And I recommend everybody interested should read it. David Wallace is a fantastic, brilliant philosopher and also trained in physics. And the book is a beautiful book. I recommend it to everybody who's interested in quantum foundations. In that book, he tries to solve this problem of probability. How do we get probabilities assigned to these things?
[152:27] by introducing a large number of additional assumptions. And when I have people read this book, I tell them, read it and then just make a list of every extra assumption he has to make. He assumes that we should have the same metaphysical relationship to many copies of ourselves as we would if there were only a unique individual we were to become. That means you have to take kind of a stand on old questions like the metaphysical teleporter problem in metaphysics. The theorem he uses
[152:57] requires invoking a notion of free will that requires taking a combat up compatibilist stance because in many world's interpretation there's just a deterministically evolving universal wave function and yet he has in his proof of the born rule agents which is already a dangerous idea agents we're bringing back agents making choices about which unitary operations are going to perform this is a crucial part of the proof and his little footnote where he admits yes this does entail certain assumptions about free will but free will is a big problem no one solved it
[153:26] But that doesn't make the case. If you're resting on an unsolved problem, it doesn't make the case that what you're doing is going to work. He introduces a number of what he calls richness axioms and rationality axioms. The rationality axioms are supposed to be general good practices of what it means to be a rational observer. These were developed in a one world kind of picture. And the assumption is that they also work in a many worlds picture.
[153:53] Basically the way that one tries to proceed here is one says, what does it mean to be rational? It means that you want to use the tools of decision theory, the formal, precise, probabilistic tools for making good decisions called decision theory. And people who use the tools of decision theory, who are rational, will end up assigning probabilities to branches according to the Born Rule.
[154:21] That's roughly and very gross outline how this argument is supposed to work. Now, John Norton, again, philosopher at the University of Pittsburgh, raised an objection to really any such approach to try to get probability out. In a deductive argument, the conclusion cannot be any stronger than the premises. If you're trying to get probability to emerge as a conclusion, there must have been probability already in your premises.
[154:46] In this proof-of-the-born rule, one is trying to get probability out, so there must be probability somewhere in the premises. If you don't assume probability somewhere in the premises, somewhere you must be doing something that is not legitimate. And you can see how this unfolds for this decision-theoretic argument, which goes back to David Deutsch also. There's an earlier version of it in a 1999 paper by David Deutsch. It's called Quantum Theory and Decisions. You can also link to that.
[155:15] The argument is that if you obey the rules of being a rational observer and use decision theory, you're going to end up assigning probabilities according to the Born Rule. But you can ask, why is that the definition of rationality? I mean, in a many worlds type universe, there are going to be observers who behave rationally according to the dictates of decision theory. Some of those observers are going to be very successful over 10 years.
[155:40] And others are going to be very unsuccessful because in the many rules interpretation everything will happen on some branch. But there are also observers who do not obey the rules of decision theory. There's some very irrational observers who just choose not to follow any of the rules of decision theory. And they're going to be branches in which they're going to be unsuccessful over 10 years. And they're going to be branches in which they're successful over 10 years. All those observers are just there. Right.
[156:05] And to say that, well, you should just be rational and obey decision theory by axiom does not solve the probability problem. In a one world picture where only one future actually happens, it seems to be the case that people who are rational and think very carefully about their decisions and use something like a decision theoretic approach in the long run over 10 years tend to make more money or healthier, live better lives, whatever it is that you want.
[156:36] And that gives us reason to think, oh, these are good rational principles. If people who follow these principles tend to do better, I see people who exercise and people who make good financial decisions and hedge their investments, they do better. I go, oh, well, there are good reasons, therefore, to do what they do and take on their principles. But you can't turn it around and say that we're going to start with axiomatically, this is the way to be rational, and then go backward and show that
[157:06] that then entails that probability should work. And that's kind of the sort of reverse argument that's taking place. I should say that not all Everettians take this decision theoretic view. Simon Saunders, for example, tries to do probability in a more Boltzmannian, cisco-mechanical way by coarse-graining and actually counting in some sense, but it's still in its embryonic form. So there are a lot of approaches to many world's interpretation and
[157:33] At present, none of them seem to find a way to get probability off the ground, and I don't think that you can. And to the extent that you can by just taking on more and more assumptions, you're doing the thing where you're adding on extra empirical assumptions that can't be verified in an experiment. I mean, I don't know how experimentally to test that I should have the right relationship to many copies of myself. That's an extra empirical statement.
[157:55] If you take many of those on in order to get the picture off the ground, I don't know how credible it is. How much credence should I give to a theoretical picture that relies on a tower of SMHs, speculative metaphysical hypotheses? I feel like if you have to do all that work to get the theory off the ground, then it lowers your credence that we should take on such an outlandish idea that there are all these many worlds.
[158:19] so that's basically where i end up with many worlds approach and this is one of the reasons why i think there's room for another interpretation that's much more conservative that says well we do experiments we see one outcome maybe that's because there is just one outcome and the experiments look probabilistic maybe that's because they are in fact probabilistic nature is telling us it's probabilistic we should listen to nature rather than saying nope nope nope got to be deterministic there's a universal wave function evolving deterministically it's got to be Markovian
[158:48] You know, maybe we should just listen to nature and build a theory around what nature is telling us. That's, I think, the conservative, non-Outlandish approach that one should take. I want to know, how is it that you got so great at being articulate and smooth with your speech? That's a very, very kind thing to say. I really appreciate that. That's really nice of you to say.
[159:16] I think we all have different strengths. I'm bad at many, many, many things. There are a few things I've gotten good at through practice. There's some things we're all born kind of a little bit good at. We've like embryonic things that we're sort of good at and then we hone those things. I've taught many classes over many years here. I've interacted with such amazing students, brilliant, idealistic, just wonderful students who ask all kinds of great questions.
[159:45] I just think it's practice. You just talk a lot with people about very intricate topics and over time it gets easier. That's the best answer I think I can give. There's an Aesop fable I like to bring up with people. It's about a stag and its antlers. So there's the stag who's drinking from a pool.
[160:11] and admiring his beautiful antlers. He thinks his antlers are so magnificent, so glorious. He goes on and on, like, antlers are really the envy of the animal kingdom. Then he looks at his legs and says, but my legs are bony and ugly, and if only my legs could be as remarkable as my antlers. As the stag is pondering this, he suddenly becomes aware that a pack of wolves is chasing him. So he
[160:37] gets up and runs from the water. He's trying to get away from the wolves and he sees a forest. He's going to run into the forest to hide. And as he runs into the forest, his antlers start getting tangled in all the vines. And before he knows it, he can't run anymore. He's stuck. And as the wolves approach him, he realizes that the thing that he was praising his antlers was his undoing and the thing that he thought was, you know, his weakest feature, his legs, they were the things that would have saved him. If it had just been his legs, his legs would have saved him.
[161:07] Um, so the reason I bring this up is in addition to saying that I think we're all like good at a few things and maybe difficulty, a lot of things, some of the things we think we're bad at seen in another way are the things we're good at and sometimes vice versa. So I'm going to say something that anyone who has known me growing up will laugh at because it's so obvious. I came into this world profoundly lacking in common sense. Okay.
[161:34] Okay? Anyone who's ever known me growing up would say that's the most obvious statement I've ever made, okay? Profoundly lacking in common sense. And as I grew up, you know, you get made fun of, you make a lot of mistakes, you do a lot of silly things because you lack common sense, and you see it as kind of a weak feature. You see it as something you're a little bit embarrassed about. When you get into philosophy and foundations of science, philosophy of physics,
[162:00] What you see is a lot of people whose common sense takes them in directions they shouldn't go. You see a lot of people who make arguments or make speculations and make claims that just seem very commonsensical to them. And sometimes those are not really rigorously supported. Their commonsense can lead them into error. Suddenly, lacking commonsense becomes a huge advantage because when I read a philosophy paper or I listen to a seminar or I'm trying to formulate an argument,
[162:29] I don't have the kind of common sense that makes the answers obvious to me. So I see every argument and I have to take it apart and really disassemble it and understand what all the pieces do because I don't have an intuition, a common sense for how things are supposed to work. And what this means is that to some extent, and obviously, I mean, we all make mistakes. I make errors too. But I feel like some of the errors I might have made if I had more common sense, I'm less likely to make. So a thing that I thought was my weakest feature
[162:59] the stags legs in a different context try not to be really useful, like being on land and having only flippers for your arms and legs. And then one day you discover the ocean and suddenly what you thought was your weakest feature becomes now your greatest asset. So that's a general lesson I think that everyone needs to take to heart. Many of the things we think are
[163:22] And now you're speaking to researchers and potential researchers, people who are younger students
[163:49] Even people who are older students perspective, there are some people who are 70 and getting their PhD and watch this. Yeah. So what is a method that they can use to help figure out or distinguish between what is a an actual good feature versus an actual bad feature that they thought was good? The best I can say there is experience. Put yourself in different contexts. If I had never become
[164:15] Someone who worked in philosophy and foundations of physics, I might have gone my whole life thinking that lacking common sense was really bad.
[164:21] Maybe it is really bad in some contexts, but I wouldn't have seen that there are in fact flip sides to it. Another thing is just talk to lots of people and ask them, are there any aspects of themselves that in some contexts they see as bad and other contexts they see as very helpful? And if you talk to enough people, you'll begin to hear them say things that remind you about things about yourself. And you'll go, wow, I have this feature that I'm not feeling great about, but this person has found a way to really use it really well. Maybe I should follow their example and do what they do.
[164:51] So yeah, that's probably my best advice there for how to do it. But let me add one last thing in closing, right? When I teach a class, I just taught this class this fall term. We just finished teaching. I just finished teaching our classes for the fall term. I said to the students, look, we've talked about a lot of physics in this class. This was a physics class. Sometimes teach physics classes, sometimes teach philosophy classes. This was a physics class.
[165:22] If in a year you don't remember some or most or maybe even all of the physics that we've talked about, I won't be disappointed. But we have to be human to each other. You know, we have to be human beings to each other. And if you forget to do that, then I'll be super disappointed. You never know when you're interacting with somebody. Is this somebody who five years from now
[165:52] is going to be the right person at the right moment to play a really important role in your life, your career, your well-being. You have to treat everybody like they could potentially be super important to you. I mean, obviously, if you're getting a hundred emails a day, you can't treat every, I mean, just like a sheer amount. But to the extent that you can treat everybody with basic respect and treat people like human beings and be human to them, you should always do that because you just don't ever know
[166:22] You know, they could always I mean, obviously, just for its own merits. I mean, people should be treated like human beings anyway, but but it's also just a good strategy because you never know if someone ultimately down the line is going to end up being important to you. When students for start here in our PhD program, one of the things I tell them is that one of the most important assets they have is their reputation. And
[166:46] A lot of people think that the right scientific reputation to have is to be intimidating, for everyone to think you're the smartest person in the room, for everyone to be, you know, in awe of your intellect and almost afraid to talk to you, right? People think that's the kind of reputation you're supposed to develop. Not everybody does, but some people think that's what you're aspiring to. And people can often think of figures in their lives, role models in some cases that have that kind of reputation.
[167:14] I would argue that's not the right reputation that you should cultivate, that you should seek to have. I talked about treating people like human beings. Your reputation is in science, and this is for any students, researchers who want to go into science, your reputation is worth its weight in gold. The kind of reputation you want to have is someone people want to work with, someone people want to go and talk to and ask questions to.
[167:43] You want to be the kind of person who, when people come to you and ask you questions and talk with you, when they leave, they feel smarter than they did before. Because if people come to you and they leave feeling smarter, feeling happier, feeling like they can go out and do things more confidently, they're going to want to come back and work with you again, talk with you again. Yes, there are very successful people who don't have that kind of reputation, who are very intimidating, and they're successful. But they would be even more successful in my view
[168:07] If they cultivate the kind of reputation that made people want to collaborate with them, work with them, and importantly support them. Because everybody in every walk of life at some point will need someone to come along and help them out with something. And if people see you as someone who is collaborative and helpful, and someone who builds people up, and someone who treats people like human beings, then they'll be more likely to support you when you need help.
[168:33] And that's the kind of investment in your own career and your own future that I think everybody needs to take very seriously and think very seriously about. Thank you, Jacob. I appreciate you spending seven hours. Kurt, it was a delight. It was a complete delight. And Addie and Will, it was really just a delight.
[168:58] I've received several messages, emails, and comments from professors saying that they recommend theories of everything to their students and that's fantastic. If you're a professor or a lecturer and there's a particular standout episode that your students can benefit from, please do share. And as always, feel free to contact me.
[169:15] New update! Started a substack. Writings on there are currently about language and ill-defined concepts as well as some other mathematical details. Much more being written there. This is content that isn't anywhere else. It's not on Theories of Everything. It's not on Patreon. Also, full transcripts will be placed there at some point in the future. Several people ask me, hey Kurt, you've spoken to so many people in the fields of theoretical physics, philosophy, and consciousness. What are your thoughts?
[169:42] Also, thank you to our partner, The Economist.
[169:56] Firstly, thank you for watching, thank you for listening. If you haven't subscribed or clicked that like button, now is the time to do so. Why? Because each subscribe, each like helps YouTube push this content to more people like yourself, plus it helps out Kurt directly, aka me. I also found out last year that external links count plenty toward the algorithm,
[170:19] Which means that whenever you share on Twitter, say on Facebook or even on Reddit, et cetera, it shows YouTube. Hey, people are talking about this content outside of YouTube, which in turn greatly aids the distribution on YouTube. Thirdly, you should know this podcast is on iTunes. It's on Spotify. It's on all of the audio platforms. All you have to do is type in theories of everything and you'll find it. Personally, I gained from rewatching lectures and podcasts.
[170:45] I
[171:05] and donating with whatever you like there's also paypal there's also crypto there's also just joining on youtube again keep in mind it's support from the sponsors and you that allow me to work on toe full time you also get early access to ad free episodes whether it's audio or video it's audio in the case of patreon video in the case of youtube for instance this episode that you're listening to right now was released a few days earlier
[171:28] Every dollar helps far more than you think either way your viewership is generosity enough. Thank you so much
View Full JSON Data (Word-Level Timestamps)
{
  "source": "transcribe.metaboat.io",
  "workspace_id": "AXs1igz",
  "job_seq": 3271,
  "audio_duration_seconds": 10295.3,
  "completed_at": "2025-11-30T21:52:05Z",
  "segments": [
    {
      "end_time": 26.203,
      "index": 0,
      "start_time": 0.009,
      "text": " The Economist covers math, physics, philosophy, and AI in a manner that shows how different countries perceive developments and how they impact markets. They recently published a piece on China's new neutrino detector. They cover extending life via mitochondrial transplants, creating an entirely new field of medicine. But it's also not just science, they analyze culture, they analyze finance, economics, business, international affairs across every region."
    },
    {
      "end_time": 53.234,
      "index": 1,
      "start_time": 26.203,
      "text": " I'm particularly liking their new insider feature was just launched this month it gives you gives me a front row access to the economist internal editorial debates where senior editors argue through the news with world leaders and policy makers and twice weekly long format shows basically an extremely high quality podcast whether it's scientific innovation or shifting global politics the economist provides comprehensive coverage beyond headlines."
    },
    {
      "end_time": 70.606,
      "index": 2,
      "start_time": 53.558,
      "text": " We don't have a single interpretation of quantum mechanics. It doesn't have serious problems."
    },
    {
      "end_time": 89.36,
      "index": 3,
      "start_time": 72.722,
      "text": " I traveled to the oldest laboratory in the United States to meet with theoretical physicist Jacob Barandes at Harvard. He's the co-director of the graduate studies department there. We delved into the technical depths of his innovative reformulation of quantum theory based on more fundamental mechanisms called indivisible stochastic processes."
    },
    {
      "end_time": 111.817,
      "index": 4,
      "start_time": 89.36,
      "text": " My name is Kurt Jaimungal, and this was part of my three-day tour at Harvard Tufts and MIT, where I recorded five podcasts, one of them you're seeing now with Jacob Arndis. It was actually over seven hours long, so we're splitting it into two parts, and this is part two. Part one is also linked in the description. The others are with Mike Eleven, Anna Chaunica, Manolis Kellis, and William Hahn. Subscribe to get notified."
    },
    {
      "end_time": 134.923,
      "index": 5,
      "start_time": 111.817,
      "text": " In this episode, we talk about what are the misconceptions of the wave-particle duality and entanglement? Is gravity indeed quantum? What about non-locality and Bell's theorem? And what exactly are indivisible stochastic processes? Kurt, it's good to see you again. Good to see you. It's been so long. Wave-particle duality. What is that? All right."
    },
    {
      "end_time": 165.657,
      "index": 6,
      "start_time": 136.305,
      "text": " When Schrodinger introduced the idea of his wave function in that paper in early 1926, building out of Hamlet-Jacobi theory, his undulatory theory of mechanics, this wave function that lived in high-dimensional configuration space, he had provided a new methodology, a technique for computing things in quantum mechanics. He used the wave function as an indirect way to calculate energy levels. What are the energy levels of atoms which then corresponded to the frequencies of radiation that came out of atoms?"
    },
    {
      "end_time": 185.282,
      "index": 7,
      "start_time": 167.363,
      "text": " Einstein had a lot of problems."
    },
    {
      "end_time": 212.227,
      "index": 8,
      "start_time": 186.049,
      "text": " And part of the reason that Einstein in particular was concerned was because Schrodinger embraced a kind of what we would call wave function realism, that the wave function is a real thing, physically, metaphysically, real thing in a high dimensional configuration space that somehow projects its meaning into three dimensions of physical space and that really where everything was happening was in this high dimensional abstract possibility space, this configuration space, that's where the waves were. Eventually Schrodinger recanted that view"
    },
    {
      "end_time": 236.118,
      "index": 9,
      "start_time": 212.807,
      "text": " In one of our earlier conversations, I talked about how in 1928 in his fourth lecture wave mechanics, Schrodinger expressed some doubt about wave function realism. He indicated that maybe you could think of the wave function as playing out all the possible realities of what could happen to the system in sort of a very embryonic version of the many worlds interpretation. But Schrodinger recanted that view in 1928."
    },
    {
      "end_time": 265.896,
      "index": 10,
      "start_time": 236.886,
      "text": " In the face of things like Born saying that the wave function should be understood as a tool for computing measurement probabilities. But in the period from 1926 to 1928 when Schrodinger was still pushing this idea of the wave function as being sort of physically fundamental, Einstein was very confused. There's a very famous letter from December 4th 1926 from Einstein to his colleague Max Born, the same Born of the Born rule."
    },
    {
      "end_time": 295.52,
      "index": 11,
      "start_time": 266.425,
      "text": " In which Einstein famously says that he doesn't believe that God plays dice. This is famous. I just don't believe that God plays dice. What people don't often know is that the very next sentence in that letter is a criticism of Schrodinger's wave function reality. He says waves in 3N dimensional space as if"
    },
    {
      "end_time": 318.865,
      "index": 12,
      "start_time": 296.032,
      "text": " Rubber bands and he even has like dots like he writes an ellipsis in the letter. He doesn't even know what to say. What's interesting is that in the canonical translation of the Einstein born letters, the collection of letters of correspondence between Einstein and Max Born, the letters translated into English was translated by Irene born and the end is missing."
    },
    {
      "end_time": 348.114,
      "index": 13,
      "start_time": 320.64,
      "text": " Einstein just says waves in three dimensional space as if by rubber, you know, the end is missing and without the end, you don't realize that his concern is not waves per se. His concern is three and dimensional wave functions and configuration space. That's what he was nervous about. So you miss a very important, but if you look in original German, the end is there. So, you know, Einstein had a lot of misgivings about this idea. Um,"
    },
    {
      "end_time": 376.596,
      "index": 14,
      "start_time": 350.196,
      "text": " But the idea has origins that go back earlier, right? De Broglie's sort of matter-wave idea that particles and electrons had waves associated with them in analogy with how light was thought to be a wave classically and then there was evidence coming starting from Planck and Einstein that light had a particle-like character. This idea that certain phenomena had both particle-like and wave-like features became known as wave-particle duality."
    },
    {
      "end_time": 402.227,
      "index": 15,
      "start_time": 377.244,
      "text": " And when people do a study of, for example, the double slit experiment, and they approach the double slit experiment in the traditional way, one particle at a time, a wave function that we can pretend is moving in three dimensional space, but this is really just an artifact of the fact that configuration space for one particle looks three dimensional. It looks like you should treat particles a wave as it goes through the slits to get the correct pattern over many repetitions of landing sites."
    },
    {
      "end_time": 431.971,
      "index": 16,
      "start_time": 402.773,
      "text": " You know, we don't actually see a wave on the other side. What we see is dots, many, many landing sites over many repetitions of the experiment. The wave is inferred. But when you measure where the particle is at the end of the experiment or you measure which hole it goes through, you get a definite result and that makes it look more like a particle. So this is the idea that sometimes things are particle-like. Sometimes they're wave-like depending on what feature of the system we're trying to study. This became known as wave-particle duality. This is further complicated by the fact that there are waves"
    },
    {
      "end_time": 457.551,
      "index": 17,
      "start_time": 432.398,
      "text": " of a different kind in physics. Electromagnetic waves, for example. Light is a disturbance in the electromagnetic field that propagates like a wave through three dimensional space. And those are waves. I mean, like I said, I teach Jackson electromagnetism. We talk about waves moving through three dimensional space. It's very easy to confuse the waves of a field"
    },
    {
      "end_time": 482.637,
      "index": 18,
      "start_time": 457.995,
      "text": " like the electromagnetic field with the wave functions or Schrodinger waves of quantum mechanics, but they're not the same thing. And this is bled into the wave particles reality. When Planck in 1900 and Einstein in 1905 and various people were proposing that light came in quanta, discrete particle like quanta called photons, the wave"
    },
    {
      "end_time": 505.811,
      "index": 19,
      "start_time": 482.927,
      "text": " that they were imagining was the wave corresponding to photons was a three-dimensional electromagnetic wave, a wave of the familiar kind of wave. The wave functions that Schrodinger introduced in 1926 were not like those waves. They were not three-dimensional waves in physical space of a field."
    },
    {
      "end_time": 535.452,
      "index": 20,
      "start_time": 506.561,
      "text": " They were these abstract complex valued functions in a high dimensional configuration space. And when you measured them, they collapsed. Now, if you're in an MRI machine and they've turned on a very strong magnetic field, you don't have to worry that if you do the wrong measurement you're going to collapse the magnetic field in the MRI machine. It's not that kind of field. The waves they're beaming at you are not those kinds of waves. So you have to make a distinction between"
    },
    {
      "end_time": 561.442,
      "index": 21,
      "start_time": 535.828,
      "text": " yield waves, the waves of a field and Schrodinger waves. And I want to make super clear that in the indivisible stochastic approach to quantum mechanics that we've been talking about, I'm saying Schrodinger waves are not real things. These abstract things that live in this high dimensional configuration space, those are not physically real. But classical waves or the waves of a field, which are a different conceptually different kind of a wave,"
    },
    {
      "end_time": 590.265,
      "index": 22,
      "start_time": 561.783,
      "text": " Those are perfectly valid. And if you're studying a system that's not made of particles, but a system made of fields, you're going to see wave-like behavior as well. But those are a different kind of wave. And these are the kinds of subtleties that I think get lost when someone just says wave-particle duality. So again, just to summarize, the relationship between a photon, a particle of light, and an electromagnetic wave is not like the relationship between an electron and a Schrodinger wave function for the electron."
    },
    {
      "end_time": 616.817,
      "index": 23,
      "start_time": 591.101,
      "text": " Now, what makes this even more confusing is that electrons do have fields also. There's a so-called Dirac field that plays a very important role in the Standard Model. And this is a field, a field in three dimensions for the electron, but the Dirac field for the electron is not the Schrodinger wave for an electron. So, you know, these are super subtle distinctions, but it's important to keep them in mind."
    },
    {
      "end_time": 646.22,
      "index": 24,
      "start_time": 617.483,
      "text": " What makes it even more confusing is that particles like electrons, which are called fermions, these are particles that have an intrinsic half-anager spin. They're the particles that obey a Pauli exclusion principle. You can't put them all in the same energy state. They make chemistry possible by not having all the atoms collapse at the ground state. Electrons are like this, quarks, protons, neutrons. Although they have fields associated with them, the fields associated with them are not classical fields like the electromagnetic field. The fields are much more bizarre and weird."
    },
    {
      "end_time": 675.572,
      "index": 25,
      "start_time": 646.698,
      "text": " And I'm not going to have time to talk very much about them except to say that one of the limitations of Bohmian mechanics is that it has a great deal of difficulty dealing with the kinds of fields associated with fermions. And that's one reason why Bohm mechanics has difficulty, the Bohm pilot wave theory. I'm getting way ahead of myself, but I just wanted to just clarify what's going on in wave-particle duality. So in the indivisible stochastic approach, there are no Schrodinger waves as part of the fundamental physics. Of course, when you go to the Hilbert space picture,"
    },
    {
      "end_time": 701.254,
      "index": 26,
      "start_time": 675.811,
      "text": " mathematically write down wave functions and use them, write down Schrodinger waves, but they're not physically there. You don't need them to explain the interference patterns. The indivisible stochastic dynamics itself generically predicts that you'll have what look like over many repetitions of the experiment, dots that look like they're following some kind of wave equation. But there is no wave actually involved in those experiments. But I'm not saying that field waves, the waves in fields are not there. That's a different kind of wave."
    },
    {
      "end_time": 732.619,
      "index": 27,
      "start_time": 703.336,
      "text": " So, speaking of these waves, you mentioned quantum field theory indirectly with Dirac. Does your approach illuminate any aspect of quantum field theory or the standard model? We've been talking about quantum mechanics, sure, especially in part one and part two. What about QFT? Yeah, so one of the nice things about Bohm's pilot wave theory is that it works really beautifully for systems of fixed numbers of finitely many non-relativistic particles. That's a lot of qualifications."
    },
    {
      "end_time": 755.452,
      "index": 28,
      "start_time": 733.2,
      "text": " doesn't work so easily for fields. You end up either having to do very complicated things or maybe even introducing stochasticity of some kind. It gets kind of messy and there's a lot of difficulty handling for me on fields in particular the fields associated with particles like electrons. One of the advantages of this approach is"
    },
    {
      "end_time": 775.213,
      "index": 29,
      "start_time": 756.425,
      "text": " Although, okay, so one of the, so let me just say something very quickly about Bohm mechanics. Now this is different because this is also related. In Bohm mechanics, for again, systems of fixed numbers of finally many non-reversible particles, we have deterministic equations. There's a pilot wave that guides the particles around. The wave function, the pilot wave obeys the Schrodinger equation."
    },
    {
      "end_time": 804.497,
      "index": 30,
      "start_time": 775.503,
      "text": " Then another equation called the guiding equation is how the wave function, the pilot wave, guides the particles around. And everything is deterministic. There's no fundamental probabilities. There are some initial uncertainties in the initial configuration of the system. And these evolve to become the Born Rule probabilities later. But the dynamics is fundamentally deterministic and is not generating the probabilities in a fundamental, law-like way. This picture is in some ways very elegant."
    },
    {
      "end_time": 826.408,
      "index": 31,
      "start_time": 805.589,
      "text": " Provided you're okay with a pilot wave living in a high dimensional configuration space. Although I should say that Goldstein, Durer, and Zange have already proposed the idea that the Bohmian pilot wave is law-like and not a physical thing. So there are other ways to read this theory. The problem is it helps itself to a lot of very special features."
    },
    {
      "end_time": 848.217,
      "index": 32,
      "start_time": 826.817,
      "text": " of models that consist of fixed numbers of finitely many non-relevant particles, features that are unavailable when you go to more general systems like fields. So you end up having to write down a very different looking model, including in some cases models that you need to now deal with stochasticity and indeterministic dynamics, and they just don't really work very well when you try to go beyond."
    },
    {
      "end_time": 878.712,
      "index": 33,
      "start_time": 849.275,
      "text": " One of the other things that Bohm mechanics requires is a preferred foliation of space-time. So last time we spoke, we talked about how in special relativity, there's no preferred way to take space and time and divide it up into moments of time, like different ways to do it. The guiding equation, the equation that takes the pilot wave and explains how the pilot wave obeying the Schrodinger equation, how the pilot wave guides the particles around, they call the guiding equation, depends on there being a preferred foliation of space-time, a slicing of space into moments of time."
    },
    {
      "end_time": 905.538,
      "index": 34,
      "start_time": 879.394,
      "text": " It's also not really great. It works fine in the non-relativistic case, but we want to do relativistic physics like we often do when we want to do quantum field theory, which is the kind of models we use when we want to deal with special relativity and quantum mechanics together, as in the standard model. Preferred foliation is really difficult to deal with. Not impossible, but it'd be nice if we didn't need it. In the indivisible stochastic approach, there's no guiding equation. There's no pilot wave."
    },
    {
      "end_time": 936.067,
      "index": 35,
      "start_time": 906.408,
      "text": " It's not that you solve the Schrodinger equation, get a pilot wave, and then take the pilot wave and plug it into a guiding equation, which depends on a portfolio and then the guiding. None of that happens. There's just the indivisible stochastic dynamics, which can be represented in Hilbert space language. But the dynamics is just directly happening. There's no middleman. There's no pilot wave and guiding equation in the middle. This means the theory is not going to be deterministic. I think one question the comments is, is this fundamentally deterministic or not? It's indeterministic. It's not a deterministic theory."
    },
    {
      "end_time": 961.817,
      "index": 36,
      "start_time": 936.749,
      "text": " But because there's no guiding equation, there's no preferred foliation. And because we're not relying on all these special features of the particle case, it's perfectly easy to now generalize this to more general kinds of systems. Have you done it? Have I done it? Good question. So there's this thing called time, and time is bounded and limited. Is it? It is, amazingly."
    },
    {
      "end_time": 982.261,
      "index": 37,
      "start_time": 962.329,
      "text": " In your framework? At least in my life. Okay."
    },
    {
      "end_time": 997.398,
      "index": 38,
      "start_time": 982.927,
      "text": " The term here is straightforward in principle to generalize this to quantum fields because none of the obstructions are there like they were before. One of the problems with Bohmian mechanics is your wave function has to live in a space, configuration space."
    },
    {
      "end_time": 1023.456,
      "index": 39,
      "start_time": 997.978,
      "text": " And fermionic particles don't have a familiar kind of configuration space. This is what makes it so hard to do bone mechanics. But there's no pilot wave here, so you just don't even have that obstruction. So many of the things that would have obstructed us from just applying this to any kind of system are just, they're just not there anymore. So if you want to deal with a field theory, you just replace particle positions with localized field intensities. These become your degrees of freedom. And then you just apply the stochastic laws to them and it works the usual way."
    },
    {
      "end_time": 1044.053,
      "index": 40,
      "start_time": 1024.155,
      "text": " The problem with quantum field theory is that quantum fields in general is that they have infinitely many degrees of freedom, infinitely many moving parts. At every sort of point in space in the most sort of – this is a whole renormalization story of effective field theory, but at a simplest sort of bird's eye view, you have a degree of freedom at every point in space, you have infinitely many of them."
    },
    {
      "end_time": 1074.616,
      "index": 41,
      "start_time": 1044.787,
      "text": " And this makes them very mathematically difficult to deal with. Even in the traditional Hilbert space or path integral formulation, quantum field theories are really mathematically tricky. And there are very few, if any, I think there are none, rigorously defined quantum field theories that are also empirically adequate. Like none of the quantum field theories that make up the standard model have been rigorously defined. This means that anytime you mention quantum field theory, you're going to run into mathematical difficulties that are just because quantum field theory is"
    },
    {
      "end_time": 1094.94,
      "index": 42,
      "start_time": 1074.77,
      "text": " Mathematically very complicated. So I think there's a research direction for an enterprising students to not only formulate quantum field theory in this language, but also see does it make any of the mathematical difficulties easier? Do any of them become harder? Like what exactly does it look like when you do this super carefully?"
    },
    {
      "end_time": 1117.449,
      "index": 43,
      "start_time": 1095.367,
      "text": " As you know on theories of everything we delve into some of the most reality spiraling concepts from theoretical physics and consciousness to AI and emerging technologies to stay informed"
    },
    {
      "end_time": 1128.916,
      "index": 44,
      "start_time": 1117.637,
      "text": " In an ever-evolving landscape, I see The Economist as a wellspring of insightful analysis and in-depth reporting on the various topics we explore here and beyond."
    },
    {
      "end_time": 1153.524,
      "index": 45,
      "start_time": 1129.377,
      "text": " The economist's commitment to rigorous journalism means you get a clear picture of the world's most significant developments, whether it's in scientific innovation or the shifting tectonic plates of global politics. The economist provides comprehensive coverage that goes beyond the headlines. What sets the economist apart is their ability to make complex issues accessible and engaging, much like we strive to do in this podcast."
    },
    {
      "end_time": 1175.299,
      "index": 46,
      "start_time": 1153.524,
      "text": " If you're passionate about expanding your knowledge and gaining a deeper understanding of the forces that shape our world, then I highly recommend subscribing to The Economist. It's an investment into intellectual growth, one that you won't regret. As a listener of Toe, you get a special 20% off discount. Now you can enjoy The Economist and all it has to offer for less."
    },
    {
      "end_time": 1204.957,
      "index": 47,
      "start_time": 1175.299,
      "text": " What is it about quantum field theory that makes it not rigorously defined other than the path integral? Because there are other approaches to quantizing than the path integral."
    },
    {
      "end_time": 1235.145,
      "index": 48,
      "start_time": 1205.64,
      "text": " So what makes it hard? Uh, not hard, not rigorously defined, not rigorously defined. So, well, I mean, we have rigorously defined quantum field theories, but they tend to be quantum field theories defined in very low numbers of space time dimensions where you can like rigorously define all the integrations and take all the limits. Um, we have quantum field theories defined by what are called the Whiteman axioms."
    },
    {
      "end_time": 1262.329,
      "index": 49,
      "start_time": 1235.691,
      "text": " But these axioms are very strong and preclude the kinds of quantum field theories that seem most apt to describe sort of nature. There's so many different angles I could take for this. Let me just, I'll just pick one. So here's one way to see what can go wrong. If you take quantum electrodynamics, which is the quantum field theory that best describes electrons, and if you want you can add some of the heavier cousins of electrons like muons,"
    },
    {
      "end_time": 1291.92,
      "index": 50,
      "start_time": 1262.841,
      "text": " and interacting with photons with electromagnetic fields. I should say, by the way, that most of what we do when we do quantum field theory is not look at particles dancing around. What we do is we introduce in the asymptotically distant past a so-called in-state, a quantum state vector that consists of some menu, some assortment of particles that are supposed to come into the experiment, and then we write down some menu of outgoing particles."
    },
    {
      "end_time": 1322.227,
      "index": 51,
      "start_time": 1292.244,
      "text": " A KFC tale in the pursuit of flavor. The holidays were tricky for the Colonel. He loved people, but he also loved peace and quiet. So he cooked up KFC's 499 Chicken Pot Pie. Warm, flaky, with savory sauce and vegetables. It's a tender chicken-filled excuse to get some time to yourself and step away from decking the halls. Whatever that means. The Colonel lived so we could chicken. KFC's Chicken Pot Pie. The best 499 you'll spend this season. Prices and participation may vary while supplies last. Taxes, tips, and fees extra."
    },
    {
      "end_time": 1330.811,
      "index": 52,
      "start_time": 1322.5,
      "text": " You might go, how do we know a particle is going to come out? Well, we don't. We're going to be computing a probability that this goes to that. So we start with some incoming particles. We start with some proposed outgoing particles."
    },
    {
      "end_time": 1355.009,
      "index": 53,
      "start_time": 1331.288,
      "text": " And then, using either the path integral or other calculational methods, we compute the so-called complex-valued scattering amplitude. It's the complex number you get when you put these things together. It's the complex number that when you mod squared is supposed to be connected to the probability you'll get that particular outcome. In practice, what we do is compute what are called scattering cross-sections, which is like what fraction come out one way, what fraction come out another way."
    },
    {
      "end_time": 1366.749,
      "index": 54,
      "start_time": 1355.009,
      "text": " Notice these are all phrased in a way that is perfectly consistent with the textbook formulation of quantum mechanics. We're not asking what's going on in between. We're not dealing with macroscopic systems. We're doing exactly what the textbook axioms ask us to do which is"
    },
    {
      "end_time": 1395.469,
      "index": 55,
      "start_time": 1367.125,
      "text": " You prepare, you compute probabilities of measurement outcomes. All the averages and numbers you're doing are like experiments repeated large numbers of times. So you're not going to run in for the most part to any of the fundamental inconsistencies or ambiguities in textbook quantum theory. So it's very easy to do quantum field theory and think there's no problem. Everything is great. We're doing quantum field theory. What's the problem? It's because most of what you're doing just doesn't run into any of those ambiguities you run into with the axioms. Now,"
    },
    {
      "end_time": 1425.742,
      "index": 56,
      "start_time": 1396.732,
      "text": " This theory is very useful and we can compute a lot of stuff. We can't compute everything. There's some ingredients that you have to take from experiments, right? The so-called physical couplings you have to go out and measure and you plug them into the model. Because if you just sort of naively try to compute everything from first principles, what you discover is that certain quantities you might want to compute depend very sensitively on sort of the upper boundary of what you've put on the theory."
    },
    {
      "end_time": 1450.572,
      "index": 57,
      "start_time": 1426.237,
      "text": " So when you study a theory like this, you recognize you can't access arbitrarily high energy physics. Our experiments don't pump in more than a certain amount of energy. So we shouldn't extrapolate the theory to arbitrarily high energies. We're going to put a cutoff on the theory. We're only going to discuss what's happening to theory up to some energy level, some energy cutoff. It's just that some of the things you might want to calculate from first principles depend sensitively on the cutoff, and those are things your theory cannot provide you with."
    },
    {
      "end_time": 1477.79,
      "index": 58,
      "start_time": 1451.032,
      "text": " So we have to take some things from empirical data and put them in. We plug them in. They become some of the parameters in our theory. The standard model has about 20 or so of these parameters you have to take from the experiment and plug them in. And once you have them, you can now make huge numbers of non-trivial, highly accurate predictions about what happens. But you still have this upper boundary. And if you try to calculate things at arbitrarily high energies, eventually your calculations stop working."
    },
    {
      "end_time": 1506.63,
      "index": 59,
      "start_time": 1478.183,
      "text": " So one of the dirty secrets of physics is that much of the calculating we do is highly approximate. A lot of it, when we do it by hand, is using a tool called perturbation theory, which I cover in some of my courses. Perturbation theory is a systematic recipe for predicting, for calculating things. And this recipe just doesn't work very well once you start trying to push your predictions beyond a certain energy level. There is a trick you can do to change what the theory looks like as you study the different energy levels. It's called renormalization."
    },
    {
      "end_time": 1530.708,
      "index": 60,
      "start_time": 1507.09,
      "text": " And what you find is that some of the parameters in the theory, they stop having values that make it possible to do consistent perturbation theory. Now, if you wanted to rigorously define a quantum field theory, what you want to do is take some kind of a limit where you can study the theory at arbitrarily high energies. This corresponds, roughly speaking, to being able to assign degrees of freedom to arbitrarily small points in space."
    },
    {
      "end_time": 1560.691,
      "index": 61,
      "start_time": 1531.34,
      "text": " And you see there's immediately an obstruction here. For most of our theories, there's a cutoff. There's a boundary we can't go beyond. The theory simply doesn't let us go to arbitrarily high energy scales. And so we're not going to be able to write down a so-called ultraviolet complete, like perfectly fine-grained version of this theory. We can only use a theory up to some cutoff. The standard model, for example, is not expected to hold to arbitrarily high energy scales. We think that the theory is reliable only up to"
    },
    {
      "end_time": 1583.882,
      "index": 62,
      "start_time": 1561.647,
      "text": " All of this takes us pretty far, no pun intended, a field of what we were talking about before, but these are the kinds of things like"
    },
    {
      "end_time": 1610.93,
      "index": 63,
      "start_time": 1584.411,
      "text": " Maybe quantum field theories in the real world, real life out there in the wild, quantum field theories are never perfectly defined. Maybe all we have is a cascade of so-called effective field theories that are all well defined within some bounds. And there is no ultimate theory that is like perfect and fine-grained and, you know, perfectly fine-grained and ultraviolet complete. Maybe there's just like a tower of these theories. This makes questions of ontology and what's physically out there, I think, very murky."
    },
    {
      "end_time": 1638.831,
      "index": 64,
      "start_time": 1611.834,
      "text": " Because if we think there's never going to be some fundamental theory at the bottom of all of this, what really is out there in nature? That's, I think, an open question. Whether or not you want to phrase that question in a language of this sort of indivisible stochastic approach, I don't know. Or it could be that these theories tap out to some ultimate quantum field theory or some very different kind of theory like string theory or something like that. I mean, there are many proposals for maybe where this terminates. But"
    },
    {
      "end_time": 1668.456,
      "index": 65,
      "start_time": 1639.787,
      "text": " I don't know. What I will say is this, though. There is a view that nature is fundamentally described by Hilbert spaces and quantum mechanics, the Hilbert space formulation of quantum mechanics. And if that's fundamental, then we already know what nature fundamentally is. Nature fundamentally is some wave function. That's it. Some wave function in some Hilbert space. We don't exactly know the features of the wave function. We don't know whether it's best described in terms of fields or something else."
    },
    {
      "end_time": 1691.561,
      "index": 66,
      "start_time": 1668.797,
      "text": " But we already know the fundamental ontology of nature. It's a wave function. So we're good. I think that's too ambitious. In the indivisible stochastic approach, there's no wave function. So the wave function is not what the ontology is. The wave function is whatever your choice of configurations are. And if you're modeling particles, you use particle configurations. If you're modeling fields, you use field configurations. If there's some"
    },
    {
      "end_time": 1715.469,
      "index": 67,
      "start_time": 1691.988,
      "text": " Ultimate system that grounds everything else, some system at the bottom, some system that if we understood it and understood its laws, we would have the unified theory of all the physics. There would presumably be some configurations for that and we would use those instead, but we don't know that theory yet. And so I think it's premature to think that we know the right fundamental degrees of freedom. So if someone asks me, what do I think is fundamentally out there?"
    },
    {
      "end_time": 1739.77,
      "index": 68,
      "start_time": 1716.032,
      "text": " I don't know, but then I'm just where we were 100 years ago or 150 years ago. We don't know the ultimate theory of nature yet, and I think it's premature to at this point guess what the ultimate ontology is going to be until we have that theory, should we ever have it. Well, I'm interested in ultimate theories. Theory of everything is the name of your podcast. Sorry, I don't have one for you. Well, I'm interested in your thoughts into how to merge quantum field theory with gravity."
    },
    {
      "end_time": 1767.483,
      "index": 69,
      "start_time": 1740.299,
      "text": " So I know we have a slew of audience questions and we're going to get to them, but they're going to have to wait because I have these questions first. These are, these are, these are like close to what I wanted to also talk about. So go on. Yeah. Great. Great. Okay. So two questions here. People say that, okay, look, we have this Heisenberg uncertainty and that applies even in QFTs. And so thus space time is, is shaky. Okay."
    },
    {
      "end_time": 1793.848,
      "index": 70,
      "start_time": 1768.012,
      "text": " But space-time in QFT is defined. You can perfectly pick out an X comma T and the values of creating particles in the fields themselves vibrate or are uncertain, but the space-time itself is there and given. However, some people say that if you were to zoom in and you follow QFT because of Heisenberg's uncertainty, you thus get to uncertainties of space-time itself."
    },
    {
      "end_time": 1818.422,
      "index": 71,
      "start_time": 1794.48,
      "text": " Is that argument valid? Now, I imagine that one of the ways that they get to this argument is by saying you have an energy time uncertainty in general relativity, space time itself has energy. And so the space time itself must have some uncertainty to it. But then you could also say, well, in QFT, you don't know if the energy that it's talking about is the same of, well, okay."
    },
    {
      "end_time": 1846.749,
      "index": 72,
      "start_time": 1818.643,
      "text": " If you have a statement that applies to all natural numbers, you can't just say that any, so X squared is always going to be a natural number if you're pulling from natural numbers, but not every number squared is a natural number. So it depends on the scope of what you're quantifying over. So is it the case that in Q of T we can apply the energy time uncertainty to GR? I don't know. Cause so that's one question. We should address that question first before we ask any more questions. Yeah. So, um,"
    },
    {
      "end_time": 1868.831,
      "index": 73,
      "start_time": 1847.705,
      "text": " It's important to step back here and just make sure we know what we're all talking about. So a quantum field, just because maybe not everyone knows what they are. So in a typical Hilbert space formulation of a quantum system, we have observables. Observables are these self-adjoint operators. And in a quantum field, we have operators associated with all points in space."
    },
    {
      "end_time": 1895.913,
      "index": 74,
      "start_time": 1869.07,
      "text": " And if we work in a formulation in which we move the time evolution out of the state vectors and into the observables, we have what's called the Heisenberg picture, and then our field operators depend on space and time. It's a fancy way of saying everywhere in space-time we've got these sort of local operators that are associated with points in space and time. Quantum field theories like QED, we talked about quantum electrodynamics, they presuppose this classical background space-time. There's no gravity, space-time is"
    },
    {
      "end_time": 1921.254,
      "index": 75,
      "start_time": 1896.152,
      "text": " usually treated as flat. We call flat ordinary special relativity spacetime. We call it Minkowski spacetime. You can do quantum field theory in a static curved spacetime. Still not treating gravity as dynamical, but that gets very complicated. Let's start to start with quantum field theories like QED on special relativity flat non-dynamical spacetime."
    },
    {
      "end_time": 1950.503,
      "index": 76,
      "start_time": 1921.578,
      "text": " In that case, you're right. x, y, z, the coordinates of where you are, and t, do not fluctuate. They are fixed features of the background architecture of space-time. They're the stage on which the action is happening. Your question about the uncertainty principle and about fluctuations of fields is an interesting question. In the Dirac-von Neumann formulation of quantum mechanics, nothing is fluctuating between measurements because nothing is happening between measurements. The only things that are happening are measurements in the Dirac-von Neumann formulation."
    },
    {
      "end_time": 1978.063,
      "index": 77,
      "start_time": 1950.981,
      "text": " So to say, oh, when you're not measuring it, the fields are just like dancing. The Dirac von Neumann axioms don't say that. They say nothing about it. They don't say that particles are zooming around. The Dirac von Neumann axioms don't let you say, oh, the reason this happened was a photon emitted an electron. All that is for color. I said this in one of our earlier conversations that physicists often talk this way. They're like, oh, this happened because an electron emitted that and did this and the field was fluctuating. If you're only working on the Dirac von Neumann axioms, all of that is just"
    },
    {
      "end_time": 2009.138,
      "index": 78,
      "start_time": 1979.292,
      "text": " fluff fluff none of it is really legitimated by the axioms now if you're frustrated by that you're like well but surely something is happening i want to be able to say something is happening well then you're on my side which is that we need to do something to the right one axioms you're making my case for me okay so the uncertainty principle that the traditional we talked about this a little earlier an observable you know corresponds to a certain basis and when the state vector of your system is aligned with one axis of that basis you're definitely that result when you measure it"
    },
    {
      "end_time": 2039.394,
      "index": 79,
      "start_time": 2009.48,
      "text": " If the state vector is not aligned with that basis vector, it's got components along multiple basis vectors, then you're going to have probabilistic measurement results given by the Born rule. And you can be aligned along the axis of one observable and have a definite result, but not along one axis of another observable. And you don't have a definite result. And if you change the state vector so that it's aligned along the axis of one, it's not aligned along the axis of the other. And this is the uncertainty principle that some observables will have sharp values that when you measure them, you always get a definite result and others won't. And if you try to make one observable sharp, others will become unsharp. This is the uncertainty principle."
    },
    {
      "end_time": 2065.06,
      "index": 80,
      "start_time": 2040.026,
      "text": " But notice again this is all phrase and level of measurements. We're not saying that between measurements anything is fluctuating. So honestly there's like no way to really talk about what like to say that the field is fluctuating on top of the space-time or to say anything more about the Heisenberg's theory principle other than this is the pattern of measurement results we get when we do measurements on the system. Now if you want to do something like Bohmian mechanics or the indivisible stochastic approach"
    },
    {
      "end_time": 2086.101,
      "index": 81,
      "start_time": 2065.503,
      "text": " or the many worlds approach or something like that. Now we can actually begin to talk about what's happening between measurements because these are all theories that describe things happening that are not just the narrow class of measurements. In some of the theories, like in the indivisible stochastic approach, there's stochastic behavior. The fields really are fluctuating."
    },
    {
      "end_time": 2109.326,
      "index": 82,
      "start_time": 2086.647,
      "text": " In the bohmian approach, it kind of depends on whether you're trying to get fields into a deterministic kind of bohmian approach or whether you're going to allow the fields to be stochastic in some sense. There are many different formulations of bohmian mechanics for fields and I can't do justice to all of them. Some of them the fields would be fluctuating, some of the fields wouldn't be. In some of them you deny that there are fields and try to do everything with particles somehow. Many worlds is a little more subtle because in many worlds"
    },
    {
      "end_time": 2139.258,
      "index": 83,
      "start_time": 2110.316,
      "text": " There's not one reality in which things are fluctuating. It's just more subtle and we'll talk about the many worlds approach a little bit later. So I wanted to just get that out of the way before we then talk about the more subtle question about is space-time fluctuating. So when you go from field theory like quantum field theory like QED, where again the thing you're primarily computing is scattering amplitudes. You set up a preparation, you get measurement results, you're computing cross-sections, decay rates, those sorts of things. Now you want to talk about gravity."
    },
    {
      "end_time": 2167.722,
      "index": 84,
      "start_time": 2139.906,
      "text": " So in general relativity, gravity is a manifestation of the change in curvature of space-time. Space-time doesn't stay flat. It curves. People often ask, where is it curving? Is it curving in some other dimension? There's a way to define curvature called intrinsic curvature that does not make reference to other dimensions. You can define it solely in terms of the four dimensions of space plus time, so you don't need an extra dimension for curvature. But there is this notion of intrinsic curvature, and if"
    },
    {
      "end_time": 2192.329,
      "index": 85,
      "start_time": 2168.097,
      "text": " Gravity is quantum mechanical. Does that mean that the curvature or the shape of space time or the geometry is also fluctuating in some sense? Now there's this discussion about, well, if you zoom in, does it look, I mean, I don't even, I don't exactly know what zooming in means. Do you mean if you're doing measurements or something like that? I mean, if we do very like precise measurements on a quantum fields on a background space time, we may get a large variance of results."
    },
    {
      "end_time": 2221.715,
      "index": 86,
      "start_time": 2192.875,
      "text": " But those are measurement outcomes. It's not the field doing anything between measurements because, again, without augmenting the Dirac-Vinoman axioms, we can't talk about what the fields are doing. Is space-time fluctuating? Well, according to the Dirac-Vinoman axioms, we can't say that. We could only say something like, if you do some kind of measurement of space, it's fluctuating. But I don't quite know how to measure space the way we would measure the intensity of a field. It's kind of subtle because"
    },
    {
      "end_time": 2245.128,
      "index": 87,
      "start_time": 2222.5,
      "text": " The relationship between the gravitational field and the curvature of space-time and the behavior of test particles, of particles moving around on space-time, it's really subtle. Even the notion of energy is super subtle in general relativity. There isn't an invariant non-trivial definition of local energy density in the gravitational field itself in general relativity."
    },
    {
      "end_time": 2271.374,
      "index": 88,
      "start_time": 2245.776,
      "text": " It's actually really hard to pin down even what we mean by all of this. And we're not going to be guided by experiments because we would expect to see distinctly quantum mechanical features of the gravitational field. Unless there's some miracle, we wouldn't expect to see that until you're talking about Planck scale physics. Planck scale physics is the physics associated with distance scales that are like 10 to the minus 43 meters."
    },
    {
      "end_time": 2300.111,
      "index": 89,
      "start_time": 2271.783,
      "text": " Right. I mean, they're as far from an atom as like an atom is from like the observable universe, something like that. I mean, they're so far away from maybe not have to work out the exact orders of magnitude, but the Planck scale is really, really small. I guess that's 10 to the minus 43 seconds. That's the Planck time. The Planck scale, the length scale is 10 to the negative 35 meters. So like these are these are super duper tiny, tiny distance scales. And we can't do experiments there. So we have no experimental data to guide us."
    },
    {
      "end_time": 2325.828,
      "index": 90,
      "start_time": 2300.674,
      "text": " So this is exactly a situation in which we need the kind of careful, rigorous scrutiny that one gets from, yes, understanding the physics as well as we can, but also having a strong background in philosophy, because it's very easy to make statements that are super speculative, that build speculations on top of speculations, to make what I call speculative metaphysical hypotheses, SMHs, and the acronym is not an accident,"
    },
    {
      "end_time": 2355.452,
      "index": 91,
      "start_time": 2326.186,
      "text": " Just to tower them on top of each other and then not to know whether what you're saying is something that's genuine and reliable. So I don't have any idea whether we should be thinking about space-time as truly fluctuating. The indivisible stochastic approach, like all approaches to quantum mechanics, faces fundamental first-order conceptual difficulties in dealing with space-time that fluctuates like a curving dynamical space-time."
    },
    {
      "end_time": 2383.831,
      "index": 92,
      "start_time": 2355.776,
      "text": " Let me explain why. In order to talk about stochastic probabilities and division events and all this stuff, you need some notion of what you mean by time, by slices of the universe at constant time. You need the ability to talk about which directions in space-time are the directions that are space-like directions and which directions are the time-like directions."
    },
    {
      "end_time": 2413.319,
      "index": 93,
      "start_time": 2384.241,
      "text": " When you want to specify your configuration of your system, you're doing it at a time over some region of space. And so you really need to know which slices of space-time are the space slices. And that's all well and good when you're doing Newtonian or non-relativistic space-time, even in quantum mechanics, not necessarily Newtonian, or even special relativistic space-time. In special relativistic space-time, you're given which directions are time and which directions are space, and they're just fixed."
    },
    {
      "end_time": 2443.831,
      "index": 94,
      "start_time": 2414.36,
      "text": " But when you consider dynamical space-time, so space-times in which the so-called metric tensor, which is the kind of field-ish thing associated with space-time and general relativity, the metric tensor is the thing that tells you which directions are time and which are space. If that is itself fluctuating, you don't know a priori which directions are the space directions and which directions are the time directions. So you can't even obviously phrase a probabilistic theory. And this is very curious."
    },
    {
      "end_time": 2475.179,
      "index": 95,
      "start_time": 2445.179,
      "text": " For one thing, it means it's very difficult to understand whether space-time fluctuates, even in an invisible stochastic theory, because it's hard to even specify, like, where do I put my probabilities? What is my initial? These probabilities are conditional. They connect one configuration to another at one time to another time. But if I don't know which directions are time directions, how do I do this if the space-time itself is itself fluctuating? That's interesting. But what's also interesting is it highlights a gap"
    },
    {
      "end_time": 2501.22,
      "index": 96,
      "start_time": 2476.22,
      "text": " the scientific study of quantum gravity. So here's a very interesting thing. We can take classical Newtonian physics, we can numerically simulate it in a computer, and we can also model many Newtonian systems probabilistically as Markov processes. Often the Markov approximation is perfectly well and can be used all the time to model Newtonian systems, to model other kinds of systems."
    },
    {
      "end_time": 2526.169,
      "index": 97,
      "start_time": 2502.807,
      "text": " And there are stochastic methods, stochastic formulations of other physical theories beyond Tony mechanics. There isn't one for general relativity. So Einstein's theory of general relativity is not a probabilistic theory. Einstein's theory of general relativity is a deterministic type theory."
    },
    {
      "end_time": 2554.599,
      "index": 98,
      "start_time": 2526.8,
      "text": " It's more subtle. There's some questions over whether it's always formulated in a Markovian way. So there's even some evidence from general relativity, even just ordinary general relativity, that maybe the Markovian picture is not quite the right picture. And there's some amazing people like Emily Adlam, who's at Chapman University, philosopher of physics, and Eddie Chen at UCSD, and Shelley Goldstein at Rutgers, who"
    },
    {
      "end_time": 2584.002,
      "index": 99,
      "start_time": 2554.94,
      "text": " are trying to think about laws of physics in a different more sort of global space-time sense that would fit better with theories like general relativity. And there may be some connections to indivisible non-Markovian type laws. But in any event, general relativity is phrased in sort of a deterministic non-probabilistic way. And people are trying to work on quantum gravity now, but you might have asked, shouldn't we have worked on an intermediate step first? What about just a probabilistic version of general relativity?"
    },
    {
      "end_time": 2607.466,
      "index": 100,
      "start_time": 2584.377,
      "text": " Like a formulation of general relativity that is stochastic, that is, we take Einstein's field equations, the equations that describe the deterministic shape of space-time, and replace them with a probabilistic version, not a quantum version, just a probabilistic version, like as a stepping stone. You might have thought that would be the natural thing to do before trying to go to a fully quantum version of the theory."
    },
    {
      "end_time": 2636.903,
      "index": 101,
      "start_time": 2608.507,
      "text": " As far as I know, there's been very little work done in that area. I could be missing something. I haven't seen everything that's been written and maybe people will see this and they'll chime in in the comments and say, wait a second, there's a theory where this is happening. And there's current work. I mean, I know that Oppenheim is working on a stochastic version of general relativity, but this is recent, right? Like this is not 50 years ago. So I think this is a huge target for research. And it kind of makes sense this would happen. I mean, general relativity, you know,"
    },
    {
      "end_time": 2665.401,
      "index": 102,
      "start_time": 2637.534,
      "text": " finished. The level of the Einstein field equation being fully formulated in November of 1915. You know, Einstein is giving these super high stakes lectures, the Prussian Academy of Sciences, and he's scrambling to finish the theory in between the lectures, and he manages to do it. And, you know, and then Schwarzschild comes along and writes on the Schwarzschild solution shortly, you know, in the beginning of 1916. But there's this whole story that Schwarzschild was doing it in the trenches of World War One, he was not in the trenches."
    },
    {
      "end_time": 2693.746,
      "index": 103,
      "start_time": 2666.101,
      "text": " Yeah, there's this really great paper, I think by Dennis Lemkul, who's a historian of science, who's great. He was like, Schwarzschild was actually stationed in this very nice house. And, and, and he was in the war, but he wasn't he wasn't doing it. But anyway, so, so people obviously was developed like 1915 1916. Stochastic process theory was not developed at the time. Right? I mean, even like,"
    },
    {
      "end_time": 2717.688,
      "index": 104,
      "start_time": 2694.514,
      "text": " Kolmogorov's axiomization of probability theory that comes in 1933, right? So that's, that's all that's like 17 years, 18 years after general relativity. And that's not even stochastic. I mean, random variables don't start becoming prominently used until like the 40s and 50s maybe. And I think like a, like a well developed theory of stochastic processes"
    },
    {
      "end_time": 2746.425,
      "index": 105,
      "start_time": 2718.114,
      "text": " If I'm not mistaken, and again, my history on the theory of stochastic processes may be somewhat mistaken, so people can correct us, but I believe it wasn't until later, like the 50s and 60s. I mean, Markov introduced Markov matrices already in like 1906, but like fully building out like an actual comprehensive theory of stochastic processes, that comes decades later. And people had already been working on quantum gravity for decades by this point. I mean, you know, people began trying to do quantum gravity"
    },
    {
      "end_time": 2760.299,
      "index": 106,
      "start_time": 2747.261,
      "text": " like the 1920s. I mean, Pauli is already trying to quantize general relativity by the late 1920s. And people are already like giving up and pulling their hair out and saying you can't do it right already, like decades before there's a theory of stochastic processes."
    },
    {
      "end_time": 2783.831,
      "index": 107,
      "start_time": 2760.981,
      "text": " So it's actually not so surprising historically that no one said maybe before we do quantum gravity, we should do probabilistic general relativity and see if we can do that. And there have been a lot of proposals to do this, you know, maybe what you do is you want an ensemble of space time a block universes or maybe but it's like not clear that any of these are really the right way to do it. I have some suspicions. And this is this is now me"
    },
    {
      "end_time": 2813.626,
      "index": 108,
      "start_time": 2784.821,
      "text": " doing something I don't want to do which is just like speculation but you know what let's just let's just speculate. Surmise away. I think a fully probabilistic version of general relativity and I don't mean taking general relativity and adding some small noise terms like small corrections I mean like a fully fully probabilistic generalization of general relativity I think that would either teach us a lot about quantum gravity or even potentially be quantum gravity because remember"
    },
    {
      "end_time": 2841.067,
      "index": 109,
      "start_time": 2814.172,
      "text": " the indivisible stochastic approach doesn't start with Hilbert spaces. It's just probability, just very non Markovian probability. There's a sense in which general relativity in its most general formulation is like not exactly I mean, depending on the nature of the space time, if you've got certain kinds of spaces and certain properties, you can formulate it as a kind of initial value problem. But like, there's something about general relativity that's a little different from the laws of other theories. And I have a suspicion that if you could fully probabilize the theory,"
    },
    {
      "end_time": 2867.705,
      "index": 110,
      "start_time": 2841.681,
      "text": " You basically be doing indivisible stochastic mechanics but for gravitational field and that would already be the theory of quantum gravity. Now that is super conjectural. I want to be super clear. I have not worked on this in any depth. It would be very interesting to study this problem more but this is the kind of question you can begin to ask because if you think that you have to start with Hilbert spaces you'd go well it must be the case that quantum gravity is going to be some Hilbert space thing or some generalization of Hilbert spaces"
    },
    {
      "end_time": 2896.834,
      "index": 111,
      "start_time": 2868.097,
      "text": " But because we didn't have to start with Hilbert spaces, we can now ask much more basic questions like what's just probabilistic general relativity, indivisible probabilistic general relativity, and is that already all we need? That's not to say it's easy because again, when you have a dynamical space time, it's very hard to talk about where you even put the conditional probabilities, but at least it centers the question on something a little more basic. And I think this comports with a couple of other principles I think that one gets from thinking philosophically about doing physics. One is"
    },
    {
      "end_time": 2921.22,
      "index": 112,
      "start_time": 2897.227,
      "text": " It's usually better to isolate problems as much as you can and deal with them in the simplest circumstances. I would much rather try to deal with probabilities and general relativity first before I try to do all of quantum gravity, right? Like let's study problems in their simplest initial incarnation. Let's not teach people quantum mechanics by starting with quantum field theory. Let's start with the simplest kind of systems and add complexity step by step rather than doing it all at once."
    },
    {
      "end_time": 2936.169,
      "index": 113,
      "start_time": 2921.766,
      "text": " That's one thing and the other thing is the idea that when approaching problems or conceptual confusions or trying to progress on a very thorny set of theoretical questions involving one of our best physical theories, sometimes you don't want to just build stuff on the end."
    },
    {
      "end_time": 2963.78,
      "index": 114,
      "start_time": 2936.63,
      "text": " Sometimes we have to do is you have to go down into the deep programming of the model and do some debugging. So for people who've done computer programming, you know that sometimes in a program isn't working. It's not because the end of your code is wrong. Sometimes it's because like way at the beginning of your code, you made some mistakes and to debug it, you have to go all the way back to the beginning and really start with the definitions, like how you've defined certain variables or how you define certain functions and like make sure all those definitions are really good before you proceed."
    },
    {
      "end_time": 2993.183,
      "index": 115,
      "start_time": 2964.804,
      "text": " And that's kind of what I'm doing here. Rather than trying to glom gravity onto Hilbert space quantum mechanics, I'm saying maybe we need to go and ask some very foundational questions first. Debug this program all the way down to the roots of the axioms. Make sure the axioms really make sense. And I can give a very concrete example of where this breaks down. So we talked about the uncertainty principle. Another thing that you compute in quantum mechanics are expectation values. Now in a previous conversation we talked about an expectation value is an average."
    },
    {
      "end_time": 3023.797,
      "index": 116,
      "start_time": 2994.053,
      "text": " You have some observable thing you want to measure, and you know the quantum state of the system, and you can compute its average. And there is this way of thinking about those averages, that they're averages of just stuff happening, of phenomena happening, but they're not. They're defined by the Dirac von Neumann axioms as statistical averages of numerical measurement outcomes weighted by their corresponding measurement outcome probabilities, and that's it."
    },
    {
      "end_time": 3047.688,
      "index": 117,
      "start_time": 3027.073,
      "text": " If you're not measuring stuff, there's no average there. But there are a lot of physicists who think that when you put brackets around something, which is the notation for an expectation value, we no longer have to think about measurements anymore. We can just think about it as stuff happening."
    },
    {
      "end_time": 3071.869,
      "index": 118,
      "start_time": 3048.49,
      "text": " So people will say something like, well, you know, quantum mechanics predicts measurements, and if you measure something, you'll get one of the eigenvalues, you'll get it with the Born rule. How do we get the classical limit? Oh, what we do is we take expectation values, we average everything, and then we show that those averages evolve in time the way that classical observables evolve in time, and this is how the classical limit happens. But this is clearly wrong, because"
    },
    {
      "end_time": 3094.718,
      "index": 119,
      "start_time": 3072.824,
      "text": " at least if you don't think everything is a measurement. I mean, if you want to argue that every phenomenon happening is a kind of measurement, then you can do this. But then you have the onus of trying to show that if you're not willing to say that everything happening is a measurement, you've got a problem because things are happening all over the place. Objects are sitting on Mars, not falling down and primordial gases are mixing."
    },
    {
      "end_time": 3124.019,
      "index": 120,
      "start_time": 3096.067,
      "text": " And you can't just put brackets around them and the quantum mechanical things and just say these are things happening because those brackets only refer to measurement averages and if there's no measurements happening then those things aren't happening. The conflation of a quantum mechanical measurement expectation value with just on average this is what's going on, the conflation of those two things, measurement averages and on average stuff is just happening in a certain way, is pervasive in the literature."
    },
    {
      "end_time": 3150.981,
      "index": 121,
      "start_time": 3124.377,
      "text": " So if you take, for example, I'm sorry to mention this because I really like this book. It's Shankar's book, Principles of Quantum Mechanics. Wrote a book. It's a wonderful book. It's a big pink book on quantum mechanics. And chapter six is called the classical limit. And the entire chapter is based on this fallacy. That you just put brackets around things and then you can treat them like classical variables where they're just happening and no one's measuring them. But it's just wrong. Now, at least according to the Dirac-Von Nomen axioms. Now,"
    },
    {
      "end_time": 3178.097,
      "index": 122,
      "start_time": 3152.056,
      "text": " If you're willing to augment or change the Dirac-Vinomon axioms and turn quantum theory from a theory of only measurements to a theory of phenomena happening generally, like in an indivisible stochastic approach or bohmian mechanics or avaredian many-worlds interpretation, then you are legitimated in doing this. But you need something to take measurement averages and turn them into just averages of things happening. This is just a category problem again, that we're only referring the Dirac-Vinomon axioms to this very narrow category of measurement outcomes,"
    },
    {
      "end_time": 3208.387,
      "index": 123,
      "start_time": 3178.814,
      "text": " we're not the larger category of things that want to be able to be happening. So how does this then affect quantum gravity and quantum gravity? We often take quantum mechanical things, put brackets around them, and then plug them into the Einstein field equation and treat them like they're classical things. So one of the starting assumptions of semi classical quantum gravity, which is where we try to sort of mix a little quantum is we take the distribution of matter broadly construed, broadly construed matter is"
    },
    {
      "end_time": 3234.718,
      "index": 124,
      "start_time": 3209.36,
      "text": " like massive particles, massive objects, but also electromagnetic fields are considered a form of matter. Really anything that's not the gravitational field that can source gravitational fields or respond to gravitational fields we call matter. And what we do is we take the quantum mechanical observables, these self-adjoint operators, we put brackets around them, call them averages, pretend that they're classical averages, and then put them into the Einstein field equation."
    },
    {
      "end_time": 3250.043,
      "index": 125,
      "start_time": 3235.367,
      "text": " and"
    },
    {
      "end_time": 3279.565,
      "index": 126,
      "start_time": 3250.64,
      "text": " And they will stick a bunch of functions into one of these integrals. Sometimes to make things more well-defined, they will take time and give it an imaginary part and even rotate the time axis in the complex plane to imaginary time to make the integrals more well-defined. And they'll compute these things called correlation functions. And sometimes I'll have a conversation with someone who does this and I'll say, what is this correlation function? They'll be like, oh, it's a correlation function. It's an average. And I'm like, but"
    },
    {
      "end_time": 3306.852,
      "index": 127,
      "start_time": 3280.162,
      "text": " Your universe has no observers in it. And you're describing a situation in quantum gravity in which there's like no planets or people or measurements happening. So what is this an average of? Are you saying that these quantities are just doing things and we're averaging them? That's not legitimated by the Dirac, Feynman axioms. So what is the physical meaning of these quantities that you're writing down?"
    },
    {
      "end_time": 3336.288,
      "index": 128,
      "start_time": 3308.08,
      "text": " and you know sometimes a lot of you know very sophisticated conversation about this and we actually get you know make some progress on it but but a lot of times people like I actually don't even know what I'm doing right so this is what I mean when I when I say that like applying rigorous scrutiny to the things we're computing like beyond the mathematics like what do they mean is actually kind of important because otherwise you might find yourself writing things down you don't even know what exactly it is you're writing down I think what this gets across is that the difference between quantum mechanics and general relativity is actually much deeper than I think"
    },
    {
      "end_time": 3353.422,
      "index": 129,
      "start_time": 3337.108,
      "text": " I mean, we all know that there are differences. Quantum mechanics is supposed to be this sort of fluctuating probabilistic theory and general relativity is supposed to be based on smooth spacetimes and things like this and how do we reconcile them? But I think that the difference between them is even deeper. General relativity is a theory of things happening."
    },
    {
      "end_time": 3374.872,
      "index": 130,
      "start_time": 3353.933,
      "text": " General relativity is a theory in which you have an Einstein field equation, you impose appropriate boundary conditions, you introduce whatever distribution of matter and energy and sources you want in your space time, and then you find a space time with the right geometry that satisfies all the constraints and obeys the Einstein field equation. And this is the space time which things happen."
    },
    {
      "end_time": 3402.978,
      "index": 131,
      "start_time": 3375.555,
      "text": " Projectiles follow what are called geodesics. If they're only subject to gravitational forces, geodesics can cross, they can meet, they can intersect five times. People can, you know, you can compute various and variant quantities. Not everything in general relativity is relative. Some things are invariant. They're like things are happening in this universe. In quantum mechanics, at least the textbook Dirac-Vinoman picture, all you've got are measurement outcomes."
    },
    {
      "end_time": 3434.497,
      "index": 132,
      "start_time": 3404.531,
      "text": " at the beginning and end of"
    },
    {
      "end_time": 3464.258,
      "index": 133,
      "start_time": 3434.974,
      "text": " You set things up and you take measurements at the end. In the asymptotic past, you set up your initial state, the asymptotic future. We take the times to be infinitely in the past, infinitely in the future. These are obviously just approximations. And we're just computing measurement results, cross-section scatter, you know. But in general, in quantum gravity, we're trying to describe what space-time is doing. We're trying to understand like what's happening to space-time. And those are questions that just are beyond the kinds of things that we usually do when we're doing QED."
    },
    {
      "end_time": 3492.841,
      "index": 134,
      "start_time": 3464.787,
      "text": " We're demanding more of quantum gravity. We're demanding more of a picture, more of a description than the textbook quantum mechanics has been designed to provide. And so I think that if you want to do quantum gravity and really tell a story, tell a picture, paint a rigorous picture of what's happening in space-time, you're just not going to be able to do it with textbook Dirac von Neumann-Hilbert space quantum mechanics. You're going to need a theory of something in order to describe a space-time where something is actually happening."
    },
    {
      "end_time": 3504.275,
      "index": 135,
      "start_time": 3493.183,
      "text": " Hope that makes sense. So I think there are reasons why a conceptual shift in how we think about quantum mechanics may be necessary before we are able to address certain deep problems in quantum gravity."
    },
    {
      "end_time": 3525.794,
      "index": 136,
      "start_time": 3504.906,
      "text": " Hi everyone, hope you're enjoying today's episode. If you're hungry for deeper dives into physics, AI, consciousness, philosophy, along with my personal reflections, you'll find it all on my sub stack. Subscribers get first access to new episodes, new posts as well, behind the scenes insights, and the chance to be a part of a thriving community of like-minded pilgrimers."
    },
    {
      "end_time": 3548.609,
      "index": 137,
      "start_time": 3525.794,
      "text": " Problems in quantum gravity."
    },
    {
      "end_time": 3577.09,
      "index": 138,
      "start_time": 3549.701,
      "text": " Great answer. Okay, so let's get to some of the questions about Bell. Yes. So people had questions about Bell's inequalities and how they're represented in your framework. Good. Yeah. So ultimately Bell's theorem is about entangled systems. So I have to say a little bit about entanglement. We've got to talk about entanglement first. What is entanglement according to usual textbook quantum mechanics? This is what entanglement"
    },
    {
      "end_time": 3609.599,
      "index": 139,
      "start_time": 3582.449,
      "text": " Composite systems systems where you've got two systems not one system anymore. That could be the superposition of two states, but two systems so suppose that I have system a and it's the state one and I've got system B and it's the state one prime and that's all I have Well, then we would say okay the composite system is in the state one and one prime and"
    },
    {
      "end_time": 3629.224,
      "index": 140,
      "start_time": 3609.872,
      "text": " System A is in state one, system B is in state one prime. That's all there is to say. I could also imagine that system A is in state two and system B is in state two prime and the composite system is in the state two, two prime. Perfectly fine. I could also imagine that system A alone is in a superposition of one and one prime."
    },
    {
      "end_time": 3657.978,
      "index": 141,
      "start_time": 3630.043,
      "text": " Let's say 1 over root 2 times 1 plus 1 over root 2 times 1 prime, because in quantum mechanics, when we superpose, we put a number in front, and that number when you square it is supposed to be related to a measurement probability. The 1 over root 2s have the property that you square them, they become halves, and you add them, you get 1. That's probabilities adding up. You can imagine the system A is in the state 1 over root 2 1 plus 1 over root 2 1 prime. You can imagine system B is in the state 1 over root 2 2"
    },
    {
      "end_time": 3676.544,
      "index": 142,
      "start_time": 3658.336,
      "text": " plus one over two two prime and you could imagine that those are the states two systems now the composite system is also in it so the composite system is in the state well it's hard to say let me call the first state psi the greek letter psi psi is the state one over two one plus one over two one prime and let's"
    },
    {
      "end_time": 3706.971,
      "index": 143,
      "start_time": 3677.841,
      "text": " I'm sorry, I did my numbering wrong. It's one plus two and two and sorry, because A can be in the state one or one prime, one or two and state and system B can be in state one prime or two prime. I got it wrong. My apologies. Okay. So, so psi will say the Greek letter psi, trident symbol psi will be one over root two state one plus one over root two state two and psi prime, which corresponds system B is"
    },
    {
      "end_time": 3737.227,
      "index": 144,
      "start_time": 3707.415,
      "text": " Si prime is the state that represents 1 over root 2 1 prime plus 1 over root 2 2 prime. And I can say that the composite system is in the state Si comma Si prime. If I multiply everything out, I'll get four terms. There'll be a term that's 1 half 1 1 prime plus 1 half 1 2 prime plus 1 half 2 1 prime plus 1 half 2 2 prime."
    },
    {
      "end_time": 3766.425,
      "index": 145,
      "start_time": 3737.671,
      "text": " We would say this is not an entangled state because it's factorizable. I can factorize it into psi next to psi prime. Psi for system A, psi prime for system B, I would say these are not entangled. And you can show that when they're not entangled, they also have statistical independence. If you do measurements on them and compute measurement probabilities, you'll find that they are statistically uncorrelated. But now let me propose a different quantum state. This quantum state is going to be"
    },
    {
      "end_time": 3797.585,
      "index": 146,
      "start_time": 3768.114,
      "text": " 1 over root 2, 1, 1 prime plus 1 over root 2, 2, 2 prime, and that's it. Just those two terms. Notice this is a superposition, but over both systems. And now I've got 1, 1 prime in one term and 2, 2 prime in the other term. And I don't have all those. I don't have the 1, 2 prime term. I don't have the 2, 1. They're not there. I only have 1, 1 prime plus 2, 2 prime. That's it. I cannot factorize that into two different states."
    },
    {
      "end_time": 3805.162,
      "index": 147,
      "start_time": 3797.79,
      "text": " There's no psi for the first system and psi prime for the second that would let me describe them both as having their own states. We would now say those are entangled."
    },
    {
      "end_time": 3832.995,
      "index": 148,
      "start_time": 3805.828,
      "text": " Just a quick question here. So people who are driving and they're listening to this or people, maybe they have a pen and paper and they're thinking, okay, well, I'm going to try to multiply some states to get that. And then they don't. So then they wonder, okay, but just because I tried some, I didn't get to it. Is there a way that I can look at this and then prove that there exists no factorizable component? Yeah, there is. And that's the way to think about this is just it's forgetting to foil when you do arithmetic. So if someone gives you, for example, I've got X plus Y over here,"
    },
    {
      "end_time": 3861.51,
      "index": 149,
      "start_time": 3833.592,
      "text": " and I've got W plus Z over here and I multiply X plus Y as a quantity times W plus Z as a quantity, I get four terms. I get X W plus X Z plus Y W plus Y get four terms. If I see those four terms, I know I can refactorize them and write them as a thing, X plus Y times other thing, Z plus W. But if I only give you X Y plus"
    },
    {
      "end_time": 3891.732,
      "index": 150,
      "start_time": 3861.766,
      "text": " Sorry, not XY. XW plus ZY. I only give you those two things. You can't factorize them. They don't factorize into a thing times a thing. This is like an arithmetic example of entanglement, basically. Now, entanglement is usually phrased as something that has no classical correspondence. There is nothing like entanglement classically. In fact, in a 1935 paper, Schrodinger wrote"
    },
    {
      "end_time": 3916.049,
      "index": 151,
      "start_time": 3892.432,
      "text": " that entanglement was not a but the feature of quantum mechanics and forced its distinction from the classical case. You can also link to that paper. I'll send you a link to it. Now, you might go well, there are certainly some things that are like entanglement. For example, you know, john bell has this"
    },
    {
      "end_time": 3945.265,
      "index": 152,
      "start_time": 3916.323,
      "text": " Paper Bertelman socks. He talks about this guy Bertelman who's got socks and the socks are always different. If you know what color one sock is, you'll know what the other color is not the same. There are systems in which, for example, if I have someone preparing coins and they always prepare the coins so that when one is heads the others are always tails. Always. And you discover one is heads and you know the other is tails. They're correlated. Even if the coins are very far apart when you look at them. If you prepare the coins and send them far apart and you look at one coin it's heads, you know the other one even is very far away as tails."
    },
    {
      "end_time": 3967.892,
      "index": 153,
      "start_time": 3945.538,
      "text": " This is called correlation. And if you do this over many coins and the coins are flipped, you don't always know what you'll get heads or tails, but you know the results will be correlated, statistically correlated. So statistical correlation certainly happened classically, but entanglement is stronger than that. And that's one of the things that Einstein, Podolsky and Rosen and Bell, they were trying to get at this"
    },
    {
      "end_time": 3996.442,
      "index": 154,
      "start_time": 3968.268,
      "text": " feature of entanglement that is somehow stronger. You get correlations that are stronger than you would think could be possible on normal how we usually reason about classical probability theory for systems that are widely separated from each other. To explain the Bell inequality, I have to start with where it came from. So Bell's theorem in 1964 is in a paper called on the Einstein-Podolsky-Rosen paradox. He's referring to a 1935 paper by Einstein-Podolsky-Rosen."
    },
    {
      "end_time": 4024.292,
      "index": 155,
      "start_time": 3997.381,
      "text": " So I have to talk about that paper and what they did and then what Bell was supposed to do. You should link to a copy of that paper. People should read it. I don't know how many physicists have actually sat down and read that paper really carefully, but it's and even Einstein wasn't super happy with it. He was a little upset about how it finally came out. But it is a very important paper to read. You mean the EPR paper? This is the famous EPR paper. Yep. Yeah. It's a very subtle argument, but it basically boils down to this."
    },
    {
      "end_time": 4054.94,
      "index": 156,
      "start_time": 4025.589,
      "text": " If I've got two quantum systems, and they're entangled, I prepare them, I prepare them in some state that's entangled, and to get them entangled, something has to be local between them. Either they have to be together initially, or you have to send something from one to the other, but some kind of, at some point, local thing should happen in order to get them entangled with each other. And then you send one of the systems very, very far away. This is a weird thing about entanglement."
    },
    {
      "end_time": 4084.667,
      "index": 157,
      "start_time": 4055.606,
      "text": " When I measure the first system, usually people do these thought experiments, they imagine Alice and Bob. Alice has the first system and Bob is very far away with the second system. Alice does a measurement on her system and she could measure a variety of different observable features. She could measure some observable feature and when she does it, she will know if you have the right kind of entanglement, she'll know exactly what Bob will get when he does his measurement."
    },
    {
      "end_time": 4115.384,
      "index": 158,
      "start_time": 4085.623,
      "text": " She'll measure observable A, she'll get some answer, and then she'll know, ah, I got this answer because of the entanglement, I know what Bob will get. Bob will definitely get this other answer. But Alice didn't have to measure that thing. She could instead have measured a different observable. She could have measured observable A prime, a different observable that is not compatible with A in the same way that position is not compatible with momentum, which is what they originally used in the EPR paper. The original EPR paper was written in terms of position and momentum, but these are incompatible observables, they obey an uncertainty principle. If you know one, you don't know the other with certainty."
    },
    {
      "end_time": 4142.21,
      "index": 159,
      "start_time": 4116.067,
      "text": " And so she measures A'. She can make Bob's system collapse, have a definite answer for a different observable. Okay? So she can steer Bob's system. This is called quantum steering. The word quantum steering was introduced by Schrodinger shortly after the EPR paper. Because it feels like Alice's choice of measurement, she measures A or A' is like steering Bob's system."
    },
    {
      "end_time": 4169.445,
      "index": 160,
      "start_time": 4142.995,
      "text": " Now, the steering does not send signals. Again, there's this theorem called the no signaling and no communication theorem that shows that Alice cannot deliberately send controllable messages this way. The steering is something more subtle and can't be used to send signals or communication. This is rigorously established because of this theorem. Nonetheless, there is some sense in which she is somehow steering Bob's system. She'll measure observable A. She doesn't control what answer she gets. A is uncertain. She could get this, she could get that."
    },
    {
      "end_time": 4197.841,
      "index": 161,
      "start_time": 4169.753,
      "text": " Depending on what she gets, Bob will get a certain corresponding thing, but because she can't control what she gets, she can't control what Bob gets. She just knows that once she's done her measurement and gets a certain result, she knows that Bob, if he decided to measure the same thing, she'd know exactly what he would get. If Alice instead measures A', she'll collapse Bob's system to a different basis, and whatever result she gets, she'll know Bob, if he measured that corresponding observable, she'd know exactly what he would get."
    },
    {
      "end_time": 4226.323,
      "index": 162,
      "start_time": 4199.002,
      "text": " Now there's two possibilities as far as EPR, Einstein-Belsky-Rosen were concerned. Either Alice's decision is really changing Bob's system. And Bob's system, when they do the experiments, could be a light year away. And that would seem to be something superluminal, unacceptable happening, faster than light happening. But if not, Bob's system must already have known what answer it would get if he measured the first observable."
    },
    {
      "end_time": 4252.705,
      "index": 163,
      "start_time": 4226.493,
      "text": " and what answer he'd get if measure the second observable because Alice could measure either of hers and depending on what she measures she can make Bob's system have a definite value of one measures everyone has a definite value of the other and if Alice is not really changing Bob's system Bob system must have known all along what it was going to have they called their paper can and they leave out the but can the quantum mechanical description of reality be considered it complete"
    },
    {
      "end_time": 4281.596,
      "index": 164,
      "start_time": 4253.217,
      "text": " They're saying that unless you allow something faster than light to be happening, Bob's system must already know the answers it should yield for all of his measurements because Alice cannot possibly, by her choice of measurement, be doing the steering. So an EPR basically establishes that there is a logical fork. Either you allow faster than light influences of some kind, causal influences of some kind, or"
    },
    {
      "end_time": 4311.067,
      "index": 165,
      "start_time": 4282.568,
      "text": " There are hidden extra parameters and the wave function, the standard approach to quantum theory is incomplete. There's more to the story than just the wave function. That's where Bell starts. In 1964, he says, well, so here's what they said. They said that either you have non-local or some kind of causal influence happening that's going from Alice to Bob,"
    },
    {
      "end_time": 4341.869,
      "index": 166,
      "start_time": 4311.903,
      "text": " Or there's more to the story than just the wave function. There are some hidden variables. Bob's system already knew what answers it would yield. What Bell wanted to do was show that that fork was actually not really there. That there wasn't an escape from the non-local causation. That if you tried to escape the non-local causation the way that EPR argued you should, assume there's more to the story, hidden variables, extra things,"
    },
    {
      "end_time": 4367.585,
      "index": 167,
      "start_time": 4342.312,
      "text": " Think Verizon, the best 5G network, is expensive? Think again. Bring in your AT&T or T-Mobile bill to a Verizon store today and we'll give you a better deal."
    },
    {
      "end_time": 4398.814,
      "index": 168,
      "start_time": 4370.606,
      "text": " And then write down a simple example of a quantum mechanical system that violates it, that you can go out and do an experiment and check that it violates it."
    },
    {
      "end_time": 4427.995,
      "index": 169,
      "start_time": 4399.718,
      "text": " So in other words, Bell is trying to close a possible way out of the non-local causation. EPR says there's either non-local causation or hidden variables, and Bell is saying, well, even with hidden variables, you still get non-local causation. Therefore, quantum theory is simply a non-local theory, and that's the end of the story. That's what he did. This theorem has gone through a giant game of telephone. So first of all, I should say that the paper"
    },
    {
      "end_time": 4456.118,
      "index": 170,
      "start_time": 4428.882,
      "text": " was like published in an underbell was not and he was a particle physicist doing this foundational work on the side and he would caution people against doing foundational work because it was considered very bad for your career which is really shameful I mean physics is supposed to be an intellectual enterprise and closing down avenues of intellectual investigation of exploration is just anti-intellectual that's a shame but his paper"
    },
    {
      "end_time": 4484.94,
      "index": 171,
      "start_time": 4457.005,
      "text": " somehow eventually became more widely known. And it's like through a game of telephone. Eventually, people began thinking that what he did was prove there couldn't be hidden variables. And people would say, oh, you have a hidden variables theory? That's ruled out. Bell said there can't be hidden variables. In fact, the Nobel Prize was given for experimental tests of the violation of the Bell inequality, right? There's this Nobel Prize that was given to Clauser and and Zeilinger and and"
    },
    {
      "end_time": 4512.722,
      "index": 172,
      "start_time": 4486.067,
      "text": " was asked by I think was asked by all. And it says if you look at the press release for the Nobel Prize, it says that Bell proved there couldn't be hidden variables. And this Nobel Prize is given because they proved hidden variables are impossible. That's not what Bell showed at all. In fact, not only did Bell not show that, but he said in the paper, that's not what he was showing. In fact, he begins the paper by talking about Bohmian mechanics. He says, Bohmian mechanics is at least for"
    },
    {
      "end_time": 4529.48,
      "index": 173,
      "start_time": 4513.541,
      "text": " Systems of fixed numbers of finitely many non-relativist particles, a perfectly empirically adequate theory of quantum mechanics. It is grossly non-local, that's the words he used for it. Could there be a hidden variables theory that is better behaved than Bohmian mechanics when it comes to locality?"
    },
    {
      "end_time": 4554.787,
      "index": 174,
      "start_time": 4529.804,
      "text": " And what he was showing was that there wasn't. But his argument wasn't that, okay, well, as long as you get rid of hidden variables, you can keep locality. He thought EPR had shown that if you don't have hidden variables, then you definitely have non-locality. So it wasn't like he was saying, well, it's hidden variables or locality. He was saying, EPR said it was"
    },
    {
      "end_time": 4584.889,
      "index": 175,
      "start_time": 4554.923,
      "text": " non-locality or hidden variables and in fact in variables still non-locality non-locality is just all you get that's what he thought he was doing and this paper has been widely misinterpreted Bell himself in later writings complained about how people kept misinterpreting his paper either not reading it carefully or getting it second hand or I guess like in the opening of what we talked about the textbook that said oh Bell prove Bell showed that the orthodox approach is the only approach right I mean that's not what Bell said so"
    },
    {
      "end_time": 4613.968,
      "index": 176,
      "start_time": 4585.998,
      "text": " Okay, but then what do we go from here? Bell claimed that he'd shown that quantum mechanics was just non-local full stop. But the EPR paper, the original EPR paper, and Bell's 1964 paper, these are arguments. They're mathematical arguments, especially Bell's paper, which is a theorem."
    },
    {
      "end_time": 4644.019,
      "index": 177,
      "start_time": 4615.265,
      "text": " And you have to be very careful when you talk about theorems in a physical context. So we were talking earlier about inductive, deductive, all these different arguments. In pure mathematics, a theorem begins with premises. The premises should be rooted ultimately, if you have to, in whatever the axioms are of the field you're working in. Maybe they go back to the axioms of set theory, who knows. And then you go through"
    },
    {
      "end_time": 4656.101,
      "index": 178,
      "start_time": 4645.128,
      "text": " a sequence of logically valid mathematical arguments that culminate in some conclusion"
    },
    {
      "end_time": 4683.302,
      "index": 179,
      "start_time": 4657.073,
      "text": " Good correct premises and your logic was valid. You have a sound proof. You have a sound deductive argument and you're done. And if anyone wants to claim there's something wrong, they're going to have to either challenge your premises or challenge your reasoning. And if they're both good, then you're just good. So Euclid proves the infinitude of the prime numbers. That's a great example. You begin with certain premises about how the natural numbers work. And then you have this logical argument that leads to the conjecture that there cannot be a biggest prime number."
    },
    {
      "end_time": 4712.534,
      "index": 180,
      "start_time": 4683.763,
      "text": " as long as you're willing to take on the axioms, the standard axioms that we use for arithmetic. But physical theorems, theorems like Bell's theorem, theorems that are about physics, the Cauchy and Specker theorem, the PBR theorem, that's the Puzi-Berut-Rudolf theorem, there are all these other theorems that are so-called physical theorems. And these can suffer from another problem."
    },
    {
      "end_time": 4742.261,
      "index": 181,
      "start_time": 4713.387,
      "text": " They can succeed as mathematical theorems. They begin with mathematically formulated ingredients that you use in the premises and then you proceed through rigorous logical deductive reasoning and you arrive at a conclusion that's the theorem you've claimed to prove. And that can all be fine. But your theorem is just floating out in math world unless it connects to something in the physical world. And that connection is where there can be a problem. So your mathematical ingredients"
    },
    {
      "end_time": 4768.677,
      "index": 182,
      "start_time": 4743.148,
      "text": " Aren't just supposed to be pure math anymore. They're supposed to have physical reference and I I'm sorry the way a singular is referent Referent is singular reference is plural. They're supposed to have things out in the world that they are representing and the things they're representing Need to be sufficiently rigorously defined and the connection between those reference and"
    },
    {
      "end_time": 4796.92,
      "index": 183,
      "start_time": 4769.206,
      "text": " and the mathematical representations. The connections have to be sufficiently rigorous. And if either of those two things breaks down, we have a connection problem. I call it the connection problem. So let's take Bell's 1964 theorem as a good example of this, okay? Well, let's even go back to EPR. Let's go back to the EPR paper. The EPR paper is a good example. So the EPR paper has premises. There are premises to the EPR paper. One premise is that wave functions collapse."
    },
    {
      "end_time": 4823.285,
      "index": 184,
      "start_time": 4797.739,
      "text": " when we do measurements on them. Another premise is of course the Drakvon-Norman axioms, which include collapse. Another premise is that we have a notion of causal influence that can be cashed out in terms of interventions by agents. I needed an Alice and a Bob to talk about this. Alice is an agent who does an intervention on her system. We call it a measurement in this case. Bob is also an agent who does an intervention."
    },
    {
      "end_time": 4854.667,
      "index": 185,
      "start_time": 4825.026,
      "text": " The interventionist theory of causation is one particular way to talk about causal influences. According to the interventionist conception of causation, to say that a thing A causally influences another thing B is just to say that if an agent comes along and intervenes in some way on A, there will also be a change in B. That's what it means to say that A causal influences B. But if there are no agents and there are no interventions,"
    },
    {
      "end_time": 4877.227,
      "index": 186,
      "start_time": 4855.367,
      "text": " Then what do we do with this theory of causal influence? And you might go, well, there are observers. But if you want a theory of quantum mechanics or theory of physics in which observers and measurements and measuring devices are not part of the fundamental axioms, you're going to have a lot of trouble talking about causation in that kind of a theory."
    },
    {
      "end_time": 4908.166,
      "index": 187,
      "start_time": 4878.456,
      "text": " If you try to do EPR and subtract out the agents and subtract out the intervention and subtract out the wave function collapse, it's actually really hard to talk about what's happening in the thing. So the extent that you take all these things on, sure, you have this rigorous statement, sort of rigorous statement about what's going on. It could fail because the reasoning is bad, but could also fail because there aren't agents out there and there aren't interventions out there. And you might go, well, again, what do you mean? I mean, there are people, Alice and Bob,"
    },
    {
      "end_time": 4936.903,
      "index": 188,
      "start_time": 4912.466,
      "text": " phrase it to meet the level of the atoms. Are atoms intervening? Are atoms agents? And then you actually run into kind of a deep problem. Like if you really are asking me to phrase this not with people, not with measuring devices, but the level of the atoms, the individual atoms that are not making decisions and freely choosing to do things and doing interventions and acting as agents, I don't even know what this theory of causation is supposed to mean."
    },
    {
      "end_time": 4950.606,
      "index": 189,
      "start_time": 4937.517,
      "text": " If you don't have a theory of causation, you don't have a theory of causal influence and you don't have a theory of non-local causal influence and the whole argument just breaks down. This is a thorny problem because causation"
    },
    {
      "end_time": 4977.841,
      "index": 190,
      "start_time": 4951.118,
      "text": " is just like a nightmare subject in metaphysics. People have been trying to understand causation for a very, very long time. Causation is one of these things where we feel like we kind of intuitively understand causation. In fact, Kant even argued that cause and effect was like built into our brain architecture. We needed to think of the world in this way. But it's really hard to pin down what you mean rigorously by cause and effect, especially if you're trying to start from physics. So there's a view of physics."
    },
    {
      "end_time": 4999.565,
      "index": 191,
      "start_time": 4978.541,
      "text": " Around the turn of the beginning of the 1800s, the sort of Laplacian view of physics, all there is is just the state of the universe, all the particles in the universe with their positions and velocities, that's it. At one snapshot in time, and then there is just a giant differential equation, the laws of physics as a giant Markovian differential equation,"
    },
    {
      "end_time": 5027.073,
      "index": 192,
      "start_time": 4999.974,
      "text": " that take this state of the whole universe, all the particle positions and velocities and tell you the next infinitesimally in time state and the next one and also the previous ones. And that's all there is. That's all there is to physics. That's all there is to the development of the evolution of physical systems. From this point of view, there's no sense in which that rock over there"
    },
    {
      "end_time": 5056.391,
      "index": 193,
      "start_time": 5027.773,
      "text": " is causally responsible for the motion of that rock exactly right because it's like you don't need that you have the overall state and it's just sort of propagating forward and backward this giant differential equation there is no role to be played by having these extra ingredients these idle wheels these notions of causal influences now when we teach Newtonian mechanics we often talk about oh why did that rock begin to accelerates because this other rock exerted a force on it right this other rock"
    },
    {
      "end_time": 5083.933,
      "index": 194,
      "start_time": 5056.869,
      "text": " exerted a force on this rock and therefore cause it to move. But if you step back and look at the entire universe, there's just some giant state evolving forward by some differential equation. It doesn't look like there's any place for causation in this picture, at least at a fundamental microphysical level. Bertrand Russell said in the beginning of the 20th century, you know, causation is a relic. I think he said it was like the British monarchy."
    },
    {
      "end_time": 5107.79,
      "index": 195,
      "start_time": 5084.582,
      "text": " It's something that continues to persist under the erroneous assumption that it does no harm. He thought you didn't need causation anymore in physics, at least at the micro-physical level, we should just get rid of it. Of course, if you get rid of causation, then there's no non-local causation, and then what is Bell's theorem even about? What is EPR even about? If there's no causation, there's no superluminal causation, and then the problem is just solved."
    },
    {
      "end_time": 5130.657,
      "index": 196,
      "start_time": 5110.299,
      "text": " I think if one takes the point of view, as some philosophers do, John Norton, for example, has a paper which you should also link to, it's called Causation as Folk Science, that there is no fundamental causation in nature, that science in the early days was about looking for cause and effect, but we've really become more sophisticated in that we're not trying to phrase things in cause and effect anymore."
    },
    {
      "end_time": 5158.336,
      "index": 197,
      "start_time": 5130.947,
      "text": " You know, cause and effect is language we can introduce later just to simplify how we describe things, but like we shouldn't be looking for physical theories fundamentally phrased in terms of cause and effect anymore. That's a relic of an old time. I think if you want to take that point of view, that's a self-consistent view, but then you're not going to be able to then appeal to Bell's theorem and say that there's non-local causation happening in invariable theories. If you want to talk about non-local causation, you need a theory of causation. You need to actually bite the bullet and say, we're going to talk about causal influences."
    },
    {
      "end_time": 5188.609,
      "index": 198,
      "start_time": 5159.582,
      "text": " And if you rely on interventionist causation, you run into the problem that interventionist causation just doesn't seem like the kind of fundamental microphysical definition of causation that we should be talking about. We're talking about microphysical theories like quantum mechanics. A lot of the no-go theorems that are related to Bell's original 1964 theorem, that theorem itself, the EPR paper, the GHZ argument, a lot of them help themselves to interventionist causation. They ultimately involve agents manipulating things, doing interventions."
    },
    {
      "end_time": 5217.005,
      "index": 199,
      "start_time": 5189.258,
      "text": " In a fundamental microphysical theory, which is just atoms doing the things that they're doing with no agents, no fundamental role of agents or interventions or measurements, it's not clear what these theorems are even saying. Bell wrote another version of his theorem, a generalization in 1975. It's a remarkable paper and I think not widely read enough by physicists. A lot of people, I think, tend to focus on the 1964 paper."
    },
    {
      "end_time": 5246.22,
      "index": 200,
      "start_time": 5217.176,
      "text": " The 1975 paper is much more sophisticated. I'll put the link on screen. Yeah, you should put the link. It's a great paper. It's beautifully written. And in this paper, he tries to deal with this problem. I mean, he doesn't use the words interventionism, but he tries to get away from the reliance on measurements and on collapse. He retreats to a much more primordial notion. He just says, look, even textbook quantum theory is committed to some ontology, things physically existing, measurement results,"
    },
    {
      "end_time": 5275.947,
      "index": 201,
      "start_time": 5246.544,
      "text": " Textbook quantum theory says there are measurement results. That's a thing it's committed to. Those are the beables, the things that are really out there according to textbook quantum theory. There are actual facts of the matter about how the measurements happen, and they're really out there in the world. He calls them beables. Not observables, but beables, the way things can be. Maybe that's all the beables you have. Maybe there are more beables in your theory, but textbook quantum theory only has those. And to be clear, a beable is what? An ontological entity? Yes. A beable is what you think is real."
    },
    {
      "end_time": 5295.947,
      "index": 202,
      "start_time": 5276.203,
      "text": " What you think is actually physically out there and according to textbook quantum theory you're committed at least to measuring things now. This of course raises some questions if measuring devices are out there and they're really real what are they made out of. In textbook quantum theory there's just nothing and you can't say well they're emergence because emergence requires a substrate."
    },
    {
      "end_time": 5325.674,
      "index": 203,
      "start_time": 5296.51,
      "text": " Water, fluid water emerges from water molecules. You need to have the things out of which the emergence is constructed. And in textbook quantum theory, you can't just say measuring devices are emergent without saying what are the things that it emerges from. In a theory like Bohmian mechanics or many worlds or indivisible stochastic formulation, you have those ingredients out of which the emergence is supposed to take place. But in textbook quantum theory, you don't. Okay, but that's putting all that aside. You're at least committed to the measuring devices, measurement results as the beables of the theory."
    },
    {
      "end_time": 5354.036,
      "index": 204,
      "start_time": 5326.374,
      "text": " And so Bell just, he rephrases the premises of his theory differently. He doesn't rely on interventionism. He doesn't even propose any theory of causation. He just says, look, I don't have a good theory of causation. I'm not going to give you a full comprehensive theory of causation. But I think any good theory of causation should have a certain feature. It should have a feature which today we would call Reichenbachian common cause factorization. Right."
    },
    {
      "end_time": 5383.677,
      "index": 205,
      "start_time": 5354.94,
      "text": " This is just the statement that if A is a thing correlated with another thing B, they statistically rise and fall together in some statistical way, and A and B do not causally influence each other directly, maybe because they're so far apart when they happen that they can't communicate with light, then there must be some other variable C that is causally influencing both of them. You know, so for example, if people have one condition,"
    },
    {
      "end_time": 5401.886,
      "index": 206,
      "start_time": 5384.241,
      "text": " Nicholas Cage movies are released when people tend to die from drowning in swimming pools."
    },
    {
      "end_time": 5431.766,
      "index": 207,
      "start_time": 5403.183,
      "text": " Sure. I don't know if that's an interesting suggestion. Well, it turns out that it's because Nicolas Cage releases movies in the summertime. Ah, good. Yes. So the common cause is summertime. Good. Yes, exactly. It'd be like saying, well, barometers show low pressure, and that's correlated with hurricanes. But it doesn't seem that the barometers are causing the hurricanes. Right. Or the hurricanes, which haven't happened yet, are causing the barometers. But there's a low pressure system that happens first, and this leads to both of them. OK. So this is the so-called common cause"
    },
    {
      "end_time": 5461.715,
      "index": 208,
      "start_time": 5432.005,
      "text": " And what Bell asserts is that any good theory of local causation should have the property that local beables, whatever they are, they can be measurement results, they can be beables in some other sense, he's being very general about this, but local beables associated with distant places, if they're correlated, there must exist other beables in the past, in the causal past, in the so-called overlap of their light cones. That's the fancy way of saying it. And there must be a rich enough set of those beables, a rich enough set of them,"
    },
    {
      "end_time": 5491.988,
      "index": 209,
      "start_time": 5462.312,
      "text": " That if you specify them all and know them all, then they explain the correlation in a very rigorous mathematical sense. They lead the joint probability distribution for the two things, the two beables, A and B. They lead the joint probability distribution to factorize cleanly when you condition on the local beables in the past, the common cause local beables. This is called Reichenbachian factorization. I don't know that Bell knew about Reichenbach's work. Reichenbach had formulated this idea in the 50s."
    },
    {
      "end_time": 5520.503,
      "index": 210,
      "start_time": 5493.08,
      "text": " And it's certainly something you could imagine a good theory of causation should have. Bell needed this factorization in order to derive his inequality with these weaker, more general assumptions. And this 1975 theorem was general enough that it could encompass probabilistic theories, theories with stochastic hidden variables where the hidden variables didn't uniquely determine measurement outcomes, but only determine them probabilistically. So this is a more general theorem, but he's changed his premises."
    },
    {
      "end_time": 5548.626,
      "index": 211,
      "start_time": 5520.708,
      "text": " And now he's taking on this premise that in order for a theory to count as locally causal, his principle of local causality is, again, it's locally causal if whenever we have statistically correlated local variables A and B that are far enough apart when they occur that they can't be causally influenced each other, then there must be a rich enough set of causal variables in the past that when you condition on them, the correct joint probability distribution factorizes in this neat way. And this is necessary to get the theorem."
    },
    {
      "end_time": 5569.343,
      "index": 212,
      "start_time": 5551.869,
      "text": " Now I'm not the first to suggest that Reichenbachian factorization is too strong a requirement, too strong a condition to impose on a theory"
    },
    {
      "end_time": 5591.971,
      "index": 213,
      "start_time": 5569.735,
      "text": " football fan, a basketball fan, it always feels good to be ranked. Right now, new users get $50 instantly in lineups when you play your first $5. The app is simple to use. Pick two or more players. Pick more or less on their stat projections. Anything from touchdowns to threes and if you're right, you can win big. Mix and match players from"
    },
    {
      "end_time": 5601.817,
      "index": 214,
      "start_time": 5591.971,
      "text": " any sport on ProgePix, America's number one daily fantasy sports app. ProgePix is available in 40 plus states including California, Texas,"
    },
    {
      "end_time": 5624.838,
      "index": 215,
      "start_time": 5602.073,
      "text": " Others"
    },
    {
      "end_time": 5653.558,
      "index": 216,
      "start_time": 5625.179,
      "text": " You know, Bill Unruh, for example, in a 2002 paper, which I can also link, has this long explanation. He says, well, yeah, I mean, the things got entangled. There was some interaction that entangled them. But in quantum mechanics, interactions are not variables or variables. They're not the kinds of things that you can condition on. There was a common cause, the interaction in the past, but it's not the right kind of common cause to get a factorization. So there's no problem here. And various philosophers of science have made this argument also. There's a bunch of papers by Jeremy Butterfield, who's a philosopher of physics at University of Cambridge."
    },
    {
      "end_time": 5678.166,
      "index": 217,
      "start_time": 5654.036,
      "text": " who also has cast doubt on reichenbach factorization. Why would we even think reichenbach factorization is good? Well, it kind of works for everyday macro world joint probabilities, but that's not a strong argument that it should also hold for micro physical probabilities. And there are already good reasons to be suspicious that in fact it should hold. But this just sets up a target."
    },
    {
      "end_time": 5696.715,
      "index": 218,
      "start_time": 5678.609,
      "text": " If you deny that Reichenbachian factorization is a good requirement of any good theory of local causation, then Bell's theorem has no teeth. It simply doesn't work anymore. Now, in a lecture Bell gave in the early 90s called La Nouvelle Cuisine,"
    },
    {
      "end_time": 5712.108,
      "index": 219,
      "start_time": 5697.466,
      "text": " which is in his collected work, Speakable and Unspeakable, is this collection of all these papers, but not the first edition, the second edition of Speakable and Unspeakable. He has this lecture, it's called La Nouvelle Cuisine, which we can also link to. He tells the 1975 theorem story over again and he modifies the premises a little bit."
    },
    {
      "end_time": 5741.578,
      "index": 220,
      "start_time": 5712.79,
      "text": " He modifies the premises a little bit. I've been having an email correspondence with a philosopher of physics, Joanna Luke, about this. She's working on a paper where she's looking at all the different formulations of Bell's theorem. And in 1991, he slightly changes the premises a little bit, so he's not relying on exactly the same kind of Reichenbachian factorization, but he still needs all these sort of like assumptions about what a good theory of causation could be. He's not proposing a theory of causation. There are many theories of causation historically,"
    },
    {
      "end_time": 5769.07,
      "index": 221,
      "start_time": 5741.783,
      "text": " there's regularity theories to say that a causally influences b is to say that when a happens b happens or later or counterfactual theories that a causes b just in a case that b would not have happened if a had not happened and and there's there's conservation law causation and there's probability raising cause there's all these theories of causation Bell doesn't propose a theory of causation he just says I think a good theory of causation should have this feature"
    },
    {
      "end_time": 5788.285,
      "index": 222,
      "start_time": 5769.821,
      "text": " And if you assume this feature, you get this inequality, the inequality is violated by quantum mechanics. Therefore, whatever quantum mechanics is, it doesn't have this feature, therefore cannot have a good theory of local causation. But he didn't propose a theory of local causation. So this is a very long way of saying, in the indivisible stochastic approach, we replace the differential equations."
    },
    {
      "end_time": 5812.108,
      "index": 223,
      "start_time": 5789.65,
      "text": " We no longer have the Schrodinger equation as a fundamental equation or Newton's laws or Maxwell equations or any of that. We don't have those things anymore. Instead, we have these conditional probabilities, the sparse set of what I call directed conditional probabilities. I'll explain the directedness in a moment. But these directed conditional probabilities are exactly the kinds of ingredients that show up in the literature on causal modeling. When you do probabilistic causal modeling, you've got"
    },
    {
      "end_time": 5837.602,
      "index": 224,
      "start_time": 5812.483,
      "text": " Random variables, these are the things that can change and they've got links between them that describe causal relationships and those causal relationships take the form of conditional probabilities. The probability of B having certain values given that these other variables have their values and we would say therefore that those variables causally influence B. This is exactly the language in which the laws are formulated in indivisible stochastic formulation of quantum mechanics."
    },
    {
      "end_time": 5867.756,
      "index": 225,
      "start_time": 5837.91,
      "text": " So you might think, well, they're phrased in a way that provides a very hospitable domain for talking about causal relationships. Maybe we should read those conditional probabilistic relationships through a causal lens. And now you have the opportunity that maybe you could build a theory of microphysical causation out of these ingredients. They're no longer based on a Laplacian paradigm of differential equations. Now they're based on exactly the kinds of"
    },
    {
      "end_time": 5897.756,
      "index": 226,
      "start_time": 5868.131,
      "text": " conditional relationships that we might think have a causal gloss to them. So in one of my later papers, this is the paper New Prospects for a Causally Local Formulation of Quantum Theory, I run with this. I say, okay, well, let's take these and use these to talk about causal influences between things. And now let's say that what it means for a theory to be causally local is that when you have two systems that are at space-like separation, they're far enough apart that they can't influence each other,"
    },
    {
      "end_time": 5922.568,
      "index": 227,
      "start_time": 5898.131,
      "text": " then there's a clean factorization of the conditional probabilities between them. Right. And I'm phrasing this very vaguely because it's a little technical to write it down, but you can read the paper. But this is basically proposing an actual is taking a stand. It's proposing, like proactively proposing a theory of of of microphysical theory of causation and then asking on that theory of microphysical causation,"
    },
    {
      "end_time": 5931.254,
      "index": 228,
      "start_time": 5924.599,
      "text": " Do we get non-local causal influences in the EPR experiments in particular? And the answer is we don't."
    },
    {
      "end_time": 5959.787,
      "index": 229,
      "start_time": 5932.227,
      "text": " So you can read this. This is in the paper. I also have some talks online. People can go and they can watch the talks where I go through all the technical details. I very precisely define what I mean by causal influences based on these conditional probabilities. And then I carefully define what I mean for two things to be causally independent of each other. And I define what I mean rigorously to say that two things are not exerting a non-local causal influence on each other. And then I carefully go through the EPR experiment. And I"
    },
    {
      "end_time": 5990.828,
      "index": 230,
      "start_time": 5960.896,
      "text": " And I show that in the EPR experiment, there is a causal influence that goes from the instantiation of the entangled pair to the two particles, which makes sense because the instantiation is in their past light cone. But there is no causal influence that goes from whatever Alice does to whatever Bob does. So I'll put a link to all of your talks on screen and in the description as well. And maybe at some point,"
    },
    {
      "end_time": 6014.036,
      "index": 231,
      "start_time": 5991.101,
      "text": " When you have another talk planned, I'd like you to give it on tow so that people can see some of the math behind what you're saying. That would be very cool. I hope that was all somewhat clear and understandable. Well, many people have questions about Bell, so I'm glad that you were able to give this explanation. Yeah. So that's a brief summation of how to think about Bell's theorem, but it's a general"
    },
    {
      "end_time": 6037.722,
      "index": 232,
      "start_time": 6014.957,
      "text": " It's a kind of care one has to take whenever approaching any theorem about physics, any physical theorem. It's not enough to check that the theorem is mathematically sound as a mathematical argument. You have to ask, do the things it refers to out in the world, the reference, are they rigorously defined? And in the case of Bell, he needs local causation. Are those terms sufficiently well defined? And I would argue they're actually not."
    },
    {
      "end_time": 6063.814,
      "index": 233,
      "start_time": 6038.712,
      "text": " And then you have to worry about the connection between those reference and the mathematical ingredients. Is that sufficiently established? That's where there's a weakness. If Bell's definition of causation is not sufficiently rigorously established, then the theorem just doesn't have any teeth. And if you can provide a theory of microphysical causation and a theory of what it means on that theory of microphysical causation for things to not be able to causally influence each other non-locally,"
    },
    {
      "end_time": 6094.77,
      "index": 234,
      "start_time": 6065.52,
      "text": " Then that's all you have. If people still don't like it and still think, well, it still seems there's too much correlation. Well, maybe that doesn't feel great. Maybe it's unintuitive, but it's not a source of brokenness. Okay. Now, this all leads to the question, what do we even mean by it? What is entanglement in this picture? So this indivisible stochastic picture, like what is going on in entanglement? If there's no state vector, if there's no superposition actually happening, what do we mean by entanglement? There's actually a very nice picture of what's going on with entanglement now."
    },
    {
      "end_time": 6124.48,
      "index": 235,
      "start_time": 6095.162,
      "text": " Suppose I start with two systems. Think of two particles, let's say, or two qubits, two simple systems. And suppose that these systems initially are independent of each other, they have their own configurations, they are not interacting with each other in any way. Well then, according to the indivisible stochastic approach, by definition they're going to have their means to be independent and not interacting is that they have their own indivisible stochastic laws. Now let's suppose that there's a certain time, we'll call this time"
    },
    {
      "end_time": 6149.462,
      "index": 236,
      "start_time": 6124.872,
      "text": " T prime. At this time, T prime, they interact in some way. And because interactions happen locally, whether you're doing quantum mechanics or not, they have to be nearby each other or sharing some intermediary in order to communicate, but some way they begin to interact. What does that interaction mean? Well, even in Newtonian physics, when two systems are interacting, they no longer have their own separate potentials anymore. There's one potential function for both of them that doesn't factorize."
    },
    {
      "end_time": 6176.817,
      "index": 237,
      "start_time": 6149.906,
      "text": " In the indivisible stochastic approach, the interaction is represented by the fact that now there's an overall stochastic dynamics for the two systems, and that overall stochastic dynamics does not factorize while they're interacting. Now, what you might imagine happens is once you separate the systems and take them to far distance separations in space, that they'll have their own separate stochastic dynamics now. And that's what would happen in the Newtonian case? In the Newtonian case, but it doesn't happen here. And it doesn't happen here because"
    },
    {
      "end_time": 6194.275,
      "index": 238,
      "start_time": 6177.449,
      "text": " The overall stochastic map is indivisible. It goes all the way back to when they first like before they interacted. It cumulatively encodes all the statistical effects between before they interacted and all future times. And if there was a moment when it stopped factorizing, it's not going to start factorizing again."
    },
    {
      "end_time": 6215.162,
      "index": 239,
      "start_time": 6195.503,
      "text": " So the two systems will not have their own separate laws. There will be one overall indivisible stochastic dynamics that's not factorizable for the two systems. But there's a common cause. The common cause was their interaction. But the common cause is not the kind of common cause that would be plugged into Reichenbach's principle of common causes."
    },
    {
      "end_time": 6243.916,
      "index": 240,
      "start_time": 6215.657,
      "text": " Now, if you have an agent, if you want, Alice or Bob, or an environment, or even just one of those little qubits we talked about, the detector bit that we did when we were talking about the double set experiment, that interacts with one of the systems and reads off its configuration, at some later time, T prime prime, T double prime, later, when they're far apart, it will produce a division event. That division event will let us restart the overall stochastic dynamics, but the systems are now separated."
    },
    {
      "end_time": 6272.568,
      "index": 241,
      "start_time": 6244.241,
      "text": " And so when the indivisible stochastic dynamics restarts, starts cleanly, they're no longer interacting. It's going to begin factorized and it will remain factorized. And this is the breaking of entanglement. So the two systems are not initially interacting. They have their own separate indivisible stochastic dynamics. We would say they're not entangled. They begin interacting for some amount of time during the interaction and after until the next division event. They no longer have their own separate indivisible stochastic dynamics that factorizes."
    },
    {
      "end_time": 6297.961,
      "index": 242,
      "start_time": 6273.353,
      "text": " Then when there's a division event later on, once they're far separated and we can restart the stochastic evolution, we can stop, look at what configurations they're in, and then write down new laws for them. They're separated now. Now they'll have their own independent laws again, and that's the breaking of entanglement. Notice this is a picture of entanglement phrased entirely in terms of ordinary probability theory with no Hilbert spaces. This is the claim that's going on. This is what's happening under entanglement."
    },
    {
      "end_time": 6320.879,
      "index": 243,
      "start_time": 6298.353,
      "text": " And this picture of what's happening with entanglement comports with this microphysical theory of causation I was describing before, a theory that does not permit whatever agent or environment or measuring system acting on one system having a causal influence at space like separation on the other. So that's what I would be saying is happening with entanglement. It's a picture of entanglement at the level of ordinary probability theory."
    },
    {
      "end_time": 6351.852,
      "index": 244,
      "start_time": 6322.346,
      "text": " Whether you call it classical probability theory is subtle. It depends on whether you think that indivisibility is a classical property or not. But it's certainly ordinary probability theory and it doesn't require Hilbert spaces and so forth. So that's that's one way to think about how entanglement is ultimately happening at the at the sort of deeper level of the indivisible stochastic process. So you mentioned that the stochastic dynamics somehow encoded had the memory. I know you don't like this word memory, but somehow encoded what happened before."
    },
    {
      "end_time": 6378.268,
      "index": 245,
      "start_time": 6352.295,
      "text": " into itself. So yeah, if I was to think of that as information that's being encoded, well, information, if you accumulate enough of it, you form a black hole in a small enough region. So does this mean that if entangled particles are entangled for long enough, then they'll just form a black hole because the dynamics between them encode so much information? Help me decode this question. Yeah. So I think, um,"
    },
    {
      "end_time": 6403.951,
      "index": 246,
      "start_time": 6380.452,
      "text": " There is a sense in which the overall indivisible stochastic map is encoding sort of cumulative like statistical connections. But even that, I mean, I'm really sort of fishing for metaphors here when I say that because it's not really memory in the traditional sense. Again, a traditional non-Markovian process, the way we usually talk about non-Markovian processes, we have this hierarchy, this tower"
    },
    {
      "end_time": 6434.121,
      "index": 247,
      "start_time": 6404.224,
      "text": " Hi,"
    },
    {
      "end_time": 6454.258,
      "index": 248,
      "start_time": 6434.701,
      "text": " description of the later configuration of the system depends on its initial configuration and that can happen in the past. It's not that information is being encoded in a literal sense. It's not the kind of information that"
    },
    {
      "end_time": 6481.118,
      "index": 249,
      "start_time": 6454.65,
      "text": " you know, the Bekenstein bound would say, could saturate the maximum amount of information that could happen in some region of space and might lead to the formation of like, it would exceed how much information you could have and might necessitate a black hole forming. So I would just say that I don't think that information, it's not, it's not information, I think, in in, in the sense of like, it's encoded on physical qubits in space, that would back react on on space time and have gravitational effects."
    },
    {
      "end_time": 6506.681,
      "index": 250,
      "start_time": 6481.8,
      "text": " It's just the laws are a little weird and stranger than we might have thought. I see. Okay, so tell me about the loss of phase information. We talked about this off air, but explain this on there. So one question you might ask is, okay, well, when I do this change of representation between the stochastic process that has no complex numbers in it, no phases, but indivisible dynamics, and I go to this sort of quantum system where I've got phases and all that sort of thing, right?"
    },
    {
      "end_time": 6532.534,
      "index": 251,
      "start_time": 6507.739,
      "text": " It seems like the phase information is really important. I mean, we need it in order to make predictions about interference. How could it be missing from the indivisible side? Well, the point is it's not missing. The phases on the Hilbert space side are just the indivisibility on the, so they're there. They're just manifesting somewhat differently. But even then you might say, well, but come on. I mean, I can indirectly measure those phases if I"
    },
    {
      "end_time": 6559.65,
      "index": 252,
      "start_time": 6532.927,
      "text": " like take a unitary time evolution matrix that I'm using to describe evolution on the Hilbert space side and I like mod square the entries and I lose all the phase information. How can that possibly still capture the same information? How can it possibly do it? And the answer is in this picture when you model a measurement process you have to bring the measuring device in just like Bohm did when he was writing those later chapters in his"
    },
    {
      "end_time": 6588.49,
      "index": 253,
      "start_time": 6559.991,
      "text": " 1951 textbook on the measurement process or in his Bohmian mechanics pair of papers in 1952. You have to bring the measuring device in and when you do that and describe the whole thing as one giant indivisible stochastic process, you don't need the phases. You just run the overall indivisible stochastic process with the measuring device and it will probabilistically end up in one of its measurement reading outcome configurations with probabilities that agree with the predictions of the Born Rule."
    },
    {
      "end_time": 6615.674,
      "index": 254,
      "start_time": 6589.787,
      "text": " And then the phases are immaterial. You don't need them. If, however, I want to excise the measuring device from my formal description of the system, if I don't want to deal with the whole measuring device, if I just want to remove it and just look at the subject system and ignore the measuring device, treat the measuring device as kind of like a background character, not someone who's in the foreground of the story, then I need the phases to make predictions."
    },
    {
      "end_time": 6641.101,
      "index": 255,
      "start_time": 6616.118,
      "text": " And then I would replace the detailed physical measurement process with a Von Neumann-Lüder's collapse. I would use the textbook Dirac-Von Neumann axioms. So what I'm saying is, the textbook Dirac-Von Neumann axioms aren't going away. We're just identifying them as describing a certain regime of validity. When you're doing a standard measurement with a big measuring device on some microscopic system,"
    },
    {
      "end_time": 6662.227,
      "index": 256,
      "start_time": 6641.869,
      "text": " You could model the whole thing and include the measuring device and do everything and then you don't need all those phase factors. You can just run the whole thing as some overall giant stochastic process and you'll get the right answer. This is all done out in detail in the first paper, the stochastic quantum correspondence paper. But if we don't want to go to all that trouble, if we want to simplify our description and ignore the measuring device, treat it as a background character and just focus on the system in question,"
    },
    {
      "end_time": 6691.22,
      "index": 257,
      "start_time": 6663.473,
      "text": " Then, and the system is microscopic, so we don't run into the ambiguities that we might run into, well then we can ignore the measuring device, we can treat the measurement as an instantaneous collapse process, and then we do need to worry about those phase factors. So the phase factors are a way of encoding not just the indivisibility, but also the unseen measuring device. That's one way to think about what happens to those phase factors. This sounds like Copenhagen still. So how is this not Copenhagen? Okay, so"
    },
    {
      "end_time": 6721.271,
      "index": 258,
      "start_time": 6691.664,
      "text": " I mentioned that Heisenberg wrote a lot of philosophy. He wrote a book called Physics and Philosophy, and it's a chapter in his book, Physics and Philosophy, which we can also link to. People can find it. And he's a chapter called the Copenhagen Interpretation. He describes what he saw as the Copenhagen Interpretation. Now, there's not agreement or consensus on exactly what the Copenhagen Interpretation means, and different people who are responsible for what we think of as the Copenhagen Interpretation had somewhat different views on it. Let me just describe how Heisenberg basically described it. He said that"
    },
    {
      "end_time": 6748.814,
      "index": 259,
      "start_time": 6721.954,
      "text": " He basically said, well, Kant told us that our human brains can only understand the world in certain ways. We understand the world in terms of three-dimensional geometry and cause and effect, but there are certain things that we just understand. This is how our brain is supposed to work. And the quantum world simply doesn't work in those ways. It doesn't work in ways that our brains can understand. The classical macroscopic world does, and we have good theories for the classical macroscopic world. We've got classical mechanics, classical physics."
    },
    {
      "end_time": 6774.718,
      "index": 260,
      "start_time": 6749.497,
      "text": " The microscopic world is simply beyond our comprehension. So we use the mathematics of quantum mechanics, Hilbert spaces, wave functions, the Schrodinger equation, not because we think the world literally is these things, the wave function is real, but merely because they just give us a formal instrumentalist, meaning just a tool set for making predictions. They give us a set of mathematical tools for predicting what will happen"
    },
    {
      "end_time": 6795.367,
      "index": 261,
      "start_time": 6775.435,
      "text": " Back on the macroscopic classical scale, a big macroscopic system sets up the experiment, a big macroscopic measuring device measures it. What's happening in between, we have no ability to understand. We use the weird mathematics of quantum mechanics to make the predictions about what will happen. But really, at the end of the day, everything has to then show up in some classical results."
    },
    {
      "end_time": 6821.374,
      "index": 262,
      "start_time": 6796.493,
      "text": " And that's the picture that's the opening interpretation, at least according to Heisenberg. And he had some words he said about where the probabilities came from. He's like, well, there's an uncertainty principle, and for big macroscopic systems, we're all kind of uncertain. And when microscopic systems interact with macroscopic systems, that's where the probabilities come in. He had a somewhat more sophisticated picture about all of this. And people can go and read his chapter on all of this. This is not Copenhagen because I'm not"
    },
    {
      "end_time": 6848.439,
      "index": 263,
      "start_time": 6821.596,
      "text": " practicing the same kind of quietism about the micro world that he was practicing. I'm not saying we don't know what's happening in the micro world. I'm not saying we just basically only have classical physics and then the micro world is inscrutable to us. We need this other theory to describe the micro world and all it does is make predictions. I'm saying the micro world has an ontology. I'm saying that classical things have physical configurations, measuring devices have physical configurations. Measuring devices are emergent from atoms"
    },
    {
      "end_time": 6872.619,
      "index": 264,
      "start_time": 6848.797,
      "text": " And that's okay now because the atoms also have an ontology. The atoms really exist. They really do have configurations. And when you're doing the experiment, the particles are really doing things. They're moving in particular ways. The laws are these indivisible stochastic laws, which are a little bit unintuitive, but things are really happening between the measurements. And now I can hopefully tell, at least in broad outlines, a picture of emergence, a story about emergence."
    },
    {
      "end_time": 6888.916,
      "index": 265,
      "start_time": 6873.541,
      "text": " where we have the particles or whatever the ontology is, fields particles, whatever, and then larger macro scale things emerge from them the way, in spirit at least, that fluid water emerges from water molecules. The Copenhagen interpretation doesn't do that."
    },
    {
      "end_time": 6914.411,
      "index": 266,
      "start_time": 6889.292,
      "text": " You can't talk about how the classical world is emergent because the Copenhagen interpretation practices quietism about the micro world. It doesn't say what is there in the micro world. It doesn't posit any kind of substrate, any lower level reality, physical reality, out of which the emergence of classical things is supposed to happen. So these are all ways in which this picture is distinct from the Copenhagen interpretation. And of course the Copenhagen definition also has this weird"
    },
    {
      "end_time": 6943.729,
      "index": 267,
      "start_time": 6914.838,
      "text": " unspecified boundary between what is quantum and microscopic and what is classical and macroscopic. This is the so-called Heisenberg cut. There's a threshold above which you're classical and below which you're quantum and that's a murky line and people have debated whether it's really there or whether the idea is you can move it around but in any event it's not part of the indivisible stochastic approach. Are electrons single particles? Are they composite or are they point particles in your picture?"
    },
    {
      "end_time": 6973.814,
      "index": 268,
      "start_time": 6944.411,
      "text": " I don't know what they're made out of. Our best theory, the standard model, describes electrons as not composites. So I don't know if they're made of anything else. I mean, there's also this interaction between electrons and Higgs field, which is, you know, complicated. But they're not any more or less composite in this picture than they would be according to the standard model. Okay. So something I'm interested in is research. What open questions does this pose? Where can people come in to help you with this theory? Yeah."
    },
    {
      "end_time": 6996.988,
      "index": 269,
      "start_time": 6974.258,
      "text": " What I find exciting about this project is it doesn't often happen that you stumble on like a blank canvas in an area of what you might have thought was settled fundamental physics. Where you can ask questions that really have no answers yet and there are a lot of directions that people can pursue when it comes to research."
    },
    {
      "end_time": 7026.852,
      "index": 270,
      "start_time": 6999.206,
      "text": " This project opens up a lot of these directions. One of them is just the mathematics of this new class of processes, these indivisible stochastic processes, which only showed up in the research literature in like 2021 in this review article by Simon Mills and Kavan Modi, which we can also link to people can look at it. It shows up in this sort of figure in their paper. There's a figure five in this paper. You know, mathematics has all these very simple ideas like"
    },
    {
      "end_time": 7056.664,
      "index": 271,
      "start_time": 7029.087,
      "text": " functions, matrices, limits, derivatives, that are reasonably simple to define, but yet have profound implications. It's not super often that you see relatively simple ideas, simple mathematical ideas that have big applications and ramifications. Indivisible stochastic processes are a fairly simple idea that I guess people just didn't really think about."
    },
    {
      "end_time": 7083.456,
      "index": 272,
      "start_time": 7057.21,
      "text": " And so there's just some interesting work to be done on trying to understand the mathematics of these processes. That could be interesting work for someone interested in math, applied math, theory of stochastic processes. We talked about how you would model real world systems like quantum field theories, like the standard model. There's a lot of work to be done in taking this picture and applying it to systems that show up in solid state physics and high energy physics and the standard model."
    },
    {
      "end_time": 7111.834,
      "index": 273,
      "start_time": 7084.258,
      "text": " to make sure it works for one thing and also to see if it reveals any interesting features of these theories that might have been difficult to see otherwise. Dynamical symmetries are a really important subject in physics. Dynamical symmetries show up in a very interesting way in this approach and so there's a lot of work to be done there. There are old problems in statistical mechanics. So one of the outstanding problems in the philosophy and foundations of statistical mechanics is"
    },
    {
      "end_time": 7140.213,
      "index": 274,
      "start_time": 7112.892,
      "text": " Where do the probabilities in system mechanics come from? In classical system mechanics, you're imagining you've got particles, like a gas is made of particles. The particles are all evolving because it's classical according to Newtonian mechanics, the rules of Newtonian mechanics. But Newtonian mechanics is not a probabilistic theory. There's this lovely argument by the philosopher of physics, David Elbert, that there's nothing whatsoever in the laws of Newtonian physics that would preclude a bunch of rocks"
    },
    {
      "end_time": 7167.056,
      "index": 275,
      "start_time": 7141.015,
      "text": " spontaneously falling together to form a bunch of statuettes of the royal family. You might think that's impossible, but it's not impossible. I mean, after all, you could start with statues of the royal family and have them crumble into rocks. And because Newtonian physics is time-reversal and variant, the opposite should be possible. And yet we would just say that's unlikely somehow. But Newtonian mechanics doesn't come with probabilities. So where do those probabilities come from?"
    },
    {
      "end_time": 7197.756,
      "index": 276,
      "start_time": 7168.592,
      "text": " One argument is the probabilities come from the initial state of the universe. The universe began in some initial state but of course there was one initial state of the universe, not a probabilistic collection of initial states. So there's some work to be done in understanding how we go from the beginning of the universe in some sense to some notion of probability distribution and it has to be the right kind of probability distribution. On the one hand it should be the kind of probability distribution that doesn't lead to rocks forming the statuettes of"
    },
    {
      "end_time": 7221.493,
      "index": 277,
      "start_time": 7198.012,
      "text": " of the royal family, because we don't see that around us. We don't see that happening. We look around and we don't see rocks spontaneously assembling into statuettes of the royal family. And so we hopefully we were looking for some kind of explanation for why we don't see that happening. Oh, what I mean is, if you were to wait around for long enough, wouldn't you see it? Maybe, but only if the set of possibilities is bounded in the right sense."
    },
    {
      "end_time": 7250.708,
      "index": 278,
      "start_time": 7222.108,
      "text": " If the number of possible configurations of the universe is unbounded, there's no requirement you ever have to revisit or visit every possibility. If there's only a bounded, a so-called compact space of possibilities, then there are arguments that eventually you have to get recurrences or you have to visit everything. But in any event, in the time we've had since our universe has existed, we have not seen that happen. We've not seen rocks spontaneously form. What I mean is, even if we have this space that's not bounded,"
    },
    {
      "end_time": 7279.121,
      "index": 279,
      "start_time": 7251.169,
      "text": " some events will occur that will be extremely, extremely unlikely. Yes. Of the same order of magnitude, if not greater than the royal family. That's right. But we don't expect them to happen all the time. Right? We live in a universe where they happen, but only rarely, not all the time. It'd be very weird if this were happening all the time all around us. How do we explain why it's not happening all the time around us? Somehow this is connected with how the universe began. The universe began in some kind of configuration that was very"
    },
    {
      "end_time": 7300.52,
      "index": 280,
      "start_time": 7279.377,
      "text": " typical in some sense. It was very generic. It was very boring. It didn't have the very special arrangements that would lead to us seeing strange, unlikely things happening all the time. But we can't make it too typical. Because there's some sense in which the most typical initial configuration is just very random and in some loose sense, very high entropy."
    },
    {
      "end_time": 7321.186,
      "index": 281,
      "start_time": 7301.084,
      "text": " We actually need the initial beginning of the universe to begin in a low entropy configuration so that we get a well-defined thermodynamic arrow of time. David Albert calls this the past hypothesis. So there's something mysterious going on about the beginning of the universe if you're living in a deterministic universe where the laws are deterministic because how else do we get probabilities out? They must come"
    },
    {
      "end_time": 7349.224,
      "index": 282,
      "start_time": 7321.664,
      "text": " from some statement about the initial conditions, but those initial conditions of the universe must be such that we began in low entropy and are rising toward high entropy, and yet are typical enough that we don't see surprising things happening all the time. In a theory in which the laws themselves are probabilistic, stochastic, we don't have the same kinds of problems. If the laws are themselves stochastic, we're getting probabilities out of the laws. We don't need to get them out of the initial conditions of the universe."
    },
    {
      "end_time": 7378.285,
      "index": 283,
      "start_time": 7349.718,
      "text": " So this gives a whole other way to think about where the probabilities of statistical mechanics can come from. Now, one might ask, okay, does that mean that all statistically fluctuating things in statistical mechanics and in thermodynamics are ultimately quantum mechanical in origin? That's not quite the way I would phrase it. The way I would say it is we need some source of probabilities in order to get things like"
    },
    {
      "end_time": 7408.524,
      "index": 284,
      "start_time": 7378.831,
      "text": " To get statistical mechanics off the ground, you need some statement like, all else equal, all of the possible configurations or states of a system that are energetically accessible are in some sense equally probable, right? The technical term for this assumption is that we're assuming the microcanonical ensemble, but it's basically all else equal if a system can have lots of states and they're all available, the system can get to them, we should treat them all as being equally probable unless we have some good reason to think otherwise. How do we get that off the ground?"
    },
    {
      "end_time": 7436.169,
      "index": 285,
      "start_time": 7409.138,
      "text": " There were arguments for a while that maybe systems just rapidly oscillated and changed, even according to Newtonian mechanics, in a way that was called ergodic. Ergodic systems are systems that rapidly explore their possibility or state space. Very rapidly. So rapidly that you can sort of pretend that the system is equally likely to be in any of its states. Unfortunately, proving that systems are ergodic is very hard. And there are many systems that are known not to be ergodic. So the ergodic hypothesis turns out not to hold for a lot of systems."
    },
    {
      "end_time": 7464.872,
      "index": 286,
      "start_time": 7437.227,
      "text": " There have been some information theoretic arguments to get this off the ground. But then you run into some very deep questions like, if the probabilities are all just in my head, how can the probabilities actually lead to coffee boiling or something like that? It feels like the probabilities should somehow be out there in nature because they seem to be doing physical work in some general sense. So information approaches toward trying to derive the equal probability of all the microstates is very hard."
    },
    {
      "end_time": 7494.957,
      "index": 287,
      "start_time": 7465.538,
      "text": " These, but theories that have probabilities in the laws provide a different way to get probabilistic behavior at this sort of necessary level. Once you've got this probabilistic behavior and can talk about Boltzmannian statistical mechanical systems, you can then take these Boltzmannian statistical mechanical systems with sort of all the states being assigned probabilities in roughly equal amounts. You can couple them together. You can take big, big, big systems called reservoirs, which model the environment and little systems."
    },
    {
      "end_time": 7517.346,
      "index": 288,
      "start_time": 7495.299,
      "text": " And you can, from these interactions, derive notions like thermal equilibrium at some temperature. And then you can derive what's called the canonical ensemble, which is the probability distribution we would associate to a system that is energetically interacting with a larger, a very large environment called a reservoir."
    },
    {
      "end_time": 7536.681,
      "index": 289,
      "start_time": 7518.063,
      "text": " And these systems will exhibit fluctuations that are thermal fluctuations and those thermal fluctuations are distinct from quantum mechanical fluctuations. So there's like a higher level of fluctuations, thermal fluctuations that you get for these systems. It's not that you need the indivisible stochastic approach to explain that higher level of emergence of thermal fluctuations."
    },
    {
      "end_time": 7566.971,
      "index": 290,
      "start_time": 7537.142,
      "text": " earlier when you said that it's not just all in our heads because the water is boiling and doing something. Are you referring to that some people think randomness is about our ignorance?"
    },
    {
      "end_time": 7585.179,
      "index": 291,
      "start_time": 7567.705,
      "text": " Right. Right. Yeah. So one way to think about probability is that probability is objective chance type probability, that nature is really behaving in kind of a chancy, unpredictable way, that phenomena are happening in an unpredictable way. Another view is that the probabilities are all in our heads."
    },
    {
      "end_time": 7611.101,
      "index": 292,
      "start_time": 7585.811,
      "text": " Right. When we assign probabilities to things, we're talking about what are called subjective credences, credences or degrees of belief. When we assign probabilities to things, we're not saying the probabilities are really out there in any sense. We're just describing like our belief in whether something is actually a particular way or not. And there's a relationship between objective chancy probabilities and subjective credence probabilities. It's most famously formulated as what David Lewis called his principle principle"
    },
    {
      "end_time": 7638.814,
      "index": 293,
      "start_time": 7612.125,
      "text": " The first principle is principle, P-A-L, and the second is principle, P-L-E, which is just to say that if you happen to know the objective chance for something and you condition on that, then your credence should be equal to the objective chance. There's a connection between objective chance and credence. But in these sorts of pictures, we acknowledge that there are different kinds of probability. There are objective chance probabilities. There are subjective credence probabilities."
    },
    {
      "end_time": 7660.094,
      "index": 294,
      "start_time": 7639.206,
      "text": " From time to time people have tried to say there is only one kind of probability. Maybe all there is is just subjective credence probability, and there is no fundamental objective chance probability, or vice versa, I guess. Maybe we'll talk a little bit about that in the context of Everettian quantum mechanics in a little bit, because it does show up in that context."
    },
    {
      "end_time": 7691.101,
      "index": 295,
      "start_time": 7661.305,
      "text": " But the question is, if all probability is really just subjective credence probability, then how can subjective credence probability in our heads underlie Boltzmannian statistical mechanics, which underlies thermodynamics and thermal fluctuations and all this stuff that happens in the world around us? I mean, if you just happen to know the exact specific state of a system, and now that specific state has 100% probability or nearly 100% probability, have we just mentally, like, now that my knowledge has changed, I've changed all the probabilities, and so that suddenly makes thermodynamics stop working."
    },
    {
      "end_time": 7717.21,
      "index": 296,
      "start_time": 7691.817,
      "text": " That's obviously too quick a statement, but there is a little bit of a mystery here around like could it be just that all the probabilities are heads or is there something random in some sense happening actually in the physical world? The reason this is very tricky is because coming up with a self-consistent, unambiguous, rigorous theory of objective chance turns out to be very hard."
    },
    {
      "end_time": 7747.807,
      "index": 297,
      "start_time": 7717.961,
      "text": " That's one reason why people have retreated to thinking that probability is all credence, because if it's credence, it's okay if it's not perfectly rigorous. Objective chance probability is very hard to specify. It runs into all kinds of basic problems. What does it mean to say that some thing out in the world objectively has a chance of 72% of .72? You might say, well, it means that in the long run,"
    },
    {
      "end_time": 7773.524,
      "index": 298,
      "start_time": 7749.241,
      "text": " times 72% of the time it will come out a certain way, but that's actually not true, right? If you take a coin, for example, and you believe the coin is a 50 50 coin, and you flip it 10,000 times, it's not going to be if it's a fair coin, it's not going to be heads 5000 times, it'll be heads a little off of 5000 times."
    },
    {
      "end_time": 7801.084,
      "index": 299,
      "start_time": 7774.036,
      "text": " But if you think about it hard enough, you realize, but actually there's a chance it could be heads every time. It's very unlikely it could be heads every time, but it could be heads every time. And if you try to say something like, well, okay, we need to take some kind of limit, maybe in the limit as the number of flips goes to infinity, it's like exactly 50%. But that's not how limits work. What about if the coin has a propensity to be 50-50? Well, propensity theories of chance are tricky because what is a propensity?"
    },
    {
      "end_time": 7818.217,
      "index": 300,
      "start_time": 7802.602,
      "text": " A tendency to yield results 50% of the time, but you see it's like circular. It's very hard to pin down what you mean by propensity. Propensity theories of chance say that they're just certain objects that they want to do something in a certain way."
    },
    {
      "end_time": 7846.869,
      "index": 301,
      "start_time": 7819.445,
      "text": " But then, what does the 50% mean? Are you saying that they want to do it a certain way, this fraction of time, but then we run into the same problems we have here? This theory of probability that it's about frequency ratios, whether they're propensity, like they're coming from the object or they're from the laws or whatever, that they're about the frequency with which you get certain results is called frequentism. And frequentism is tough to make rigorous. You might say, well, just take the limit n goes to infinity, take the number of trials to infinity, but that's not how limits work."
    },
    {
      "end_time": 7875.742,
      "index": 302,
      "start_time": 7847.534,
      "text": " A limit, when you say that a certain sequence of things has a certain limit, what you're saying is that if you go beyond a certain term in the limit, you go beyond the nth term in the limit, then all the later terms are closer to the claimed limiting value than whatever, you give me some error, some epsilon, I can find a far enough distance along the sequence that everybody farther along is closer to the claimed limit than epsilon."
    },
    {
      "end_time": 7905.401,
      "index": 303,
      "start_time": 7876.357,
      "text": " If you make Epsilon smaller, I just go farther down the line. If you make Epsilon smaller, I go farther down the line. As long as I go far enough down the line, everything later down the line will be closer to the limit than Epsilon. Probability doesn't work that way. Frequentist probability doesn't work that way. There's no number of times you can flip a coin that will make, for sure, its frequencies closer to 50%. You could flip a coin a billion times and it could land heads every single time. It's unlikely, but it could happen."
    },
    {
      "end_time": 7934.104,
      "index": 304,
      "start_time": 7905.913,
      "text": " Some epsilon, you can't give me any n, any number of flips that will guarantee that it will fall closer to 50% than that. You might roll your eyes and say, oh, come on, but it's unlikely to do that. It's likely to be closer than epsilon, but the word likely is probability. What you can say is that if I flip the coin enough times, I can make the probability that it is farther from epsilon, away from 50%, smaller than epsilon."
    },
    {
      "end_time": 7962.039,
      "index": 305,
      "start_time": 7935.179,
      "text": " But that's just relating one probability to another. It's a totally circular definition. The law of large numbers is phrased this way. It's just a circularity relating one kind of probability to another. And the formal way to describe this is that when you do a limit, you have to have a notion of a measure. A notion of a metric, I'm sorry. You have to have a metric. You have to have a notion of how far away something is from something else. And for probabilistic systems, the metric itself is a probabilistic metric. That's what we're using for distance."
    },
    {
      "end_time": 7984.394,
      "index": 306,
      "start_time": 7962.534,
      "text": " And so any attempt to use limits with a probabilistic metric to describe probability is going to run into the circularity injection. Nonetheless, even though we don't have a rigorous theory of frequent probability, we certainly have an intuition that when we look at a long sequence of coin flips, or a long sequence of ones and zeros, that we can distinguish a highly random sequence from a non-random sequence."
    },
    {
      "end_time": 8012.517,
      "index": 307,
      "start_time": 7985.162,
      "text": " If we look at 10,000 zeros and ones, and we discover that about 50% of them are zeros and 50% of them are ones, and furthermore, runs of zeros, runs of three or four or five zeros in a row, or ones, three or four or five in a row, occur with certain frequencies, and this sequence obeys a number of other criteria for randomness, like, you know, the various criteria for randomness,"
    },
    {
      "end_time": 8038.37,
      "index": 308,
      "start_time": 8013.251,
      "text": " There's the right kinds of lack of correlation over time. There's all these things you can run on a sequence of 10,000. We would look at that and we'd go, that to me looks like a random sequence that was generated by a 50-50 coin. It's not rigorous. You can't make it rigorous. And maybe there will never be a perfectly rigorous theory of probability at the level of like frequent probability. But when you look at a long sequence, there's at least an approximate notion that certain sequences seem to have all the hallmarks of probability."
    },
    {
      "end_time": 8068.473,
      "index": 309,
      "start_time": 8039.548,
      "text": " So maybe we don't need a theory of probability for system mechanics. Maybe it's enough to rely on randomness suitably defined. There are terms that come up when people talk about Kolmogorov complexity for characterizing how random a sequence is. Maybe we can rely on those instead of relying on probability. Maybe probability is all in our heads and what's out there in nature is something like complexity or Kolmogorov complexity or randomness. Or maybe nature is just inherently chancy. So there are a lot of ways to think about these kinds of problems."
    },
    {
      "end_time": 8094.991,
      "index": 310,
      "start_time": 8069.77,
      "text": " Now, you just mentioned measure, incidentally, but there's a problem of a measure in the many-worlds interpretation. We should talk about these other... So why bother introducing a new interpretation of quantum theory at all? Don't we already have enough interpretations? I mean, there are a lot of people who are like, we don't need any more interpretations. The world just keeps adding more and more of them. Why do we need any of them? Here is the reason I think we need a new interpretation of quantum theory."
    },
    {
      "end_time": 8119.633,
      "index": 311,
      "start_time": 8096.408,
      "text": " The existing interpretation suffer from one of the following or more than one of the following problems. Vagueness. They're vague about things they shouldn't be vague about. Or they're instrumentalist, which means they only talk about what happens in measurements, but then what are measurements and what are measuring devices and measuring devices made of things and, you know, you run into all these circularity problems. You run into measurement problems, basically."
    },
    {
      "end_time": 8150.213,
      "index": 312,
      "start_time": 8120.333,
      "text": " Or they're ambiguous when trying to deal with systems that are macro size. We've talked about like the Wigner's friend thought experiment. Once you've got systems that are of the same size as, you know, big classical measuring devices, does the theory render unique or unambiguous predictions? Or the theory is empirically inadequate. Like it works for some systems, like Bohm mechanics works pretty well for systems of fixed numbers of finitely many non-relativistic particles, but doesn't appear to be empirically adequate enough to be able to handle the standard model."
    },
    {
      "end_time": 8179.718,
      "index": 313,
      "start_time": 8151.323,
      "text": " Or finally, the theory relies on too many extra-empirical assumptions, axioms, and speculative metaphysical hypotheses. That is, to get the interpretation to work, we have to take on a whole collection of assumptions that cannot be verified empirically and that seem kind of like desperate measures or seem very far-fetched or seem difficult to justify, except that they give us the interpretation we want."
    },
    {
      "end_time": 8201.101,
      "index": 314,
      "start_time": 8180.333,
      "text": " those are the problems i think that all the existing interpretations have all of them have one of them i mean bohmian mechanics suffers from it doesn't appear to be empirically adequate the philosopher of physics david wallace who is at university of pittsburgh wrote a paper that i think characterizes very neatly he says you know the sky is blue"
    },
    {
      "end_time": 8232.022,
      "index": 315,
      "start_time": 8202.125,
      "text": " And the sky is blue and our best theory of why the sky is blue is based on what's called Rayleigh scattering. Rayleigh scattering is when you, I teach Jackson electromagnetism, we cover Rayleigh scattering. When electromagnetic radiation impinges on charged particles, the charged particles vibrate and re-radiate radiation and they do it, they radiate power according to a certain frequency dependence that favors high frequency radiation, so you get much more scattering from high frequency radiation to low radiation."
    },
    {
      "end_time": 8262.568,
      "index": 316,
      "start_time": 8234.104,
      "text": " Bohmian mechanics at this point doesn't seem capable of explaining Rayleigh scattering. And it's been around. I mean, de Broglie first introduced pilot wave theories in the late 1920s. Bohm again independently discovered them and then eventually began talking with de Broglie in the early 50s. It's been over 70 years now. And the inability of Bohmian mechanics to account for these sort of familiar"
    },
    {
      "end_time": 8285.316,
      "index": 317,
      "start_time": 8262.892,
      "text": " features of our physical world isn't as a sign of empirical inadequacy, and that's a problem. Copenhagen, well, instrumentalism, vagueness, what is it measuring? Copenhagen interpretation has lots of problems. We've talked about all of those. There are spontaneous dynamical collapse approaches to quantum mechanics."
    },
    {
      "end_time": 8293.507,
      "index": 318,
      "start_time": 8285.64,
      "text": " And there are some of those I think that are still viable that haven't been ruled out empirically. Some of them are now empirically been ruled out. That means they're not empirically adequate."
    },
    {
      "end_time": 8322.312,
      "index": 319,
      "start_time": 8293.899,
      "text": " They often involve some ad hoc choices you have to make, that you have to introduce sort of ad hoc parameters, what's the time scale of which collapse is supposed to be taking place, but some of those are still live possibilities. People are working on them, and people should work on them, and I'm not saying people should stop working on any of these things. We should see if Bohmian mechanics can be made empirically adequate. We should see if dynamical collapse approaches can work, but so far they don't yet. They don't yet work. And then there are other things that are even farther away from these things, like cubism."
    },
    {
      "end_time": 8334.121,
      "index": 320,
      "start_time": 8323.302,
      "text": " So Cubism, which comes from quantum Bayesianism, is associated with Chris Fuchs, who is at University of Massachusetts at Boston."
    },
    {
      "end_time": 8363.012,
      "index": 321,
      "start_time": 8334.514,
      "text": " Quantum Bayesianism begins with the idea that probability really is in our heads, there isn't really physical probability out there, and that quantum mechanics, the formalism of quantum mechanics, is really a methodology for dealing with uncertainty, for dealing with uncertainty about the world, and it's a particular mathematical framework that you need to use to do this. It purports to not be anti-realist, it purports to be compatible with the idea that there is in fact a fact of the matter about what's going on behind quantum mechanics,"
    },
    {
      "end_time": 8390.845,
      "index": 322,
      "start_time": 8363.387,
      "text": " but it hasn't yet been able to formulate what that picture is supposed to look like. And I feel so bad because every time Chris gives a talk at some point, you know, in the question session, I'll raise my hand and I'll ask Chris this question about, well, where's the picture? What's the ontology? What's going on? He says they're just not ready to provide that yet. I feel very bad whenever I ask him that because he's so nice and patient with me when I say these things. But so I think the problem is we kind of don't have a place to stand. Right."
    },
    {
      "end_time": 8416.681,
      "index": 323,
      "start_time": 8391.766,
      "text": " I think one view is, what's the hurry? What's the emergency? Why do we need another interpretation? Just do the textbook quantum theory, Dirac von Neumann, or Copenhagen, or, you know, Bohmian mechanics, or whatever it is you want. There's no rush. There's no problem. There are too many interpretations, actually, and I would say there are too few. We do not have a problem of underdetermination with too many viable interpretations for one theory."
    },
    {
      "end_time": 8441.613,
      "index": 324,
      "start_time": 8417.159,
      "text": " We have a problem of overdetermination or at least a potential problem. We don't have a single interpretation in my view that works, that meets all the requirements I laid out, that doesn't have these serious problems. And without one, we're in danger. We're like at sea without a raft. We need something and that's why I think that the time is due for a new interpretive approach."
    },
    {
      "end_time": 8471.92,
      "index": 325,
      "start_time": 8442.5,
      "text": " Now, I've talked about the indivisible stochastic approach. We've talked about many of its features. We've talked about open questions, and there's more open questions, right? I mean, there are potential applications to quantum simulation and quantum computing that people should think about. I mean, after all, if Hilbert space pictures are dual to stochastic pictures, that may mean that quantum hardware could be very good at simulating certain kinds of stochastic systems efficiently."
    },
    {
      "end_time": 8498.063,
      "index": 326,
      "start_time": 8472.381,
      "text": " Just a moment. It's not exactly dual because you said it's many to many. That's right. It's many to many. But the idea is that a given Hilbert space picture can describe many different stochastic systems. That's good. It may mean that with quantum hardware, we can simulate many kinds of stochastic systems that might have been difficult to simulate otherwise. So one area of inquiry people can look into is, you know, and I'm certainly thinking about this is are there applications of this picture to finding new ways to simulate"
    },
    {
      "end_time": 8524.497,
      "index": 327,
      "start_time": 8498.746,
      "text": " more general kinds of stochastic systems, especially stochastic systems outside the Markov approximation, using quantum hardware in an efficient way. And then there's more formal stuff. There's a whole formulation of quantum theory in the language of C star algebras we talked about in our first talk. What's the C star algebraic formulation that's appropriate for this kind of a theory? And do we need something like that to talk about certain kinds of physical systems? If we're not starting with Hilbert spaces anymore,"
    },
    {
      "end_time": 8551.425,
      "index": 328,
      "start_time": 8526.015,
      "text": " Then we're not beholden to Hilbert spaces. We're not trying to build on top of them or modify Hilbert spaces. We're starting at a different place. We're starting just with ordinary probability theory. Does this lend itself to generalizations of quantum theory that would have been impossible to get to if we'd started with Hilbert spaces? So when we start with the Hilbert space, the worry is that if you modify the Hilbert space picture in the wrong way, you'll get nonsense. You'll get probabilities that are negative or probabilities that sum to more than one or things that don't make any sense."
    },
    {
      "end_time": 8571.647,
      "index": 329,
      "start_time": 8551.817,
      "text": " But if you begin with a theory phrased from the beginning in language of old-fashioned probability theory, you're not at risk in the same way that generalizations are going to lead to inconsistent or nonsense results probability. You don't need to get to probability from something else. When you start with Hilbert spaces, it could be the path you take to probability could break down."
    },
    {
      "end_time": 8599.155,
      "index": 330,
      "start_time": 8572.244,
      "text": " If you modify Hilbert spaces in the wrong way, the path to get to good probability breaks down. If you begin with probability, you're already there and you're just not at the same risk of running into inconsistencies with how you formulate probability. And finally, as we've already talked about, there could be some potential avenues for rethinking our approaches to quantum gravity. At this point, it would be great to talk about what is the many worlds interpretation and what is the fundamental problem or problems with it. Okay, so open questions, other interpretations,"
    },
    {
      "end_time": 8624.718,
      "index": 331,
      "start_time": 8599.633,
      "text": " I haven't said very much about Everettian quantum theory. What about Everettian quantum theory? What about the many worlds interpretation? Here is the story of the Everettian approach. There's a cartoon picture. In the cartoon picture of Everettian quantum theory, every time you do a quantum measurement, the universe splits into branches. You have a cat. The cat's superposition alive and dead. This is the cartoon version. You measure the cat and now you split. There's a universe in which"
    },
    {
      "end_time": 8649.514,
      "index": 332,
      "start_time": 8625.026,
      "text": " There's a you and a live cat, and there's a universe in which there's a different you and a dead cat. This is how the cartoon picture is supposed to work. And it seems kind of intuitive. And if you want to take wave functions to be fundamental, it seems like, well, this is the natural thing to do with them, if you sort of want to take them seriously. But you run into problems almost immediately with this cartoon picture. One problem is that it's not always 50-50."
    },
    {
      "end_time": 8678.78,
      "index": 333,
      "start_time": 8650.384,
      "text": " If the wave function is root two-thirds alive cat and root one-third dead cat, you still get two branches. So in what sense is one of them now two-thirds likely and one of them is one-third likely? If there are two branches, what does it mean to say that one of the two branches has a two-thirds probability and the other one is a one-third probability? How do we connect the branches with the notion of probability I was talking about before, randomness? If you've got a 50-50 random sequence, we expect to see zeros and ones according to some distribution that looks random."
    },
    {
      "end_time": 8697.244,
      "index": 334,
      "start_time": 8679.292,
      "text": " How do we get that picture of probability out of the branch picture of probability? This is not obvious. One thing you might try to do is argue that somehow when you have a root one-third branch and a root two-third branches, we should think of the root two-thirds branches as really two branches and there's like three branches now."
    },
    {
      "end_time": 8723.49,
      "index": 335,
      "start_time": 8697.688,
      "text": " But it turns out that branch counting arguments don't work very well. There's a well-known paper from 1989 in Analysts of Physics by Farhi, Goldstone, and Gutman called How Probability Arises in Quantum Mechanics. And you can link it. People can look at it. They try to get this sort of counting picture. You just consider infinitely or large numbers of experiments."
    },
    {
      "end_time": 8750.538,
      "index": 336,
      "start_time": 8723.865,
      "text": " large numbers of repeated trials of experiments and somehow argue that certain branches in the long run survive and others don't and you can sort of count them in some sense and this is where probability comes from. These sorts of arguments just, they fall out of favor because they don't work very well. So what do you do? Well, you could just add an axiom. You could just say axiomatically when there are branches, the Born Rule tells you what probabilities they have."
    },
    {
      "end_time": 8775.725,
      "index": 337,
      "start_time": 8751.374,
      "text": " The problem is how do we relate these probabilities back to the randomness probabilities we're talking about? Like, what does it mean to say just by fiat there's a probability here? But there's actually a deeper problem. You see, remember we talked about different bases you could use? In the Everettian approach, there's just a giant universal wave function. And there are infinitely many bases you could pick. And if you change what bases you pick, then the branches change."
    },
    {
      "end_time": 8801.237,
      "index": 338,
      "start_time": 8776.254,
      "text": " Right? All the components of the universal state vector are the branches. And if you change your basis, you change the branches. Which basis are the probabilities referring to? If there is, in fact, parallel universes with probabilities assigned to them, in which basis do we do this? This is known as the preferred basis problem. And I would add one more thing."
    },
    {
      "end_time": 8830.111,
      "index": 339,
      "start_time": 8802.125,
      "text": " Probability, when you say something is a certain probability, what you're saying is that there are n possible ways it could happen, only one of which is realized. In the Many Worlds approach, they all happen. So is this even a probability at all? Is it even coherent to talk about this using probabilistic language? And Many Worlds interpretation forces us to be skeptical about some things that we just see around us. I mean, we do experiments, we get a single outcome. The outcomes appear to be happening probabilistically."
    },
    {
      "end_time": 8860.64,
      "index": 340,
      "start_time": 8831.476,
      "text": " And the many worlds interpretation denies that that's true, right? If you're going to do that, you better have good evidence for it. OK, so. So what do you do with all these problems? One argument is to say. OK, the per basis problem is kind of a kind of problem, but but maybe nature dynamically picks out a basis. Maybe as you let the universe evolve,"
    },
    {
      "end_time": 8891.152,
      "index": 341,
      "start_time": 8861.152,
      "text": " Decoherence works out well in only one basis. There's a particular basis in which a particular way to decompose the universal wave function, so that when you decompose it in that way, decoherence gives you branches that no longer interfere with each other noticeably. I think that's Sean Carroll's argument in the Mad Dog Everett lecture, and I think you were there. It's also the view that is at the center of David Wallace's 2012 book, The Emergent Multiverse."
    },
    {
      "end_time": 8920.776,
      "index": 342,
      "start_time": 8891.561,
      "text": " This idea is that we don't presuppose a particular basis in which the branches happen. The universe just evolves and decoherence just doesn't work in most bases. But in a certain basis, we get nice, emergent, decoherent, no longer interfering branches. And that's what dynamically is the correct branching. And the branches are not fundamental. The world is not fundamental. They're not fundamentally there. They're just useful, convenient ways to describe the wave function. But now we have a problem."
    },
    {
      "end_time": 8945.998,
      "index": 343,
      "start_time": 8921.391,
      "text": " If the branches are not fundamental, if they're emergent, we can't have a probability axiom that assigns them probabilities. You see, the axioms, the fundamental axioms of your theory are supposed to refer to fundamental things. If the branches are emergent, approximate things, not fundamental things, the axioms cannot say, oh, if at some point in the future we develop these emergent approximate branches, then by axiom they'll be assigned probabilities."
    },
    {
      "end_time": 8975.503,
      "index": 344,
      "start_time": 8946.613,
      "text": " If the branches are now not fundamental, but merely emergent, merely just convenient ways to describe what's going on, then it's very difficult to think about how you would make an axiom that they should be assigned probabilities. If we're not going to get the probabilities from the axioms, we now have a fundamental problem. And this is where so much of the work in Everettian quantum theory has happened, this problem of probabilities. If the branches are emergent things, not fundamental, and we can't assign them probabilities by fiat through the axioms,"
    },
    {
      "end_time": 9005.299,
      "index": 345,
      "start_time": 8975.776,
      "text": " How do probabilities happen? Now, I think the argument I would make here is that they don't. If you were compelled to believe in an outlandish metaphysical picture like the many worlds interpretation because you had to, because it was empirically unavoidable, like we look out into outer space and we see galaxies, many, many, many billions of light years away, we see countless galaxies billions of light years away, that leads us to believe that there is a big universe out there."
    },
    {
      "end_time": 9035.145,
      "index": 346,
      "start_time": 9006.237,
      "text": " We see clocks on airplanes move at slightly different rates, atomic clocks move at slightly different rates. That's hard to believe, but we can do the experiments and we see this repeated rigorously many times. It's not that we should never believe outlandish things, but as Carl Sagan said, extraordinary claims require extraordinary evidence. The Many-Worlds Interpretation says that there is an uncountable, you know, an uncountable profusion of universes that are coming out of every single moment, not even just measurements, but all the time."
    },
    {
      "end_time": 9063.063,
      "index": 347,
      "start_time": 9035.845,
      "text": " Uh, that's an outlandish statement, and sure, we could believe it if we were compelled to by either rigorous logical reasoning or by just unavoidable empirical results. But we're just not. And when you're formulating manuals interpretation and you run into this problem of, well, I have the per basis problem, I guess I can deal with that by letting the branches be emerged into decoherence, but then I can't axiomatically assign them probabilities anymore. At that point, you should just give up."
    },
    {
      "end_time": 9088.012,
      "index": 348,
      "start_time": 9064.241,
      "text": " Because you're no longer compelled through rigorous logic or empirical data that you have to believe in many worlds. So why are you still trying to chase it down? That is, this extravagant, outlandish metaphysical picture is no longer forced upon us logically or by experiment. So why are we chasing it down? Why are we starting with the assumption that they should be there and we need to somehow"
    },
    {
      "end_time": 9118.848,
      "index": 349,
      "start_time": 9089.65,
      "text": " Jerry Mander are axioms and principles and assumptions to get the many worlds picture to come out. And that's the impression that I get when I see some of the work going on right now, right? We're not compelled to take many worlds on as a serious idea. We can only get it off the ground by adding lots more stuff. Why are we doing this? So let me just describe a couple of the routes people have taken and then we can quit because that's basically the end of it. One route is the route that David Wallace takes in his book."
    },
    {
      "end_time": 9145.896,
      "index": 350,
      "start_time": 9119.189,
      "text": " The Emergent Multiverse. It is an excellent book. You should list it on the YouTube channel. And I recommend everybody interested should read it. David Wallace is a fantastic, brilliant philosopher and also trained in physics. And the book is a beautiful book. I recommend it to everybody who's interested in quantum foundations. In that book, he tries to solve this problem of probability. How do we get probabilities assigned to these things?"
    },
    {
      "end_time": 9175.64,
      "index": 351,
      "start_time": 9147.961,
      "text": " by introducing a large number of additional assumptions. And when I have people read this book, I tell them, read it and then just make a list of every extra assumption he has to make. He assumes that we should have the same metaphysical relationship to many copies of ourselves as we would if there were only a unique individual we were to become. That means you have to take kind of a stand on old questions like the metaphysical teleporter problem in metaphysics. The theorem he uses"
    },
    {
      "end_time": 9205.691,
      "index": 352,
      "start_time": 9177.21,
      "text": " requires invoking a notion of free will that requires taking a combat up compatibilist stance because in many world's interpretation there's just a deterministically evolving universal wave function and yet he has in his proof of the born rule agents which is already a dangerous idea agents we're bringing back agents making choices about which unitary operations are going to perform this is a crucial part of the proof and his little footnote where he admits yes this does entail certain assumptions about free will but free will is a big problem no one solved it"
    },
    {
      "end_time": 9231.834,
      "index": 353,
      "start_time": 9206.015,
      "text": " But that doesn't make the case. If you're resting on an unsolved problem, it doesn't make the case that what you're doing is going to work. He introduces a number of what he calls richness axioms and rationality axioms. The rationality axioms are supposed to be general good practices of what it means to be a rational observer. These were developed in a one world kind of picture. And the assumption is that they also work in a many worlds picture."
    },
    {
      "end_time": 9259.206,
      "index": 354,
      "start_time": 9233.507,
      "text": " Basically the way that one tries to proceed here is one says, what does it mean to be rational? It means that you want to use the tools of decision theory, the formal, precise, probabilistic tools for making good decisions called decision theory. And people who use the tools of decision theory, who are rational, will end up assigning probabilities to branches according to the Born Rule."
    },
    {
      "end_time": 9285.247,
      "index": 355,
      "start_time": 9261.254,
      "text": " That's roughly and very gross outline how this argument is supposed to work. Now, John Norton, again, philosopher at the University of Pittsburgh, raised an objection to really any such approach to try to get probability out. In a deductive argument, the conclusion cannot be any stronger than the premises. If you're trying to get probability to emerge as a conclusion, there must have been probability already in your premises."
    },
    {
      "end_time": 9313.916,
      "index": 356,
      "start_time": 9286.237,
      "text": " In this proof-of-the-born rule, one is trying to get probability out, so there must be probability somewhere in the premises. If you don't assume probability somewhere in the premises, somewhere you must be doing something that is not legitimate. And you can see how this unfolds for this decision-theoretic argument, which goes back to David Deutsch also. There's an earlier version of it in a 1999 paper by David Deutsch. It's called Quantum Theory and Decisions. You can also link to that."
    },
    {
      "end_time": 9340.128,
      "index": 357,
      "start_time": 9315.043,
      "text": " The argument is that if you obey the rules of being a rational observer and use decision theory, you're going to end up assigning probabilities according to the Born Rule. But you can ask, why is that the definition of rationality? I mean, in a many worlds type universe, there are going to be observers who behave rationally according to the dictates of decision theory. Some of those observers are going to be very successful over 10 years."
    },
    {
      "end_time": 9365.179,
      "index": 358,
      "start_time": 9340.64,
      "text": " And others are going to be very unsuccessful because in the many rules interpretation everything will happen on some branch. But there are also observers who do not obey the rules of decision theory. There's some very irrational observers who just choose not to follow any of the rules of decision theory. And they're going to be branches in which they're going to be unsuccessful over 10 years. And they're going to be branches in which they're successful over 10 years. All those observers are just there. Right."
    },
    {
      "end_time": 9395.913,
      "index": 359,
      "start_time": 9365.998,
      "text": " And to say that, well, you should just be rational and obey decision theory by axiom does not solve the probability problem. In a one world picture where only one future actually happens, it seems to be the case that people who are rational and think very carefully about their decisions and use something like a decision theoretic approach in the long run over 10 years tend to make more money or healthier, live better lives, whatever it is that you want."
    },
    {
      "end_time": 9425.845,
      "index": 360,
      "start_time": 9396.852,
      "text": " And that gives us reason to think, oh, these are good rational principles. If people who follow these principles tend to do better, I see people who exercise and people who make good financial decisions and hedge their investments, they do better. I go, oh, well, there are good reasons, therefore, to do what they do and take on their principles. But you can't turn it around and say that we're going to start with axiomatically, this is the way to be rational, and then go backward and show that"
    },
    {
      "end_time": 9452.602,
      "index": 361,
      "start_time": 9426.34,
      "text": " that then entails that probability should work. And that's kind of the sort of reverse argument that's taking place. I should say that not all Everettians take this decision theoretic view. Simon Saunders, for example, tries to do probability in a more Boltzmannian, cisco-mechanical way by coarse-graining and actually counting in some sense, but it's still in its embryonic form. So there are a lot of approaches to many world's interpretation and"
    },
    {
      "end_time": 9475.486,
      "index": 362,
      "start_time": 9453.114,
      "text": " At present, none of them seem to find a way to get probability off the ground, and I don't think that you can. And to the extent that you can by just taking on more and more assumptions, you're doing the thing where you're adding on extra empirical assumptions that can't be verified in an experiment. I mean, I don't know how experimentally to test that I should have the right relationship to many copies of myself. That's an extra empirical statement."
    },
    {
      "end_time": 9498.643,
      "index": 363,
      "start_time": 9475.845,
      "text": " If you take many of those on in order to get the picture off the ground, I don't know how credible it is. How much credence should I give to a theoretical picture that relies on a tower of SMHs, speculative metaphysical hypotheses? I feel like if you have to do all that work to get the theory off the ground, then it lowers your credence that we should take on such an outlandish idea that there are all these many worlds."
    },
    {
      "end_time": 9527.073,
      "index": 364,
      "start_time": 9499.411,
      "text": " so that's basically where i end up with many worlds approach and this is one of the reasons why i think there's room for another interpretation that's much more conservative that says well we do experiments we see one outcome maybe that's because there is just one outcome and the experiments look probabilistic maybe that's because they are in fact probabilistic nature is telling us it's probabilistic we should listen to nature rather than saying nope nope nope got to be deterministic there's a universal wave function evolving deterministically it's got to be Markovian"
    },
    {
      "end_time": 9553.592,
      "index": 365,
      "start_time": 9528.422,
      "text": " You know, maybe we should just listen to nature and build a theory around what nature is telling us. That's, I think, the conservative, non-Outlandish approach that one should take. I want to know, how is it that you got so great at being articulate and smooth with your speech? That's a very, very kind thing to say. I really appreciate that. That's really nice of you to say."
    },
    {
      "end_time": 9585.026,
      "index": 366,
      "start_time": 9556.476,
      "text": " I think we all have different strengths. I'm bad at many, many, many things. There are a few things I've gotten good at through practice. There's some things we're all born kind of a little bit good at. We've like embryonic things that we're sort of good at and then we hone those things. I've taught many classes over many years here. I've interacted with such amazing students, brilliant, idealistic, just wonderful students who ask all kinds of great questions."
    },
    {
      "end_time": 9610.674,
      "index": 367,
      "start_time": 9585.828,
      "text": " I just think it's practice. You just talk a lot with people about very intricate topics and over time it gets easier. That's the best answer I think I can give. There's an Aesop fable I like to bring up with people. It's about a stag and its antlers. So there's the stag who's drinking from a pool."
    },
    {
      "end_time": 9636.852,
      "index": 368,
      "start_time": 9611.323,
      "text": " and admiring his beautiful antlers. He thinks his antlers are so magnificent, so glorious. He goes on and on, like, antlers are really the envy of the animal kingdom. Then he looks at his legs and says, but my legs are bony and ugly, and if only my legs could be as remarkable as my antlers. As the stag is pondering this, he suddenly becomes aware that a pack of wolves is chasing him. So he"
    },
    {
      "end_time": 9666.186,
      "index": 369,
      "start_time": 9637.466,
      "text": " gets up and runs from the water. He's trying to get away from the wolves and he sees a forest. He's going to run into the forest to hide. And as he runs into the forest, his antlers start getting tangled in all the vines. And before he knows it, he can't run anymore. He's stuck. And as the wolves approach him, he realizes that the thing that he was praising his antlers was his undoing and the thing that he thought was, you know, his weakest feature, his legs, they were the things that would have saved him. If it had just been his legs, his legs would have saved him."
    },
    {
      "end_time": 9694.172,
      "index": 370,
      "start_time": 9667.654,
      "text": " Um, so the reason I bring this up is in addition to saying that I think we're all like good at a few things and maybe difficulty, a lot of things, some of the things we think we're bad at seen in another way are the things we're good at and sometimes vice versa. So I'm going to say something that anyone who has known me growing up will laugh at because it's so obvious. I came into this world profoundly lacking in common sense. Okay."
    },
    {
      "end_time": 9719.77,
      "index": 371,
      "start_time": 9694.445,
      "text": " Okay? Anyone who's ever known me growing up would say that's the most obvious statement I've ever made, okay? Profoundly lacking in common sense. And as I grew up, you know, you get made fun of, you make a lot of mistakes, you do a lot of silly things because you lack common sense, and you see it as kind of a weak feature. You see it as something you're a little bit embarrassed about. When you get into philosophy and foundations of science, philosophy of physics,"
    },
    {
      "end_time": 9748.729,
      "index": 372,
      "start_time": 9720.555,
      "text": " What you see is a lot of people whose common sense takes them in directions they shouldn't go. You see a lot of people who make arguments or make speculations and make claims that just seem very commonsensical to them. And sometimes those are not really rigorously supported. Their commonsense can lead them into error. Suddenly, lacking commonsense becomes a huge advantage because when I read a philosophy paper or I listen to a seminar or I'm trying to formulate an argument,"
    },
    {
      "end_time": 9778.541,
      "index": 373,
      "start_time": 9749.565,
      "text": " I don't have the kind of common sense that makes the answers obvious to me. So I see every argument and I have to take it apart and really disassemble it and understand what all the pieces do because I don't have an intuition, a common sense for how things are supposed to work. And what this means is that to some extent, and obviously, I mean, we all make mistakes. I make errors too. But I feel like some of the errors I might have made if I had more common sense, I'm less likely to make. So a thing that I thought was my weakest feature"
    },
    {
      "end_time": 9801.323,
      "index": 374,
      "start_time": 9779.07,
      "text": " the stags legs in a different context try not to be really useful, like being on land and having only flippers for your arms and legs. And then one day you discover the ocean and suddenly what you thought was your weakest feature becomes now your greatest asset. So that's a general lesson I think that everyone needs to take to heart. Many of the things we think are"
    },
    {
      "end_time": 9828.114,
      "index": 375,
      "start_time": 9802.278,
      "text": " And now you're speaking to researchers and potential researchers, people who are younger students"
    },
    {
      "end_time": 9854.633,
      "index": 376,
      "start_time": 9829.326,
      "text": " Even people who are older students perspective, there are some people who are 70 and getting their PhD and watch this. Yeah. So what is a method that they can use to help figure out or distinguish between what is a an actual good feature versus an actual bad feature that they thought was good? The best I can say there is experience. Put yourself in different contexts. If I had never become"
    },
    {
      "end_time": 9860.896,
      "index": 377,
      "start_time": 9855.367,
      "text": " Someone who worked in philosophy and foundations of physics, I might have gone my whole life thinking that lacking common sense was really bad."
    },
    {
      "end_time": 9890.435,
      "index": 378,
      "start_time": 9861.817,
      "text": " Maybe it is really bad in some contexts, but I wouldn't have seen that there are in fact flip sides to it. Another thing is just talk to lots of people and ask them, are there any aspects of themselves that in some contexts they see as bad and other contexts they see as very helpful? And if you talk to enough people, you'll begin to hear them say things that remind you about things about yourself. And you'll go, wow, I have this feature that I'm not feeling great about, but this person has found a way to really use it really well. Maybe I should follow their example and do what they do."
    },
    {
      "end_time": 9920.401,
      "index": 379,
      "start_time": 9891.067,
      "text": " So yeah, that's probably my best advice there for how to do it. But let me add one last thing in closing, right? When I teach a class, I just taught this class this fall term. We just finished teaching. I just finished teaching our classes for the fall term. I said to the students, look, we've talked about a lot of physics in this class. This was a physics class. Sometimes teach physics classes, sometimes teach philosophy classes. This was a physics class."
    },
    {
      "end_time": 9951.749,
      "index": 380,
      "start_time": 9922.21,
      "text": " If in a year you don't remember some or most or maybe even all of the physics that we've talked about, I won't be disappointed. But we have to be human to each other. You know, we have to be human beings to each other. And if you forget to do that, then I'll be super disappointed. You never know when you're interacting with somebody. Is this somebody who five years from now"
    },
    {
      "end_time": 9981.732,
      "index": 381,
      "start_time": 9952.995,
      "text": " is going to be the right person at the right moment to play a really important role in your life, your career, your well-being. You have to treat everybody like they could potentially be super important to you. I mean, obviously, if you're getting a hundred emails a day, you can't treat every, I mean, just like a sheer amount. But to the extent that you can treat everybody with basic respect and treat people like human beings and be human to them, you should always do that because you just don't ever know"
    },
    {
      "end_time": 10006.544,
      "index": 382,
      "start_time": 9982.261,
      "text": " You know, they could always I mean, obviously, just for its own merits. I mean, people should be treated like human beings anyway, but but it's also just a good strategy because you never know if someone ultimately down the line is going to end up being important to you. When students for start here in our PhD program, one of the things I tell them is that one of the most important assets they have is their reputation. And"
    },
    {
      "end_time": 10033.2,
      "index": 383,
      "start_time": 10006.971,
      "text": " A lot of people think that the right scientific reputation to have is to be intimidating, for everyone to think you're the smartest person in the room, for everyone to be, you know, in awe of your intellect and almost afraid to talk to you, right? People think that's the kind of reputation you're supposed to develop. Not everybody does, but some people think that's what you're aspiring to. And people can often think of figures in their lives, role models in some cases that have that kind of reputation."
    },
    {
      "end_time": 10062.654,
      "index": 384,
      "start_time": 10034.326,
      "text": " I would argue that's not the right reputation that you should cultivate, that you should seek to have. I talked about treating people like human beings. Your reputation is in science, and this is for any students, researchers who want to go into science, your reputation is worth its weight in gold. The kind of reputation you want to have is someone people want to work with, someone people want to go and talk to and ask questions to."
    },
    {
      "end_time": 10087.073,
      "index": 385,
      "start_time": 10063.183,
      "text": " You want to be the kind of person who, when people come to you and ask you questions and talk with you, when they leave, they feel smarter than they did before. Because if people come to you and they leave feeling smarter, feeling happier, feeling like they can go out and do things more confidently, they're going to want to come back and work with you again, talk with you again. Yes, there are very successful people who don't have that kind of reputation, who are very intimidating, and they're successful. But they would be even more successful in my view"
    },
    {
      "end_time": 10112.551,
      "index": 386,
      "start_time": 10087.927,
      "text": " If they cultivate the kind of reputation that made people want to collaborate with them, work with them, and importantly support them. Because everybody in every walk of life at some point will need someone to come along and help them out with something. And if people see you as someone who is collaborative and helpful, and someone who builds people up, and someone who treats people like human beings, then they'll be more likely to support you when you need help."
    },
    {
      "end_time": 10134.957,
      "index": 387,
      "start_time": 10113.729,
      "text": " And that's the kind of investment in your own career and your own future that I think everybody needs to take very seriously and think very seriously about. Thank you, Jacob. I appreciate you spending seven hours. Kurt, it was a delight. It was a complete delight. And Addie and Will, it was really just a delight."
    },
    {
      "end_time": 10154.855,
      "index": 388,
      "start_time": 10138.899,
      "text": " I've received several messages, emails, and comments from professors saying that they recommend theories of everything to their students and that's fantastic. If you're a professor or a lecturer and there's a particular standout episode that your students can benefit from, please do share. And as always, feel free to contact me."
    },
    {
      "end_time": 10182.568,
      "index": 389,
      "start_time": 10155.282,
      "text": " New update! Started a substack. Writings on there are currently about language and ill-defined concepts as well as some other mathematical details. Much more being written there. This is content that isn't anywhere else. It's not on Theories of Everything. It's not on Patreon. Also, full transcripts will be placed there at some point in the future. Several people ask me, hey Kurt, you've spoken to so many people in the fields of theoretical physics, philosophy, and consciousness. What are your thoughts?"
    },
    {
      "end_time": 10194.548,
      "index": 390,
      "start_time": 10182.568,
      "text": " Also, thank you to our partner, The Economist."
    },
    {
      "end_time": 10219.189,
      "index": 391,
      "start_time": 10196.817,
      "text": " Firstly, thank you for watching, thank you for listening. If you haven't subscribed or clicked that like button, now is the time to do so. Why? Because each subscribe, each like helps YouTube push this content to more people like yourself, plus it helps out Kurt directly, aka me. I also found out last year that external links count plenty toward the algorithm,"
    },
    {
      "end_time": 10245.35,
      "index": 392,
      "start_time": 10219.189,
      "text": " Which means that whenever you share on Twitter, say on Facebook or even on Reddit, et cetera, it shows YouTube. Hey, people are talking about this content outside of YouTube, which in turn greatly aids the distribution on YouTube. Thirdly, you should know this podcast is on iTunes. It's on Spotify. It's on all of the audio platforms. All you have to do is type in theories of everything and you'll find it. Personally, I gained from rewatching lectures and podcasts."
    },
    {
      "end_time": 10265.247,
      "index": 393,
      "start_time": 10245.35,
      "text": " I"
    },
    {
      "end_time": 10288.78,
      "index": 394,
      "start_time": 10265.247,
      "text": " and donating with whatever you like there's also paypal there's also crypto there's also just joining on youtube again keep in mind it's support from the sponsors and you that allow me to work on toe full time you also get early access to ad free episodes whether it's audio or video it's audio in the case of patreon video in the case of youtube for instance this episode that you're listening to right now was released a few days earlier"
    },
    {
      "end_time": 10295.35,
      "index": 395,
      "start_time": 10288.78,
      "text": " Every dollar helps far more than you think either way your viewership is generosity enough. Thank you so much"
    }
  ]
}

No transcript available.