192 | Nicole Yunger Halpern on Steampunk Quantum Thermodynamics

Randomness and probability are central to modern physics. In statistical mechanics this is because we don't know everything about the distribution of atoms and molecules in a fluid, so we consider a probability distribution over what they might be; in quantum mechanics it's because the theory only lets us predict measurement outcomes probabilistically. Physicist Nicole Yunger Halpern explains how we've been lagging behind at bringing these two theories together, and how recent progress is changing the landscape of how we think about the microworld.

Support Mindscape on Patreon.

Nicole Yunger Halpern received her Ph.D. in physics from Caltech. She is currently a NIST physicist and Adjunct Assistant Professor of Physics and IPST at the University of Maryland. Her Ph.D. thesis won the international Ilya Prigogine Prize for a thermodynamics dissertation. As a postdoc she received the International Quantum Technology Emerging Researcher Award. Her new book is Quantum Steampunk: The Physics of Yesterday's Tomorrow.

0:00:00.0 Sean Carroll: Hello everyone. Welcome to the Mindscape podcast. I'm your host, Sean Carroll. We've all heard of the Clockwork Universe, right? This is the idea that came into physics post Isaac Newton and Pierre-Simon Laplace. The idea, if you knew everything was going on in the universe, you could predict exactly what would happen next. Of course, the Clockwork Universe isn't the current paradigm for thinking about physics because of quantum mechanics, which came about in the early 20th century, we know that the world, as we observe it, is not deterministic. There is unpredictable quantum randomness in the world. But that quantum randomness is not the first time that randomness or probability popped up in the history of physics. There's also famously in the late 19th century, the advent of statistical mechanics. We had thermodynamics, we were inventing in the 19th century, putting together the theory of thermodynamics, the science of heat and how different kinds of fluids and gases pushed on each other and did work and dissipated entropy and all that stuff.

0:01:01.6 SC: And then Maxwell and Boltzmann and their collaborators figured out that you could understand thermodynamics if you believed in atoms and molecules. If you believed that things you thought of as a fluid or a solid or actually many, many, many little particles. And these particles were more likely to behave in some ways than others. So probability came in. The fact that entropy increases over time is not an airtight rule in Boltzmann's way of thinking about it. It's just very, very, very likely. People didn't like this at the time, there were strong, strong objections, but now it's the accepted lore.

0:01:37.4 SC: So you might think that statistical mechanics which involves probability and quantum mechanics which involves probability would be close bedfellows, right? They would be natural partners. And you would think about quantum statistical mechanics and its implications for either pure science or for technology or what have you. Well, they are natural fellow travelers, but it's taken a long time for scientists to really dig into the essence of quantum statistical mechanics. How do you combine together the idea of quantum mechanics replacing particles and fields with wave functions and observations with probability outcomes with statistical mechanics? The idea that you don't know exactly what the system is doing, you can only make probabilistic assertions about what's going to happen next.

0:02:25.6 SC: Today's guest, Nicole Yunger Halpern is a leading expert, a very young person, but nevertheless leading in the dawning science of quantum statistical mechanics and thermodynamics, and its relationship to things like information theory. How much do you know about a system. And the fact that it combines crazy radical ideas with potentially real world experimental implications has inspired Nicole to name her own field Quantum Steampunk. And she has a new book out called Quantum Steampunk all about this revolution that's been going on for no more than 20 or 30 years, I would say, in physics, where we're really understanding the frontier of statistical mechanics with quantum at the heart of it.

0:03:11.0 SC: And so, again, it has both implications for how we think about the fundamental nature of reality, because that's what quantum mechanics is. And maybe implications for things like how biophysics works, how DNA works or maybe for nano-technology for building little machines at these tiny scales. Thermodynamics and statistical mechanics were driven by the urge to understand steam engines at some sense, in some way. Quantum thermodynamics is going to be building steam engines at the molecular scale. Thus, Quantum Steampunk. And also the outfits are really, really cool.

0:03:47.7 SC: So there's a fun conversation, cutting edge science going on here, but in a slightly different vein than we've had previously on the podcast. I'll give you our occasional reminder that we have a website, preposterousuniverse.com/podcast where you can find show notes, transcripts for every episode. You can search through them, etcetera. All the archival episodes are just as awesome now as they were when we recorded them. And there is a Patreon that if you wanna support Mindscape, you can go to patreon.com/seanmcarroll and you get both ad-free versions of the podcast episodes, plus it's the Patreon supporters that get to ask questions for the monthly, Ask Me Anything, episodes. So plenty of reason to join the Patreon, although I would argue that the reason to do it is less for tangible benefits than for the intangible benefit of being part of the community of Mindscape supporters.

0:04:39.5 SC: We talk on the Patreon page, there are comments, there's a back and forth, there's a whole back channel, you don't know about, if you're not a Patreon supporter. So join up today, patreon.com/seanmcarroll and even if you don't, that's okay. We still love you. You can still enjoy the episode. Let's go.

[music]

0:05:12.1 SC: Nicole Yunger Halpern, welcome to the Mindscape podcast.

0:05:14.6 Nicole Yunger Halpern: Thank you. It's a delight to be here.

0:05:16.7 SC: I wanted to take advantage of having you here. You have this wonderful Quantum Steampunk label for both your research and your new book. And obviously both of those words are very evocative in very different ways. I had to laugh a little bit because when I wrote my book about quantum mechanics, about many worlds, etcetera I went to Amazon and I looked for books with the word quantum in them, and I didn't find that 'cause it didn't exist yet. But I did find quantum leadership, quantum yoga, quantum healing, etcetera. So you're joining a long, not so very dignified history of using the word quantum mechanics, but you're using it in a much more respectable way.

0:05:55.9 NH: That's the intent.

0:05:57.9 SC: [chuckle] But before we get to the quantum part, I wanted to think about the thermodynamic part of what you do. Actually, maybe why don't you just define what you mean by Quantum Steampunk and then we'll back up a little bit and lay some groundwork.

0:06:10.4 NH: Okay, Steampunk is a genre of literature, art and film. It's a genre that combines Victorian settings from the 1800s. So, some of the earliest factories, very smoky, foggy London. The wild, wild west, men in top hats. And puts those together with futuristic technologies like time machines, dirigibles and automaton. So this is a genre of sci-fi, but I see it as really coming to life in science that's happening now at the intersection of quantum physics, information theory and thermodynamics. Thermodynamics is the study of energy and it was developed during the 1800s, motivated by those smoky factories, the steam engines that were driving them. So quantum thermodynamics, this bringing together of these three fields is partially Victorian, but quantum information is part of the science and technology of the future. People are building quantum computers now. So I see Quantum Steampunk as really the spirit of quantum thermodynamics and as sharing its aesthetic with this genre of literature, art and film.

0:07:26.6 SC: And especially the word steam is just a wonderful fit because like you say, that was the origin of thermodynamics. Like, these days we think of thermodynamics in terms of entropy and the arrow of time and stuff like that, depending on where you come from. But it was really all about building steam engines back in the day, right?

0:07:42.3 NH: Yes, people wanted to be able to pump water out of mines in England.

0:07:47.3 SC: [laughter] And I'm told, and I've often related this, but maybe I don't know if you have better insight on the history that Sadi Carnot of the Carnot Cycle fame, one of the pioneers of thermodynamics, was mostly motivated by his annoyance that British steam engines were so much better than French steam engines. Have you ever heard that story?

0:08:06.9 NH: I can imagine that. I hadn't heard that, but I can well imagine that.

0:08:11.0 SC: Yeah, there's a certain nationalistic pride there, and so he invented... So in a very French fashion, he said, well, let's invent the perfect steam engine, and then realized that there would always be some dissipation about the Second Law. So good, this is a perfect excuse for me to talk about thermodynamics 'cause it's a crucially important part of science. In some ways, I think that thermodynamics is kind of not given as much respect among professional physicists as it should. We all love the laws of thermodynamics, but it's not quite considered a sub-discipline. You have particle physicists. You have astrophysicists. There's not a lot of people, at least in the modern US culture, that I'm aware of who are specializing in thermodynamics or statistical mechanics.

0:08:57.4 NH: I discuss that a little bit in the book. Quantum thermodynamics has its root in the 1930s, right after people discovered quantum theory, but it's really blossomed over the past 10 or so years, and a lot of the energy has come from Europe, Canada, Singapore, outside the US. So the US has taken a while to catch on, which is ironic because there was also another spurt of growth in quantum thermodynamics during the 1980s, and a lot of those developments did happen in the US. But over the past 10 years or so, there was a lot less quantum thermodynamics activity in the US. And so that's one reason why I considered going elsewhere for grad school actually. Europe has this millennia old wonderful tradition of philosophy, and the US has had, let's say, at least a stereotype of efficiency and productivity and applications. So the nature of the quantum science done in the US has been very different, and quantum thermodynamics has, historically, especially, at the beginning of its lifetime, been more abstract, mathematical and philosophically oriented.

0:10:17.9 NH: Recently, there have been lots of connections with experiments. Some of my colleagues and I have worked to build bridges with the fields that are more developed in the US in atomic, molecular, and optical physics, [0:10:27.5] ____ matter, High Energy Physics. And so now, there's a lot more interest and enthusiasm in the US for quantum thermodynamics and in association with that thermodynamics, I think.

0:10:40.4 SC: Well, it's ironic, right? Because like we just said, thermodynamics itself, minus the quantum part, classical thermodynamics from the 19th century was the paradigmatic applied science. It really was almost engineering. Can you tell us a little bit about your favorite stories of the origins of thermodynamics? How do you think about that, those developments over the 1800s, which were so influential?

0:11:04.5 NH: Some people as we discussed were interested in seeing how efficiently engines could power factories and pump water out of mines, but these practical questions can't really be considered without the fundamental questions that go along with them. For instance, steam is a relatively large classical system. You can think about it without having to think about individual particles, but some thermodynamicists thought it worthwhile to think about individual particles. Atomism had basically been established by then, but still some people didn't like atomism and some thermodynamicists said, we can describe a lot of the changes coming about in the world during the Industrial Revolution by just referring to large-scale properties of systems like energy, temperature and pressure. We have no need to talk about these invisible particles that you can't even see. It's unscientific for you to talk about them. And they sort of beat down on the thermodynamicists who wanted to think about individual particles, but fortunately, those other thermodynamicists did have their say and they ended up being right, that there are individual particles out there. And so we can even start to see the legacy of quantum theory in thermodynamics.

0:12:27.6 SC: That's true. Yeah, there definitely was the hints of atoms and particles that eventually grew into quantum theory. I'm wondering if maybe... We've all heard entropy increases over time, the second law of thermodynamics, and the closed system entropy goes up, but that was not the formulation that people had back in Sadi Carnot's time. He didn't know about the word entropy. He thought about cycles and engines. And so can you sort of put us back in that mindset of someone who thinks about thermodynamics in terms of cycles and engines? What was Carnot's big thing that he explained to us?

0:13:01.3 NH: There are multiple ways of casting the second law of thermodynamics. One is very closely related to Carnot's engine, which as you mentioned is an idealized engine. It is some working medium that has contact with two different large bodies, we call them heat baths. You can also call them environments, but I like the term heat baths because it makes me think of soap and loofahs and so on. So this engine can contact two different heat baths. One is hot, one is cold. And the engine undergoes a cycle, a sequence of four different steps. And at the end of the cycle, the engine ends up in the same configuration, the same state as it was at the beginning. That's why we call it a cycle, because the four steps close, the engine ends up back where it was. However, the rest of the world has changed. During the cycle, the engine performs work, useful energy on some other system so it might be lifting a weight or turning a paddle wheel. And meanwhile, energy has flowed between those two heat baths. So the hot bath gets a little bit cooler, and the cold bath gets a little bit warmer.

0:14:21.5 NH: So Carnot calculated the best possible efficiency achievable by this engine or by any engine that's in contact with just two different heat baths of two different temperatures. And so his... This optimal efficiency is called the Carnot bound or the Carnot limit. When we develop new engine cycles, including in quantum thermodynamics, to evaluate how well the engine performs, we'll often calculate the efficiency, and then double-check that if our engine satisfies Carnot's assumptions, it does not break the Carnot bound, because if our engine broke the Carnot bound, then we would basically be breaking the second law of thermodynamics, and certainly, we would have done something wrong.

0:15:05.5 SC: [laughter] Or really, really important. But actually, let me...

0:15:09.0 NH: Yes. With very, very low probability.

0:15:09.7 SC: Very low probability. I know. We have our priors on these things. But this is peeking ahead a little bit, but of course, we're gonna eventually talk about the quantum side of things. Are the results of Carnot, etcetera, still just as true in the quantum realm, or once we have quantum mechanics, can we do better than a Carnot efficiency?

0:15:28.5 NH: One might hope that quantum theory could get around Carnot's bound. To my knowledge, Carnot's bound has not been broken, and there are many papers with many proposals of engines, classical and quantum, and in these papers, the authors check that, yes, my engine obeys the Carnot bound. However, there are ways to bend around the Carnot bound. So you could propose an engine that on the face of it might seem like it should obey the Carnot bound, but actually accesses some slightly different resource, like a certain quantum resource that really doesn't enable the engine to fall under the assumptions of Carnot, and so you can, "break the Carnot bound," but actually, you're just bending around it.

0:16:14.8 SC: Just heat it. Yeah. Okay.

0:16:16.4 NH: Keeping a better efficiency.

0:16:16.4 SC: Alright. And so back to the classical world of the 1800s, you've used words or there are words hanging around in the background here that would be worth digging into, namely, work and heat. These were tough for me when I was first learning thermodynamics. Clearly, we need to understand these two different forms of energy. And I think that no one really does, or at least and I think that people disagree about what the best way of thinking about these things are. Do you have a favorite way of saying what is work and what is heat?

0:16:47.3 NH: Work and heat are the two types of energy that are being transferred between two different bodies. And my way of thinking about them is partially a little loose and intuitive and then partially we can put some math to, say, the work that is performed, and then say, everything that's not covered by this math is heat. So work is energy that's being transferred that's directly useful. You can directly harness it to push a rock up a hill or charge a battery. It's, in a sense, coordinated, whereas heat is random energy. So it's uncoordinated, it's not doing something useful, although if you wanted to, then you could use it in conjunction with an engine to do something more useful to turn it into work. So heat is more the random jiggling around of molecules.

0:17:40.5 SC: I think... Yeah, I think heat makes sense. And it even makes sense how you can turn heat into useful work. That's literally what happens in an internal combustion engine. You fire some piston, the gas ignites, and it heats up and pushes the piston. But the work part is always a bit more subtle, especially because, like you just did, it's so often defined in terms of usefulness. And that seems about anthropocentric, right? It seems like human beings and their ideas about what is and is not useful are somehow creeping into this fundamental law of physics. But there's also the related concepts of coordination, which you just used or organized forms of energy. Are there rigorous ways of defining the sense in which certain kinds of energy or certain energetic substances can be organized and therefore, we call that energy work, rather than heat?

0:18:35.0 NH: We have in thermodynamics a set of thermodynamic variables. So entropy, temperature, pressure, volume, particle number, chemical potential. Some of these numbers are extensive. They grow with the size of the system. For instance, you can have some material that takes up some volume. If you take another copy of the material and put it right next to the first copy, then you've doubled the volume. So those are the extensive parameters. And then there are intensive parameters that don't scale in that way. For instance, if you have one material at some temperature, and you take another copy of the material and put it right next to the first copy, everything is still at the same temperature. And we do work when we change those extensive parameters. For example, we could have a gas in a box, our favorite example of steam, that gas could be kept with a piston, and if the gas is expanding against the piston so that it's increasing its volume, then the gas is performing work against the piston, even if it's not particularly useful for me personally to have the piston move upward.

0:19:48.2 SC: Got it. Okay. Good. Yeah. So it's not really as anthropocentric as it sounds. But is this question of a clear bright line between heat and work, is that a useful thing to think about? Do the experts really truly understand it once and for all, or is it kind of the thing where researchers are still quibbling over it?

0:20:08.6 NH: Researchers definitely quibble over it in quantum thermodynamics. I've heard less quibbling in classical thermodynamics, but there could be quibbling that I haven't heard.

0:20:20.2 SC: [chuckle] Good. I think that gives us the background we need. And then there are, of course, I want to get on the table the idea of refrigerators, which are somehow related to engines, but backwards.

0:20:31.6 NH: Yes, exactly. We could take, for instance, Carnot's cycle that moves heat from the hot bath to the cold bath and run it backward. We would be moving heat from the cold bath to the hot bath to cool down the cold baths to keep last night leftovers cool.

0:20:52.6 SC: But this requires some energy, is the point, like it's never gonna happen spontaneously, that would violate the second law.

0:20:58.3 NH: Exactly. If we want to run Carnot's engine backward, then instead of getting work out of the cycle, we have to put work into the cycle.

0:21:05.7 SC: Yeah. Okay, good. And so the next big thing, so that's how people were thinking, like in the first half of the 1800s, 19th century, the second half, like you already alluded to, atoms came on the scene in statistical mechanics, and I think probably people are somewhat familiar with that basic idea that really gases are made of atoms, etcetera, but what really is interesting, like the profound philosophical move, is that probability and information come into the game. Can you say a little bit about that sort of conceptual shift in how thermodynamics was thought about?

0:21:40.5 NH: Thermodynamics originally was thought about in terms of these large-scale quantities that characterize systems, the ones that I just mentioned, external and internal, excuse me, extensive and intensive parameters like energy, temperature, and pressure. But you can go farther and think about some material of many, many particles as... Excuse me, some large material as consisting of many, many particles. And in principle, if you know some of the theories that we have a good handle on in Physics today, including quantum mechanics, then you could imagine describing all of those particles individually and all of those particles' interactions. And in principle, if you had enough computing power, which you won't, then you could recover the large-scale behaviors of as large a material as you please. But again, this is very difficult to do, and also it's not necessarily the most useful because when we study thermodynamic behaviors of materials, we don't really care about what all the little particles are doing, we're more interested in what the material is doing, which is well described in many cases by average behaviors of the particles. And we can also, not knowing exactly what the particles are doing, use probabilities to say the particles have some probability of being here with these momenta and some probability of being there with those momenta. We don't know, but we don't actually need to know in order to recover the large-scale behaviors of the material.

0:23:25.7 SC: So, did people in 1870 think in terms of information theory at all? They certainly began to think in terms of probability, people like Maxwell and Boltzmann, etcetera.

0:23:35.2 NH: There are certainly the beginnings of information theory there. For instance, Boltzmann's H-function is a manifestation of entropy, which played a very important role in information theory during the 20th century, as invented by Claude Shannon. Claude Shannon the "father of information theory," explained why he, when he came up with some function in information theory that quantified uncertainty about something, he wasn't sure what to call it, then he got advice from his friend John von Neumann, the great Hungarian-American Mathematical Physicist, and von Neumann said, "This thing you should call it entropy because it's already been called that in statistical mechanics. It already exists. And also, no one really understands entropy, so in a debate, you will always have the advantage."

0:24:31.2 SC: [laughter] I don't know how many debates Shannon got into, but I hope that he did get the advantage there. But I don't wanna sort of gloss over the profoundness of this move. We start by talking about steam engines, and everyone thinks that there's something that's gonna happen when you light the spark in your piston, but now you've re-formulated everything in terms of information, like, entropy is a measure of how much we don't know about the microscopic state of the system. How do you reconcile in your soul how information helps pistons move?

0:25:04.9 NH: That's a good question. I suppose there are multiple ways to approach this question, and one way is thermodynamics is an operational theory. It is about agents who run engines, who want to cool systems down or to charge batteries, so they want to accomplish goals, and they can accomplish goals only in so far as they have the information about where a gas is compressed or... And so information and the gases entropy should come into play when we think about thermodynamics.

0:25:43.3 SC: Okay, that makes sense. I guess, let's make it even more specific by talking about everyone's favorite thought experiment of Maxwell's demon. Maxwell's demon, in some sense, is an operationalization of this relationship between information and thermodynamics and entropy and so forth. So I'll let you tell the audience what Maxwell's demon experiment is actually supposed to tell us, what it is and what it is supposed to teach us.

0:26:06.8 NH: James Clark Maxwell, who was one of the founders of thermodynamics, came up with a thought experiment in which he challenged the second law. He said, suppose that there is a gas in a box, our favorite system in thermodynamics, there is a partition dividing the box in half, there is a little door in the partition, and there's what he called a finite being. His colleagues named it a demon, which is probably the term that Maxwell should have used because it caught on so well. There's a demon who can open and close the door. This demon will see particles moving at a very high speed coming, let's say, from the right and will let those through, and whenever the demon sees slow moving particles coming from the left, the demon will let those through, but if there are any fast particles coming through the left, the demon won't let those through the door, and so on. So eventually, the demon will separate out the different particles in the gas so that the quick molecules are on the left-hand side of the box and the slow molecules are on the right-hand side of the box. The speed of a set of molecules in a gas is directly related to the energy of that gas and its temperature.

0:27:24.4 NH: So the demon will have separated what used to be just one uniform gas which is everywhere at just one temperature into two different gases, one hot and one cold, then the demon could take Carnot's idea and run an engine cycle, and from this temperature difference extract useful work. So the two gases over the course of the engine cycle will come to be at the same temperature or after many, many cycles, so the demon will have extracted work while returning the whole gas to the same uniform temperature, and then the demon could do this entire thing again and again and again, so the demon could run a perpetual motion machine and extract as much work as he wanted, that idea violates the second law of thermodynamics. So what is wrong with Maxwell's demon? Why can't this Maxwell demon really operate as it operates?

0:28:22.9 SC: Good. So what's the answer? It took us a while, right? It took us... Is it...

0:28:24.5 NH: There is still a little bit of...

0:28:27.1 SC: Yeah, it took us an embarrassingly long time...

0:28:28.1 NH: There is still...

0:28:28.9 SC: To figure this out.

0:28:30.4 NH: Yes. So the answer that is the most widely accepted was put forth by Charlie Bennett near the end of the 20th century, he was building on the work of Szilard, another great Hungarian physicist and as well as Rolf Landauer. There is a little bit of debate about whether this is really satisfactory answer, but I see quite a bit of agreements across the Physics community, and I find the answer satisfying personally. So Bennett proposed that the answer is this demon actually records information about which particles are moving quickly, and so where the hot gas ends up in this part the demon at the end of... I suppose that the demon has separated the gas into two different gases and has run the engine cycle so that there's just one uniform gas again, we would like to say, according to our story, that everything is reset, so that it's exactly as it used to be at the beginning.

0:29:33.5 NH: However, the demon actually has this memory which is now full of information, and in order to actually reset the full system, the demon needs to erase this information from his memory, and Landauer showed that erasing information costs thermodynamic work, so the work that the demon gains by running the engine cycle is lost in the erasure, and so the demon nets zero work after all and can't run a perpetual motion machine.

0:30:05.2 SC: It's interesting. So you say that erasure cost thermodynamic work, but usually that I think about it is erasure increases entropy, these are probably the same things said in different words?

0:30:16.2 NH: Yeah, basically.

0:30:17.2 SC: Okay. [chuckle] And that's roughly because irreversible things increase entropy, right? Like, if you had just the underlying Newtonian or even Le Chatelier equation laws of Physics, everything is reversible, but something like erasing is irreversible, so that's where entropy has to come in.

0:30:35.9 NH: Exactly.

0:30:36.2 SC: Good. And I guess I see why that explanation from Bennett is pretty darn good but not entirely satisfying. What if the demon had a really big ledger and didn't need to erase it for a long time? I'm not completely convinced about this yet. I know I'm saying that you should have a better way of doing it, I think it's interesting that over 100 years later, we still haven't completely nailed this thing down.

0:31:05.5 NH: Right. And interestingly, it was just recently that experimentalists got to the point of checking the qualitative, or excuse me, the quantitative prediction by Landauer that indicated that the demon really would need to expend basically all the work that he got from the engine cycle in order to erase his memory, so yes, there's still a lot of discussion happening.

0:31:31.1 SC: And this is at least tangentially related to the fact that our laptops heat up doing computations, we can do... Sorry, we can do reversible computations, but typically we don't, typically we do irreversible computations, and according to Landauer, that's gonna heat things up. It's a fascinating connection between information and heat.

0:31:53.5 NH: Yes. We could do our computation reversibly, but we would need an idealised system, it would run very, very, very, very, very slowly. So it won't be very practical for doing your taxes, say. [laughter]

0:32:08.4 SC: And I presume that... I mean, Landauer has a bound, like it's for every bit that you erase a certain entropy is created. Are real world computers... Are the laptop that I'm using right now close to that bound or are they more wasteful than that?

0:32:24.5 NH: Our computers are way more wasteful than that. [chuckle] But again, it was just not too long ago that during the 21st century that experimentalists started showing that they could approach the idealisation that is Landauer's limit.

0:32:42.9 SC: Is this a future frontier for computer builders to try to be more and more efficient and come closer to Landauer's bound?

0:32:51.2 NH: It is definitely a topic that is an inspiration for, say, my community, and topic also of practical interest to say a colleague I have who used to be at IBM. So it's being thought about. We are here a little ways away, but energy usage is an important topic, and there are a lot of people thinking about it.

0:33:17.2 SC: I mean, it's certainly true that the computers around the world use a lot of electricity, generate a lot of heat, it's something we might imagine if we would like to do better at.

0:33:26.4 NH: Yes, definitely.

0:33:27.4 SC: And then... Okay, good. Now we're... I think we're ready to move on to the quantum side of things. So, since you've explained thermodynamics to us, please explain quantum mechanics to us now. [chuckle]

0:33:37.9 NH: Of course. Very loosely speaking, quantum physics is the physics of the very small electrons, atoms, photons, or particles of light, and so on, and these systems can act in ways that are very different from the ways in which the large systems from our everyday lives can act. Probably most people who have listened to this podcast have heard about superpositions and entanglement, very strong correlations that quantum particles can share. We can use these different behaviors of quantum systems to perform certain tasks such as solving certain computational problems much more efficiently than we can perform those tasks with classical systems. Similarly, there are thermodynamic tasks like the extraction of work in the charging of batteries that are similar in spirit to information processing tasks. So we can ask, can we also use quantum resources to perform those tasks better than with classical resources?

0:34:42.0 SC: And I guess if someone asks me, "What is the most important difference between quantum mechanics and classical mechanics?" My answer would have something to do with entanglement. Do you think of it that way also, or do you think of something else as the most important thing? And what is it? [chuckle]

0:34:56.6 NH: I certainly think of entanglement as very closely bound up in the difference between classical and quantum. There are many different things that we can think of as quantum. Some of those characteristics can be mimicked by classical systems. For instance, an atom can have only certain amounts of energy, so its energy consists of just discrete numbers. However, there are classical systems that have at least approximately discretized energies. Entanglement is something that we don't attribute to classical systems, but there's a very high bar that you can set such that if you prove that some phenomenon meets this bar, then it is truly non-classical, that bar is contextuality. It's pretty difficult to prove that something is contextual, but it can be done. But I think of contextuality as, at in some cases, the thing behind entanglement that makes entanglement really non-classical, and in some cases provides speed up to quantum computers.

0:36:00.2 NH: But I think it's much easier in many cases to talk about entanglement, and it's also very useful. So entanglement is a strong relationship that quantum particles can have. I think of entanglement or quantum particles that are entangled as a whole, greater than the sum of their parts. There is some information in a system of entangled particles that is not in just one particle, it's not in just the other particle, and it's not in the sum of the two particles separately, but it's sort of between the particles that's in the whole.

0:36:35.7 SC: And what is contextuality? Can you explain that?

0:36:38.4 NH: So, we would think that if we were performing a measurement in a lab, a measurement of atoms or DNA or anything else, then the color of my shirt doesn't matter. It's not going to affect the statistics of the outcomes of that experiment if we were to run many trials. So there are properties that seem like they really should be irrelevant to the statistics of the outcomes of the measurements in an experiment. And in the classical world, indeed, the color of my shirt does not matter, but in the quantum realm, there are some properties that we would expect to be totally irrelevant to the statistics and measurement outcomes that in fact aren't irrelevant, and so we could say that in quantum theory, context matters for measurement statistics. So we say that quantum theory is contextual and classical physics is non-contextual.

0:37:38.0 SC: Can you give us a concrete example of a property that we would expect not to matter classically, but would matter quantum mechanically to measurement outcomes?

0:37:45.9 NH: We can imagine having a system with three observables, three properties that we might wanna measure, and properties A and B could be... Could commute with each other, so we can measure the two of them simultaneously without any problems, and similarly, observables A and C could commute with each other, so we can measure them both simultaneously without problems, so we could think, if we measure one of these observables then say, if we measure A, then whether we measure one of these other observables really doesn't matter and shouldn't affect the outcome statistics. However, observables, B and C might disagree with each other, and we would say that they don't commute with each other, so they can't be measured simultaneously, but you would think that that wouldn't really affect A, but it can affect the statistics and measurement outcomes.

0:38:49.8 SC: So this is great and fascinating, but it's also a little bit abstract, so where do we run into this kind of thing, measuring spins or cubits or in a quantum computer or some kind of experiment in your lab? I don't know.

0:39:04.8 NH: There are... People have shown that contextuality is behind at least some of the speed-ups that are achievable by quantum computers with cubits. So that is one example where contextuality is important.

0:39:19.0 SC: Okay, cool, and is it... Is it sort of a version of entanglement or is it more like the uncertainty principle, I mean, certainly the fact that different observables can't be measured at the same time, sounds Uncertainty Principle-esque.

0:39:30.5 NH: Definitely, and they're all related. They're all parts of quantum physics, they... I think of contextuality again, as each of them is a very specific property and they're all closely related, one can involve the other. Contextuality is just a particularly high bar or something for some property or for some experiment to have such that we can really say it's really, really non-classical.

0:39:56.1 SC: Yeah, it's really quantum. Okay, good. And the other interesting thing to get on the table before diving into applications is that quantum mechanics introduces a whole new kind of entropy, or at least a new way to have entropy, entanglement brings along with it a kind of entropy that we hadn't anticipated before.

0:40:12.9 NH: Indeed, if we have two particles that share entanglements, then automatically, each one of those particles has entropy just because of that entanglement.

0:40:23.7 SC: And can we...

0:40:24.0 NH: That's... Entanglement is... The most popular way of measuring that entanglement is with the Von Neumann entropy named after the John von Neumann who advised Claude Shannon about naming his entropy.

0:40:35.5 SC: Yeah. Did that come second? Did von Neumann invent his entropy after Shannon? I don't know who came first.

0:40:40.9 NH: I'm not sure of the years. But at least the two of them, according to this quote from Claude Shannon, they seemed not to be competitors but more helping each other out.

[laughter]

0:40:51.2 SC: And can I think of it? I've always... I've gone back and forth by myself about this, so I have two entangled systems, and like you just said, that means that either one of them, considered by itself, has some entropy, can I relate that back to information, can I say that because it's entangled, there's some information I don't have about that sub-system unless I include the whole shebang?

0:41:14.9 NH: Exactly, there's some information that's not in just one of the particles or the other, or in the sum of the two metrics separately, but that information is spread out across the whole.

0:41:26.8 SC: Good. Okay, good. So we have entanglement, it leads to this new kind of entropy. Let's put it to work. Let's do some quantum thermodynamics. So I have a list of different ways in which this becomes really interesting. I don't know, do you want to pick one first, what is the sales pitch if you're gonna say not just thermodynamics but quantum thermodynamics is something that's really interesting to think about for this reason?

0:41:52.7 NH: I think of quantum thermodynamics as encompassing a number of goals and ways of viewing it, one is that we know that quantum physics can assist with information processing tasks like communicating information and solving certain computational problems, there are thermodynamic tasks, analogous information processing tasks like refrigerating and charging batteries, so we can imagine that quantum phenomena might help with some of those thermodynamic tasks. One example that I discuss in my book is an information-based engine, this is Szilard's engine, we mentioned Szilard just now when we were discussing the resolution of Maxwell's Demon paradox.

0:42:38.7 NH: So Szilard showed that if you have information and you also have a source of heat, then you can use that information to turn this random uncoordinated heat into useful work. His initial thought experiment was about classical gas, but you can imagine that this gas might be classical or quantum, and as we know, there are different types of quantum particles, some are fermions like electrons and the other atoms... Or the other types of matter around us. Instead, you can imagine that the particles are bosons, so carriers of force or compositions of multiple fermions, and the fermions and the bosons have different tendencies, different properties. So you could imagine running your engine through many, many cycles with each of these types of particles, and you would get out some amount of work on average from each type, and the amount of work that you could get from fermions would be in a certain setup would be zero, the amount you could get with classical particles would be greater, and the amount that you could get with bosons would be even greater. So the... What we think of as spin statistics of bosons can help you on average in extracting work from an information engine.

0:44:02.4 SC: Okay, that's very interesting. So fermions are the particles that can't pile on top of each other, bosons are the ones that like to pile on top of each other, and you're saying that we can use that fact to beat the best possible classical result.

0:44:16.9 NH: In principle, and the basic idea comes exactly from what you said, since the bosons will have a tendency to pile on top of each other, they'll exert a greater pressure, and that's why they'll do more work.

0:44:27.5 SC: And is this something that is only gonna be useful at the microscopic scale, or can we imagine using this kind of technology and building it up to greater and greater size so that it's really like an engine that could push a car around?

0:44:42.5 NH: We could imagine. I don't expect to have quantum engines in our garages any time soon, but quantum control is developing, and I as a theorist, long thought that basically anything I proposed would be laughed at because theorists have a reputation for proposing experiments that are impossible for the next 20 years, but I've been very pleasantly surprised at how often experimentalists have said, "Huh. Oh, yeah, I could do that. Sure, let's collaborate." [chuckle] So I have... Again, people have recently performed the Landauer or ratio experiment that we just discussed earlier with a number of different platforms. Szilard's engine is very closely related. To my knowledge, no one has realized Szilard's engine with these different types of particles, but I have discussed it with some experimentalists and they say there is at least one lab out there that can probably do the experiment now with some significant thought. So in a few years, in 10 years, experimentalists will have much better control, so I don't wanna say that they can't do something.

0:45:55.1 SC: And I wanna help out the audience really just visualize what's going on here, so a real engine like in a car, in a gas car, I put gas into it, there's some fuel and it burns the fuel, increases entropy, makes the car go, what are the analogous things for your little tiny quantum engine, what is the fuel that makes it go?

0:46:17.5 NH: The fuel you could think of as consisting of two parts, one is the heat from the environment, from the heat bath. Again, this is random uncoordinated energy, so we have to have, so to speak, some way of directing it. And we use our information in order to direct it. For instance, in the case of Szilard's engine, we have, once again, a gas in a box. And we could imagine the box being partitioned in half. And suppose that we have one bit of information about the gas, suppose in a really simple example the gas is just one particle, and we know that the particle is on the right-hand side of the box rather than the left-hand side of the box. Then we can hook up a weight so that if we allow the partition to move so that the particle knocks against the partition and is able to move it farther and farther and farther across the box, as the partition moves, it drags a weight upward. Then we're using our information about which side the particle is on in order to decide where to put the weight, and that's what enables the particle to get heat energy from the bath and lift the weight.

0:47:35.9 NH: By the end of this process, the particle has pushed the partition all the way to the edge of the box so that, again, the particle's free to be absolutely anywhere in the box. We have no idea where the particle is, so we have lost our information about the particle, but in exchange we have lifted the weight. And I don't know that I could necessarily describe very well exactly how you would hook up the weight so that it is being dragged. But there's a beautiful illustration in my book that I did not draw, a wonderful steam punk artist drew it, so there's more information there.

0:48:10.7 SC: And so in some sense, you're being Maxwell's demon, you're using information to sort of figure out how to extract some work from these randomly fluctuating particles?

0:48:20.6 NH: You could think about it that way. And you could see that as a reason why one could say that Charlie Bennett was building on the work of Szilard and Landauer in proposing a resolution to Maxwell's demon paradox.

0:48:33.4 SC: And is this kind of thing... I guess the question I'm trying get to is separate out the quantum-ness of it from the tiny-ness of it. Part of this whole discussion is just that we're at the level of particles and atoms and things like that. And even if the world had been classical, you would have to think a little bit differently than we do in the big macroscopic world. Another part of it is the... Well, it's not classical, it's really quantum mechanical, and there's contextuality and entanglement. So is this engine you're describing really using quantum-ness, or is it just relying on the fact that things are tiny?

0:49:06.6 NH: The engine that I just described in detail can work even if the particle is classical, that's why a group of people who came around much later than Szilard built on his idea by saying, "Suppose that we move from thinking about one particle to thinking about multiple particles, and suppose that these multiple particles were classical or fermions or bosons, that their results relied on the statistics of fermions and bosons." Fermions obey Pauli's Exclusion Principle, so as you said, two particles... Two fermions can't pile on top of each other. They can't be in exactly the same quantum state, whereas bosons do like to be in the same quantum state and so can be bunched up together.

0:49:56.6 SC: Okay. Very good. And are... And since this is happening down at the level of particles and molecules, etcetera, I can imagine this is gonna be useful for nanotechnology. Is it useful in biology? Do cells use this kind of... These kinds of structures to move themselves around, or is this something only human beings have caught on to?

0:50:19.6 NH: It's something we'd like to know more about. There is a sub-field of quantum biology that is enjoying extra energy nowadays. And people approach the... Quantum biology from many different angles. In general, it is a very hard-to-find way in which non-classical phenomenon could be having a large scale effects in biology, because biological systems are warm, they're watery, they tend to be large. And in these conditions, quantum phenomena such as entanglement tend to die off quickly. However, their quantum theory is certainly relevant to molecules, molecules are certainly relevant to biology. For instance, I had the privilege and pleasure of working with a chemist, David Limmer, at UC Berkeley. He studies photo-isomers, which are molecular switches. They're molecules that have one conformation most of the time, but when light hits them, and they absorb a photon, they have the opportunity to switch configuration.

0:51:30.4 NH: So he was studying these molecules in many different ways, people have plenty of models for them. But these molecules are small, they are quantum, they're far from equilibrium. And so models tend to be either very, very detailed or to involve some assumptions that we might prefer not to make. So we worked together to try to make a more general model using Quantum Information Theory and derive a general bound on the probability that one of these molecular... One of these molecules would switch. You could probably recast that in terms of quantum... Well, this is a quantum thermal machine, a molecular switch. We weren't thinking of it as an engine, but one can often think about such things as quantum thermal machines. So they are out there, although their large scale effects on biology, as far as we know, are probably fairly small, but there could be plenty to be discovered.

0:52:35.2 SC: I think maybe it's worth helping the audience gain this intuition that physicists have, that warm wet things won't be quantum mechanical, right? Quantum mechanics is the world, the whole world is quantum mechanical, but classical physics is a pretty good approximation to it, and it becomes a very good approximation when things are big and radiating and so forth. Can you explain a little bit more about why that is, and therefore, why intrinsically quantum phenomena are not even more ubiquitous in biology?

0:53:08.7 NH: Sure. And we can use entanglement as a touchstone. You mentioned that you think of entanglement as particularly quantum. I really do too. I think it's a useful way to think in many cases. So suppose that we were interested in finding some biological system that makes use of entanglement. So this entanglement would need to be concentrated and controlled between the few systems that would need to be entangled. It's all too easy to entangle particles. Particles very easily entangle with stray particles, the things that bump into them, especially if the temperature is high, things are jiggling around a whole lot. So even if you start... Even if two molecules or one molecule started out with some controlled entanglement that it might be able to take advantage of to perform some task, thermodynamic or information processing, probably that entanglement would dissipate very quickly to the environment because of the principal of monogamy of entanglement. One particle can share only so much entanglement with any other particles.

0:54:17.3 NH: So this particle could share a whole bunch of concentrated entanglement with one other particle to do something useful, or if this particle gets distracted by other particles, then it would end up sharing only a little bit of entanglement with each of the other particles, and there would be no particle with which it shares enough entanglement that it could accomplish something useful from that entanglement.

0:54:45.6 SC: Okay. Let me try to rephrase that, because this is an interesting perspective that I haven't heard before. Given that there's sort of an upper amount of entanglement you can have when particles are... When there's lots of particles and they keep bumping into each other, they will be entangled, but the amount of entangling between one particle and any other one particle will be really, really tiny, 'cause the entanglement gets spread all over the place. And therefore, this seems like a bit of a leap, but therefore, it's almost as if there's no entanglement at all.

0:55:13.8 NH: Right. Suppose that we as humans, to make this relatively clean example, as humans who have conscious agency, wanted to send some quantum message from me to you. So we could do that in two ways. We could have a quantum channel analogous to the Internet that we're using now, but quantum, and I could send my message over that quantum channel. Instead, we could just share two particles, two qubits, two basic units of quantum information that are maximally entangled, and we could perform a certain protocol on them, such that it would be as though we had one use of a quantum channel. So to do something useful, to send a unit amount of quantum information, we would need... Or to... Yeah. To send some amount of quantum information, we would need at least this one maximally entangled pair. And if we had just a pair of particles that were very distracted by the other particles around them, so that your particle and my particle didn't share enough entanglement, then we couldn't accomplish a useful quantum task.

0:56:29.0 SC: It's interesting because it's always a bit of a surprise to me, even though I know it's true, that the world is so very, very quantum, and yet, we have to work really hard to manifest its quantum-ness in some places. Yeah. It's very interesting. And even before quantum mechanics comes along, there's this... In statistical mechanics, there's this concept of fluctuations of, if we don't know, if we don't have perfect knowledge, information of the state of all of our particles, we can say what usually happens, but there will be fluctuations around that. And maybe you could help us... Because I don't think we've ever talked about it on the podcast, but there's been a bit of a revolution in this non-equilibrium fluctuation kind of statistical mechanics over the last 20 years that is, even in the classical world and now, these days, in the quantum world.

0:57:21.3 NH: Yes. So there's this field of non-equilibrium thermodynamics or statistical mechanics called fluctuation relations. Fluctuation relations are... Can be thought of as stronger versions of the second law. So the second law of thermodynamics is an inequality. As you stated near the beginning of the podcast, according to the second law of thermodynamics, the entropy of a closed isolated system increases or stays constant. So it obeys an inequality over time. And the second law of thermodynamics, in most cases, doesn't tell us how much the entropy will increase. It just says, It'll increase by some amount or stay constant.

0:58:00.6 SC: Right. Can't go down.

0:58:02.4 NH: So that is... That's some information, but it's not as much information as one can want. Fluctuation relations are equalities. And they govern systems even very far from equilibrium, which is a state in which the large-scale properties of a system, such as its temperature and pressure remain constant, and there are no net flows of anything into or out of the system. So it's a very idealized state. Most of the world that we care about is out of equilibrium. We are far from equilibrium. So fluctuation relations are these stronger versions of the second law of thermodynamics. But from them you can recover what is sometimes called the second law of thermodynamics, and they give us also more information in that if you perform the same experiment many, many times, then you might get different outcomes in different trials. For instance, suppose that you have a strand of DNA. You can trap one end of it in an optical trap using a laser, and use lasers to stretch the strand of DNA.

0:59:05.0 NH: And stretching something requires the input of energy, it requires work. So you can imagine starting the strand out in the same way in each of many, many experiments and in each experiment, stretching the DNA strand through the same distance. In different trials, you'll need different amounts of work, because in one experiment, a water molecule kicks the molecule over here in this direction. In another experiment or another trial, a water molecule kicks the DNA over there in a different direction. So different trials require different amounts of work, but you can perform many, many trials. And from the statistics, predict that in the next trial you'll need this amount of work. And from some fluctuation relations we can learn about the probability of the next trial. We can do all sorts of things with fluctuation relations. Some are practical, some are fundamental. I have to mention my institution, I feel sort of legally obliged.

1:00:05.7 SC: Please.

1:00:06.9 NH: The University of Maryland is home to my colleague and collaborator, Chris Jarzynski, who is responsible for Jarzynski's equality, which is one of these fluctuation relations, one of the ones that helps to kick off a lot of excitement about what they could say in detail about these far-from-equilibrium systems that are usually extremely difficult to describe. Except Chris is very humble, so everyone else calls it Jarzynski's equality and he calls it the non-equilibrium fluctuation relation, which I think is much less convenient to say. But these equalities are remarkable, not only for the reasons I've mentioned, but also because the world far-from-equilibrium tends to be very difficult to describe. It's chaotic, it's wild, it's difficult to derive very general results that describe lots of non-equilibrium situations, but these fluctuation relations do. And they're not only theoretical constructs that have just been proved mathematically, they've been tested with DNA, with RNA, with other small particles. And extensions of them have been explored with quantum systems and more.

1:01:16.2 SC: Let me, just as an aside, it's interesting that you keep mentioning DNA, 'cause it took a while for this to get into my brain, but we think of DNA as what carries the genetic information inside our cells, but to physicists, it's like a really good spring or something like that. It's a very flexible molecule for tiny, tiny applications, right?

1:01:35.9 NH: Yes. To physicists, most things are springs.

1:01:39.6 SC: [chuckle] It has nothing to do with which base pairs are being used in the DNA, you're not using that information. That's interesting and fun. And yeah, so this... I mean, I don't know, I'm asking you open-ended questions here because I just wanna hear your big picture views, but to me, these fluctuation theorems, these non-equilibrium statements, these statements about how thermodynamics works when things are not settled down and the gas is not all equilibrated inside the box. We're beginning to put... To get a handle on how that works. And this might have enormous implications for how complex systems work in general, like you already said, it's hard to get general rules in these cases. And here's a general rule. So can we imagine bootstrapping our way up from those rules that we have to a bigger picture understanding of things beyond individual particles, things which have many moving parts, but aren't simple like a box of gas?

1:02:40.1 NH: Certainly. And I think that the field of complexity science is very exciting. I know you belong to the Santa Fe Institute, so you have a bunch of colleagues thinking about complexity theory. I've even written a paper about complexity theory in that sense recently, the regimes certainly share the importance of information, in some cases, they share the importance of thermodynamics. In some ways, they are a little bit different. Fluctuation relations are especially useful when the system is small. If a system is large, then it will very often, behave as it does, on average. And fluctuations away from the average have less and less significance. So that's one reason why fluctuation relations have been tested with strands of DNA and RNA. They're small enough that the fluctuations are very important, but also I agree complex systems are far-from-equilibrium. So there can certainly be overlaps in those tool kits.

1:03:49.2 SC: Yeah, I mean, I guess the intuition I have, which is very vague and non-mathematical at this point, in my thought about it, but what makes complex systems interesting is there can be feedback between different levels, even though it's a big system. A simple system is the earth going around the sun. Even though the earth is made up of lots of particles, we can pretty accurately predict where it is. But once system has become complex, like a person or a volcano or a hurricane, I can imagine tiny little fluctuations at the microscopic level impacting the macroscopic behavior in interesting ways. That's why I'm wondering whether or not there are connections to be drawn there. That's all I have to say. It's very vague, but figured I'd put it on the table.

1:04:32.6 NH: There are certainly a lot of work in that direction. I've also had the privilege and pleasure of working with Jeremy England's group. He and his group have had extremely interesting ideas about fluctuation relations, complex systems, far-from-equilibrium systems, and self-organizing systems. So they do very rigorous far-from-equilibrium statistical mechanics. And in part, it was a pleasure to work with him, because he is such a rigorous thinker about statistical mechanics, but the group is inspired by thoughts of life, which is certainly a complex system. And so there are connections between the fluctuation relations that are particularly visible in small systems and larger systems such as living systems.

1:05:23.9 SC: I will mention that Jeremy was a former Mindscape guest, so you're a... So we have a good track record there. Good, but I know that was a long digression 'cause I just love the fluctuation theorems, but I do wanna hit back on a couple of more quantum points before we wrap up. One of the fascinating, crucial things about quantum mechanics is the role of measurements and observations, right. We've talked about entanglement a little bit, which I think is pretty central. I mean, as a many worlds person, I think entanglement is the only thing you need to know, and you can derive everything else. But certainly, operationally, in the real world, when you measure quantum systems, their wave functions look like they collapse. How important is that kind of measurement uncertainty in quantum mechanics, which wasn't there in classical mechanics, to the story of statistical mechanics and thermodynamics at the micro-scale? How important is wave function collapse to Quantum Steampunk thinking?

1:06:18.8 NH: It's very important. Measurement is how we learn about properties of the system, and we, as thermodynamicist, would like to know how much work we've performed on a system, how much work we've extracted from it, how much heat its exchanged with its surroundings, but we find out those properties, typically, by measuring the system of interest. And if we measure the system of interest, then we disturb it, and we can influence our understanding of how much energy the system has absorbed or let go. So there is a problem of defining quantum heat and quantum work. That has been the subject of a lot of debate. I have a folder that I keep, every time I find another paper that proposes another definition of quantum work and heat, I add it to the folder. [chuckle] But I think that that's actually a reason for some of the richness of quantum thermodynamics. I haven't proposed definitions of quantum work and heat myself, but partially, because there are definitions that I think are quite useful in different settings that I consider.

1:07:35.6 NH: And I think that this richness can be interestingly contrasted with, say, particle physics, physicists are stereotyped as being obsessed with unification trying to take a whole bunch of different theories and show that they're really the same theory, so that there's just one theory to rule them all. But in quantum thermodynamics, there are these many different definitions of work and heat that seem to be useful depending on how you poke the system, so how you perform work on the system, how you measure the system, whether you have any other systems around to help you in particular ways. And so my opinion, which seems to be the opinion of at least some other quantum thermodynamicist, is there might not be any one definition of quantum work and heat to rule them all, but maybe there doesn't need to be, because there are diverse situations. We approach different systems in different ways with different tools and different skills and different definitions and measurement strategies can be useful in different situations. So the disturbance of a quantum system by measurement is extremely fundamentally important in quantum thermodynamics.

1:08:45.0 SC: Good, I figured it was. I wanted to check, 'cause sometimes you get surprising answers in this field, that's why you need to do the research, I guess. Okay, so at the end of the podcast, I always like to allow ourselves to be a little bit more speculative than in the meat of the podcast. Imagine that rather than being a professor and podcaster, I was a venture capitalist, and I wanted to know how to spend my money on Quantum Steampunk startup companies. What are the things that we might hope even if it's not in the very near term, but what are the real world technological applications that we might wanna get out of this, is it mostly mini-engines or timekeeping, or is there even biological medicinal kind of applications for this kind of technology?

1:09:36.7 NH: I think that's a very important question, and one that I think that the field could benefit from increasingly addressing. As I mentioned, the roots of quantum thermodynamics, and a lot of the quantum thermodynamics that has been done has been relatively abstract, mathematical, and theoretical. Quantum thermodynamics has made its way into experiments and the number of experiments and the complexity of the experiments has been growing and it'll continue to grow. But thermodynamics did develop hand-in-hand with the industrial revolution, which was extremely useful.

1:10:14.8 SC: [laughter] That was useful, yeah. Also, there're some downsides of it, but mostly, yeah, it was useful.

1:10:21.4 NH: Yes. There were certainly downsides, but hopefully, we can learn from past mistakes. It would be lovely if also quantum thermodynamics could go hand-in-hand with some quantum thermodynamic technologies. Those on a practical level don't yet exist. The engines that we propose have been really useful for seeing the difference between quantum physics and classical physics through the lens of thermodynamic tasks, similarly to how we understand that nowadays, how quantum differs from classical by understanding what problems a quantum computer can solve much more efficiently than a classical computer. And the experiments have been proof of principle experiments. But now that we have done some of this fundamental lab-work and we have started on... And done a number of proof of principle experiments, I think it definitely would be useful to think about how quantum thermodynamics could help us technologically.

1:11:26.3 NH: One challenge is that quantum systems are very difficult to control, so I like to look for what I think of as quantum analogues of Southern California. Southern California is... It just happens to be very well-suited to solar panels, and so if you just stick some solar panels in Southern California, then, even though the solar panels need rather specific conditions to work well, they will work well if they're put in the right place. So I think there might be some situations that might already exist in labs in which we could just stick a quantum thermodynamic device and get some benefits. I'm working with a lab on one possible way of realizing this idea. They've asked me not to talk much about it, so I...

1:12:22.9 SC: Okay, good. [chuckle]

1:12:23.1 NH: Won't say much for now. And also, I'm not the only person thinking in this direction. Alexia Auffèves in France has also recently written some papers about the possibilities for quantum thermodynamic technologies. I think that all the ideas you mentioned are interesting to think about. I don't know if we will succeed in making many very useful quantum thermodynamic technologies. I hope so. We haven't thought about it too much yet, so I think there's a lot of opportunity.

1:12:55.9 SC: Okay, but I'm pretending to be a venture capitalist now, not a podcaster. I want you to tell me what one technology I should spend my money on.

1:13:04.9 NH: If you're a venture capitalist, I'm afraid I'm the wrong person to talk to.

[laughter]

1:13:10.4 NH: But, yeah, perhaps a quantum engine that is placed in the right way in a lab, so that it can take advantage of the quantum control that's naturally there... Well, not naturally, but already there, and use the resources around it. And I should probably, due to the request of my colleague, not say too much more at the moment.

1:13:34.2 SC: Okay, I know, that is unfair, but are there potential... Just as a sort of guidance as to what to be aware of, we're looking for as the future comes online, solar power, is that something that'll be maybe useful, these quantum techniques?

1:13:54.7 NH: There are certainly people thinking about how we can capture energy from light more efficiently, for instance my chemist colleague at UC Berkeley, and their quantum physics, energy and information are all relevant. So the quantum thermodynamics community has been historically a little separate from that, but I think it would be useful to create some more of those bridges.

1:14:23.3 SC: Okay, and then on the totally opposite side, not being a venture capitalist any more, final question, are there connections, and maybe the answer is no, but are there connections here to other things going on in quantum information theory, like quantum computing or even applications of quantum information to black holes and emergent spacetime?

1:14:42.9 NH: Absolutely, and that's one of the frontiers that I think is most exciting for quantum thermodynamics and Quantum Steampunk, which I think of as taking the... Not only building a theory of quantum thermodynamics, marrying quantum physics with thermodynamics, sometimes with help from information theory, but also taking the mathematical and conceptual tools of this quantum theory of thermodynamics, taking it around to different disciplines and saying, "How can we help with our toolkits? Or can we uncover any new questions, or can we take inspiration from your field to ask new questions about quantum thermodynamics?"

1:15:18.8 NH: So I've had, again, the privilege and pleasure of working with people in chemistry, condensed matter, atomic, molecular and optical physics, and high energy and black hole physics. It's definitely a lot of fun to cross these borders and very productive. I often feel very ignorant because I'm usually talking with people whose conversation I don't understand, but it's a wonderful feeling to be able to say, "You have this problem, and I think that this tool from quantum thermodynamics could help." I gave the example earlier of the chemist David Limmer who is thinking about a molecular switch, so we used a tool from quantum thermodynamics, a resource theory, a very simple mathematical model for quantum thermodynamics in order to derive the bound that he was looking for. So those connections definitely exist. Some of my colleagues and I are working on building many more.

1:16:22.7 SC: Yeah, it's a good situation to be in. It can be challenging, I think, like you allude to, because you're always feeling like you're not the expert in the room, but you have your little area of expertise that they don't have, and so you're useful to people, and that is an exciting position for everyone to be in.

1:16:37.9 NH: Yes, definitely.

1:16:39.3 SC: And I'm looking forward to more, and good luck with the book coming out. Nicole Yunger Halpern, thanks so much for being on the Mindscape podcast.

1:16:47.9 NH: Thank you, it's been a pleasure.

[music]

3 thoughts on “192 | Nicole Yunger Halpern on Steampunk Quantum Thermodynamics”

  1. Pingback: Maryland Today | A researcher's new book puts the past and future of physics... - inssapartners

  2. I just want to say how reassuring and human it is to hear two far more accomplished people in their field discuss something, and one expresses the humility of having to work with people in fields where they are not expert, and the other one helps remind them that this is normal and it’s hard to see the value of your own contributions when you figure on that. But that person’s value is there.

    Just saying, I come for the science, I’m staying for the humanity.

  3. Maria Fátima Pereira

    Um episódio super, super interessante, empolgante, em uma linguagem acessível para todos e quaisquer curiosos do mundo da ciência.
    Agradeço a Sean Carroll e Nicole Yunger Halpern por este bom episódio onde a curiosidade, o conhecimento, ciência “em ponta” são uma constante. Adicionando alguns momentos divertidos ao longo da conversação.

Comments are closed.

Scroll to Top