Physicists often describe the fabric of the universe we inhabit as four-dimensional spacetime, comprising three dimensions of space and one of time. But whereas we spend our days passing freely through space in any direction we wish (gravity and solid obstacles permitting), time pushes us along, willingly or not, in a single predetermined direction: toward the future.

This is the arrow of time—life carries us from the past, through the present, and into the future. Back to the Future plotlines notwithstanding, no one knows how to reverse the arrow—how to move backward in time—and the logical paradoxes that would result from such a trip into the past render it a thorny proposition at best. (Thanks to a prediction of special relativity called time dilation, travel into the distant future is relatively easy: just move really, really fast.)

In his new book, From Eternity to Here (Dutton, 2010), theoretical physicist Sean Carroll of the California Institute of Technology sets out to explain why time marches along unfailingly in one direction. Expanding on the concepts in his June 2008 feature for Scientific American, "The Cosmic Origins of Time's Arrow," Carroll argues for the necessity of marrying three seemingly disparate concepts: time, entropy and cosmology.

Entropy, which in rough terms is the measure of a system's disorder, creeps up over time, as dictated by the second law of thermodynamics. To illustrate entropy's inexorable growth, Carroll takes us to the breakfast table—you can't unscramble an egg, he points out, and you can't unstir the milk out of your coffee. These systems invariably proceed to disordered, or high-entropy, arrangements. Each of these examples shows how the continual growth of entropy fills the world with irreversible processes that divide the past from the future: The making of an omelet and the mixing of milk into a cup of coffee are events that work in only one temporal direction.

But why should entropy always increase? This is where Carroll turns to cosmology, which must explain why the universe began in a uniquely low-entropy state. We spoke to the physicist about his new book and the challenges of presenting cutting-edge physics to a wide audience.

[An edited transcript of the interview follows.]

What's so interesting about time? To a naive observer it's something that just passes by and that we can't do anything with; it's unchanging.
There are two things that inspired me to write this book. One is that time is something we all are familiar with. We all use it—we have no problem reading a watch. But then, when you act like a good scientist or philosopher and try to make sense of it, this puzzle arises: The fundamental laws of physics treat the past and the future [as being] exactly the same, whereas the world does not. There's a big difference—the past happened and the future is still up for grabs. So it would be nice to know how to reconcile that. That's the arrow of time problem as it's been thought about for at least a couple hundred years now.

I think that's an important and interesting problem, and it's just as good to write about as anything else. But there is something that I think makes this problem a little bit special, which is that the answer to why the past is different from the future, whatever it is going to end up being, is not just about what happens here as you and I are talking, as time goes by in our daily lives. It is intimately connected with the whole universe—with what happened at the big bang, with the special condition in our universe when it started.

A full understanding of what happens in our everyday lives needs to take into account what happened at the big bang. And not only is that intrinsically interesting and just kind of cool to think about, but it's also a mystery that is not given much attention by working scientists; it's a little bit underappreciated. We are so far from knowing what the final answer to this is that we sort of don't think about it that much. So I wanted to draw attention to this connection between the arrow of time and cosmology, both to everyday readers but also to my scientific friends. I think this is something that we really should keep in mind as one of the fundamental puzzles facing us in modern science.

As an everyday reader, I appreciated the introductory quotes to the chapters from Annie Hall, Vladimir Nabokov, Dumb and Dumber. How much of a challenge was it to try to keep this book accessible and enjoyable?
I tried my best, and I think I succeeded in some places more than others. A lot of the material was not exactly what I do research on, so I had to learn a lot and sort of think about things I had been vaguely aware of for awhile. I actually think that I was better at making those sections lively and interesting and accessible than the sections that I understood the best. Because I knew I had to sit down and think very, very hard about it; I couldn't just give my conventional spiel.

The good news is that, except for a few things about quantum mechanics and the multiverse, most of the basic ideas are pretty graspable. They're not dramatically abstract; we're not working in higher dimensions or anything like that. You can see the basic ideas we're talking about working themselves out in everyday life.

I'm a big believer that science is part of a larger cultural thing. Science is not all by itself. So I definitely wanted to give the feeling that as we're thinking about the universe and space and time and experience and memory and free will and all these things that I talk about in the book, this is both science and our everyday lives and the culture in which we live, so why not sort of have fun and bring them together?

Going back to the science for a moment, how does the concept of entropy intertwine with the arrow of time?
Well, I think people have probably heard the word entropy. It goes up; that's the second law of thermodynamics. There's this famous—at least famous among scientists—episode with [English novelist and physicist] C. P. Snow, where he was trying to convince people that they should not only be literate about literature but literate about science, as well. And the one example Snow chose as something everyone should know was the second law of thermodynamics, the law that says that entropy increases.

And that's true, and I believe that's a great example, but what I think is actually underappreciated is that just about everything about the arrow of time—what we would think of as "how time works," the fact that the past is set in stone while the future can still be altered—is all because of entropy. The fact that you can remember yesterday but not tomorrow is because of entropy. The fact that you're always born young and then you grow older, and not the other way around like Benjamin Button—it's all because of entropy. So I think that entropy is underappreciated as something that has a crucial role in how we go through life.

You have a sort of candid moment in the book where you talk about a veteran, unnamed physicist who had some complaints with your theories about time and the second law.
His objection was with the idea that cosmology has something to do with it.

The following statement is very true: To understand the second law of thermodynamics, or how the arrow of time works in our everyday lives, we don't need to ever talk about cosmology. If you pick up a textbook on statistical mechanics, there will be no talk about cosmology at all. So it would be incorrect to say that we need to understand the big bang in order to use the second law of thermodynamics, to know how it works. The problem is, to understand why it exists at all requires a knowledge of cosmology and what happened at the big bang.

Once you assume that the universe had a low entropy for whatever reason, everything else follows, and that's all we ever talk about in textbooks. But we're being a little bit more ambitious than that. We want to understand why it was that way—why was it that the entropy was lower yesterday than it is today?

To understand why the entropy was lower yesterday really requires cosmology. And I think that if you sit down and think about it carefully there is absolutely no question that that is true, yet a lot of people don't quite accept it yet.

If you take this approach and look at time from a cosmological standpoint, what is this low-entropy condition in the past? What does that look like?
We're not learning anything about the early universe by making this observation. We already know what the early universe was like. It was smooth, it was expanding very rapidly, it was a dense, hot state, and there was a lot of stuff in the universe. Now, that happens to be a very low–entropy configuration that the universe could be in, and that is the puzzle. So it's not that we're learning what the early universe was like, because we already knew that—it's that in trying to explain that, in trying to come up with a theory, whether it's inflation or the cyclic universe or a big bounce, you haven't succeeded in explaining the early universe unless you've explained why it has low entropy. And I just think a tremendous number of contemporary cosmological theories fail at that requirement; they sort of sidestep their way around that question rather than addressing it head-on.

Do these various theories make predictions that we could test based on our understanding of time and entropy?
Not yet. We'd like them to. All I can say is hopefully they will. I talk about that in the epilogue of the book.

On the one hand, if these ideas don't connect with observed things, then there's no use talking about them. But that's not the same as saying that because we can't connect them right now to observable predictions, there's no use talking about them. It's part of a much bigger picture—we have to understand how quantum mechanics and gravity play together long before we can ever hope to say definitively what the right answer is to these questions.