A B S T R A C T S
International Workshop
The Conceptual Foundations of Statistical Mechanics
May 22 - 25, 2000
The Limits of Information
Jacob D. Bekenstein
Information is a central element in the world around us. What would that world be without DNA, books, inscriptions, computers, internet, ...? Quantification of the amount of information (if not its value) in a physical system has been accomplished via the concept of entropy. The controversy whether information entropy and thermodynamic or statistico-mechanical entropy are the same can be settled usefully. However, this gives rise to a new question: can one place objective bounds on the information that can be held in a specific physical system ? From a mere wish this seems to have become a reality in the last two decades through conceptual developments in black hole physics (formulation of black hole thermodynamics) and particle physics (formulation of the holographic principle). I shall review 't Hooft's 1993 holographic bound, which is the most famous statement about this matter, and discuss its relation to the much earlier, and more efficient, universal entropy bound. Very recent developments in this subject (bound on information of a large chunk of the universe) will also be mentioned.
Comparing the Approach to Equilibrium in Thermodynamics and Statistical Mechanics
Harvey R. Brown
It is argued that it is wrong to think of the second law of thermodynamics as "nature's way of driving systems towards equilibrium". In thermodynamics the tendency of systems irreversibly to approach equilibrium is logically prior even to the zeroth law. Its treatment in statistical mechanics is in principle more ambitious. (This talk will be based on work done with J. Uffink.)
Against Coarse Graining
Katinka Ridderbos
The subject of this paper is the coarse graining approach to statistical mechanics, which is an attempt to get a grip on the problem of the impossibility, resulting from Liouville's theorem, of the evolution toward a fine-grained equilibrium distribution by an arbitrary non-equilibrium distribution. The aim of this paper is twofold. First, I will address the criticism that the coarse graining approach introduces an element of subjectivity into statistical mechanics. I will argue that the scope of this objection is limited. Second, I will argue that criticism should instead be leveled at the coarse graining approach because it disregards the dynamical constraints on the system. In doing so the coarse graining approach undermines the very task statistical mechanics has set out to accomplish, viz to provide a microdynamical underpinning of thermodynamics.
As to the first point, I will argue that the absence of a counterpart in thermodynamics of the coarse graining method's subjectivity, does not necessarily imply that the method of coarse graining is in conflict with thermodynamics, for the following two reasons. (i) The source of the subjectivity associated with the coarse graining approach lies in the grouping together of microstates in coarse-grained cells on the basis of their indistinguishability with respect to quantities that do not figure in the thermodynamic description of the system, and that, from the purely thermodynamic point of view, add a layer of detail to the system's description that is irrelevant.
(ii) The subjectivity associated with the coarse graining approach does not infest all steps of the coarse graining strategy, since the observer's measurement resolution only affects the perception of the instantaneous ensemble distribution. That is, the identification by an observer of the coarse-grained equilibrium distribution depends on the observer's measurement resolution, and can thus said to be subjective.
But the time evolution of an arbitrary coarse-grained ensemble distribution is uniquely determined by the underlying dynamics. Whether or not a particular non-uniform distribution evolves to the coarse-grained equilibrium distribution is therefore independent of the observer's measurement resolution.
As to the second point, the coarse graining method treats empirically indistinguishable distributions as identical. I will argue that this reliance on appearances can only be carried through in a consistent manner when the empirical evidence taken into consideration is restricted to instantaneous observations. For in spin-echo type experiments observable differences can be shown to exist in the time evolution of two types of distributions (so-called `true equilibrium distributions' and `quasi-equilibrium distributions') that are indistinguishable with respect to instantaneous observations.
The analysis of the coarse-grained account of the spin-echo types of experiment leads to a further objection, which is centered on the notion of a coarse-grained entropy. I will argue that the required increase in the coarse-grained entropy is obtained by treating dynamically inaccessible microstates as actually accessible to the system. This means the dynamical constraints on the system are disregarded, an act that undermines the project of providing a microdynamical underpinning of thermodynamics.
The Idea of Entropy and the Possibility of
Maxwellian Demons
David Z. Albert
The arguments against the possibility of Maxwellian Demons (both in the classical and in the quantum-mechanical context) are refuted by means of an explicit counter-example; and the confusion at the heart of those arguments is diagnosed.
Fluctuations, Correlations and Demon Communications
Itamar Pitowsky
We consider a gas at equilibrium whose state is represented
by a probability distribution in phase space .
In a previous paper
(Pitowsky, I. and Shoresh, N. Found. of Phys. 26, 1996,
1231-1242) we proved that the distribution
is Maxwell-Boltzmann (in other words the gas is ideal) if and only if it satisfies
a locality condition: The correlations among fluctuations at remote small regions
in physical space diminish as
.
In the present paper I generalize this observation to distributions whose correlations among local fluctuations do not diminish in the thermodynamic limit. I derive a formula that expresses the distribution in terms of those correlations. I also show how “local observers” which are capable to measure only local fluctuations (Maxwell’s Demons) can determine the correlations (and thus the entire distribution). They communicate with each other without the investment of external energy, using the existence of the correlations between the fluctuations. Finally, I use Liouville’s theorem to connect the abstract discussion to the physical Gibbs distribution.
Exorcist XIV: the Wrath of Maxwell's Demon
John D. Norton
In the 1860s James Clerk Maxwell conceived a "very observant and neat-fingered" being, able to operate a door between two chambers containing a kinetic gas. He opens and shuts the door with such facility that the faster gas molecules are accumulated without expenditure of work in one chamber and the slower in the other. The overall result is that one chamber becomes hotter and the other cooler without expenditure of work, in violation of the Second Law of Thermodynamics.
Maxwell thought the moral of the Demon was merely that the Second Law holds only with overwhelming probability. In the decades following Maxwell's original analysis, the Demon was transmogrified from a benevolent agent assisting us in circumscribing the domain of validity of the Second Law to a malevolent threat to the law that had to be exorcised.
Following the work of Szilard in the 1920s, the dominant viewpoint has become that information theory provides such an exorcism and that no exoricism can succeed without it. So Bennett reported in 1987 that "The correct answer--the real reason Maxwell's demon cannot violate the second law--has been uncovered only recently. It is the unexpected result of a very different line of research: research on the energy requirements of computers."
I shall argue that information theoretic exorcisms of Maxwell's Demon provide largely illusory benefits, whether they foSzilard in seeking compensating entropy increases in information acquisition or Landauer in seeking compensating entropy increases in memory erasure. In so far as the Demon is a thermodynamic system already governed by the Second Law of Thermodynamics, no further supposition about information and entropy is needed to save the Second Law. In so far as the Demon fails to be such a system, no supposition about the entropy cost of information acquisition and processing can save the Second Law from the Demon. My talk reports joint work with John Earman.
Maxwell's Demon and the Thermodynamics of Computation
Jeffrey Bub
The current orthodox position on how the Second Law of thermodynamics is saved from Maxwell's Demon is that there is a minimal entropy cost to erasing information (Landauer's principle), and that this is an unavoidable stage in the Demon's information processing. The prior view located the entropy cost in the Demon's acquisition of information (Szilard's principle). In a recent paper ('Exorcist IV: The Wrath of Maxwell's Demon. Parts I and II,' Studies in History and Philosophy of Modern Physics 29, 435 - 471 (1998); 30, 1 - 40 (1999)), Earman and Norton argue that these exorcisms of the Demon are misplaced. In particular, they take issue with the claim by Bennett and others that Szilard's principle is superseded by Landauer's principle as the correct explanation for the failure of the Demon to violate the Second Law. Either the Demon is a thermodynamic system governed by the Second Law, or not. In the first case, no supposition about the entropy cost of information acquisition and processing is needed to save the Second Law; in the second case, no such supposition can save the Second Law. Are Earman and Norton right, or is Bennett right?
The Concept of Probability in the Many-worlds Interpretation of Quantum Mechanics
Lev Vaidman
The usual way of introducing the concept of probability in a deterministic theory is in the form of an IGNORANCE probability: probability of a particular outcome of an experiment is defined for an observer who performs the experiment when she is ignorant about some parameters of the experimental system. For example, this is the way the probability is defined in the framework of Bohmian interpretation of quantum mechanics in which the observer cannot know the "Bohmian positions" which govern the results of experiments with various possible outcomes. The many-worlds interpretation is a deterministic theory, but it has no parameters in the experimental system about which the observer performing the experiment is ignorant of. Moreover, how can the observer be ignorant about the result of the experiment when she knows that ALL possible results will be obtained? A suggestion to resolve this difficulty is
proposed.
Objective Chance and Determinism
Barry Loewer
I review various interpretations of probabilities with an eye to the question of whether they can make sense of the probabilities that occur in deterministic theories; specifically statistical mechanics and Bohmian mechanics. I argue that David Lewis' "Humean" account looks like the best of the lot though not without its difficulties.
The Interpretation of Theories: The Case of
Statistical Mechanics
Lawrence Sklar
Why does statistical mechanics need an "interpretation?" What is it to interpret that theory? How is interpretation in the case of statistical mechanics like and unlike interpretation in the case of other foundational physical theories such as classical dynamics and quantum mechanics?
Direction and Description
Yemima Ben-Menahem
This paper deals with the dependence of directionality in the course of events -- or our claims concerning such directionality -- on the modes of description we use in speaking of the events in question. I argue that criteria of similarity and individuation play a crucial role in assessments of directionality. This is an extension of Davidson's claim regarding the difference between causal and explanatory contexts.
I begin by characterizing necessity and contingency in terms of degree of sensitivity to initial conditions and interfering factors. Obviously, these notions of necessity and contingency differ from their counterparts in modal logic. More interestingly, they also differ -- and here I depart from common usage -- from the notions of causality and chance. Thus, there can be necessary connections in the proposed sense in a non-deterministic world, and, likewise, contingent connections between causally-related events.
Applying these insights to the foundational problems in statistical mechanics, I argue that it is conceivable that directionality hinges, is some cases, on mode of representation, and as such is, to all intents and purposes, a priori once the representation is in place. This kind of directionality is perfectly compatible with both determinism and indeterminism at the microscopic level. There is no advantage to reducing such directionality to any other law or causal process. Further, this dependence on representation does not imply subjectivism, either in general, or with respect to probability in particular. At the same time, there is no reason to assume that directionality is always representation dependent; certain types of causal processes may also give rise to directionality. A pluralistic conception of directionality, as opposed to a monolithic understanding, thus seems warranted.
In the last part of the paper, I illustrate some of my claims through an examination of the Gould-Dennett dispute over directionality and natural selection.
The Open System and Its Enemies
Craig Callender
This paper is a critique of the idea that appealing to the openness of systems can solve philosophical problems in the foundations of physics.
After a brief tour of failed attempts in other areas of physics, e.g., relativity and quantum mechanics, I turn to statistical mechanics, in particular, to 'interventionism.' Interventionism holds that random, uncontrollable environmental influences ultimately explain why systems approach equilibrium. By Liouville's theorem, the Gibbsian fine-grained entropy of a closed system remains constant in time. Interventionists solve this problem by pointing out that closed systems are really only an idealisation, for systems can never be shielded in practice from all manner of environmental perturbation. When one recognises that the system is really an open one, interventionists argue, one ought to modify the relevant Hamiltonian in such a way as to make the 'paradox'of the conservation of the fine-grained entropy disappear. Open systems of course have their legitimate role to play in physics, but I don't think that appealing to openness itself can solve any philosophical problem.
The typical dialectic regarding interventionism goes as follows. One objects to the claim that one can only explain the approach to equilibrium of system S by appealing to the open system of S plus its relevant environment E by pointing out that S + E is a closed system whose fine-grained entropy remains constant. Interventionists then respond by pointing out that S+ E itself has an environment E'. The argument continues indefinitely unless the universe is mercifully assumed closed, in which case the debate turns to whether the universe as a whole has an increasing entropy.
The starting point of my paper is that this above argument is not one anyone will win. There are better ways to criticise interventionism, but these involve, first and foremost, clearly distinguishing the various questions that interventionism may be an answer to, second, looking at the physics and toy models, and third, making different philosophically-based objections.
Some Philosophical Remarks on Interventionism
Orly Shenker
Interventionism is often mentioned as a possible framework for understanding Statistical Mechanics, only to be quickly dismissed as a non-starter or as explanatory irrelevant. This talk presents Interventionism as expressing a philosophical approach towards the aim and role of science, in addition to its being a specific claim of physics. This opens a way to ssome features and consequences of Interventionism, normally taken to be shortcomings, as philosophically advantageous.
Quantum Decoherence and the Second Law
Meir Hemmo
Albert has recently proposed a way to recover the second law of thermodynamics, in particular the approach to equilibrium, by appealing to the quantum mechanical theory of the collapse of the wavefunction of Ghirardi, Rimini and Weber. Relying on the theory of decoherence of open systems, I show that Albert's proposal can be naturally generalised to no-collapse interpretations of quantum mechanics.
The Use and Abuse of Initial Randomness
Huw Price
Since the H-Theorem, many arguments aiming to show that entropy does not decrease in the future have relied on an assumption to the effect that the initial microscopic distribution of relevant particle motions is 'random', 'uncorrelated', or 'independent'. I examine the logical structure of such arguments, assuming initially a T-symmetric deterministic dynamics; and compare it to the structure of the time-reversed case, which relates the existence of correlations in the future to the level of entropy in the past. I argue that such appeals to initial randomness fail by ordinary scientific standards for explanation and justification, at least in this sense: no such appeal can provide an epistemically well-grounded objection to the hypothesis that entropy decreases in the (sufficiently distant) future. I argue that the same holds if we relax the dynamical assumptions of determinism and T-symmetry.
An Intrinsic Time Arrow Independent of Initial Conditions?
Avshalom C. Elitzur
Shahar Dolev*
In contrast to the time-symmetry of physical law, the universe is markedly time-asymmetric. Most physicists ascribe this asymmetry to the universe’s boundary conditions (the latter supposedly lying beyond scientific explanation), while a minority suspects that all microscopic interactions conceal a time-asymmetric ingredient, still unknown to physics.
Apparently unrelated to this controversy, another old question in physics is whether our world is deterministic. Here opinions are more evenly divided. About half of the prevailing interpretations of quantum mechanics invoke some hidden deterministic variables, whereas the other half claim that quantum mechanics invokes true indeterminism. Other indeterministic theories argue that fundamental randomness is obliged by black-hole thermodynamics, general relativity and even classical physics.
Strangely, the two controversies – intrinsic vs. epiphenomenal time-asymmetry and determinism vs. indeterminism – went on nearly oblivious to one another. This dissociation is unfortunate, for the former issue bears, directly and strongly, on the latter.
We prove the following statement: In any closed system, the slightest failure of determinism must lead to the emergence of a time arrow that accords with that of the entire universe, regardless of the system’s boundary conditions and no matter how shielded the system is from the universe. Hence, all theories that invoke indeterminism entail the highly unorthodox claim that time’s arrow is intrinsic rather than a consequence of boundary conditions.
*Shahar Dolev, The Cohn Institute for the History and Philosophy of Science and Ideas, Tel Aviv University.
Teeth and Arrows; the Second Law of Thermodynamics and the Concept of Irreversibility
Jos Uffink
I will try to disentangle various meanings of the concept of (ir)reversibility, in particular between time asymmetry and irrecoverability.
A common conception is that the second law of thermodynamics is relevant to both aspects of the term. However, I will show that a satisfactory formulation of the second law is possible without explicit recourse to either aspect of irreversibility.
Computational Complexity in Statistical Physics
Cristopher Moore
Many notions and measures of "complexity" have been suggested for physical systems. One of the most well-defined, and most practical, is the following: how much computation does it take to predict a physical system?
Some systems have special properties that allow us to parallelize their behavior, and predict their final state without going through their entire history. Other systems are computationally complex, e.g. P-complete, suggesting that explicit simulation is necessary.
In this talk I will discuss several systems, including sandpiles, Ising models, lattice gases, cellular automata, and diffusion-limited aggregation. I will present both recent results on the computational complexity of these systems, including work by myself, Jonathan Machta, of the University of Massachusetts, and others, and some open problems. I will also talk about how "cultural differences" between computer science and physics need to be bridged to make these tools more relevant to real physical systems --- for instance, to address the average case rather than the worst case.
Ergodic Theory, Interpretations of Probability and the Foundations of Statistical Mechanics
Janneke van Lith
The traditional use of ergodic theory in the foundations of equilibrium statistical mechanics is that it provides a link between thermodynamical observables and microcanonical probabilities. On the one hand, the ergodic theorem demonstrates the equality of microcanonical phase averages and infinite time averages (albeit for a special class of systems, and up to a measure zero set of exceptions). On the other hand, it is argued that actual measurements of thermodynamic quantities yield in fact time averaged quantities, since measurements take a long time.
The combination of these two points is held to be an explanation why calculating microcanonical phase averages is a successful recipe for predicting the values of thermodynamical observables. It is also well-known (at least to philosophers of science) that this account is problematic.
There are also other uses of ergodic theory in the foundations of statistical mechanics; I will distinguish three of them. The purpose of my talk is to review them and to examine closely their foundational role and particularly the relevance of specific interpretations of probability."
***