Consciousness summary:
As we have seen, the central problem in consciousness is that there seems to be some sort of disconnect between a physical description of a mental state and the actual mental state. That is, even if we are able to map out all the biochemistry and neuro-physical pathways that conjoin a particular mental state, all we have at the end of the day is a bunch of physical stuff. We cannot look at the neurons and biochemistry and see something like a desire to eat icecream or feelings of grief. Desires and griefs are not physical things. They are phenomenal states of an organism. They are different from things like itches and knee jerks or cement. Hence, we should not be suprised that a complete physical description of phenomenal states still leaves us wanting as to what the phenomenal state is. In this sense, the consciousness problem is not at all like the engine-combustion or the stomach-digestion problem. We can look in an engine and see fuel being ignited by a spark and compression of pistons etc. Similarly we can look in a stomach and see chemicals breaking down food into tiny bits and pieces. All of these processes involve only physical states. There is nothing phenomenal about either of these problems. None of the language that is needed to describe engine-combustion or stomach-digestion requires anything other than a purely physical stance. If you think the mind-body problem is equivalent to the stomach-digestion problem then you should think again. Problems which involve phenomenal states of an organism (desires, wants, beliefs) are problematic on a reductionist account because what is being reduced is not physical. Colour is a physical phenomenon. But the belief that an umbrella will keep one dry is not a physical thing. If I were to look into someone's head who believed this, I might see a particular set of neurons acting but the neurons acting is not the belief. Precisely what is the relationship between phenomenal and physical states is not known at present. The most that can be concluded from the various reductive schemes is that whenever an orgainism is in physical state P, the organism is also in mental state M. To say that a mental state is a physical state is problematic because one is equating mental and physical predicates. It is clear how to use the word ``is'' when the two things being equated are of the same kind. Sand is a grainy substance. Desire is a ... One cannot say that desire is cement or desire is a knee jerk. This makes no sense. We can equate desire with some other mental predicate but this does not help us understand what the physicalists have in mind. The burden of proof falls on the physicalists to tell us precisely how mental and physical predicates can be equated not just how mental states arise from physical states. This they have not done. In quantum mechanics invoking consciousness to explain the collapse of the wavefunction involves all the problems of explaining what consciousness is as well as accounting for the causal connection between the two. This is a doubly difficult problem.
Concluding Perspectives on measurement:
The central problem of the Folk approach is that it does not specify at what point the linear wave eq ceases to apply, and thus is incomplete in the sense that it does not fully describe whether interference effects will be found in hypothetical experiments with large-scale quantum coherence.
The Copenhagen approach avoids that problem by saying that the wave function is a non-existent entity to which the linear wave function applies exactly, in between experiences, which are real. (A friend in junior-high used to refer to the unicorn as "a mythical beast found only in Africa.")The problem here is that "experience" is elevated to a central position in the physical working of the universe- it delimits the applicability of the wave equation. However, "experience' is an extremely fuzzy concept, and appears to play an ephemeral role in a universe whose physical behavior seems to be consistent over broad expanses of time and space.
The Bohm hidden variables account does not work as shown by Bell.
The "macro-realist" approaches predict that the wave function really does collapse (following a non-linear equation), under circumstances which depend on physical parameters. The theories are not yet full developed, and invoke non-QM random fields.
The standard Many Worlds picture contains only the wave function obeying the linear wave equation. It doesn't explain why the universe is found in a condition in which "measurement" occurs, but it is consistent with that description. Unfortunately, it gives the wrong probabilities for experimental outcomes.
Notice that the Many-Worlds pictures approach the experience/reality question in a way opposite to the Copenhagen interpretation. For Copenhagen, experience is taken to be the central theme, even at the cost of making the theory anthropocentric. People sound central to the process. For Many Worlds, the math is taken to be central, with the requirement that experience be correctly predicted. People are so radically peripheral to the process that most aspects of reality remain completely hidden from any individual experience.
Conclusion: The world has not been kind to local realism. The violations of local realism are just what QM predicts. However, special relativity and all the key limitations on time-travel have survived intact, no matter how beat-up our intuitions may be.
It is evident that the current state of the interpretation of QM (centered around the measurement problem) is unsatisfactory if one wishes to maintain objective reality. Whether or not some version of quantum mechanics recaptures the intuitive appeal of objective reality is unknown.
Quantum cryptography
(irrelevant, but interesting)One can use QM correlations and the collapse of the wave function to devise perfectly encrypted communication.
A perfect code is one in which the encryption scheme changes randomly from one character to the next. Then, a spy can learn nothing at all about future messages, no matter what she knows about past ones.If the sender and recipient share a table of random numbers, they can use it to ensure privacy. This "one time pad" is the method used by governments to send secure communications overseas. However, it requires advance
(insecure) transmission of the table. The EPR apparatus can avoid this insecurity.Consider a sequence of correlated photon pairs. Each observer
(sender and receiver) sets his polarizer along the x-axis. This need not be kept secret, so they can coordinate their activities on a non-secure communication channel.This requires long distance QM correlations. Only about 400 m has been achieved so far.
probability , entropy, and irreversibility
Thermodynamics
In the broadest sense, thermodyanmics is concerned with the properties of matter in so far as they are influenced by changes in temperatures. This definition can be sharpened. Classical thermodynamics limits itself to a macroscopic study of the properties of matter as a function of temperture: an atomic-level description is not part of classical thermodynamics. Specific macroscopic observables of interest are the pressure, volume, internal energy, and the entropy, for example. For our future study in quantum mechanics and cosmology, we will need the concept of entropy. The simplest way to introduce the concept of the entropy is to reflect for the moment that in the natural world, there seems to be a natural direction in which spontaneous events occur. For example, a car in neutral will run downhill. One in neutral at the bottom of the hill will not spontaneously move up hill. It will remain there for all time. Although there is a finite probability that it might all of a sudden run uphill, such an event is highly unlikely. Also heat flows from a hotter object to a colder object, not the other way. Engines cannot operate in a cycle without giving off exhaust. That is, the efficiency of an engine cannot be unity. All of these processes are governed by a fundamental thermodynamic principle. That principle is the maximization of the entropy by all spontaneous changes in a closed thermodynamic system. The second law of thermodynamics states that all spontaneous changes in a closed system will necessarily increase the entropy. Physically, entropy is a measure of the order in a system. An equivalent restatement of the second law is that physical processes run in a direction that decreases order or in other words increases disorder. When leaves fall off a tree, they do not accumulate in a nice clump below the tree. Rather they are dispersed all over the place. Entropy dictates that this be the case.
From Thermodynamics to Statistical Mechanics
Consider a container of a gas with a fixed volume. What happens if the container is doubled. If the volume of the gas doubles, the degree to which we know the precise location of each gas particle has decreased. That is, our information content of the container has decreased. Boltzmann was the first to conjecture that microscopically the way to understand the entropy of a given thermodynamic state is through the information content. Specifically, Boltzmann conjectured that the statistical route to the entropy is obtained by considering all the possible distinct ways of distributing the particles of a system. Let W be the number of distinct ways of distributing particles in a system. Boltzmann conjectured that the microscopic prescription for computing the entropy is: S=k_Bln(W), where k_B is Boltzmann's constant=natural gas constant/Avogadro's number. Boltzmann was way ahead of his time when he made this proposal. Faced with failing health and abject rejection from the scientific community, Boltzmann committed suicide. On his tomb lies the inscription: S=kln(W). This equation is the basis for modern statistical mechanics. We now turn to the underlying principles of probability that make its computation possible.
Before we do this, consider the problem of Maxwell's Demon
Consider "Maxwell’s demon," a hypothetical entity who performs impossible feats. For example, he stands at the door between two rooms and only lets molecules through one way. Notice that this process would reduce entropy, since there's more ways to place the molecules if they can go on either side than if they're confined to one side.
Then you get high pressure on one side, low pressure on the other. You could then use that pressure difference to drive a piston. Is this possible?
Before:
After:
Within classical physics there is no account of why this Maxwell demon procedure is impossible, although it obviously wouldn't be easy. Classically, this is IN PRINCIPLE not different from trapping all the billiard balls on one side of a table. So there's a bit of a paradox about classical thermodynamics.
That paradox will be removed by quantum mechanics. We won't worry about it yet.
Probability in classical physics:
In statistical mechanics, what you predict is the probability of various outcomes. In equilibrium, that probability is just proportional to the number of microscopic arrangements that give the outcome. That's why in equilibrium you end up with high ENTROPY. The most inevitable events (e.g. the expansion of a gas when a valve is opened) and the most unreasonable ones (the spontaneous contraction of some gas into a confined bottle) are assigned probabilities, which depend on the number of micro-arrangements in the FINAL state. (These probabilities come out a LOT different.)
What does probability mean?
That is, what do we mean when we say, "The probability of rolling a 3 with a 6-sided die is 1/6."
There are at least two standard types of answers, frequentist and subjectivist.
There a few formal definitions that are necessary. You should read Sklar pages 92-100. When we roll a dice that has six sides, the probability that any of the sides ends up face up is 1/number sides of the dice. This number is 1/6th. Now what if we roll the dice 12 times, will we measure that each side shows up twice. Probably not. The probability of 1/6 corresponds to a generalization we make from a large number of dice tosses. We take the ratio of the number of times a particular side ends up facing up over the total number of tosses. This frequency is the idealized probability that we associate with rolling an evensided dice. Of course, any finite sample size we choose we might not ever see 1/6th as the frequency associated with observing any side of the dice. The law of large numbers says that if the sample size is large enough, we will in fact see 1/6th. Now conditional probability. What if we roll two dice. What is the probability of observing an 8. An eight can occur five ways. What is the probability of observing a 4 if the sum on the dice is 8. The answer is 2/5. This is a question of conditional probability. The conditional probability question is what is the probability of an event A occuring given that event B occured. The answer is take the ratio of the number of ways event A can occur over the number of ways event B can occur. In probability the ratios are always of the form: event of interest/total events. Sometimes probabilities multiply. What is the probability of rolling a 5 and then a 6. If each of these events is independent (which we have no reason to believe they are not) then we simply take the product of the individual probabilities that a 5 and a 6 occurs. Hence, the joint probability is 1/36. These are the facts associated with probability. But we still have to answer the question, what do we really mean by probability. Because we can only really talk about the ideal probabilities of 1/6 for example in the context of a dice when the sample size is large, why is it that we say that when we roll a dice a 6 will occur 1/6th of the time. No experiment we have ever performed will ever give this result, unless we roll dice until from now until eternity. The objectivist ignores this and says that ``probabilities are those generalized attributions of frequency and proportion that appear in the posits that play a fundamental role in the structure of generalizations''. Probability is the dispositional magnitude that a particular outcome is produced. The dispositional magnitude of flipping a head on a two-sided coin is 1/2. What does an objectivist say about a random sequence? Consider the sequence HTHTHTHTHTHT. Is this sequence random? We could easily have generated this sequence by two print commands in a 3-line computer program. A simple rule of thumb is that no 3-line computer program can generate a random sequence. Believe it or not, this criticism is not too far from the truth. A random sequence is viewed as being random if it is generated with a computer program that is sophisticated enough to generate a random sequence. But this is somewhat circular. The sequence with alternating H and T's is not random because it seems to fit a simple rule. Namely H on odd flips and T on even flips. Since this rule works, the sequence is not random. Let's assume no such simple rule can be devised. Then a sequence is random if the limiting relative frequency in subsequences is not too far off from the ideal frequency predicted from the law of large numbers.
Subjectivist (Bayesian) probability
:Probability is defined in terms of the speaker’s degree of confidence in his statement. On this account probability is not so much about the world but about our biases and our knowledge of a particular situation. In A good example of subjectivist probability at work is the Monty Hall question. I will go over this in class. The crucial piece of information is knowing what is in Monty's head.
Sometimes there's a list of possible outcomes, each assumed to be equally likely until we learn otherwise. This is called the
This definition is certainly flexible enough to cover all the cases we've mentioned. Is it too flexible?
time