Decoding Reality
VLATKO VEDRAL
The most fundamental aspect of reality is information,
and not energy or matter
Prologue
 Postulating a supernatural being does not really help explain reality since then we only displace the question of the origins of reality to explaining the existence of the supernatural being.
 We create our reality through our understanding of the Universe and our reality is what is possible based on everything we know.
 Deduction without any principles is what John Wheeler called a "law without law". If we can explain laws of physics without invoking any a priori laws of physics, then we would be in a good position to explain everything. It is this view that is the common scientific take on "creation out of nothing", creation ex nihilo.
 Information is far more fundamental than matter or energy because it can be successfully applied to both macroscopic interactions, such as economic and social phenomena, and information can also be used to explain the origin and behaviour of microscopic interactions such as energy and matter.
 Information, in contrast to matter and energy, is the only concept that we currently have that can explain its own origin.
 As we compress and find allencompassing principles describing our reality, it is these principles that then indicate how much more information there is in our Universe to find.
 We compress information into laws from which we construct our reality, and this reality then tells us how to further compress information.
 Information reflects the degree of uncertainty in our knowledge of a system.
Part One
 Information has to be inversely proportional to probability, i.e. events with smaller probability carry more information.
 The formula for information must be a function such that the information of the product of two probabilities is the sum of the information contained in the individual events. The information content of an event is proportional to the log of its inverse probability of occurrence.
 We only need the presence of two conditions to be able to talk about information. One is the existence of events (something needs to be happening), and two is being able to calculate the probabilities of events happening.
 The general principle that Shannon deduced is that the less likely messages need to be encoded into longer strings and more likely messages into shorter strings of bits.
 The basic unit of information is the bit, a digit whose value is either zero or one.
 Why did Nature choose digital rather than any analogue (nondigital) encoding? There are two reasons in favour of digital encoding: one is the reduced energy overhead to process information, and the other is the increased stability of information processing.
 Meaningful information necessarily emerges only as an interplay between random events and deterministic selection.
 Any selfreplicating entity needs to have the following components: a universal constructing machine, M, a controller, C, a copier, X, and the set of instructions required to construct these three, I.
 The Second Law of thermodynamics tells us that in physical terms, a system reaches its death when it reaches its maximum disorder (i.e. it contains as much information as it can handle).
 Entropy is a quantity that measures the disorder of a system and can be applied to any situation in which there are multiple possibilities.
 The entropy of a closed system always increases.
 The First Law says that energy cannot be created out of nothing. It can only be transformed from one form to another.
 The Second Law tells us that when we convert one form of energy into another we cannot do this with perfect efficiency (i.e. the entropy, the degree of disorder in the process, has to increase).
 Life maintains itself on low entropy through increasing the entropy of its environment.
 Computer processes information as it runs and any information processing must lead to wasting of heat.
 When we "delete" information all we actually do is displace this unwanted information to the environment, i.e. we create disorder in the environment.
 Information, rather than being an abstract notion, is entirely a physical quantity. In this sense it is at least on an equal footing with work and energy.
 Information gain is very large when something unlikely happens.
 There is a general law in finance that in an efficient market there is no financial gain without risk. Anything worth doing must, according to this law, have a (significant) probability of failure associated with it. If something is a sure thing, you can bet that the reward is going to be negligible.
 In order to produce some useful work, you must be prepared to waste some heat  this is the Second Law of thermodynamics.
 The Third Law of thermodynamics prohibiting us from reaching absolute zero.
 The more profitable life becomes the less profitable its environment.
 As the environment increases in entropy, this makes it more and more difficult for life to propagate.
 The increase of complexity of life with time is now seen to be a direct consequence of evolution: random mutations and natural selection.
 Mutual information is the formal word used to describe the situation when two (or more) events share information about one another.
 Globalization is the increasing interconnectedness of disparate societies.
 Phase transitions occur in a system when the information shared between the individual constituents become large.
 A high degree of mutual information often leads to a fundamentally different behaviour, although the individual constituents are still the same. As a group they exhibit entirely different behaviour.
 In the initial state, which was completely disordered, there is very little mutual information.
 Mutual information is maximal in a maximally segregated society.
 Wealth doesn’t just add to wealth, it multiplies. Those that have more will get proportionally more and so the gap between the haves and havenots increases to conform to the power.
 In a more interconnected society we are more susceptible to sudden changes. Mutual information simply increases very rapidly and if we want to make good decisions we need to ensure that our own information processing keeps pace.
Part Two
 With quantum theory the notion of a deterministic Universe fails, events always occur with probabilities regardless of how much information you have.
 At the heart of quantum physics is the concept of indeterminism. Indeterminism is linked to the fact that an object can indeed be in more than one state at any one time. This is also known as quantum superposition.
 Measurements affect and change the state of the system being measured and through measurements we force the system to adopt one of its many possible states that existed prior to measurement. If we need to know the exact value of some property of an object (e.g. spatial location, momentum, energy), then we have to destroy the quantumness to obtain it – otherwise we can leave the quantumness intact.
 The entropy of the whole system must (classically speaking) be at least as large as the entropy of any of its parts.
 The problem with Shannon’s information is that it always tells us that there is at least as much information in a whole as there is in any of its parts. This is not true for quantum systems.
 A qubit is a quantum system that, unlike a bit, can exist in any combination of the two states, zero and one.
 Quantum physics applies to all matter in the Universe. It’s just that its predictions are much less distinct from conventional physics at this level.
 Two of the most important features of quantum theory are:
 Qubits can exist in a variety of different states at the same time
 When we measure a qubit we reduce it to a classical result, i.e. we get a definitive outcome.
 Quantum cryptography is one of the areas where quantum physics has demonstrated a new order of information processing. This is not just a theoretical construct; it has been successfully implemented over vast distances.
 A computer, at its most basic level, is any object that can take instructions, and perform computations based on those instructions.
 Quantum physics helps with problems because unlike a conventional computer which checks each possibility one at a time, quantum physics allows us to check multiple possibilities simultaneously. The main limitation of quantum computation geared towards solving classical problems is that we ultimately have to make a measurement in order to extract the answer, given that the question we are asking requires a definite answer. It is an intrinsically probabilistic process and there is always a finite probability that our answer may be wrong. A far more serious inefficiency is the effect of environmental noise which is, in practice, very difficult to control.
 Thinking of computation as a process that maximizes mutual information between the output and the input – i.e. the question being asked, we can think of the speed of computation as the rate of establishing mutual information, i.e. the rate of build up of correlations between the output and the input. The fact that qubits offer a higher degree of mutual information than is possible with bits, directly translates into the quantum speedup.
 We need large systems to be in many different states at the same time in order that they demonstrate quantum behaviour. But, the larger the system, the more ways there are for the particular information about the state to leak out into the environment. The more atoms there are in a superposition, the harder it is to stop one of them decohering to the environment. The solution is redundancy.
 The lower the overall entropy of an arbitrary physical system the higher the chances that its constituent atoms may be entangled.
 There is continuing evidence that more and more natural processes must be based on quantum principles in order to function as they do.
 Living beings are like thermodynamical engines. They must battle the natural tendency to increase disorder. Life does that by absorbing highly disordered energy coming from the Sun and converting it to a more ordered and useful form.
 Let us define free will as the capacity for persons to control their actions in a manner not imposed by previous events, i.e. as containing some element of randomness as well as some element of determinism. Free will lies somewhere between randomness and determinism which seem to be at the opposite extremes in reality. Neither pure randomness or pure determinism would leave any room for free will.
 Every quantum event is fundamentally random, yet we find that large objects behave deterministically. Sometimes when we combine many random things, a more predictable outcome can emerge.
 One of the most fundamental and defining features of quantum theory is that even when we have all information about a system, the outcome is still probabilistic.
 The quantity that tells us by how much orderly things can be compressed to shorter programs is known as Kolmogorov’s complexity.
 A theory is only genuine if there is a way of falsifying it.
 Kolmogorov view of randomness: when the rule is as complicated as the outcome it needs to produce, then the outcome must be seen as complex or, in other words, random.
Part Three
 The Second Law already tells us that the physical entropy in the Universe is always increasing. As such, the information content of the Universe can only ever increase.
 We currently view the information content of reality in terms of quantum physics and gravity – which are our shortest programs used to describe reality.
 Gravity is quite distinct from quantum theory. Gravity dominates large bodies (e.g. planets) and becomes less influential for microscopic bodies.
 The modern view of gravity, through Einstein’s general relativity, is to see it as a curvature of space and time.
 The higher the entropy of a system the more information it carries.
 Entropy is actually proportional to the total number of atoms on the surface, not the volume of an object.
 Quantum mutual information is a form of supercorrelation between different objects and that this supercorrelation is fundamental to the difference between quantum and classical information processing.
 Quantum mutual information is not at all a property of the molecule, it can only be referenced as a joint property between the molecule and the rest of the Universe. It is proportional to the surface area of the molecule.
 The information content of anything does not reside in the object itself, but is a relational property of the object in connection with the rest of the Universe.
 The very act of partitioning, dividing, and pigeonholing necessarily increases information, as you cut through any parts that may be correlated.
 Optical holography shows that two dimensions were sufficient to store all information about three dimensions. When you look at a hologram, you see the standard twodimensional image, but you are also seeing light reflected back to you at slightly different times and this is what gives you the perception of a threedimensional image.
 Einstein’s equation in general relativity describes the effect of energymass on the geometrical structure of fourdimensional spacetime. His equation says that matter tells spacetime how to curve, while spacetime instructs matter how to move.
 In thermodynamics the entropy of a system multiplied by its temperature is the same as the energy of that system.
 All quantum information is ultimately context dependent.
 Reality is created through your observations and is therefore not independent of us.
 Science is constructed more in a way that it tells us what the Universe is "not like" rather than what it is like.
 The laws of physics are the compression of reality which, when run on a universal quantum computer, produce reality.
 The anthropic principle states that the laws of the Universe are the way they are, because if they were different, we would not be here to talk about them.
 It is tempting to say that things and events have no meaning in themselves, but that only the shared (mutual) information between them is real. All properties of physical objects are only encoded in the relationships between them and hence in the information they share with other physical objects. This philosophy goes under the general name of "relationalism".
 What emptiness means in Buddhism is that "things" do not exist in themselves, but are only possible in relation to other "things".
 The whole of our reality emerges by first using the conjectures and refutations to compress observations and then from this compression we deduce what is and isn’t possible.
Epilogue
 There is no prior information required in order for information to exist. Information can be created from emptiness.
 Outside of our reality there is no additional description of the Universe that we can understand, there is just emptiness. There is no scope for the ultimate law or supernatural being – given that both of these would exist outside of our reality and in the darkness.
 The laws of Nature are information about information and outside of it there is just darkness.
These notes were taken from Vlatko's book.
See more at Wikipedia
