Showing posts with label quantum mechanics. Show all posts
Showing posts with label quantum mechanics. Show all posts

Saturday, October 7, 2017

The Basis of the Universe May Not Be Energy or Matter but Information

via BigThink:

There are lots of theories on what are the basis of the universe is. Some physicists say its subatomic particles. Others believe its energy or even space-time. One of the more radical theories suggests that information is the most basic element of the cosmos. Although this line of thinking emanates from the mid-20th century, it seems to be enjoying a bit of a Renaissance among a sliver of prominent scientists today.

Consider that if we knew the exact composition of the universe and all of its properties and had enough energy and know-how to draw upon, theoretically, we could break the universe down into ones and zeroes and using that information, reconstruct it from the bottom up. It’s the information, purveyors of this view say, locked inside any singular component that allows us to manipulate matter any way we choose. Of course, it would take deity-level sophistication, a feat only achievable by a type V civilization on the Kardashev scale.

Mid-20th century mathematician and engineer Claude Elwood Shannon, is thought the creator of classical information theory. Though few know of him outside of scientific circles, he’s being hailed today as the “father of the digital age.” Shannon’s spark of genius came in 1940 at MIT, when he noticed a relationship between Boolean algebra and telephone switching circuits.

Soon after, he was hired by Bell Labs to devise the most efficient way to transfer information over wires. In 1948, he penned “A Mathematical Theory of Communication,” essentially laying the foundation for the digital age. Shannon was the first to show that mathematics could be used to design electrical systems and circuits.

Before him, it was done through expensive model-making, or mere trial and error. Today, Boolean algebra is used to design communication and computer systems, hardware, software, and so much more. Basically, anything that generates, stores, or transfers information electronically, is based on Shannon’s tome.

That's not all. Shannon defined a unit of information, the binary unit or bit. Bits are a series of 0s and 1s, which help us to store and recall information electronically. Moreover, he was the first to transform data into a commodity. Its value he said was proportional to how much it surprised the consumer. 

In addition, he connected electronic communication to thermodynamics. What's now called “Shannon entropy,” measures the disorder or randomness inherent in any communications system. The greater the entropy, the less clear the message, until it becomes unintelligible. As for information theory, he developed that during World War II, while trying to solve the problem of sending an encrypted message over a static-ridden telephone or telegraph line.

To look at information theory from a quantum viewpoint, the positions of particles, their movement, how they behave, and all of their properties, give us information about them and the physical forces behind them. Every aspect of a particle can be expressed as information, and put into binary code. And so subatomic particles may be the bits that the universe is processing, as a giant supercomputer. Besides quantum mechanics, since Shannon elucidated it, information theory has been applied to music, genetics, investment, and much more.

Science writer James Gleick, author of The Information, contends that it wasn’t Shannon, but early 19th century mathematician Charles Babbage, who first called information the central component of all and everything. Babbage is credited for first conceptualizing the computer, way before anyone had the ability to even build one.

The eminent John Archibald Wheeler in his later years was a strong proponent of information theory. Another unsung paragon of science, Wheeler was a veteran of the Manhattan Project, coined the terms “black hole” and “wormhole,” helped work out the “S-matrix” with Neils Bohr, and collaborated with Einstein on a unified theory of physics.

Wheeler said the universe had three parts: First, “Everything is Particles,” second, “Everything is Fields,” and third, “Everything is information.”

In the 1980s, he began exploring possible connections between information theory and quantum mechanics. It was during this period he coined the phrase “It from bit.” The idea is that the universe emanates from the information inherent within it. Each it or particle is a bit. It from bit.
In 1989, Wheeler produced a paper to the Santa Fe institute, where he announced "every it--every particle, every field of force, even the space-time continuum itself--derives its function, its meaning, its very existence entirely--even if in some contexts indirectly--from the apparatus-elicited answers to yes-or-no questions, binary choices, bits."

A team of physicists earlier this year announced research conclusions that would make Wheeler smile. We might be caught inside a giant hologram they state. In this view, the cosmos is a projection, much like a 3D simulation. What’s weird is that the laws of physics operate well in a 2D quantum field within a 3D gravitational one.

It’s important to note that most physicists believe that matter is the essential unit of the universe. And information theory’s proof is limited. After all, how would you test for it?

If the nature of reality is in fact reducible to information itself, that implies a conscious mind on the receiving end, to interpret and comprehend it. Wheeler himself believed in a participatory universe, where consciousness holds a central role. Some scientists argue that the cosmos seems to have specific properties which allow it to create and sustain life. Perhaps what it desires most is an audience captivated in awe as it whirls in prodigious splendor.

Modern physics has hit a wall in a number of areas. Some proponents of information theory believe embracing it may help us to say, sew up the rift between general relativity and quantum mechanics. Or perhaps it’ll aid in detecting and comprehending dark matter and dark energy, which combined are thought to make up 95% of the known universe. As it stands, we have no idea what they are.

 Ironically, some hard data is required in order to elevate information theory. Until then, it remains theoretical.

To learn more about information theory as the basis of the universe, click here:


Monday, June 20, 2016

Naive realism and reality tunnels

“We think this is reality. But in philosophy, that’s called naive realism: "What I perceive is reality.” And philosophers have refuted naive realism every century for the last 2,500 years, starting with Buddha and Plato, and yet most people still act on the basis of naive realism.

Now the argument is, “Well, maybe my perceptions are inaccurate, but somewhere there is accuracy, scientists have it with their instruments. That’s how we can find out what’s really real.” But relativity, quantum mechanics, have demonstrated clearly that what you find out with instruments is true relative only to the instrument you’re using, and where that instrument is located in space-time. So there is no vantage point from which real reality can be seen.

We’re all looking from the point of view of our own reality tunnels. And when we begin to realize that we’re all looking from the point of view of our own reality tunnels, we find that it is much easier to understand where other people are coming from.

All the ones who don’t have the same reality tunnel as us do not seem ignorant, or deliberately perverse, or lying, or hypnotized by some mad ideology, they just have a different reality tunnel. And every reality tunnel might tell us something interesting about our world if we’re willing to listen.
The idea every perception is a gamble, seems to me so obviously true that I continually am astonished that I could forget it so many times during the course of 24 hours. But to the extent that I remember it, I just can’t stay angry at anybody, so it’s a thing worth keeping in mind.”

~Robert Anton Wilson

Monday, October 26, 2015

The Universe Really Is Weird: A Landmark Quantum Experiment Has Finally Proved It So

via IFLScience

Only last year the world of physics celebrated the 50th anniversary of Bell’s theorem, a mathematical proof that certain predictions of quantum mechanics are incompatible with local causality. Local causality is a very natural scientific assumption and it holds in all modern scientific theories, except quantum mechanics.

Local causality is underpinned by two assumptions. The first is Albert Einstein’s principle of relativistic causality, that no causal influences travels faster than the speed of light. This is related to the “local” bit of local causality.

The second is a common-sense principle named after the philosopher Hans Reichenbach which says roughly that if you could know all the causes of a potential event, you would know everything that is relevant for predicting whether it will occur or not.

Although quantum mechanics is an immensely successful theory – it has been applied to describe the behaviour of systems from subatomic particles to neutron stars – it is still only a theory.

Thus, because local causality is such a natural hypothesis about the world, there have been decades of experiments looking for, and finding, the very particular predictions of quantum mechanics that John Bell discovered in 1964.

But none of these experiments definitively ruled out a locally causal explanation of the observations. They all had loopholes because they were not done quite in the way the theorem demanded.

No Loopholes

Now, the long wait for a loophole-free Bell test is over. In a paper published today in Nature, a consortium of European physicists has confirmed the predictions required for Bell’s theorem, with an experimental set-up without the imperfections that have marred all previous experiments.

A Bell experiment requires at least two different locations or laboratories (often personified as named fictional individuals such as Alice and Bob) where measurements are made on quantum particles. More specifically, at each location:

    a setting for the measurement is chosen randomly
    the measurement is performed with the chosen setting
    the result is recorded.

The experiment will only work if the particles in the different laboratories are in a so-called entangled state. This is a quantum state of two or more particles which is only defined for the whole system. It is simply not possible, in quantum theory, to disentangle the individual particles by ascribing each of them a state independent of the others.

The two big imperfections, or loopholes, in previous experiments were the separation and efficiency loophole.

To close the first loophole, it is necessary that the laboratories be far enough apart (well separated). The experimental procedures should also be fast enough so that the random choice of measurement in any one laboratory could not affect the outcome recorded in any other laboratory be any influence travelling at the speed of light or slower. This is challenging because light travels very fast.

To close the second, it is necessary that, once a setting is chosen, a result must be reported with high probability in the time allowed. This has been a problem with experiments using photons (quantum particles of light) because often a photon will not be detected at all.

The Experiment

Most previous Bell-experiments have used the simplest set up, with two laboratories, each with one photon and the two photons in an entangled state. Ronald Hanson and colleagues have succeeded in making their experiment loophole-free by using three laboratories, in a line of length 1.3km.

In the laboratories at either ends, Alice and Bob create an entangled state between a photon and an electron, keep their electron (in a diamond lattice) and send their photons to the laboratory in the middle (which I will personify as Juanita). Alice and Bob then each choose a setting and measure their electrons while Juanita performs a joint measurement on the two photons.

Alice and Bob’s measurements can be done efficiently, but Juanita’s, involving photons, is actually very inefficient. But it can be shown that this does not open a loophole, because Juanita does not make any measurement choice but rather always measures the two photons in the same way.

The experiment, performed in the Netherlands, was very technically demanding and only just managed to convincingly rule out local causality. This achievement could, in principle, be applied to enable certain very secure forms of secret key distribution. With continuing improvements in the technology one day this hopefully will become a reality.

For the moment, though, we should savour this result for its scientific significance. It finally proves that either causal influences propagate faster than light, or a common-sense notion about what the word “cause” signifies is wrong.

One thing this experiment has not resolved is which of these options we should choose. Physicists and philosophers remain as divided as ever on that question, and what it means for the nature of reality.

Tuesday, March 3, 2015

Superdeterminism

From Wikipedia:

In the context of quantum mechanics, superdeterminism is a term that has been used to describe a hypothetical class of theories that evade Bell's theorem by virtue of being completely deterministic. Bell's theorem depends on the assumption of "free will", which does not apply to deterministic theories. It is conceivable, but arguably unlikely, that someone could exploit this loophole to construct a local hidden variable theory that reproduces the predictions of quantum mechanics. Superdeterminists do not recognize the existence of genuine chances or possibilities anywhere in the cosmos.

Bell's theorem assumes that the types of measurements performed at each detector can be chosen independently of each other and of the hidden variable being measured. In order for the argument for Bell's inequality to follow, it is necessary to be able to speak meaningfully of what the result of the experiment would have been, had different choices been made. This assumption is called counterfactual definiteness. But in a deterministic theory, the measurements the experimenters choose at each detector are predetermined by the laws of physics. It can therefore be argued that it is erroneous to speak of what would have happened had different measurements been chosen; no other measurement choices were physically possible. Since the chosen measurements can be determined in advance, the results at one detector can be affected by the type of measurement done at the other without any need for information to travel faster than the speed of light.

In the 1980s, John Bell discussed superdeterminism in a BBC interview:

    There is a way to escape the inference of superluminal speeds and spooky action at a distance. But it involves absolute determinism in the universe, the complete absence of free will. Suppose the world is super-deterministic, with not just inanimate nature running on behind-the-scenes clockwork, but with our behavior, including our belief that we are free to choose to do one experiment rather than another, absolutely predetermined, including the "decision" by the experimenter to carry out one set of measurements rather than another, the difficulty disappears. There is no need for a faster than light signal to tell particle A what measurement has been carried out on particle B, because the universe, including particle A, already "knows" what that measurement, and its outcome, will be.

Although he acknowledged the loophole, he also argued that it was implausible. Even if the measurements performed are chosen by deterministic random number generators, the choices can be assumed to be "effectively free for the purpose at hand," because the machine's choice is altered by a large number of very small effects. It is unlikely for the hidden variable to be sensitive to all of the same small influences that the random number generator was.

Superdeterminism has also been criticized because of its implications regarding the validity of science itself. For example, Anton Zeilinger has commented:

    [W]e always implicitly assume the freedom of the experimentalist... This fundamental assumption is essential to doing science. If this were not true, then, I suggest, it would make no sense at all to ask nature questions in an experiment, since then nature could determine what our questions are, and that could guide our questions such that we arrive at a false picture of nature.

Monday, March 2, 2015

Quantum Mechanics for Dummies

This excellent series of videos by 'Looking Glass Universe' lucidly explains the ins and outs of Quantum Mechanics in simple language. Great for meatheads like myself whose brains hurt when math is involved.

Introduction to Quantum Mechanics:



The Wave Function:



Quantum Randomness:



Quantum Eraser:


Quantum Eraser Explained:




Causes after effects?





Heisenberg Uncertainty Principle:



EPR Paradox and Entanglement

Is Quantum Mechanics True? Bell's Theorem Explained



Proof of Bell's Theorem






Bohmian Mechanics: An Alternative to Quantum




A problem with Bohmian mechanics? Contextuality.










Monday, February 23, 2015

The Unreality of Time






Philosophy and physics may seem like polar opposites, but they regularly address quite similar questions. Recently, physicists have revisited a topic with modern philosophical origins dating over a century ago: the unreality of time. What if the passage of time were merely an illusion? Can a world without time make sense?

While a world without the familiar passage of time may seem far-fetched, big names in physics, such as string theory pioneer Ed Witten and theorist Brian Greene, have recently embraced such an idea. A timeless reality may help reconcile differences between quantum mechanics and relativity, but how can we make sense of such a world? If physics does indeed suggest that the flow of time is illusory, then philosophy may be able to shed light on such a strange notion.

British philosopher J.M.E McTaggart advanced this idea in 1908 in his paper titled, “The Unreality of Time.” Philosophers widely consider his paper to be one of the most influential, early examinations of this possibility. Looking through McTaggart’s philosophical lens, a reality without time becomes a little more intuitive and, in principle, possible.

A Tale of Two Times

McTaggart’s argument against the reality of time has a number of interpretations, but his argument starts with a distinction about ordering events in time. The “A” series and “B” series of time form an integral part of McTaggart’s argument, and I’ll unravel this distinction with an example historical event.

On July 20, 1969, Apollo 11 became the first manned spacecraft to land on the moon. For argument’s sake, consider this event to represent an event during the present. Several days in the past (July 16), then, Apollo 11 lifted off the ground. Additionally, several days in the future all of the mission astronauts will land back on earth, safe and sound. Classifying an event as “several days past,” or “several days future,” falls under the “A” series. For the moon landing, some events (e.g. Lincoln’s assassination) are in the distant past; some events are in the distant future (e.g. the inauguration of President Obama); and other events fall somewhere in between.

Under the “A” series, events flow from one classification (i.e. past, present and future) to another. On July 16th, the moon landing would have the property of being in the future. The instant the Apollo 11 landed on the moon, that event would be present. After this moment, its classification changes to the past.

The “B” series, however, doesn’t classify events on this scale ranging from the distant past to the distant future. Instead, the “B” series orders events based on their relationship to other events. Under this ordering, Lincoln’s assassination occurs before the moon landing, and Obama’s inauguration occurs after the moon landing. This relational ordering seems to capture a different way of looking at time.

Two Times, One Contradiction

With this distinction in place, McTaggart additionally argues that a fundamental series of time requires a change to take place. Under the “B” series, the way these events are ordered never change. Obama’s inauguration, for instance, will never change properties and occur before the moon landing and vice versa. These relational properties simply don’t change.

But the A series does embody the change that we might expect from the flow of time. Events first have the property of being in the future, then they become present events. Afterward, they drift into the past. Under the A series, time does have an objective flow, and true change does happen. In McTaggart’s mind (and perhaps the mind of many others), this change is a necessary aspect of time.

But herein lies the contradiction. If these events do change in this sense, they will have contradictory properties. McTaggart argues that an event can’t be in the past, in the present, and in the future. All of these properties are incompatible, so the A series leads to a contradiction. Consequently, time, which requires change, does not truly exist. Welcome to the timeless reality.

Wait a Minute…

Certainly, many philosophers and physicists still believe in the reality of time and have objected to McTaggart’s argument. There are a number of fascinating caveats and counterexamples that you can read about elsewhere. Nonetheless, McTaggart’s work has influenced a number of philosophers’ approach to time, and his work has inspired many philosophers to incorporate physics into their arguments.

For instance, when Albert Einstein introduced special relativity, he seriously disrupted our “folk” conception of the flow of time. In special relativity, there is no absolute simultaneity of events. In one reference frame, two events may appear to happen at the same time. An observer on a speeding rocket ship, however, may observe one event happening before the other. Neither observer is “right” in this situation: This is simply the weirdness that special relativity entails.

Consequently, many philosophers have used special relativity as evidence against a theory supporting the A series of time. If absolute simultaneity doesn’t exist, it doesn’t make sense to say that one event is “in the present.” There’s no absolute present that pervades the universe under special relativity.

But McTaggart’s entire argument may help us better understand strange physics at the intersection of quantum mechanics and general relativity. In an attempt to reconcile these two theories, some well-known physicists have developed theories of quantum gravity that imply the world lacks time in a fundamental way.

Brad Monton, a philosopher of physics at the University of Colorado Boulder, recently published a paper comparing McTaggart’s philosophy with prominent theories in physics, including quantum gravity. During an interview, I asked him how some of the “timeless” ideas in quantum gravity compared to McTaggart.

“They’re on par with the radicalness,” he said. “There’s a lot of radicalness.”

Monton cautioned, however, that quantum gravity does not imply the same lack of time that McTaggart may have had in mind. Physicist John Wheeler, as Monton notes, has postulated that time may not be a fundamental aspect of reality, but this only happens on extremely small distance scales.

Some of these ideas in quantum gravity may be radical, but several respected names in physics are seriously considering a reality without time at its core. If a quantum gravity theory emerges that requires a radical conception of time, McTaggart may help us prepare.

As Monton writes in his paper: “As long as McTaggart’s metaphysics is viable, then the answer to the physicists’ queries is “no” – they are free, from a philosophical perspective at least, to explore theories where time is unreal.”

Many quantum gravity theories remain speculative, but there’s a chance that timelessness may become a prominent feature in physics. If that’s the case, then hopefully philosophers of science will help us wrap our heads around the implications.

From Physics Central

Sunday, February 22, 2015

Digital Philosophy

http://www.digitalphilosophy.org/

What is Digital Philosophy?

Digital Philosophy (DP) is a new way of thinking about the fundamental workings of processes in nature. DP is an atomic theory carried to a logical extreme where all quantities in nature are finite and discrete. This means that, theoretically, any quantity can be represented exactly by an integer. Further, DP implies that nature harbors no infinities, infinitesimals, continuities, or locally determined random variables. This paper explores Digital Philosophy by examining the consequences of these premises.

At the most fundamental levels of physics, DP implies a totally discrete process called Digital Mechanics. Digital Mechanics[1] (DM) must be a substrate for Quantum Mechanics. Digital Philosophy makes sense with regard to any system if the following assumptions are true:

All the fundamental quantities that represent the state information of the system are ultimately discrete. In principle, an integer can always be an exact representation of every such quantity. For example, there is always an integral number of neutrons in a particular atom. Therefore, configurations of bits, like the binary digits in a computer, can correspond exactly to the most microscopic representation of that kind of state information.

In principle, the temporal evolution of the state information (numbers and kinds of particles) of such a system can be exactly modeled by a digital informational process similar to what goes on in a computer. Such models are straightforward in the case where we are keeping track only of the numbers and kinds of particles. For example, if an oracle announces that a neutron decayed into a proton, an electron, and a neutrino, it’s easy to see how a computer could exactly keep track of the changes to the numbers and kinds of particles in the system. Subtract 1 from the number of neutrons, and add 1 to each of the numbers of protons, electrons, and neutrinos.

The possibility that DP may apply to various fields of science motivates this study.

Tom Campbell: Virtual Reality: Why It's A Better Model Than String Theory and Holographic Universe



“When the original founding fathers of quantum mechanics were doing these experiments they were really excited… making statements like- ‘if quantum mechanics doesn’t blow your mind, that’s because you don’t understand quantum mechanics.’ They realized this was a really big deal philosophically, (and) scientifically… Then they tried to come up with a good explanation. They couldn’t find one… Now they just blow it off as ‘nobody will ever know… it’s just weird science.’ This My Big Toe theory though, explains it.”  -Tom Campbell

If that chopped up quote sounds vague, pseudo science-y, or confusing (especially if you’re not familiar with some of the basic ideas behind quantum mechanics) I get that. But, when you’re grappling with huge issues like the very nature of our reality and you’re trying to take a broad stroke across the top, things tend to get foggy, so bear with me.

(You should know about the infamous, hotly-debated double-slit experiment covered above for this talk.)

Actually, don’t bear with me, or take anything from me, because our guest, Tom Campbell has an impressive career in applied physics. He worked in military intelligence, reverse-engineering enemy technology, in national missile defense, even on huge engineering projects for NASA— impressive stuff.

But what makes Tom even more of an interesting and rare specimen is that he has also spent three decades researching the nature of consciousness and reality. And, he’s done so by remaining open-minded about topics that many scientists and snopes denizens would greet only with a scoff and a pompous finger wave. We’re talking about the sexy stuff- out of body experiences, altered states of consciousness, the statistically measurable power of intent and a bunch of other stuff that sounds like it’s straight out of an episode of FRINGE...

Read more at disinformation:

Visit Tom's website at http://www.my-big-toe.com/

Saturday, February 21, 2015

Every Black Hole Contains a New Universe

By: Nikodem Poplawski

Our universe may exist inside a black hole. This may sound strange, but it could actually be the best explanation of how the universe began, and what we observe today. It's a theory that has been explored over the past few decades by a small group of physicists including myself.

Successful as it is, there are notable unsolved questions with the standard big bang theory, which suggests that the universe began as a seemingly impossible "singularity," an infinitely small point containing an infinitely high concentration of matter, expanding in size to what we observe today. The theory of inflation, a super-fast expansion of space proposed in recent decades, fills in many important details, such as why slight lumps in the concentration of matter in the early universe coalesced into large celestial bodies such as galaxies and clusters of galaxies.

But these theories leave major questions unresolved. For example: What started the big bang? What caused inflation to end? What is the source of the mysterious dark energy that is apparently causing the universe to speed up its expansion?

The idea that our universe is entirely contained within a black hole provides answers to these problems and many more. It eliminates the notion of physically impossible singularities in our universe. And it draws upon two central theories in physics...

The first is general relativity, the modern theory of gravity. It describes the universe at the largest scales. Any event in the universe occurs as a point in space and time, or spacetime. A massive object such as the Sun distorts or "curves" spacetime, like a bowling ball sitting on a canvas. The Sun's gravitational dent alters the motion of Earth and the other planets orbiting it. The sun's pull of the planets appears to us as the force of gravity.

The second is quantum mechanics, which describes the universe at the smallest scales, such as the level of the atom. However, quantum mechanics and general relativity are currently separate theories; physicists have been striving to combine the two successfully into a single theory of "quantum gravity" to adequately describe important phenomena, including the behavior of subatomic particles in black holes.

A 1960s adaptation of general relativity, called the Einstein-Cartan-Sciama-Kibble theory of gravity, takes into account effects from quantum mechanics. It not only provides a step towards quantum gravity but also leads to an alternative picture of the universe. This variation of general relativity incorporates an important quantum property known as spin. Particles such as atoms and electrons possess spin, or the internal angular momentum that is analogous to a skater spinning on ice.

In this picture, spins in particles interact with spacetime and endow it with a property called "torsion." To understand torsion, imagine spacetime not as a two-dimensional canvas, but as a flexible, one-dimensional rod. Bending the rod corresponds to curving spacetime, and twisting the rod corresponds to spacetime torsion. If a rod is thin, you can bend it, but it's hard to see if it's twisted or not.

Spacetime torsion would only be significant, let alone noticeable, in the early universe or in black holes. In these extreme environments, spacetime torsion would manifest itself as a repulsive force that counters the attractive gravitational force coming from spacetime curvature. As in the standard version of general relativity, very massive stars end up collapsing into black holes: regions of space from which nothing, not even light, can escape.

Here is how torsion would play out in the beginning moments of our universe. Initially, the gravitational attraction from curved space would overcome torsion's repulsive forces, serving to collapse matter into smaller regions of space. But eventually torsion would become very strong and prevent matter from compressing into a point of infinite density; matter would reach a state of extremely large but finite density. As energy can be converted into mass, the immensely high gravitational energy in this extremely dense state would cause an intense production of particles, greatly increasing the mass inside the black hole.

The increasing numbers of particles with spin would result in higher levels of spacetime torsion. The repulsive torsion would stop the collapse and would create a "big bounce" like a compressed beach ball that snaps outward. The rapid recoil after such a big bounce could be what has led to our expanding universe. The result of this recoil matches observations of the universe's shape, geometry, and distribution of mass.

In turn, the torsion mechanism suggests an astonishing scenario: every black hole would produce a new, baby universe inside. If that is true, then the first matter in our universe came from somewhere else. So our own universe could be the interior of a black hole existing in another universe. Just as we cannot see what is going on inside black holes in the cosmos, any observers in the parent universe could not see what is going on in ours.

The motion of matter through the black hole's boundary, called an "event horizon," would only happen in one direction, providing a direction of time that we perceive as moving forward. The arrow of time in our universe would therefore be inherited, through torsion, from the parent universe.

Torsion could also explain the observed imbalance between matter and antimatter in the universe. Because of torsion, matter would decay into familiar electrons and quarks, and antimatter would decay into "dark matter," a mysterious invisible form of matter that appears to account for a majority of matter in the universe.

Finally, torsion could be the source of "dark energy," a mysterious form of energy that permeates all of space and increases the rate of expansion of the universe. Geometry with torsion naturally produces a "cosmological constant," a sort of added-on outward force which is the simplest way to explain dark energy. Thus, the observed accelerating expansion of the universe may end up being the strongest evidence for torsion.

Torsion therefore provides a theoretical foundation for a scenario in which the interior of every black hole becomes a new universe. It also appears as a remedy to several major problems of current theory of gravity and cosmology. Physicists still need to combine the Einstein-Cartan-Sciama-Kibble theory fully with quantum mechanics into a quantum theory of gravity. While resolving some major questions, it raises new ones of its own. For example, what do we know about the parent universe and the black hole inside which our own universe resides? How many layers of parent universes would we have? How can we test that our universe lives in a black hole?

The last question can potentially be investigated: since all stars and thus black holes rotate, our universe would have inherited the parent black hole’s axis of rotation as a "preferred direction." There is some recently reported evidence from surveys of over 15,000 galaxies that in one hemisphere of the universe more spiral galaxies are "left-handed", or rotating clockwise, while in the other hemisphere more are "right-handed", or rotating counterclockwise. In any case, I believe that including torsion in geometry of spacetime is a right step towards a successful theory of cosmology.

Nikodem Poplawski is a theoretical physicist at the University of New Haven in Connecticut.