The Universe: A Computer Simulation?

An unpublished paper on the arXiv is claiming to have formulated a suite of experiments, as informed by a particular kind of computer approximation (called “lattice QCD” or L-QCD), to determine if the universe we perceive is really just an elaborate computer simulation. It is creating a buzz (e.g. covered by the Skeptics Guide to the Universe, Technology Review, io9, and probably elsewhere).

I have some problems with the paper’s line of argument. But let me make it clear that I have no fundamental problem with the speculation itself. I think it is a fun and interesting to ponder the possibility of living in a simulation and to try and formulate experiments to demonstrate it. It is certainly an amusing intellectual exercise and, at least in my own experience, this was an occasional topic of my undergraduate years. More recently than my undergraduate years, Yale philosopher Nick Bostrom put forth this famous arguments in more quasiformal terms, but the idea had been hovering there (probably with a Pink Floyd soundtrack) for a long time.

The paper is not “crackpot”, but is highly speculative. It uses a legitimate argumentation technique, if used properly (and the authors basically do), called reductio ad absurdum: reduction to the absurd. Their argument goes like this:

  1. Computer simulations of spacetime dynamics, as known to humans, always involve space and time lattices as a stage to perform dynamical approximations (e.g. finite difference methods etc.);
  2. Lattice QCD (L-QCD) is a profound example of how (mere) humans have successfully simulated, on a lattice, arguably the most complex and pure sector of the Standard Model: SU(3) color, a.k.a. quantum chromodynamics, the gauge theory that governs the strong nuclear force as experienced by quarks and gluons;
  3. L-QCD is not perfect, and is still quite crude in its absolute modern capabilities (I think most people reading these articles, given the hype imparted to L-QCD, would be shocked at how underwhelming L-QCD output actually is, given the extreme amount of computing effort and physics that goes into it). But it is, under the hood, the most physically complete of all computer simulations and should be taken as a proof-of-principle for the hypothetical possibility of bigger and better simulations — if we can do it, even at our humble scale, certainly an übersimulation should be possible with sufficient computing resources;
  4. Extrapolating (this is the reductio ad absurdum part), L-QCD for us today implies L-Reality for some other beyond-our-imagination hypercreatures: for we are not to be taken as a special case for what is possible and we got quite a late start into the game as far as this sentience thing goes.
  5. Nevertheless, nuanced flaws in the simulation that arise because of the intrinsic latticeworks required by the approximations might be experimentally detectable.

Cute.

Firstly, there is an amusing recursive metacognative aspect to this discussion that has its own strangeness; it essentially causes the discussion to implode. It is a goddamn hall of mirrors from a hypothesis testing point of view. This was, I believe, the point Steve Novella was getting at in the SGU discussion. So, let’s set aside the question of whether a simulation could

  1. accurately reconstruct a simulation of itself and then
  2. proceed to simulate and predict its own real errors and then
  3. simulate the actual detection and accurate measurement of the unsimulated real errors.

Follow that? For the byproduct of a simulation to detect that it is part of an ongoing simulation via the artifacts of the main simulation, I think you have to have something like that. I’m not saying it’s not possible, but it is pretty unintuitive and recursive.

My main problem with the argument is this: a discrete or lattice-like character to spacetime, with all of its strange implications, is neither a necessary nor sufficient condition to conclude we live in a simulation. What it would tell us, if it were to be identified experimentally, is that: spacetime has a discrete or lattice-like character. Given the remarkably creative and far-seeing imaginative spirit of the project, it seems strangely naive to use such an immature, vague “simulation = discrete” connection to form a serious hypothesis. There very well may be some way to demonstrate we live in a simulation (or, phrased more responsibly, falsify the hypothesis that we don’t live in a simulation), but identifying a lattice-like spacetime structure is not the way. What would be the difference between a simulation and the “real” thing. Basically, a simulation would make error or have inexplicable quirks that “reality” would not contain. The “lattice approximation errors” approach is pressing along these lines, but is disappointingly shallow.

The evidence for living in a simulation would have to be much more profound and unsubtle to be convincing than mere latticworks. Something like, somewhat in a tongue-and-cheek tone:

  1. Identifying the equivalent of commented out lines of code or documentation. This might be a steganographic exercise where one looks for messages buried in the noise floor of fundamental constants, or perhaps the laws of physics itself. For example, finding patterns in π sounds like a good lead, a la Contact, but literally everything is in π an infinite number of times, so one needs another strategy like perhaps π lacking certain statistical patterns. If the string 1111 didn’t appear in π at any point we could calculate, this would be stranger than finding “to be or not to be” from Hamlet in ASCII binary;
  2. Finding software bugs (not just approximation errors); this might appear as inconsistencies in the laws of physics at different periods of time;
  3. Finding dead pixels or places where the hardware just stopped working locally; this might look like a place where the laws of physics spontaneously changed or failed (e.g. not a black hole where there is a known mechanism for the breakdown, but something like “psychics are real”, “prayer works as advertised”, etc.);

I’m just making stuff up, and don’t really believe these efforts would bear fruit, but those kinds of thing, if demonstrated in a convincing way, would be an indication to me that something just wasn’t right. That said, the laws of physics are remarkably robust: there are no known violations of them (or nothing that hasn’t been able to be incorporated into them) despite vigorous testing and active efforts to find flaws.

I would also like to set a concept straight that I heard come up in the SGU discussion: the quantum theoretical notion of the Planck length does not imply any intrinsic clumpiness or discreteness to spacetime, although it is sometimes framed this way in casual physics discussions. The Planck length is the spatial scale where quantum mechanics encounters general relativity in an unavoidable way. In some sense, current formulations of quantum theory and general relativity “predict” the breakdown of spacetime itself at this scale. But, in the usual interpretation, this is just telling us that both theories as they are currently formulated cannot be correct at that scale, which we already hypothesized decades ago — indeed this is the point of the entire project of M-theory/Loop quantum gravity and its derivatives.

Moreover, even working within known quantum theory and general relativity, to consider the Planck length a “clump” or “smallest unit” of spacetime is not the correct visualization. The Planck length sets a scale of uncertainty. The word “scale” in physics does not imply a hard, discrete boundary, but rather a very, very soft one. It is the opposite of a clump of spacetime. The Planck length is then interpreted as the geometric scale at which spacetime is infinitely fuzzy and statistically uncertain. It does not imply a hard little impenetrable region embedded in some abstract spacetime latticeworks. This breakdown of spacetime occurs at each continuous point in space. That is, one could zoom into any arbitrarily chosen point and observe the uncertainty emerge at the same scale. Again, no latticeworks or lumpiness is implied.

Transcendental Mathy Music

Ever wonder what π or e or other number sequences sound like when mapped into some musical sound space? I’ve written a little Mathematica Notebook, downloadable here, that lets you tinker with these possibilities with a simple interface. You can then save your work to a MIDI file which can then be loaded into your favorite music software like CuBase, Logic, Pro Tools, or even Garage Band. I would offer the full notebook as a free CDF file, but Wolfram’s current CDF format does not support writing out to files yet. However, below is the basic interface you can tinker with on this web page. You will need the free Mathematica CDF plugin installed (or a copy of Mathematica 8).

On my latest album, Smug, I use this software to create two pieces based on the trancendental number e and π. Descriptions below.

Smuggy E:
The first riff uses the first 15 digits of the transcendental number e=2.71828182845904 (0=C, 1=C#, 2=D etc.) in 15/16 time it then modulates so 0=F, 1=F#, 2=G
the interlude riff is the speed of light in vacuum c=299792458.0 m/s with (0=C, 1=C#, 2=D etc.) in 6/4. Yes, I cheated a little adding the “.0” on the end of c since, in m/s, c is defined as an exact integer.

Smuggy π:
Similar to above, uses the first 10 digits of pi and the speed of light with the mapping (0=C, 1=C#, 2=D etc.).

These are just a few of the literally infinite possibilities one can create using amusing number mappings. Let me know if you create (or have created) any of your own. I’d be interested to hear!


Chick-Fil-A Apparently Backtracks: An Unfortunate Development

I was reasonably optimistic about Chick-Fil-A’s apparent overture to end their practices of donating to anti-LGBT organizations. But apparently their president, Dan Cathy, issued a statement actively contradicting previous reports in the form of a correction. Cathy was quoted as saying:

“There continues to be erroneous implications in the media that Chick-fil-A changed our practices and priorities in order to obtain permission for a new restaurant in Chicago. That is incorrect. Chick-fil-A made no such concessions, and we remain true to who we are and who we have been.”

With nothing more than a vague intuition, my sense is that there may be some confusion or strife within the company itself. Other than simply outright incorrect reporting, what else can explain such polar views coming from the same corporation? However, as a private company, Cathy’s public word is probably a better indicator of the company’s position than an arbitrator in Chicago.