Computational complexity of the landscape II—Cosmological considerations

Frederik Denef, Michael R. Douglas, Brian Greene, Claire Zukowski

Research output: Contribution to journalArticlepeer-review

26 Scopus citations

Abstract

We propose a new approach for multiverse analysis based on computational complexity, which leads to a new family of “computational” measure factors. By defining a cosmology as a space–time containing a vacuum with specified properties (for example small cosmological constant) together with rules for how time evolution will produce the vacuum, we can associate global time in a multiverse with clock time on a supercomputer which simulates it. We argue for a principle of “limited computational complexity” governing early universe dynamics as simulated by this supercomputer, which translates to a global measure for regulating the infinities of eternal inflation. The rules for time evolution can be thought of as a search algorithm, whose details should be constrained by a stronger principle of “minimal computational complexity”. Unlike previously studied global measures, ours avoids standard equilibrium considerations and the well-known problems of Boltzmann Brains and the youngness paradox. We also give various definitions of the computational complexity of a cosmology, and argue that there are only a few natural complexity classes.

Original languageEnglish (US)
Pages (from-to)93-127
Number of pages35
JournalAnnals of Physics
Volume392
DOIs
StatePublished - May 2018
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 2018 Elsevier Inc.

Keywords

  • Computational complexity
  • Measures
  • Multiverse
  • String theory

Fingerprint

Dive into the research topics of 'Computational complexity of the landscape II—Cosmological considerations'. Together they form a unique fingerprint.

Cite this