Quantcast
Channel: mkoconnor – XOR’s Hammer
Viewing all articles
Browse latest Browse all 50

The Arithmetic Hierarchy Meets the Real World

$
0
0

Mathematical logic has a categorization of sentences in terms of increasing complexity called the Arithmetic Hierarchy.  This hierarchy defines sets of sentences \Pi_i and \Sigma_i for all nonnegative integers i.  The definition is as follows:

  • \Pi_0 and \Sigma_0 are both equal to the set of sentences P such that a computer can determine the truth or falsity of P in finite time.Sentences in \Pi_0 = \Sigma_0 are essentially computations, things like 3^2 + 4^2 = 5^2 or “For all x, y, z, n less than a million, if n > 2 then x^n + y^n \neq z^n.”
  • \Pi_i for i > 0 is defined as follows: If, for all n, P(n) is a \Sigma_{i-1} sentence, then \forall n\, P(n) is a \Pi_i sentence.
  • Similarly, if for all n, P(n) is a \Pi_{i-1} sentence, then \exists n\, P(n) is a \Sigma_i sentence.

It’s not obvious from the definition, but by encoding pairs, triples, etc. of natural numbers as single natural numbers, you are allowed to include multiple instances of the same quantifier without moving up the hierarchy. For example, a sentence \forall n\forall m\,P(n,m) is \Pi_1 if P(n,m) is \Pi_1. Essentially \Pi_i contains all statements with i - 1 quantifier alternations where the outermost quantifier is a \forall and \Sigma_i contains all statements with i - 1 quantifier alternations where the outermost quantifier is an \exists.

Many number-theoretic and combinatorial statements are \Pi_1: Fermat’s last theorem, Goldbach’s conjecture, and the Riemann Hypothesis (which can be seen to be in \Pi_1 by a formulation of Jeffrey Lagarias). By encoding finite groups as numbers, theorems like the classification of finite simple groups can also be seen to be \Pi_1.

Note that statements of the form \forall n \exists m \leq f(n) P(n,m) where f is some computable function are also in \Pi_1 since for a fixed n, \exists m \leq f(n) P(n,m) is checkable in finite time by a computer.

There are many sentences which are of the form \forall n\exists m P(n,m) in \Pi_2 for which we actually know a bound on m, so we get a sentence in \Pi_1 when we use that bound. For example, the statement that there are infinitely many primes is in \Pi_2, since it can be written as “For all n, there exists an m>n such that m is prime”, but we also know a version that’s in \Pi_1, for example, we know that there’s a prime between n and 2n for any n.

There is a theorem that the sentences in this hierarchy actually do get harder to decide the truth of in general:

Theorem: For any i: even in a programming language augmented with the magical ability to decide the truth of \Pi_i sentences, it is not possible to write a program which decides the truth of every \Pi_{i+1} sentence.

A Scientific Hierarchy

What if we used the above idea to classify scientific statements instead of number-theoretic ones?  We would get a new hierarchy where:

  • \Pi_0 and \Sigma_0 are both equal to the set of sentences P such that a definite experiment can be performed which demonstrates the truth or falsity of P.
  • \Pi_i and \Sigma_i for i > 0 are defined just as before.

Examples of \Pi_1 statements in this classification would be things like: Any pair of objects dropped in a vacuum will fall with the same acceleration; or given any gram of liquid water at a standard pressure, adding 4.18 joules of energy will raise it 1 degree Celsius.

Of course, those examples are idealized: to make them actually true, you have to add in lots  more caveats and tolerances: for example, you have to say that any pair of objects weighing no more than such-and-such will fall with the same acceleration up to  such-and-such tolerance (and add in various other caveats).  More on this later.

An example of a \Pi_2 scientific statement might be something like: if you put any two objects in thermal contact, then for any given tolerance, there will be a time at which their temperatures are equal to within that tolerance.

You might have to use a statement of complexity \Pi_3 to say something about  randomness.  For example, a \Pi_3 scientific statement might be: Suppose you are doing an experiment where you acquire a carbon-11 atom, wait for it to decay to boron-11 and record the time, then repeat.  Then, for any tolerance \epsilon, there will be an n such that for any m>= n the proportion of the first m carbon-11 atoms that decayed in less than 20.334 minutes will be no more than \epsilon away from 0.5.

A Scientific Hierarchy Theorem?

In the Arithmetic Hierarchy case, we had a theorem saying that statements higher up the hierarchy were qualitatively harder to decide the truth of.  Is there a similar theorem here?

In fact, you can make this precise (although, to be honest, I’m not sure what the cleanest way to do it is).  In particular, \Pi_1 statements are learnable in the limit of acquiring all the experimental data, while \Pi_2 statements aren’t.  One way to make this rigorous is: Suppose you have a \Pi_1 statement \forall x\,P(x), where for any object x (assume that x ranges over some countable set \{x_0, x_1, x_2, \ldots\} of physical objects), P(x) can be established or refuted by a single experiment.

Now, suppose that you have a probability distribution over all possibilities for P(x_i) for objects x_i: that is, you have a probability distribution Pr on the Cantor space where the elements of the sample space are to be interpreted as full specifications of whether or not P(x_i) holds for each x_i: e.g.,  things like P(x_0)\wedge P(x_1)\wedge P(x_2)\wedge\cdots or \neg P(x_0)\wedge P(x_1) \wedge \neg P(x_2)\wedge \cdots.

Call events which are finite conjunctions of events of the form P(x_i) or \neg P(x_i) basic events.

Now say that this probability distribution is open-minded with respect to an event E if, for any basic event B, if E and B are consistent, then Pr(E|B) > 0.

Now, assuming that Pr is open-minded with respect to the event \forall x\, P(x), it’s pretty easy to show that \lim_{n\to\infty} Pr(\forall x\, P(x) | D_n) = 1, where D_n = P(x_0)\wedge P(x_1)\wedge\cdots\wedge P(x_n).  That is, if \forall x\, P(x) is actually true, it will be learned.  (Of course, if it is false, that will be learned in finite time automatically.)

On the other hand, for general \Pi_2 statements, it’s pretty easy to see that this is not the case:  In fact, any computable procedure assigning a probability for a statement \forall x\exists y P(x,y) just based on seeing finitely many data points of the form P(x',y') or \neg P(x',y') can be tricked by some value of P by using the definition of the computable procedure in P itself.

I would like to emphasize though, that the above are off-the-cuff basically-trivial observations.  There must be a cleaner, nicer framework to state a scientific hierarchy theorem, but I don’t know what it is.

Science is Hard

In my opinion, thinking about the complexity of scientific statements in terms of quantifiers yields some insights.

For example, the \Pi_1 case is basically the scientific method as learned in elementary school: a hypothesis is generated, and by repeated experiments we either reject it, or gradually come to correctly accept the truth of it.

The fact that science is not so simple is seen by just recognizing that there are scientific hypotheses that are not in \Pi_1 form.  However, probably even more significant is the fact that many real-world phenomena add quantifiers to our hypotheses:

  • Measurement errors.As alluded to above, to make our scientific statements true, we have to add tolerances.  If we don’t know what the tolerances should be when we make the hypothesis, we have to add at least one quantifier.
  • Not everyone can perform the experimentsNot everyone performs experiments, in fact, most people rely on scientific consensus.
    That automatically brings the hypothesis up to (at least) \Pi_2, for example:
    There exists a time t, such that for all times t'>t the scientific consensus at time t' will be in favor of global warming (or evolution, or that eggs are good for you, or whatever).

Where can I go for more information?

I feel like thinking about the complexity of scientific hypotheses in terms of quantifier alternations is so natural that it must have been studied before, but I can’t find anything by googling around.  Does anyone know where to find more information on this?


Viewing all articles
Browse latest Browse all 50

Trending Articles