Thursday, April 8, 2021

Self-Proof of Consistency

 

Basing the development of foundations on existence of a monster can lead to serious errors. For example, the [Gödel 1931] conclusion that a theory that axiomatizes is own provability cannot prove its own consistency was based on existence of the proposition I’mUnprovable (such that I’mUnprovable I’mUnprovable). However, the [Gödel 1931] construction of I’mUnprovable fails in foundations because it violates restrictions on orders for propositions.

 

Nonexistence of Gödel’s proposition I’mUnprovable opened up the question of a theory being able to prove its own consistency. [Gödel 1931] included provability axioms for Russell’s system including the following:

·  ModusPonens: Ψ)Ψ)

·  DoubleNegation: (¬¬ΨΨ)

 

Although not included in [Gödel 1931], the incorporations of TheoremUse {( (Ψ)Ψ)} is fundamental to foundational theories by enabling the principle that theorems can be used in subsequent proofs. As such, TheoremUse belonged in the [Gödel 1931] axiomatization of provability for Russell’s Principia Mathematica. However, its inclusion would have made [Gödel 1931] inconsistent. Not including TheoremUse made the [Gödel 1931] axiomatization of provability fundamentally incomplete.

 

Inconsistency of a theory (¬Consistent) can be formally defined as there being a proposition Ψ0 such that  Ψ0¬Ψ0.

 

Contra [Gödel 1931], a foundational theory can prove its own consistency as shown in the following theorem:

    Theorem. Consistent

        Proof. In order to obtain a contraction, hypothesize ¬Consistent. Consequently, there is a proposition Ψ0 such that Ψ0¬Ψ0. By the principle of TheoremUse

{((Φ)Φ)} with Ψ0 for Φ, Ψ0¬Ψ0, which is the desired contradiction.

 

However, the proof does not carry conviction that a contradiction cannot be derived because the proof is valid even if the theory is inconsistent. Consistency of the mathematical theories Actors and Ordinals is established by proving each theory has a  unique-up-to-isomorphism model with a unique isomorphism.

 

See the following for more information:


    "Epistemology Cyberattacks"
         https://papers.ssrn.com/abstract=3603021

 

 

Tuesday, March 30, 2021

An Actor Application Can Be Hundreds of Times Faster Than a Parallel Lambda Expression

Communication between Turing Machines and between lambda-expressions was crucially omitted thereby crippling them as a foundation for Computer Science. Actors [Hewitt, et. al 1973] remedied the foundations to provide scalable computation. Church’s lambda calculus crucially omitted concurrency with the consequence that an Actors machine can be hundreds of times faster than a parallel lambda expression. The reason that in practice that an Actor can be hundreds of times faster is that in order to carry out a concurrent computation, the parallel lambda expression must in parallel gather together the results of the computation step of each core and then in parallel redistribute the communicated messages from each core to other cores. The slowdown comes from the overhead of a lambda expression spanning out from a core to all the other cores, performing a step on each core, and then gathering and sorting the results of the steps (cf. [Strachey 2000]).

Actor Speed-Up Theorem. There are practical Actor computations of time t which in the parallel lambda calculus require time a significant multiple of t*log[n], where n is the number of cores on a machine.

 

For more information, see the following:

Epistemology Cyberattacks

 

 

The Church/Turing Thesis is False

The Church/Turing Thesis [cf. Turing 1936] was developed to that effect that a nondeterministic Turing Machine can perform any computation. The Thesis has been the overwhelming consensus since the 1936.

 

Global states are central to the Church/Turing model of computation. In this model, there is a global state with nondeterministic transition to the next state. A state transition can be initiated by a computation step that specifies that one of a number of specified steps can be executed next. A global-state model of computation has the property of bounded nondeterminism. Bounded nondeterminism means that there is a bound determined in advance for the number of possibilities that a system can explore given that it must always come back with an answer. In the development of a theory of concurrent computation, Dijkstra remained committed to a global state model of computation. This commitment gave rise to the “unbounded nondeterminism” controversy. Digital systems can be physically indeterminate by making use of arbiters in order to resolve reception order in communication between systems. Because of physical indeterminacy, a digital system can have the property of unbounded nondeterminism in that no bound exists when it is started on the number of possibilities that an always-responding digital system can explore. Being restricted to bounded nondeterminism is one the reasons that the Church/Turing model of computation is inadequate for digital computation. Consequently, there are digital computations that cannot be performed by a nondeterministic Turing Machine.

 

See the following for more information:


"Epistemology Cyberattacks"