Thursday, January 27, 2022

Current Articles and Videos (frequently updated at link provided for each article and Video)


    • Universal Intelligent Systems by 2030


    • Linux 60th Anniversary Keynote


    • Project Liftoff: Universal Intelligent Systems (UIS) by 2030

    • Technology for Project Liftoff

      • Resilience Against Cyberattacks using Strongly-typed Actors


    video: “Preventing Successful Cyberattacks Using Strongly-typed Actors” 


    • Robust Inference for Universal Intelligent Systems

    • Secure (Against Direct Cyberattacks) Backdoors for Universal Intelligent Systems  
    • Citadels: faster response time and better information integration than remote datacenters 
    • Information Security Requires Strongly-Typed Actors and Theories

    • Physical Indeterminacy in Digital Computation

    • Theory Ordinals can Replace ZFC in Computer Science

    • Academic Biography of Professor Carl Hewitt  

    Wednesday, January 26, 2022


     Universal Intelligent Systems by 2030


    Carl Hewitt with John Perry


    Apple, Facebook, and Microsoft project that by 2030, electronic glasses will have replaced smartphones. Electronic glasses will robustly integrate information under contention in real-time using abstract reasoning about information from sensors in the glasses, from the Internet, and from the systems own internal storage. Intelligent Systems using electronic glasses will advance current smartphone systems as follows:

    ·        Educable in incorporating and using additional abstractions and principles, although by 2030 not yet be great at formulating genuinely new abstractions.

    ·        Self-informative in understanding their own capabilities and limitations.

    ·        Discourse with humans.

    An Apollo-scale project will be required to implement and deploy Universal Intelligent Systems by 2030, which will be even more transformative than the smartphones revolution:

    ·        Globally (military, economic, technological)

    ·        Domestically (administration, health, education, commerce, communities, and surveillance)

    All aspects of society could become completely dependent on Intelligent Systems, making orders of magnitude greater resilience against cyberattack absolutely necessary. Cyberesilience requires inherently secure abstractions (beyond the inherently insecure von Neumann architecture) that can be used to prove technical specifications.

    Carl Hewitt

    Carl Hewitt is an MIT emeritus professor. Together with his colleagues and students, he is known for the Actors Abstraction, which is more general than the Church/Turing theory of computation. Unlike the von Neumann architecture, the Actors Abstraction is inherently secure. ActorsTheory (that characterizes the Actor Abstraction up to a unique isomorphism) provides foundations for proving technical specifications of practical computer systems. Practical frameworks for Actors have been developed by Apple, Erlang Solutions, Lightbend, Microsoft, and others. Very large systems have been deployed. The Actors Abstraction is foundational for Formalized Discourse, which is a crucial technology for Universal Intelligent Systems.

    Hewitt is also known for ActorsTheory providing greater resilience against cyberattacks on foundations:

    ·   The very existence of the [Gödel 1931] paradoxical proposition I’mUnprovable could be a potential source of cyberattacks in foundational theories. ActorsTheory prevents construction of the proposition using restrictions on orders of propositions.

    ·   ActorsTheory resolves the [Church 1934] paradoxical proposition MyTheoremsAreEnumerable that Church stated meant the end of mathematical logic. The proposition is axiomatic in the foundations of mathematics beginning with Euclid.  Using a diagonal argument, [Church 1934] showed that including the proposition in a foundational theory leads to contradiction. The solution to the paradox is to change the definition of a theorem to be a proposition whose proof can be algorithmically checked to be correct. Proof checking is algorithmically decidable although theorems are not algorithmically enumerable.

        In this way, MyStringTheoremsOfAnOrderAreEnumerable can be proved to be a logically undecidable proposition in ActorsTheory:

    o The proposition cannot be proved in ActorsTheory because of the diagonal argument in [Church 1934].

    o It cannot be disproved in ActorsTheory because it is provably true in the mathematical model of ActorsTheory using the metatheory of ActorsTheory.

    John Perry

    John Richard Perry is Henry Waldgrave Stuart Professor of Philosophy Emeritus at Stanford University and Distinguished Professor of Philosophy Emeritus at the University of California, Riverside. He has made significant contributions to philosophy in the fields of philosophy of language, metaphysics, and philosophy of mind. He is known primarily for his work on situation semantics (together with Jon Barwise), reflexivity, indexicality, personal identity, and self-knowledge.

    Perry is a founder of the Center for the Study of Language and Information (CSLI), an independent research center at Stanford University with philosophers, computer scientists, linguists, and psychologists from Stanford, SRI International, and Xerox PARC. It strives to study all forms of information and improve how humans and computers acquire and process it.

    Thursday, April 8, 2021

    Self-Proof of Consistency


    Basing the development of foundations on existence of a monster can lead to serious errors. For example, the [Gödel 1931] conclusion that a theory that axiomatizes is own provability cannot prove its own consistency was based on existence of the proposition I’mUnprovable (such that I’mUnprovable I’mUnprovable). However, the [Gödel 1931] construction of I’mUnprovable fails in foundations because it violates restrictions on orders for propositions.


    Nonexistence of Gödel’s proposition I’mUnprovable opened up the question of a theory being able to prove its own consistency. [Gödel 1931] included provability axioms for Russell’s system including the following:

    ·  ModusPonens: Ψ)Ψ)

    ·  DoubleNegation: (¬¬ΨΨ)


    Although not included in [Gödel 1931], the incorporations of TheoremUse {( (Ψ)Ψ)} is fundamental to foundational theories by enabling the principle that theorems can be used in subsequent proofs. As such, TheoremUse belonged in the [Gödel 1931] axiomatization of provability for Russell’s Principia Mathematica. However, its inclusion would have made [Gödel 1931] inconsistent. Not including TheoremUse made the [Gödel 1931] axiomatization of provability fundamentally incomplete.


    Inconsistency of a theory (¬Consistent) can be formally defined as there being a proposition Ψ0 such that  Ψ0¬Ψ0.


    Contra [Gödel 1931], a foundational theory can prove its own consistency as shown in the following theorem:

        Theorem. Consistent

            Proof. In order to obtain a contraction, hypothesize ¬Consistent. Consequently, there is a proposition Ψ0 such that Ψ0¬Ψ0. By the principle of TheoremUse

    {((Φ)Φ)} with Ψ0 for Φ, Ψ0¬Ψ0, which is the desired contradiction.


    However, the proof does not carry conviction that a contradiction cannot be derived because the proof is valid even if the theory is inconsistent. Consistency of the mathematical theories Actors and Ordinals is established by proving each theory has a  unique-up-to-isomorphism model with a unique isomorphism.


    See the following for more information:

        "Epistemology Cyberattacks"



    Tuesday, March 30, 2021

    An Actor Application Can Be Hundreds of Times Faster Than a Parallel Lambda Expression

    Communication between Turing Machines and between lambda-expressions was crucially omitted thereby crippling them as a foundation for Computer Science. Actors [Hewitt, et. al 1973] remedied the foundations to provide scalable computation. Church’s lambda calculus crucially omitted concurrency with the consequence that an Actors machine can be hundreds of times faster than a parallel lambda expression. The reason that in practice that an Actor can be hundreds of times faster is that in order to carry out a concurrent computation, the parallel lambda expression must in parallel gather together the results of the computation step of each core and then in parallel redistribute the communicated messages from each core to other cores. The slowdown comes from the overhead of a lambda expression spanning out from a core to all the other cores, performing a step on each core, and then gathering and sorting the results of the steps (cf. [Strachey 2000]).

    Actor Speed-Up Theorem. There are practical Actor computations of time t which in the parallel lambda calculus require time a significant multiple of t*log[n], where n is the number of cores on a machine.


    For more information, see the following:

    Epistemology Cyberattacks



    The Church/Turing Thesis is False

    The Church/Turing Thesis [cf. Turing 1936] was developed to that effect that a nondeterministic Turing Machine can perform any computation. The Thesis has been the overwhelming consensus since the 1936.


    Global states are central to the Church/Turing model of computation. In this model, there is a global state with nondeterministic transition to the next state. A state transition can be initiated by a computation step that specifies that one of a number of specified steps can be executed next. A global-state model of computation has the property of bounded nondeterminism. Bounded nondeterminism means that there is a bound determined in advance for the number of possibilities that a system can explore given that it must always come back with an answer. In the development of a theory of concurrent computation, Dijkstra remained committed to a global state model of computation. This commitment gave rise to the “unbounded nondeterminism” controversy. Digital systems can be physically indeterminate by making use of arbiters in order to resolve reception order in communication between systems. Because of physical indeterminacy, a digital system can have the property of unbounded nondeterminism in that no bound exists when it is started on the number of possibilities that an always-responding digital system can explore. Being restricted to bounded nondeterminism is one the reasons that the Church/Turing model of computation is inadequate for digital computation. Consequently, there are digital computations that cannot be performed by a nondeterministic Turing Machine.


    See the following for more information:

    "Epistemology Cyberattacks"