Humanity's exploration of 'computation' is not merely a mechanical operation but a profound and lengthy 'transfinite' iteration, whose historical narrative can be traced back to Leibniz's grand vision, continuously advancing towards the goal of mathematical axiomatization on the foundational thoughts of generation after generation of scientists, ultimately crystallizing into modern technologies such as the Turing Machine and Bitcoin.
I. Prologue: Leibniz's Axiomatic Dream and the Germination of Computation
The story begins in the 17th century with Gottfried Wilhelm Leibniz. He was the first to propose a revolutionary axiomatic hypothesis: everything can be determined through computation. Leibniz believed that through a 'universal language' and 'calculus ratiocinator', all human disputes, even philosophical disagreements, could be resolved through pure computation, as clearly as mathematical problems. This laid the foundation for mathematical axiomatization and sowed the seeds for the future concept of 'computation'. His thoughts foreshadowed a world where logic and reasoning could be formalized and mechanized.
II. The First Iteration: Cantor and Hilbert's Challenge of 'Infinity' and the Pursuit of Axiomatization
Leibniz's vision has undergone nearly 300 years of sedimentation, and at the end of the 19th century and the beginning of the 20th century, the mathematical community welcomed an important iteration:
Cantor (Georg Cantor)
is the 'anomaly' in this iteration. He broke the traditional mathematical constraints on 'finiteness', introducing the concepts of transfinite numbers and transfinite ordinals. Cantor's study of infinity was a challenge to Leibniz's axiom that 'everything can be computed', as it revealed the layers and complexities of infinity while also prompting mathematicians to consider how to formalize and axiomatize these new mathematical objects. His work, as discussed in von Neumann's high school paper, opened up the possibility of providing axiomatic definitions of 'infinity'.
Following closely was David Hilbert, who is regarded as Cantor's 'forefront'. Hilbert astutely perceived the crisis of mathematics being marginalized and strongly advocated for the axiomatization of the entire mathematical system. His goal was to establish a complete, consistent, and decidable mathematical system, making mathematics a solid foundation for any science. This was precisely the inheritance and deepening of Leibniz's axiomatic dream, attempting to harness all mathematical concepts, including 'infinity', through a rigorous axiomatic system.
III. The Second Iteration: von Neumann's Axiomatization of Set Theory and the Vision of Computing Machines
Building on Cantor and Hilbert, John von Neumann took the historical stage, pushing this exploration to new heights.
Von Neumann's academic career began with the axiomatization of set theory. His third paper was 'An Axiomatization of Set Theory,' which directly responded to Hilbert's call for the axiomatization of mathematics and made significant contributions to the foundation of modern set theory. He attempted to tame the infinity revealed by Cantor through a rigorous axiomatic system, bringing it into a controllable logical framework.
However, von Neumann's vision extended beyond this. He extended the understanding of axiomatic systems to explore the essence of computation. His life's work has been described as 'starting from axiomatic set theory to the invention of computers that mimic the human brain', and this is no coincidence. His profound insights into formal logic and algorithms made him one of the founders of modern computer architecture. He embodied abstract logical operations into executable instructions and data processing, paving the way for the birth of universal computing machines.
IV. The Third Iteration: Gödel's Insights on Incompleteness and the Arrival of the Turing Machine Era
In the pursuit of mathematical axiomatization and formalized computation, the emergence of Kurt Gödel brought the most profound and shocking insights.
Gödel's Incompleteness Theorems, published in 1931, struck like a bolt from the blue, revealing that any formal system strong enough to contain Peano arithmetic is incomplete. This means that within these systems, there will always exist some true propositions that cannot be proven within the system itself. Furthermore, if such a system is consistent, its consistency cannot be proven within the system itself. This discovery directly challenged Hilbert's grand plan for complete axiomatization, pointing out that any formal system 'with computation' has inherent limitations and cannot achieve complete completeness.
Nevertheless, Gödel held Leibniz in the highest regard. He even believed that Leibniz had discovered deep principles of theories like von Neumann's game theory centuries earlier. Gödel devoted his life to reviewing Leibniz's unpublished manuscripts, hoping to find evidence to support his advanced conjectures, which undoubtedly reflected his profound understanding of Leibniz's axiom that 'everything can be computed' and his yearning for truths hidden deep in history.
Gödel's Incompleteness Theorems, while pointing out the limitations of formal systems, also indirectly promoted a more precise definition of the concept of 'computation'. It was in this context that Alan Turing proposed the Turing Machine, an abstract computational model, in 1936. The Turing Machine precisely defined the limits of 'computability' in an extremely minimal form. It proved that any algorithm can be executed through this simple model, thereby providing a clear operational definition for the 'formal systems' revealed by Gödel.
Von Neumann architecture
's computer is the physical embodiment of the Turing Machine in the real world. Modern computers are based on the working principles of the Turing Machine, achieving the capability of universal computation through the storage of programs and data. Thus, humanity's pursuit of computation through mathematical axiomatic assumptions has truly 'landed', although Gödel has reminded us that this 'landing' is not without boundaries.
V. The Fourth Iteration: The Oracle Turing Machine, Transfinite Craftsmanship, and the Birth of Bitcoin
With the rapid development of computer technology, humanity's understanding of 'computation' has not stopped, but has continued to iterate towards the dimension of 'transfinite', ultimately giving birth to the epoch-making invention of Bitcoin.
The underlying logic of Bitcoin can be seen as a further extension of the Turing Machine concept, particularly in relation to the **'Oracle Turing Machine'** and Transfinite Craftsmanship. The Oracle Turing Machine is a hypothetical Turing machine that can access an 'oracle' to instantaneously solve certain problems that traditional Turing machines cannot solve in a finite amount of time. While Bitcoin itself is not an Oracle Turing Machine, it constructs a decentralized 'trust oracle' through cryptographic and distributed ledger mechanisms, making global value transfer and consensus possible without a central authority.
Transfinite Craftsmanship can be understood as the intricate art of weaving and ordering information and value based on Turing's doctoral thesis (based on ordinal logic systems). The blockchain structure of Bitcoin is essentially an immutable, time-sequenced chain of transaction records, where each block has a unique 'ordinal' position. Through clever hashing functions, proof of work, and chain structure, Bitcoin achieves decentralized consensus and tamper resistance, constructing an unprecedented 'trust machine'.
The birth of Bitcoin represents another milestone in humanity's integration of mathematical axiomatization, computational theory, and practical application. It expands 'computation' from merely processing data to creating and managing digital scarcity and value, achieving 'computable trust' on a global scale through a distributed network. This construction of trust is yet another 'transfinite' extension of Leibniz's original axiomatic dream—bringing social consensus and value systems into a computable and verifiable realm.
From Leibniz's grand vision on paper, to Cantor's exploration of infinity, to Hilbert's call for axiomatization, to Gödel's profound insights, von Neumann's machine realization, and Turing's theoretical foundation, culminating in Bitcoin's practical innovation, humanity's 'transfinite exploration of computation' is an ongoing process of evolution. Each generation of scientists builds upon the thoughts of their predecessors, using mathematical axiomatic assumptions to gradually push Leibniz's original vision towards reality, continuously expanding its application boundaries, and this process continues to this day. Gödel's Incompleteness Theorems constantly remind us that even within the most rigorous mathematical and computational systems, there exist mysteries that we cannot fully grasp, which in turn inspires humanity's eternal quest to transcend its cognitive limits. This is the transfinite iterative craftsmanship inherent to humanity itself based on the axiom of 'everything is computable'.