James & Theory
AKA: THE PHYSICS UNDERGROUND

Research

Aaron has a Bachelors degree in physics and a Bachelors degree in engineering (Civil/Structural). Work is focussed specifically on the unification problem, the generation of tools to efficiently expedite the unification problem and the development of new applications from the new processes identified. Fields investigated include developing fundamental physical and mathematical models of Entropy, Special Relativity, 1PN and 2PN General Relativity, Gravitomagnetism, Electromagnetism, Quantum Theory, Nuclear and Particle Physics and Astrophysical processes: Dark Matter, Dark Energy, Cosmology. The focus is in finding a causal continuous physical and associated mathematical links between these diverse fields.

Emergence Analysis

At some point in time, someone will be faced with convincing the rest of the world that they have found the unified model of physics. Their problem then becomes one of providing very convincing scientific proof that they have the correct model. Although numerous more heuristic non-quantifyable claims of beauty, symmetry and simplicity are regularly claimed to identify the unified model, a much harder, clearer scientific tool was sought to generate this proof. A truly unified model would not make just a few clean predictions, it would need to make all clean predictions. We can statistically and conservatively define the chance of a single correct independent prediction as (1/2)1 and of n successive correct independent predictions as P(n) = (1/2)n. One a model achieves n>=22 predictions - we satisfy the standard compliance for scientific discovery (5 sigma). I have independently developed Emergence Analysis as a series of simple scientific tools specifically targeted at deconstructing and error correcting unified models. While this sets the basic compliance metric for a unified model, the truly unified model would sail right past this metric - predicting every emergent behaviour and system in its path. Which makes is very easy to recognise and prove. This statistical tool is called the Binary Hypothesis Test (BHT) - which rather than looking as a single emergent layer of behaviour, has here been expanded to test independent predictions on a whole of model basis. The BHT is exceedingly efficient in sorting those models that comply with this minimum standard for unification compliance and those that do not. And as such becomes a valuable tool for redirecting resources to unification approaches that are scientifically viable.

BHT — Predictive and Compliance Metrics

Binary Hypothesis Test P(n) = (1/2)n .

When n = 22, P(22) = (1/2)22  ≈ 1.38 × 10−7  ≈ 5 σ .

For n ≥ 22, the model meets the minimum compliance standard for unification. That being the case the seed model is highly likley to carry patterns and/or structure (seed complexity) we can then data mine for further unification refinements.

Status: scaffold

Parsimony Metrics

Track total seed complexity and diagnose “fudge-factor load” vs emergence-driven economy.

Status: scaffold

Entropy

Entropy is identified as a vehicle that minimises algorithmic complexity for system self-evolution - both physically and mathematically. Entropic system evolution eliminates all layers of abstraction required to drive system evolution - and thus eliminates the unrestrained degrees of abstraction freedom that have hampered unification. Entropy increase is the increase in disorder. When temporally reversed, entropy decreases and the system state complexity decreases. This is identified as physical parsimony (with associated mathematical parsimony metrics). Parsimony metrics place confinements on seed model complexity and become another simple but powerful tool for deconstructing unification models.

Status: scaffold

Map of the Viable Unification Landscape

The combination of the BHT and parsimony metrics generates the main tools of Emergence Analysis. In combination we can quantify the relative seed complexity various long accepted models such as General Relativity and Quantum Theory and indeed rank their relative position on an ideal unification tree. This in turn allows us to map the fundamentality of these models relative to one another - with the surprising result that General Relativity is found to be a much more fundamental model that Quantum Theory. Thus we now have the tools to scientifically identify that attempts to quantify gravity as a subset of quantum theory have close to zero chance of success, whereas building quantum theory from General Relativity is a scientifically supported route to unification.

Placeholder for the “emergence tree / landscape map” graphic.