James & Theory
AKA: THE PHYSICS UNDERGROUND

Research

Aaron has a Bachelors degree in physics and a Bachelors degree in engineering (Civil/Structural). Work is focussed specifically on the unification problem, the generation of tools to efficiently expedite the unification problem and the development of new applications from the new processes identified. Fields investigated include developing fundamental physical and mathematical models of Entropy, Special Relativity, 1PN and 2PN General Relativity, Gravitomagnetism, Electromagnetism, Quantum Theory, Nuclear and Particle Physics and Astrophysical processes: Dark Matter, Dark Energy, Cosmology. The focus is in finding a causal continuous physical and associated mathematical links between these diverse fields.

Emergence Analysis

At some point A unified model will be discovered, and the discoverer/s will be need to prove they have found the unified model of physics. Claims of beauty, symetry and simplicity are subjective and unscientific. A far more scientific and mathematical approach is required. A truly unified model would not make just a few clean predictions, it would need to make all clean predictions. Such a model would be scientifically and statisitically highly unlikely top make many-many predctions by chance. And we can quantify that unlikely probability mathematially with a statistical/ probability tool called the Binary Hypothesis Test (BHT). We can treat a model predition like a coin toss. The probability of a model makeing a single correct independent prodiction by chance is (1/2)1 and of n successive correct independent predictions as P(n) = (1/2)n. Once a model achieves n≥22 predictions - the probability of that occuring by chance becomess very small. Ineed we can match that low probability to the standard for scientific discovery (5 σ). I have independently developed Emergence Analysis as a series of simple scientific tools specifically targeted at deconstructing and error correcting unified models. While this sets the basic compliance metric for a unified model, the truly unified model would sail right past this metric - predicting every emergent behaviour and system in its path. Which makes is very easy to recognise and prove. This statistical tool is called the Binary Hypothesis Test (BHT) - which rather than looking as a single emergent layer of behaviour, has here been expanded to test independent predictions on a whole of model basis. The BHT is exceedingly efficient in testing and filtering for minimum unfication compliance. And as such becomes a valuable tool for redirecting resources to sceintifcially/ statistically viable unification models.

BHT — Predictive and Compliance Metrics

Binary Hypothesis Test P(n) = (1/2)n .

When n = 22, P(22) = (1/2)22  ≈ 1.38 × 10−7  ≈ 5 σ .

For n ≥ 22, the model meets the minimum compliance standard as a unification model. That being the case the seed model is highly likley to be carry patterns and/or structure (seed complexity) we can then data mine for further unification refinements.

Status: scaffold

Parsimony Metrics

An emergent model must propagate its seed structure to generate increasing levels of diversity. Parsimony (the time-reversed analogue of entropy) provides the mathematical driver toward simpler fundamental seed complexity.

To track a model’s seed complexity, we decompose it into:

  • Pc - the set of fundamental postulates required by the model
  • Ff - the collection of arbitrary assumptions (free constants, tuning parameters, boundary conditions, etc.)

The total seed complexity is then defined as:

TSC = Pc + Ff

Status: scaffold

Entropy

Entropy is identified as a vehicle that minimises algorithmic complexity for system self-evolution - both physically and mathematically. Entropic system evolution eliminates all layers of abstraction required to drive system evolution - and thus eliminates the unrestrained degrees of abstraction freedom that have hampered unification. Entropy increase is the increase in disorder. When temporally reversed, entropy decreases and the system state complexity decreases. This is identified as physical parsimony (with associated mathematical parsimony metrics). Parsimony metrics place confinements on seed model complexity and become another simple but powerful tool for deconstructing unification models.

Status: scaffold

Map of the Viable Unification Landscape

The combination of the BHT and parsimony metrics generates the main tools of Emergence Analysis. In combination we can quantify the relative seed complexity various long accepted models such as General Relativity and Quantum Theory and indeed rank their relative position on an ideal unification tree. This in turn allows us to map the fundamentality of these models relative to one another - with the surprising result that General Relativity is found to be a much more fundamental model that Quantum Theory. Thus we now have the tools to scientifically identify that attempts to quantify gravity as a subset of quantum theory have close to zero chance of success, whereas building quantum theory from General Relativity is a scientifically supported route to unification.

Mapping Ideal Emergence

Each emergent behaviour is generated in a causally continuous evolution from underlying layers. With each successive emergent layer, the complexity of the system increases. Similalry such an ideal system must track back to a single hyper-parsimonious seed.

In ideal emergence - all emergent behaviour and systems emerge from a single hyper-parsimonious seed.
complexity increasing at each emergent level.
In ideal emergence - all emergent behaviour and systems emerge from a single hyper-parsimonious seed.
Where we have a model that has multiple assumptions, constnats and postulates originating at its base, the ideal emergnce model infers that there are more fundamental parts of the model missing that need to be deconstructed to fill in the missing underlying emergence layers.
Where we have a model that has multiple assumptions, constnats and postulates originating at its base, the ideal emergnce model infers that there are more fundamental parts of the model missing that need to be deconstructed to fill in the missing underlying emergence layers.