OTHER LIFE

Subscribe

RSS Feeds

Archives

Hard Forking Reality (Part 1)

On complexity, inequality, and ontological exits

I would like to explore how the multiple versions of reality that circulate in any society can become locked and irreconcilably divergent. Deliberation, negotation, socialization, and most other forces that have historically caused diverse agents to revolve around some minimally shared picture of reality — these social forces now appear to be approaching zero traction, except within very narrow, local bounds. We do not yet have a good general theory of this phenomenon, which is amenable to testing against empirical data from the past few decades. A good theory of reality divergence should not only explain the proliferation of alternative and irreconcilable realities, it should also be able to explain why the remaining local clusters of shared reality do persist; it should not just predict reality fragmentation, it should predict the lines along which reality fragmentation takes, and fails to take, place.

In what follows, I will try to sketch a few specific hypotheses to this effect. I have lately been stimulated by RS Bakker’s theory of Semantic Apocalypse. Bakker emphasises the role of increasing environmental complexity in short-circuiting human cognition, which is based on heuristics evolved under very different environmental conditions. I am interested in the possibility of a more fine-grained, empirical etiology of what appears to be today’s semantic apocalypse. What are the relevant mechanisms that make particular individuals and groups set sail into divergent realities, but to different degrees in different times and places? And why exactly does perceptual fragmentation — not historically unprecedented — seem uniquely supercharged today? What exactly happened to make the centrifugal forces cross some threshold of runaway divergence, traceable in the recorded empirical timeline of postwar Western culture?

I will borrow from Bakker the notion of increasing environmental complexity as a major explanatory factor, but I will generate some more specific and testable hypotheses by also stressing two additional variables. First, the timing and degree of information-technology advances. Second, I would like to zoom in on how the effect of increasing environmental complexity is crucially conditional on cognitive abilities. Given that the ability to process and maneuver environmental complexity is unequally distributed and substantially heritable, I think we can make some predictions about how semantic apocalypse will play out over time and space.

The intuition that alternative realities appear to be diverging among different groups — say, the left-wing and the right-wing — is simple enough. But judging the gravity of such an observation requires us to trace its more formal logic. Is this a superficial short-term trend, or a longer and deeper historical track? To answer such questions, we need a more precise model; and to build a more precise model, we need to borrow from a more formal discipline.

A garden of forking paths

When software developers copy the source code of some software application, their new copy of the source code is called a fork. Developers create forks in order to make some new program on the basis of an existing program, or to make improvements to be resubmitted to the original developer (these are called “pull requests”).

The picture of society inside the mind of individual human beings is like a fork of the application that governs society. As ultrasocial animals, when we move through the world, we do so on the basis of mental models that we have largely downloaded from what we believe other humans believe. But with each idiosyncratic experience, our “forked” model of reality goes slightly out of sync with the source repository. In a thriving community composed of healthy, thriving individuals, every individual fork gets resubmitted to the source repository on a regular basis (over the proverbial campfire, for instance). Everyone then votes on which individual revisions should be merged into the source repository, and which should be rejected, through a highly evolved ballot mechanism: smiles, frowns, admiration, opprobrium, and many other signals interface with our emotions to force the community of “developers” toward convergence on a consensus over time.

This process is implicit and dynamic; it only rarely registers official consensus and only rarely hits the exact bullseye of the true underlying distribution of preferences. At its most functional, however, a community of social reality developers is surprisingly good at silently and constantly updating the source code in a direction convergent toward the most important shared goals and away from the most dire of shared horrors.

These idealized individual reality forks are typically soft forks. The defining characteristic of a soft fork is, for our purposes, backward-compatibility. Backward-compatibility means that while “new rules” might be created in the fork, the “old rules” are also followed, so that when the innovations on the fork are merged with the source code, all the users operating on the old source code can easily receive the new update. An example would be a someone who experiments with a simple innovation in hunting method; if it’s a minor change that’s appreciably better, it will easily merge with all the previously existing source code, because it doesn’t conflict with anything fundamental in that original source code.

soft fork diagram

Every now and then, one individual or subgroup in the community might propose more fundamental innovations to the community’s source code, by developing some radically novel program on a fork. This change, if accepted, would require all others to alter or delete portions of their legacy code. An example might be an individual who starts worshipping a new god, or a subordinate who wishes to become ruler against the wishes of the reigning ruler; each case represents someone submitting to the source code new rules that would require everyone else to alter their old rules deep in the source code; these forks are not backward-compatible. These are hard forks. Everyone in the community has to choose if they want to preserve their source code and carry on without the new fork’s innovations, or if they want to accept the new fork.

Hard fork diagram

Recall that when the innovator on a fork resubmits to the source repository, in the ancestral human environment, the decisions to accept or reject are facilitated through the proverbial campfire. This process is subject to costs, which are highly sensitive to contextual factors such as the complexity of the social environment (increasing the number of things to worry about), social capital (decreasing the number of things to worry about), and information communication technology (decreasing transaction costs facilitating convergence, but also decreasing exit costs facilitating divergence). Finally, individual heterogeneity in cognitive ability is likely a major moderator of the influence of environmental complexity, social capital, and information technology on social forking dynamics. A consideration of these variables, I think, will provide a compelling and parsimonious interpretation of ideological conflict in liberal societies since World War II, on a more formal footing than is typically leveraged by commentators on these phenomena.

Stay up to date on all my projects around the web. No spam, don't worry.

This site participates in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites. The Privacy Policy can be found here. The content of this website is licensed under a CREATIVE COMMONS ATTRIBUTION 4.0 INTERNATIONAL LICENSE.

rss-square
linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram