Relevant Notes: The Metacrisis | coordination failures - multipolar traps (Moloch) | Daniel Schmachtenberger

Part 1 - The Metacrisis

Link to Podcast


Defining the Metacrisis and why is it useful

It is not one single risk or crisis, but a framework to holistically understand the problems of our time. To make it through we have to prevent all these risks. Looking at each of them separately and seeing how complex each of them is may make it seem impossible.

For example: Solving climate change is not only going against the oil industry, but it is going against the whole global economy. Each industry relies on energy, if we internalize energy externalities and energy prices go up dramatically, the whole market would collapse.

Looking at all of our biggest crises together we can find the common underlying patterns contributing to them. Doing so gives us something to start with that can move us in the right direction on all these issues.

Core questions institutions, corporations should be asking all the time (addressing generator functions)

Environmental Risks

It is not just one risk but many interlinked ones (Planetary Boundaries as a great framework)

Even BEFORE CLIMATE CHANGE IS FULLY CATASTROPHIC

climate change - even before reaching “Hothouse Earth”, we’ll start seeing more and more extreme weather events. Some are already happening in low-density population areas of Australia, once it starts happening in highly dense areas such as India and Pakistan we can expect:

  • extreme population migrations
  • resource shortages
  • political amplifiers
  • supply chain disruptions

Many of the looming environmental catastrophes are the result of the underlying structure of current systems:

  • An economic system running a linear materials economy that relies on infinite growth on a finite planet

Technology Risks

  • AI Risks
    • AGI doing its own things
    • AI algorithms optimizing for some variables while creating catastrophic externalities
      • Current example: social media algorithms optimizing for views, increasing polarization and breaking democracies
  • Exponential improvements of weapons
    • catastrophic weapons increasingly available

Issue: Groups/companies that develop these technologies and focus on minimizing risks tend to go slower and get overcompeted by those that do not care about minimizing risks

The Third Attractor

Two High-Level, High-Risk Attractors

Increasing Catastrophes & Breakdown

  • catastrophes not the result of someone trying to create them, but resulting from coordination problems and externalities
    • everyone contributing to the issue and no one individual capable to solve it
  • decentralized catastrophic weapons become more easily available to non-state actors
    • Advanced AI generating propaganda

Dystopic Control Structures

Surveillance state could emerge as a way to prevent catastrophic outcomes from cheap destructive technology or coordination problems.

With enough power, enforced control structures become dystopic.

Each catastrophe, motivates us to want a solution enough to accept a dystopic one. Then you have multiple control systems competing in an arms race driving other types of catastrophes

Catastrophes and dystopias could mutually reinforce each other

Third Attractor

  • Third Attractor as something that allows us to avoid other attractors

    • governance that allows us to solve/address multipolar traps globally
    • creating checks and balances that prevent enforced control structures to become dystopic, centralized governance
  • Governments currently “solve” (read: address) some of these coordination failures nationally (overlogging, overfishing etc…) with rule of law and enforcement (monopoly of violence)

  • Internationally, there’s no ability to do rigorous enforcement

    • sanctions are often not enough
    • Ostrom’s Framework to Managing Commons can work in small-scale, local applications. But it is much harder on a global scale, with complex power relations and catastrophic weapons

How Tech Can Move us towards the 3rd attractor

  • Forced transparency

    • Planet Labs - can show where mining, illegal logging, fishing is happening
      • doing so can allow to track, monitor prove what’s happening in the world
      • Lack of communication and transparency makes multipolar traps worse
        • There are strategic advantages to opaque information (military classified information), but also many downsides
          • duplicate efforts across agencies/companies
          • less energy invested in hiding/concealing information
          • lost possibility for coordination
    • Blockchain
  • Potential of Democratized AI for Education

    • e.g. possibilities for interacting with AI-emulated Albert Einstein, Thomas Jefferson etc…
    • Human tutors to stimulate thoughts and encourage development of uniquely human intelligence/skills
  • AI for improved Democracy

We can up-regulate Human Nature to move towards that 3rd attractor

  • Examples of Buddhist and Jewish communities creating a culture of non-violence and valuing education (respectively)
  • Example of lead pollution reducing IQ and increasing propensity to violence

Part 2 - Superstructures, Culture & Technology

Link to Podcast

Technology Is Not Values Neutral

There is no metacrisis without technology

  • tech affects psychology and then affects culture

  • The technology you have at hand changes your perception of the world (ways of looking), and your intention and extends your actuation capacity

    • a camera may orient you to look for beauty
    • a chainsaw to cutting trees
    • a spear to kill things or hit targets
  • We normally design tech for a specific, narrow purpose

    • when designing new tech we need to think about externalities
      • not only physical ones, also psycho-social externalities
      • Instagram, with its focus on images, introduction of filters and emphasis on likes bred narcissism and body dysmorphia across population

Example of the Plow

  • The plow revolutionized agriculture, increasing food production by relying on animal work (ox)
    • Caloric output using the plow increased dramatically
      • Technology becomes obligate - If a tech confers a lot of capacity, it also confer a game-theoretic advantage in a rivalrous context. Any group without that tech gets outcompeted (arms races)
    • Changed value systems around animals
      • having to beat animals forced humans to remove the sacredness from animals (societies used to be animistic before)
    • More surplus - more capacity for wealth inequality
    • Women could not operate plows - contributed to spread of patriarchal society

Culture & Tech

  • The dominant ideology/narrative of a culture is the apologism of the power structure of that culture
    • Crusades making kings richer and more powerful were justified by Christianity
    • Today it is techno-capital optimism, advocated for by the winners of the current system (Gates, Rossling etc…)
    • Move fast, break things culture is actually privatizing potential gains and socializing losses from unexplored risks and externalities
  • Generally the values we have justify the use of our technology
  • Values could also lead to codifying laws that bans tech, or binds it from it being used in certain ways

Resource: Marvin Harris’ Cultural Materialism

3 Interacting Components of a Society (Framework)

Infrastructure

  • Tech Stack to meet needs of people
    • transportation, agriculture tools, energy, waste management, clothes

Social Structure

  • Governance, Law, Economics and their Institutions

Superstructure

  • Values of the people (religion, ideologies, memes)

Each of these 3 affects and influences each other.

Values affect technology and viceversa

Changes in Infrastructure drive changes in superstructure

  • Examples:
    • without a printing press you couldn’t have a democracy (you needed educated people who were informed)
    • Internet decentralized access to information
    • Facebook’s algorithm feeding people’s Confirmation Bias and boosting outrage has a notable change on people’s values

Decentralized technology can become centralizing

Tech that initially increased democratization of society, can end up becoming a stronger centralizing force

  • printing press became a tool to homogenize opinions and spread propaganda
  • Few corporations (Google, Youtube) have control over the algorithms that most people use to find information

Ideals to aspire to

  • People with “inclusive/“good” values (superstructure) to design technology (infrastructure) that reinforces that superstructure”
  • A culture that helps people develop superstructure

Conflict Theory vs. Mistake Theory

Article on Less Wrong

  • Is harm being created as an unintended, unforeseen externality (Mistake theory), or is it willingly (or knowingly) created (conflict theory)

Addressing Mistake Theory Externalities

  • Do a better job trying to predict 2nd and 3rd order effects
  • Involve all stakeholders that could be affected and use frameworks such as Social vs. Physical Externalities and Superstructure, Social Structure, Infrastructure, to think of externalities you hadn’t thought of
  • Set up monitoring systems to check for unforeseen externalities
  • Test tech with big impact potential on a small “safe to fail” scale

Addressing Conflict Theory’s Perverse Incentives

  • There’s more incentives to focus on the upside of tech than its downsides
    • incentive is not to look for potential harm
    • Conflict theory gets mistaken as Mistake theory
    • Possible ways to address:
      • Safety tests or rigorous risk analysis being applied before tech being released to the market (like it happens for drugs with FDA)
      • Becoming aware of incentives to socialize risks while privatizing gains