Relationships are Rivers, Not Lines

David Jay
7 min readDec 26, 2018

--

Towards a mathematical understanding of dynamic relationships.

Modern graph theory has given us everything from Google’s legendary search algorithm to Facebook’s ad targeting to the artificial intelligence behind self driving cars. The mathematical study of networks was invented by an oddball polymath who traveled the world crashing on friend’s couches, and has proven incredibly broad in its application. From the proteins in our cells to the neurons in our brains to communities that shape us networks are everywhere, which makes tools to understand those networks especially powerful. But those tools are missing something significant.

At its core, graph theory describes networks with little dots called nodes and little lines called edges, they look a little like this:

A Petersen Graph

Its toolbox includes all kinds of ways to assign values to these lines, to color them in and to move them around, but it doesn’t have much to say about where the lines come from. If I’m studying a network the lines either exist or they don’t, they just pop into existence or pop out like someone friending and unfriending me on Facebook. As a result, graph theory is excellent at describing the structure of connected stuff but has little to say about the process of how that stuff becomes connected. It explores how we make friend requests, not how we make friends.

In the real world, connections never pop into existence fully formed. A new friendship evolves from a first conversation to a coffee date to a regular ritual together. Over time we learn how to spend time with the people who matter to us, and if the relationship is healthy that understanding is constantly changing. The way that we go from not knowing anyone at a party to having a community that loves and supports us mirrors the way that animals integrate with their ecosystem, or new ideas integrate with a conceptual framework. Just as graph theory gives us tools to describe static networks, it is possible to develop mathematical tools to study networks that change over time.

Introducing Relationality

I’ve chosen the word “relationality” to describe this propensity of networks to evolve and change. Webster defines relational as “the state of being constituted of relationships”, which is a little unwieldy but close to what I’m getting at. To be more precise, I’m using the term relationality to refer to the propensity of systems which randomly transmit information to develop into systems which contain stable flows of information.

When someone draws an edge in graph theory they are almost always describing a flow of information, a way in which one entity is influencing another. In physics “information” doesn’t just mean text messages and cat gifs, it refers to anything capable of changing the state of something else. Food is information, so is money, so is a bullet. If two things exchange information in a way that we’re interested in we can say that they are connected, if they don’t we can say that they are disconnected.

Our task is to explore how people and proteins and plants and such go from not exchanging information to exchanging information, to see the world not as a series of dots and lines, but as a network of stable and unstable rivers of information.

Imagine a node, we’ll call it node A. If you’re a visual learner imagine it as a little blue circle:

Node A interacts with its environment by transmitting information out into the world. Imagine node A shooting out little blue dots that hit other nodes around it. Node A has many different states, that is many different ways it can shoot out information. Maybe it only sends information to node B, or maybe it evenly splits its information between nodes B, C, and D. To understand how A behaves, we’ll draw a probability distribution of its possible states:

This flat line means that every state of A is equally likely. Pretty chaotic! A is a hot mess, shooting its information out into the world with no rhyme or reason. This is the state of maximum entropy for A. We can mitigate this craziness by sending node A a packet of information, which will change its probability distribution like so:

Now a particular range of states has become more likely and all other states have become less likely. A is a little bit more stable, and its entropy has dropped. You can think of entropy as representing the “flatness” of one of these probability distributions, and negated entropy as representing the “spikiness.” When we say that information is negated entropy, we’re saying that the value of a packet of information is equal to the entropy reduction it creates in the entity receiving it, the extent to which it stabilizes that entity by making its probability distribution “spikier.”

So what does all of this have to do with relationality? We’re getting there. We’re going to observe a system as it goes from a high entropy system which transmits information randomly to a low entropy system which only transmits information between its nodes.

Let’s return to our friend node A from earlier, and add another, equivalent node named node B. We’ll assume that both node A and node B are at the maximum possible level of entropy. Imagine them both floating there, randomly spitting out information into the universe.

Every once and a while, by sheer chance, a packet of information from A will hit B. This packet of information will make B more likely to transmit information in a particular way.

A and B will continue to permute through all of their possible states, occasionally hitting one another, permuting through every possible state of their respective probability distributions until…

..eventually, A puts B in a state that makes B more likely to send information back to A. Now what was a purely random process becomes a self-reinforcing one. A sends information to B which makes it more likely to send information back to A, which in turn makes A more likely to send information back to B, and so on. Once A and B have found this self-reinforcing state they will shift from randomly permuting to stably sending information back and forth to one another, causing the entropy of the system to steadily decrease. Relationality has arrived.

Note that the entropy of the system is closely tied to the stability of its relationships. Systems with stable relationships will always have low entropy, systems with unstable relationships will always have a high entropy. This gives us a handy tool for measuring relationship growth, and a formal definition of relationality:

R(N) = -dS(N)/dt

The relationality (R) of a system (N) is defined as the negative rate of change of the entropy(S) of that system. Highly relational systems will evolve quickly from randomness to a state of dynamic connection, while irrelational systems will do the opposite, destroying connection and replacing it with a chaotic mess.

To summarize:

  1. All networks can be described in terms of information exchange.
  2. This information exchange emerges from randomness through the emergence of self-reinforcing states.
  3. As these states emerge, entropy in the system will decrease.

This conceptual shift from relationships-as-lines to relationships-as-information-flows provides powerful tools for measuring relationships as they grow. By measuring the entropy of information exchange throughout a system over time, we can get a picture of where in that system relationships are forming and where they are degrading. We can see which parts of a system are more relational, and begin to research what systemic properties lead to this relationality. This research can be easily compared across domains: we could directly compare the relational dynamics of a network of proteins, a human social network, and an emerging sector of the economy to identify principles of relationality that are consistent throughout.

Anyone who has used a dating app or attended a professional conference can attest to the fact that we live in a society short on rigorous methods for establishing meaningful connection. Often the vital and transformative work of building relationship goes unrecognized, unsupported and unrewarded, institutions focus on maximizing what can be measured (likes and shares, attendee numbers, employees hired) over what can’t (meaningful conversations had, new relationships formed, integration of new employees into an organization.) It is my hope that developing a richer mathematical language for describing the development of relationships can lead to a world where our institutions understand and value the work of creating connection as much as we do.

--

--

David Jay

Founder @ Relationality Lab, fascinated with the way that relationships and movements form.