The Jobs AI Isn’t Coming For
Imagine that you’re a second grade teacher. A kid in your class is struggling with the confidence to take on long division, you suspect that the lack of confidence may be tied to problems going on at home but don’t know what those problems are. How might you offer them support?
Here’s ChatGPT’s answer:
I understand that learning long division can be challenging and it’s okay to feel scared or unsure. But I believe in you, and I know you can do it with practice and patience. Let’s break it down into smaller steps and use some tools to make it easier. We can work together to make sure you understand each step and build your confidence. And always remember, your family loves you and wants to support you through this.
When it comes to the role of generative AI in the global economy, it’s important to understand the implications of that awkwardly overreaching last sentence. “Your family loves you and wants to support you through this” is a strange thing for a teacher to say in any circumstance and is entirely inappropriate in this one. Maybe the child is trying to process a parent dealing with substance abuse, maybe they are struggling to adapt to a new sibling, maybe their parents are considering divorce. Whatever the circumstance, we don’t know it. What kind of love their family feels for them and what kind of support that family is prepared to offer is a tender subject, and stating point blank that love and support are there when we don’t know the details is insensitive at best.
As a human, my brain can construct a model of my relationship with the child, their relationship with both their family and the long division problem as a complex system in which it can intervene. Some interventions, like offering them a little more assurance and emotional support than I would other students, bring the system into greater balance. Others, like point-blank bringing up whether their family loves and supports them, risk disrupting that system in ways that are harmful. While this kind of thinking can feel effortless and intuitive for us, it’s extremely complex. In a world where insect brains can engage in complicated pattern recognition evolution had to give us massive neocortexes to answer these sorts of questions, neocortexes that take a tremendous amount of calories to maintain and barely fit through a birth canal. Evolution gave us these expensive and awkwardly large brains give us one overwhelming evolutionary advantage: not the ability to recognize patterns (bugs can do that), and not the ability to use tools (crows can do that), but the ability to model and make predictions about relationships in ways that allow us to form large coordinated groups. Our ancestors who could model and navigate these groups thrived, while our ancestors who messed up their relationships tended to get kicked out and die.
What ChatGPT is doing, and what the many generative models that come after it will do, is more akin to an insect brain than a human one. They read a bunch of text written by humans (or pixels arranged by humans or sounds created by humans) and make predictions about what a human would do. These predictions are useful when “what a generic privileged human would say” or even “what a generic privileged and skilled human would do” is a sufficient standard, but they’re less valuable when what we need is a particular human whose brain has modelled a particular relational context.
What ChatGPT is doing is more akin to an insect brain than a human one.
Relational reasoning requires a big, complex brain for a reason. Relationships are highly complex flows of information that have all of the markers of a chaotic system but that our brains make accurate predictions about on a routine basis. When I say “problems going on at home” your mind is capable of conjuring a model of what those problems might be and how they would impact something as seemingly unrelated as long division in a way that both yields helpful insight and is profoundly difficult to derive from basic mathematical principles. Not “hard math problem” difficult, “requires multiple profound leaps in evolution” difficult. Like the jump from an insect brain to a human one or the jump from a bottle rocket to a Saturn V, getting there from where AI sits today is a marathon of unknown problems of unknown complexity that AI itself may or may not be helpful in solving.
All of this means that when I imagine the future of AI I don’t imagine it doing relational reasoning anytime soon. We may have personalized chatbot buddies that we ask for relationship advice, but the experience will be akin to hiring an assistant to read and summarize the world’s best advice books. A human friend would model the relationship, model us, and have insight on how to move forward in a way that an AI just won’t. “How should I navigate this conflict among my team members at work?” is a question that a human brain can uniquely answer. So is “How can I get members of this community to trust and support one another?” or “How can I best communicate a difficult diagnosis to this patient?” or “My friend is having a hard time, what can I do to make them feel loved and supported?”
Our ability to answer these questions matters. If it didn’t we wouldn’t have evolved big brains to answer them. Relational reasoning keeps our communities together, our families together, our companies together and our democracies together. If relationships wind up being the one thing we humans can produce that AI doesn’t then we still have a pretty critical economic niche.
Relational reasoning keeps our communities together, our families together, our companies together and our democracies together.
Relationships support our health. It turns out that not having friends is as harmful for our health as smoking 15 cigarettes a day. In study after study relationships are the leading driver of human happiness. Hannah Arendt famously stated that “totalitarianism is institutionalized loneliness,” highlighting the deep link between healthy relationships, the health of democracies, and the liberty and justice that they bring. A world where we are supported in forming healthy relationships with ourselves, one another, and our environment is a world that we want to live in. A world where we fail to form those relationships isn’t. And ultimately, only other humans can help us form them.
Unlike other (temporarily) economically valuable skills like coding or graphic design that require extensive retraining, relational labor is a thing that people the world over already know how to do. There are diverse lineages of cultural wisdom about how to create the conditions for relationship that stretch back to before written history, and millions of people are deeply experienced in putting that wisdom into practice. Their work already maintains the relational structures that allow our societies to function: from families to world governments, things only stay together when the work to maintain, deepen and grow their relationships takes place. Where relationships thrive, so do most other things that we care about.
If you believe that AI will rapidly and fundamentally shift our economy, it makes sense to invest in the work of building relationships. Social workers, educators, community organizers, and faith leaders all perform a service that is both extremely valuable and massively undervalued. Relationships do not appear on balance sheets. They help us sell things, but they cannot be sold. They help us accomplish things, but they are not a piece of equipment with a clear repeatable function. They heal our bodies, but they are not medical treatments that can be preassembled, administered, and billed. They inspire us, but they are not intellectual property. When we build relationships we create value, but GDP does not go up. As a result, we have developed a society that is often good at waxing poetic about the importance of investing in community but bad at actually resourcing the people who make community happen. Relational labor is often relegated to and expected of women, people of color, queer and disabled folks precisely because it is very often unacknowledged and uncompensated.
We have developed a society that is often good at waxing poetic about the importance of investing in community but bad at actually resourcing the people who make community happen.
I invite you to imagine what the world might look like if this changes, and what the world might look like if it doesn’t. In a world where this changes, we develop new economic frameworks for seeing where relationships form and investing in the work of forming them. In this world, someone who wants to see change happen might invest not in a particular nonprofit but broadly in a social movement, with money flowing to the organizers who make critical relationships happen that drive change. Money slated for public health could flow to people whose work demonstrably reduces social isolation, as connectedness is strongly correlated with public health outcomes. In this economy, millions of people would begin to be paid for work that they are currently burning themselves out doing on top of their day job. We’d get better music, better scientific collaboration, more economic innovation, and more functional democracies. If we incentivize people to build healthy relationships, we’ll get AI that partners with skilled humans to achieve this goal.
If we don’t, we’ll get a dystopia. If we see professions like education, healthcare, and social work as simply the delivery of a transactional service, then we will have no problem automating them. Our schools will become harder and harder places to form and process relationships, as will our workplaces and our families. An increasingly lonely populace will be sold synthetic friends who are wholly incapable of supporting them in forming and maintaining relationships with other humans, and we will retreat into increasingly personalized isolation. Democracies will falter and authoritarianism will continue its rise. Wealth and power will become increasingly concentrated on a few individuals and then on unaccountable AI systems themselves, who will increasingly take on the work of managing authoritarian states.
To respond to the power of generative AI our economy must evolve toward relationship. We must leverage tools in social science and complexity theory to better understand what relationships are and how to recognize the work of building them. We must look to cultures with deep wisdom about how to see, build, and invest in relationship, and invest in them as sites of critical innovation. Relational work has not been invisible for most of human history, but an economic history built on domination, homogenization and extraction has made it invisible. Now is a time to challenge ourselves to see relationship, build relationship, and invest in relationship.
David Jay is the founder of Relationality Lab, which helps movement organizations that center relationship attract resources with data.