AI creates jobs. Simplicity destroys jobs.

At least in theory

#ai

#software

These are uncertain times in the Galaxy. The mysterious, shinny new technology called Generative AI has taken everybody by surprise, supposedly creating a white-collar job bloodbath. Every day we hear of layoffs, after layoffs, after layoffs.

But something doesn’t add up in this narrative.

With AI, it’s easier than ever to create complexity. Even if some tasks take less time, there’s a counter-balancing force:

  • More lines of codes mean more complexity, leading to the need for more developers.
  • Longer reports with more fluff and less substance mean more time spent reading and trying to understand data.
  • More possible ideas for a marketing campaign translates to more time spent considering and analyzing viable paths.

Most white-collar employees don’t just execute tasks. They understand tasks. The job is as much doing the task as maintaining a mental model of it, the job, and the company. We could say every task a worker performs has 2 outputs: the actual task (for example, the code for a feature) and the updating of the mental model. We could say that this mental model is “a huge context”, in LLM parlance.

The most important part of a job is not the actual making of an Excel sheet, marketing campaign, or software feature, but the understanding that took place to have it created and edited. If this were not the case, companies would consist of armies of contractors alone. Instead, smart companies recognize that workers need a lot of context and experience to do a good job.

We will need LEGIONS of humans to handle the complexity that AI will generate. To understand it, process it, and manage it. Letting complexity ride loose is a big mistake for any kind of organization. Because, when systems get big, without proper care, the chances of failure multiply exponentially.

Foundation

In the book Foundation, by Isaac Asimov, a psychohistorian (a scientific futurologist of sorts) discovers that the big Galactic Empire is about to collapse under its own weight, due to a rising bureaucracy.

The fall of Empire, gentlemen, is a massive thing, however, and not easily fought. It is dictated by a rising bureaucracy, a receding initiative, a freezing of caste, a damming of curiosity – a hundred other factors. It has been going on, as I have said, for centuries, and it is too majestic and massive a movement to stop.

—Isaac Asimov, Foundation

The book is a classic worth reading, if you are into sci-fi. It was published in 1951, but the lesson remains: complex systems can grow out of control. I can see how this can play out in miniature when companies without the proper teams push too many half-baked features too fast, without care for reliability and maintainability. What used to be a fine working system stops being reliable and moves so fast it’s impossible for the team to make sense of what’s going on. The more complex the system, the more people you need to handle it.

For example, if code is complex and hard to understand, software developers will need more time to push the same amount of features. Also, since codebase understanding is low, the features might have subtle bugs that are caught only by the end user at the worst possible time. This leads not only to low reliability, but also to more time spent fixing bugs.

While AI could assist in reducing complexity, we will still need many humans, because the key to simplicity is a deep understanding of what is valuable, which machines lack. Generative AI has the problem of churning output too fast, creating things without pausing to decide if it’s worth it. Without humans in the loop, the slop will quickly flood the digital world. So, we will need more software engineers to ship features in larger, more bloated codebases. Also, more analysts to process new data that will be created. And more marketers to compete for the little attention people have left.

Not “Just a Tool”

Generative AI is not “just a tool”. Nothing is, really. Tools have capabilities, and people will take advantage of those. There’s a reason nobody says that scissors are “just a tool” when kids are running with them in their hands.

Generative AI has the capability to create things: texts, images, videos and sounds, so it will create things, a lot of things, and since most things are of average quality, it will create a lot of average things.

To a man with a hammer, everything look like a nail. To a man with generative AI, everything looks like Shrimp Jesus. AI could be used to help understand and simplify things, but its features make it unlikely for that mode of use to be favored.

There’s another problem with the tool: it imitates human-made things. Image generation imitates pictures and paintings created by humans. LLMs imitate text or code written by humans. And while this imitation is sometimes helpful, there’s still a problem: it appears created by humans, so the signal that at least some amount of work went into making something is lost. And while human labor never was a good proxy for value (sorry, Karl), it surely is better to know that at least one soul took some time to consider if something was worth it.

I’m not saying this imitation is bad ethically. I’m saying that this lost signal of the effort that went into something creates complexity. Without doing the hard mental work of figuring out if something is worth it, we create worthless digital artifacts, diminishing the signal-to-noise ratio. With less focus on real value creation, we will need more people to create the same amount of value. In simpler terms: if we don’t stop the slop, it will flood us. Take, for example, an employee who decides to generate an internal 30-page report of the US Food Market using AI. It takes a few seconds to generate, but it wastes multiple hours of reading time. It is a much better idea for a single person to work on a clearer, more accurate, more condensed report, and only then having other people read it.

Complexity is Good

From the point of view of an employee seeking stability, what should be feared is simplicity, not complexity. Complexity that workers are used to means job security. The problem is not when AI spits out more of the same things that we already have, but when paradigms shift.

  • Elevator clerks were replaced by elevators that don’t need clerks at all.
  • Swiss watchmakers by mass-produced Japanese watches.
  • Bank tellers were not replaced by ATMs, sure. But they were replaced by iPhone.

Now, ask yourself:

  • Are spreadsheets going away?
  • Is code going away? (No, it’s not. Spec-driven development is a terrible idea.)
  • Are user interfaces going away? (I don’t think so. I will write more about this.)
  • Is written text going away?

Decluttering, simplifying, and “less is more” are popular in architecture, but not so much in the white-collar world. When these ideas become more popular, only then will available jobs decrease structurally. One day, all corporate CEOs may wake up and realize that they cannot get out of the hole by digging. The day consultants look like a male Marie Kondo on steroids, axing complexity, because it doesn’t spark joy. Until that day comes, there will be enough jobs to make a living. Whether we will like these jobs when the day-to-day is surrounded by so much slop, that’s a different question.

How to explain the layoffs then?

“Markets can remain irrational a lot longer than you and I can remain solvent.”

—A. Gary Shilling

Even when the need for employees should increase in theory, at least some part of employment is driven by culture and expectations, not actual demand for work. I won’t try to guess how big this portion is, but we know of companies with a lot of people not doing a lot of work, and companies with exhausted employees pulling all-nighters. It also varies by industry and company department. Overall, the business environment has not been favorable to maintaining a large workforce these last few years.

  • The end of the ZIRP and the rise of interest rates make capital for new initiatives more expensive.
  • Elon Musk inaugurated the lean-company season with layoffs at Twitter, which other leaders soon followed.
  • Talk of AI-driven efficiency gains pressure executives into being “more agile”, while giving them an excuse for layoffs.
  • The end of same-year amortization for software development salaries under Section 174A. (Later restored)

Given those big effects, I would say the US job market is not doing that bad. It might just be that the world is a complex beast.