The Problem with “The Three-Body Problem”

From a recent review of Cixin Liu’s Nebula Award-nominated and Hugo Award-winning “The Three-Body Problem” on Big Think:

“Why We Should Really Stop Trying to Contact Aliens” by Robby Berman

Cixin’s writing is beyond smart — it’s brilliant — and it’s science fiction of the best kind, with mind-boggling ideas and perceptions, and characters you care about. His concept of the dark forest, though presented in a work of fiction, is chilling, and very real.

In the book, Cixin uses the analogy of hunters in a dark forest, where the hunters are sentient civilizations.

The universe is a dark forest. Every civilization is an armed hunter stalking through the trees like a ghost, gently pushing aside branches that block the path and trying to tread without sound. Even breathing is done with care. The hunter has to be careful, because everywhere in the forest are stealthy hunters like him. If he finds another life — another hunter, angel, or a demon, a delicate infant to tottering old man, a fairy or demigod — there’s only one thing he can do: open fire and eliminate them.

The argument is that failure to do so will lead to the destruction of the pacifist civilization since:

When one civilization becomes aware of another, the most critical thing is to ascertain whether or not the newly found civilization is operating from benevolence — and thus won’t attack and destroy you — or malice. Too much further communication could take you from limited exposure in which the other civilization simply knows you exist, to the strongest: They know where to find you. And so each civilization is left to guess the other’s intent, and the stakes couldn’t be higher.

You can’t assume the other civilization is benevolent, and they can’t assume that about you, either. Nor can you be sure the other correctly comprehends your assessment of their benevolence or maliciousness.

This adaptation of the Prisoner’s Dilemma supposedly explains the Fermi Paradox, proposed by the physicist Enrico Fermi, who wanted to make sense of the disparity between the lack of evidence for alien races and the overwhelming odds that there are some — probably many, if not millions. The reason, we’re told, is that the universe is a dark forest full of silent hunters.

Frankly, as an analogy, submarines work much better. The whole premise of submarine warfare (in the modern age) is to detect the enemy without being detected. If there is near parity, as there was during the Cold War, this is not so much a strategy as an environment-specific tactic between two cultures that are not hiding. They’re very aware of each other. They just don’t want their ships to be seen so that they can maintain a competitive edge in case of open hostilities. The cloaking device of Star Trek fills the same need.

In conditions of near-parity, mutually-assured destruction goes a long way to suspend open war. That isn’t just human culture. It’s game theory. In conditions of disparity, however, where one (or more) civilization is vastly stronger, you wouldn’t have hunters in a forest stalking each other. You’d have something much more like what actually happens in a forest at night: predator species stalking prey species.

But game theory still applies. In nature, prey animals typically aren’t smart enough to play the Prisoner’s Dilemma, but space-faring civilizations are, which means some of the vast numbers of “prey” societies would almost certainly band together to form a large protective hegemony — any that refused would eventually be eliminated by evolution — until conditions of parity were once again met and we’re back to a Cold War model.

In the dark forest hypothesis, we’re to believe that there are millions of civilizations — a number supposedly derived from statistics — and that they’re all aware in some sense that the others are out there, and yet it’s logically impossible for any of them to communicate such that negotiations are never an option, and of those millions of civilizations, none have ever successfully banded together and so gotten more to join by their example, not in millions upon millions of years, because the bad guys trump all.

Of course, in a work of fiction, that works fine and I’m not attacking Mr. Liu or his books. In fact, it’s a standard trope in sci-fi to bend the universe politically and spatiotemporally to fit the needs of your story, where most writers set up a system of technology and transport that more or less mimics earth in the age of ocean-going conquest. But a trope is not “brilliant” or “very real” and suggesting we should shut down SETI “Right. Damn. Now.” is alarmist and anti-rational.

The motive here is the supposed heat death of the universe, where effectively immortal species are already planning ahead for life in a few billion years. (It’s possible, I suppose.) But then, the heat death is either avoidable or it’s not. If it’s not, then everyone is doomed, regardless of who’s left standing. If it is, then there’s no real existential threat. In fact, it may even be worthwhile to bring as many species into the “cleanup effort” as possible.

I suspect the reason the scenario presented in “The Three-Body problem” is so convincing to people is because the conditions it requires appear to be the most plausible in each case, so if each condition is the most plausible, then surely the result is the most plausible, right?

Wrong. Being the single most plausible scenario, a relative measure, doesn’t make anything plausible in absolute terms. (Read up on a heuristic called Representativeness.) For example, we don’t actually know for sure that there will be a heat death of the universe. It’s very possible — and may even be likely — but for the dark forest hypothesis to work, it must be true.

Furthermore, for the heat death to be true, our current understanding of thermodynamics must be more or less complete and unimpeachable. You have to accept that what humans believe now at this point in history will never be trumped by any other discovery (the way Einstein trumped Newton). What’s more, you have to accept that a host of other scientific speculations are NOT the case: the multiverse, for example, or that it’s impossible to trigger a big bang or to make internal changes to the universe that allow it to live on.

To put it in numerical terms (albeit artificial), if you have a series of six necessary conditions that must be met for an outcome to obtain, and all six are individually the most likely outcome — for simplicity, let’s assign a probability of 70% to each — then the odds are:

0.7 x 0.7 x 0.7 x 0.7 x 0.7 x 0.7 = 0.1178

Or about 12%, meaning there’s an 88% chance that some other scenario is the case. Despite that each condition is individually the most likely, it’s still not likely to happen in real terms.

Think of it this way. Cixin Liu presents us a Caveman’s Dilemma: that the only options available to an advanced star-faring civilization are “to club or not to club.” But that requires the “no communication” condition. Civilizations must be forced to act while blindfolded, otherwise they have more than a simple dilemma. So it is removing any one of the necessary conditions produces an entirely different result.

I think it’s far more likely that problems are soluble, even universe-sized ones, and that even if a solution wasn’t ready, advanced species might — like Arthur C. Clarke’s precursors and their monoliths — impose limits on less advanced civilizations while they tried to figure it out, and then explained the problem and asked for help once a species emerged from adolescence.

Altruism appears spontaneously in nature. It can be an evolutionarily stable strategy, conferring adaptive benefits on species that adopt it. For the dark forest hypothesis to work, we have to accept that it’s always shed and/or stamped out, and that in millions of years, the effects of chance were marginal and no civilization ever took a risk that paid off. I suspect at least a few sentient races are far more robust than that.

Of course, local conditions might vary in any number of ways. It could be that humans just happen to have evolved on a planet near a psycho race, and the bad guys get to us before any of the benevolent ones do. That could happen. But that’s not what Cixin Liu writes in the book and it’s not what people are praising. In fact, it’s an argument to INCREASE our search for extraterrestrial life. We need the support!

And that raises another problem with the dark forest analogy: It’s not actually a solution to the Fermi Paradox. The forest would neither be dark nor silent. Space is a vast freezer that keeps a record of every bird call ever chirped. Mankind, for example, has been leaking radio waves for decades. Those don’t go away. They spread out, they get weaker, some get lost, but our cluster of signals will keep spreading through the void for hundreds of thousands of years (millions?) before they’re finally so dispersed that they’re completely lost to the cosmic background and not even a technologically advanced race can make sense of them.

Keep in mind, the earth isn’t simply spinning and orbiting the sun. The sun itself is moving, as is the entire Milky Way galaxy and the local group of galaxies of which it’s a part. We’re leaving a noisy wake fully laden with identifying information. Not only have we left a footprint, it gets bigger with time! The same would have been true of just about any other technologically advanced civilization, even one that went dark later — and especially any that were eventually eaten (since they would have been found).

The more likely scenario, as Bill Nye explains in the video that accompanies the book review, is that we’re just not looking hard enough — for example, that radio is a clunky technology and we should be looking across the full EM spectrum (which we’re not).


While I didn’t tackle the Fermi Paradox in my first novel FANTASMAGORIA, the premise of the book was an answer to the question: Why would aliens bother? Why would they cross such distances, even if it were relatively easy for them? It could only be to satisfy a need that couldn’t be satisfied at home — curiosity, for example. Or in the case of my book, culture.

I proposed that any species capable of the journey would have long-since figured out how to supply all its wants and needs. They would be more like the humans in WALL-E — fat and bored and looking for something to do! (There was also supposed to be a statement in there about the carnival-esque nature of fandom and our own society generally.)

Thus, the aliens in FANTASMAGORIA transmute entire planets into Westworld-like “pleasure worlds” — giant amusement parks — based on the myths and stories of the species they encounter, which was of course a great way to plausibly populate an entire planet full of stuff from classic pulp fiction: unicorns and dinosaurs and cannibal fairies and war dragons and wereninjas and a mechanical gunslinger and a shape-shifting scoundrel and a radioactive assassin — all of which physically existed there, rather than being some complex virtual reality or robot simulation.

That’s not to say my solution is the “correct” one. Nor am I impeaching Cixin Liu or his award-winning trilogy. We can do anything in fiction. We read to escape. I happen to like stories about ninja-monks with laser swords and magical powers that begin with the words “Long, long ago in a galaxy far, far away…” But then, no one is mistaking Star Wars for hard SF.


Cover image is “The Landing” by Henry Ledesma.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s