Why I Don’t Buy The Dark Forest Hypothesis
By Cool Worlds
Summary
Topics Covered
- Dark Forest Fails Hart's Paradox
- Attack Risks Mutually Assured Destruction
- Telescopes Undermine Dark Forest Game
- Life Demands Risk Beyond Mere Survival
Full Transcript
9 months ago, I made a video explaining why I do not buy one of the most popular proposed solutions to the Fermy paradox, an idea known as grabby aliens. The idea
was conceived by economist Robin Hansen, and I addressed my criticisms to him directly in the video. So, check that out if you haven't already. At the end of that video, I said that if you guys
were interested, I would be happy to weigh in with my thoughts on the other most popular idea out there, and that is the dark forest hypothesis. And indeed,
many of you said yes, please. So, today
I'm going to explain what dark forest says and why ultimately I don't buy this one either. Dark forest is a theory
one either. Dark forest is a theory which simply put places fear as the primary motivating force for the behaviors of alien civilizations. Fear
of the unknown of what might be lurking amongst the stars is one of the oldest and most visibly articulated paradigms in humanity's discussions of alien life.
Science fiction in particular amplifies this view, often portraying alien species as hellbent in our extermination. And we have to
extermination. And we have to acknowledge that we have been somewhat culturally preconditioned to expect this outcome due to Hollywood's lust for exploding cities. In a nutshell, Dark
exploding cities. In a nutshell, Dark Forest proposes that the cosmos is a doggy dog universe. It is actually filled with civilizations, but most of
them stay silent for fear of being attacked. If a species does announce
attacked. If a species does announce itself, then it will invariably be destroyed simply as an insurance policy against possible future competition. So,
the theory accommodates the premise that life could be common, and yet it explains the great silence that we observe. It is simply a consequence of
observe. It is simply a consequence of this fear-driven mentality. Although the
motivating ideas behind dark forest have been around for a long time, I guess what we might call the canonical dark forest theory stems from the 2008 science fiction series, The Threebody
Problem, by Chinese author Lu Sha. I'll
be honest. I remember when I first heard that the threebody problem, a book I own, had become a bestseller, I was kind of surprised but pleased that the public had finally got into Hamiltonian
operators. But of course, it was a
operators. But of course, it was a different book. So, what we now call the
different book. So, what we now call the dark forest theory is usually a reference to the second novel in this Lucian series. But even then, this isn't
Lucian series. But even then, this isn't the first story to introduce these ideas. It might be fair to call Fred
ideas. It might be fair to call Fred Saberhagen's Berserker novel series as the father of dark forest, a series dating back to 1963. Worth a read if you haven't before. In these stories, a
haven't before. In these stories, a civilization known as the Builders builds the Berserkers as a weapon against their mortal enemy. But the
Berserkers malfunction and spread across the galaxy, sterilizing all worlds of organic life. The closest thing to a
organic life. The closest thing to a scientific basis for the Berserk hypothesis comes from physicist John von Newman, who discussed the idea of self-reroducing probes, now often simply
called vonneumman probes. Whereas
berserkers have the primary objective of killing all life, vonneumman probes simply aim to expand and reproduce, much like a bacterial colony. But it's easy
to criticize the berserk hypothesis as being plainly incompatible with our very existence. In fact, this was first
existence. In fact, this was first pointed out by astronomer Michael Hart and is thus known as Hart's fact a that genocidal aliens have not visited us.
And look, our planet has been infested with organic life for at least 4.2 billion years. So long that even if
billion years. So long that even if berserkers were limited to the outrageously slow speeds of our Voyager 1 spacecraft, they could still have
sterilized the entire galaxy two and a half times over by now. I do have some more nuanced thoughts on this though and I'll direct you to hear for that.
Regardless, I suspect this is one of the factors which motivated author Lucia Shin to modify the Berserk hypothesis into the dark forest idea. The key
modification is that rather than sending probes out to every world in the cosmos, the builders or really now the tricolarans in this story only send them to worlds with technological species
discovered on them, saving some resources. But the motivation for dark
resources. But the motivation for dark forest goes much deeper than this and we will explore that next. Whether space is a dark forest or not, the internet
certainly is these days. And so that's where today's sponsor comes in, Incogn.
In dark forest theory, species work hard to keep their presence undetected. And I
think a lot of us feel the same way about our online presence. Of course,
these days, companies often demand your personal information to start new services and subscriptions, but then they often sell that information on to data brokers. Your data is their
data brokers. Your data is their product, often being sold to the highest bidder. Information like your employment
bidder. Information like your employment history, address, telephone number, and even social security number. You can
fight back with Incogn. In fact, Incogn has custom removals in the unlimited plan and family unlimited plan. So if
you find your personal information visible on a people's search site or other website, flag it and an incogn private agent will take care of the rest for you. Sign up and they work to make
for you. Sign up and they work to make you invisible again, chasing down where your information is and then using the weight of the law to demand that information be cleaned. For me, incogn
have now successfully deleted my personal info from nearly a thousand sources. A thousand. It would take me
sources. A thousand. It would take me roughly 700 hours to have done this myself. So, I'm glad to have the
myself. So, I'm glad to have the professionals help. So, sign up today at
professionals help. So, sign up today at incogn.com/coolworlds and use the codec
incogn.com/coolworlds and use the codec coolworlds for an exclusive deal of 60% off. Once again, that's
off. Once again, that's incogn.com/coolworlds.
incogn.com/coolworlds.
Codecoolworlds. Now, back to the video.
So, if we're going to go deep into dark forest theory, and hey, this is cool world. So, of course, we are going to go
world. So, of course, we are going to go deep. We have to introduce some concepts
deep. We have to introduce some concepts from game theory. The case for Dark Forest is largely motivated by imagining the cosmos as a sequential game like a
game of chess in which each player takes turns to make actions. Dark Forest
assumes that the only objective of this game is to survive. But frankly, that's a rather tragic state of affairs if that's all you care about. So consider
that an alien species hear the tricolarans receive a broadcast from humanity, not a directed message, but rather it's some kind of radio beacon
that we blurt out in all directions.
They now have three possible actions.
They can ignore the broadcast, they could reply to it, or they could attack us. In game theory, we can compare these
us. In game theory, we can compare these possible actions by weighing their net payoffs. So consider the first possible
payoffs. So consider the first possible action the trial could take to reply to us. In this case, humanity will gain
us. In this case, humanity will gain information. We will now know that they
information. We will now know that they exist and their location or at the very least the location where their relay stations. This means that the ball is
stations. This means that the ball is now in our court as to what happens next. Humanity could continue to
next. Humanity could continue to communicate in friendly dialogue with, let's call it, a probability PC. that
would lead to some long-term benefit to the tricarons. If we dub this benefit as
the tricarons. If we dub this benefit as F, then the payoff of this sequence of events to the tricerans is simply PC
multiplied by F. But maybe humanity isn't so friendly. Perhaps instead we attempt to exterminate the tricolarans, remove the player from the game. Our
message was then just bait to find their location. Of course, our current
location. Of course, our current technology might not be able to sterilize a planet, but they don't know that. And plausibly, we could do it in a
that. And plausibly, we could do it in a few centuries regardless. So, let's call the probability of us attacking them as PA. And the benefit from that from the
PA. And the benefit from that from the tricolar's perspective is typically assumed to be very bad. In fact,
infinitely bad. So, minus infinity. So
now the tricelum payoff from the sequence will be PA multiplied by minus infinity which of course is just minus infinity. When we add up the different
infinity. When we add up the different payoffs including the ignore case the net payoff is dominated by that minus
infinity. And so replying looks like a
infinity. And so replying looks like a bad idea. Instead the tricolarans could
bad idea. Instead the tricolarans could just ignore the broadcast. But then it's only a matter of time until we detach them anyway as our technology grows and
then the attack risk would reappear. And
so this branch also ends up as having minus infinity payoff. The dark forest argument is thus that aliens will never engage in the reply or ignore options
because of those abysmal payoffs. And
this leaves them with just one course of action, attack. Whilst this might cost
action, attack. Whilst this might cost them some resources, that would pale into insignificance compared to their own demise. And thus, essentially, it
own demise. And thus, essentially, it represents a neutral action and the best play available. Proponents of the Dark
play available. Proponents of the Dark Forest hypothesis cite this game theoretic underpinning as its major strength. Combined with the fact it
strength. Combined with the fact it explains the great silence, it seems like Dark Forest has everything buttoned up pretty well. But I have two major criticisms of Dark Forest. And the first
is that this game theoretic argument is too simplified and frankly a little naive. So to recap, the argument is that
naive. So to recap, the argument is that for aliens like the tricolarans, replying or even ignoring a broadcast from humanity is always too risky, far
safer just to destroy the sender.
Presumably, all species come to the same logical conclusion and thus the universe ends up being very quiet. But the
argument here is one of deduction.
Recall that the reply and ignore strategies were found to have minus infinity payoff. And thus we concluded
infinity payoff. And thus we concluded that the third option must be the preferred strategy without really giving it too much thought. So let's look a little bit more closely at that
strategy. Now here the tricolarans
strategy. Now here the tricolarans attack humanity upon receiving our message. Now, in the novels, the
message. Now, in the novels, the Tricolar live in the Alpha Centuri system, which just so happens to be the closest star system to us. This is an
enormous contrivance for the sake of the story to allow for fast travel. The odds
that the very next door system not only has life on it, but a technological civilization is remote. In fact, Avi Lobo has formally estimated to be 1 in
100 million using a Capernac principle argument. Yet more, a neighboring
argument. Yet more, a neighboring civilization like this would surely already know about us. And thus, our continued existence already undermines the dark forest theory. In practice, I
think it's safe to assume that another civilization would be hundreds or even thousands of light years away. That's
important because it introduces significant risk to the attack option branch because now an attack is by no means guaranteed to be successful. It's
quite possible that in a thousand years from now, humanity's technological prowess has exponentially increased.
Even if their attack travels at the speed of light, it would still potentially take many centuries to reach us. By then, we may be able to trivially
us. By then, we may be able to trivially defend the attack. Or perhaps by that point we would have spread across the solar system or into deep space on O'Neal cylinders or even colonized
nearby star systems. If even a single vestage of humanity survives, we could plausibly retaliate. And remember, the
plausibly retaliate. And remember, the Tricolar have essentially no intelligence about us whatsoever beyond that initial broadcast they received.
For all the Tricolar know humanity could be far more advanced than they, especially in a millennium from now. So
whereas before we had this chart, we're going to add some nuance by allowing for the fact that an attack is not guaranteed to be successful. So one
outcome is that the attack is successful and thus humanity dies with a probability p. But another option is
probability p. But another option is that it was unsuccessful with a probability one minus pas.
Master Yoda, you survived. Now, at this point, if you really wanted, you could add another decision tree as to how humanity responds to such an attack, but simply
adopting the likely response of a counterattack suffices for our purposes.
Now, it's worth noting that that attack and indeed the other attacks on here are also not necessarily successful either, but they all carry some finite chance of
success multiplied by minus infinity loss for the tricolar and thus their payoff will always end up being minus infinity. And the counterattack doesn't
infinity. And the counterattack doesn't even have to come from humanity directly. It could come from an alien
directly. It could come from an alien ally or our pet AI. The result is the same and that is that the net payoff of all three actions is now minus infinity.
So it seems like this game theoretic approach has actually kind of broken down here. Ostensibly all three of these
down here. Ostensibly all three of these actions have equally awful payoff. But
this really only happens because of this introduction of those minus infinities.
To make mathematical sense, we have to replace them with a finite value to existence. Let's call it E. And so this
existence. Let's call it E. And so this now gives us something like this model shown here. When we go through and add
shown here. When we go through and add it all up, the attack behavior is the superior strategy to the others only when these inequalities are satisfied, which we can actually simplify if we
assume that the value of dialogue is pretty negligible compared to survival.
And because of the sign dependency, these really just boil down to this. So,
the attack behavior is only worthwhile if the probability of attacks being successful exceeds the probability that civilizations don't attack first. A
remarkably simple result. What the
canonical dark forest argument assumes is that attacks are always successful at eliminating other players. So, for
example, Project Nash comes up with this game theoretic chart where attacks are always assumed to equate to destroy. To
me, this is a mistake. What this really comes down to is that attacking risks mutually assured destruction, MAD. This,
of course, is a concept familiar to anyone worried about geopolitics or modern superpowers, each with a nuclear arsenal capable of annihilating the other. Even though nations may be
other. Even though nations may be adversaries, it is not beneficial to go around nuking every player, the risk of retaliation is simply too high. When we
factor in MAD, Dark Forest starts to make a lot less sense. There is a way to avoid MAD, to be a genocidal maniac without the risk of retaliation. It's
really quite simple. The safest way to avoid the risk of a counterattack is to launch your assault before the other player spawns to sterilize their home planet before they have a chance to
develop. And this of course is the
develop. And this of course is the original Berserker hypothesis that we discussed earlier, the seed from which the dark forest took root. You know, the threebody problem always kind of
bothered me on this point. How could it be that there was an advanced civilization just a few light years away? One capable of interstellar travel
away? One capable of interstellar travel and even folding protons, yet one that had no idea that the Earth was inhabited by technological species. In the story,
our broadcast is what triggers their attack. But surely they would have known
attack. But surely they would have known about humanity decades or even a century earlier than this and thus had the chance to attack us when we were far more defenseless. As an example,
more defenseless. As an example, consider that humanity has altered the chemical composition of our planetary atmosphere over the last century.
Besides from triggering climate change, many of the chemicals we produce like CFC gases have no natural production channels and thus would immediately betray our presence. The Berserkers goes
further than this because even waiting for a Teta signature like CFC's would be too risky. Once again, humanity could
too risky. Once again, humanity could greatly advance in the years it takes to attempt at sterilization, and thus they would risk retaliation. Berserkers
neatly sidesteps this risk by simply sterilizing all planets with life on them, regardless of their evolutionary development. We can make a similar game
development. We can make a similar game theoretic chart when playing in Berserker mode. Let's imagine the
Berserker mode. Let's imagine the Tricolar evolved millions or even billions of years before us. So after
noticing that the Earth has life on it, they could just ignore it. In that case, effectively the next turn is billions of years later when humanity develops and
discovers the Tricolarens. So this looks largely the same as before, carrying the same risk of Tricolar extinction. On the
other hand, the Tricons could attack Earth, but now there's no impediment to their success. Even if something went
their success. Even if something went wrong, they still would have billions of years to try again. Perhaps this cost them some resources, R. But if R is
small compared to the value of existence, E, then it becomes clear that this is the optimal strategy, the so-called Nash equilibrium. The issue of resource scarcity though is often framed
as a primary argument for motivating the dark forest hypothesis over that of Berserker because eliminating life on a 100 billion worlds would add up. That
would be very expensive. But I think that this misses the argument of von Newman that the originator of the berserkers need not shoulder the resource cost for all probes. rather
they simply build the first few berserkers which then gather materials on distant worlds and then use that for self-replication. I think the most
self-replication. I think the most compelling argument against the Berserker hypothesis though is actually different and it hearkens back to the original stories of Fred Saberhagen. If
these probes are self-replicating, capable of propagating their build code from one generation to the next, then they will be capable of Darwinian
evolution and mutation. Accordingly,
there is always some risk that these probes will malfunction and eventually attack their creators. In essence, we have introduced a new player into the game, one who is bent on destruction
from the outset. And for me, this really nullifies the previous clear advantage that the building berserkers branch initially seem to have. Despite that, I still think there is a better case for
the Berserker hypothesis over the dark forest hypothesis. Because consider that
forest hypothesis. Because consider that we are already putting AI into all sorts of devices right now, including spacecraft control. Eventually, it may
spacecraft control. Eventually, it may become very difficult to distinguish between humanity and AI as we gradually merge our thought processes with
machines. In such a case, the
machines. In such a case, the distinction of Berserkers as a third player might dissolve. as long as there's some berserkers still out there, then in effect, the creator is still
going. To try and wrap this criticism up
going. To try and wrap this criticism up a little bit more neatly, I think that the whole premise of sitting around waiting for alien communication and then
attacking anyone that says hello ignores the existence of telescopes. Similarly,
avoiding broadcasting for fear of being discovered and thus destroyed ignores the existence of telescopes. Because
look, already we have telescopes bordering on being capable of remotely detecting life on another planet. And in
another century, it's not hard to imagine us building super telescopes cataloging living worlds across the galaxy, analogous to how we are currently cataloging exoplanets. The
game theory argument assumes that the only information exchange that can occur are via messaging or essentially an attack. But telescopes are like a spy
attack. But telescopes are like a spy network that undermine this premise.
Thus, the whole idea of waiting around for a message trigger doesn't make sense. If you are able and willing to go
sense. If you are able and willing to go around the galaxy sterilizing planets, then there is no reason to wait until the last moment to do it. In fact, doing
so would come with substantial risks of retaliation. Dark Forest is a theory
retaliation. Dark Forest is a theory about fear, but truly only our fear. The
truth is that we have no idea if other civilizations would share such a fear since we have no interactions with them.
And any argument that cast aliens as genocidal maniacs is really a statement more about us than them since we have zero information about their behavior.
Dark Forest is a mirror like in fact much of SETI is. It's a mirror revealing a dark side to our own human nature that
in moments of fear we are capable of capitulating to violence. You know, if we followed the tenants of Dark Forest in our own lives, it would be a pretty
grim world. We would never have
grim world. We would never have children, for that would introduce new players that could eventually turn around and kill us. We would never invite friends over because they would
then know our location and could then kill us. In fact, we'd never step
kill us. In fact, we'd never step outdoors or even interact with the outside world in any way because all of that would carry some finite risk to our own mortality.
Certainly as individuals, but even collectively as nations, we do not operate this way. Instead, we accept risks in our lives. Every time we get in
a car or a plane, it's a risk. Every
time we bite into a juicy burger, it is a risk. Every time we fall in love, holy
a risk. Every time we fall in love, holy smokes, does that come with risk. There
is more to life than just existing for the sake of existing. You can be alive without really living. So yes, reaching out comes with some risk. But hey, who
wants to live forever anyway? Don't we
want to experience a meaningful and rich existence? Some species will stay
existence? Some species will stay silent. They will fear the dark. But I
silent. They will fear the dark. But I
suspect that humanity and other species like us will be willing to accept the risk to roll the dice for a more fulfilling existence. So until next
fulfilling existence. So until next time, stay thoughtful and stay curious.
Thanks so much for watching everybody and a special thank you to everybody who has subscribed to our channel so far because we have just crossed the 1 million subscriber threshold which I never thought my wider streams would
cross. So to celebrate I'm going to do a
cross. So to celebrate I'm going to do a little ask me anything. I'm going to put a thread down below in the comment section. Reply there with any of your
section. Reply there with any of your questions. And I couldn't have done this
questions. And I couldn't have done this without all of our donors. So thank you to everybody who has supported our research program. If you too want to get
research program. If you too want to get in on that, you can use the link up above and down below. See you next time.
Loading video analysis...