Which arguments are you likely to win in this debate? Why, your arguments, of course. And, by extension, which arguments are your opponents likely to win? Their arguments. So, if you win all your arguments, and they win all their arguments, who should win the round?
Or, in more concrete terms, say you win that your plan saves a million impoverished people from dying of starvation, but they win that your plan will crash the US economy and cause war to break out, say between the US and China. Who wins? If a million people dying of starvation is worse than the war, then plan is a good idea, and you win, and they win if the opposite is true. The victory, in other words, goes to the team that prevents the "worse" impact.
So, what makes an impact worse than another? Arguments that answer this question are called Impact Comparisons, and these arguments are critically important in competitive debate. The "You win all your arguments, they win all their arguments" scenario doesn't often play out exactly, but some version of it does happen quite a bit, for exactly the reasons in my hypothetical scenario. Impact comparisons also focus the debate on a core concept, so if you can win an impact comparison you may be able to make a large part of the rest of the debate irrelevant.
The question that is probably burning on many minds by now is "well, how many people does the US-China war kill? Is it more than a million?" This is a magnitude comparison. For countable impacts like this, magnitude is fairly straightforward: how many die? how far does global GDP fall? For some other impacts, this can be a bit tricker: Is societal sexism worse than societal racism? Is a genocide a worse atrocity than war even if fewer people die?
Of course, magnitude is not the only thing that matters when it comes to making decisions. Say you are considering walking to the library. If you decide to go, there is a small (but real) possibility that you could get hit by a bus on your way. Getting hit by a bus has a much greater magnitude than going without the latest Harry Potter, but yet you go to the library anyway. Why? The probability of getting hit is so low that you decide it is worth the risk.
It is the same way with policymaking. Higher probability impacts are more important than lower probability impacts. Here is where a smart debater can sway a debate in their favor. You might not be able to win your link turn on the Econ Disad, but you might be able to mitigate the probability. After all, there are a lot of things that have to go wrong for a crash to lead to a war... maybe you've got evidence that the US wouldn't want to attack China. And maybe your case harms are worse than a 20% chance of war, even if fewer people die.
When comparing impacts based on magnitude and probability, it's important to note another characteristic: whether an impact is Linear or Terminal. Terminal impacts are a yes/no proposition. War either happens or it doesn't. We either get hit by an asteroid or we don't. Linear impacts have varying degrees of bad (that scale "linearly" if you were to plot them). We might be able to save all 1 million impoverished folks, or we could save 500k of them, or 100k of them, or none of them, or any amount in between.
The upshot of this is that a statement like "their impact has 50% probability" is ambiguous if I don't know much about the impact. A 50% chance of saving a million people from starvation (and probably saving 500k) looks much different than a 50% chance of a war breaking out.
In addition to magnitude and probability, the last of the "three great comparisons" is timeframe. It's an intuitive notion that a bad thing today is worse than a bad thing 100 years from now, but it might pay to reflect on why. If a policy now would cause a war in 100 years, maybe we'll figure out how to stop the war in the meantime. If plan costs a lot of money in 10 years, maybe we'll be able to save up.
For many affirmatives (especially on this year's topic), timeframe is an important weapon, because harms like poverty are happening now (they are "systemic," as some like to say). Of course, this can be taken too far. For example, people are infected with AIDS right now, with 100% certainty. But say we want to put more research money into finding a cure. The research has maybe a 10% chance of finding a cure (generous), and it might take 10 years. Suddenly your timeframe advantage is less definite. If your opponents know what they are doing, your advantage is only as short as the longest link in the chain.
The three great comparisons are powerful tools, but a few more Dos and Don'ts are in order so that you can use them most effectively.
- Comparisons aren't comparisons unless they actually Compare. Don't just tell me your war happens fast, tell me why it's fast-er than theirs.
- Most of the time, you can't win all three. Much better to tell me why the comparisons you are winning are more important than grasp at straws
- Get specific. Not all wars are created equal. A "nuclear war" between the US and Korea wouldn't be fun, but it would pale in comparison to a US-Russia nuclear conflict
- The three biggest lies in the world are: "The check is in the mail", "This might sting a little", and "Our plan results in extinction." Don't let your opponents over-claim their impacts.