Humanitarian Intervention and the West’s Failures


The West, and the United States in particular, still seems to think that the best way to produce peace is by killing people. At least, that is what its tradition of military intervention, from Libya to Iraq, would have us think. The reality, however, is that prolonged military engagements for humanitarian purposes frequently backfire for both the intended recipients of aid the would-be humanitarians. Regardless of whether these engagements result in such immediate consequences, they drain resources at home that could go towards strengthening healthcare and education programs. While preventing mass casualties abroad is a worthwhile cause, the United States if strictly following this noble goal should have sent expensive forces into countless dangerous areas such as Congo, Syria, Myanmar, Venezuela, Libya, Yemen, South Sudan, the Philippines, and other nations falling under geopolitical unrest and humanitarian crises. Given resource constraints, such a situation is untenable. With that in mind, this article will argue that the West’s tradition of humanitarian intervention as we know it today should end.

Such a thesis would be indefensible without qualification; to oppose natural disaster relief, for example, would be monstrous. Thus, I define the West’s tradition of humanitarian intervention as the exercise of military force by Western powers under the publicly stated rationale of ameliorating a humanitarian crisis. I define a humanitarian crisis as a situation in which armed forces, whether they be an army or insurgency, pose an imminent and grave threat to human life. These definitions exclude all non-military operations (e.g. those conducted by the World Food Program), disaster relief, and many other peaceful aid programs facilitated Western troops from my thesis.

For further clarification, this article does not contend that the West should never resort to military intervention. But the Western tradition is one of reflexive engagement, not reluctant intervention. There are times when intervention works, like Britain’s entry into Sierra Leone. [1] But our tradition of reflexive engagement is not conducive to wise intervention. Consequently, this article argues that the West, and the United States, in particular, adopt a default principle of restraint. The upcoming section examines the theoretical underpinnings of such a principle, and the subsequent section elucidates the theoretical argument with historical examples.

President Donald Trump’s “America First” rhetoric is inappropriate for its appeal to xenophobia and dark historical roots. But that does not mean that the principle behind the rhetoric is necessarily wrong. To paraphrase Rich Lowry, editor of the National Review, if not America first, then who do we put first? [2] Perhaps “duty to country first” would be a better, albeit clunkier slogan. As public opposition to current foreign aid spending indicates, in the abstract, most Americans would likely agree that the U.S. should prioritize Americans. [3] But if you asked them “whose life is more valuable, an American life or an Ethiopian one?” they would probably tell you, with no small amount of indignation, that the question is downright offensive. No one’s life is more valuable than anyone else’s. And yet, the U.S. spends more money on Medicare than foreign aid by a longshot, despite the fact that it costs much less money to save an Ethiopian life than an American one. [4] [5] Is this some moral failing or the instantiation of the natural tendency for a country to prioritize its own citizens? Just because everyone has equal intrinsic worth (and they do) does not mean that the U.S. should prioritize those suffering abroad at the expense of Americans’ well-being. To do so would violate the primary duty every country has to its own citizens.

"Just because everyone has equal intrinsic worth (and they do) does not mean that the U.S. should prioritize those suffering abroad at the expense of Americans’ well-being."

John Locke famously argued that government gains its legitimacy through consent of the governed. [6] While this is up for serious dispute, I doubt many Americans would disagree with the similar notion that the American president gains whatever authority he has through his election, a process whereby the people select their trustee for international affairs. Congressmen are expected to represent the interests of their constituents first, not another district’s interests. Why shouldn’t we demand the same thing from the president on the international stage? Sometimes that requires being bystanders to suffering abroad. While this might seem coldly arbitrary, I doubt that many Westerners would want their countries to pluck funds from domestic healthcare programs and give them to people across the world. The very fact that American liberals, those who most vociferously advocate for those in need, have recently proposed high-priced domestic programs suggests that they too subscribe to something like “America first,” even if they would never admit that in a town hall. [7]

While some critics assert that it is not the United States’ place to interfere with the affairs of a sovereign nation, primary duty to country doctrine dictates that some interventions are just. In other words, the moral issues at stake with taking control of another nation are subordinate to American interests. This is not to say that the U.S. can do whatever it wants; repressing citizens abroad for relatively small domestic benefits would be sinful, and the United States has engaged in such behavior. [8] But liberals go too far when they condemn all interventions based on past American transgressions. More convincing is their argument that intervention undermines America's credibility, thereby harming U.S. interests. [9] But the principle of primary duty to country accommodates such concerns; an intervention that severely harms American credibility, and therefore U.S. interests, violates the duty to country doctrine.

The above analysis gives Westerners reason to be suspicious of adventurous military outings that provide little benefit to their countrymen. However, they should not be so mistrustful of interventions that seem likely to either I) cost them little to nothing and save foreign lives or II) directly benefit them. Unless one of these two conditions is fulfilled, barring extreme, unforeseeable circumstances, the West should refrain from intervention.

Having determined which interventions are justified, we must address what constitutes success for an intervention. In type I intervention, success requires low mortality and financial cost for the intervening countries and significantly reduced casualties compared to what would have occurred without the intervention. A type II intervention is trickier to articulate because it depends on what one thinks counts as a benefit to a country’s citizens. In terms of international relations, a realist might think that military action that enhances a country’s power should count as a benefit to its citizens. But this logic is not typically associated with humanitarian goals. To avoid tangling too deeply with the theoretical weeds, this article will only consider one of the big issues: stability. While this is far from the only issue at stake, increased stability abroad surely counts as a direct benefit to a Western country’s citizens due to associated reductions in terrorism and economic turbulence at home.

But does the West’s tradition of intervention tend towards successful type I and type II interventions? Unfortunately, as we are about to see, it does not. Type I interventions are rare, especially in recent years; military interventions are expensive by nature. Even missions advertised as “advise and assist” operations for the benefit of local troops frequently turn into costly and indefinite military presences.[10] Without many options to choose from, this article will briefly analyze type I interventions through the examples of the Somalia and Libya interventions. It will then examine the type II intervention of the Iraq War.

A good example of the typical failures of a type I intervention can be studied in Somalia. The Somalia intervention of 1992 began as an attempt to mitigate both a war-induced famine and a brutal regime but morphed into expensive long-term aid. [11] After UN efforts to provide relief came under threat from warring Somali faction, U.S. President George H.W. Bush proposed that 25,000 American troops enter the region to protect UN workers. And yet, unrest continued to undermine the operation. The situation came to a head on October 3, 1993, when Somali militia fighters surrounded two crashed American helicopters in the famous “Black Hawk Down” incident. [12]

This story is not black and white. UN and American efforts saved an estimated 100,000 Somali lives. [13] But there were consequences. The United States left the still chaotic region with 18 Americans and 2 UN soldiers in body bags. [14] Many experts label the intervention a failure. [15] Others contend that if anything, the problem was that we failed to commit enough resources to nation-building. [16] As I have argued above, such intensive aid would likely have been unjustified given the high price tag. This view finds support from Walter Clarke and Jeffrey Herbst, who argued in 1996 that “there is no such thing as a humanitarian surgical strike.” [17] Another national embarrassment of that sort would rock the foreign policy world and onlookers, much as it did then, risking the possibility of retrenchment in times when aid might actually accomplish something. Furthermore, despite reluctance to commit military resources to Somalia, the ongoing conflict there has cost the world an estimated $55 billion, and with little success in improving conditions there. [18] An attempt to intervene surgically and without implementing long-term political reforms in Somalia has turned the country into a cash dump for America.

Whichever side one comes down on regarding the Somalia intervention--too much involvement or not enough--it is clear that type I interventions can get out of hand quickly. Without enough military commitment, as type I intervention requires, one risks leaving the region little better than the intervening country found it. Too much commitment can engender political opposition to sensible aid programs (e.g. to mitigate Rwandan genocide) and political instability. I will discuss the latter issue in the context of the American airstrike campaign that began in 2011.

In an attempt to prevent an “imminent massacre” by dictator Muammar Qaddafi and usher in democracy, the U.S. began an airstrike campaign in 2011. By its own standards, it failed miserably. As Foreign Affairs bluntly put it, “Libya has not only failed to evolve into a democracy; it has devolved into a failed state...Libya now serves as a safe haven for militias affiliated with both al Qaeda and the Islamic State.” [19] But the standards for success a type I intervention are not as stringent as the Obama administration’s: it only needs to cost little for the U.S. and save lives compared to not intervening.    

Depending on whom you ask, the intervention in Libya cost between $15 and $300 million per week. [20] For a government of the United States’ size, that may not seem like a lot, but that money could have gone to any number of programs, especially if the U.S. committed to restraint as a guiding principle. It would then not need to maintain such high baseline levels of military readiness, which are much more costly than the intervention itself. [21] Even if we set aside structural military changes, $100 million per week is more money than seems appropriate for type I intervention to be called “successful,” especially considering that it lasted seven months. [22]

"$100 million per week is more money than seems appropriate for an intervention to be called “successful,” especially considering that it lasted seven months."

It is also unclear whether the intervention saved lives in the long run. In the short term, it prevented Qaddafi from hunting down his people “inch by inch, house by house, home by home.” [23] However, opponents argue that the intervention failed because “violent deaths and other human rights abuses have increased severalfold” since it took place. [24] While we can never know for sure what would have happened had the U.S. committed even more to the intervention, that is irrelevant to the argument here. Such commitment would not constitute a type I intervention, and would not benefit the U.S. substantially enough to count as successful type II intervention.

While the Libyan intervention is murky, the burden of proof appears to be on proponents of intervention for similar cases in the future. Having demonstrated that two attempts at type I interventions are not clearly successful, this article will now turn to type II interventions.

The United States originally justified its invasion of Iraq with the accusation that Saddam Hussein possessed weapons of mass destruction (WMDs). [25] After steamrolling the Iraqi army, the U.S. failed to locate any WMDs and altered its justification for the war to include rebuilding Iraq and bringing humanitarian aid. [26] This is an attempted type II intervention because the U.S. publicly stated that humanitarian aid was a goal, and bringing stability to the region would benefit American citizens by reducing the risk of terrorism. But things didn’t go as planned.

American and other Western troops remained there until late 2011, costing well over 4,000 American soldiers their lives and $2 trillion. [27] [28] [29] This extreme cost to the United States already disqualifies Iraq from being a success, but it gets worse. Around 165,000 Iraqi civilians died as a direct result of the war. [30] Given the uncertainty surrounding how many people Saddam Hussein would have murdered during that period had the U.S. not intervened, it is unclear whether the American military represented an improvement in the death toll. But the results of the war should surely not count as a peaceful order or a humanitarian success. The Iraqi government still runs roughshod over basic human rights, mortality is high, and democracy isn’t even on the table. Many argue that the war laid the groundwork for ISIS, which publicly decapitated multiple Westerners and others, sending the world into a panic. [31] [32] Furthermore, nation-building facilitated by indefinite military occupation requires obscenely large financial commitments. This cannot be a long-term model for the West. While Iraq only represents one example, it illustrates just about everything that can and often does go wrong with the West’s so-called “humanitarian” intervention.

This article is by no means exhaustive, but it does provide a basic, accessible framework for debates about humanitarian intervention. It turns out that the framework’s requirements are demanding enough that many recent humanitarian interventions are unjustified, including the most prominent of our lifetimes.

A desire to aid others is commendable, but doing so frequently runs counter to the government’s obligation to its own citizens or backfires. This gives the West good reason to refrain from using force, however heart-wrenching that may be.

James McIntyreStaff Writer

Edited by Economics Editor Lan Phan

Sources and Notes

Featured Image: “Apache” by Noel Reynolds — Own work. Licensed under Cc BY 2.0 via Flickr Creative Commons —


Send a Comment

Your email address will not be published.