Governments spend large amounts of money in fighting wars, but until recently, these expenditures have not been subject to rigorous analysis. The reason is obvious: during a war, the priority is to win. No one wants to second-guess the generals on how money should be spent. After a war, the issue of whether the money was well spent is of interest to historians; public attention is focused on more pressing issues, including dealing with the aftermath of the war. The Iraq and Afghanistan wars, however, were different. Unlike most wars, they were wars of choice. Iraq did not attack the United States. The US invasion of Iraq was part of a new policy termed “preemptive” war. The initial US airstrikes in Afghanistan were launched to eradicate Al-Qaeda strongholds after the bombings of September 11, 2001. But subsequently, the US made a decision to topple the Taliban government and to mount a full-scale war in Afghanistan which has continued for nearly a decade. In both Iraq and Afghanistan, the advocates of war have maintained that military actions are necessary to ensure US security. However, the US has been able to determine, to a large extent, the tempo of the wars, the scale of US military intervention, the number of troops deployed and the amount of funding devoted to these endeavors. These wars were also long wars—arguably the longest wars that the US has ever fought.1 After a year or two, it was clear that the conflicts would be continuing for an extended period of time—long enough for an analysis of the benefits and costs.
Stiglitz, Joseph E., and Linda J. Bilmes. "Estimating the Costs of War: Methodological Issues, with Applications to Iraq and Afghanistan." Oxford Handbook of the Economics of Peace and Conflict. Ed. Michelle R. Garfinkel and Stergios Skaperdas. Oxford University Press, 2012.