A longer look reveals interesting patterns and may clarify what is driving a rise in suicides.
In the global war on terrorism, America’s longest war, a service member is more likely to die by their own hand than an enemy combatant’s. U.S. military suicides have been rising steadily since 2004, and no one is sure why, despite parsing numbers with the most advanced and sophisticated scientific methods in history.
At the dawn of a new decade, it is time to broaden the scope of research and use history to inform our problem-solving and the policies we develop as result. Incorporating historical data can help scientific researchers recognize and separate chronic forces from acute factors affecting suicide rates. Instead of analyzing military suicide over the past 20, 50 or 70 years, what if we examined available records and documents from the past 200? We did just this in a recently published study.
While war and serving in the military have always been highly stressful, we found that throughout the 19th and first half of the 20th centuries, periods of war were associated with decreased suicide rates in the U.S. Army, something that flipped with the long war in Vietnam. More broadly, we found that throughout the 19th and early 20th centuries, U.S. Army suicide rates were well above today’s rates. Then, beginning with World War II, rates dropped precipitously. From 1941 until 2004, these numbers were comparable to or even below civilian rates. The military suicide rate overtook its civilian counterpart between 2007 and 2008, upending a paradigm that had stood since World War II — and no one has been able to discern why. But history may provide insight into how to tackle this tragic problem.
U.S. Army active-duty suicide rates increased over the course of the 19th century, eventually peaking in 1883 with a reported rate of 118.69 per 100,000 soldiers. Then the rate declined in three successive waves, each occurring with the end of a conflict, specifically the Spanish-American War, World War I and World War II. Notably, the U.S. Army officially reported its lowest suicide rate, of 5 people per 100,000, in 1944-45.
During World War II, total civilian mobilization and new legal and financial protections and opportunities for service members coalesced to significantly reduce the suicide rate. These included measures such as the G.I. Bill and a program that provided starter loans to service members and veterans for education and buying a home. This helped them to secure their finances, start a family and reincorporate into American society. New legal protections also helped to keep them from being evicted when deployed. And service members’ retirement benefits were improved.
Following World War II, to maintain commitments abroad, the United States drafted a standing army larger than ever before. To enhance retention and keep the U.S. military competitive with the private sector, President Dwight Eisenhower championed expanded access to housing and health care for service members and their families in his 1954 State of the Union address. Improvements to both followed in the years ahead.
Nevertheless, as the Cold War began, the suicide rate began gradually rising until it stabilized at 10 to 15 per 100,000 people — less than half the pre-World War II rate — where it remained for more than 50 years, except for a brief spike following the end of the draft and the shift to an all-volunteer force. The latter coincided with the end of the Vietnam War when U.S. Army morale plummeted, and alcoholism, drug use, and insubordination rose service-wide. Antiwar sentiment also rose among civilians.
The suicide spike proved to be a temporary aberration, however, as President Ronald Reagan’s inauguration brought with it a vision for the U.S. military that brimmed with patriotic ebullience. Reagan increased military spending, boosted the Defense Department’s technical prowess and celebrated service members as the best the United States had to offer. The Vietnam War had left Americans with an unsavory aftertaste; even here, Reagan offered something far more palatable: hate the war, honor the warrior. As the military enjoyed increased funding and a revitalized image, the suicide rate again fell to 10 to 15 per 100,000.
This relatively stable paradigm lasted until the beginning of the 21st century and the dawn of Operations Iraqi Freedom and Enduring Freedom, when the suicide rate increased once more, eventually spiking at 29.7 per 100,000 in 2012. By February 2007, medical cost-cutting and rising numbers of traumatic brain injuries and post-traumatic stress disorder diagnoses had overwhelmed the military.
An exposé by The Washington Post’s Dana Priest and Anne Hull on conditions at Walter Reed Naval Hospital revealed an underfunded, understaffed, overwhelmed and dilapidated medical program at the military’s once-proud standard-bearer of medicine. The quality of care had dropped off during the 1990s and 2000s as the military tried to slash care costs through benefit cuts and contract staff. Contract staff lowered expenses but at the expense of direct oversight over the quality of care.
While the Army’s active duty suicide rate has dropped from the 2012 peak, it has remained around 20 to 30 per 100,000. While in the past, periods of war seem to have lowered suicide rates, that correlation became inverted first during the decades-long conflict in Vietnam and the almost-two-decade wars in Afghanistan and Iraq. Correlation is not causation, but the current elevated rates appear to be indicative of a bedeviling new paradigm.
Examining historical patterns can assist policymakers and the military in addressing the factors causing it. For example, we must now ask ourselves what is different and unique about the war on terrorism? Why do the best efforts of the Defense Department, modern psychiatry and dramatically expanded mental health programs not result in lower suicide rates, instead of higher ones? What is different about today’s force than yesterday’s?
One possible answer: the change in who shoulders the costs of war. Less than one percent of the nation’s population serves in the military, and for most people, the current military conflict is a distant and remote phenomenon that is easily put out of mind. Civilians are not asked to ration, pay increased taxes, endure mobilization or see their children face the prospect of being drafted, as they did in previous wars. The blood of the few now pays for the freedom of the many.
While we venerate our service members, U.S. society has elevated them to the status of remote icons and rendered war a spectator sport. Thanks in part to this remoteness, the public has willingly countenanced extended war, which compelled the Army to reduce enlistment standards to meet recruitment goals while simultaneously seeking reductions to ballooning health-care costs that draw against funds used to maintain military readiness. That means a third of new Army recruits have preexisting mental illnesses and are simply more vulnerable at the outset of service than they have been in the past. As a result, the Army, and the military as a whole, are playing catch-up in mental health care.
How to tackle this problem? The Defense Department’s November 2017 instruction codified mental health policies pertaining to suicide across all services, and although it was a good start, Defense should also look to replicate the strategies that paid dividends in the past. That means going beyond the military and mental health element of soldiers’ lives by making educational, legal and financial experts readily available. This will help their personal lives and careers, while securing their financial future.
We frequently consider service members the best the nation has to offer, and their deaths by suicide do not fit into our triumphant narrative of heroic military service. Beyond the civil-military divide stands a volunteer force that is the best the nation has ever fielded. However, with the end of the draft, the increase in long wars, enlistment shortages, lowered recruitment standards and the erosion of military benefits, death by suicide has risen. Given the recent developments in the Middle East, it appears this trend will not end anytime soon. Perhaps at the start of the new decade, knowledge of our shared past can inform our future solutions.
(Source: Jeffrey Allen Smith, Michael Doidge, Ryan Hanoa and B. Christopher Frueh, Washington Post- 17/1/2020)