Agence Global

  • About AG
  • Content
  • Articles
  • Contact AG

Arnold R. Isaacs, “Moral injury and America’s endless conflicts”

December 3, 2019 - TomDispatch

When an announcement of a “Moral Injury Symposium” turned up in my email, I was a bit startled to see that it came from the U.S. Special Operations Command. That was a surprise because many military professionals have strongly resisted the term “moral injury” and rejected the suggestion that soldiers fighting America’s wars could experience moral conflict or feel morally damaged by their service.

Moral injury is not a recognized psychiatric diagnosis. It’s not on the Veterans Administration’s list of service-related disabilities. Yet in the decade since the concept began to take root among mental health specialists and others concerned with the emotional lives of active-duty soldiers and military veterans, it has come to be fairly widely regarded as “the signature wound of today’s wars,” as the editors of War and Moral Injury: A Reader, a remarkable anthology of contemporary and past writings on the subject, have noted.

For those not familiar with the tag, moral injury is related to but not the same as post-traumatic stress disorder, or PTSD, which is a recognized clinical condition. Both involve some of the same symptoms, including depression, insomnia, nightmares, and self-medication via alcohol or drugs, but they arise from different circumstances. PTSD symptoms are a psychological reaction to an experience of life-threatening physical danger or harm. Moral injury is the lasting mental and emotional result of an assault on the conscience — a memory, as one early formulation put it, of “perpetrating, failing to prevent, or bearing witness to acts that transgress deeply held moral beliefs and expectations.”

The idea remains controversial in the military world, but the wars that Americans have fought since 2001 — involving a very different experience of war fighting from that of past generations — have made it increasingly difficult for military culture to cling to its old manhood and warrior myths. Many in that military have had to recognize the invisible wounds of moral conflict that soldiers have brought home with them from those battlefields.

That shift was evident at the moral injury symposium, held in early August in a Washington, D.C., hotel. The feelings and experiences I heard about there were not necessarily representative of the climate in the wider military community. The special operations forces, which put on the event, have their own distinctive character, culture, and experiences, and a disproportionate number of the 130 or so attendees were mental-health specialists or chaplains, the two groups that have been most open and attuned to the very idea of moral injury. (A military chaplain in the Special Operations Command, in fact, first had the idea for the symposium.)

Still, the symposium emerged from the same history the rest of the military has lived through: 18 years of uninterrupted violence, of war without end in distant lands, that has killed or wounded some 60,000 Americans and a far greater number of foreign civilians, while displacing millions more and helping drive the worldwide refugee population to successive record-setting levels. Against that backdrop, those two days in Washington proved gripping and thought provoking in their own right. What follows are some of the thoughts they provoked in my mind as I listened or when I later reflected on what I heard.

Something said, something unsaid In the sessions I attended, virtually every speaker mentioned one relevant fact about our present wars and the soldiers who fight them. But a different relevant fact on the same subject was almost completely missing.

Again and again, participants spoke about the great change in how soldiers experience war. In past generations, for the great majority of service members, war was a one-time event. In the 18 years since 9/11 and the invasion of Afghanistan, war has become a permanent part of soldiers’ lives in a continuing cycle of repeated deployments to battle zones. (And that’s not to mention the even more startling change for those who see combat remotely, sitting in front of screens and firing missiles or dropping bombs from unmanned aircraft flying over targets thousands of miles away.) As nearly all the symposium speakers pointed out, that change in the war-fighting experience has also changed the nature of combat trauma and the military culture’s understanding of and attitudes toward it.

Here’s the reality that almost nobody mentioned, though it’s closely related: the reason these wars have lasted this long and have become a permanent part of soldiers’ lives is that they have not been successful. My notes record only one presentation where that connection was even touched upon, and then only implicitly, not directly.

That single indirect mention came in a discussion group conducted by Air Force Lieutenant Colonel David Blair, the commanding officer of a Florida-based remotely piloted aircraft squadron. He mentioned that his MQ-9 Reaper drone crews increasingly have come to prefer missions in theaters other than Afghanistan. Specifically, he said, they were most positive about strikes against ISIS in Iraq and Syria where they “could see the front lines moving.” (That suggests he was referring mainly to the 2016-2017 period when those Reapers were supporting American and Iraqi ground forces recapturing territory that had been under ISIS occupation.) Those missions led to “less trauma” for his operators, he said. At another point, he added that “if it [an engagement] ends well, they look back on their lives differently.”

Other than that single remark about his crews preferring missions in other theaters, Blair never made any explicit comparison between Afghanistan and any other conflict zone. However, what he did say sounds like plain common sense. It’s logical that when a military operation is relatively successful, it’s easier for soldiers to explain to themselves and live with their own actions. It must help mitigate moral injury symptoms, at the very least, if they can tell themselves that a greater good was accomplished.

Conversely, if you did something that leaves you with doubt or regret but achieved no positive results, that would lead to more painful feelings and less defense against them. So, in one way, it seems odd that, except in those few moments, I didn’t hear anyone make the connection between the lack of victory in America’s wars and the incidence of trauma.

On the other hand, it’s not so surprising that such connections were not made more often or more clearly. They would only have reminded the participants of an uncomfortable reality: that America’s wars in the present era have, on the whole, fallen far short of producing any greater good that would help justify the moral injury so many soldiers are struggling with, not to mention all the other human damage those wars have caused.

I can’t know their inner feelings, but I can guess that it would have been painful for many symposium participants to admit that fact out loud or to let themselves think it at all. Probably it wasn’t something the organizers would have liked to hear either or remember when they face troubled soldiers in the months and years to come.

Moral clarity versus moral injury Another moment in that same session suggested a different but related link between the nature and circumstances of a military operation and the likelihood of trauma. This one had to do with the moral perception of the operation itself.

Since his crews are not physically at risk when carrying out their missions, Lieutenant Colonel Blair pointed out, the traditional “kill or be killed” formula of the battlefield can’t help them explain their war to themselves. Instead, the drone fighter’s explanation has to be “kill or someone else will be killed.” In turn, that determines not just what they do, but who they feel they are. “Being a protector of others,” Blair said, becomes their “core identity.”

A couple of quotes in a December 2017 article on an Air Force website show how the missions against ISIS strongly validated that identity — and, indirectly, suggest why operations in other theaters have not.

The article, which I found after the symposium ended, was a feature about a remotely piloted aircraft unit (not Blair’s) that supported the ground operation to recapture Raqqa, the Syrian provincial city that ISIS designated as the capital of its so-called caliphate. One quote is from a squadron commander: “It wasn’t our aircrew just striking ISIS targets. We also were safeguarding and watching over [friendly Syrian troops] as they cleared civilians moving out of the city to safe locations.” The article also quoted a sensor operator: “My favorite part of this job is that I’m able to help civilians be safe and I’m able to help liberate whatever city we need to. There’s no better feeling than knowing you can directly impact the battlefield and other people’s lives.”

Obviously, when their screens showed them the civilians they were helping, and not just the enemies they were killing, those crewmen found moral clarity, rather than moral conflict, in their experience. From Blair’s comments, one can surmise that was true for his crews as well, presumably for similar reasons.

Sadly, it is also pretty obvious that such a sense of clarity has been the exception, not the rule, in the wars Americans have been fighting for nearly two decades. That doesn’t automatically mean those wars were not moral, but whatever their moral nature, it would only rarely have shown up on the drone operators’ screens — or in the sightlines of soldiers looking at actual battlegrounds in real space — as clearly as it did for those airmen remembering their Raqqa missions. (Not that Raqqa raised no moral questions at all. Yes, the fighting there liberated its inhabitants from an exceptionally brutal occupation. But it also destroyed most of their homes, largely in air strikes by U.S. and allied planes that, by one estimate, dropped 20,000 bombs on the city. By the time the campaign was over, Raqqa, like a number of other Syrian and Iraqi cities, was in almost complete ruins.)

A question, maybe farfetched… I didn’t frame it this way when I was at the symposium, but this question later came to mind: Has the U.S. military as an institution, not just its individual service members, morally injured itself over the last 18 years?

This is a military force that never stops declaring it’s the best and strongest in the world, but has not successfully concluded a significant war for nearly 30 years or maybe longer. (The first Gulf War of 1990-1991 looked like a great win at the time, but appears like anything but an unequivocally positive accomplishment in retrospect.) It may sound farfetched, but is it unreasonable to wonder if that dissonance, that wide gap between goals and actual accomplishments, might leave a collective sense of sorrow, grief, regret, shame, and alienation? That’s the list of feelings that Glenn Orris, a Navy chaplain, displayed on a chart in his symposium presentation and specified as the ones that keep morally injured service members awake at night.

I’m posing this as a question, not offering it as an answer. Certainly, at various moments during the symposium, I had a sense not just of individual but of collective trauma. As an outsider in that world, I can’t and won’t venture to evaluate the emotional state of the military as a whole. Still, the question doesn’t seem ridiculous.

A new idea of what moral injury really is The final event of the second day — an unusual closer for a professional or academic conference — was a reading of Sophocles’ play Ajax, as rewritten by Bryan Doerries. After the reading, Doerries, artistic director for Theater of War, the company that put on the performance, moderated a discussion with a panel of four recent veterans and members of the audience.

Essentially, he attempted to draw out the panelists and the audience on what the play was trying to say and how that 2,500-year-old story of a warrior’s depression, madness, and suicide might connect to their own experience. Listening to various responses, I found myself thinking that perhaps the main purpose of his, if not Sophocles’s, version was to make the audience think about what war is. What it really is, not the heroic myth humans have made of it from ancient times on. And then I thought, maybe that’s what we’d been talking about for the previous two days. Maybe that’s what moral injury is: realizing the true nature of war.

Along with that thought came another, one that first occured to me nearly 45 years ago when, as a reporter for the Baltimore Sun, I personally witnessed the disastrous end of the Vietnam War. I’ve believed ever since that covering war from the losing side gave me a truer knowledge of its nature than I’d have gotten from that or any other war’s winning side. Maybe I should say darker, not truer, since I suppose the winners’ war is real, too. But whichever word you choose, my experience, I felt, gave me a more unobstructed view of war. I could see it more clearly for what it was precisely because there was no good result to balance against the death and loss and terror and despair. There was no excuse to explain away the human disaster I’d seen and written about for several years, no way to tell myself that the war was necessary or had served any purpose.

That bit of personal history makes me think it’s not accidental that our present consciousness of moral injury has come out of wars we didn’t win. They haven’t been lost in the same clear-cut way that the war in Vietnam was. They haven’t (yet) ended in the kind of catastrophically decisive final act I witnessed there in the spring of 1975 in the weeks that led to Saigon’s surrender. But these recent wars haven’t accomplished their goals either, or given our soldiers a worthwhile reason for what they’ve gone through, which is surely a key piece of the moral injury story.

I was a civilian journalist, not a soldier. I went to Vietnam to report, not to fight. I didn’t come home with any trauma symptoms. But I have all the feelings that Chaplain Orris listed as identifying markers for moral injury: sorrow, grief, regret, shame, and alienation. Those emotions come from what I learned about war, not from anything I did, and that makes me believe it may not be wrong to think that what we call moral injury might not be just one person’s response to particularly troubling events, but a symptom of something larger, of seeing war individually and collectively for what it truly is.

A last thought In closing, I will turn back to the editors of War and Moral Injury. In their introduction, Douglas Pryer, a retired army intelligence officer and Afghanistan and Iraq veteran, and Robert Emmett Meagher, a classicist and professor of humanities at Hampshire College, pointed to an aspect of war that is missing in their anthology, the symposium, and in American culture more broadly:

“We must acknowledge a great gap in this text as in nearly every other on the subject of America’s wars and veterans: the deaths and wounds, physical and spiritual, inflicted on the ‘others,’ our enemies, especially our ‘civilian enemies.'”

Pryer and Meagher are right. Such an acknowledgement is almost entirely absent from the national discourse about our wars and their legacy. But without it, no moral wound, whether an individual’s or a society’s, can truly be healed.

Arnold R. Isaacs, a journalist based in Maryland and writes regularly for TomDispatch (where this article originated). He covered the final years of the Vietnam War for the Baltimore Sun. He is the author of Without Honor: Defeat in Vietnam and Cambodia, Vietnam Shadows: The War, Its Ghosts, and Its Legacy, and an online report, From Troubled Lands: Listening to Pakistani and Afghan Americans in post-9/11 America. His website is www.arnoldisaacs.net

Copyright ©2019 Arnold R. Isaacs — distributed by Agence Global

—————-
Released: 03 December 2019
Word Count: 2,554
—————-

Andrea Mazzarino, “Bearing witness to the costs of war”

November 25, 2019 - TomDispatch

There is some incongruity between my role as an editor of a book about the costs of America’s wars and my identity as a military spouse. I’m deeply disturbed at the scale of human suffering caused by those conflicts and yet I’ve unintentionally contributed to the war effort through the life I’ve chosen.

I am the co-editor with Catherine Lutz of War and Health: The Medical Consequences of the Wars in Iraq and Afghanistan, a new volume of social science research from Brown University’s Costs of War Project. At the same time, I am a practicing therapist-in-training and I specialize in working with veterans who have post-traumatic stress disorder, or PTSD. Through the scholarly research I review and the veteran clients I have seen, I am committed professionally to bearing witness to the human costs of America’s forever wars, and to alleviating suffering where I can.

I am also married to a submarine officer in the Navy. We are so fortunate in so many ways. We have two beautiful children, pets, loving friends, and extended family. We both have graduate degrees. While our finances take hits from relocations without adequate job and childcare support, we don’t face the continuous fears that many military families experience when a loved one is sent into a war zone. In many respects, my family’s life does not look like that of most American military families profiled in my book.

And yet I have misgivings.

During one of my husband’s deployments, I was relieved to hear our 2-year-old son talk about war in a way that, despite his innocence, was more nuanced than the usual tales of “sacrifice,” “honor,” and “fighting terror” that one hears routinely in the mainstream media and in local command newsletters.

It was spring 2017 and we had just seen Kim Jong-un displaying one of North Korea’s new missiles on the TV news. Our son asked me what a war is. I gave my best explanation and his reply, undoubtedly garnered from preschool discussions about conflict resolution, was: “They don’t use words? They hit?”

Sort of, I told him. I did my best to explain what a weapon was, a description I suspect that many of my liberal mom friends would balk at. In our military community, however, such imagery is all around us. Real missiles and replicas are, for instance, often used as decorations lining the streets of naval bases or as lampposts or even wall hangings in military family households.

My son did his best to take it in. Later, at the waterfront near our home, he tossed a piece of his donut into the ocean and told me it was for his father who, he insisted, was under the water “playing hide-and-seek.” Of course, he doesn’t connect the relentless training and deployments characteristic of our military life with the fighting of war itself, though our family feels the strain and implicit sense of danger in our daily lives.

In writing my recent book on the costs of this country’s post-9/11 wars, I learned about Afghan war widows who use heroin to make it morally possible to live amid grief and poverty after seeing their spouses and children killed; about NGO workers who leave their own families, facing threats of kidnapping and death, to aid refugees in the Pakistani-Afghan borderlands. And I read about the experiences of the million war-wounded, ill, or traumatized American combat veterans, the sorts of patients my therapy will someday (I hope) help, who have sought health care and social support and so often come up desperately short.

As I do this, there’s always a low buzz of guilt somewhere in my gut, even about my own voluntary, unpaid work in support of other military spouses, even after I’ve relinquished travel assignments in my work as an activist that would have compromised my husband’s security clearance, even as I abide by harsh security restrictions in my personal life. I worry, in other words, about aiding the very military that, 18 years after the 9/11 attacks, still continues to rack up war’s costs without an end in sight.

The costs of war at home I see firsthand trends affecting all military communities in the United States. Deployments during these wars have come more frequently and often last longer than in past American wars. The specter of death by suicide hangs over all our lives, because everyone in such communities knows someone who has died that way or has threatened to do so.

In 2012, for the first time in our history, American service members began to die by suicide at higher rates than civilians. Today, they are more likely to take their own lives than to perish in combat. As anthropologist Kenneth MacLeish points out, military suicides are most prevalent among those who have deployed to our war zones just once or not at all, or who left the military involuntarily with a “bad paper discharge” or other than honorable discharges of some kind. Moreover, mental illness is rampant among active-duty military service members. According to the nonprofit National Alliance on Mental Illness, in 2014 roughly one in four active-duty service members showed signs of mental illness, including mood and trauma disorders such as PTSD, depression, and anxiety (though this figure is conservative, given that the study did not include the prevalence of traumatic brain injuries among combat vets. Many soldiers seek relief from the stresses of training and combat through alcohol and other drugs and, in our military community, it’s common knowledge that seeking professional support for such problems can place you at risk of social stigma.

And don’t forget military families either. Training and fighting both take a toll on us, too. What modest figures we have on the subject make the point. For example, as anthropologists Jean Scandlyn and Sarah Hautzinger point out in our book, among servicemembers who entered the military between 1999 and 2008, the more months spent deployed, the more likely they are to divorce, with the vast majority of such divorces occurring soon after returning from deployments.

Local reports of domestic violence in military communities suggest that the problems leading to such divorces are only growing, though documentation on the subject is unreliable. It wasn’t until 2018 that, under pressure from Congress, the military made domestic violence a crime under its own legal code. Deployments of nine months or longer or frequent redeployments leave spouses at greater risk of depression, anxiety, and sleep problems, which, in turn, often affect the mental and physical health of their children as well.

Young children with deployed parents visit the doctor more frequently for behavioral health issues than those whose parents have not been deployed. Yet, as many spouses like me have discovered, community-based physicians are often unprepared to help in such situations, tending instead to blame the behavioral and mental-health issues of children on their parents or even on the children themselves, while not making referrals to services that could help (often, sadly, because there are none in the community).

“They were as hard off as me and I was killing them” Such collective problems are, of course, experienced individually and I’ve felt many of them in my own life. My spouse, for instance, departed for sea tours at moments when most of our family’s ducks were anything but in a row, whether it was a matter of childcare, work schedules, my health needs, or our other family obligations. Our son, for instance, has trouble sleeping because he was sad and scared for his dad, given what he hears in passing about Syria, North Korea, and — from other well-meaning military spouses and our own extended family — his own father’s attempts to “keep us safe” from unnamed others who might want to harm us.

I’m edgy and uneasy, knowing that my husband’s commander, a combat vet, has been angry at our family because I refused at one point to volunteer to work with a spouses group. When our house gets broken into, mid-deployment, and I’m alone with our toddler and pregnant, I wonder briefly if payback could have been involved before I dismiss the thought.

After I have our second child, a woman from the base with no mental-health or social-work training calls me weekly to ask about my baby’s health and safety. When I request that she stop, she refuses, telling me the same commander has ordered her to check in on each new mother in his command during deployment. I receive capitalized, hysterically punctuated emails from this woman warning all spouses not to jeopardize national security by talking to anyone about the submarine’s movements or, for that matter, emailing anything to our partners that they might find “distressing,” even details about a family member’s illness. Repeatedly, I am reminded that the U.S. is fighting a war on terror and our individual problems should never get in the way of that.

Things aren’t exactly a cakewalk between deployments either. It seems that, wherever I go, I find stigma, not support. For example, shortly after giving birth, I consulted a psychiatrist for help with post-partum depression. He was the only psychiatrist within 30 miles of our town who accepted military health insurance. Upon meeting me for the first time, he asked me to sign paperwork allowing him discretion to commit me to a psychiatric hospital “because military spouses often get psychotic during deployments.” I decided to tough it out rather than see him again.

And I try to keep in mind that my problems don’t add up to much, given the true costs of war out there. As a start, it’s a stretch to draw comparisons of any sort between an educated, white millennial family here and those who directly pay war’s costs like combat vets or, above all, civilians in Afghanistan, Iraq, and other American war zones. As my co-editor Catherine Lutz and others have shown, though, combat and the home front are connected in unexpected ways.

If you spend 18 years fighting wars you grossly underestimated how to pay for, if you embark upon those wars without first considering alternatives like diplomacy, if you assume that social support for this country’s wars and those fighting them will come from military families that are patriarchal ideals from the white 1950s, and if you imagine an enemy — terrorism — that could be anywhere at all any time at all, then you’re already in a battle that’s going to prove unwinnable and morally unnerving for everyone involved.

I obviously can’t speak for how people from groups in this country more vulnerable than mine think about our never-ending wars and their costs, but my guess is that at least some of them feel connections to those in the war zones far more intimately than I do, no matter how hard I try. I will never forget a neighbor of ours, a Mexican-American Vietnam vet whom I would find smoking on our street when I completed my daily runs. One evening, when we were chatting, he told me that what haunted him most was how many of the rural, poor Vietnamese he’d shot at looked more like him than most of the American officers in his unit. “They were as hard off as me and I was killing them,” he suddenly said, tears in his eyes. Among veterans, he’s not alone in feeling an affinity for those on the other side.

On bearing witness When Catherine Lutz, Neta Crawford, and I first founded the Costs of War Project at Brown University in 2011, we took a close look at the kinds of public assumptions we wanted to upend. As a start, we wanted to show that, contrary to the Bush administration’s stated rationales for invading Afghanistan and then Iraq, Washington had not effectively protected human rights — not to safety, liberty, or for that matter freedom of speech — nor brought “democracy” with us into those distant lands. Instead, by then, those countries had already seen spikes in gender-based violence and the deterioration of the most basic protections that led to everything from the collapse of prenatal care to the killing of civilians to the kidnapping of journalists, aid workers, and academics.

We wanted to go beyond the Pentagon’s focus on the deaths of American soldiers and focus instead on the tens of thousands of Afghan and Iraqi military deaths that had taken place and especially the soaring death rates of civilians in those lands. And, of course, we wanted to show that our grim wars should not be described in sterile terms via the usual imagery of families embracing upon a smiling service-member’s return or the by-then-familiar photographs of neat coffins draped with flags being carried out of planes by uniformed service members as spouses (usually white, female, and non-disabled) looked on sadly.

That, we knew, was not the essence of America’s already ongoing war on terror. My colleagues and I wanted people in this country to refocus on the staggering death and injury rates that only grew as the years passed, the ever-more-crippling ways in which all sides learned to kill and injure, and the long-term mental-health effects of arduous family separations.

A therapist mentor once taught me that, when working with veterans who have PTSD, I should, as he put it, “Ask them to start their story a little before they think it began and have them keep going even after they think it’s over.” My colleagues and I wanted to do that when it came to our wars, focusing not just on the obvious newsworthy photographs that tended to appeal to the American psyche, but on the missing context in which those photographs were taken. That’s the best way I can think of to describe the purpose of our new book (and our future work). None of us should stop trying to refocus in that way, not until America’s war story is declared over — and not even then, given how long the costs of war are likely to take to play out.

One sunny afternoon in May 2011, as Catherine Lutz and I sat in her office in Brown’s Anthropology Department sifting through media images for the initial launch of the Costs of War website, we happened upon a video of a screaming young Iraqi child with open burn wounds covering his face and body, a relative clutching him in her arms as they hustled through a crowd. Gunshots and explosions were audible in the background. The before, the after, the neighborhood where the violence was taking place, the weapons used, who was even fighting whom — none of that was evident from the clip.

For years, that image and the sound of that child has haunted me. Who was he? Did he get to the hospital? Was there even a hospital for him to get to? Would he ever go to school or play again? Who was the woman and what had her life been like before the American invasion of Iraq in 2003? What was it like now? What services could she access? Was she safe?

I think of this image when I wake up at night, when I hear patients describe the screams of children in war zones, when I hear my own children scream during tantrums. It’s like a nightmarish echo that spurs me to keep working because all of us, regardless of where we are, should be bearing witness to the costs of war until somebody in power decides to end the suffering.

Andrea Mazzarino co-founded Brown University’s Costs of War Project. She is an activist and social worker interested in the health impacts of war. She has held various clinical, research, and advocacy positions, including at a Veterans Affairs PTSD Outpatient Clinic, with Human Rights Watch, and at a community mental health agency. She is the co-editor of the new book War and Health: The Medical Consequences of the Wars in Iraq and Afghanistan. This article originally appeared at TomDispatch

Copyright ©2019 Andrea Mazzarino — distributed by Agence Global

—————-
Released: 25 November 2019
Word Count: 2,572
—————-

Naomi Oreskes, “How the energy companies took us all: The greatest scam in history”

November 11, 2019 - TomDispatch

It’s a tale for all time. What might be the greatest scam in history or, at least, the one that threatens to take history down with it. Think of it as the climate-change scam that beat science, big time.

Scientists have been seriously investigating the subject of human-made climate change since the late 1950s and political leaders have been discussing it for nearly as long. In 1961, Alvin Weinberg, the director of the Oak Ridge National Laboratory, called carbon dioxide one of the “big problems” of the world “on whose solution the entire future of the human race depends.” Fast-forward nearly 30 years and, in 1992, President George H.W. Bush signed the U.N. Framework Convention on Climate Change (UNFCCC), promising “concrete action to protect the planet.”

Today, with Puerto Rico still recovering from Hurricane Maria and fires burning across California, we know that did not happen. Despite hundreds of scientific reports and assessments, tens of thousands of peer-reviewed scientific papers, and countless conferences on the issue, man-made climate change is now a living crisis on this planet. Universities, foundations, churches, and individuals have indeed divested from fossil fuel companies and, led by a 16-year-old Swedish girl, citizens across the globe have taken to the streets to express their outrage. Children have refused to go to school on Fridays to protest the potential loss of their future. And if you need a measure of how long some of us have been at this, in December, the Conference of Parties to the UNFCCC will meet for the 25th time.

Scientists working on the issue have often told me that, once upon a time, they assumed, if they did their jobs, politicians would act upon the information. That, of course, hasn’t happened. Anything but, across much of the planet. Worse yet, science failed to have the necessary impact in significant part because of disinformation promoted by the major fossil-fuel companies, which have succeeded in diverting attention from climate change and successfully blocking meaningful action.

Making climate change go away Much focus has been put on ExxonMobil’s history of disseminating disinformation, partly because of the documented discrepancies between what that company said in public about climate change and what its officials said (and funded) in private. Recently, a trial began in New York City accusing the company of misleading its investors, while Massachusetts is prosecuting ExxonMobil for misleading consumers as well.

If only it had just been that one company, but for more than 30 years, the fossil-fuel industry and its allies have denied the truth about anthropogenic global warming. They have systematically misled the American people and so purposely contributed to endless delays in dealing with the issue by, among other things, discounting and disparaging climate science, mispresenting scientific findings, and attempting to discredit climate scientists. These activities are documented in great detail in How Americans Were Deliberately Misled about Climate Change, a report I recently co-authored, as well as in my 2010 book and 2014 film, Merchants of Doubt.

A key aspect of the fossil-fuel industry’s disinformation campaign was the mobilization of “third-party allies”: organizations and groups with which it would collaborate and that, in some cases, it would be responsible for creating.

In the 1990s, these allied outfits included the Global Climate Coalition, the Cooler Heads Coalition, Informed Citizens for the Environment, and the Greening Earth Society. Like ExxonMobil, such groups endlessly promoted a public message of denial and doubt: that we weren’t really sure if climate change was happening; that the science wasn’t settled; that humanity could, in any case, readily adapt at a later date to any changes that did occur; and that addressing climate change directly would wreck the American economy. Two of these groups — Informed Citizens for the Environment and the Greening Earth Society — were, in fact, AstroTurf organizations, created and funded by a coal industry trade association but dressed up to look like grass-roots citizens’ action organizations.

Similar messaging was pursued by a network of think tanks promoting free market solutions to social problems, many with ties to the fossil-fuel industry. These included the George C. Marshall Institute, the Cato Institute, the Competitive Enterprise Institute, the American Enterprise Institute, and the Heartland Institute. Often their politically motivated contrarian claims were presented in formats that make them look like the scientific reports whose findings they were contradicting.

In 2009, for instance, the Cato Institute issued a report that precisely mimicked the format, layout, and structure of the government’s U.S. National Climate Assessment. Of course, it made claims thoroughly at odds with the actual report’s science. The industry also promoted disinformation through its trade associations, including the American Legislative Exchange Council, the American Petroleum Institute, the U.S. Chamber of Commerce, the National Black Chamber of Commerce, and the National Association of Manufacturers.

Both think tanks and trade organizations have been involved in personal attacks on the reputations of scientists. One of the earliest documented was on climate scientist Benjamin Santer at the Lawrence Livermore National Laboratory who showed that the observed increase in global temperatures could not be attributed to increased solar radiation. He served as the lead author of the Second Assessment Report of the U.N.’s prestigious Intergovernmental Panel on Climate Change, or IPCC, responsible for the 1995 conclusion that “the balance of evidence suggests a discernible human impact on the climate system.” Santer became the target of a vicious, arguably defamatory attack by physicists from the George C. Marshall Institute and the Global Climate Coalition, who accused him of fraud. Other climate scientists, including Michael Mann, Jonathan Overpeck, Malcolm Hughes, Ray Bradley, Katharine Hayhoe, and, I should note, myself, have been subject to harassment, investigation, hacked emails, and politically motivated freedom-of-information attacks.

How to play climate change for a fool When it came to industry disinformation, the role of third-party allies was on full display at the House Committee on Oversight hearings on climate change in late October. As their sole witness, the Republicans on that committee invited Mandy Gunasekera, the founder and president of Energy45, a group whose purpose, in its own words, is to “support the Trump energy agenda.”

Energy45 is part of a group known, bluntly enough, as the CO2 Coalition and is a perfect example of what I’ve long thought of as zombie denialism in which older players spouting industry arguments suddenly reappear in new forms. In this case, in the 1990s and early 2000s, the George C. Marshall Institute was a leader in climate-change disinformation. From 1974-1999, its director, William O’Keefe, had also been the executive vice president and later CEO of the American Petroleum Institute. The Marshall Institute itself closed in 2015, only to re-emerge a few years later as the CO2 Coalition.

The comments of Republican committee members offer a sense of just how deeply the climate-change disinformation campaign is now lodged in the heart of the Trump administration and congressional Republicans as 2019 draws to an end and the planet visibly heats. Consider just six of their “facts”:

1) The misleading claim that climate change will be “mild and manageable.” There is no scientific evidence to support this. On the contrary, literally hundreds of scientific reports over the past few decades, including those U.S. National Climate Assessments, have affirmed that any warming above 2 degrees Centigrade will lead to grave and perhaps catastrophic effects on “health, livelihoods, food security, water supply, human security, and economic growth.” The U.N.’s IPCC has recently noted that avoiding the worst impacts of global warming will “require rapid and far-reaching transitions in energy… infrastructure… and industrial systems.”

Recent events surrounding Hurricanes Sandy, Michael, Harvey, Maria, and Dorian, as well as the devastating wildfire at the ironically named town of Paradise, California, in 2018 and the fires across much of that state this fall, have shown that the impacts of climate change are already part of our lives and becoming unmanageable. Or if you want another sign of where this country is at this moment, consider a new report from the Army War College indicating that “the Department of Defense (DoD) is precariously unprepared for the national security implications of climate change-induced global security challenges.” And if the Pentagon isn’t prepared to manage climate change, it’s hard to imagine any part of the U.S. government that might be.

2) The misleading claim that global prosperity is actually being driven by fossil fuels. No one denies that fossil fuels drove the Industrial Revolution and, in doing so, contributed substantively to rising living standards for hundreds of millions of people in Europe, North America, and parts of Asia. But the claim that fossil fuels are the essence of global prosperity today is, at best, a half-truth because what is at stake here isn’t the past but the future. Disruptive climate change fueled by greenhouse gas emissions from the use of oil, coal, and natural gas now threatens both the prosperity that parts of this planet have already achieved and future economic growth of just about any sort. Nicholas Stern, the former chief economist of the World Bank and one of the foremost experts on the economics of climate change, has put our situation succinctly this way: “High carbon growth self-destructs.”

3) A misleading claim that fossil fuels represent “cheap energy.” Fossil fuels are not cheap. When their external costs are included — that is, not just the price of extracting, distributing, and profiting from them, but what it will cost in all our lives once you add in the fires, extreme storms, flooding, health effects, and everything else that their carbon emissions into the atmosphere will bring about — they couldn’t be more expensive. The International Monetary Fund estimates that the cost to consumers above and beyond what we pay at the pump or in our electricity bills already comes to more than $5 trillion dollars annually. That’s trillion, not billion. Put another way, we are all paying a massive, largely unnoticed subsidy to the oil, gas, and coal industry to destroy our civilization. Among other things, those subsidies already “damage the environment, caus[e]… premature deaths through local air pollution, [and] exacerbat[e] congestion and other adverse side effects of vehicle use.”

4) A misleading claim about poverty and fossil fuels. That fossil fuels are the solution to the energy needs of the world’s poor is a tale being heavily promoted by ExxonMobil, among others. The idea that ExxonMobil is suddenly concerned about the plight of the global poor is, of course, laughable or its executives wouldn’t be planning (as they are) for significant increases in fossil-fuel production between now and 2030, while downplaying the threat of climate change. As Pope Francis, global justice leader Mary Robinson, and former U.N. Secretary General Ban Ki-Moon — as well as countless scientists and advocates of poverty reduction and global justice — have repeatedly emphasized, climate change will, above all, hurt the poor. It is they who will first be uprooted from their homes (and homelands); it is they who will be migrating into an increasingly hostile and walled-in world; it is they who will truly feel the heat, literal and figurative, of it all. A fossil-fuel company that cared about the poor would obviously not be committed, above all else, to pursuing a business model based on oil and gas exploration and development. The cynicism of this argument is truly astonishing.

Moreover, while it’s true that the poor need affordable energy, it is not true that they need fossil fuels. More than a billion people worldwide lack access (or, at least, reliable access) to electricity, but many of them also lack access to an electricity grid, which means fossil fuels are of little use to them. For such communities, solar and wind power are the only reasonable ways to go, the only ones that could rapidly and affordably be put in place and made available.

5) Misleading assertions about the costs of renewable energy. The cheap fossil fuel narrative is regularly coupled with misleading assertions about the allegedly high costs of renewable energy. According to Bloomberg News, however, in two-thirds of the world, solar is already the cheapest form of newly installed electricity generation, cheaper than nuclear, natural gas, or coal. Improvements in energy storage are needed to maximize the penetration of renewables, particularly in developed countries, but such improvements are happening quickly. Between 2010 and 2017, the price of battery storage decreased a startling 79% and most experts believe that, in the near future, many of the storage problems can and will be solved.

6) The false claim that, under President Trump, the U.S. has actually cut greenhouse gas emissions. Republicans have claimed not only that such emissions have fallen but that the United States under President Trump has done more to reduce emissions than any other country on the planet. One environmental reporter, who has described herself as “accustomed to hearing a lot of misinformation” about climate change, characterized this statement as “brazenly false.” In fact, U.S. CO2 emissions spiked in 2018, increasing by 3.1% over 2017. Methane emissions are also on the rise and President Trump’s proposal to rollback methane standards will ensure that unhappy trend continues.

Science isn’t enough And by the way, when it comes to the oil companies, that’s just to start down a far longer list of misinformation and false claims they’ve been peddling for years. In our 2010 book, Merchants of Doubt, Erik Conway and I showed that the strategies and tactics used by Big Energy to deny the harm of fossil-fuel use were, in many cases, remarkably similar to those long used by the tobacco industry to deny the harm of tobacco use — and this was no coincidence. Many of the same PR firms, advertising agencies, and institutions were involved in both cases.

The tobacco industry was finally prosecuted by the Department of Justice, in part because of the ways in which the individual companies coordinated with each other and with third-party allies to present false information to consumers. Through congressional hearings and legal discovery, the industry was pegged with a wide range of activities it funded to mislead the American people. Something similar has occurred with Big Energy and the harm fossil fuels are doing to our lives, our civilization, our planet.

Still, a crucial question about the fossil-fuel industry remains to be fully explored: Which of its companies have funded the activities of the trade organizations and other third-party allies who deny the facts about climate change? In some cases, we already know the answers. In 2006, for instance, the Royal Society of the United Kingdom documented ExxonMobil’s funding of 39 organizations that promoted “inaccurate and misleading” views of climate science. The Society was able to identify $2.9 million spent to that end by that company in the year 2005 alone. That, of course, was just one year and clearly anything but the whole story.

Nearly all of these third-party allies are incorporated as 501(c)(3) institutions, which means they must be non-profit and nonpartisan. Often they claim to be involved in education (though mis-education would be the more accurate term). But they are clearly also involved in supporting an industry — Big Energy — that couldn’t be more for-profit and they have done many things to support what could only be called a partisan political agenda as well. After all, by its own admission, Energy45, to take just one example, exists to support the “Trump Energy Agenda.”

I’m an educator, not a lawyer, but as one I can say with confidence that the activities of these organizations are the opposite of educational. Typically, the Heartland Institute, for instance, has explicitly targeted schoolteachers with disinformation. In 2017, the institute sent a booklet to more than 200,000 of them, repeating the oft-cited contrarian claims that climate science is still a highly unsettled subject and that, even if climate change were occurring, it “would probably not be harmful.” Of this booklet, the director of the National Center for Science Education said, “It’s not science, but it’s dressed up to look like science. It’s clearly intended to confuse teachers.” The National Science Teaching Association has called it “propaganda” and advised teachers to place their copies in the recycling bin.

Yet, as much as we know about the activities of Heartland and other third-party allies of the fossil-fuel industry, because of loopholes in our laws we still lack basic information about who has funded and sustained them. Much of the funding at the moment still qualifies as “dark money.” Isn’t it time for citizens to demand that Congress investigate this network, as it and the Department of Justice once investigated the tobacco industry and its networks?

ExxonMobil loves to accuse me of being “an activist.” I am, in fact, a teacher and a scholar. Most of the time, I’d rather be home working on my next book, but that increasingly seems like less of an option when Big Energy’s climate-change scam is ongoing and our civilization is, quite literally, at stake. When citizens are inactive, democracy fails — and this time, if democracy fails, as burning California shows, so much else could fail as well. Science isn’t enough. The rest of us are needed. And we are needed now.

Naomi Oreskes is professor of the history of science and affiliated professor of earth and planetary sciences at Harvard University. She is coauthor, with Erik Conway, of Merchants of Doubt. Her latest book is Why Trust Science? This article originated at TomDispatch.com

Copyright ©2019 Naomi Oreskes — distributed by Agence Global

—————-
Released: 11 November 2019
Word Count: 2,849
—————-

Danny Sjursen, “Watching my students turn into soldiers of empire”

November 7, 2019 - TomDispatch

Patches, pins, medals, and badges are the visible signs of an exclusive military culture, a silent language by which soldiers and officers judge each other’s experiences, accomplishments, and general worth. In July 2001, when I first walked through the gate of the U.S. Military Academy at West Point at the ripe young age of 17, the “combat patch” on one’s right shoulder — evidence of a deployment with a specific unit — had more resonance than colorful medals like Ranger badges reflecting specific skills. Back then, before the 9/11 attacks ushered in a series of revenge wars “on terror,” the vast majority of officers stationed at West Point didn’t boast a right shoulder patch. Those who did were mostly veterans of modest combat in the first Gulf War of 1990-1991. Nonetheless, even those officers were regarded by the likes of me as gods. After all, they’d seen “the elephant.”

We young cadets arrived then with far different expectations about Army life and our futures, ones that would prove incompatible with the realities of military service in a post-9/11 world. When my mother — as was mandatory for a 17-year-old — put her signature on my future Army career, I imagined a life of fancy uniforms; tough masculine training; and maybe, at worst, some photo opportunities during a safe, “peace-keeping” deployment in a place like Kosovo.

Sure, the U.S. was then quietly starving hundreds of thousands of children with a crippling sanctions regime against autocrat Saddam Hussein’s Iraq, occasionally lobbing cruise missiles at “terrorist” encampments here or there, and garrisoning much of the globe. Still, the life of a conventional Army officer in the late 1990s did fit pretty closely with my high-school fantasies.

You won’t be surprised to learn, however, that the world of future officers at the Academy irreparably changed when those towers collapsed in my home town of New York. By the following May, it wasn’t uncommon to overhear senior cadets on the phone with girlfriends or fiancées explaining that they were heading for war upon graduation.

As a plebe (freshman), I still had years ahead in my West Point journey during which our world changed even more. Older cadets I’d known would soon be part of the invasion of Afghanistan. Drinking excessively at a New York Irish bar on St. Patrick’s Day in 2003, I watched in wonder as, on TV, U.S. bombs and missiles rained down on Iraq as part of Secretary of Defense Donald Rumsfeld’s promised “shock-and-awe” campaign.

Soon enough, the names of former cadets I knew well were being announced over the mess hall loudspeaker at breakfast. They’d been killed in Afghanistan or, more commonly, in Iraq.

My greatest fear then, I’m embarrassed to admit, was that I’d miss the wars in Iraq and Afghanistan. It wasn’t long after my May 28, 2005, graduation that I’d serve in Baghdad. Later, I would be sent to Kandahar, Afghanistan. I buried eight young men under my direct command. Five died in combat; three took their own lives. After surviving the worst of it with my body (though not my mind) intact, I was offered a teaching position back at my alma mater. During my few years in the history department at West Point, I taught some 300 or more cadets. It was the best job I ever had.

I think about them often, the ones I’m still in touch with and the majority whom I’ll never hear from or of again. Many graduated last year and are already out there carrying water for empire. The last batch will enter the regular Army next May. Recently, my mother asked me what I thought my former students were now doing or would be doing after graduation. I was taken aback and didn’t quite know how to answer.

Wasting their time and their lives was, I suppose, what I wanted to say. But a more serious analysis, based on a survey of U.S. Army missions in 2019 and bolstered by my communications with peers still in the service, leaves me with an even more disturbing answer. A new generation of West Point educated officers, graduating a decade and a half after me, faces potential tours of duty in… hmm, Afghanistan, Iraq, or other countries involved in the never-ending American war on terror, missions that will not make this country any safer or lead to “victory” of any sort, no matter how defined.

A new generation of cadets serving the empire abroad West Point seniors (“first-class cadets”) choose their military specialties and their first duty-station locations in a manner reminiscent of the National Football League draft. This is unique to Academy grads and differs markedly from the more limited choices and options available to the 80% of officers commissioned through the Reserve Officers Training Corps (ROTC) or Officer Candidate School (OCS).

Throughout the 47-month academy experience, West Pointers are ranked based on a combination of academic grades, physical fitness scores, and military-training evaluations. Then, on a booze-fueled, epic night, the cadets choose jobs in their assigned order of merit. Highly ranked seniors get to pick what are considered the most desirable jobs and duty locations (helicopter pilot, Hawaii). Bottom-feeding cadets choose from the remaining scraps (field artillery, Fort Sill, Oklahoma).

In truth, though, it matters remarkably little which stateside or overseas base one first reports to. Within a year or two, most young lieutenants in today’s Army will serve in any number of diverse “contingency” deployments overseas. Some will indeed be in America’s mostly unsanctioned wars abroad, while others will straddle the line between combat and training in, say, “advise-and-assist” missions in Africa.

Now, here’s the rub: given the range of missions that my former students are sure to participate in, I can’t help but feel frustration. After all, it should be clear 18 years after the 9/11 attacks that almost none of those missions have a chance in hell of succeeding. Worse yet, the killing my beloved students might take part in (and the possibility of them being maimed or dying) won’t make America any safer or better. They are, in other words, doomed to repeat my own unfulfilling, damaging journey — in some cases, on the very same ground in Iraq and Afghanistan where I fought.

Consider just a quick survey of some of the possible missions that await them. Some will head for Iraq — my first and formative war — though it’s unclear just what they’ll be expected to do there. ISIS has been attritted to a point where indigenous security forces could assumedly handle the ongoing low-intensity fight, though they will undoubtedly assist in that effort. What they can’t do is reform a corrupt, oppressive Shia-chauvinist sectarian government in Baghdad that guns down its own protesting people, repeating the very mistakes that fueled the rise of the Islamic State in the first place. Oh, and the Iraqi government, and a huge chunk of Iraqis as well, don’t want any more American troops in their country. But when has national sovereignty or popular demand stopped Washington before?

Others are sure to join the thousands of servicemen still in Afghanistan in the 19th year of America’s longest ever war — and that’s even if you don’t count our first Afghan War (1979-1989) in the mix. And keep in mind that most of the cadets-turned-officers I taught were born in 1998 or thereafter and so were all of three years old or younger when the Twin Towers crumbled.

The first of our wars to come from that nightmare has always been unwinnable. All the Afghan metrics — the U.S. military’s own “measures for success” — continue to trend badly, worse than ever in fact. The futility of the entire endeavor borders on the absurd. It makes me sad to think that my former officemate and fellow West Point history instructor, Mark, is once again over there. Along with just about every serving officer I’ve known, he would laugh if asked whether he could foresee — or even define — “victory” in that country. Take my word for it, after 18-plus years, whatever idealism might once have been in the Army has almost completely evaporated. Resignation is what remains among most of the officer corps. As for me, I’ll be left hoping against hope that someone I know or taught isn’t the last to die in that never-ending war from hell.

My former cadets who ended up in armor (tanks and reconnaissance) or ventured into the Special Forces might now find themselves in Syria — the war President Trump “ended” by withdrawing American troops from that country, until, of course, almost as many of them were more or less instantly sent back in. Some of the armor officers among my students might even have the pleasure of indefinitely guarding that country’s oil fields, which — if the U.S. takes some of that liquid gold for itself — might just violate international law. But hey, what else is new?

Still more — mostly intelligence officers, logisticians, and special operators — can expect to deploy to any one of the dozen or so West African or Horn of Africa countries that the U.S. military now calls home. In the name of “advising and assisting” the local security forces of often autocratic African regimes, American troops still occasionally, if quietly, die in “non-combat” missions in places like Niger or Somalia.

None of these combat operations have been approved, or even meaningfully debated, by Congress. But in the America of 2019 that doesn’t qualify as a problem. There are, however, problems of a more strategic variety. After all, it’s demonstrably clear that, since the founding of the U.S. military’s Africa Command (AFRICOM) in 2008, violence on the continent has only increased, while Islamist terror and insurgent groups have proliferated in an exponential fashion. To be fair, though, such counterproductivity has been the name of the game in the “war on terror” since it began.

Another group of new academy graduates will spend up to a year in Poland, Romania, or the Baltic states of Eastern Europe. There, they’ll ostensibly train the paltry armies of those relatively new NATO countries — added to the alliance in foolish violation of repeated American promises not to expand eastward as the Cold War ended. In reality, though, they’ll be serving as provocative “signals” to a supposedly expansionist Russia. With the Russian threat wildly exaggerated, just as it was in the Cold War era, the very presence of my Baltic-based former cadets will only heighten tensions between the two over-armed nuclear heavyweights. Such military missions are too big not to be provocative, but too small to survive a real (if essentially unimaginable) war.

The intelligence officers among my cadets might, on the other hand, get the “honor” of helping the Saudi Air Force through intelligence-sharing to doom some Yemeni targets — often civilian — to oblivion thanks to U.S. manufactured munitions. In other words, these young officers could be made complicit in what’s already the worst humanitarian disaster in the world.

Other recent cadets of mine might even have the ignominious distinction of being part of military convoys driving along interstate highways to America’s southern border to emplace what President Trump has termed “beautiful” barbed wire there, while helping detain refugees of wars and disorder that Washington often helped to fuel.

Yet other graduates may already have found themselves in the barren deserts of Saudi Arabia, since Trump has dispatched 3,000 U.S. troops to that country in recent months. There, those young officers can expect to go full mercenary, since the president defended his deployment of those troops (plus two jet fighter squadrons and two batteries of Patriot missiles) by noting that the Saudis would “pay” for “our help.” Setting aside for the moment the fact that basing American troops near the Islamic holy cities of the Arabian Peninsula didn’t exactly end well the last time around — you undoubtedly remember a guy named bin Laden who protested that deployment so violently — the latest troop buildup in Saudi Arabia portends a disastrous future war with Iran.

None of these potential tasks awaiting my former students is even remotely linked to the oath (to “support and defend the Constitution of the United States against all enemies, foreign and domestic”) that newly commissioned officers swear on day one. They are instead all unconstitutional, ill-advised distractions that benefit mainly an entrenched national security state and the arms-makers that go with them. The tragedy is that a few of my beloved cadets with whom I once played touch football, who babysat my children, who shed tears of anxiety and fear during private lunches in my office might well sustain injuries that will last a lifetime or die in one of this country’s endless hegemonic wars.

A nightmare come true This May, the last of the freshman cadets I once taught will graduate from the Academy. Commissioned that same afternoon as second lieutenants in the Army, they will head off to “serve” their country (and its imperial ambitions) across the wide expanse of the continental United States and a broader world peppered with American military bases. Given my own tortured path of dissent while in that military (and my relief on leaving it), knowing where they’re heading leaves me with a feeling of melancholy. In a sense, it represents the severing of my last tenuous connection with the institutions to which I dedicated my adult life.

Though I was already skeptical and antiwar, I still imagined that teaching those cadets an alternative, more progressive version of our history would represent a last service to an Army I once unconditionally loved. My romantic hope was that I’d help develop future officers imbued with critical thinking and with the integrity to oppose unjust wars. It was a fantasy that helped me get up each morning, don a uniform, and do my job with competence and enthusiasm.

Nevertheless, as my last semester as an assistant professor of history wound down, I felt a growing sense of dread. Partly it was the realization that I’d soon return to the decidedly unstimulating “real Army,” but it was more than that, too. I loved academia and “my” students, yet I also knew that I couldn’t save them. I knew that they were indeed doomed to take the same path I did.

My last day in front of a class, I skipped the planned lesson and leveled with the young men and women seated before me. We discussed my own once bright, now troubled career and my struggles with my emotional health. We talked about the complexities, horror, and macabre humor of combat and they asked me blunt questions about what they could expect in their future as graduates. Then, in my last few minutes as a teacher, I broke down. I hadn’t planned this, nor could I control it.

My greatest fear, I said, was that their budding young lives might closely track my own journey of disillusionment, emotional trauma, divorce, and moral injury. The thought that they would soon serve in the same pointless, horrifying wars, I told them, made me “want to puke in a trash bin.” The clock struck 1600 (4:00 pm), class time was up, yet not a single one of those stunned cadets — unsure undoubtedly of what to make of a superior officer’s streaming tears — moved for the door. I assured them that it was okay to leave, hugged each of them as they finally exited, and soon found myself disconcertingly alone. So I erased my chalkboards and also left.

Three years have passed. About 130 students of mine graduated in May. My last group will pin on the gold bars of brand new army officers in late May 2020. I’m still in touch with several former cadets and, long after I did so, students of mine are now driving down the dusty lanes of Iraq or tramping the narrow footpaths of Afghanistan.

My nightmare has come true.

Danny Sjursen writes regularly for TomDispatch (where this article originated). He is a retired U.S. Army major and former history instructor at West Point. He served tours with reconnaissance units in Iraq and Afghanistan, and now lives in Lawrence, Kansas. He has written a memoir of the Iraq War, Ghost Riders of Baghdad: Soldiers, Civilians, and the Myth of the Surge. Follow him on Twitter at @SkepticalVet and check out his podcast “Fortress on a Hill,” co-hosted with fellow vet Chris Henriksen.

Copyright ©2019 Danny Sjursen — distributed by Agence Global

—————-
Released: 07 November 2019
Word Count: 2,633
—————-

James Carroll, “What the dismantling of the Berlin Wall means 30 years later”

October 28, 2019 - TomDispatch

Some anniversaries are less about the past than the future. So it should be with November 9, 1989. In case you’ve long forgotten, that was the day when East and West Germans began nonviolently dismantling the Berlin Wall, an entirely unpredicted, almost unimaginable ending to the long-entrenched Cold War. Think of it as the triumph of idealistic hope over everything that then passed for hard-nosed “realism.” After all, Western intelligence services, academic Kremlinologists, and the American national security establishment had always blithely assumed that the Cold War would essentially go on forever — unless the absolute malevolence of Soviet Communism led to the ultimate mayhem of nuclear Armageddon. For almost half a century, only readily dismissed peaceniks insisted that, in the nuclear age, war and endless preparations for more of it were not the answer. When the Berlin Wall came down, such idealists were proven right, even if their triumph was still ignored.

Yet war-as-the-answer reasserted itself with remarkable rapidity. Within weeks of the Wall being breached by hope — in an era that saw savage conflicts in Central America, the Philippines, and South Africa transformed by a global wave of nonviolent resolution — the United States launched Operation Just Cause, the invasion of Panama by a combat force of more than 27,000 troops. The stated purpose of that act of war was the arrest of Panama’s tinhorn dictator Manuel Noriega, who had initially come to power as a CIA asset. That invasion’s only real importance was as a demonstration that, even with global peace being hailed, the world’s last remaining superpower remained as committed as ever to the hegemony of violent force.

Who ended the Cold War? While President George H.W. Bush rushed to claim credit for ending the Cold War, the Soviet Union’s Mikhail Gorbachev was the lynchpin of that historic conclusion. It was he who, in the dramatic autumn of 1989, repeatedly ordered Communist forces to remain in their barracks while throngs of freedom-chanters poured into the streets of multiple cities behind the Iron Curtain. Instead of blindly striking out (as the leaders of crumbling empires often had), Gorbachev allowed democratic demands to echo through the Soviet empire — ultimately even in Russia itself.

Yet the American imagination was soon overtaken by the smug fantasy that the U.S. had “won” the Cold War and that it was now a power beyond all imagining. Never mind that, in 1987, when President Ronald Reagan issued his famed demand in then still-divided Berlin, “Mister Gorbachev, tear down this wall,” the Soviet leader was already starting to do precisely that.

As the wall came down, the red-scare horrors that had disturbed American dreams for three generations seemed to dissolve overnight, leaving official Washington basking in triumphalism. The U.S. then wrapped itself in a self-aggrandizing mantle of virtue and power that effectively blinded this country’s political leadership to the ways the Cold War’s end had left them mired in an outmoded, ever more dangerous version of militarism.

After Panama, the self-styled “indispensable nation” would show itself to be hell-bent on unbridled — and profoundly self-destructive — belligerence. Deprived of an existential enemy, Pentagon budgets would decline oh-so-modestly (though without a “peace dividend” in sight) but soon return to Cold War levels. A bristling nuclear arsenal would be maintained as a “hedge” against the comeback of Soviet-style communism. Such thinking would, in the end, only empower Moscow’s hawks, smoothing the way for the future rise of an ex-KGB agent named Vladimir Putin. Such hyper-defensive anticipation would prove to be, as one wag put it, the insurance policy that started the fire.

Even as the disintegration of the once-demonized USSR was firmly underway, culminating in the final lowering of the hammer-and-sickle flag from the Kremlin on Christmas Day 1991, the United States was launching what would prove to be a never-ending and disastrous sequence of unnecessary Middle Eastern wars. They began with Operation Desert Storm, George H.W. Bush’s assault on Saddam Hussein’s Iraq in 1990. In American memory, that campaign, which crushed the Iraqi autocrat’s army and forced it out of Kuwait, would be a techno-war made in heaven with fewer than 200 U.S. combat deaths.

That memory, however, fits poorly with what was actually happening that year. An internationally mounted sanctions regime had already been on the verge of thwarting Hussein without the U.S.-led invasion — and, of course, what Bush the father began, Bush the son would, with his 2003 shock-and-awe recapitulation, turn into the permanent bedrock of American politics. 

As the 30th anniversary of the end of the Cold War approaches, it should be obvious that there’s been a refusal in the United States to reckon with a decades-long set of conflagrations in the Greater Middle East as the inevitable consequence of that first American invasion in 1990. Above all, Desert Storm, with its monumental victory parade in Washington D.C., brought the Pentagon’s Cold War raison d’être back from the brink of obsolescence. That campaign and what followed in its wake guaranteed that violence would continue to occupy the heartlands of the U.S. economy, its politics, and its culture. In the process, the world-historic aspirations kindled by the miracle of the Berlin Wall’s dismantling would be thoroughly dashed. No wonder, so many years later, we hardly remember that November of hope — or the anniversary that goes with it.

Out of the memory hole By revisiting its astonishing promise as the anniversary approaches, however, and by seeing it more fully in light of what made it so surprising, perhaps something of that vanished positive energy can still be retrieved. So let me call to mind the events of various earlier Novembers that make the point. What follows is a decade-by-decade retracing of the way the war machine trundled through recent history — and through the American psyche — until it was finally halted in a battle-scarred, divided city in the middle of Europe, stopped by an urge for peace that refused to be denied.

Let’s start with November 1939, only weeks after the German invasion of Poland that began what would become World War II. A global struggle between good and evil was just then kicking into gear. Unlike the previous Great War of 1914-1918, which was fought for mere empire, Hitler’s war was understood in distinctly Manichaean terms as both apocalyptic and transcendent. After all, the moral depravity of the Nazi project had already been laid bare when Jewish synagogues, businesses, and homes everywhere in Germany were subject to the savagery of Kristallnacht, or “the night of broken glass.” That ignition of what became an anti-Jewish genocide took place, as it happened, on November 9, 1938.

The good-versus-evil absolutism of World War II stamped the American imagination so profoundly that a self-righteous moral dualism survived not only into the Cold War but into Washington’s twenty-first-century war on terror. In such contests against enemies defined as devils, Americans could adopt the kinds of ends-justify-the-means strategies called for by “realism.” When you are fighting along what might be thought of as an axis of evil, anything goes — from deceit and torture to the routine sacrifice of civilians, whose deaths in America’s post-9/11 wars have approached a total of half a million. Through it all, we were assured of one certain thing: that God was on our side. (“God is not neutral,” as George W. Bush put it just days after the 9/11 attacks.)

From genocide to omnicide But what if God could not protect us? That was the out-of-the-blue question posed near the start of all this — not in August 1945 when the U.S. dropped its “victory weapon” on two cities in Japan, but in August 1949 when the Soviet Union acquired an atomic bomb, too. By that November, the American people were already in the grip of an unprecedented nuclear paranoia, which prompted President Harry Truman to override leading atomic scientists and order the development of what one called a “genocidal weapon,” the even more powerful hydrogen bomb. Then came the manic build-up of the U.S. nuclear arsenal to proportions suitable less for genocide than for “omnicide.” Such weapons mushroomed (if you’ll excuse the word in a potentially mushroom-clouded world) from fewer than 200 in 1950 to nearly 20,000 a decade later. Of course, that escalation, in turn, drove Moscow forward in a desperate effort to keep up, leading to an unhinged arms race that turned the suicide of the human species into a present danger, one measured by the Doomsday Clock, of the Bulletin of the Atomic Scientists, which was set at two minutes to midnight in 1953 — and then again in 2019, all these Novembers later.

Now, let’s flash forward another decade to November 1959 when the mortal danger of human self-extinction finally became openly understood, as Soviet leader Nikita Khrushchev began issuing blatant threats of nuclear war over — you guessed it — Berlin. Because part of that city, far inside Communist East Germany, was still occupied by American, French, and British forces, it amounted to a tear in what was then called the Iron Curtain, separating the Soviet empire from Western Europe. With thousands fleeing through that tear to the so-called Free World, the Soviets became increasingly intent on shutting the escape hatch, threatening to use the Red Army to drive the Allies out of Berlin. That brought the possibility of a nuclear conflict to the fore.

Ultimately, the Communists would adopt a quite different strategy when, in 1961, they built that infamous wall, a concrete curtain across the city. At the time, Berliners sometimes referred to it, with a certain irony, as the “Peace Wall” because, by blocking escape from the East, it made the dreaded war between the two Cold War superpowers unnecessary. Yet within a year the unleashed prospect of such a potentially civilization-ending conflict had hopscotched the globe to Communist Cuba. The Cuban Missile Crisis of 1962 caused the world to shudder as incipient nuclear war between Washington and Moscow suddenly loomed. That moment, just before Khrushchev and American President John F. Kennedy stepped back from doomsday, might have changed something; a relieved world’s shock of recognition, that is, might have thrown the classic wooden shoe of sabotage into the purring engine of “realism.” No such luck, however, as the malevolent power of the war state simply motored on — in the case of the United States directly into Vietnam.

By November 1969, President Richard Nixon’s cynical continuation of the Vietnam War for his own political purposes had already driven the liberal-conservative divide over that misbegotten conflict into the permanent structure of American politics. The ubiquitous “POW/MIA: You Are Not Forgotten” flag survives today as an icon of Nixon’s manipulations. Still waving over ball parks, post offices, town halls, and VFW posts across the nation, that sad black banner now flies as a symbol of red state/blue state antagonism — and as a lasting reminder of how we Americans can make prisoners of ourselves.

By 1979, with the Vietnam War in the past, President Jimmy Carter showed how irresistible November’s tide — the inexorable surge toward war — truly was. It was in November of that year that militant Iranian students overran the American embassy in Tehran, taking sixty-six Americans hostage — the event that was credited with stymying the formerly peace-minded president. In reality, though, Carter had already initiated the historic anti-Soviet arms build-up for which President Ronald Reagan would later be credited.

Then, of course, Carter would ominously foreshadow America’s future reversals in the deserts of the Levant with a failed rescue of those hostages. Most momentously, however, he would essentially license future Middle East defeats with what came to be known as the Carter Doctrine — the formally declared principle that the Persian Gulf (and its oil) were “vital interests” of this country, worthy of defense “by any means necessary, including military force.” (And of course, his CIA would lead us into America’s first Afghan War, still in a sense going on some 40 years later.)

Retrieving hope? Decade by decade, the evidence of an unstoppable martial dynamic only seemed to accumulate. In that milestone month of November 1989, Washington’s national security “realists” were still stuck in the groove of such worst-case thinking. That they were wrong, that they would be stunned by the fall of the Berlin Wall and the subsequent implosion of the Soviet Union, should mandate thoughtful observance of this coming 30th anniversary.

During the late 1980s, a complex set of antiwar and antinuclear countercurrents seemed to come out of nowhere. Each of them should have been impossible. The ruthlessly totalitarian Soviet system should not have produced in Mikhail Gorbachev a humane statesman who sacrificed empire and his own career for the sake of peace. The most hawkish American president in history, Ronald Reagan, should not have responded to Gorbachev by working to end the arms race with him — but he did.

Pressuring those two leaders to pursue that course — indeed, forcing them to — was an international grassroots movement demanding an end to apocalyptic terror. People wanted peace so much, as President Dwight D. Eisenhower had predicted in 1959, that, miracle of all miracles, governments got out of their way and let them have it. With the breaching of the Berlin Wall that November 9th — a transformation accomplished by ordinary citizens, not soldiers — the political realm of the possible was substantially broadened, not only to include prospective future detente among warring nations, but an eventual elimination of nuclear weapons themselves.

Yet, in November 2019, all of that seems lost. A new Cold War is underway, with East-West hostilities quickening; a new arms race has begun, especially as the United States renounces Reagan-Gorbachev arms-control agreements for the sake of a trillion-plus dollar “modernization” of its nuclear arsenal. Across the globe, democracy is in retreat, driven by pressures from both populist nationalism and predatory capitalism. Even in America, democracy seems imperiled. And all of this naturally prompts the shudder-inducing question: Were the worst-case realists right all along?

This November anniversary of the dismantling of the Berlin Wall should offer an occasion to say no to that. The Wall’s demise stopped in its tracks the demonic dynamic set in motion on the very same date in 1938 by that Kristallnacht. If idealistic hope could so triumph once, it can so triumph again, no matter what the die-hard realists of our moment may believe. I’ve referred to that November in Berlin as a miracle, but that is wrong. The most dangerous face-off in history ended not because of the gods or good fortune, but because of the actions and efforts of human beings. Across two generations, countless men and women — from anonymous community activists and union organizers to unsung military officials, scientists, and even world leaders — overcame the seemingly endless escalations of nuclear-armed animus to make brave choices for peace and against a war of annihilation, for life and against death, for the future and against the doom-laden past.

It can happen again. It must.

James Carroll writes regularly for TomDispatch (where this article orginated). He is a former Boston Globe columnist, the author of 20 books, most recently the novel The Cloister. His history of the Pentagon, House of War, won the PEN-Galbraith Award. His Vietnam War memoir, An American Requiem, won the National Book Award. He is a fellow of the American Academy of Arts and Sciences.

Copyright ©2019 James Carroll — distributed by Agence Global

—————-
Released: 28 October 2019
Word Count: 2,476
—————-

William J. Astore, “Killing me softly with militarism”

October 24, 2019 - TomDispatch

When Americans think of militarism, they may imagine jackbooted soldiers goose-stepping through the streets as flag-waving crowds exult; or, like our president, they may think of enormous parades featuring troops and missiles and tanks, with warplanes soaring overhead. Or nationalist dictators wearing military uniforms encrusted with medals, ribbons, and badges like so many barnacles on a sinking ship of state. (Was Donald Trump only joking recently when he said he’d like to award himself a Medal of Honor?) And what they may also think is: that’s not us. That’s not America. After all, Lady Liberty used to welcome newcomers with a torch, not an AR-15. We don’t wall ourselves in while bombing others in distant parts of the world, right?

But militarism is more than thuggish dictators, predatory weaponry, and steely-eyed troops. There are softer forms of it that are no less significant than the “hard” ones. In fact, in a self-avowed democracy like the United States, such softer forms are often more effective because they seem so much less insidious, so much less dangerous. Even in the heartland of Trump’s famed base, most Americans continue to reject nakedly bellicose displays like phalanxes of tanks rolling down Pennsylvania Avenue.

But who can object to celebrating “hometown heroes” in uniform, as happens regularly at sports events of every sort in twenty-first-century America? Or polite and smiling military recruiters in schools? Or gung-ho war movies like the latest version of Midway, timed for Veterans Day weekend 2019 and marking America’s 1942 naval victory over Japan, when we were not only the good guys but the underdogs?

What do I mean by softer forms of militarism? I’m a football fan, so one recent Sunday afternoon found me watching an NFL game on CBS. People deplore violence in such games, and rightly so, given the number of injuries among the players, notably concussions that debilitate lives. But what about violent commercials during the game? In that one afternoon, I noted repetitive commercials for SEAL Team, SWAT, and FBI, all CBS shows from this quietly militarized American moment of ours. In other words, I was exposed to lots of guns, explosions, fisticuffs, and the like, but more than anything I was given glimpses of hard men (and a woman or two) in uniform who have the very answers we need and, like the Pentagon-supplied police in Ferguson, Missouri, in 2014, are armed to the teeth. (“Models with guns,” my wife calls them.)

Got a situation in Nowhere-stan? Send in the Navy SEALs. Got a murderer on the loose? Send in the SWAT team. With their superior weaponry and can-do spirit, Special Forces of every sort are sure to win the day (except, of course, when they don’t, as in America’s current series of never-ending wars in distant lands).

And it hardly ends with those three shows. Consider, for example, this century’s update of Magnum P.I., a CBS show featuring a kickass private investigator. In the original Magnum P.I.that I watched as a teenager, Tom Selleck played the character with an easy charm. Magnum’s military background in Vietnam was acknowledged but not hyped. Unsurprisingly, today’s Magnum is proudly billed as an ex-Navy SEAL.

Cop and military shows are nothing new on American TV, but never have I seen so many of them, new and old, and so well-armed. On CBS alone you can add to the mix Hawaii Five-O (yet more models with guns updated and up-armed from my youthful years), the three NCIS (Naval Criminal Investigative Service) shows, and Blue Bloods (ironically starring a more grizzled and less charming Tom Selleck) — and who knows what I haven’t noticed? While today’s cop/military shows feature far more diversity with respect to gender, ethnicity, and race compared to hoary classics like Dragnet, they also feature far more gunplay and other forms of bloody violence.

Look, as a veteran, I have nothing against realistic shows on the military. Coming from a family of first responders — I count four firefighters and two police officers in my immediate family — I loved shows like Adam-12 and Emergency! in my youth. What I’m against is the strange militarization of everything, including, for instance, the idea, distinctly of our moment, that first responders need their very own version of the American flag to mark their service. Perhaps you’ve seen those thin blue line flags, sometimes augmented with a red line for firefighters. As a military veteran, my gut tells me that there should only be one American flag and it should be good enough for all Americans. Think of the proliferation of flags as another soft type of up-armoring (this time of patriotism).

Speaking of which, whatever happened to Dragnet’s Sergeant Joe Friday, on the beat, serving his fellow citizens, and pursuing law enforcement as a calling? He didn’t need a thin blue line battle flag. And in the rare times when he wielded a gun, it was a .38 Special. Today’s version of Joe looks a lot more like G.I. Joe, decked out in body armor and carrying an assault rifle as he exits a tank-like vehicle, maybe even a surplus MRAP from America’s failed imperial wars.

Militarism in the USA Besides TV shows, movies, and commercials, there are many signs of the increasing embrace of militarized values and attitudes in this country. The result: the acceptance of a military in places where it shouldn’t be, one that’s over-celebrated, over-hyped, and given far too much money and cultural authority, while becoming virtually immune to serious criticism.

Let me offer just nine signs of this that would have been so much less conceivable when I was a young boy watching reruns of Dragnet:

1. Roughly two-thirds of the federal government’s discretionary budget for 2020 will, unbelievably enough, be devoted to the Pentagon and related military functions, with each year’s “defense” budget coming ever closer to a trillion dollars. Such colossal sums are rarely debated in Congress; indeed, they enjoy wide bipartisan support.

2. The U.S. military remains the most trusted institution in our society, so say 74% of Americans surveyed in a Gallup poll. No other institution even comes close, certainly not the presidency (37%) or Congress (which recently rose to a monumental 25% on an impeachment high). Yet that same military has produced disasters or quagmires in Afghanistan, Iraq, Libya, Syria, Somalia, and elsewhere. Various “surges” have repeatedly failed. The Pentagon itself can’t even pass an audit. Why so much trust?

3. A state of permanent war is considered America’s new normal. Wars are now automatically treated as multi-generational with little concern for how permawar might degrade our democracy. Anti-war protesters are rare enough to be lone voices crying in the wilderness.

4. America’s generals continue to be treated, without the slightest irony, as “the adults in the room.” Sages like former Secretary of Defense James Mattis (cited glowingly in the recent debate among 12 Democratic presidential hopefuls) will save America from unskilled and tempestuous politicians like one Donald J. Trump. In the 2016 presidential race, it seemed that neither candidate could run without being endorsed by a screaming general (Michael Flynn for Trump; John Allen for Clinton).

5. The media routinely embraces retired U.S. military officers and uses them as talking heads to explain and promote military action to the American people. Simultaneously, when the military goes to war, civilian journalists are “embedded” within those forces and so are dependent on them in every way. The result tends to be a cheerleading media that supports the military in the name of patriotism — as well as higher ratings and corporate profits.

6. America’s foreign aid is increasingly military aid. Consider, for instance, the current controversy over the aid to Ukraine that President Trump blocked before his infamous phone call, which was, of course, partially about weaponry. This should serve to remind us that the United States has become the world’s foremost merchant of death, selling far more weapons globally than any other country. Again, there is no real debate here about the morality of profiting from such massive sales, whether abroad ($55.4 billion in arms sales for this fiscal year alone, says the Defense Security Cooperation Agency) or at home (a staggering 150 million new guns produced in the USA since 1986, the vast majority remaining in American hands).

7. In that context, consider the militarization of the weaponry in those very hands, from .50 caliber sniper rifles to various military-style assault rifles. Roughly 15 million AR-15s are currently owned by ordinary Americans. We’re talking about a gun designed for battlefield-style rapid shooting and maximum damage against humans. In the 1970s, when I was a teenager, the hunters in my family had bolt-action rifles for deer hunting, shotguns for birds, and pistols for home defense and plinking. No one had a military-style assault rifle because no one needed one or even wanted one. Now, worried suburbanites buy them, thinking they’re getting their “man card” back by toting such a weapon of mass destruction.

8. Paradoxically, even as Americans slaughter each other and themselves in large numbers via mass shootings and suicides (nearly 40,000 gun deaths in 2017 alone), they largely ignore Washington’s overseas wars and the continued bombing of numerous countries. But ignorance is not bliss. By tacitly giving the military a blank check, issued in the name of securing the homeland, Americans embrace that military, however loosely, and its misuse of violence across significant parts of the planet. Should it be any surprise that a country that kills so wantonly overseas over such a prolonged period would also experience mass shootings and other forms of violence at home?

9. Even as Americans “support our troops” and celebrate them as “heroes,” the military itself has taken on a new “warrior ethos” that would once — in the age of a draft army — have been contrary to this country’s citizen-soldier tradition, especially as articulated and exhibited by the “greatest generation” during World War II.

What these nine items add up to is a paradigm shift as well as a change in the zeitgeist. The U.S. military is no longer a tool that a democracy funds and uses reluctantly.  It’s become an alleged force for good, a virtuous entity, a band of brothers (and sisters), America’s foremost missionaries overseas and most lovable and admired heroes at home. This embrace of the military is precisely what I would call soft militarism. Jackbooted troops may not be marching in our streets, but they increasingly seem to be marching unopposed through — and occupying — our minds.

The decay of democracy As Americans embrace the military, less violent policy options are downplayed or disregarded. Consider the State Department, America’s diplomatic corps, now a tiny, increasingly defunded branch of the Pentagon led by Mike Pompeo (celebrated by Donald Trump as a tremendous leader because he did well at West Point). Consider President Trump as well, who’s been labeled an isolationist, and his stunning inability to truly withdraw troops or end wars. In Syria, U.S. troops were recently redeployed, not withdrawn, not from the region anyway, even as more troops are being sent to Saudi Arabia. In Afghanistan, Trump sent a few thousand more troops in 2017, his own modest version of a mini-surge and they’re still there, even as peace negotiations with the Taliban have been abandoned. That decision, in turn, led to a new surge (a “near record high”) in U.S. bombing in that country in September, naturally in the name of advancing peace. The result: yet higher levels of civilian deaths.

How did the U.S. increasingly come to reject diplomacy and democracy for militarism and proto-autocracy? Partly, I think, because of the absence of a military draft. Precisely because military service is voluntary, it can be valorized. It can be elevated as a calling that’s uniquely heroic and sacrificial. Even though most troops are drawn from the working class and volunteer for diverse reasons, their motivations and their imperfections can be ignored as politicians praise them to the rooftops. Related to this is the Rambo-like cult of the warrior and warrior ethos, now celebrated as something desirable in America. Such an ethos fits seamlessly with America’s generational wars. Unlike conflicted draftees, warriors exist solely to wage war. They are less likely to have the questioning attitude of the citizen-soldier.

Don’t get me wrong: reviving the draft isn’t the solution; reviving democracy is. We need the active involvement of informed citizens, especially resistance to endless wars and budget-busting spending on American weapons of mass destruction. The true cost of our previously soft (now possibly hardening) militarism isn’t seen only in this country’s quickening march toward a militarized authoritarianism. It can also be measured in the dead and wounded from our wars, including the dead, wounded, and displaced in distant lands. It can be seen as well in the rise of increasingly well-armed, self-avowed nationalists domestically who promise solutions via walls and weapons and “good guys” with guns. (“Shoot them in the legs,” Trump is alleged to have said about immigrants crossing America’s southern border illegally.)

Democracy shouldn’t be about celebrating overlords in uniform. A now-widely accepted belief is that America is more divided, more partisan than ever, approaching perhaps a new civil war, as echoed in the rhetoric of our current president. Small wonder that inflammatory rhetoric is thriving and the list of this country’s enemies lengthening when Americans themselves have so softly yet fervently embraced militarism.

With apologies to the great Roberta Flack, America is killing itself softly with war songs.

A retired lieutenant colonel (USAF) and history professor, Astore writes regularly for a TomDispatch, where this article originated. His personal blog is Bracing Views.

Copyright ©2019 William J. Astore — distributed by Agence Global

—————-
Released: 24 October 2019
Word Count: 2,242
—————-

Arnold R. Isaacs, “Making America crueler again”

October 22, 2019 - TomDispatch

On September 26th, President Donald Trump’s White House announced that, in 2020, refugee admissions to the United States will be limited to 18,000, drastically lower than any yearly ceiling over the past 40 years. Along with that announcement, the White House released a separate executive order intended to upend many years of precedent by giving state and local authorities the power to deny refugees resettlement in their jurisdictions.

Nine days later, Trump issued another directive ordering that new immigrant visas be restricted to those who can afford unsubsidized health insurance coverage or are affluent enough to pay for their own health-care costs. Meanwhile, his administration was heading into the final days of a planned timetable to implement new restrictions that would make it harder for needy immigrants to get a green card and work legally to support themselves and their families. That plan has been thwarted, at least temporarily, by orders from judges in three different federal courts.

Those separate but related actions are the latest pages in another dark chapter in the Trump administration’s anti-immigration binge. Together, they steer the U.S. government onto an even more heartless course, setting policies that will not just harm people directly covered by the new provisions but will cause significant collateral damage.

The local option to prevent resettlement will stir up anti-immigrant groups and inflame the national immigration debate, making that issue and the country’s racial divides even more toxic than they already are. In addition to keeping many more desperate people out of this country, the refugee cutback will harm organizations that help refugees already here and destroy Washington’s ability to persuade other countries to deal with the worldwide tidal wave of refugees displaced by wars and other catastrophes.

The new green card rules, if they overcome court challenges and go into effect, will greatly expand the grounds for finding that an applicant might become a “public charge.” That will deny legal employment to many of the most vulnerable immigrants and lead others to forgo badly needed benefits to which they are legally entitled — a trend already evident before those rules even take effect. Similarly, the new requirement that immigrants be capable of paying for health insurance will not just penalize foreign nationals applying for visas, but in many cases keep family members already in the U.S., including children and spouses, from reuniting with loved ones seeking to join them.

These policies have one more thing in common: none of them has anything to do with illegal immigration.

Refugees hoping for resettlement in the United States are not only seeking to enter the country legally but doing so through the most rigorous and time-consuming of all procedures for getting a visa. Those already here who could be excluded from a state or locality under the new regulations are lawfully in the country, not part of an “invasion” (as Trump calls it) of undocumented immigrants who have crossed the border illegally. Immigrants applying for green cards or visa applicants subject to the health insurance requirement are within the law by definition.

The new refugee ban, town by town The “local option” giving state and local governments the right to block the resettlement of newly admitted refugees in their territory has been the least noticed of these new initiatives so far. It has, however, the potential for far-reaching, troubling, even dangerous effects. If the plan survives the expected court challenges and resettlement organizations have to get written approval from state and local authorities before placing new arrivals in specific locations, that could mobilize anti-immigrant activists across the country to put pressure on local officials, intensifying the politicization of refugee issues and galvanizing ugly forces in this society.

The heads of two of the nine national organizations that administer the resettlement program for the State Department’s Office of Refugee Resettlement have been blunt in their criticism of the local option policy. It “shocks the conscience,” the Reverend John McCullough, CEO and president of Church World Service, declared in a statement. “This proposal would embolden racist officials to deprive refugees of their rights under U.S. law. This proposal is a slippery slope that takes our country backward. The ugly history of institutionalized segregation comes to mind.”

In a similar vein, Mark Hetfield, president of the Hebrew Immigrant Aid Society (HIAS), described the order as “in effect, a state-by-state, city-by-city refugee ban, and it’s un-American and wrong. Is this the kind of America we want to live in? Where local towns can put up signs that say ‘No Refugees Allowed’ and the federal government will back that?”

Details are still vague on how the local option program would work. Trump’s order calls for the State Department and the Department of Homeland Security to develop the lineaments of such a process within 90 days, so details may be forthcoming. On the essential points, though, its wording makes the order’s intent unequivocally clear.

A key passage states that resettlement agencies will have to get written permission from state and local authorities before placing any refugees in their jurisdiction; the burden, that is, will be on the agencies to get approval, not on local or state leaders to initiate an objection. In a curious provision, the document adds that the secretary of state “shall publicly release any written consents of States and localities to resettlement of refugees.” A decision to exclude refugees, however, can remain undisclosed.

Only President Trump and his advisers know whether the primary motive for such requirements was to make resettling refugees more politically fraught and potentially a more visible issue in the coming election season. But that is sure to be the result.

Krish O’Mara Vignarajah, president and CEO of Lutheran Immigration and Refugee Services (LIRS), is troubled by the prospect that the decisions of local authorities will only be publicized if they accept refugees, not if they refuse them — a twist that may tend to “stoke xenophobia,” she pointed out in an interview, and make it harder for communities to welcome refugees.

Matthew Soerens, who directs World Relief’s efforts to mobilize evangelical churches on refugee and immigration issues, voiced a similar concern. Mandating a public announcement when a jurisdiction decides to accept refugees will draw the attention of “people who maybe don’t want their state or local government receiving them,” Soerens said in an interview. Even if 70% of the people in a community support resettlement and only 30% object, “they can make an ugly political issue,” he added, possibly increasing the difficulty of bringing refugees into a community even when the authorities are in favor of resettling them. “We don’t want refugees to come into a situation where there’s been a big political circus about their arrival,” he added. Most residents may be welcoming, but “it only takes a few to make them feel uncomfortable and unsafe.”

Church World Service, HIAS, LIRS, and World Relief are four of nine national resettlement agencies. Six of them are faith-based. All nine have strongly criticized the new refugee ceiling as cruel, contrary to religious teachings of love and compassion, and against American values. (“Trump Puts Out Lady Liberty’s Torch” was the headline over the Church World Service’s statement.)

Worldwide refugees at a record high, U.S. relief at an unprecedented low The unanimous criticism from those resettlement agencies reflects how deeply Trump’s latest decision will cut into future refugee relief efforts. The new ceiling of 18,000 represents less than one-fifth of the 95,000 yearly cap presidents have set, on average, since the present refugee law was enacted in 1980. Actual admissions, normally somewhat lower than the maximum allowed, are now guaranteed to fall far below the average annual rate over an even longer period dating back to the 1940s.

The number of Muslim refugees, in particular, has dropped in a stunning fashion since Donald Trump entered the White House, even though, as a recent study notes, four of the world’s five largest refugee crises affect Muslim populations. That report, compiled by the Refugee Council USA, highlights how startling the change was for the most deeply troubled countries between the last two fiscal years of Obama’s presidency, 2016 and 2017, and the first two full fiscal years of Trump’s, 2018 and 2019.

The report’s country-by-country figures show that refugee admissions from Iran dropped from 6,327 in 2016 and 2017 to 104 in 2018 and 2019. Admissions from Iraq — where the waiting list still includes thousands of Iraqis who worked for the U.S. government or military after the American invasion and occupation of that country — fell from 16,766 to 308, a 98% drop. For Somalia, the number went from 15,150 to 284; for Sudan, from 2,438 to 201; and for Syria, from 19,473 to 280. Altogether, the Refugee Council found that admissions of Muslim refugees had declined by 90% in the Trump era.

Hurting refugees — and those who help them The cut in admissions doesn’t just harm refugees waiting to come to the United States but hurts those already here and the people who help them. Because the State Department gives the resettlement agencies a fixed amount of money for each individual they resettle, the sharp drop in admissions has meant deep cuts in their budgets. That, in turn, reduces their ability to help new arrivals fit into American society after their initial government-funded 90-day refugee benefits run out.

Since 2017, according to the Refugee Council study, the nine national agencies combined have closed 51 branch offices across the country. That means they can no longer help refugees in those communities find jobs or offer them language training or legal services, or assist them in enrolling children in school or obtaining public benefits they are lawfully entitled to.

If the rollercoaster keeps going downhill, says Krish Vignarajah of LIRS, it could destroy the network her organization has created in its 80-year history: “If we lose that infrastructure built over decades by faith communities, nonprofits, and local communities, that is going to take a very long time to replace.”

World Relief’s Soerens said his agency has closed seven offices since 2017, while halting refugee resettlement in several others, losing “really gifted, committed staff who have years and decades of experience.” When possible, World Relief and similar agencies have tried to close down branches in places where other resettlement agencies are still operating, but, of course, those agencies are now stretched to the limit as well.

Trump’s policies also damage the international response to the growing global refugee crisis. In sharp contradiction to the spirit of the 1980 Refugee Act, which states that “it is the policy of the United States to encourage all nations to provide assistance and resettlement opportunities to refugees to the fullest extent possible,” American influence under Trump has moved in exactly the opposite direction. Instead of providing moral leadership for international efforts to meet the crisis, his example has encouraged governments and political forces across the world that strongly resist more generous efforts. As a result, tens or hundreds of thousands of desperate refugees will be trapped in their suffering for years longer, waiting for relief that may never come.

Raising the “public charge” barrier Another recent Trump initiative will potentially mean new hardships for a different category of immigrants who, like resettled refugees, are in the U.S. legally: non-citizens seeking the right to legal employment who may, in some cases, be subject to deportation if they can’t work.

That group, which includes many who are related to, or share households with, U.S. citizens, will face new barriers under a revised “public charge” rule that was scheduled to take effect this month until it was delayed by judicial rulings in three federal district courts. In those orders, handed down just four days before the October 15th effective date, federal judges in New York and Washington state temporarily blocked the rule nationwide, while a more limited ruling in California stayed its implementation in that state as well as in Maine, Pennsylvania, Oregon, and the District of Columbia, which were plaintiffs in the same lawsuit.

The new rule is aimed at making it tougher for green-card applicants to show that they will not be dependent on public benefits. Its weight would fall entirely on those in the applicant pool who are already the most needy and vulnerable. Women, children, the ill, and the elderly will be disproportionately affected, as will immigrants from poorer countries (who are also more likely to belong to racial minorities). Within those already disadvantaged groups, the poorer and more vulnerable someone is, the more likely she is to suffer adverse consequences.

That non-citizens should be denied permanent residence if they are “deemed likely” to depend on government benefits is a long-standing provision in U.S. immigration law, not a Trump-era invention. For many years, though, the “public charge” label was applied only to those receiving cash assistance through welfare, Social Security disability payments, or government-funded long-term institutional care. Under the new rules, immigrants seeking a green card or temporary employment status would be penalized for using — or just being judged likely to use — a long list of other benefits including food stamps, most Medicaid services, and various housing assistance programs, which were not previously held to define the recipient as a public charge.

Limited use of one of those benefits would not automatically disqualify an applicant, but would count as a “heavily weighted negative factor.” Low income, defined as less than 125% of the federal poverty guideline, would be another “heavily weighted” negative. Health and age could also count against an applicant.

Practically speaking, someone lawfully here could be sent home not only for using public benefits but simply for being more than 61 years old or having “a medical condition that is likely to require extensive medical treatment or institutionalization or that will interfere with the alien’s ability to provide care for himself or herself, to attend school, or to work.” Presumably, this means that someone legally in the U.S. who is blind or has some other physical disability would face a greater risk of deportation. Women would be at a significant disadvantage, an analysis by the Migration Policy Institute showed, because “they are less likely to be employed than men, generally live in larger households, and have lower incomes.”

A side effect of the new rules (noticeable since a draft was released more than a year ago) is that significant numbers of immigrants are now going without assistance to which they are legally entitled. Multiple studies have documented declining enrollments even in programs not covered in the new regulations or when benefits are going to the U.S.-born children of immigrants who are unquestionably eligible for them.

For example, the Agriculture Department’s special nutrition program for women, infants, and children, known as WIC, is explicitly excluded from the list of “benefits designated for consideration in public charge inadmissibility determinations.” But a recent Kaiser Family Foundation fact sheet reports that WIC agencies in a number of states have experienced “enrollment drops that they attribute largely to fears about public charge.” Investigations by the Urban Institute, Children’s Health Watch, and other organizations have found the same pattern in other programs.

A last thought Taken as a whole, the latest Trump administration assaults on refugees and immigrants should shock the conscience — the words the Church World Service’s John McCullough used about the new local-option resettlement policy. Legally, they are not high crimes and misdemeanors as that phrase appears in the Constitution. In moral terms, though, it would not be an exaggeration to call them high crimes and misdemeanors against humanity. By any reasonable standard they are more morally repugnant and bring more suffering to more innocent people than any presidential phone call to Ukraine.

In Trumpian terms, think of it as MACA, or Making America Crueler Again — and again, and again. Closing the country’s doors to more refugees (particularly if they’re Muslim), encouraging bigots and xenophobes to mobilize to keep refugees out of their towns, making it harder for immigrants to stay and build new lives if they are old or poor or sick, raising a barrier of fear that keeps them away from food aid and health care they and their children need and have a right to — none of these are impeachable offenses. In a fairer and more humane country, perhaps they would be.

Journalist Arnold R. Isaacs writes regularly for TomDispatch, where this article originated. Based in Maryland, he has written widely on refugee and immigration issues. He is the author of From Troubled Lands: Listening to Pakistani and Afghan Americans in post-9/11 America and two books relating to the Vietnam War. His website is www.arnoldisaacs.net.

Copyright ©2019 Arnold R. Isaacs — distributed by Agence Global

—————-
Released: 22 October 2019
Word Count: 2,708
—————-

FEATURE—Rebecca Gordon, “Extorting Ukraine is bad enough but Trump has done much worse”

October 15, 2019 - TomDispatch

Recently a friend who follows the news a bit less obsessively than I do said, “I thought George W. Bush was bad, but it seems like Donald Trump is even worse. What do you think?”

“Well,” I replied, “in terms of causing death and destruction, I suspect Bush still has the edge.” In fact, the U.S.-led forever wars begun under the Bush administration have killed hundreds of thousands of people in Iraq and Afghanistan (almost half a million by one respected estimate). And those are only directly caused, violent deaths. Several times that many have reportedly died from hunger, illness, and infrastructure collapse.

Millions more have become refugees. The U.N. refugee agency (UNHCR) says that, worldwide, “[t]here are almost 2.5 million registered refugees from Afghanistan. They comprise the largest protracted refugee population in Asia and the second largest refugee population in the world.” The numbers for Iraq are even higher. UNHCR reports that 3.3 million Iraqis were displaced by the various conflicts that followed the U.S. invasion of 2003 (though most of them remain in-country). Eleven million people, a quarter of the population, still need humanitarian aid.

Things are so bad that, since early October, Iraqis in Baghdad and some other cities have united across sectarian lines to risk death and injury in demonstrations demanding changes from the government. As Reuters explains it:

After decades of war against its neighbors, U.N. sanctions, two U.S. invasions, foreign occupation, and sectarian civil war, the defeat of the Islamic State insurgency in 2017 means Iraq is now at peace and free to trade for the first extended period since the 1970s. Oil output is at record levels. But infrastructure is decrepit and deteriorating, war-damaged cities have yet to be rebuilt, and armed groups still wield power on the streets.

So much for Operation Enduring Freedom. In terms of creating sheer human misery, George W. definitely has The Donald beat for now. But despite Trump’s frequently voiced desire “to get out of these ridiculous Endless Wars, many of them tribal, and bring our soldiers home,” he may yet do more harm than his Republican predecessor.

At the very least, he deserves impeachment as much as Bush did.

ITMFA Back in 2006, when Bush was president, a reader of the gay sex-advice columnist and podcaster Dan Savage suggested a campaign to “Impeach the Mother-Fucker Already.” ITMFA was the mock acronym — a play on Savage’s frequent advice to readers in bad relationships that they should DTMFA (the “D” being for “dump”). In response, Savage would have a bunch of ITMFA pins and buttons made and raise about $20,000, which he split between the American Civil Liberties Union (ACLU) and two Democratic senatorial campaigns.

In 2017, Savage again took stock of the country’s situation. “I didn’t think I’d see a worse president than George W. Bush in my lifetime. But here we are,” he wrote. So he added a new line of T-shirts, hats, and mugs to the ITMFA store, and sales have allowed him to donate more than $250,000 to the ACLU, Planned Parenthood, and the International Refugee Assistance Project.

Of course, Savage wasn’t the only one already talking about impeachment in 2017. That June, Representatives Brad Sherman (D-CA) and Al Green (D-TX) actually presented an impeachment resolution on the House floor. Its single Article of Impeachment accused President Trump of using the power of his office to “hinder and cause the termination of” the Justice Department’s investigation into Russian involvement in the 2016 election by threatening and ultimately firing FBI Director James Comey. It also cited Trump’s efforts to get Comey to “curtail” an investigation into Lt. General Michael Flynn who had briefly served as the president’s national security advisor. Flynn would later plead guilty to lying to the FBI about calls he made to Russia’s ambassador to the U.S. soon after Trump’s election victory.

Since October 2017, Representative Green has repeatedly introduced a different set of Articles focused on the president’s obvious and vocal racism:

“In his capacity as President of the United States… Donald John Trump has with his statements done more than insult individuals and groups of Americans, he has harmed the society of the United States, brought shame and dishonor to the office of President of the United States, sowing discord among the people of the United States by associating the majesty and dignity of the presidency with causes rooted in white supremacy, bigotry, racism, anti-Semitism, white nationalism, or neo-Nazism on one or more of the following occasions…”

The resolution goes on to list a number of Trump’s racist interventions, including calling some of the white supremacists and neo-Nazi protestors who marched in Charlottesville, Virginia (and one of whom murdered counter-demonstrator Heather Heyer by driving his car into a crowd), “very fine people”; sharing on social media anti-Muslim videos originally posted by Britain First, a minor English far-right party; attempting to prevent Muslims from entering the U.S. by executive order; attacking professional football players for taking a knee during the national anthem; accusing Puerto Ricans of throwing the U.S. “budget out of whack” in the aftermath of Hurricane Irma; and insulting Representative Frederica Wilson, an African American congresswoman, by calling her “wacky.”

The House has repeatedly rebuffed Green’s efforts, most recently in July 2019, when it voted 332-95 to table the measure, effectively killing it.

What a difference a couple of months can make.

Impeachment fever rising As anyone who’s been paying attention knows, even with a 54% majority in the House of Representatives, the Democratic leadership has long resisted calls to impeach the president, while Speaker Nancy Pelosi did a masterful job restraining the party’s left wing. Whatever I thought of her position on impeachment then, I had to admire her consummate parliamentary skills. She happens to represent my congressional district, so I’ve been a Pelosi-watcher ever since I worked for her opponent in her first congressional primary in 1987.

Impeachment advocates had hoped this would change with the release of Special Counsel Robert Mueller’s report on Russian interference in the 2016 election. Although the report did document numerous presidential efforts to obstruct the inquiry, the special counsel declined to speculate on the question of Trump’s guilt, arguing that Justice Department rulings prohibit the indictment of a sitting president. Nevertheless, in his first public statement, Mueller made it clear that his team’s work did not exonerate the president: “If we had had confidence that the president clearly did not commit a crime, we would have said so.”

With their eyes on the 2020 election season, however, Democratic Party centrists continued to argue that, because Trump would inevitably survive a trial in the Republican-dominated Senate, impeachment was a futile exercise. Worse, it might well stir up the president’s base and so improve his chances of reelection.

That all changed this August with a whistleblower’s revelation that the president had used a July 25 telephone call to press Ukrainian President Volodymyr Zelensky to dig up dirt on Joe Biden and his son Hunter. At the time, Trump had, without explanation, also frozen $391 million in U.S. military aid previously appropriated by Congress to help Ukraine resist separatists and their Russian allies fighting on its territory.

Under pressure, the White House released a two-page synopsis of the call, thinking that this would calm things down. It had the opposite effect. In that document, which is not quite a transcript and might not be complete, Zelensky, a comedian elected president after playing that very role in a popular TV series, told Trump that Ukraine was “almost ready to buy more Javelins [U.S. anti-tank missiles] from the United States for defense purposes.”

Trump responded, “I would like you to do us a favor though, because our country has been through a lot and Ukraine knows a lot about it. I would like you to find out what happened with this whole situation with Ukraine, they say Crowdstrike…” (The ellipsis, which may or may not represent missing material, marks the end of his sentence.) Trump was referring to a discredited conspiracy theory in which a supposedly missing Democratic National Committee computer server, hacked by Russia during election 2016 according to the Mueller investigation, ended up in Ukraine. (There is, in fact, no missing server, here or in Ukraine.)

Later, Trump asked Zelensky to look into a previous Ukrainian government’s ousting of prosecutor Viktor Shokin for corruption. Specifically, he wanted his counterpart to check out the theory that then-Vice President Joe Biden engineered Shokin’s dismissal to protect his son, Hunter, who then held a seat on the board of Burisma, a natural gas company owned by a Ukrainian oligarch that was under investigation. It seems clear that Shokin really was corrupt and that Joe Biden’s role in his ouster was unremarkable. (It seems equally clear, as Matthew Yglesias writes at Vox, that the younger Biden “had no apparent qualifications for the job,” which paid up to $50,000 a month, “except that his father was the vice president and involved in the Obama administration’s Ukraine policy.”)

Finally, Donald Trump had done something bad enough — strong-arming a foreign leader into digging up dirt on a likely Democratic Party presidential candidate — to convince the House leadership to initiate impeachment proceedings. Trump had already openly called on Russia to release 33,000 supposedly missing Hillary Clinton emails during the 2016 election. He had now invited a second country to interfere in U.S. elections and then tripled down by publicly asking China to do the same. All of this should be enough to demonstrate that the president has violated his oath of office on multiple occasions. ITMFA.

High(er) crimes and misdemeanors Extorting political favors is bad enough, but Donald J. Trump has done so much worse, even if his true highest crimes and misdemeanors aren’t ever likely to make it into the Articles of Impeachment finally sent to the Senate. These, to my mind, would include:

• Violating U.S. responsibilities toward refugees under international humanitarian law as defined in treaties and conventions this country long ago signed and ratified: In his behavior towards asylum-seekers and other migrants at our southern border, Trump, who began his 2015 election campaign by denouncing Mexicans as “drug dealers, criminals, and rapists,” has as president turned his back on decades of international consensus on the rights of refugees. He, of course, oversaw an administration that instituted a cruel policy of family separation of undocumented immigrants, causing thousands of children to be cut off from their parents. He also allowed such children to be held for weeks in stinking, filthy cages near the U.S. border.

More recently, he has pursued “safe third country” deals with the very nations — El Salvador, Honduras, Guatemala, and Mexico — that people are fleeing, in part, because of drug cartel violence and their governments’ inability, or unwillingness, to stop it. How can Mexico, for example, be a “safe” alternative for Salvadorans fleeing gang violence when its own citizens are seeking asylum in the United States for similar reasons?

He has also slashed to 18,000 the number of refugees allowed to enter the United States annually. (One hundred and ten thousand were accepted in Barack Obama’s final year as president.) He has, in other words, caused the country to turn its back on its international responsibilities, as well as on millions of human beings in desperate need of help around the world.

• Unlike other wealthy people elected president, Donald Trump refused from the outset to put his assets in a blind trust, arguing that “conflicts of interest laws simply do not apply to the president.” The purpose of such a trust is to prevent officials from knowing whether actions they take will result in personal financial benefit. Instead, Trump retained ownership of all his assets through a revocable trust, run for his sole benefit by his own children, and about which he receives regular updates.

The Constitution’s “emoluments” clause prohibits federal office-holders from accepting “any present, Emolument, Office, or Title, of any kind whatever, from any King, Prince, or foreign State.” Nevertheless, Trump has continued to benefit personally from money spent by foreign governments at his hotels (and golf clubs), especially his still relatively new Trump International Hotel a few blocks from the White House.

And it’s not only foreign diplomats, domestic lobbyists, and the like who have felt obliged to patronize such Trump properties. On a recent visit to Ireland, Vice President Mike Pence chose to stay at the president’s Doonbeg hotel and golf club, a distant 181 miles from Dublin where his meetings were being held. But there was no presidential pressure involved, as Pence’s Chief of Staff Marc Short assured reporters: “I don’t think it was a request, like a command. I think that it was a suggestion.” (It’s always possible, of course, that a presidential suggestion carries more weight than your average TripAdvisor review.) The New York Times reports that Pence’s Great America Committee PAC has spent more than $225,000 at the Trump International Hotel, among other Trump properties, since 2017.

Not to be outdone by mere elected officials, the U.S. Air Force has acknowledged that it has lodged airplane crews at Trump’s Turnberry resort in Scotland at least 40 times since 2015, most of them since he was elected, at a cost of more than $184,000.

And undoubtedly such examples just scratch the surface of what a president who happens to be an international real-estate developer can rake in when he puts his mind to it.

• He has caused this country to unilaterally violate the Joint Comprehensive Plan of Action, an agreement that successfully confined Iran’s nuclear development to serving its domestic energy needs: In May 2018, the president rashly pulled the U.S. out of the Iran nuclear deal that the Obama administration had successfully negotiated. This move has not only induced Iran to begin violating the terms of the agreement, but has destabilized the balance of power in the Middle East, leading to tit-for-tat vessel seizures and further inflaming relations between Iran and Saudi Arabia in dangerous ways. In September, for example, the Trump administration blamed Iran for a drone and missile attack that seriously damaged two key installations where much of the Saudi’s oil is refined.

• His dishonest, vicious, and racially charged rhetoric has cheapened political discourse in this country and is helping to hollow out our democracy: Free conversation about political issues, including sharp disagreements, is essential to a democratic society. But such conversations are only possible when the people involved can assume that everyone will make a good faith effort to tell the truth as they see it, to argue honestly, and to respect each other’s right to participate in the conversation. The philosopher Jürgen Habermas has called this approach “discourse ethics” and it should be at the very heart of democratic life.

Trump, of course, is a specialist in telling lies (more than 12,000 of them during his presidency so far, according to the count of the Washington Post). When the head of a democratic nation routinely treats lying as if it were a kind of truth telling in disguise, it changes the rules of political conversation. How can you argue with someone who “trumps” you not with logic, but with “alternative facts”?

Add to that the president’s constant use of insults, especially racially charged ones, to rule some participants out of the conversation altogether. He typically employs adhesive nicknames to “prove” (without evidence) claims about his opponents’ failings (“Crooked Hillary [Clinton],” “Shifty [Adam] Schiff”). Many of his ugliest insults are directed at women of color, calling African American congresswoman Maxine Waters “crazy” with an “extraordinarily low IQ,” for example. Perhaps most famously, he tweeted that four progressive Democrats (and women of color) known as “The Squad” should “go back” to where they came from:

“So interesting to see ‘Progressive’ Democrat Congresswomen, who originally came from countries whose governments are a complete and total catastrophe, the worst, most corrupt and inept anywhere in the world now loudly and viciously telling the people of the United States, the greatest and most powerful Nation on earth, how our government is to be run.”

Of course, the four (New York’s Alexandria Ocasio-Cortez, Minnesota’s Ilhan Omar, Massachusetts’s Ayanna Pressley, and Michigan’s Rashida Tlaib) are, in fact, part of “our government.” They are members of Congress. And by “countries whose governments are a complete and total catastrophe” Trump must have meant the United States, because that’s where three of them were born. The fourth, Ilhan Omar, was born in Somalia and is a naturalized U.S. citizen.

Remembering Robert Drinan Thinking about Trump’s impending impeachment reminds me of one of my heroes, Robert Drinan, a Jesuit priest and congressman from Massachusetts in the Nixon years. He was the first in Congress to call for the president’s impeachment — not for the coverup of what the White House called “a third-rate burglary” of Democratic Party headquarters at the Watergate office building in Washington, but for what he considered a much worse crime: the multi-year secret carpet-bombing of Cambodia.

That bombing campaign had begun under President Lyndon Johnson, but it expanded in a staggering way in the Nixon years. According to Yale University’s Genocide Studies Program, the U.S. flew more than 231,000 sorties over 115,000 sites, dumping “half a million or more tons of munitions” on that country. National Security Advisor Henry Kissinger memorably relayed President Nixon’s orders on the subject to General Alexander Haig: “He wants a massive bombing campaign in Cambodia. He doesn’t want to hear anything. It’s an order, it’s to be done. Anything that flies on anything that moves.”

Drinan asked his colleagues in Congress, “Can we impeach a president for concealing a burglary but not for concealing a massive bombing?” Their answer was that they could, although Nixon resigned before the House could vote on its articles of impeachment.

I’m reminded of Robert Drinan now, because once again we’re threatening to impeach a president, this time for a third-rate attempt to extort minor political gain from the government of a vulnerable country (without even the decency of a cover-up). But we’re ignoring Trump’s highest crime, worse even than the ones mentioned above.

He has promised to withdraw the United States from the Paris climate accord, the 2015 international agreement that was meant to begin a serious international response to the climate crisis now heating the planet. Meanwhile, he’s created an administration that is working in every way imaginable to ensure that yet more greenhouse gases are released into the atmosphere. He is, in other words, a threat not just to the American people, or to the rule of law, but to the whole human species.

And for that he richly deserves to be impeached and convicted.

Rebecca Gordon writes regularly for TomDispatch (where this article originated) and teaches at the University of San Francisco. She is the author of American Nuremberg: The U.S. Officials Who Should Stand Trial for Post-9/11 War Crimes. Her previous books include Mainstreaming Torture: Ethical Approaches in the Post-9/11 United States and Letters from Nicaragua.

Copyright ©2019 Rebecca Gordon — distributed by Agence Global

—————-
Released: 15 October 2019
Word Count: 3,105
—————-

Tom Engelhardt, “Mulch Trump”

September 29, 2019 - TomDispatch

Look what Greta started and what she did to me! I took part in the recent climate-strike march in New York City — one of a quarter-million people (or maybe 60,000) who turned out there, along with four million others across all seven continents. Then I came home and promptly collapsed. Which tells you one thing: I’m not 16 years old like Greta Thunberg, the Swedish teen who almost singlehandedly roused a sleeping planet and is now described as “the Joan of Arc of climate change.” Nor am I the age of just about any of the demonstrators I stopped to chat with that afternoon, however briefly, while madly scribbling down their inventive protest signs in a little notebook.

But don’t think I was out of place either. After all, the kids had called on adults to turn out that day and offer them some support. They understandably wanted to know that someone — other than themselves (and a bunch of scientists) — was truly paying attention to the global toilet down which their future was headed. I’m 75 and proud to say that I was walking that Friday with three friends, two of whom were older than me, amid vast crowds of enthusiastic, drum-beating, guitar-playing, chanting, shouting, climate-striking kids and their supporters of every age and hue. The streets of downtown Manhattan Island were so packed that sometimes, in the blazing sun of that September afternoon, we were barely inching along.

It was impressive, exuberant, and, yes, let me say it again, exhausting. And that sun, beautiful as it was, didn’t help at all. At one point, I was so warm that I even stripped down to my T-shirt. I have to admit, though, that I felt that orb was shining so brightly at the behest of those young school strikers to make a point about what planet we were now on. It was about 80 degrees Fahrenheit during that march, which fortunately was to a park on the tip of Manhattan, not to somewhere in Jacobabad, Pakistan, now possibly the hottest city on Earth (and growing hotter by the year) with a temperature that only recently hit 124 degrees Fahrenheit.

That night, back in my living room, I slumped on the sofa, pillows packed behind me, and turned on NBC Nightly News to watch anchor Lester Holt report on the breaking stories of that historic day in which climate strikers and their supporters had turned out in staggering numbers from distant Pacific islands to Africa, Europe, the Americas, and — yes — Antarctica. Even — bless them — a small group of young Afghans in that desperately embattled land was somehow still capable of thinking about the future of our planet and risked their lives to demonstrate! “I want to march because if I don’t survive this war,” said Sarah Azizi, one of those Afghans, “at least I would have done something for the next generation that they can survive.” (Where, though, were the Chinese demonstrators in a country that now releases more greenhouse gases into the atmosphere than any other, though the U.S. remains by far the largest emitter in history?)

Let me add one thing: I’m a religious viewer of Lester Holt or at least what I can take of his show (usually about 15 minutes or so). The reason? Because I feel it gives me a sense of what an aging slice of Americans take in as the “news” daily on our increasingly embattled planet. If you happen to be one of the striking school kids with a certain perspective on the adults who have gotten us into our present global fix, then you won’t be shocked to learn that those “Fridays for Future” global demonstrations proved to be the sixth story of the day on that broadcast. But hey, who can blame Lester Holt & Co? (“Tonight, several breaking headlines as we come on the air!”) After all, not far from Chicago, an SUV (“Breaking news! Shocking video!”) had busted ever so photogenically into a mall and rambled around for a while knocking things over (but hurting no one) before the driver was arrested. No comparison with millions of human beings going on strike over the heating of a planet on which life forms of every sort are in increasing jeopardy.

Then, of course, there was story number two: the “deadly tour bus crash” in Utah (“Also breaking, the highway horror!”) that killed four people near a national park. Hey, no comparison with a planet going down. Then there was the obvious crucial third story of the night, the “surprise move” of football’s New England Patriots to drop Antonio Brown, “the superstar facing sexual misconduct allegations,” from their roster. Fourth came an actual weather emergency, “the growing disaster, a new round of relentless rain on the Texas coast, the catastrophic cutting-off of communities, the death toll rising!” And staggering downpours from Tropical Depression Imelda, 40 inches worth in the Houston area, were indeed news. Of course, Lester offered not the slightest hint, despite the demonstrations that day, that there might be any connection between the seventh-wettest tropical cyclone in U.S. history and climate change. And then, of course, there was Donald Trump. (“Allegations President Trump pressured Ukraine’s leader eight times in a single phone call to investigate the son of rival Joe Biden!”) He’s everywhere and would probably have been bitter, had he noticed, to come in a rare and distant fifth that night. He was expectably shown sitting in his usual lost-boy pose (hands between legs, leaning forward), denying that this latest “whistleblower firestorm” meant anything at all. And finally, sixth and truly last, at least in the introductory line-up of stories to come, was humanity’s “firestorm” and the children who, unlike the grown-ups of NBC Nightly News, actually grasp the importance of what’s happening to this planet and so many of the species on it, themselves included. (“…And walking out of class, millions of students demanding action on climate change…”)

As I’ve written elsewhere, this sort of coverage is beginning to change as, in 2019, the climate crisis enters our world in a far more obvious way.  Still, it’s fairly typical of how the grownups of this planet have acted in these years, typical of what initially upset Greta Thunberg. Admittedly, even that day and the next, there was far better coverage to be found in the mainstream media. The Guardian, for instance, impressively streamed climate-crisis news all day and, that evening, the PBS NewsHour made it at least a decently covered second story of the day (after, well, you-know-who and that secret whistleblower). Meanwhile, a new initiative launched by the Columbia Journalism Review and the Nation magazine to heighten coverage of the subject has already drawn at least 300 outlets globally as partners. (Even Lester Holt has begun giving it a little more attention.)

And though it may not be timely enough, change is coming in polling, in the media, and elsewhere, and those children I saw marching in such profusion that day will indeed help make it happen. Opinion will continue to change in the heat of the oncoming moment, as in the end will governments, and that will matter, even if not as fast as would be either useful or advisable.

“Don’t be a fossil fool” Let me stop now and look back on that New York demonstration, more than a week gone, where, at one point, people all around me waving hand-made blue signs visibly meant to be ocean waves were chanting, “Sea levels are rising and so are we!”

To understand what’s happening on this planet of ours from the bottom up, what our future might truly hold in a post-Trumpian world (that’s still a world), I wish you could have spent a little time, as I did that day, with those marchers. But I think there’s a way you still can. As I mentioned, I spent those hours, in part, feverishly jotting down what was written on the endless array of protest signs — some held, some pasted onto or slung over shirts, some, in fact, actual T-shirts (“No More B[oil], Leave it in the ground”). Some had clearly been professionally printed up. (Perfect for the age of Trump, for instance: “The universe is made of protons, neutrons, electrons, and morons.”) Many were, as participants told me, not original but slogans found online and turned into personal expressions of feeling, often with plenty of decoration. That would, for instance, include the mock-Trumpian “Make America Greta Again” and “There Is No Plan[et] B. Green New Deal!”).

Many of the signs were, however, clearly original, some done with ultimate care, others scrawled wildly. Some were profane (“Fuck Trump, the Earth is Dying!” from a 14-year-old boy or “Clean the Earth, it’s not Uranus”); some were starkly blunt (“Act now before the show is over”); some politically oriented (“We’re not red or blue, we’re green”); some pop-culturally on target (“Winter is not coming”); some wry (“Don’t be a Fossil Fool”); some politically of the moment (“Real science, Fake president,” “Less AC, More AOC”); some critiques of capitalism (“If we can save the banks, we can save the world,” “We can’t eat money, we can’t drink oil”); some wise (“The climate is changing, why aren’t we?”); some culturally sly (“#MeToo, said Mother Earth”); or clever indeed (“This sign is reusable, STOP AND THINK”).

There were those two kids I ran into. The younger, a girl of 10, was carrying a homemade sign that said, “Dear Donald, Hate to break it to you, but climate change is real. XOXO Love, Earth”; her brother, 14, held up a two-word sign all his own that simply said, “Mulch Trump.” Touché! A college student’s sign read, “I am studying for a future that is being destroyed.” A 20-year-old woman held one that said, pungently enough in our present American universe, “Eco not Ego.”

A boy, 8, was blunt: “Save our future.” An 11-year-old girl no less blunt: “If you won’t act like adults, we will.” A 10-year-old boy had written plaintively: “I’m too old 2 die,” while another, a year older, offered this mordant message: “I don’t want to live on Mars. I want to live in Manhattan 30 years from now.” Many signs were, in their own way, upbeat, but some were deeply dystopian as in one woman’s that said: “Don’t think of this summer as the HOTTEST summer in the last 125 years. Think of it as the coolest summer of the NEXT 125 years.”

There was the woman with a sign that read “Science is not a liberal conspiracy.” When my friend congratulated her on it, she responded, “I wish I hadn’t been wearing this sign for seven years!” There was the woman carrying a sign that proclaimed, “Here for my son’s future.” Mounted on it was a photo of a bright-looking baby boy. When asked, she assured me with a smile that he was indeed her child whom she had given this line: “Mom, why didn’t you do more?”

And if you don’t think this — multiplied by millions across the planet — is hopeful, despite heatmongers like Donald Trump and Brazil’s Jair Bolsonaro now being in power, think again.

Let me assure you, I know what it feels like when a movement is ending, when you’re watching a nightmare as if in the rearview mirror, when people are ready to turn their backs on some horror and pretend it’s not happening. That was certainly what it felt like as the streets emptied of demonstrators in 2003 — and there had indeed been millions of them across the planet then, too — in the wake of the Bush administration’s invasion of Iraq. It will not, however, be as easy to turn away from climate change as it was from the Iraq War and its consequences (if, at least, you didn’t live in the Middle East).

The new climate crisis movement is, I suspect, neither a flash in the pan (since global warming will ensure that our “pan” only gets hotter in the years to come), nor a movement about to die. It’s visibly a movement being born.

“And I mean it!” There was the 63-year-old grandmother carrying a sign that said: “I want my granddaughter to have a future! She’s due on February 1, 2020.” My heart went out to her, because the afternoons I spend with my own grandson are the joys of my life. (He was marching elsewhere that day in a self-decorated T-shirt that said, “Plant more trees.”) Yet there’s seldom one of those afternoons when, at some unexpected moment, my heart doesn’t suddenly sink as I think about the planet I’m leaving him on.

So, even at my age, that march meant something deep and true to me. Just being there with those kids, a generation that will have to grow up amid fossil-fuelized nightmares whose sponsors, ranging from Big Energy companies to figures like Donald Trump, are intent on committing the greatest crime in human history. It’s certainly strange, not to say horrific, to have so many powerful men (and they are men) intent on quite literally heating this planet to the boiling point for their own profit, political and economic, and so obviously ready to say to hell with the rest of you, to hell with the future.

So, yes, there’s always the possibility that civilization as we know it might be in the process of ending on this planet. But there’s another possibility as well, one lodged in the living hopes and dreams of all those kids across a world that is already, in a sense, beginning to burn. It’s the possibility that something else is beginning, too. And it’s never too late for something new. Increasing numbers of the young are now starting to make demands and, in the wake of that march, I have the feeling that the demanding won’t stop until they get at least some of what they want — and the rest of us so desperately need.

In the end, I’m with the eight-year-old boy who had clipped (quite literally) to the back of his T-shirt what may have been my favorite sign of the march. Begun by him but obviously partially written out by an adult at his inspiration (and then decorated by him), it said: “I’m not cleaning up my room until the grownups clean up the planet — and I mean it!!!!!!!!!!!!!!”

As well he should!!!!!!!!!!!!!

Tom Engelhardt is a co-founder of the American Empire Project and the author of a history of the Cold War, The End of Victory Culture. He runs TomDispatch.com (where this article originated) and is a fellow of the Type Media Center. His sixth and latest book is A Nation Unmade by War (Dispatch Books).

Copyright ©2019 Tom Engelhardt — distributed by Agence Global

—————-
Released: 30 September 2019
Word Count: 2,397
—————-

Danny Sjursen, “Just when I thought it couldn’t get any worse…”

September 22, 2019 - TomDispatch

Recently, on a beautiful Kansas Saturday, I fell asleep early, exhausted by the excitement and ultimate disappointment of the Army football team’s double overtime loss to highly favored Michigan. Having turned against America’s forever wars and the U.S. military as an institution while I was still in it, West Point football, I’m almost ashamed to admit, is my last guilty martial pleasure. Still, having graduated from the Academy, taught history there, and spent 18 long years in the Army, I find something faintly hopeful about a team of undersized, overmatched, non-National Football League prospects facing off against one of the biggest schools in college football.

I awoke, though, early the next morning to the distressing — if hardly surprising — news that President Trump had spiked months of seemingly promising peace talks with the Taliban, blocking any near-term hope for an end to America’s longest, most hopeless war of all. My by-now-uncomfortably-familiar response was to go even deeper into a funk, based on a vague, if overwhelming, sense that the world only manages to get worse on a near-daily basis. For this longtime skeptic of U.S. foreign policy, once also a secret dreamer and idealist, that reality drives me toward political nihilism, a feeling that nothing any of us can do will halt the spread of an increasingly self-destructive empire and the collapse of democracy at home.

Looking back, I can trace my long journey from burgeoning neoconservative believer to Iraq War opponent to war-on-terror dissenter to disenfranchised veteran nonbeliever. Thinking about this in the wake of Army’s loss and those cancelled Afghan peace talks, during a typically morose conversation with Tom, of TomDispatch, I realized that I could tell a story of escalating military heresy and disappointment simply from the three years of articles I’d written for his website. It mattered little that, at the time, I imagined them as anything but the stuff of autobiography.

If all this sounds gloomy, writing itself has been cathartic for me and may have saved me on this strange journey of mine. So, join me on a little autobiographical fast march through a world increasingly filled with improvised explosive devices, or IEDs, as seen through the eyes of one apostate military veteran. Maybe some of you will even recognize aspects of your own life journeys in what follows.

“Hope and change” in Iraq In October 2006, when Second Lieutenant Sjursen arrived in Iraq, Baghdad was still, at least figuratively, aflame. It took only a few months of repetitious, useless “presence patrols,” a dozen IED strikes on my scout platoon, the deaths of three of my troopers and the maiming of others, as well as ubiquitous civilian deaths in marketplace bombings, to free me from a sense that the war in Iraq served any purpose whatsoever. Hearing again and again, even from long oppressed Shia Iraqis, that life under Sunni autocrat Saddam Hussein had been better, it became increasingly apparent that the U.S. invasion, launched by the Republican administration of George W. Bush and Dick Cheney on thoroughly bogus grounds in the spring of 2003, had shattered their nation and perhaps destabilized a region as well.

Just 23 years old (and, by my own estimation, immature at that), I — and a surprising number of my junior officer peers — started cautiously acting out. I grew my hair longer than regulations allowed and posted World War I-era antiwar poems by British veterans like Siegfried Sassoon and Wilfred Owen on my locker. I eventually even began “phoning in” my patrols, while attempting to avoid dicey, ambush-prone neighborhoods whenever possible.

And yet, despite a growing sense of darkness, I’d yet to lose all hope. At home, the Democrats (many of whom had once voted for the Iraq War) won back Congress in November 2006, largely thanks to a sudden burst of antiwar, anti-Bush rhetoric. In 2007, I began using my limited Internet time to ingest transcripts of every speech by or article about an upstart young African-American Democratic presidential contender, Barack Obama. Unlike anointed frontrunner Hillary Clinton, he seemed inspirational, an outsider, and — as an Illinois state senator — an early opponent of the very invasion that had landed me in my macabre predicament. I quickly decided he was my man, buying into his “hope and change” rhetoric, while dreaming of the day he’d end my war, saving countless lives, including possibly my own.

Sadly, if predictably, despite the new Democratic majority on Capitol Hill and monthly U.S. military fatalities that regularly hit triple digits, nothing could stop the Bush administration from continuing to escalate the war. I remember the moment in April 2007 when I heard that, thanks to President Bush’s announced troop “surge” in Iraq, my squadron was designated to stay three months past our scheduled year-long deployment. It felt like a gut punch. Steve, my fellow lieutenant, and I chain-smoked a pack of cigarettes in silence, while leaning against the brick wall of our Baghdad barracks. Then we faced the music and broke the news to our distraught soldiers.

In that bloodiest year of the war, my squadron would lose another half-dozen men in combat, while nearly 1,000 U.S. servicemen and women would die. Yet that famed, widely hailedsurge would, of course, ultimately fail. Not that most policymakers thought so at the time. The Bush-anointed, media-savvy new commander in Iraq, Army General David Petraeus, sold a temporary drop in violence to a fawning Congress, including most of those Democrats, as a profound success. It scarcely mattered that the announced purpose of the surge — to create space and time for a political reconciliation between Iraqi sects and ethnicities — failed from the start. My long-shot dream that an “antiwar” Congress would cut off funds for the conflict remained just that.

Still, landing at my home base in Colorado that New Year’s Eve, I remained almost unnaturally hopeful about Barack Obama as a potential savior. By April 2008, promoted to captain and sent to Fort Knox, Kentucky, for advanced schooling, I found myself secretly canvassing for him across the Ohio River in Indiana, which had just gained swing-state status. If only he could best Republican candidate John McCain, I thought, he might rapidly end what he had once called “the dumb war.” Given my single-minded focus on that possibility, I managed to ignore the way candidate Obama simultaneously called for an escalation of what he termed “the good war” in Afghanistan. Never mind, Obama won in November 2008. I spent Election Day drinking blue martinis and cheering him on with fellow dissenting officers. That night, holding my newly born infant son, I cried tears of joy as the election returns poured in.

Serving empire abroad, feeling empire close to home The next few years would be filled with disappointment, disenchantment, and disbelief as I followed America’s wars and the state of the world from a desktop computer in my new, highly immersive job with the 4th Cavalry on the squadron operations staff in Fort Riley, Kansas. I watched President Obama shed his dove credentials, unleash across the Greater Middle East exponentially more drone assassination attacks than the Bush administration, fail to close Guantanamo, and triple troop numbers — besting even Bush — in his own “surge” in Afghanistan. Meanwhile, the Pentagon would utilize a newly established U.S. military command, AFRICOM, to quietly expand deployments across another continent.

I was now in command of a company (we in the cavalry called it a “troop”) of some 100 scouts. In February 2011, Obama’s ongoing surge 2.0 diverted my unit from a potentially cushy “turn-out-the-lights” Iraq deployment to a fierce fight on the Taliban’s home turf in Kandahar, Afghanistan. That awful mission, as I told a Reuters reporter on the 10th anniversary of the 9/11 attacks — to the frustration of my colonel — seemed to me futilely unrelated to the events of September 2001. (I was chosen for the interview as a New York native.)

In that ultimately futile deployment, my troop of scouts lost three more lives and several more limbs. That May, Osama bin Laden was killed by Navy SEALs in Pakistan and my mother promptly asked if I’d now get to come home early. No such luck.

Meanwhile, the Obama administration further shattered the Greater Middle East and beyond through a string of military interventions. During my year-long deployment in Afghanistan, Washington helped turn Muammar Gaddafi’s Libya into a failed state of battling warlords and Islamists through an ill-fated regime-change operation; inched its way toward an intervention in the Syrian civil war that would, in the end, counterproductively back jihadis; and stood aside as the Saudis invaded Bahrain to crush Arab Spring protests in a little country that just happened to be home to the U.S. Navy’s Fifth Fleet.

I’d entered Afghanistan already opposed to that war and with few illusions that my own unit — or the U.S. military more generally — could alter the outcome there, let alone “win.” As that tour of duty wound down, I considered leaving the military once and for all. Still, I hedged. From remote southern Afghanistan, I had just enough fax-machine access to apply for a position teaching history at West Point, an assignment that could first get me two blissful years earning a master’s degree at a civilian university free of charge with full military pay and benefits. Surprisingly, I was accepted into that selective program and decided to stay in the Army indefinitely.

Grad school in the hippie enclave and university town of Lawrence, Kansas, in 2012 was all I’d hoped for, and more. Shedding my uniform, I felt strangely at home and thrived. I might have remained a student forever. Still, as I studied, I watched my former world continue to worsen.

During my two years at the University of Kansas, the Obama administration changed course, backing an Egyptian military coup against that country’s first democratically elected president; National Security Agency employee Edward Snowden blew the whistle on a massive illegal domestic surveillance program that was monitoring nearly all Americans; Army leaker Chelsea Manning, brought to trial by Obama’s Justice Department, was sentenced to 35 years in federal prison under the archaic World War I Espionage Act; and the newly branded Islamic State (formed in U.S. prisons in Iraq) exploded across Iraq and Syria. Soon, the president, having pulled U.S. troops out of Iraq in 2011, found himself launching a new air war in Syria as well as relaunching an old one in Iraq, and then sending troops into both countries.

All the while, the war in Afghanistan raged on without end or a hint of progress. Not yet emotionally prepared to speak, I suddenly wrote a short, angry, letter to hawkish Republican Senator Lindsey Graham, which would, over the next two years, turn into an anti-Iraq war memoir focused on the myth of the success of the surge. Predictably, hardly anyone noticed. Rather than feeling elated over having my book published, I only became more cynical about our ability — any of us — to alter the hapless path on which imperial America seemed so fully embarked.

Life goes on, however, and from 2014 to 2016, I had, I thought, the best job in the Army: teaching U.S. history to “plebes” (freshmen cadets) at West Point. Despite my own heartache, my by-then crippling PTSD, and the barely suppressed mental-health crisis that went with it, I held onto one hope: that, if I could enthusiastically impart a more accurate and critical history of the nation to my students, I just might influence a new generation of more independent-thinking officers. My former cadets are now all lieutenants and though some do attest to the influence of my class, most are serving the empire as middle managers across a vast global chain of American bases.

The news only grew more distressing during my brief foray at West Point. By then, the Pentagon was supporting an ongoing Saudi war in Yemen that included regular terror bombing and a starvation blockade of the country. It would kill tens of thousands of civilians, starve perhaps 100,000 children to death, and unleash a cholera epidemic of epic proportions. Meanwhile, the president reversed a promise to remove all U.S. troops from Afghanistan by December 31, 2014, and that war went right on. In those same years, the U.S. military “footprint” across Africa expanded exponentially, as (in a pattern already seen in the Greater Middle East) did a proliferation of Islamist militias on that continent.

Then, empire — as it always does — came home, this time in the form of increasingly militarized and Pentagon-equipped policing in neighborhoods of color across the nation. Thanks to YouTube and social media, pervasive instances of police brutality and the killing of unarmed, mostly young black men streamed into public consciousness. It was all brought home to me when a black man, Eric Garner, was choked to death by a white New York City police officer on a troubled street corner in my home borough of Staten Island, for the alleged crime of selling loose cigarettes. As a student of civil rights history, an aspirant activist, and the lead instructor (oddly enough) in African-American History at West Point, I felt galvanized into action.

The result: I found myself teaching cadets by day, then changing into jeans and a hoodie and driving 90 minutes to Staten Island, protest sign in tow. There, I would attend Eric Garner rallies and shout at the police. Hours later, I would trek back to the military academy, rinse and repeat. It felt good to be out on the streets, but, of course, it changed nothing. America’s warrior cops still operate with near impunity, using U.S. military counterinsurgency tactics (sometimes with Israeli Defense Force training) in communities of color as if they were occupied enemy territory.

Off the rails, once and for all Leaving West Point’s (relatively) progressive and intellectual history department in June 2016 for Fort Leavenworth’s stiflingly conventional Command and General Staff College in Kansas would prove deeply unsettling. Little did I know, though, that, as I began protesting America’s forever wars (my wars, so to speak) ever more volubly, my once-promising military career would soon be over. Army doctors determined that my emotional wounds qualified me for an early medical retirement. By February 2019, I found myself writing up a little antiwar storm and experiencing in-patient PTSD treatment in Arizona. I was, in other words, on my way out the door, an ignominious — if fitting — end to a career only months longer than America’s second Afghan War.

In those years, U.S. foreign policy should have gone into in-patient treatment, too. It had, in fact, spun out of control. In a through-the-looking-glass series of moves, our military continued to bomb seven countries, deployed troops to Syria, reentered Iraq, began expanding and modernizing its already vast nuclear arsenal, launched a new Cold War with Russia and China, and moved into the 18th year of its war in Afghanistan.

And did I mention that Donald Trump, corrupt real estate magnate, playboy, and reality TV star turned “populist” xenophobic hero, was elected president of the United States? He then ditched a promising Obama-era deal to deter Iran’s nuclear program, eschewed any American contribution to the global campaign against the existential threat of climate change (which he had previously called a “Chinese hoax”), and spiked the Cold War Intermediate Range Nuclear Forces treaty, leading atomic scientists to tick the “doomsday” clock a stroke closer to midnight.

He or his top officials also militarized the southern border, separated children from immigrant parents, and stuck kids in cages. He cheered on white supremacist rallies; encouraged those militarized cops to “not be too nice” to suspects and perhaps even to slam their heads into patrol car doors on their way to the station; threatened a “fire and fury” nuclear war against North Korea before falling “in love” with that country’s ruler; indicted, for the first time in American history, a publisher, Julian Assange, for posting leaked files; officially recognized the Israeli occupation of the Golan Heights, while expressing approval for Prime Minister Netanyahu’s plan to annex portions of the Palestinian West Bank outright; and… and… but I lack the energy to go on.

Which brings me back to Army’s heartbreaking (if inconsequential) loss in that football game and Trump’s recent decision to cancel ongoing peace talks with the Taliban (maybe the only hope left of getting our troops out of Afghanistan). That, of course, was the one constant of this tale of mine: that never-ending American war in Afghanistan. By September 2019, matters had so deteriorated that I was left with but one pathetic hope: that Donald Trump might, somehow, some way, sometime, be the one to end that absurd, Orwellian forever war.

And then, of course, he called off those peace talks and — a last gut punch — justified his decision by citing a Taliban attack that killed yet another American soldier. In the process, he ensured that yet more troopers like me (some of them undoubtedly born after the 9/11 attacks took place) will needlessly die in a war without end. Now, an alleged Iranian-sponsored attack on the Saudi oil industry may well scuttle any hopes for a long-shot peace deal with Tehran. War there, of course, could kill many more U.S. troops.

As for me, I have a feeling that I’ll wake up tomorrow to some new bit of bad news and begin repeating my now-endless refrain: Just when I thought it couldn’t get any worse…

Danny Sjursen writes regularly for TomDispatch (where this article originated). He is a retired U.S. Army major and former history instructor at West Point. He served tours with reconnaissance units in Iraq and Afghanistan, and now lives in Lawrence, Kansas. He has written a memoir of the Iraq War, Ghost Riders of Baghdad: Soldiers, Civilians, and the Myth of the Surge. Follow him on Twitter at @SkepticalVet and check out his podcast “Fortress on a Hill,” co-hosted with fellow vet Chris Henriksen.

Copyright ©2019 Danny Sjursen — distributed by Agence Global

—————-
Released: 23 September 2019
Word Count: 2,898
—————-

  • « Previous Page
  • 1
  • …
  • 37
  • 38
  • 39
  • 40
  • Next Page »

Syndication Services

Agence Global (AG) is a specialist news, opinion and feature syndication agency.

Rights & Permissions

Email us or call us 24 hours a day, 7 days a week, for rights and permission to publish our clients’ material. One of our representatives will respond in less than 30 minutes over 80% of the time.

Social Media

  • Facebook
  • Twitter

Advisories

Editors may ask their representative for inclusion in daily advisories. Sign up to get advisories on the content that fits your publishing needs, at rates that fit your budget.

About AG | Contact AG | Privacy Policy

©2016 Agence Global