Maimed by Trauma: The Implications of Automated Warfare on Drone Pilots
People commonly associate PTSD with the military. The public has become desensitized to military trauma, to the extent that we risk turning a blind eye to the unconventional ways modern technology inflicts trauma on soldiers. With the way AI is changing every sector, from adding jobs to wiping them out, it’s crucial that more resources are put towards understanding how AI powered warfare is affecting soldiers.
The Department of Defense (DoD) is actively moving towards greater use of automated weapons systems (AWS), which are “capable [of] select[ing] and engag[ing] with targets without further intervention from human operators.” Traditionally, you would face your enemy and fight them on the battlefield. Now, these systems increasingly allow wars to be fought from a distance, thus rapidly changing the landscape of modern warfare. An example of this is the growing use of drones.
Drone technology is nothing new: it was used as early as WWI by Britain and the US in experiments with radio-controlled unmanned aircraft, such as the de Havilland DH82B Queen Bee. A crucial period for their development was during the War on Terror. Drones allowed the US army to wage a forever war by reducing direct risks to soldiers and making ongoing conflict more politically palatable. However, AI-powered drones are taking on an increasingly important role in the military. The DoD estimates that, by 2035, optionally manned or unmanned aircraft will make up nearly 60% of the Air Force. Some believe this new form of warfare to be less physically destructive and emotionally damaging, but that is far from the case. People fail to realize that AI powered warfare has similar detrimental effects on soldiers as what we consider “traditional warfare.” However, before we address the theoretical of the potential emotional damage of AI in remote warfare, it’s important we acknowledge the reality of what remote warfare is already doing to soldiers, as it removes them from physical harm but not psychological harm.
Drones are making a big impact on the future of the Air Force, as the pilots of remotely piloted aircrafts are able to conduct surveillance, collect intelligence and carry out precision strikes essentially mirroring the task of an infantryman from afar. However, drones appear in every branch of the military. Those in the Army fly Tactical Unmanned Systems and their responsibilities include conducting air reconnaissance, surveillance, targeting, and acquisition missions, as well as calling for direct and indirect fires against the enemy. In the Navy, drone pilots perform similar operations to that of the Army, but their unique roles include being able to provide aerial refueling, maritime radar and mine detection. However, this article is focused on drone pilots in the Air Force who are put in a unique position where they are physically removed from war but still emotionally present; as a result they are faced with their own set of emotional challenges that are not well understood.
When I think of a desk job, I think of a dead-end job. Ones with these jobs sit inside a boring office of uniform cubicles. These are not the type of jobs you think are reserved for uniformed officers, especially those with the ability to take a civilian life. Drone pilots give a new image to uniformed, faceless executioners. They are tasked with collecting photos, video feeds and watching U.S. soldiers on the ground. The “special” few are recruited to fly CIA assassination missions from the sky and kill “high value targets.” It feels like a full circle moment that the violent video games your parents tried to keep you from playing like Call of Duty mirror real life careers. However, killing these targets does not earn you more experience points; the only reward is a lifetime of survivor’s guilt. There is no reset button, only endless mental playback. How is twelve months of training ever supposed to prepare you for imagery so violent you cannot conjure it in your mind? Women and children incinerated by a hellfire missile. Men crawling across a field trying to make it to the nearest compound for help while bleeding out of their severed legs. Most people cannot see them, their tattered clothing, their severed limbs, their piercing screams: the defenseless and powerless civilians living in war zones who are experiencing the collateral damage of war. But for drone pilots, repeated exposure to this graphic imagery is a part of the job description. The ability of a drone pilot to take another human life is learned. Learned from clocking twelve-hour workdays, absorbing death and destruction.
Traditionally, soldiers are seen as emblematic of bravery, but former drone pilot Brandon Bryant describes feeling like a coward. He says he is seen by his peers as a “video game warrior” from his office in Nevada. However, these “video game warriors” are still exposed to the horrors of war from their cubicle. They are physically but not emotionally safe. Bryant describes feeling haunted by “a legion of the dead,” and being in extreme mental and physical pain to the point of “being ready to eat a bullet [himself].”
This is not an isolated experience. Captain Kevin Larson, who was once heralded as one of the best drone pilots in the Air Force, having earned twenty medals for achievement, coped with his trauma by using psychedelics. When the Air Force found out about his substance use, he was charged with using and distributing illegal drugs and stripped of his flight status. Larson ended up facing a possible prison term of more than 20 years. Since he was not classified as a conventional combat veteran, there was no psychological evaluation to see what influence his career as a drone pilot had on his misconduct. During his trial, none of his 188 airstrikes or high-profile kills were brought up. After his conviction, he went on the run and eventually committed suicide.
Larson’s tragedy highlights how even though drones are physically removing soldiers from war, they’re not making war any less traumatic, even for the most patriotic of people like Larson who felt military service was their life’s calling. Several former crew members of Captain Larson reported that trauma from their line of work has led them to drinking, divorce, and in some cases, mental breakdowns. Some have left the operations floor in tears. Others have attempted suicide. Drone pilots’ reports of drug use and poor mental health suggest that much of their emotional turmoil is rooted in PTSD. However, their low recorded PTSD rates compared to those returning from deployment underscore the ignorance and stigma about the ways PTSD can be developed and suggest underreporting driven by drone pilots’ fears of losing their security clearances. War is inherently traumatic, and failing to adequately address and acknowledge the trauma of drone pilots is inhumane and a moral failure. This is especially true as PTSD developed via proximity to no direct violence is nothing new and was even reported in jobs like Facebook content moderation where moderators saw graphic and disturbing content.
Due to a shortage of qualified drone personnel, current drone pilots are clocking in 900–1800 hours a year compared to the maximum 300 hours a year of a typical Air Force pilot. As a result of being physically removed from war and therefore labeled a non combatant, there is a lack of attention paid to their mental health. Drone pilots rarely, if ever, get the same mental health screenings and recovery periods as combatants. Drone pilots are watching death and destruction, which can make being overworked especially detrimental as it can lead to the development of PTSD over time. The absence of an effective response to this problem feeds the seemingly widespread idea that if you are not in any physical danger, logging in extra hours won’t kill you. Many people fail to realize that PTSD is not just developed by those who were in physical danger, but also by those who witnessed others in physical danger.
Despite their similar functions as aerial combatants, the military acknowledges the physical and emotional stress of being a fighter pilot but fails to recognize the full physical and emotional stress drone pilots experience. Drone pilots’ physical health reflect the same health trends as white collar jobs, with sleep deprivation and long working hours that leave little time for a fitness routine and in turn often lead to metabolic disorders. This is detrimental as mental and physical health go hand in hand. While treating drone pilots’ emotional trauma, we must not ignore how, as we move war online, soldiers’ physical health is also on a downward decline that only compounds their poor mental health.
In addition to PTSD and declining mental health, drone pilots are also at risk for moral injury, which the Harvard T.H. Chan School of Public Health defines it as “psychological harm incurred from committing, witnessing, or being subject to actions that violate one’s moral code.” The unique part of being a drone pilot compared to being a fighter pilot is that drone pilots carefully study their targets for weeks, seeing how they are as a spouse, a parent, a sibling. They grasp an idea of who their target is as a person before ultimately pulling the trigger on their life. Not only that, but they get to see the immediate aftermath of the emotional and physical damage they caused. Bennett Miller, an intelligence analyst, said that the process of collecting intelligence often consists of as little as watching people live their everyday lives. Despite this, he is not spared from the trauma of war. Miller recalled one instance where his team was asked to watch and kill a high-level Taliban financier. Miller and his team “for a week [-] watched the man feed his animals, eat with family in his courtyard and walk to a nearby village.” After the financier was killed, they watched the aftermath and saw those same family members pick up the pieces left of the man to bury. Miller recalled how that day “[He] had watched [the victim’s] kids pick up the body parts. Then [he went] home and hugged [his] own kids.” While the DoD is pushing to make drones more autonomous in order to improve the safety of soldiers and the performance and efficiency of military operations, at the moment these Unmanned Aerial Vehicles are still very much manned. Drone pilots may not be on the ground like a traditional combat soldier, but moral injury is still “in the air.” The moral injury experienced by drone pilots may even be especially suffocating as drone pilots lack work-life separation. Those deployed have a unique opportunity to be fully physically and mentally present in war whereas drone pilots, while emotionally present in war, are physically removed since they can clock out after a long shift, continuing with their civilian lifestyle until they are to report for duty once again. This leaves them to experience a new strain of moral injury as they have the power to derail someone’s day to day life while having the ability to continue with their day to day life.
Even if we were to develop fully autonomous drones, soldiers still wouldn’t be absolved from moral injury. Kanaka Rajan, a founding faculty member of the Kempner Institute for the Study of Natural and Artificial Intelligence at Harvard University, warns that as these systems get more complex, powerful, and faster-reacting, the “long chain of black box decisions” embedded in AWS will make it hard for humans to be a meaningful part of the decision-making process. The inability to quantify the nuances of ethics and morals along with the complexity of the decision making process puts service members at a high risk for moral injury, as they are in a way still following orders that are not their own. Paul Scharre, a Pentagon defense expert and former U.S. Army Ranger who served in Iraq and Afghanistan, expands on how autonomous weapons can affect a soldier’s psyche in his book Army of None: Autonomous Weapons and the Future of War. He claims that making life-or-death decisions on the battlefield is the essence of the military profession and recalls how, years later, he can still feel the gravity of having someone’s life in his hands. What allowed him to come to peace with his decisions on the battlefield was feeling and understanding that weight.
Ethics and morality have a lot of nuances that inevitably come out in morally contradicting places such as war. Therefore, even in automated warfare, the moral dilemmas of war just manifest in new ways that still affect a soldier’s psyche. Moral injury can be a mental life sentence if not treated: you not only experience individual emotions of anxiety and depression, but also feel guilt and shame that keep you in a never ending cycle of self blame. You blame yourself for all the ways you believe you could’ve handled the situation differently even when what occurred was out of your control. This challenges the stigma surrounding AI powered drones: while critics argue that automated decisionmaking makes it easier to carry out morally conflicting decisions, claiming that automation removes moral conflicts soldiers may face. As more drone pilots speak out, it is becoming clear that current advancements in warfare technology are not robust enough to remove the morally conflicting decisions of war and prevent emotional injury, as the traumas of war do not discriminate.
Moving forward, it is crucial that more funding and resources are allocated to study and support drone pilots as what “we lack in knowledge we can make up for in data.” This would allow us to potentially prove that the drop out rate has much more to do with the deep rooted trauma they experience (similar to that of combat soldiers), not just the high stress environment. Additionally, they are not any less morally conflicted due to automated decision making, and may even be at a higher risk of moral injury due to not being able to take a more active role in controlling the ability to take the life of another human being. More research would remove this stigma of being perceived as a “video game warrior” or “cubicle warrior” that leads many Air Force drone pilots and other military personnel to dismiss their trauma. Allowing drone pilots more support would benefit drone pilots physically and emotionally, thus indirectly benefitting them by allowing the drop out rate to potentially decrease. This would reduce the tendency to overwork drone pilots and allow them additional recovery time.
We are constantly ignoring the fact that we don’t fully seem to understand AI ‘s capabilities and externalities, which would help us better understand how to regulate it across all sectors. However, one negative consequence of AI powered warfare we know of for sure is that it still leaves soldiers at risk of developing PTSD and moral injury. This is a consequence of all types of warfare, regardless of how high-powered military technology becomes. With all the ethical implications of AI warfare and the lack of thorough research, drone pilots’ struggles get easily dismissed and lost in the noise. As technological advancement in warfare has yet to prove itself as a full faceted solution to shielding soldiers from the traumas of war, it’s important that resources are invested into proven therapies to treat PTSD and moral injury. These include traditional methods such as talk therapy and technological therapies like virtual reality. The gaps in our knowledge surrounding drone pilots’ PTSD and moral injury will not fill themselves until more research is done; however, in the meantime, we have proven therapies to help treat drone pilots maimed by the traumas of war. While drone pilots are physically safe, we must not ignore the fact that they are just as emotionally vulnerable as infantrymen.