Final Project

Autonomous Weapons & The Persistent Moral Toll of War

Over the last several decades, as computing technology has advanced there has naturally been a corresponding effort to integrate it within the modern warfighting space. One of the most prevalent effects we have seen from this the advent and widespread adoption of drone technologies. First, we saw drones utilized for reconnaissance to decrease soldiers’ potential exposure to hazards. Weaponization followed soon after. Drones have naturally been equipped with autonomous capabilities allowing them to fly from bases to targets and orbit for hours on end.

However, when weapons are employed by these platforms, their process of targeting and firing is human-initiated and operated. As nearly every form of munition used by drones are missiles, minor autonomous functions do appear in the form of guidance systems. As evidenced by the above image, in recent years there has been increased interest in the reduction or removal of the human element in the act of killing during the course of warfare. Some believe that we would be better served to let robots take our place in the ending of another human life. I heartily disagree. Replacing a soldier with a drone, while providing what we would see as the benefit of removing a life from harm’s way, operates upon the false assumption that because we (humans) are not the ones intimately performing the act that ends another life that there is no moral consequence. Evidence has shown that drone operators experience PTSD in the same way as soldiers directly involved in combat. (This is only a single example for reference)

With considerations such as these, I envisioned a concept for an “game” experience emulating what it may be like for a drone operator of the future. One who still controls where their drone goes, but not how it decides what to shoot – for the powers that be have decided that a computer can more reliably identify who is a threat and who isn’t than a human. As is always the case, however, computers are only as smart as their programmers have made them, which can lead to serious, unintended consequences.

For more of my thoughts on the implications of autonomous weapons, see this page containing a discussion post written for another class I took this semester, AI for Gameplay: https://jdminterfaces.home.blog/2019/12/16/final-project-supplemental-post/

Concept Art

Design & Implementation

One of the key computing-centered ideas I wished to represent through my project was that machines, especially computers, do not always behave as we would expect or like them to. Primarily, I aimed to implement this programatically, though the interface of a velostat sensor, which I created for my second project, was an obvious choice for introducing physical irregularity. Because of the inconsistent nature of the velostat in reacting to how it is pressed, as well as our own human limitations when it comes to estimating the force we apply, this makes reading taken from the sensors vary with every use. Therefore, it may take repeated attempts at pressing a sensor to get the desired reading and response. Each of the three sensors is used to activate a separate distinct function within the game, which will be explained later. Along with the velostat sensors, there is also a joystick connected to the Arduino. Although capable of a switch input by pressing down on the stick, I choose to only utilize readings from its X/Y axes. Readings from velostat sensors and joystick are sent via serial communication to the game’s executable, where they are interpreted.

Velostat sensors and joystick connected to the Arduino.

A screencap from the game showing that the drone has selected an individual as a target. The player must then confirm the target and give the order to fire.

“Gameplay”

I put the word gameplay in quotations because, as it stands, there is no true goal for the player to accomplish. Several issues arose during the course of my development that prevented me from implementing anything but the barest features. After running the executable, the player is presented with a screen displaying the collage of articles seen at the top of this post. Following this is a short written narrative introducing the context of the player as a drone operator. They then proceed to the game.

In the game scene, the player’s drone starts in the center. Around the player, a crowd of people move around the landscape. The majority of these people are civilians, though some are armed. Rocking the joystick forwards or backwards will cause the drone to advance or reverse. As a result of the nature of serial communication, there is also a delay between the player’s movement of the joystick and the response of the drone, akin to the real-world delay that can occur when piloting drones remotely from thousands of miles away.

Each of the velostat sensors corresponds to a separate function in the drone’s firing process. The player has no control over the direction of the drone’s turret and cannot fire it indiscriminately. The first sensor directs the drone to choose a target, which will then be identified by a crosshair and the turret will continuously rotate to face the target. Because of the near-constant recording of the sensor’s value, once the player presses the sensor hard enough to get the needed reading, the drone may cycle between a number of different targets before stopping on one. Inherent in this process is the possibility of the drone targeting a civilian rather than an armed person. It is up to the player to recognize this and direct the drone to choose another target. Pressing the second sensor will cause the drone to confirm its target, after which the target cannot be changed. From there, the player”s only option is to press the third sensor to fire the drone’s turret. The projectile will travel straight ahead until either reaching the edge of the screen and being destroyed or colliding with one of the people, killing them.

A current bug in the project is that the turret does not actually fire it’s projectiles in the direction it is facing. While an unintended outcome, this behavior could be further adapted to evoke more reflection on the kinds of tragedies that can arise in warfare.

As it is implemented, which target the drone chooses is completely dependent upon a random value generated whenever the command to identify a target is given. Although weighted towards choosing an armed character, as stated the possibility exists for the drone choosing a civilian.

Unimplemented Ideas/Future Work

One of the key shortcomings of the project’s current state is that I was unable to include any mechanic that explicitly directs the player’s attention to the consequences of killing an innocent person. Ideally, this would have involved something to the effect of either displaying a list of photos, names, etc for each civilian killed by the player or showing the player a series of news articles concerning incidents of civilian deaths in war as a result of drone/airstrikes over the last 2-3 decades.

Another attempted feature was the ability of the drone to automatically initiate the entire process of select, confirming, and firing upon a target without the player’s input. Unfortunately, I was not able to get this to work correctly. Had I, the next step would have been to have the drone initiate any of the steps individually after the player had initiated a previous one.

The gameplay section could also use a number of UI additions. Initially I had considered showing an identification chart, clearly showing which kinds of character sprites were hostiles and which were civilians. However, I eventually decided to not do so as I felt it worked against my goal of demonstrating how it has often difficult to discern friend from foe in recent conflicts, as well as for it detracting from the focus of criticizing relying upon AI for such a task. I also had hoped to include a number of feedback-related UI elements such as to display the drone turret’s current state (possibly inaccurately). Another was to provide feedback upon the killing of civilians and hostiles. Lastly, for the sake of polish I had hoped to include visual effects such as flames and smoke on firing the turret and blood to potentially influence the affective response of the player.

Fritzing’s pressure sensors are used to represent the velostat sensors actually utilized.

The Code

The Arduino code is fairly simple and involves sending a stringified version of the input data from the joystick and velostat sensors to the executable. The complete Unity project is made up of a number of interacting scripts and can be found here: https://github.com/jdm4344/Alt-Interfaces-Project3

A complete explanation of the Unity implementation would be excessive. For purposes of clarity though, it essentially consists of systems that boil down to the following:

  • Serial data (from the Arduino controller) interpretation & response
  • Player (the drone) movement physics
  • Non-player character (the “targets”) movement physics
  • Drone turret orientation physics
  • Projectile physics
  • Projectile collisions
  • Background & non-player character generation
  • User Interface
/*
 * Jordan Machalek
 * Adapted fron built-in Arduino SerialCallResponseASCII example
 */

// Attributes
int inByte = 0;         // incoming serial byte
// Joystick
//const int swPin = 2; // digital switch for when stick is depressed
const int swXPin = A0;
const int swYPin = A1;
// Joystick values
int swXVal = 0;
int swYVal = 0;
// Velostat
const int pressPin0 = A2;
const int pressPin1 = A3;
const int pressPin2 = A4;
// Velostat Values
int press0Value = 0; // First pressure sensor
int press1Value = 0; // Second pressure sensor
int press2Value = 0; // Third pressure sensor

void setup() {
  Serial.begin(9600);
  while(!Serial) { ; } // wait for serial connection

  //pinMode(swPin, INPUT);
  pinMode(swXPin, INPUT);
  pinMode(swYPin, INPUT);
  // Velostat Pressure Sensors
  pinMode(pressPin0, INPUT);
  pinMode(pressPin1, INPUT);
  pinMode(pressPin2, INPUT);
}

void loop() {
  if(Serial.available() > 0){
    // get call byte from Unity
    inByte = Serial.read();

    // Get joystick readings
    swXVal = analogRead(swXPin);
    swYVal = analogRead(swYPin);
  
    // Get readings from Velostat pressure sensors
    press0Value = analogRead(pressPin0);
    press1Value = analogRead(pressPin1);
    press2Value = analogRead(pressPin2);
  
    // Write parsable string of data to Unity
    Serial.print(swXVal); // Joystick X
    Serial.print(",");
    Serial.print(swYVal); // Joystick Y
    Serial.print(",");
    Serial.print(press0Value); // Velostat 1
    Serial.print(",");
    Serial.print(press1Value); // Velostat 2
    Serial.print(",");
    Serial.println(press2Value); // Velostat 3    
  }

  
}

void establishContact() {
  while (Serial.available() <= 0) {
    Serial.println("0,0,0,0,0");   // send an initial string
    delay(300);
  }
}

References

Leave a comment

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Create your website at WordPress.com
Get started
%d bloggers like this: