Final Project – Supplemental Post

Discussion Prompt:

Start this discussion by reading the article “Autonomous Weapon Systems and US Military Robotics ”

As an experienced programmer you’ve seen how difficult it is to fully debug complex programs, such as some of the AI applications that we worked on in this course. What is your opinion on the desirability of any military producing autonomous weapons of war? Explain why you’re for or against this particular application of AI.

Response:

The adoption of robotic remote and automated machines is undoubtedly one of the most significant advancements in modern warfare and will remain at the forefront for years to come. I believe the greatest benefit to come from this is the opportunities that unmanned drones have provided regarding the ability to keep people out of harm’s way, such as for scouting and ordinance disposal. Technology such as UGVs that fulfil logistical functions like transporting equipment and wounded are also a positive result of these developments. Applying AI to platforms such as these could help to further their life-saving ability. Concern does arise regarding the potential for accidents that occur as a result of programming bugs but the same issue is present in those devices that are not AI driven. At the same time, I believe that when we look to give AI controlled machines the ability to kill people this presents a moral problem.

To me, a significant difference exists between a missile equipped drone controlled by an operator in a command center or in the field and one without human input that uses AI to identify and engage targets. Artificial intelligence coupled with technology designed for the express purpose of causing harm becomes a twofold point of failure, where there are not only  points of failure in the device that is specifically carrying out the action but also in the decision making process of the AI. We might say that AI would be less prone to error than a human controller and it likely is true that a computer could more reliably visually identify targets than a person. Even so, because computers are designed and programmed by people who make errors, there will never be a system with one-hundred percent reliability. Furthermore, I believe it is much harder to “hack” a human than to hack a computer for nefarious purposes, but that is a complex and separate discussion and so I will avoid exploring it here. 

Attempting to absolve ourselves of the responsibility for killing by passing on the act to something with no inherent moral agency makes those of us responsible for doing so more guilty for the lack of regard such an approach shows for human life. As long as humanity as we know it continues there will be those who seek to do evil and conflict as a result. Therefore, I believe we should seek to resolve it with as little harm and loss of life as possible. Whether a soldier pulls the trigger of a gun or the trigger of a drone’s joystick a moral decision is made to end the life of another human. I think that we would be kidding ourselves to believe that just because a person isn’t physically pulling the trigger that we become innocent of the consequences. To set loose AI that can decide to end the life of a human, no matter the restrictions set, is still indiscriminate for we have given it that ability. As far as I am concerned, no artificial intelligence or otherwise autonomous device should be given the agency to make an independent, intentional decision meant to end the life of a human. 

Final Project

Autonomous Weapons & The Persistent Moral Toll of War

Over the last several decades, as computing technology has advanced there has naturally been a corresponding effort to integrate it within the modern warfighting space. One of the most prevalent effects we have seen from this the advent and widespread adoption of drone technologies. First, we saw drones utilized for reconnaissance to decrease soldiers’ potential exposure to hazards. Weaponization followed soon after. Drones have naturally been equipped with autonomous capabilities allowing them to fly from bases to targets and orbit for hours on end.

However, when weapons are employed by these platforms, their process of targeting and firing is human-initiated and operated. As nearly every form of munition used by drones are missiles, minor autonomous functions do appear in the form of guidance systems. As evidenced by the above image, in recent years there has been increased interest in the reduction or removal of the human element in the act of killing during the course of warfare. Some believe that we would be better served to let robots take our place in the ending of another human life. I heartily disagree. Replacing a soldier with a drone, while providing what we would see as the benefit of removing a life from harm’s way, operates upon the false assumption that because we (humans) are not the ones intimately performing the act that ends another life that there is no moral consequence. Evidence has shown that drone operators experience PTSD in the same way as soldiers directly involved in combat. (This is only a single example for reference)

With considerations such as these, I envisioned a concept for an “game” experience emulating what it may be like for a drone operator of the future. One who still controls where their drone goes, but not how it decides what to shoot – for the powers that be have decided that a computer can more reliably identify who is a threat and who isn’t than a human. As is always the case, however, computers are only as smart as their programmers have made them, which can lead to serious, unintended consequences.

For more of my thoughts on the implications of autonomous weapons, see this page containing a discussion post written for another class I took this semester, AI for Gameplay: https://jdminterfaces.home.blog/2019/12/16/final-project-supplemental-post/

Concept Art

Design & Implementation

One of the key computing-centered ideas I wished to represent through my project was that machines, especially computers, do not always behave as we would expect or like them to. Primarily, I aimed to implement this programatically, though the interface of a velostat sensor, which I created for my second project, was an obvious choice for introducing physical irregularity. Because of the inconsistent nature of the velostat in reacting to how it is pressed, as well as our own human limitations when it comes to estimating the force we apply, this makes reading taken from the sensors vary with every use. Therefore, it may take repeated attempts at pressing a sensor to get the desired reading and response. Each of the three sensors is used to activate a separate distinct function within the game, which will be explained later. Along with the velostat sensors, there is also a joystick connected to the Arduino. Although capable of a switch input by pressing down on the stick, I choose to only utilize readings from its X/Y axes. Readings from velostat sensors and joystick are sent via serial communication to the game’s executable, where they are interpreted.

Velostat sensors and joystick connected to the Arduino.

A screencap from the game showing that the drone has selected an individual as a target. The player must then confirm the target and give the order to fire.

“Gameplay”

I put the word gameplay in quotations because, as it stands, there is no true goal for the player to accomplish. Several issues arose during the course of my development that prevented me from implementing anything but the barest features. After running the executable, the player is presented with a screen displaying the collage of articles seen at the top of this post. Following this is a short written narrative introducing the context of the player as a drone operator. They then proceed to the game.

In the game scene, the player’s drone starts in the center. Around the player, a crowd of people move around the landscape. The majority of these people are civilians, though some are armed. Rocking the joystick forwards or backwards will cause the drone to advance or reverse. As a result of the nature of serial communication, there is also a delay between the player’s movement of the joystick and the response of the drone, akin to the real-world delay that can occur when piloting drones remotely from thousands of miles away.

Each of the velostat sensors corresponds to a separate function in the drone’s firing process. The player has no control over the direction of the drone’s turret and cannot fire it indiscriminately. The first sensor directs the drone to choose a target, which will then be identified by a crosshair and the turret will continuously rotate to face the target. Because of the near-constant recording of the sensor’s value, once the player presses the sensor hard enough to get the needed reading, the drone may cycle between a number of different targets before stopping on one. Inherent in this process is the possibility of the drone targeting a civilian rather than an armed person. It is up to the player to recognize this and direct the drone to choose another target. Pressing the second sensor will cause the drone to confirm its target, after which the target cannot be changed. From there, the player”s only option is to press the third sensor to fire the drone’s turret. The projectile will travel straight ahead until either reaching the edge of the screen and being destroyed or colliding with one of the people, killing them.

A current bug in the project is that the turret does not actually fire it’s projectiles in the direction it is facing. While an unintended outcome, this behavior could be further adapted to evoke more reflection on the kinds of tragedies that can arise in warfare.

As it is implemented, which target the drone chooses is completely dependent upon a random value generated whenever the command to identify a target is given. Although weighted towards choosing an armed character, as stated the possibility exists for the drone choosing a civilian.

Unimplemented Ideas/Future Work

One of the key shortcomings of the project’s current state is that I was unable to include any mechanic that explicitly directs the player’s attention to the consequences of killing an innocent person. Ideally, this would have involved something to the effect of either displaying a list of photos, names, etc for each civilian killed by the player or showing the player a series of news articles concerning incidents of civilian deaths in war as a result of drone/airstrikes over the last 2-3 decades.

Another attempted feature was the ability of the drone to automatically initiate the entire process of select, confirming, and firing upon a target without the player’s input. Unfortunately, I was not able to get this to work correctly. Had I, the next step would have been to have the drone initiate any of the steps individually after the player had initiated a previous one.

The gameplay section could also use a number of UI additions. Initially I had considered showing an identification chart, clearly showing which kinds of character sprites were hostiles and which were civilians. However, I eventually decided to not do so as I felt it worked against my goal of demonstrating how it has often difficult to discern friend from foe in recent conflicts, as well as for it detracting from the focus of criticizing relying upon AI for such a task. I also had hoped to include a number of feedback-related UI elements such as to display the drone turret’s current state (possibly inaccurately). Another was to provide feedback upon the killing of civilians and hostiles. Lastly, for the sake of polish I had hoped to include visual effects such as flames and smoke on firing the turret and blood to potentially influence the affective response of the player.

Fritzing’s pressure sensors are used to represent the velostat sensors actually utilized.

The Code

The Arduino code is fairly simple and involves sending a stringified version of the input data from the joystick and velostat sensors to the executable. The complete Unity project is made up of a number of interacting scripts and can be found here: https://github.com/jdm4344/Alt-Interfaces-Project3

A complete explanation of the Unity implementation would be excessive. For purposes of clarity though, it essentially consists of systems that boil down to the following:

  • Serial data (from the Arduino controller) interpretation & response
  • Player (the drone) movement physics
  • Non-player character (the “targets”) movement physics
  • Drone turret orientation physics
  • Projectile physics
  • Projectile collisions
  • Background & non-player character generation
  • User Interface
/*
 * Jordan Machalek
 * Adapted fron built-in Arduino SerialCallResponseASCII example
 */

// Attributes
int inByte = 0;         // incoming serial byte
// Joystick
//const int swPin = 2; // digital switch for when stick is depressed
const int swXPin = A0;
const int swYPin = A1;
// Joystick values
int swXVal = 0;
int swYVal = 0;
// Velostat
const int pressPin0 = A2;
const int pressPin1 = A3;
const int pressPin2 = A4;
// Velostat Values
int press0Value = 0; // First pressure sensor
int press1Value = 0; // Second pressure sensor
int press2Value = 0; // Third pressure sensor

void setup() {
  Serial.begin(9600);
  while(!Serial) { ; } // wait for serial connection

  //pinMode(swPin, INPUT);
  pinMode(swXPin, INPUT);
  pinMode(swYPin, INPUT);
  // Velostat Pressure Sensors
  pinMode(pressPin0, INPUT);
  pinMode(pressPin1, INPUT);
  pinMode(pressPin2, INPUT);
}

void loop() {
  if(Serial.available() > 0){
    // get call byte from Unity
    inByte = Serial.read();

    // Get joystick readings
    swXVal = analogRead(swXPin);
    swYVal = analogRead(swYPin);
  
    // Get readings from Velostat pressure sensors
    press0Value = analogRead(pressPin0);
    press1Value = analogRead(pressPin1);
    press2Value = analogRead(pressPin2);
  
    // Write parsable string of data to Unity
    Serial.print(swXVal); // Joystick X
    Serial.print(",");
    Serial.print(swYVal); // Joystick Y
    Serial.print(",");
    Serial.print(press0Value); // Velostat 1
    Serial.print(",");
    Serial.print(press1Value); // Velostat 2
    Serial.print(",");
    Serial.println(press2Value); // Velostat 3    
  }

  
}

void establishContact() {
  while (Serial.available() <= 0) {
    Serial.println("0,0,0,0,0");   // send an initial string
    delay(300);
  }
}

References

Reading Set 2

Ambiguity as a Resource for Design

This article presents a somewhat concise perspective on how ambiguity is conceptualized and implemented within technology. The key thought that I have come away from this reading is how applicable ambiguous design is to instances of artistic works, yet is in my opinion nearly unusable in practical endeavors. However, the physical implementations of ambiguity explored in the article are not the kind that I expect to want to experiment with for myself. The examples of Projected Realities and Desert Rain were both particularly interesting to me, likely because of the story elements within them. Meanwhile The Pillow and Home Health Monitor came across as a novelty invasion of privacy and a forced system of meaningless analogs, respectively. I also found the example of Sarah Pennington’s phone design to be quite ridiculous, essentially being a pager rather than a unique exploratory piece.

As the authors explore throughout their paper, ambiguity is particularly useful in causing reflection and abstract thought. This is something that I find has particular merit in games and greatly appreciate when I encounter it. Design through metaphor and allegory are approaches which often catch my attention and I very much agree with their claim that:

By impelling people to interpret situations for themselves, it encourages

them to start grappling conceptually with systems and their

contexts, and thus to establish deeper and more personal

relations with the meanings offered by those systems.

Gaver et al.

It is not often though that designers and engineers of systems for purposes of industry, research, consumer use, etc. create interfaces that are not easily interpreted. Though we may in some instances want to represent information such as temperature with a gradient of red to blue minus any numerical indicator.

Overall, Gaver et al.’s article has reinforced my opinion on the application of ambiguity. They give a useful overview of how to use this concept, yet fail to extend it beyond the domain of art. As far as I was able to discern, little evidence is given for their argument that “in the many emerging applications for everyday life…ambiguity is a resource that designers should neither ignore nor repress.”

Reflective Design

Compared to the first article, I found the content discussed within Sengers et al.’s Reflective Design to be much more relevant both in regards to practicality and range of application. The initial discussion on HCI and biases and their effects within the practice was quite interesting and I think served as an excellent example giving context to the problem that they perceive. I wonder, however, to what extent did the authors reflect upon their own assumptions when forming their arguments and consolidating motivations. For in their section defining reflective design, they seem to have fully embraced ideologies of Marxism, feminism, racial and ethnic studies, the Enlightenment, etc. while assuming that there is nothing to be applied to reflective design from what might otherwise be “conflicting” viewpoints. Understandably though, an argument for any perspective must have some kind of definite foundation to be drawn from and cannot incorporate ideas from every worldview, nor do I think that they should.

One of the most important points I think that the authors present is found in their principles of reflective design, that “Technologies are not inherently values-blind”. As they themselves expand upon, everyone has concepts that underlie how we conduct every aspect of our lives, and design of and interaction with technology is no different. Critically, they also note how technology should not be the complete director of its user’s actions – especially when the user is under observation. Rather, it should offer avenues of interaction that respect agency and lend themselves to unintended, hopefully positive, consequences. With this I wholly agree.

At its core, I find the authors’ argument is for the promotion of evaluation of worldviews, albeit an evaluation focused within the context of technological design, and this is something that I am wholly supportive of. Indeed, this is something I attempt to do at every opportunity within my own work. In keeping with the theme of reflection, I will admit my preference for this article was likely influenced by the preexisting importance I place on reflective practices. If we do not reflect upon the things that we do, say, and create we are doomed to drift into self-defeating patterns of action.

Project 2 – Alternative Interface Prototype

Mock Up – Surveillance Jacket

The overall concept which I have designed is a jacket integrated with a sensor suite and computer to increase situational awareness and provide surveillance related functionality. Such a garment would be intended for undercover operations ranging from low to high risk and therefore be constructed from Kevlar and other durable materials and electronic components. An array of ultrasonic distance sensors would be located along the sides and back of the jacket and used to sense if the wearer is approached from these vulnerable directions. These would be integrated with other sensors such as an accelerometer (not included in my diagram) and programmed in such a way to ignore scans against stationary objects that are passed while the wearer in motion. A corresponding feedback array would be used to alert the wearer to the location and distance of incoming contacts. Vibrating motors or even another wearable such as a smart watch or Google Glass-like device could be used for this purpose, though in my diagrams I have opted for a simple set of 4 LED associated with each sensor. The jacket will also come equipped with a hidden front-facing camera and accompanying microphone. In my design above, the camera is represented by the small black box on the left side. Serving a dual purpose, the microphone could be utilized for both voice-activated commands affecting the jacket’s functions as well as for recording surrounding audio. Audio feedback given to the user is also a possibility via a separate device or an integrated speaker, which I have included in the diagram.

Because of the jacket’s emphasis on awareness and concealment, the wearer would likely not want to visibly be seen interacting with a secondary device when controlling functions of the jacket. As such, primary interaction would be handled via haptic pressure sensors within the jacket’s pocket. In my diagram I have used Fritzing’s Force Sensitive Resistors to represent these, however in reality they would use Velostat or a drawing-tablet like interface that can track both the position of interaction and varying levels of pressure to control the jacket. Various combinations of tapping, holding, sliding, etc. enable, disable, or adjust the many functions of the jacket, such as activating the recording functionality or disabling the proximity scanning or modifying the range at which it alerts the wearer.

Prototype

For my prototype of the surveillance jacket, I focused primarily upon the interface of the force sensors. As in my concept above I have represented them in my breadboard and schematic diagrams using FSRs as a stand-in. My material of choice for the prototype of this sensor was Velostat, which turned out to be particularly challenging to work with. After making several iterations based upon various examples, I finally was able to create a functioning pressure-sensitive circuit. Unfortunately, the reading given by the sensors were not always accurate, making it difficult to get consistent readings of light, medium, and hard presses as well as the difference between pressing the sensors temporarily and holding them down.

Initial attempts to create the sensor with two pieces of tin tape at the wire ends and from one to several sheets of Velostat bridging the gap. Several types of diodes were also included based on examples.

First accurate revision consisting of a single square of Velostat in series with a 10k resistor .

Final version of three separate pressure sensors.

Along with the pressure sensors are an ultrasonic distance sensor, microphone, and speaker, along with accompanying LEDs to function as indicator lights. As implemented, the microphone only simulates the act of recording. Two LEDs are associated with the microphone, the green indicating whether it is “recording” and the red whether input is being received. The second red light activates relative to the ultrasonic sensor. When an object passes within six feet of the sensor, the light will activate. At two feet or less, the sensor will cause the speaker will play a series of four notes as an alarm. By either tapping the Velostat sensors, the microphone, speaker, and ultrasonic sensor can all be turned on or off. Because of how the Velostat works, various functions can also be assigned to when they are held individually, or when multiple are tapped or held simultaneously.

Main File

/*
 * Jordan Machalek
 * Project 2
 * Prototype utilizing pressure and ultrasonic sensors to create an interactive
 * proximity sensor device intended to be integrated with a jacket or other
 * outerwear as a wearable device
 * Ultrasonic sensor code adapted from: https://howtomechatronics.com/tutorials/arduino/ultrasonic-sensor-hc-sr04/
 * Microphone code adapted from: https://learn.adafruit.com/adafruit-microphone-amplifier-breakout/measuring-sound-levels
 * Velostat pressure sensor code adapted from: https://gist.github.com/sshadmand/34a2f73e48a731c3c2e86b2171d6c41e
 * Speaker code adapted from: https://www.arduino.cc/en/Tutorial/toneMelody
 */
#include <math.h>
#include "pitches.h"

int myPin = 0;  


// Digital Pins
const int speakerPin = 13;
const int ultraEchoPin = 12;
const int ultraTrigPin = 11;
const int recStatusPin = 9;
const int recInputPin = 7;
const int proxAlertPin = 6;
// Analog Pins
const int pressPin0 = A0;
const int pressPin1 = A1;
const int pressPin2 = A2;
const int micPin = A3;

// Pressure Sensor Values
int press0Value = 0; // First pressure sensor
int press1Value = 0; // Second pressure sensor
int press2Value = 0; // Third pressure sensor
int isTouching = false;
int touchCount0 = 0;
int touchCount1 = 0;
int touchCount2 = 0;

// Mic Variables
boolean isRecording = true;
int micValue = 0;
//unsigned int micValue = 0;
//const int sampleWidth = 50;

// Ultrasonic Sensor Variables
boolean isSensing = true;
long travelTime;
int travelDistance; // measured in inches

// Speaker
boolean isMuted = true;

void setup() {
  // Speaker
  pinMode(speakerPin, OUTPUT);
  // Ultrasonic Sensor
  pinMode(ultraEchoPin, INPUT);
  pinMode(ultraTrigPin, OUTPUT);
  // Microphone
  pinMode(micPin, INPUT);
  // Lights
  pinMode(recStatusPin, OUTPUT);
  pinMode(recInputPin, OUTPUT);
  pinMode(proxAlertPin, OUTPUT);
  // Velostat Pressure Sensors
  pinMode(pressPin0, INPUT);
  pinMode(pressPin1, INPUT);
  pinMode(pressPin2, INPUT);
  
  Serial.begin(9600);
}

// the loop function runs over and over again forever
void loop() {
  // Get readings from Velostat pressure sensors
  press0Value = analogRead(pressPin0);
  press1Value = analogRead(pressPin1);
  press2Value = analogRead(pressPin2);

  // Check proximity sensor
  if(isSensing){
    CheckProximity();  
  }

  if(isRecording){
    Record();
  }

  // Check for input from the velostat sensors
  PressureInputCheck(0, press0Value);
  PressureInputCheck(1, press1Value);
  PressureInputCheck(2, press2Value);
}

void CheckProximity(){
  // Clear ultraTrigPin data
  digitalWrite(ultraTrigPin, LOW);

  // Send a sound wave for 10ms
  digitalWrite(ultraTrigPin, HIGH);
  delayMicroseconds(10);
  digitalWrite(ultraTrigPin, LOW);

  // Read the ultraEchoPin to get the travel time in ms
  travelTime = pulseIn(ultraEchoPin, HIGH);

  // Determine the distance in inches
  travelDistance = travelTime * 0.0133 / 2;

  // Check if something is less than 2 yards away, if so alert the user
  if(travelDistance < 54){
    digitalWrite(proxAlertPin, HIGH);

    if(travelDistance < 24 && isMuted == false){
      AudioAlert();
    }
  }
  else{
    digitalWrite(proxAlertPin, LOW);
  }
}

void PressureInputCheck(int ID, int value){
  String touchType = "";
  
  if (value > 90) {
    isTouching = true;

    // Increment count per sensor
    if(ID == 0){
      touchCount0++;
    }
    if(ID == 1) {
      touchCount1++;
    }
    if(ID == 2) {
      touchCount2++;
    }
  } else {
    isTouching = false;
    // Clear count per sensor
    if(ID == 0){
      touchCount0 = 0;
    }
    if(ID == 1) {
      touchCount1 = 0;
    }
    if(ID == 2) {
      touchCount2 = 0;
    }
  }

  if (isTouching && (touchCount0 > 3 || touchCount1 > 3 || touchCount2 > 3)) {
    touchType = "Hold";
  } else if (isTouching) { 
    touchType = "Tap";
  }

  if (value < 90) {
    //Serial.println("Not touched");
  } 
  else if (value < 120) {
    String output = String("Light " + touchType);
    Serial.println(output);
    // Perform action based on the sensor pressed
    switch(ID){
      case 0:
        if(touchType == "Tap"){
          // Toggle Recording
          isRecording = !isRecording;
          
          // Indicate that the speaker is recording
          if(isRecording){
            digitalWrite(recStatusPin, HIGH);
          }
          else {
            digitalWrite(recStatusPin, LOW);
          }
        }
        else {
          
        }
        break;
      case 1:
        if(touchType == "Tap"){
          // Toggle speaker
          isMuted = !isMuted;
        }
        else {
          
        }
        break;
      case 2:
        if(touchType == "Tap"){
          // Toggle proximity sensing
          isSensing = !isSensing;
        }
        else {
          
        }
        break;
    }
  } 
  else if (value < 160) {
    String output = String("Strong " + touchType);
    Serial.println(output);
    // Perform action based on the sensor pressed
    
    // Switch in same format as the first, assigned to different functionality
    // Removed here for length considerations
  } 
  else if (value < 190) {
    String output = String("Hard " + touchType);
    Serial.println(output);
    // Perform action based on the sensor pressed

    // Switch in same format as the first, assigned to different functionality
    // Removed here for length considerations
  }
}

// Simulates the action of recording
void Record(){  
  micValue = digitalRead(micPin);
  //Serial.println("Mic value: " + micValue);
  if(micValue == HIGH) {
    digitalWrite(recInputPin, HIGH);
  }
  else {
    digitalWrite(recInputPin, LOW);
  }
}

// Plays a tone alerting the wearer to a contact within 2 feet
void AudioAlert(){
  int alarm[] = {
    NOTE_FS6, NOTE_C6, NOTE_FS6, NOTE_C6
  };

  int noteLengths[] = {
    6, 6, 6, 6
  };

  for (int i = 0; i < 4; i++) {
    // Calculate time the note should play
    int noteLength = 1000 / noteLengths[i];
    tone(speakerPin, alarm[i], noteLength);

    // Set minimum time between notes
    int pauseTime = noteLength * 1.30;
    delay(pauseTime);
    
    // stop the tone
    noTone(speakerPin);
  }
}

pitches.h File from: https://www.arduino.cc/en/Tutorial/toneMelody

/*************************************************
 * Public Constants
 *************************************************/

#define NOTE_B0  31
#define NOTE_C1  33
#define NOTE_CS1 35
#define NOTE_D1  37
#define NOTE_DS1 39
#define NOTE_E1  41
#define NOTE_F1  44
#define NOTE_FS1 46
#define NOTE_G1  49
#define NOTE_GS1 52
#define NOTE_A1  55
#define NOTE_AS1 58
#define NOTE_B1  62
#define NOTE_C2  65
#define NOTE_CS2 69
#define NOTE_D2  73
#define NOTE_DS2 78
#define NOTE_E2  82
#define NOTE_F2  87
#define NOTE_FS2 93
#define NOTE_G2  98
#define NOTE_GS2 104
#define NOTE_A2  110
#define NOTE_AS2 117
#define NOTE_B2  123
#define NOTE_C3  131
#define NOTE_CS3 139
#define NOTE_D3  147
#define NOTE_DS3 156
#define NOTE_E3  165
#define NOTE_F3  175
#define NOTE_FS3 185
#define NOTE_G3  196
#define NOTE_GS3 208
#define NOTE_A3  220
#define NOTE_AS3 233
#define NOTE_B3  247
#define NOTE_C4  262
#define NOTE_CS4 277
#define NOTE_D4  294
#define NOTE_DS4 311
#define NOTE_E4  330
#define NOTE_F4  349
#define NOTE_FS4 370
#define NOTE_G4  392
#define NOTE_GS4 415
#define NOTE_A4  440
#define NOTE_AS4 466
#define NOTE_B4  494
#define NOTE_C5  523
#define NOTE_CS5 554
#define NOTE_D5  587
#define NOTE_DS5 622
#define NOTE_E5  659
#define NOTE_F5  698
#define NOTE_FS5 740
#define NOTE_G5  784
#define NOTE_GS5 831
#define NOTE_A5  880
#define NOTE_AS5 932
#define NOTE_B5  988
#define NOTE_C6  1047
#define NOTE_CS6 1109
#define NOTE_D6  1175
#define NOTE_DS6 1245
#define NOTE_E6  1319
#define NOTE_F6  1397
#define NOTE_FS6 1480
#define NOTE_G6  1568
#define NOTE_GS6 1661
#define NOTE_A6  1760
#define NOTE_AS6 1865
#define NOTE_B6  1976
#define NOTE_C7  2093
#define NOTE_CS7 2217
#define NOTE_D7  2349
#define NOTE_DS7 2489
#define NOTE_E7  2637
#define NOTE_F7  2794
#define NOTE_FS7 2960
#define NOTE_G7  3136
#define NOTE_GS7 3322
#define NOTE_A7  3520
#define NOTE_AS7 3729
#define NOTE_B7  3951
#define NOTE_C8  4186
#define NOTE_CS8 4435
#define NOTE_D8  4699
#define NOTE_DS8 4978

Exercise 4: Serial Communication

Part 1

Completing this section was completely straightforward, as both the digital and analog tutorials clearly showed how to construct the circuits necessary. Neither presented me with any issues, although I did experiment with reversing the value returned by the potentiometer in the analog tutorial by swapping the power and ground connections.

Part 2

For part 2 I simply connected three LEDs to the Arduino to pins 11-13. Each is wired in sequence with a 1k resistor on the long lead and the short lead directly into the ground rail on the breadboard.

For the Processing code, I adapted the Simple Write example. In doing so, I expanded the window and added two additional rectangles. Along with this, I created two more checks against the mouse’s position associated with these new rectangles. If the mouse overlaps any of the shapes, the value 1, 2, or 3, respectively, is written to the serial port.

For the Arduino’s functionality, pins 11-13 are recorded in order to utilize them for the output of the lights. Each iteration, the serial port is checked for data and, if there is data available it is saved. If this data is equal to the integers 1, 2, or 3 the associated light is then turned on via digitalWrite().

// Jordan Machalek
// Exercise 4 Part 2
// Arduino Code

// Light Pins
const int light1 = 13;                  
const int light2 = 12;
const int light3 = 11;

char val; // Data received from the serial port

void setup() {
  pinMode(light1, OUTPUT);
  pinMode(light2, OUTPUT);
  pinMode(light3, OUTPUT);
  Serial.begin(9600);             
}

 void loop() {
  // Check for serial data
   while (Serial.available()) { 
     val = Serial.read(); 
     }
   
   if (val == 1) { 
    digitalWrite(light1, HIGH); 
   } 
   else if (val == 2) {
    digitalWrite(light2, HIGH);
   }
   else if (val == 3) {
    digitalWrite(light3, HIGH);
   }
   else {
    // Turn off the lights
    digitalWrite(light1, LOW);
    digitalWrite(light2, LOW);
    digitalWrite(light3, LOW);
   }
   
   delay(10);
 }
/*
 * Jordan Machalek
 * Exercise 4 Part 2
 * Processing Code
 * Based on Simple Write example 
 */


import processing.serial.*;

Serial myPort;  // Create object from Serial class
int val;      // Data received from the serial port



void setup() 
{
  size(600, 200);
  String portName = Serial.list()[0]; // 1 for COM3, 0 is COM1
  myPort = new Serial(this, portName, 9600);
}

void draw() {
  background(255);
  if (mouseOverRect() == 1) {  // If mouse is over square,
    fill(204);                    // change color and
    myPort.write(1);              // send an H to indicate mouse is over square
  } 
  else if (mouseOverRect() == 2){                        
    fill(100);                    
    myPort.write(2);
  }
  else if (mouseOverRect() == 3) {
    fill(300);                   
    myPort.write(3);  
  }
  else {
    fill(0);                     
    myPort.write(0);  
  }
  
  // Draw squares
  rect(50, 50, 100, 100);         
  rect(250, 50, 100, 100);
  rect(450, 50, 100, 100);
}

int mouseOverRect() { // Test if mouse is over square
  if ((mouseX >= 50) && (mouseX <= 150) && (mouseY >= 50) && (mouseY <= 150)){
    return 1;
  }
  else if ((mouseX >= 250) && (mouseX <= 350) && (mouseY >= 50) && (mouseY <= 150)) {
    return 2;
  }
  else if ((mouseX >= 450) && (mouseX <= 550) && (mouseY >= 50) && (mouseY <= 150)) {
    return 3;
  }
  
  return 0;
  // return ((mouseX >= 50) && (mouseX <= 150) && (mouseY >= 50) && (mouseY <= 150));
}

Part 3

To set up the Arduino, I connected three photocells to read analog input. Each was connected directly to the 5V rail of the breadboard via one pin. The other pin of each cell was flanked on one side by a 10k resistor leading back to the ground rail and by a wire leading back to analog pins A0 – A2 on the Arduino.

In code, the value of each photocell is saved and then mapped from their raw value to a range of 0 – 255. First, however, the establishContact() method is used to ensure that serial data is being transmitted. Each iteration of the loop also implements a delay of 1ms to limit sensor readings.

In processing, the data from the serial port is read in and stored in an array. The values in the array, ranging from 0 – 255 are then used to manipulate the position and color of a circle and square that are drawn to a window, the background of which is also changed with these values. The random() function is used to introduce some variability to color and position. Each time that three values are recorded from the serial stream, a message is sent back to the Arduino requesting the next sequence to be sent.

/*
 * Jordan Machalek
 * Exercise 4 part 3
 * Arduino Code
 * Adapted from Processing Simple Read  and Arduino SerialCallResponse examples
 */

// Pin #'s
const int photo1 = A0;
const int photo2 = A1;
const int photo3 = A2;
int inByte = 0;         // incoming serial byte

// Sensor values
int photo1Val = 0;
int photo2Val = 0;
int photo3Val = 0;

void setup() {
  pinMode(photo1, INPUT);
  pinMode(photo2, INPUT);
  pinMode(photo3, INPUT);
  
  Serial.begin(9600);           
      
  establishContact(); 
}

void loop() {
  // if we get a valid byte, read analog ins:
  if (Serial.available() > 0) {
    // get incoming byte:
    inByte = Serial.read();

    // Read photocell values
    photo1Val = analogRead(photo1);
    photo2Val = analogRead(photo2);
    photo3Val = analogRead(photo3);
    
    // Map photocell values
    int redMap = map(photo1Val, 0, 1024, 0, 255);
    int greenMap = map(photo2Val, 0, 1024, 0, 255);
    int blueMap = map(photo3Val, 0, 1024, 0, 255);
    
    // Write to serial port
    Serial.write(redMap);
    Serial.write(greenMap);
    Serial.write(blueMap);
    
    delay(1000);
  }
}

void establishContact() {
  while (Serial.available() <= 0) {
    Serial.print('A');   // send a capital A
    delay(300);
  }
}
/*
 * Jordan Machalek
 * Exercise 4 part 3
 * Processing Code
 * Adapted from Processing Simple Read and Arduino SerialCallResponse examples
 */

import processing.serial.*;

// Serial data
Serial myPort;                     
int[] serialData = new int[3];   
int serialCount = 0;   
boolean firstContact = false;

// Sketch data
int bgColor;          
color fillColor;         
color fillColor2;
int xPos, yPos, xPos2, yPos2; // coordinates of shapes  
          
void setup() 
{
  size(200, 200);
  String portName = Serial.list()[0];
  myPort = new Serial(this, portName, 9600);
}

void draw() {
    background(serialData[0], serialData[1], serialData[2]);
    fill(fillColor);
    
    // Draw shapes
    ellipse(xPos, yPos, 20, 20);
    fill(fillColor2);
    square(xPos2, yPos2, 30);
}

void serialEvent(Serial myPort) {
  // read a byte from the serial port:
  int inByte = myPort.read();
  
  // if this is the first byte received, and it's an A, clear the serial
  // buffer and note that you've had first contact from the microcontroller.
  // Otherwise, add the incoming byte to the array:
  if (firstContact == false) {
    if (inByte == 'A') {
      // Clear data
      myPort.clear();       
      firstContact = true; 
      myPort.write('A'); 
    }
  }
  else {
    // Add the latest byte from the serial port to array:
    serialData[serialCount] = inByte;
    serialCount++;

    // When 3 values have been recorded, do something with them
    if (serialCount > 2 ) {
      // Use
      xPos = serialData[0];
      yPos = serialData[1];
      fillColor = color(serialData[0], serialData[1], serialData[2]);
      
      xPos2 = serialData[0] + (int)random(-10, 10);
      yPos2 = serialData[1] + (int)random(-10, 10);
      fillColor2 = color(serialData[0] + random(-50, 50), serialData[1] + random(-50, 50), serialData[2] + random(-50, 50));

      // print the values (for debugging purposes only):
      println(xPos + "\t" + yPos + "\t" + fillColor);

      // Send a capital A to request new sensor readings:
      myPort.write('A');
      // Reset serialCount:
      serialCount = 0;
    }
  }
}

Part 4

In this part, I connected both a potentiometer and toggle switch to the Arduino to control video. The potentiometer is connected with the center pin to analog pin A0 on the Arduino and the left and right pins to ground and 5V power. The switch is connected to digital pin 13 and ground via a 10k resistor on an end pin and to power via its center pin.

For the Arduino functions, each iteration of the loop the state of the switch and the value of the potentiometer are recorded. The potentiometer value is mapped from its native 0-1024 range to a range of 1-100. Both this value and the switch state are written to the serial stream.

In Processing, using the video library a local file is retrieved and played on loop in a window. Through serial, the value of the potentiometer and state of the switch are retrieved and stored. Each Draw() call, the switch state value will be check and either play or pause the video. The potentiometer value will also be used to adjust the playback speed of the video.

/*
 * Jordan Machalek
 * Exercise 4 part 4
 */

// Pin #'s
const int switchPin = 13;
const int potentPin = A0;
int inByte = 0;

// Sensor values
int switchState = 0;
int sensorValue = 0;

void setup() {
  pinMode(switchPin, INPUT);
  pinMode(potentPin, INPUT);
  
  Serial.begin(9600);           
      
  establishContact(); 
}

void loop() {
  // if we get a valid byte, read analog ins:
  if (Serial.available() > 0) {
    // get incoming byte:
    inByte = Serial.read();

    // Get switch state
    switchState = digitalRead(switchPin);

    // Read photocell values
    sensorValue = analogRead(potentPin);

    // Map sensor value
    int  sensorMap = map(sensorValue, 0, 1024, 1, 100);
    
    // Write to serial port
    // If the switch is on send 1, if off send 0
    if(switchState == HIGH) {
      Serial.write(1);
    }
    else {
      Serial.write(0);
    }
    // Send the mapped sensor value
    Serial.write(sensorMap);
    
    delay(10);
  }
}

void establishContact() {
  while (Serial.available() <= 0) {
    Serial.print('A');   // send a capital A
    delay(300);
  }
}
/*
 * Jordan Machalek
 * Exercise 4 part 4
 * Adapted from: 
 * https://funprogramming.org/125-Simple-video-player-in-Processing.html
 */

import processing.serial.*;
import processing.video.*;

String VIDEO_PATH = "Downloads/095.MOV";
Movie mov;

// Serial data
Serial myPort;                     
int[] serialData = new int[2];   
int serialCount = 0;   
boolean firstContact = false;

// Sketch data
int movieSpeed = 5;
boolean isPlaying = true;
          
void setup() 
{
  size(640, 360);
  
  String portName = Serial.list()[0];
  
  myPort = new Serial(this, portName, 9600);
  
  frameRate(30);
  mov = new Movie(this, VIDEO_PATH);
  mov.loop();
  mov.speed(5);
  mov.volume(0);
}

void draw() {
  if (mov.available()) {
    mov.read();
  }
  
  image(mov, 0, 0, width, height);
  
  // Pause or play the video
  if(isPlaying == true){
    mov.play();
  }
  else if(isPlaying == false) {
    mov.pause();
  }
  
  // Set the speed of the video
  mov.speed(movieSpeed);
}

void serialEvent(Serial myPort) {
  // read a byte from the serial port:
  int inByte = myPort.read();
  
  // if this is the first byte received, and it's an A, clear the serial
  // buffer and note that you've had first contact from the microcontroller.
  // Otherwise, add the incoming byte to the array:
  if (firstContact == false) {
    if (inByte == 'A') {
      // Clear data
      myPort.clear();       
      firstContact = true; 
      myPort.write('A'); 
    }
  }
  else {
    // Add the latest byte from the serial port to array:
    serialData[serialCount] = inByte;
    serialCount++;

    // When 3 values have been recorded, do something with them
    if (serialCount > 1 ) {
      
      // Pause or play the video
      if(serialData[0] == 1){
        isPlaying = true;
      }
      else if(serialData[0] == 0) {
        isPlaying = false;
      }
      
      // Set the speed of the video
      movieSpeed = serialData[1];
      
      // Print data to console
      println(serialData[0] + "\t" + serialData[1] + "\t");

      // Send a capital A to request new sensor readings:
      myPort.write('A');
      // Reset serialCount:
      serialCount = 0;
    }
  }
}

Part 5

To connect the compass to the Arduino, power is provided via the 3.3V pin rather than the 5V and ground as usual. The SCL connection was wired to the analog pin A5 and SDA connection wired to the analog pin A4.

To interface with the compass I utilzed Adafruit’s accelerometer example and combined this with my Part 3 code. This sends mapped X, Y, and Z values of the accelerometer via serial to Processing. There, it utilizes the data to manipulate the position and color of a circle and the sketch’s background.

/*
 * Jordan Machalek
 * Exercise 4 Part 5
 * Adapted from: https://github.com/adafruit/Adafruit_LSM303DLHC/tree/master/examples/accelsensor
 */
#include <Wire.h>
#include <Adafruit_Sensor.h>
#include <Adafruit_LSM303_U.h>

int inByte = 0;

// Assign a unique ID to the sensor
Adafruit_LSM303_Accel_Unified accel = Adafruit_LSM303_Accel_Unified(45621);

void setup()
{
#ifndef ESP8266
  while (!Serial);     // Pause until the serial console opens
#endif
  Serial.begin(9600);
  Serial.println("Accelerometer Test"); Serial.println("");

  /* Initialise the sensor */
  if(!accel.begin())
  {
    /* There was a problem detecting the ADXL345 ... check your connections */
    Serial.println("Ooops, no LSM303 detected ... Check your wiring!");
    while(1);
  }
}

void loop()
{
  /* Get a new sensor event */
  sensors_event_t event;
  accel.getEvent(&event);

  if (Serial.available() > 0) {
    // get incoming byte:
    inByte = Serial.read();
  
    // Print data for clarity
    Serial.print("X Raw: "); Serial.print(accel.raw.x); Serial.print("  ");
    Serial.print("Y Raw: "); Serial.print(accel.raw.y); Serial.print("  ");
    Serial.print("Z Raw: "); Serial.print(accel.raw.z); Serial.println("");

    // Map values
    int xValMap = map(accel.raw.x, -1024, 1024, 0, 255);
    int yValMap = map(accel.raw.y, -1024, 1024, 0, 255);
    int zValMap = map(accel.raw.x, -1024, 1024, 0, 255);

    // Send to serial
    Serial.write(xValMap);
    Serial.write(yValMap);
    Serial.write(zValMap);
    
    delay(1000);
  }

}

void establishContact() {
  while (Serial.available() <= 0) {
    Serial.print('A');   // send a capital A
    delay(300);
  }
}
/*
 * Jordan Machalek
 * Exercise 4 Part 5
 * Processing Code
 * Adapted from Processing Simple Read and Arduino SerialCallResponse examples
 */

import processing.serial.*;

// Serial data
Serial myPort;                     
int[] serialData = new int[3];   
int serialCount = 0;   
boolean firstContact = false;

// Sketch data
int bgColor;          
color fillColor;        
int xPos, yPos;
          
void setup() 
{
  size(200, 200);
  String portName = Serial.list()[0];
  myPort = new Serial(this, portName, 9600);
}

void draw() {
    background(serialData[0], serialData[1], serialData[2]);
    fill(fillColor);
    
    // Draw shapes
    ellipse(xPos, yPos, 20, 20);
}

void serialEvent(Serial myPort) {
  // read a byte from the serial port:
  int inByte = myPort.read();
  
  // if this is the first byte received, and it's an A, clear the serial
  // buffer and note that you've had first contact from the microcontroller.
  // Otherwise, add the incoming byte to the array:
  if (firstContact == false) {
    if (inByte == 'A') {
      // Clear data
      myPort.clear();       
      firstContact = true; 
      myPort.write('A'); 
    }
  }
  else {
    // Add the latest byte from the serial port to array:
    serialData[serialCount] = inByte;
    serialCount++;

    // When 3 values have been recorded, do something with them
    if (serialCount > 2 ) {
      // Use
      xPos = serialData[0];
      yPos = serialData[1];
      fillColor = color(serialData[0], serialData[1], serialData[2]);
      
      // print the values (for debugging purposes only):
      println(xPos + "\t" + yPos + "\t" + fillColor);

      // Send a capital A to request new sensor readings:
      myPort.write('A');
      // Reset serialCount:
      serialCount = 0;
    }
  }
}

Project 1 – Switches & Analogs

Switch

The switch based portion of my project utilizes a toggle switch, push button, and LED light in order to create a device that allows a user to input a pattern that will then be repeated back via the light. When the switch is toggled to the “on” (high) position, the Arduino will begin to record when the button is pressed, how long it is pressed, and the time between its release and the next press. When the switch is then changed back to the “off” (low) position, it will then continually repeat the pattern that was entered. The switch can be turned on again in order to enter a new patter.

Both the switch and button are connected to the 5V rail on the breadboard for power, with a second wire connecting to pins 12 and 11, respectively, and a 10k resistor grounding them. The LED connects to pin 13 and is grounded with a 1k resistor.

My initial attempt to program the functionality of the device worked overall but doesn’t replicate the patter exactly. Because of how I recorded the time, the gap between when the switch is flipped and the button is first pressed is used as the initial time for the light to be on. Therefore, the entire pattern is off from there and the final button press is not recorded.

The original code that is mostly accurate to the pattern entered.

/*
 * Jordan Machalek
 * Project 1 switch code
 * Version 1
 */

// Pins
const int buttonPin = 11;
const int switchPin = 12;
const int redLedPin= 13;

unsigned long previousTime = 0;
unsigned long onTimes[100];
unsigned long offTimes[100];
int delayCount = 0;

int switchState = LOW; // value of the toggle switch
int prevSwitchState = LOW; // state that the switch was in prior to a new loop() iteration
int buttonState = LOW; // value of the push button
int prevButtonState = LOW; // state that the button was in prior to a new loop iteration

void setup() {
 pinMode(buttonPin, INPUT);
 pinMode(switchPin, INPUT);
 pinMode(redLedPin, OUTPUT);

 Serial.begin(9600);
}

void loop() {
  // get the switch state
  switchState = digitalRead(switchPin);

  // record the time
  unsigned long currentTime = millis();

  // HIGH == record a pattern
  if(switchState == HIGH) {
    // if the switch was just flipped, clear data
    if(prevSwitchState == LOW) {
      // Clear the onTimes array
      for(int i = 0; i < 100; i++) {
        onTimes[i] = 0;
      }
      delayCount = 0;
      previousTime = currentTime;
      // set prevSwitchState to HIGH so this only happens once when the switch is flipped
      prevSwitchState = HIGH; 
    }
    
    // only get the button's state if it is being recorded
    buttonState = digitalRead(buttonPin);

    if(buttonState == HIGH && prevButtonState == LOW) {
      digitalWrite(redLedPin, HIGH); // Turn on the light to show the button was pressed
      // Save the elapsed time - the difference between the current time and the last time the button was pushed
      currentTime = millis();
      onTimes[delayCount] = (currentTime - previousTime);
      // set prevButtonState to HIGH so the next loop will record the time difference
      // between the button being depressed and being released
      prevButtonState = HIGH; 
    }
    else if(buttonState == LOW && prevButtonState == HIGH){
        offTimes[delayCount] = (currentTime - previousTime);
        delayCount++;
        previousTime = currentTime; // Update the previous time
        prevButtonState = LOW;
        digitalWrite(redLedPin, LOW);
    }
  } 
  else { // LOW == play the pattern back
    Serial.println("Loop start");
    for(int i = 0; i < delayCount; i++) {
      Serial.println(i);
      digitalWrite(redLedPin, HIGH);
      Serial.print("onTime: ");
      Serial.println(onTimes[i]);
      delay(onTimes[i]);
      digitalWrite(redLedPin, LOW);
      Serial.print("offTime: ");
      Serial.println(offTimes[i]);
      delay(offTimes[i]);
    }
    // Give a 4s buffer before restarting the pattern
    Serial.println("Wait 4s");
    delay(4000);
    // 
    prevSwitchState = LOW;
  }
}

A revised version that attempts to correct how the time between button presses and releases are recorded.

/*
 * Jordan Machalek
 * Project 1 switch code
 * Version 2
 */

// Pins
const int buttonPin = 11;
const int switchPin = 12;
const int redLedPin= 13;

unsigned long currentOnTime = 0;
unsigned long currentOffTime = 0;
unsigned long previousOnTime = 0;
unsigned long previousOffTime = 0;
unsigned long onTimes[100];
unsigned long offTimes[100];
int delayCount = 0;

int switchState = LOW; // value of the toggle switch
int prevSwitchState = LOW; // state that the switch was in prior to a new loop() iteration
int buttonState = LOW; // value of the push button
int prevButtonState = LOW; // state that the button was in prior to a new loop iteration

void setup() {
 pinMode(buttonPin, INPUT);
 pinMode(switchPin, INPUT);
 pinMode(redLedPin, OUTPUT);

 Serial.begin(9600);
}

void loop() {
  // get the switch state
  switchState = digitalRead(switchPin);

//  currentOnTime = millis();
//  currentOffTime = millis();
  
  // HIGH == record a pattern
  if(switchState == HIGH) {
    // if the switch was just flipped, clear data
    if(prevSwitchState == LOW) {
      // Clear the onTimes array
      for(int i = 0; i < 100; i++) {
        onTimes[i] = 0;
      }
      delayCount = 0;
      currentOnTime = 0;
      currentOffTime = 0;
      previousOnTime = 0;
      previousOffTime = 0;
      
      // set prevSwitchState to HIGH so this only happens once when the switch is flipped
      prevSwitchState = HIGH; 
    }
    
    // only get the button's state if it is being recorded
    buttonState = digitalRead(buttonPin);

    if(buttonState == HIGH && prevButtonState == LOW) { // Button was just pressed
      
      // Save the time that the button was pressed at
      // On the first iteration, the prevOnTime will be 0. So: current - prev = current
      currentOnTime = millis();
      previousOffTime = currentOffTime;

      // set prevButtonState to HIGH so the next loop will record the time difference
      // between the button being depressed and being released
      prevButtonState = HIGH; // record that the button was pressed
      digitalWrite(redLedPin, HIGH); // Turn on the light to show the button was pressed
    }
    else if(buttonState == HIGH && prevButtonState == HIGH) { // Button is being held
      // Update the time
      currentOnTime = millis();
    }
    else if(buttonState == LOW && prevButtonState == HIGH){ // Button button was just released
        // Save the duration that the button was pressed
        onTimes[delayCount] = (currentOnTime - previousOnTime);
        
        previousOnTime = currentOnTime; // Update the previous on time
        
        delayCount++;
        prevButtonState = LOW; // record that the button was released
        digitalWrite(redLedPin, LOW); // Turn off the light to show the button was released
    }
    else if (buttonState == LOW && prevButtonState == LOW) { // Button has not been pressed
      // Update the time
      currentOffTime = millis();

      // Save the duration that the button was not pressed
      offTimes[delayCount] = (currentOffTime - previousOffTime);     
    }
  } 
  else { // LOW == play the pattern back
    Serial.println("Loop start");
    for(int i = 0; i < delayCount; i++) {
      Serial.println(i);
      
      // Turn on
      digitalWrite(redLedPin, HIGH);
      Serial.print("onTime: ");
      Serial.println(onTimes[i]);
      delay(onTimes[i]);
      
      // Turn off
      digitalWrite(redLedPin, LOW);
      Serial.print("offTime: ");
      Serial.println(offTimes[i]);
      delay(offTimes[i]);
    }
    
    // Give a 4s buffer before restarting the pattern
    Serial.println("Wait 4s");
    delay(4000);
    
    // Note that the switch has been turned off
    prevSwitchState = LOW;
  }
}

Analog

For my analog device, I utilized a combination of photocells and an RGB LED. These components are set up in such a way that the value recorded from the first through third cells will be analogous to the red, green, and blue diodes of the light.

The three short pins of the LED are wired into PWM slots 9-11 on the board, with a 1k resistor included in the circuit. Because the LED I am using is of the common anode type, the long pin is connected to the 5V power rail on the breadboard. If it were a common anode LED, it would instead connect to the ground rail. As for the photocells, each is connected between the 5V rail an a rail of its own. A 10k resistor grounds each cell while a wire attaches back to analog pins A0-A2.

In the code, constant variables are used for each of the three digital and three analog pins. A further three variables hold the values of recorded from each of the photocells. Each iteration of the loop the value from the cells will first be recorded using analogRead(). These values are then mapped from the range of 0 – 1024 to the range of 0 – 255 to correspond with the values that can be passed to each of the pins of the RGB LED. Finally, the mapped values are applied to the light using three calls of analogWrite() within the function setLED().

/*
 * Jordan Machalek
 * Project 1 analog sensor code
 */

// Pins
const int redLightPin= 11;
const int greenLightPin = 10;
const int blueLightPin = 9;
const int redSensorPin = A0;
const int greenSensorPin = A1;
const int blueSensorPin = A2;

// Analog sensor values
int redSensorVal = 0;
int greenSensorVal = 0;
int blueSensorVal = 0;

void setup() {
  pinMode(redLightPin, OUTPUT);
  pinMode(greenLightPin, OUTPUT);
  pinMode(blueLightPin, OUTPUT);
}

void loop() {  
  // Get values from the photocells
  redSensorVal = analogRead(redSensorPin);
  greenSensorVal = analogRead(greenSensorPin);
  blueSensorVal = analogRead(blueSensorPin);

  // Map sensor values to LED values
  int redMap = map(redSensorVal, 0, 1024, 0, 255);
  int greenMap = map(greenSensorVal, 0, 1024, 0, 255);
  int blueMap = map(blueSensorVal, 0, 1024, 0, 255);

  // Apply color to light
  setLED(redMap, greenMap, blueMap);

  // Keep the same color for 2 seconds
  delay(2000);
}

void setLED(int redVal, int greenVal, int blueVal)
 {
  analogWrite(redLightPin, 255 - redVal);
  analogWrite(greenLightPin, 255 - greenVal);
  analogWrite(blueLightPin, 255 - blueVal);
}

Reading Set 1

From Rituals to Magic: Interactive Art and HCI of the Past, Present, and Future by Jeon et al.

Within the article I found the section on interaction and interactivity to be the most interesting. Specifically, the explanation on the distinction between “quasi interactivity” and “full interactivity.” Prior to this I had not considered categorizing interaction based upon whether previous communication had occurred. The idea of these different forms of interaction also made me think about how we develop AI and how it can either be strictly reactive or can be developed such that it develops a knowledge base and draws from it in future interactions.

Questions about how much an artist’s work advances the field of art, brought up within the exploration of creativity, also held my interest. Work that does not push the boundaries of a field is likely to be looked over. The connection between computer science and artistic practices was yet another key point. Although game and software development and artistic endeavors have caused me to use the three creative thinking tools listed time and again, I have never made the connection between their use in the two areas.

Overall, I found that the article to have a valuable focus on how the practices within HCI and interactive art can improve each other. However, I also think that much of what the authors had to say could have been stated more succinctly and that they were overly reliant on their sources.

Tangible Bits: Towards Seamless Interfaces between People, Bits, and Atoms by Ishii and Ullmer

As presented in the article, I find that some of what the authors present as having some potential, if limited. Quite obviously, their idea of “interactive surfaces” has come to fruition in an improved way in the form of touch screens. However, the other goals of “coupling bits and atoms” and “ambient media” seem to me to be an impractical pipe dream. Much of our digital technologies have been created to take the place of analogue tools because the analogue was less efficient or intuitive, while Ishii and Ullmer looked to reintroduce these elements.

The authors mention that “in the real world, when a process that was not a focus of attention catches our interest, we are often able to seamlessly integrate it into our activity.” In response to this I ask how are our interactions with current digital interfaces less part of the “real world” than the ways they suggest we will interact with Tangible User Interfaces? I struggle to see what real world applications many of the proposed versions of tangible interfaces would be.

The concepts which the article presents cause me to wonder what long term consequences there may be on human cognitive function, specifically related to our ability to focus our attention, should the author’s vision of bringing “background bits” to the “periphery of human perception” become more real. To me, Tangible User Interfaces, at least on the scale of the ambient media idea, mostly seem like a pathway to quick and substantial sensory overload.

Exercise 3 – Sensors & I/O

Part 1

Setting up the initial version of the button example was simple. The only issue I ran into was accidentally using a switch at first instead of the button because of the appearance of the button in the Fritzing diagram, which meant that the light would only turn on for an instant when the switch was toggled. After that, however, everything worked fine. To rewire with different outputs, I saved wires by running the 10k resistor directly to the ground rail from the button rail as well as putting the cathode of the LED into the ground rail. I also changed my inputs to pins 8 and 10 and made the associated edits to the code.

const int buttonPin = 10;
const int ledPin = 8;

Part 2

For this part I added a second, green LED in addition to the red LED, attached in the same way with the cathode directly in the ground rail. As for functionality, I first added the necessary code to interact with the second LED. This time, the green LED activates whenever the button is pressed. In addition, as long as the button is held for at least two seconds, the red light will also activate and remain on for two seconds after the button is released.

const int buttonPin = 12;
const int redLedPin = 9;
const int greLedPin = 8;
int redState = LOW;
int greState = LOW;

int buttonState = 0; // variable for reading the pushbutton status

unsigned long previousTime = 0;
const int timeDelay = 2000;

void setup() {
// initialize the LED pin as an output:
pinMode(redLedPin, OUTPUT);
pinMode(greLedPin, OUTPUT);
// initialize the pushbutton pin as an input:
pinMode(buttonPin, INPUT);
}

void loop() {
// read the state of the pushbutton value:
buttonState = digitalRead(buttonPin);
unsigned long currentTime = millis();

// Turn on the green light while the button is pressed
if(buttonState == HIGH){
digitalWrite(greLedPin, HIGH);
}
else {
digitalWrite(greLedPin, LOW);
}

if(currentTime – previousTime >= timeDelay){
previousTime = currentTime;
// check if the pushbutton is pressed.
if (buttonState == HIGH) {
digitalWrite(redLedPin, HIGH);
}
else {
digitalWrite(redLedPin, LOW);
}
}
}

Part 3

For my modification of the analog basics example, I chose to replace the potentiometer with a photoresistor in order to retrieve analog input. Along with this, I connected a switch for digital input. Using an if() statement to check the state of the switch, I made it so that the light would only blink when the switch is turned on. With the addition of the photoresistor, the light now blinks faster the less light reaches the sensor and slower as more light reaches it.

const int switchPin = 11;
const int sensorPin = A3;
const int ledPin = 13;
int sensorValue = 0; // value of the photoresistor
int switchState = 0;

void setup() {
pinMode(switchPin, INPUT);
pinMode(ledPin, OUTPUT);
}

void loop() {
// Get the state of the switch
switchState = digitalRead(switchPin);

// Check if the switch is on
if(switchState == HIGH){
sensorValue = analogRead(sensorPin);
digitalWrite(ledPin, HIGH);
delay(sensorValue);
digitalWrite(ledPin, LOW);
delay(sensorValue);
}
}

Part 4

In completing this section, I largely followed the provided voltage divider example. Since I did not have a force sensitive resistor, I instead used a thermistor. Once the thermistor, potentiometer, and photoresistor were connected to the analog input, I added the three LEDs via digital pins 11-13. To control the blinking rate of each light, I adapted code from the AnalogBasics example to relate the analog values to the length of delay affecting the light states.

const int knobPin = A0;
const int photoPin = A3;
const int thermalPin = A5;
int knobValue = 0;
int photoValue = 0;
int thermalValue = 0;
int firstPin = 13;
int secondPin = 12;
int thirdPin = 11;

void setup() {
pinMode(firstPin, OUTPUT);
pinMode(secondPin, OUTPUT);
pinMode(thirdPin, OUTPUT);
pinMode(knobPin, INPUT);
pinMode(photoPin, INPUT);
pinMode(thermalPin, INPUT);
}

void loop() {
// read the sensor’s value (0-1023**)
knobValue = analogRead(knobPin);
photoValue = analogRead(photoPin);
thermalValue = analogRead(thermalPin);

// set the potentiometer light’s rate
digitalWrite(firstPin, HIGH);
delay(knobValue);
digitalWrite(firstPin, LOW);
delay(knobValue);

// Set the photoresistor pin light’s rate
digitalWrite(secondPin, HIGH);
delay(photoValue);
digitalWrite(secondPin, LOW);
delay(photoValue);

// Set the thermal resistor light’s rate
digitalWrite(thirdPin, HIGH);
delay(thermalValue);
digitalWrite(thirdPin, LOW);
delay(thermalValue);
}

Analog Output

Setting up a potentiometer to control the PWM rate was straightforward. All that was required was the addition of the control wired into the power/ground rails and an analog input in the board, leaving the LED as it was for the initial Fade setup. With the appropriate variables to track the analog value, the next step was to map the analog values to PWM. This took me more time than it should have as I first attempted to implement a mapping formula before thinking to check for a mapping function within Processing. Despite attempting to follow several guides, I was unable to figure out an implementation of a low-pass filter to control the fade rate.

int brightness = 0; // how bright the LED is
int fadeAmount = 5; // how many points to fade the LED by
const int ledPin = 11; // LED pin, this pin must be a PWM-capable pin
const int analogPin = A0; // pin for potentiometer
int sensorValue = 0;

void setup() {
// declare the LED pin to be an output:
pinMode(analogPin, INPUT);
pinMode(ledPin, OUTPUT);
}

void loop() {
// set the brightness of the LED pin:
analogWrite(ledPin, brightness);

sensorValue = analogRead(analogPin);

// Map the sensor value to a PWM value
fadeAmount = map(sensorValue, -1023, 1023, 0, 255);

// change the brightness for next time through the loop:
brightness = brightness + fadeAmount;

// reverse the direction of the fading at the ends of the fade:
if (brightness == 0 || brightness == 255) {
fadeAmount = -fadeAmount ;
}
// wait for 30 milliseconds to see the dimming effect
delay(30);
}

Exercise 2 – Blink

Part 1

At first I was somewhat intimidated to get started, as the tutorial and schematic showed the use of a resistor. Despite comparing my materials to descriptions on the Electronics Club component list I was unsure whether I had the correct resistor. After reading the comments in the code from the repository, however, I realized I could simply connect a light directly to the #13 pin and ground pin and attached it that way. From there it was as simple as connecting the board via USB, selecting the com port, and uploading the code.

Part 2

Changing the timing for the light was incredibly simple. All that was required was changing the values for the onTime and offTime variables.

Part 3

Realizing that I now needed to use resistors to set up multiple lights, some further research revealed that I did in fact have the ones I needed. With this new knowledge I set to connecting my breadboard to the Arduino using cables from the 5V and ground power terminals on the Arduino to the positive and negative lines on the board. Two cables were also used to connect two digital pins on the Arduino to the long pins on each of the lights. The resistors were then used to connect the short pins to the ground/negative line on the breadboard.

As for the code, I created a second variable to reference the new pin being used for the second light. From here all that was required was adding a second instance of digitalWrite() high and low to the loop for that second light. I also removed any delay after the low write so that one of the lights would be on at all times.

Part 4

For adding a third light the process was the same as adding the second. The addition of the light, a third cable, and resistor completed the physical portion. The code also simply needed an additional variable to reference the pin, a new line to set the pinMode in setup(), and additional digital writes in the loop. Finally, reversing the sequence was as simple as swapping the value assigned to the first and third pin variables.

Part 5

The message which I chose to implement was the standard SOS pattern in Morse Code, which follows the pattern “… — …”. To do this, I used three lights with one light relaying each character. Code for this followed the same pattern for each light, turning the light on and off with digitalWrite() three times, with a delay after each write. As the first and last letters are the same, they shared the same, smaller delay value and the middle light used a longer delay.

I recorded a video of this but no matter what I tried WordPress refused to upload it.

Part 6

  1. In order to cause the light to blink without using delay() I worked based upon the Blink Without Delay tutorial. I added two new global variables to keep track of an elapsed time and hold a max delay time. On each iteration of the loop, the current time is saved using the millis() function. The elapsed time is subtracted from this value and if it is greater than the max delay time, the previous time will be set to the current and the state of the light is swapped to whatever it is not (high or low) using a nested if() statement. The light state is saved using another global variable. Finally digitalWrite() is used to turn the light on or off.
  2. As before, the only change required to make the light blink at a different rate was changing the delay value.
  3. Code require for the second light required the addition of global variables for the pin and its state. Making the lights alternate was as simple as setting the second light’s state to LOW when the first is set to HIGH within the nested if(). Another digitalWrite() for the second light is also included.
  4. Making the three lights blink in sequence required slight modifications to the if() statements of the loop. The first check of the state of the lights now checks if both the first and second lights are off, if so it will turn the first on and second and third off. The next now simply checks if the first light is on and if so turns it off and the second on. Finally an else block turns the second off and the third on.
  5. Following the same three light setup as I did for the delay() implementation, getting them to display a message required a variable to keep track of the max number of times I wanted each light to blink and one to track which light should be currently blinking. Two more were used to hold values for the time lights should remain for dots and dashes. From there I simply checked what the current light was, compared the current minus the previous time against either the value for a dot or dash, and if it was greater turned on the light and incremented the blink count. Once the light had blinked the needed number of times, the light count was incremented to move on to the next and the blink count reset.
Create your website at WordPress.com
Get started