[icm final] True Love Tinder Robot v1

2015-11-30 22.37.20

Here we are, at the end of ICM! How time flies.

This is the first prototype of my ICM/pcomp final project, the True Love Tinder Robot. The fabrication and wiring is not quite done yet, but this being a post about my ICM documentation, the code is more complete, and that’s the part I’ll focus on here.

Here’s how it works:

You put the phone down in front of the robot hand with Tinder open. Then you put your hands on the sensors. The robot will begin talking, telling you that you have a few seconds to look carefully at each Tinder profile. As you’re looking, the robot will reading your heart’s desire through the sensors, and then swipe on your phone for you*, all the while announcing what it’s doing.

*(okay the physical swiping part isn’t done, but will be next week!)

2015-12-02 15.53.48

Now, about the code:

One of the biggest coding challenges in this project for me was working with counting and timing. A number of things happen in the project that depend on time: how long it takes to get a reading, how long you have before the robot swipes, how much time can pass with no sensor reading to determine that the user has left and that the next user should start from the beginning of the sequence. In Arduino, the only built-in timer is millis(), which counts the milliseconds from when the program launched. Therefore there’s no simple way to say, “when this event happens, do something for five seconds, and then do something else for three seconds, and then start over.”

I did a few things to solve this problem. The first was that I turned my series of convoluted if statements into a finite state machine, which made it much easier to organize.

//state machine
#define waiting 1
#define checkIfNewUser 2
#define newUser 3
#define giveIntro 4
#define beginSwiping 5
#define calculate 6
#define changeHasSwipedLeft 7
#define changeHasSwipedRight 8
#define swipeLeft 9
#define swipeRight 10
#define getNoSensorTime 11
#define ending 12

Another was saving a millis() value when something crucial happened, and then taking the difference between that and the current millis() to act as a timer.

static unsigned long ts;
timePassed = (millis() / 1000) - ts; //the amount of time passed is the current time minus the last time saved

And I used boolean flags to mark when an event happened.

   case changeHasSwipedLeft:
      hasSwipedLeft = false;
      state = swipeLeft;
      break;
    case swipeLeft:
      hasSwiped = true;
      if (timePassed >= 6 && hasSwipedLeft == false) {
        emic.speak(swipeLeftWords[iLeft]);
        iLeft++;
        hasSwipedLeft = true;
        handServo.write(servoLeft);
        digitalWrite(swipeLeftRedLed, HIGH);
        digitalWrite(swipeRightGreenLed, LOW);
      }
      if (timePassed >= 9) {
        state = beginSwiping;
      }
      break;

Another thing I thought about a lot was the personality of the robot. It was important to me that the robot speaks to you, both to explain what’s going on, but also to add to the experience. I was inspired by the character GLaDOS from the videogame Portal, who is an AI that’s passive-aggressive, dark, and very funny. The lines I wrote for the robot were an attempt to give a similar sense of character, even for a short interaction.

A first prototype of my creepy robot

A video posted by Nicole He (@nicolehe) on

//platform.instagram.com/en_US/embeds.js

Overall, I’m happy with the way this is turning out. The physical part is now finished yet, but I’m glad that the code is doing what I want it to. I learned a lot during the process, not just about specific coding techniques, but also about how to find and use libraries, upload on Github, and generally Google around for answers to my many questions.

My code is on Github.

https://github.com/nicolehe/true-love-tinder-robot

[icm week 10] Final project idea: True Love Tinder Robot

INTRODUCTION

tinder_match

Two things.

Number one: It’s common these days for people to use technology to look for sex and love. Some stats show that 1 in 4 couples these days have met online. Dating sites and apps like OkCupid and Tinder give us the tools to find the people we are interested in, but we still strongly feel that the choices we make in our partners are our own. It’s not a stretch to think that as the algorithms get better and the technology advances, it’ll do an even better job of matching us with potential partners.

image

Number two: it’s also increasingly acceptable to buy consumer devices that tell us about what’s going on in our bodies. The Nike Fuelband, the Fitbit, and now the Apple Watch read our biometric data and tell us about our exercise habits and in turn suggest how we can improve ourselves.

Maybe it’s not a stretch to think that we’ll eventually combine these two things.

My final project idea, as mentioned in previous posts, is a True Love Tinder Robot. I want to explore the idea that the computer knows us better than we know ourselves, and that it has a better authority on who we should date than we do. Before I go into the specifics of my own project, here are a few examples of similar works other people have done.

INSPIRATION

1) Social Turkers: Crowdsourced Dating (2013) by Lauren McCarthy

Screenshot 2015-11-12 10.34.51

Lauren, who is also my teacher for this class (hi Lauren!) did a project where she went on dates that were live-streamed in realtime to Mechanical Turk workers, who gave her feedback as the date went on.

2) Lonely Sculpture (2014) by Tully Arnot

Tully made a finger that just says yes to every Tinder profile it comes across. The lifelike, grotesque finger in this project is quite evocative and gave me inspiration to use a lifelike hand for my own robot.

3) Tender (2015) by Cors Brinkman, Jeroen van Oorschot, Marcello Maureira, and Matei Szabo

JAwaxFU

Tender is literally a piece of meat that swipes right.

MY PROJECT

IMG_0935

Short description:

The True Love Tinder Robot reads your true heart’s desire as you are looking at Tinder profiles, and then physically swipes on your phone for you. Literally put your love life in a robot’s hands.

Here’s the idea of how it works in its current iteration:

I’m planning on making a lifelike, robotic hand. The hand will be on a box, housing LEDs in a heart shape. In front of a hand will be a stand to hold a phone. There will also be a speaker on or next to the box.

The user puts their phone in front of it, with the Tinder app open. The user then puts their hand on the sensors next to the box. When the robot is ready, the LEDs will light up and the robot will say “I’m ready to find you love” (or something like that). The robot will also explain that you have five seconds to look at each profile, and then it will read your true heart’s desire and swipe accordingly. It then begins to swipe.

Based on the biometric data from the sensors as the user is looking at each profile, the robot will decide whether or not the user should date someone. When the robot thinks the user is not interested, the hand will swipe left and the LEDs will turn red, and the robot will announce that they are not a good match. On the other hand, if the robot thinks the user should say yes to the person they are looking at, the hand will swipe right and the LEDS will turn green and the robot will announce that it’s a match.

Throughout the process, the robot will say things to reassure you to trust it.

PROGRESS

Last week I tested a camera and an emotion-reading library as a potential sensor. It was interesting, but I decided that I won’t use a screen because it’s distracting, and I probably won’t use a camera.

I haven’t decided yet what biometric sensor to use, so this week I’ll have to test the heart rate monitor, as well as try a GSR.

The bulk of the ICM-related work will be, I think, working with audio and making sure the robot responds appropriately.

GOALS & QUESTIONS

I want this project to be sort of amusing, kind of creepy and slightly embarrassing. I want the user to feel a tension  between the robot assuring you that it knows best and not being sure whether or not to trust it. I want the user to question whether or not we should let a computer make intimate decisions for us.

Some questions:

  • How can I make the robot evoke the feeling that it has authority? (Besides calling it a robot.)
  • How can I make the experience interesting for spectators as well as the user?
  • What type of things should the robot “say?”
  • How should the interaction end?

 

[pcomp week 10] Final project BOM, system diagram, and plan

 

 

unnamed

We did some playtesting last week for our final projects, which was quite helpful. I got a lot of interesting feedback about my true love Tinder robot.

For testing purposes, I had one of the few sensors I was planning on using (semi)working — the camera. I used my computer screen to mostly indicate what would happen with different facial expression readings, and then I manually moved my stand-in robot hand.

IMG_0866

Although I didn’t set up a perfect test, it was helpful to hear what people had to say, and what other ideas they suggested. One thing I realized immediately was that having a second screen is too distracting, as users are supposed to be looking at the phone the whole time. I’ll likely replace that with audio output, so the robot speaks rather than show something on a screen.

A big question that came up was the personality of the robot, and generally how the robot will be presented. There was a suggestion to build an entire robot with a body and a face, which sounds interesting but….might be beyond my scope for now.

I’m now thinking about how much “personality” I can convey in the material of the hand and in how the robot speaks. Lots of work to be done!

Below are my initial BOM, plan, and system diagram.

 

 

 

Screenshot 2015-11-11 10.24.16Screenshot 2015-11-10 22.13.00IMG_0925

[pcomp week 7] Space Laser! (midterm)

 

For our Physical Computing midterm project, Yiting and I made a game called Space Laser!. (The critical exclamation point was added at the suggestion of Tom.)

HOW IT WORKS

You have 30 seconds to hit all five aliens using your laser. When you successfully hit one, their eyes will dim and they’ll make a dying sound. When you only have 10 seconds left, the eyes will start to blink so that you know you’re running out of time.

IDEA

We originally came up with an entirely different idea for a game, but after some experimentation, decided that we wanted to do something with two servos on top of each other. This formed the basis for our laser pointer. But what good is a laser if you don’t have something to shoot at?

Everyone knows that aliens are the natural enemy of lasers, so we decided to make a space-themed game where you’d shoot a laser at aliens, who would react in various ways when they got hit.

There were two main parts of the project: the laser and the target board. We prototyped both pieces before we made the final versions.

LASER

One of the first things we did was attach a laser diode to a servo on top of another servo. One servo moved in the x-axis and the other moved in the y-axis, so when combined, they allowed a wide range of movement for the laser on top. We then connected it to a toggle switch to turn it on and off, and a joystick that mapped to the axes of the servos. We put them all together in a cardboard box for prototyping, which was simple enough.

IMG_0635

IMG_2798

 

Here’s where it got a bit more complicated. Neither Yiting nor I have any experience with fabrication, which was one of the biggest challenges in this project. We knew we wanted to make a more solid enclosure, but weren’t really sure how. When we found a piece of acrylic in the junk shelf, we thought hey, why not try to make something out of this?

IMG_2805 IMG_2804

We took some measurements, and Yiting lasercut the acrylic into rectangles, as well as the holes on top for the joystick and switch. I rather haphazardly glued the pieces together with acrylic cement.

It worked for this project, but I learned that going forward with future projects, I should use a different material or buy a pre-made enclosure.

box1

box2

TARGET BOARD

As with the box, we began with a cardboard prototype of a target.

boardPrototypr

After some testing, we confirmed that a laser could reliably make a spike in the photocell’s sensor reading, so it could easily be used as a target on each alien. We figured that having two types of feedback together for successfully hitting the target — audio and visual — would create a nice effect, so we used LEDs for the eyes, and decided to add a speaker as well. The eyes would dim and the speaker would make the “dying” sound once you hit an alien.

I found a large black foam board in the junk shelf in the back. (Who knew that it could be such a treasure trove?) This became the backdrop of our space fight.

We found some free assets online, and Yiting used her amazing Illustrator skills to mock up a design. We print the images out and cut them up.

Then the tedious part began!

IMG_0662IMG_0663

It’s funny how the unexpected things can take up the most time. For example, it took us an excruciatingly long time to successfully poke the legs of the photocells and LEDs through a piece of paper, a layer of cardboard, and a foamboard. We had 30 legs in total to poke through, and often each one took multiple attempts, even with the help of a sewing needle.

IMG_0666

In the end, we successfully gave each alien two LED eyes and a photocell target in its mouth.

Now, on to the wiring.

This next part was theoretically straightforward — each LED would need a resistor and a digital input, and each photocell would also need a photocell and an analog input. We taped an Arduino and a breadboard on the back, and after some testing with alligator clips, we began to solder.

IMG_0688 IMG_0691

This also took a very long time even with the two of us soldering together.

Happily, we wired it all up correctly and it all worked, even if it’s not pretty. In the future, I’ll make an effort to tidy up and organize wires better.

IMG_0759

The next part was both the easiest and probably the most rewarding. We thought it would be fun to add obstacles to the board to make the game harder, so we added three planets that moved back and forth automatically in front of the aliens, each controlled by a servo. This was simple but made the game immensely more dynamic and fun. It really confirmed for me what Tom said in class a few weeks ago, which was that servos really give us a lot of “bang for [our] buck.”

IMG_0704

The last component was sound. We attached a speaker to the board and designed three different types of sound: a “game start” sound, an “alien died” sound, and a “game over” sound. This audio effects also really added a lot of how dynamic and interactive the game felt.

END RESULT

IMG_0716

Yiting and I were both really pleased with how this little game turned out. It was a good lesson in making something from start to finish, and in collaboration.

Lessons learned for me:

  • Organize your wiring!
  • Prototyping before making a final version is necessary.
  • Learn more about fabricating enclosures, or just buy them.
  • Working with another person is extremely helpful.
  • The soldering irons in the ITP shop suck.
  • Giving a user more than one type of feedback makes the experience feel much more interactive.
  • Never try to poke the legs of photocells through foam board or cardboard, it is a terrible experience.

We used three Arduinos for this project, and the code for all three parts is below.

Continue reading “[pcomp week 7] Space Laser! (midterm)”

[pcomp week 8] Final project idea: Tinder Robot

IMG_0802

This idea came to me in as I was falling asleep a few weeks ago, which probably means that it’s either really good or really bad. I hope to find out which one it is soon!

My idea for my final project in pcomp is based around the popular dating app, Tinder. The way it works is that it shows you profiles of other users, and you can “swipe left” to say no, and “swipe right” to say yes. I think it would be interesting to have a computer automate the process for you by reading your body to determine whether or not you truly want to say yes or no to a potential match. Reading your “heart’s desire,” if you will.

The user, connected to some kind of biometric sensor (heart rate monitor? GSR sensor? both?)  would look at a Tinder profile on the screen, and the computer would read whether or not they were excited by the profile. If not, a robot hand controlled by a motor or a servo of some kind would swipe left. If so, it would swipe right.

For demoing purposes, I’d set up some kind of dummy Tinder account that anyone can try. But I think this experiment would be even more interesting if users were willing to put their own phones and Tinder accounts on the line, so there actually some (small) stakes for them.

There are a few questions, of course, that I need answer moving forward:

  • What kind of sensors should be used?
  • How should the data be read to interpret interest or lack of interest?
  • What potential obstacles are there in the Tinder UI? (For example, I think if you and another user both swipe right, a different screen pops up. How would we deal with that?)
  • Should we use another display to show what the sensor readings are? (I drew one in the sketch above just in case.)

[pcomp week 6] Allergic to Love, Joystick Version (serial communication)

2015-10-14 00.00.59

I ran into a lot of problems this week!

While doing this week’s labs on serial communication, lots of weird things kept happening — it would work sometimes and not other times, my computer crashed, Arduino couldn’t find the right port…

A few hours in, I realized that 95% of my problems were because I kept forgetting to turn on and off the serial monitor in P5. Even when I realized this I would forget to do it, or do it in the wrong order. This was very irritating.

Eventually, I got it sorted out. In the labs, it was suggested that we wire up one button and two potentiometers. Luckily for me, the joystick that I got for this assignment was literally those three things in one, so once I figured out the code on the Arduino side, I could easily move it into my p5 sketch.IMG_0500

The sketch I decided to add a sensor to was a game I made the other week in ICM, Allergic to Love. In the original version, you use your mouse to move your character along the x-axis, and then clicked to shoot your laser. Using a joystick to replace the mouse was fairly straightforward. The pot in the joystick that controls x-axis movement would replace mouseX, and the button on the joystick would replace mousePressed.

(This joystick also allows you to control movement on the y-axis, and even though this particular game doesn’t require it, I decided to keep that code in just in case in the future I decide to add some kind of movement up and down.)

Like in the lab, I made an array of data in P5 for the 3 sensor inputs: x, y, and button press. The P5 code using the sensor data ended up looking like this:

function serialEvent() {
  // read a string from the serial port
  // until you get carriage return and newline:
  var inString = serial.readStringUntil('rn');
 
  //check to see that there's actually a string there:
  if (inString.length > 0 ) {
    var sensors = split(inString, ',');            // split the string on the commas
    if (sensors.length > 2) {                      // if there are three elements
      if (sensors[0] == 539) { 
          cannon.dir = 0
      } else if (sensors[0] < 200) {
          cannon.dir = -1
      } else if (sensors[0] > 900) {
          cannon.dir = 1
      }
      buttonPress = sensors[2];      // element 2 is the button
    }
  }
}

You can see how I made sensors[0], the x-axis, change direction depending on input, and how I set sensors[2], the button, to a simple variable.

It’s pretty fun to play the game with a joystick. It definitely makes it feel more arcade-y. Even though I had to make it a bit easier to accommodate the joystick, it’s still pretty hard!

My P5 code and my Arduino code are below.

Continue reading “[pcomp week 6] Allergic to Love, Joystick Version (serial communication)”

[pcomp week 4] The Lonely But Tender Ghost v.2, now with sound! (analog outputs)

IMG_0272

It was fun playing with speakers and servo motors this week. After doing the labs, I focused mostly on doing things with sound, but in the future I’d like to spend more time experimenting with servos…

In the tone lab, I had some fun with the pitch library, and found it pretty easy to change the song played to something else:

I wanted to continue building on my lonely ghost project from last week. When I last left it, I had hand-sewn an FSR that caused an RGB LED to change colors depending on how hard you squeezed it. It was supposed to express “feeling” with the colors — green was good, red was bad. At that point, using colors were the only output.

I added a speaker to my breadboard and worked on adding sound in addition to color as feedback for the toy’s feelings.

IMG_0270

The first thing I did was add the tone() function to the code so that when the variable “force” was 3 — that is, pushed the hardest, the speaker would make a noise in addition to having the LED turn red.

I thought the ghost could be made to be a bit needier. What if it got lonely if you didn’t pay attention to it for a period of time?

I used the millis() function to count the number of milliseconds that have passed whenever the ghost was squeezed. I then set a variable called lonelyTime, which was the amount of time it that could pass before the ghost got lonely. When the last time squeezed subtracted from the current millisecond count exceeded lonelyTime, I had the speakers make a tone. It would stop when you squeezed it again.

(I used the same method to make the LED blink when you weren’t squeezing the FSR, which I thought was a more natural neutral state than having the light just be white.)

This was nice, but all of the tones sounded pretty boring and static. That’s when I realized I could use the pitches library, like in the tone lab, to compose custom sounds for each state. I ended up making three:

in pain
in pain
happy
happy
Screenshot 2015-09-29 13.17.28
lonely

I was a bit surprised by how much more effective the custom sounds were at expressing feeling compared to the basic speaker tones.

Now, the ghost feels much more like a pet or a needy toy. When he’s lonely, the light will turn yellow and he’ll make the lonely sound until you squeeze him. If you squeeze him gently, the light turns green and he makes the happy sound. If you squeeze him too hard, he’ll make a distressing sound and the light will turn red. The blink effect makes it feel more alive as well.

Check out the video (with sound) here:

My Arduino code is below.

Continue reading “[pcomp week 4] The Lonely But Tender Ghost v.2, now with sound! (analog outputs)”

[pcomp week 3] The Lonely But Tender Ghost (digital and analog inputs, digital outputs)

IMG_0156

This week we learned how to program the Arduino to take inputs from our sensors and program them to make stuff happen.

I went to the Soft Lab workshop on Friday, where I learned how to sew a simple button, so I used that in the first example of alternating LEDs with a switch:

The fun part was using analog sensors to change the brightness of LEDs — I wired up a force sensor and a photocell to control two different LEDs on the breadboard.

I had a ton of ideas for our assignment to do something creative with these sensors this week, many of which sounded great in my mind but in reality were all varying degrees of unfeasible for the time being. One thing that stuck with me — newly inspired by the Soft Lab — was the idea of doing something with a doll or plushie. My goal was to make a plushie that gave you the sense that it had feelings.

I decided to go with a force sensitive resistor. The idea was that I’d make a plushie with LED eyes that would change color depending on how hard you squeezed it.

Here’s the circuit I built on the breadboard:

The map() function was really helpful for me to turn the input from the sensor into three different states, which I could then turn into colors. I learned how to use an RGB LED with the help of this example from Adafruit, and I ended up using the setColor() function written in that sketch in my final code.

IMG_0163

The next step was to make my plushie!

 

IMG_0151 IMG_0153

I realized that my original plan to sew two RGB LEDs into fabric as eyes was actually extraordinarily complicated, so I just made the light separate from the plushie and went with the next best thing: googly eyes.

I built my own force sensitive resistor with some conductive material and Velostat, and sewed it all up in some felt to make my little ghost plushie. I noticed that the input values I got from the commercial FSR went pretty accurately from 0 – 1023, but my homemade FSR pretty much started at 750 or so rather than 0. I adjusted my variable in my code to accommodate it and it worked perfectly well.

I decided to call him the Lonely But Tender Ghost. In his normal state, the light is white. When you squeeze him tenderly, the light turns green. If you squeeze him too hard the light turns red. 😦

This is just a basic first project, but hopefully later on I can further explore building an object that makes you feel like it’s expressing human feelings, perhaps creating sympathy or empathy in you, the user.

My full Arduino code is below.

Continue reading “[pcomp week 3] The Lonely But Tender Ghost (digital and analog inputs, digital outputs)”

[pcomp week 3] Design meets disability reading & observation

from the Alternative Limb Project
from the Alternative Limb Project

This week’s reading was Graham Pullin’s Design Meets Disability, discussing both objects explicitly used to counteract a disability, like prosthetics, as well as objects used by people of all abilities that have varying levels of inclusiveness. Glasses are cited as an example of successful design for disability, to the point that people don’t consider poor eyesight a disability because glasses have transitioned from being medical devices to fashion accessories. This reminds me of Norman’s phrase, “Attractive things work better.”

I appreciate this perspective in the context of physical computing. If we’re designing for the human body, it’s important to take into consideration the ways in which people’s bodies and abilities are different, and to not take any particular ability for granted. I think it’s neat to see examples of things designed specifically for, say, wheelchair users, but also to see products that keep different preferences of usage in mind (a clicking sound and sensation, for example.)

(A small note on the examples: it was fun to see Nick’s Bricolo because we used to work together at my old job before ITP!)

——-

For my observation assignment, I decided to watch people use the entrance to the subway. More specifically, I watched them use the Metrocard slider that collects their fare.

NEW YORK, UNITED STATES - JANUARY 13:  A person swipes a metrocard in New York subway station on January 13, 2014 in New York, United States. The Metropolitan Transportation Authority (MTA) declares that MetroCard the subway rapid transit system is going to change until 2019. (Photo by Bilgin S. Sasmaz/Anadolu Agency/Getty Images)

According to the MTA, people swipe in about 1.7 billion times a year. That’s a lot! I’ve probably done it a thousand times myself.

That said, it’s certainly not perfect. My assumption is that people who are accustomed to the system — understanding which way to swipe and the specific speed at which you swipe — can move through pretty quickly within 3 seconds or so with no problem. But tourists, anyone that has insufficient fare on their Metrocard or any other Metrocard problem, or people that move too slowly I predict will have trouble with the machine.

I watched people use the machine at Union Square because there’s a lot of activity there, and locals and tourists alike.

I noticed that the people using the machines generally fell into three groups:

  1. Confident and experienced users who got through with no problem
  2. Confused users who had problems, likely tourists
  3. Confident users who had a problem with their card

The first group was the majority of users who moved through the system quickly. The second group usually approached the machines slowly and often in groups, and would often swipe too slowly or too quickly, receiving the “Swipe card again at this turnstile” message. They would try again and again until it worked. This usually would take something more like 10 or 15 seconds.

The third group actually ran into the most trouble. People who were experienced and confident moved forward with the speed of someone who would get through in a couple seconds, but were halted by the machine abruptly if the card didn’t work. Sometimes they would almost run into the turnstile because the momentum was carrying them forward. Other times there were almost collisions with people behind them, especially if they had to turn back to refill their card.

In the case of insufficient fare, people had to go back to the machines to refill them, which could take up to a few minutes.

Developing the correct motion to swipe in a way that the machine understands is a skill that improves with practice. This is probably one reason why most other (and more modern) subway systems around the world use a tapping system, which seems to be easier for anyone using the machine, even if they’ve never done it before.

The way to solve the insufficient fare problem seems to be harder. It’s not an issue of not informing riders of how much fare is left (since it’s on the display when you swipe in), but people forget that they need to refill even if during the last ride they knew they ran out. It seems to be an issue of when riders are notified that they need to refill, which should ideally be when they walk into the station and not when they’re already at the turnstile.

A shorter term solution might be to design the space around the turnstiles in such a way that people can quickly exit the turnstile area if they need to, so it’s not a crowded bottleneck.

 

[pcomp week 2] Electronics labs, switches, and CHOP TO THE TOP!!!

IMG_0067

This week in pcomp we got our hands dirty with circuits and components.

IMG_0020 IMG_0022

I spent a good amount of time getting to know the multimeter, learning how to measure resistance, voltage and current. One of the hardest parts about it was holding the tiny components, measuring them with the multimeter and trying to take pictures at the same time!

I also started building basic circuits with switches and LEDs.

To do the exercise with the potentiometer, I realized I had to solder wires on to the sensor, so I also learned to solder for the first time. It definitely wasn’t the prettiest solder-job in the world, but it worked.

Applying what we learned in the labs, our assignment was to make a switch. We were encouraged to be creative — “A switch is nothing more than a mechanism to bring two pieces of conductive material together and separate them.”

I decided to make a game.

I wanted to do something that tested people’s ability to use chopsticks. I decided to make something that would have two conductive parts with a wide space in between them. The pieces would be connected when people moved other conductive things together in between them so that the circuit could be completed, at which point a light would turn on.

In order to encourage people to use chopsticks to pick things up rather than push things around, I had to make the space between the two conductive points vertical rather than horizontal on the table. Aluminum foil—cheap and abundant—was my material of choice for conducting electricity. So I made a bunch of balls of aluminum foil of different sizes as the things people had to pick up.

The tubes I used were originally pen holders that I had cut the bottoms out of,  glueing foil at the bottom and the top. I had to test if a bunch of foil balls stacked on top of each other would be conductive enough to connect the bottom to the top, so I first used the multimeter, and then wired up a circuit that made an LED light up.

IMG_0044IMG_0045

In order to make this game a race between two teams, I had two tubes and wired up two switches and LEDs of different colors in parallel on my breadboard. So now there’s a green team and a yellow team.

IMG_0070 IMG_0072

I’m not 100% sure the schematic I drew is accurate, mostly because I’m not sure if the line in the middle is correct. But I think it is?

It started to come together, and I wrote the rules:

Light your color first to win. Turn on your light by filling your tube to the top with the balls. You may only touch the balls with chopsticks.

(2 players: each player holds a pair of chopsticks)

(4 players: divide in 2 teams, each player holds 1 chopstick)

A silly game deserves a silly name, and thus I call it CHOP TO THE TOP!!!

IMG_0054

IMG_0055

And voila. I had Aaron (yellow) and Sam (green) play the game to demo. The LEDs are tiny so they’re a little hard to see, but it works!