[pcomp week 10] Final project BOM, system diagram, and plan

 

 

unnamed

We did some playtesting last week for our final projects, which was quite helpful. I got a lot of interesting feedback about my true love Tinder robot.

For testing purposes, I had one of the few sensors I was planning on using (semi)working — the camera. I used my computer screen to mostly indicate what would happen with different facial expression readings, and then I manually moved my stand-in robot hand.

IMG_0866

Although I didn’t set up a perfect test, it was helpful to hear what people had to say, and what other ideas they suggested. One thing I realized immediately was that having a second screen is too distracting, as users are supposed to be looking at the phone the whole time. I’ll likely replace that with audio output, so the robot speaks rather than show something on a screen.

A big question that came up was the personality of the robot, and generally how the robot will be presented. There was a suggestion to build an entire robot with a body and a face, which sounds interesting but….might be beyond my scope for now.

I’m now thinking about how much “personality” I can convey in the material of the hand and in how the robot speaks. Lots of work to be done!

Below are my initial BOM, plan, and system diagram.

 

 

 

Screenshot 2015-11-11 10.24.16Screenshot 2015-11-10 22.13.00IMG_0925

[pcomp week 7] Space Laser! (midterm)

 

For our Physical Computing midterm project, Yiting and I made a game called Space Laser!. (The critical exclamation point was added at the suggestion of Tom.)

HOW IT WORKS

You have 30 seconds to hit all five aliens using your laser. When you successfully hit one, their eyes will dim and they’ll make a dying sound. When you only have 10 seconds left, the eyes will start to blink so that you know you’re running out of time.

IDEA

We originally came up with an entirely different idea for a game, but after some experimentation, decided that we wanted to do something with two servos on top of each other. This formed the basis for our laser pointer. But what good is a laser if you don’t have something to shoot at?

Everyone knows that aliens are the natural enemy of lasers, so we decided to make a space-themed game where you’d shoot a laser at aliens, who would react in various ways when they got hit.

There were two main parts of the project: the laser and the target board. We prototyped both pieces before we made the final versions.

LASER

One of the first things we did was attach a laser diode to a servo on top of another servo. One servo moved in the x-axis and the other moved in the y-axis, so when combined, they allowed a wide range of movement for the laser on top. We then connected it to a toggle switch to turn it on and off, and a joystick that mapped to the axes of the servos. We put them all together in a cardboard box for prototyping, which was simple enough.

IMG_0635

IMG_2798

 

Here’s where it got a bit more complicated. Neither Yiting nor I have any experience with fabrication, which was one of the biggest challenges in this project. We knew we wanted to make a more solid enclosure, but weren’t really sure how. When we found a piece of acrylic in the junk shelf, we thought hey, why not try to make something out of this?

IMG_2805 IMG_2804

We took some measurements, and Yiting lasercut the acrylic into rectangles, as well as the holes on top for the joystick and switch. I rather haphazardly glued the pieces together with acrylic cement.

It worked for this project, but I learned that going forward with future projects, I should use a different material or buy a pre-made enclosure.

box1

box2

TARGET BOARD

As with the box, we began with a cardboard prototype of a target.

boardPrototypr

After some testing, we confirmed that a laser could reliably make a spike in the photocell’s sensor reading, so it could easily be used as a target on each alien. We figured that having two types of feedback together for successfully hitting the target — audio and visual — would create a nice effect, so we used LEDs for the eyes, and decided to add a speaker as well. The eyes would dim and the speaker would make the “dying” sound once you hit an alien.

I found a large black foam board in the junk shelf in the back. (Who knew that it could be such a treasure trove?) This became the backdrop of our space fight.

We found some free assets online, and Yiting used her amazing Illustrator skills to mock up a design. We print the images out and cut them up.

Then the tedious part began!

IMG_0662IMG_0663

It’s funny how the unexpected things can take up the most time. For example, it took us an excruciatingly long time to successfully poke the legs of the photocells and LEDs through a piece of paper, a layer of cardboard, and a foamboard. We had 30 legs in total to poke through, and often each one took multiple attempts, even with the help of a sewing needle.

IMG_0666

In the end, we successfully gave each alien two LED eyes and a photocell target in its mouth.

Now, on to the wiring.

This next part was theoretically straightforward — each LED would need a resistor and a digital input, and each photocell would also need a photocell and an analog input. We taped an Arduino and a breadboard on the back, and after some testing with alligator clips, we began to solder.

IMG_0688 IMG_0691

This also took a very long time even with the two of us soldering together.

Happily, we wired it all up correctly and it all worked, even if it’s not pretty. In the future, I’ll make an effort to tidy up and organize wires better.

IMG_0759

The next part was both the easiest and probably the most rewarding. We thought it would be fun to add obstacles to the board to make the game harder, so we added three planets that moved back and forth automatically in front of the aliens, each controlled by a servo. This was simple but made the game immensely more dynamic and fun. It really confirmed for me what Tom said in class a few weeks ago, which was that servos really give us a lot of “bang for [our] buck.”

IMG_0704

The last component was sound. We attached a speaker to the board and designed three different types of sound: a “game start” sound, an “alien died” sound, and a “game over” sound. This audio effects also really added a lot of how dynamic and interactive the game felt.

END RESULT

IMG_0716

Yiting and I were both really pleased with how this little game turned out. It was a good lesson in making something from start to finish, and in collaboration.

Lessons learned for me:

  • Organize your wiring!
  • Prototyping before making a final version is necessary.
  • Learn more about fabricating enclosures, or just buy them.
  • Working with another person is extremely helpful.
  • The soldering irons in the ITP shop suck.
  • Giving a user more than one type of feedback makes the experience feel much more interactive.
  • Never try to poke the legs of photocells through foam board or cardboard, it is a terrible experience.

We used three Arduinos for this project, and the code for all three parts is below.

Continue reading “[pcomp week 7] Space Laser! (midterm)”

[pcomp week 8] Final project idea: Tinder Robot

IMG_0802

This idea came to me in as I was falling asleep a few weeks ago, which probably means that it’s either really good or really bad. I hope to find out which one it is soon!

My idea for my final project in pcomp is based around the popular dating app, Tinder. The way it works is that it shows you profiles of other users, and you can “swipe left” to say no, and “swipe right” to say yes. I think it would be interesting to have a computer automate the process for you by reading your body to determine whether or not you truly want to say yes or no to a potential match. Reading your “heart’s desire,” if you will.

The user, connected to some kind of biometric sensor (heart rate monitor? GSR sensor? both?)  would look at a Tinder profile on the screen, and the computer would read whether or not they were excited by the profile. If not, a robot hand controlled by a motor or a servo of some kind would swipe left. If so, it would swipe right.

For demoing purposes, I’d set up some kind of dummy Tinder account that anyone can try. But I think this experiment would be even more interesting if users were willing to put their own phones and Tinder accounts on the line, so there actually some (small) stakes for them.

There are a few questions, of course, that I need answer moving forward:

  • What kind of sensors should be used?
  • How should the data be read to interpret interest or lack of interest?
  • What potential obstacles are there in the Tinder UI? (For example, I think if you and another user both swipe right, a different screen pops up. How would we deal with that?)
  • Should we use another display to show what the sensor readings are? (I drew one in the sketch above just in case.)

[pcomp week 6] Allergic to Love, Joystick Version (serial communication)

2015-10-14 00.00.59

I ran into a lot of problems this week!

While doing this week’s labs on serial communication, lots of weird things kept happening — it would work sometimes and not other times, my computer crashed, Arduino couldn’t find the right port…

A few hours in, I realized that 95% of my problems were because I kept forgetting to turn on and off the serial monitor in P5. Even when I realized this I would forget to do it, or do it in the wrong order. This was very irritating.

Eventually, I got it sorted out. In the labs, it was suggested that we wire up one button and two potentiometers. Luckily for me, the joystick that I got for this assignment was literally those three things in one, so once I figured out the code on the Arduino side, I could easily move it into my p5 sketch.IMG_0500

The sketch I decided to add a sensor to was a game I made the other week in ICM, Allergic to Love. In the original version, you use your mouse to move your character along the x-axis, and then clicked to shoot your laser. Using a joystick to replace the mouse was fairly straightforward. The pot in the joystick that controls x-axis movement would replace mouseX, and the button on the joystick would replace mousePressed.

(This joystick also allows you to control movement on the y-axis, and even though this particular game doesn’t require it, I decided to keep that code in just in case in the future I decide to add some kind of movement up and down.)

Like in the lab, I made an array of data in P5 for the 3 sensor inputs: x, y, and button press. The P5 code using the sensor data ended up looking like this:

function serialEvent() {
  // read a string from the serial port
  // until you get carriage return and newline:
  var inString = serial.readStringUntil('rn');
 
  //check to see that there's actually a string there:
  if (inString.length > 0 ) {
    var sensors = split(inString, ',');            // split the string on the commas
    if (sensors.length > 2) {                      // if there are three elements
      if (sensors[0] == 539) { 
          cannon.dir = 0
      } else if (sensors[0] < 200) {
          cannon.dir = -1
      } else if (sensors[0] > 900) {
          cannon.dir = 1
      }
      buttonPress = sensors[2];      // element 2 is the button
    }
  }
}

You can see how I made sensors[0], the x-axis, change direction depending on input, and how I set sensors[2], the button, to a simple variable.

It’s pretty fun to play the game with a joystick. It definitely makes it feel more arcade-y. Even though I had to make it a bit easier to accommodate the joystick, it’s still pretty hard!

My P5 code and my Arduino code are below.

Continue reading “[pcomp week 6] Allergic to Love, Joystick Version (serial communication)”

[pcomp week 4] The Lonely But Tender Ghost v.2, now with sound! (analog outputs)

IMG_0272

It was fun playing with speakers and servo motors this week. After doing the labs, I focused mostly on doing things with sound, but in the future I’d like to spend more time experimenting with servos…

In the tone lab, I had some fun with the pitch library, and found it pretty easy to change the song played to something else:

I wanted to continue building on my lonely ghost project from last week. When I last left it, I had hand-sewn an FSR that caused an RGB LED to change colors depending on how hard you squeezed it. It was supposed to express “feeling” with the colors — green was good, red was bad. At that point, using colors were the only output.

I added a speaker to my breadboard and worked on adding sound in addition to color as feedback for the toy’s feelings.

IMG_0270

The first thing I did was add the tone() function to the code so that when the variable “force” was 3 — that is, pushed the hardest, the speaker would make a noise in addition to having the LED turn red.

I thought the ghost could be made to be a bit needier. What if it got lonely if you didn’t pay attention to it for a period of time?

I used the millis() function to count the number of milliseconds that have passed whenever the ghost was squeezed. I then set a variable called lonelyTime, which was the amount of time it that could pass before the ghost got lonely. When the last time squeezed subtracted from the current millisecond count exceeded lonelyTime, I had the speakers make a tone. It would stop when you squeezed it again.

(I used the same method to make the LED blink when you weren’t squeezing the FSR, which I thought was a more natural neutral state than having the light just be white.)

This was nice, but all of the tones sounded pretty boring and static. That’s when I realized I could use the pitches library, like in the tone lab, to compose custom sounds for each state. I ended up making three:

in pain
in pain
happy
happy
Screenshot 2015-09-29 13.17.28
lonely

I was a bit surprised by how much more effective the custom sounds were at expressing feeling compared to the basic speaker tones.

Now, the ghost feels much more like a pet or a needy toy. When he’s lonely, the light will turn yellow and he’ll make the lonely sound until you squeeze him. If you squeeze him gently, the light turns green and he makes the happy sound. If you squeeze him too hard, he’ll make a distressing sound and the light will turn red. The blink effect makes it feel more alive as well.

Check out the video (with sound) here:

My Arduino code is below.

Continue reading “[pcomp week 4] The Lonely But Tender Ghost v.2, now with sound! (analog outputs)”

[pcomp week 3] The Lonely But Tender Ghost (digital and analog inputs, digital outputs)

IMG_0156

This week we learned how to program the Arduino to take inputs from our sensors and program them to make stuff happen.

I went to the Soft Lab workshop on Friday, where I learned how to sew a simple button, so I used that in the first example of alternating LEDs with a switch:

The fun part was using analog sensors to change the brightness of LEDs — I wired up a force sensor and a photocell to control two different LEDs on the breadboard.

I had a ton of ideas for our assignment to do something creative with these sensors this week, many of which sounded great in my mind but in reality were all varying degrees of unfeasible for the time being. One thing that stuck with me — newly inspired by the Soft Lab — was the idea of doing something with a doll or plushie. My goal was to make a plushie that gave you the sense that it had feelings.

I decided to go with a force sensitive resistor. The idea was that I’d make a plushie with LED eyes that would change color depending on how hard you squeezed it.

Here’s the circuit I built on the breadboard:

The map() function was really helpful for me to turn the input from the sensor into three different states, which I could then turn into colors. I learned how to use an RGB LED with the help of this example from Adafruit, and I ended up using the setColor() function written in that sketch in my final code.

IMG_0163

The next step was to make my plushie!

 

IMG_0151 IMG_0153

I realized that my original plan to sew two RGB LEDs into fabric as eyes was actually extraordinarily complicated, so I just made the light separate from the plushie and went with the next best thing: googly eyes.

I built my own force sensitive resistor with some conductive material and Velostat, and sewed it all up in some felt to make my little ghost plushie. I noticed that the input values I got from the commercial FSR went pretty accurately from 0 – 1023, but my homemade FSR pretty much started at 750 or so rather than 0. I adjusted my variable in my code to accommodate it and it worked perfectly well.

I decided to call him the Lonely But Tender Ghost. In his normal state, the light is white. When you squeeze him tenderly, the light turns green. If you squeeze him too hard the light turns red. 😦

This is just a basic first project, but hopefully later on I can further explore building an object that makes you feel like it’s expressing human feelings, perhaps creating sympathy or empathy in you, the user.

My full Arduino code is below.

Continue reading “[pcomp week 3] The Lonely But Tender Ghost (digital and analog inputs, digital outputs)”

[pcomp week 3] Design meets disability reading & observation

from the Alternative Limb Project
from the Alternative Limb Project

This week’s reading was Graham Pullin’s Design Meets Disability, discussing both objects explicitly used to counteract a disability, like prosthetics, as well as objects used by people of all abilities that have varying levels of inclusiveness. Glasses are cited as an example of successful design for disability, to the point that people don’t consider poor eyesight a disability because glasses have transitioned from being medical devices to fashion accessories. This reminds me of Norman’s phrase, “Attractive things work better.”

I appreciate this perspective in the context of physical computing. If we’re designing for the human body, it’s important to take into consideration the ways in which people’s bodies and abilities are different, and to not take any particular ability for granted. I think it’s neat to see examples of things designed specifically for, say, wheelchair users, but also to see products that keep different preferences of usage in mind (a clicking sound and sensation, for example.)

(A small note on the examples: it was fun to see Nick’s Bricolo because we used to work together at my old job before ITP!)

——-

For my observation assignment, I decided to watch people use the entrance to the subway. More specifically, I watched them use the Metrocard slider that collects their fare.

NEW YORK, UNITED STATES - JANUARY 13:  A person swipes a metrocard in New York subway station on January 13, 2014 in New York, United States. The Metropolitan Transportation Authority (MTA) declares that MetroCard the subway rapid transit system is going to change until 2019. (Photo by Bilgin S. Sasmaz/Anadolu Agency/Getty Images)

According to the MTA, people swipe in about 1.7 billion times a year. That’s a lot! I’ve probably done it a thousand times myself.

That said, it’s certainly not perfect. My assumption is that people who are accustomed to the system — understanding which way to swipe and the specific speed at which you swipe — can move through pretty quickly within 3 seconds or so with no problem. But tourists, anyone that has insufficient fare on their Metrocard or any other Metrocard problem, or people that move too slowly I predict will have trouble with the machine.

I watched people use the machine at Union Square because there’s a lot of activity there, and locals and tourists alike.

I noticed that the people using the machines generally fell into three groups:

  1. Confident and experienced users who got through with no problem
  2. Confused users who had problems, likely tourists
  3. Confident users who had a problem with their card

The first group was the majority of users who moved through the system quickly. The second group usually approached the machines slowly and often in groups, and would often swipe too slowly or too quickly, receiving the “Swipe card again at this turnstile” message. They would try again and again until it worked. This usually would take something more like 10 or 15 seconds.

The third group actually ran into the most trouble. People who were experienced and confident moved forward with the speed of someone who would get through in a couple seconds, but were halted by the machine abruptly if the card didn’t work. Sometimes they would almost run into the turnstile because the momentum was carrying them forward. Other times there were almost collisions with people behind them, especially if they had to turn back to refill their card.

In the case of insufficient fare, people had to go back to the machines to refill them, which could take up to a few minutes.

Developing the correct motion to swipe in a way that the machine understands is a skill that improves with practice. This is probably one reason why most other (and more modern) subway systems around the world use a tapping system, which seems to be easier for anyone using the machine, even if they’ve never done it before.

The way to solve the insufficient fare problem seems to be harder. It’s not an issue of not informing riders of how much fare is left (since it’s on the display when you swipe in), but people forget that they need to refill even if during the last ride they knew they ran out. It seems to be an issue of when riders are notified that they need to refill, which should ideally be when they walk into the station and not when they’re already at the turnstile.

A shorter term solution might be to design the space around the turnstiles in such a way that people can quickly exit the turnstile area if they need to, so it’s not a crowded bottleneck.

 

[pcomp week 2] Electronics labs, switches, and CHOP TO THE TOP!!!

IMG_0067

This week in pcomp we got our hands dirty with circuits and components.

IMG_0020 IMG_0022

I spent a good amount of time getting to know the multimeter, learning how to measure resistance, voltage and current. One of the hardest parts about it was holding the tiny components, measuring them with the multimeter and trying to take pictures at the same time!

I also started building basic circuits with switches and LEDs.

To do the exercise with the potentiometer, I realized I had to solder wires on to the sensor, so I also learned to solder for the first time. It definitely wasn’t the prettiest solder-job in the world, but it worked.

Applying what we learned in the labs, our assignment was to make a switch. We were encouraged to be creative — “A switch is nothing more than a mechanism to bring two pieces of conductive material together and separate them.”

I decided to make a game.

I wanted to do something that tested people’s ability to use chopsticks. I decided to make something that would have two conductive parts with a wide space in between them. The pieces would be connected when people moved other conductive things together in between them so that the circuit could be completed, at which point a light would turn on.

In order to encourage people to use chopsticks to pick things up rather than push things around, I had to make the space between the two conductive points vertical rather than horizontal on the table. Aluminum foil—cheap and abundant—was my material of choice for conducting electricity. So I made a bunch of balls of aluminum foil of different sizes as the things people had to pick up.

The tubes I used were originally pen holders that I had cut the bottoms out of,  glueing foil at the bottom and the top. I had to test if a bunch of foil balls stacked on top of each other would be conductive enough to connect the bottom to the top, so I first used the multimeter, and then wired up a circuit that made an LED light up.

IMG_0044IMG_0045

In order to make this game a race between two teams, I had two tubes and wired up two switches and LEDs of different colors in parallel on my breadboard. So now there’s a green team and a yellow team.

IMG_0070 IMG_0072

I’m not 100% sure the schematic I drew is accurate, mostly because I’m not sure if the line in the middle is correct. But I think it is?

It started to come together, and I wrote the rules:

Light your color first to win. Turn on your light by filling your tube to the top with the balls. You may only touch the balls with chopsticks.

(2 players: each player holds a pair of chopsticks)

(4 players: divide in 2 teams, each player holds 1 chopstick)

A silly game deserves a silly name, and thus I call it CHOP TO THE TOP!!!

IMG_0054

IMG_0055

And voila. I had Aaron (yellow) and Sam (green) play the game to demo. The LEDs are tiny so they’re a little hard to see, but it works!

 

[pcomp week 2] The design of everyday objects & physical computing’s greatest hits

9793515_orig

This week we read two works by Donald A. Norman: the first chapter of his book, Design of Everyday Things, and  his essay, Emotional Design: Attractive Things Work Better. The first rails against everyday objects that are poorly designed, by which he mostly means difficult to understand and confusing to use. He cites numerous examples, like doors that don’t make it clear whether you should pull or push, the thermostat in his refrigerator, and now-almost-obsolete landline telephones.

Scissors, Norman says, are an example of a well-designed everyday object because their affordances, constraints and mappings allow you to easily form a conceptual model in your mind of how they should be used, even if you’ve never done so before.

His essay is a response to some criticism in his book that makes it seem as though he values usability over all else in design—beauty, in particular. His response clarifies that it’s not what he was trying to say, and that designing with people’s emotions in mind is equally important.

These readings make me wonder about the cultural influences in what makes something considered easy to use, or beautiful. I was recently in Japan, a land well-known for its design and usability of everyday objects. As a non-Japanese speaker, some things were easy for me to understand: a basket under your restaurant chair for putting your purse, for example.

Others were not. Many ramen restaurants have you order via machine rather than telling it to the waitstaff (pictured above). The idea is great, but I unfortunately lacked the cultural knowledge or reading ability to figure parts of it out—like how you have to put your money in first before pushing the buttons for your order, and that you have to hit the change button to get change at all at the end.

You only have to give any modern 3 year-old an iPad to see how much culture determines whether or not something is easy to use, so I wonder what kind of cultural assumptions are in the background for a person to understand how to use something as seemingly straightforward as Norman’s scissors.

The final reading this week was Tom’s blog post, Physical Computing’s Greatest Hits (and misses). It’s intimidating and inspiring at the same time to see all the types of projects that can be made with physical computing. What I like in particular is a sense of playfulness about most of them. We don’t necessarily have to create world peace with our designs—making someone smile can be a good enough reason to make something.

[pcomp week 1] What is physical interaction?

After reading the first two chapters of Chris Crawford’s The Art of Interactive Design and Bret Victor’s A Brief Rant on the Future of Interaction Design, the question “what is physical interaction?” reminds me of another question I’ve been trying to answer a lot recently, which is “what is ITP?” With both, it seems that the more you think about it and the more you try to come up with a solid answer, the more inadequate your definition feels.

Crawford addresses this subjectivity, but nonetheless puts forth a definition of interactivity as a conversation “in which two actors alternately listen, think, and speak.” He describes interactivity as something that exists on a continuum rather than in absolutes, and defines it also by what it is not: reaction and participation, for example. Upon first thought, a conversation makes sense to me as a starting point for thinking about how to define interaction. A conversation isn’t static or predictable; it’ll change and adapt according to what each participant says in each turn. Sounds interactive to me!

But does it still count as interactive if there are no humans in the conversation? The video above showing two AI chatbots talking to each other is certainly a conversation (as well as a pretty cool piece of digital technology), but I wouldn’t classify it as interactive because people are not a part of the actual interaction. At least until we consider robots as people, which as far as I know, hasn’t happened yet.

Victor’s rant, similarly, encourages us to consider people when designing for interactivity. This is where the physical part kicks in. His blog post rages against the prevailing vision of the future that’s entirely screen-based, or as he calls it, “Pictures Under Glass.”

“We live in a three-dimensional world,” he writes. “I believe that our hands are the future.”

Physical interaction necessarily involves the body. Of course, as Victor argues, hands are under-considered as tools in design, but we should also think about the ways we can use other parts of the body to creative physical interaction. And to consider this in terms of sense, what else can we use besides touch? It’ll be interesting to design for interaction by sound, sight, smell and taste too.

What makes for good physical interaction? Maybe it’s what McLuhan considers to be “cool media,” or something that requires more active participation on behalf of the person or user to get something out of it. Or maybe it’s the other way around—something that gives you a wider array of output depending on how you interact with it. Like the way that light switch that turns the lights on and off is less of an interactive experience than a dial that allows you to change your lights to all colors of the rainbow.

But does more interaction mean good interaction? Does it make it a better interaction if you end up with stronger feelings about the experience? Or does that just make it better art? Maybe, the best physical interaction is one where the output is an experience tailored completely uniquely to your input, like a conversation. (Between humans.)