[live web week 10] final project idea: kitchencam

kitchencam-mockup

For my final project, I’m interested in creating something that will allow for random people on the internet to control one particular thing in real life. Before I talk specifically about my project idea, I’ll talk a bit about my references.

REFERENCES

The Telegarden

telegarden-8x6-72dpi

The Telegarden is an art installation that allows web users to view and interact with a remote garden filled with living plants. Members can plant, water, and monitor the progress of seedlings via the tender movements of an industrial robot arm.

Twitch Plays Pokemon

twitch-plays-pokemon.0

Twitch Plays Pokemon was an experiment that allowed for internet users to collaboratively play a game of Pokemon Red by controlling what happened in the game via chat.

Jennicam

jennnn

Jennifer Ringley was one of the first “lifecasters,” using a webcam to take a photo of her bedroom every 15 minutes for years. It was a way for anyone on the internet to see what was happening in one person’s daily life.

MY PROJECT

What I want to do is set up a webcam in my kitchen. I’ll build a webpage, where anyone can go and see what’s currently happening in my kitchen, but they will have to click a button that takes a picture first.

I think this is an interesting idea because it’s a little different from typical “lifecasting” or livestreaming. It’s easy to be voyeuristic on the internet, just by stumbling across anyone’s personal livestream. But because the user has to click a button to capture a photo and see what’s happening, the user is a little more complicit in the voyeurism.

Personally, I’m not interested in being particularly exhibitionistic (in the bedroom style of Jennicam), but I figured getting to look into someone’s kitchen at any point in the day feels somewhat intimate. You can learn a lot by seeing what someone does in the kitchen – seeing how they make their coffee, whether they’re cooking or microwaving leftover takeout, why they’re eating cereal at midnight or drinking at noon.

TECH

I’m not 100% sure how I’ll build this out, but I imagine I’ll use a Raspberry Pi or something similar for the webcam and use socket servers to connect to a webpage. We’ll see…!

[arcade week 8] final cabinet ideas

For our final arcade cabinet, I’m working with Jesse, Paula, and Ian.

We aren’t entirely sure what our game will be, but we’ve agreed upon a few things:

  • Probably prom themed
  • Lighthearted and positive in tone
  • Possibly exploring feelings of awkwardness
  • Maybe dancing related?
  • The cabinet will uniquely reflect the game in some way

We are inspired by the Realistic Kissing Simulator game. Hopefully we’ll narrow it down soon!

[icm final] True Love Tinder Robot v1

2015-11-30 22.37.20

Here we are, at the end of ICM! How time flies.

This is the first prototype of my ICM/pcomp final project, the True Love Tinder Robot. The fabrication and wiring is not quite done yet, but this being a post about my ICM documentation, the code is more complete, and that’s the part I’ll focus on here.

Here’s how it works:

You put the phone down in front of the robot hand with Tinder open. Then you put your hands on the sensors. The robot will begin talking, telling you that you have a few seconds to look carefully at each Tinder profile. As you’re looking, the robot will reading your heart’s desire through the sensors, and then swipe on your phone for you*, all the while announcing what it’s doing.

*(okay the physical swiping part isn’t done, but will be next week!)

2015-12-02 15.53.48

Now, about the code:

One of the biggest coding challenges in this project for me was working with counting and timing. A number of things happen in the project that depend on time: how long it takes to get a reading, how long you have before the robot swipes, how much time can pass with no sensor reading to determine that the user has left and that the next user should start from the beginning of the sequence. In Arduino, the only built-in timer is millis(), which counts the milliseconds from when the program launched. Therefore there’s no simple way to say, “when this event happens, do something for five seconds, and then do something else for three seconds, and then start over.”

I did a few things to solve this problem. The first was that I turned my series of convoluted if statements into a finite state machine, which made it much easier to organize.

//state machine
#define waiting 1
#define checkIfNewUser 2
#define newUser 3
#define giveIntro 4
#define beginSwiping 5
#define calculate 6
#define changeHasSwipedLeft 7
#define changeHasSwipedRight 8
#define swipeLeft 9
#define swipeRight 10
#define getNoSensorTime 11
#define ending 12

Another was saving a millis() value when something crucial happened, and then taking the difference between that and the current millis() to act as a timer.

static unsigned long ts;
timePassed = (millis() / 1000) - ts; //the amount of time passed is the current time minus the last time saved

And I used boolean flags to mark when an event happened.

   case changeHasSwipedLeft:
      hasSwipedLeft = false;
      state = swipeLeft;
      break;
    case swipeLeft:
      hasSwiped = true;
      if (timePassed >= 6 && hasSwipedLeft == false) {
        emic.speak(swipeLeftWords[iLeft]);
        iLeft++;
        hasSwipedLeft = true;
        handServo.write(servoLeft);
        digitalWrite(swipeLeftRedLed, HIGH);
        digitalWrite(swipeRightGreenLed, LOW);
      }
      if (timePassed >= 9) {
        state = beginSwiping;
      }
      break;

Another thing I thought about a lot was the personality of the robot. It was important to me that the robot speaks to you, both to explain what’s going on, but also to add to the experience. I was inspired by the character GLaDOS from the videogame Portal, who is an AI that’s passive-aggressive, dark, and very funny. The lines I wrote for the robot were an attempt to give a similar sense of character, even for a short interaction.

A first prototype of my creepy robot

A video posted by Nicole He (@nicolehe) on

//platform.instagram.com/en_US/embeds.js

Overall, I’m happy with the way this is turning out. The physical part is now finished yet, but I’m glad that the code is doing what I want it to. I learned a lot during the process, not just about specific coding techniques, but also about how to find and use libraries, upload on Github, and generally Google around for answers to my many questions.

My code is on Github.

https://github.com/nicolehe/true-love-tinder-robot

[icm week 10] Final project idea: True Love Tinder Robot

INTRODUCTION

tinder_match

Two things.

Number one: It’s common these days for people to use technology to look for sex and love. Some stats show that 1 in 4 couples these days have met online. Dating sites and apps like OkCupid and Tinder give us the tools to find the people we are interested in, but we still strongly feel that the choices we make in our partners are our own. It’s not a stretch to think that as the algorithms get better and the technology advances, it’ll do an even better job of matching us with potential partners.

image

Number two: it’s also increasingly acceptable to buy consumer devices that tell us about what’s going on in our bodies. The Nike Fuelband, the Fitbit, and now the Apple Watch read our biometric data and tell us about our exercise habits and in turn suggest how we can improve ourselves.

Maybe it’s not a stretch to think that we’ll eventually combine these two things.

My final project idea, as mentioned in previous posts, is a True Love Tinder Robot. I want to explore the idea that the computer knows us better than we know ourselves, and that it has a better authority on who we should date than we do. Before I go into the specifics of my own project, here are a few examples of similar works other people have done.

INSPIRATION

1) Social Turkers: Crowdsourced Dating (2013) by Lauren McCarthy

Screenshot 2015-11-12 10.34.51

Lauren, who is also my teacher for this class (hi Lauren!) did a project where she went on dates that were live-streamed in realtime to Mechanical Turk workers, who gave her feedback as the date went on.

2) Lonely Sculpture (2014) by Tully Arnot

Tully made a finger that just says yes to every Tinder profile it comes across. The lifelike, grotesque finger in this project is quite evocative and gave me inspiration to use a lifelike hand for my own robot.

3) Tender (2015) by Cors Brinkman, Jeroen van Oorschot, Marcello Maureira, and Matei Szabo

JAwaxFU

Tender is literally a piece of meat that swipes right.

MY PROJECT

IMG_0935

Short description:

The True Love Tinder Robot reads your true heart’s desire as you are looking at Tinder profiles, and then physically swipes on your phone for you. Literally put your love life in a robot’s hands.

Here’s the idea of how it works in its current iteration:

I’m planning on making a lifelike, robotic hand. The hand will be on a box, housing LEDs in a heart shape. In front of a hand will be a stand to hold a phone. There will also be a speaker on or next to the box.

The user puts their phone in front of it, with the Tinder app open. The user then puts their hand on the sensors next to the box. When the robot is ready, the LEDs will light up and the robot will say “I’m ready to find you love” (or something like that). The robot will also explain that you have five seconds to look at each profile, and then it will read your true heart’s desire and swipe accordingly. It then begins to swipe.

Based on the biometric data from the sensors as the user is looking at each profile, the robot will decide whether or not the user should date someone. When the robot thinks the user is not interested, the hand will swipe left and the LEDs will turn red, and the robot will announce that they are not a good match. On the other hand, if the robot thinks the user should say yes to the person they are looking at, the hand will swipe right and the LEDS will turn green and the robot will announce that it’s a match.

Throughout the process, the robot will say things to reassure you to trust it.

PROGRESS

Last week I tested a camera and an emotion-reading library as a potential sensor. It was interesting, but I decided that I won’t use a screen because it’s distracting, and I probably won’t use a camera.

I haven’t decided yet what biometric sensor to use, so this week I’ll have to test the heart rate monitor, as well as try a GSR.

The bulk of the ICM-related work will be, I think, working with audio and making sure the robot responds appropriately.

GOALS & QUESTIONS

I want this project to be sort of amusing, kind of creepy and slightly embarrassing. I want the user to feel a tension  between the robot assuring you that it knows best and not being sure whether or not to trust it. I want the user to question whether or not we should let a computer make intimate decisions for us.

Some questions:

  • How can I make the robot evoke the feeling that it has authority? (Besides calling it a robot.)
  • How can I make the experience interesting for spectators as well as the user?
  • What type of things should the robot “say?”
  • How should the interaction end?

 

[pcomp week 10] Final project BOM, system diagram, and plan

 

 

unnamed

We did some playtesting last week for our final projects, which was quite helpful. I got a lot of interesting feedback about my true love Tinder robot.

For testing purposes, I had one of the few sensors I was planning on using (semi)working — the camera. I used my computer screen to mostly indicate what would happen with different facial expression readings, and then I manually moved my stand-in robot hand.

IMG_0866

Although I didn’t set up a perfect test, it was helpful to hear what people had to say, and what other ideas they suggested. One thing I realized immediately was that having a second screen is too distracting, as users are supposed to be looking at the phone the whole time. I’ll likely replace that with audio output, so the robot speaks rather than show something on a screen.

A big question that came up was the personality of the robot, and generally how the robot will be presented. There was a suggestion to build an entire robot with a body and a face, which sounds interesting but….might be beyond my scope for now.

I’m now thinking about how much “personality” I can convey in the material of the hand and in how the robot speaks. Lots of work to be done!

Below are my initial BOM, plan, and system diagram.

 

 

 

Screenshot 2015-11-11 10.24.16Screenshot 2015-11-10 22.13.00IMG_0925

[video & sound week 7] How To Fly A Kite (final video)

Aaron, Melody and I had a great time filming and editing our little video about flying a kite. We went to Prospect Park on a lovely beginning-of-fall day a couple weeks ago with a camera, tripod, kite, and a heavy dose of optimism about wind conditions.

IMG_0388

 

 

We set up the camera, and then decided that the two “characters” from our storyboard should be consistent throughout. It ended up being me as the main kite flyer, Aaron as the kite thrower, and Melody as the main person behind the camera.  IMG_0397

The park wasn’t too crowded and it was a beautiful day, so filming conditions were pretty good. We found that it wasn’t too hard to get the kite in the air, but the wind wasn’t strong enough to keep it up for more than a couple minutes. For our movie-making purposes, however, this was fine.

One of the challenges was that as the kite flyer, I generally had to run around to keep the kite up, which made it difficult for Melody to capture all the action in a smooth way on film. In order to get shots of the kite from below, Melody had to run around the park with me, all while craning her head and the camera up to get the footage.

After a successful afternoon of shooting, we rewarded ourselves with some ice cream floats.

IMG_0407

When it came time to edit, we structured the video as we originally envisioned it: in three main steps, from set-up, to getting the kite in the air, to keeping the kite in the air. We figured we would add either voice over or more titles to explain some of the in between steps, like testing the wind, or throwing the kite.

After getting feedback on our rough cut, we decided that the video didn’t really need any text-based instruction because visually, the steps were pretty straightforward. We found two tracks we liked from freemusicarchive.org by the artist Podington Bear that matched well with the kind of cute and lighthearted feeling we were going for.

IMG_0738

Matching the cuts to the beats brought it all together.

Overall, I found it to be a really helpful experience to make a video from start to finish. Watch it above!