[icm final] True Love Tinder Robot v1

2015-11-30 22.37.20

Here we are, at the end of ICM! How time flies.

This is the first prototype of my ICM/pcomp final project, the True Love Tinder Robot. The fabrication and wiring is not quite done yet, but this being a post about my ICM documentation, the code is more complete, and that’s the part I’ll focus on here.

Here’s how it works:

You put the phone down in front of the robot hand with Tinder open. Then you put your hands on the sensors. The robot will begin talking, telling you that you have a few seconds to look carefully at each Tinder profile. As you’re looking, the robot will reading your heart’s desire through the sensors, and then swipe on your phone for you*, all the while announcing what it’s doing.

*(okay the physical swiping part isn’t done, but will be next week!)

2015-12-02 15.53.48

Now, about the code:

One of the biggest coding challenges in this project for me was working with counting and timing. A number of things happen in the project that depend on time: how long it takes to get a reading, how long you have before the robot swipes, how much time can pass with no sensor reading to determine that the user has left and that the next user should start from the beginning of the sequence. In Arduino, the only built-in timer is millis(), which counts the milliseconds from when the program launched. Therefore there’s no simple way to say, “when this event happens, do something for five seconds, and then do something else for three seconds, and then start over.”

I did a few things to solve this problem. The first was that I turned my series of convoluted if statements into a finite state machine, which made it much easier to organize.

//state machine
#define waiting 1
#define checkIfNewUser 2
#define newUser 3
#define giveIntro 4
#define beginSwiping 5
#define calculate 6
#define changeHasSwipedLeft 7
#define changeHasSwipedRight 8
#define swipeLeft 9
#define swipeRight 10
#define getNoSensorTime 11
#define ending 12

Another was saving a millis() value when something crucial happened, and then taking the difference between that and the current millis() to act as a timer.

static unsigned long ts;
timePassed = (millis() / 1000) - ts; //the amount of time passed is the current time minus the last time saved

And I used boolean flags to mark when an event happened.

   case changeHasSwipedLeft:
      hasSwipedLeft = false;
      state = swipeLeft;
      break;
    case swipeLeft:
      hasSwiped = true;
      if (timePassed >= 6 && hasSwipedLeft == false) {
        emic.speak(swipeLeftWords[iLeft]);
        iLeft++;
        hasSwipedLeft = true;
        handServo.write(servoLeft);
        digitalWrite(swipeLeftRedLed, HIGH);
        digitalWrite(swipeRightGreenLed, LOW);
      }
      if (timePassed >= 9) {
        state = beginSwiping;
      }
      break;

Another thing I thought about a lot was the personality of the robot. It was important to me that the robot speaks to you, both to explain what’s going on, but also to add to the experience. I was inspired by the character GLaDOS from the videogame Portal, who is an AI that’s passive-aggressive, dark, and very funny. The lines I wrote for the robot were an attempt to give a similar sense of character, even for a short interaction.

A first prototype of my creepy robot

A video posted by Nicole He (@nicolehe) on

//platform.instagram.com/en_US/embeds.js

Overall, I’m happy with the way this is turning out. The physical part is now finished yet, but I’m glad that the code is doing what I want it to. I learned a lot during the process, not just about specific coding techniques, but also about how to find and use libraries, upload on Github, and generally Google around for answers to my many questions.

My code is on Github.

https://github.com/nicolehe/true-love-tinder-robot

[icm week 10] Final project idea: True Love Tinder Robot

INTRODUCTION

tinder_match

Two things.

Number one: It’s common these days for people to use technology to look for sex and love. Some stats show that 1 in 4 couples these days have met online. Dating sites and apps like OkCupid and Tinder give us the tools to find the people we are interested in, but we still strongly feel that the choices we make in our partners are our own. It’s not a stretch to think that as the algorithms get better and the technology advances, it’ll do an even better job of matching us with potential partners.

image

Number two: it’s also increasingly acceptable to buy consumer devices that tell us about what’s going on in our bodies. The Nike Fuelband, the Fitbit, and now the Apple Watch read our biometric data and tell us about our exercise habits and in turn suggest how we can improve ourselves.

Maybe it’s not a stretch to think that we’ll eventually combine these two things.

My final project idea, as mentioned in previous posts, is a True Love Tinder Robot. I want to explore the idea that the computer knows us better than we know ourselves, and that it has a better authority on who we should date than we do. Before I go into the specifics of my own project, here are a few examples of similar works other people have done.

INSPIRATION

1) Social Turkers: Crowdsourced Dating (2013) by Lauren McCarthy

Screenshot 2015-11-12 10.34.51

Lauren, who is also my teacher for this class (hi Lauren!) did a project where she went on dates that were live-streamed in realtime to Mechanical Turk workers, who gave her feedback as the date went on.

2) Lonely Sculpture (2014) by Tully Arnot

Tully made a finger that just says yes to every Tinder profile it comes across. The lifelike, grotesque finger in this project is quite evocative and gave me inspiration to use a lifelike hand for my own robot.

3) Tender (2015) by Cors Brinkman, Jeroen van Oorschot, Marcello Maureira, and Matei Szabo

JAwaxFU

Tender is literally a piece of meat that swipes right.

MY PROJECT

IMG_0935

Short description:

The True Love Tinder Robot reads your true heart’s desire as you are looking at Tinder profiles, and then physically swipes on your phone for you. Literally put your love life in a robot’s hands.

Here’s the idea of how it works in its current iteration:

I’m planning on making a lifelike, robotic hand. The hand will be on a box, housing LEDs in a heart shape. In front of a hand will be a stand to hold a phone. There will also be a speaker on or next to the box.

The user puts their phone in front of it, with the Tinder app open. The user then puts their hand on the sensors next to the box. When the robot is ready, the LEDs will light up and the robot will say “I’m ready to find you love” (or something like that). The robot will also explain that you have five seconds to look at each profile, and then it will read your true heart’s desire and swipe accordingly. It then begins to swipe.

Based on the biometric data from the sensors as the user is looking at each profile, the robot will decide whether or not the user should date someone. When the robot thinks the user is not interested, the hand will swipe left and the LEDs will turn red, and the robot will announce that they are not a good match. On the other hand, if the robot thinks the user should say yes to the person they are looking at, the hand will swipe right and the LEDS will turn green and the robot will announce that it’s a match.

Throughout the process, the robot will say things to reassure you to trust it.

PROGRESS

Last week I tested a camera and an emotion-reading library as a potential sensor. It was interesting, but I decided that I won’t use a screen because it’s distracting, and I probably won’t use a camera.

I haven’t decided yet what biometric sensor to use, so this week I’ll have to test the heart rate monitor, as well as try a GSR.

The bulk of the ICM-related work will be, I think, working with audio and making sure the robot responds appropriately.

GOALS & QUESTIONS

I want this project to be sort of amusing, kind of creepy and slightly embarrassing. I want the user to feel a tension  between the robot assuring you that it knows best and not being sure whether or not to trust it. I want the user to question whether or not we should let a computer make intimate decisions for us.

Some questions:

  • How can I make the robot evoke the feeling that it has authority? (Besides calling it a robot.)
  • How can I make the experience interesting for spectators as well as the user?
  • What type of things should the robot “say?”
  • How should the interaction end?

 

[pcomp week 10] Final project BOM, system diagram, and plan

 

 

unnamed

We did some playtesting last week for our final projects, which was quite helpful. I got a lot of interesting feedback about my true love Tinder robot.

For testing purposes, I had one of the few sensors I was planning on using (semi)working — the camera. I used my computer screen to mostly indicate what would happen with different facial expression readings, and then I manually moved my stand-in robot hand.

IMG_0866

Although I didn’t set up a perfect test, it was helpful to hear what people had to say, and what other ideas they suggested. One thing I realized immediately was that having a second screen is too distracting, as users are supposed to be looking at the phone the whole time. I’ll likely replace that with audio output, so the robot speaks rather than show something on a screen.

A big question that came up was the personality of the robot, and generally how the robot will be presented. There was a suggestion to build an entire robot with a body and a face, which sounds interesting but….might be beyond my scope for now.

I’m now thinking about how much “personality” I can convey in the material of the hand and in how the robot speaks. Lots of work to be done!

Below are my initial BOM, plan, and system diagram.

 

 

 

Screenshot 2015-11-11 10.24.16Screenshot 2015-11-10 22.13.00IMG_0925

[icm week 9] Swipe (capture & CLM tracker)

Screenshot 2015-11-04 22.42.43

(TLDR: See my Swipe sketch from this week here.)

I did something pretty simple this week. It’s just a part I decided to add to my Tinder Robot project, which I will likely also make my ICM final. The purpose of this sketch was mostly to test if I could use facial emotion reading as one of my sensor inputs for my robot.

I built this sketch in P5 off of the emotion reading CLM Tracker example, the libraries of which are quite dauntingly complex:

Screenshot 2015-11-04 23.02.18

…but I guess that’s why we use libraries rather than make everything from scratch.

Keeping with the idea of my Tinder robot, I wanted to use facial tracking/emotion reading as one of the sensors that would determine whether or not the robot would swipe left or right.

What this means for this sketch was that if the user was making an expressive face (indicating strong emotions), it would swipe right. If the user was not making an expressive face (indicating ambivalence), it would swipe left.

The way I determined whether or not the user was making an expressive face was by adding up all the values the tracker gave for each emotion it detected. The higher the value, the more of that emotion you’re expressing. This should mean that a high total value indicates a strong emotional reaction.

Screenshot 2015-11-04 22.42.54

I’m satisfied that the general concept of the sketch works, but it’s really just the bare minimum of what I have to do, so I’m considering this one just as an initial test.

Check it out here. My sketch code is below.

Continue reading “[icm week 9] Swipe (capture & CLM tracker)”

[pcomp week 8] Final project idea: Tinder Robot

IMG_0802

This idea came to me in as I was falling asleep a few weeks ago, which probably means that it’s either really good or really bad. I hope to find out which one it is soon!

My idea for my final project in pcomp is based around the popular dating app, Tinder. The way it works is that it shows you profiles of other users, and you can “swipe left” to say no, and “swipe right” to say yes. I think it would be interesting to have a computer automate the process for you by reading your body to determine whether or not you truly want to say yes or no to a potential match. Reading your “heart’s desire,” if you will.

The user, connected to some kind of biometric sensor (heart rate monitor? GSR sensor? both?)  would look at a Tinder profile on the screen, and the computer would read whether or not they were excited by the profile. If not, a robot hand controlled by a motor or a servo of some kind would swipe left. If so, it would swipe right.

For demoing purposes, I’d set up some kind of dummy Tinder account that anyone can try. But I think this experiment would be even more interesting if users were willing to put their own phones and Tinder accounts on the line, so there actually some (small) stakes for them.

There are a few questions, of course, that I need answer moving forward:

  • What kind of sensors should be used?
  • How should the data be read to interpret interest or lack of interest?
  • What potential obstacles are there in the Tinder UI? (For example, I think if you and another user both swipe right, a different screen pops up. How would we deal with that?)
  • Should we use another display to show what the sensor readings are? (I drew one in the sketch above just in case.)