We did some playtesting last week for our final projects, which was quite helpful. I got a lot of interesting feedback about my true love Tinder robot.
For testing purposes, I had one of the few sensors I was planning on using (semi)working — the camera. I used my computer screen to mostly indicate what would happen with different facial expression readings, and then I manually moved my stand-in robot hand.
Although I didn’t set up a perfect test, it was helpful to hear what people had to say, and what other ideas they suggested. One thing I realized immediately was that having a second screen is too distracting, as users are supposed to be looking at the phone the whole time. I’ll likely replace that with audio output, so the robot speaks rather than show something on a screen.
A big question that came up was the personality of the robot, and generally how the robot will be presented. There was a suggestion to build an entire robot with a body and a face, which sounds interesting but….might be beyond my scope for now.
I’m now thinking about how much “personality” I can convey in the material of the hand and in how the robot speaks. Lots of work to be done!
Below are my initial BOM, plan, and system diagram.