Tag Archives: experimental

Wearable Remote Control (3)

A wearable remote control.
Project developed for the Dynamic Interfaces Class, in collaboration with Apon Palanuwech and Ayodamola Okunseinde.

Though our last prototype with the wrist band worked, we wanted to make the one with the sock work. The shape seemed to fit better the function.12

We sew a foam to the sock, to prevent the plastic from collapsing like last time. We also tried to isolate the circuit as much as we could.


Even so, the results were different each time. It varied depending on wether we were using the Arduino Uno or the Fio, if it was connected to the computer through USB or Wi-Fly…


…and if we were touching or not the board. That led us to a problem discussed in the Capacitive Sensor tutorial: the board needs to be grounded. The page also explains a lot of problems we had, like the laptop acting as a sensor too when connected to the board.
After that, we gave up on the Fio/Wi-fly and decided to work with the regular Uno, for this prototype.
For the software part, we added calibration and “tap detection.” Now we finally have it controlling a video!

Wearable Remote Control (2)

A wearable remote control.
Project developed for the Dynamic Interfaces Class, in collaboration with Apon Palanuwech and Ayodamola Okunseinde.

Software Development
a) 1st Prototype

After building a relatively stable device to measure conductivity along 2 axys (see previous post), we started working on the software. Though our purpose was to build a simple remote control, we started to test with a sort of trackpad — big mistake, maybe?
For this prototype, we used processing and serial communication.
We first tried to assign an absolute position to the ball, based on the finger position on the trackpad. That proved to be impossible, because people’s charge on the pad changed a lot.
So we made the charge give the ball a direction, like in a joystick — the direction from the pad’s center is translated as a new direction to the ball.

b) 2nd Prototype
After that, we translated the processing sketch to javascript and changed the functions to control a video on the browser.

Hardware Development
a) 5th (?) Prototype

Meanwhile, we replicated the hardware circuit in a non-rigid device, to make it wearable. We sewed the conductive plastic on felt…


…and the on a sock:10

Though it looked great as a super-like thing, the plastic collapsed and became very low conductive:

Hardware Development
b) 6th Prototype

A much simpler and more stable solution was achieved when we simply put the plastic on an E.V.A. wrist band:

Wearable Remote Control (1)

A wearable remote control. Basic simple functions, like rewind, fast-forward, volume up and down, and play/pause.
The concept plays with the idea that we’re always losing our remote controls, then the best place to have them would be in our bodies.
Project developed for the Dynamic Interfaces Class, in collaboration with Apon Palanuwech and Ayodamola Okunseinde.

Hardware Development
a) 1st Prototype
The project started with a different product in mind. We wanted to build a keyboard embedded in our pants. Maybe using Engelbart’s Chorded Keyboard to reduce the number of keys needed.
We began by experimenting with conductive ink and paper.
01 02

That didn’t work out. Maybe because the ink is not that conductive, maybe because it was a complete mess.
But we also started to rethink the concept from an user’s perspective. What device would make sense as a wearable-remote? A five-finger keyboard probably wouldn’t. That’s how we got to the remote control.

b) 2nd Prototype
We started to experiment with a prototype that Ayo has previously developed. It uses aluminum foil and conductive fabric to create a sort of resistive sensor.


We made some tests using conductive fabric too, but it all seemed too unstable and low-conductive.

c) 3rd Prototype
We changed the material to conductive plastic and it worked better. This prototype uses the Arduino Capacitive Sensing Library, and the circuit is mounted pretty much like in the library’s tutorial. However, we put two wires, one in each side of the stripe. By doing so, we could measure an approximate distance from the finger to the wires.04 05

c) 4th Prototype
With the basic functionality solved, we added two more wires to get readings from the two axys (x and y). We also tried to solve some isolation and conductivity problems by using cardboard, cooper tape, conductive plastic and alligator clips.06

Next steps
Software development: serial communication, filter the data.

VR Inspirations (2)

Some references from the last class reminded me of other not-so-optimistic VR examples.
First, Win Wenders movie Until the End of the World.

The plot is confusing and includes a pre-apocalyptic event. What I remember from 1999, though, is that everybody ends up addicted to this new technology that can record your dreams — and play it again for you.

Second, an episode from the TV series Black Mirror, called “The Entire History of You.” in the episode, people are continuously recording their lives through a device — directly connected their brains, it seems. Like in the previous example, virtual* ends up taking over reality, with characters getting addicted to past experiences.
*Though I’m not sure if the recorded scenes would be considered “virtual reality.”

VR Inspirations

I came across these two Oculus projects through this post on the Rhizome website.
It’s interesting that they are based on the same input — a bicycle —, but with radically different outputs.

PaperDude VR is a remake of an 8-bit game from 1984. Besides the bicycle input, it tracks the user gestures with a Kinect. It seems pretty accurate, from the video below.

Citytrip goes for an experimental approach. There doesn’t seem to be a goal besides exploring the 3D scenario. I’m curious about the results of such a surreal VR.

The Status Box – Final

1. Idea
A box that displays the user status — busy, available, etc. Useful for workings in semi-public spaces. One could tell by looking at the box if a person is focused on work or just checking Facebook, for instance.

2. Development
2.1. Technology
The box doesn’t have any internet connection. The status is changed by physically turning it. Also, it should be as inexpensive as possible. Instead of detecting the angle with an accelerometer, it uses 4 tilt switches — 2 for each axis, x and y.  It works as in the following sketch.

2.2. 1st Prototype
I’ve programmed all functions and assembled the circuit on the breadboard before putting anything inside the enclosure:

2.3. Enclosure
The box is made out of plexiglass and assembled with bolts and nuts only. I drew its plan based on the model found here.
Once again, a huge thank you to Brendan Byrne for the tip!
I had no experience with laser cutting plexiglass, so I ended up melting it. Anyway, it was useful to check if the box plan was right.

2.4. 2nd Prototype
With everything working on the breadboard, I simply stuck it into my enclosure — along with the battery and the Arduino board.
The switches are a little bit unstable. That made the colors flicker while I moved the box. But the main function seemed to work fine.

2.5. Final Circuit and Board Assembling
Once again, this was the hardest part. I used solder AND hot glue, because working with the tape in the previous project was a pain. It didn’t make things easier, though. I build an x and y axis with toothpicks, for the switches.final
Also, I had a little less space. Because of that, I ended up glueing the battery to my Arduino.

The Little Prince – Final

Screen Shot 2013-12-16 at 3.35.14 PM Idea
An interactive app based on The Little Prince book. The app covers the chapters in which the prince travel through seven planets before coming to the Earth.

The project draws inspiration from interactive children’s books. Particularly the ones by Bruno Munari, which explores the interaction with paper in non-traditional forms.
The purpose then was to make something analogous using a laptop. How can a user interact with it besides using mouse and keyboard?

See previous posts for more precedents and process.

The project uses 3 different inputs: camera, light sensor and speaker. Each planet respond for only one of them — except for the first one, which is supposed to be a tutorial for the other parts.
Screen Shot 2013-12-16 at 3.35.26 PM

This was my last prototype before the final version. All interactions and navigation working. Drawings still in sketch version — though the final is a sort of sketch, too!


Original post: http://codeforart.cc/fall-2013/the-little-prince-final/

The final code is here.

Download the app here.