Tag Archives: interactive

Wearable Remote Control (3)

A wearable remote control.
Project developed for the Dynamic Interfaces Class, in collaboration with Apon Palanuwech and Ayodamola Okunseinde.

Though our last prototype with the wrist band worked, we wanted to make the one with the sock work. The shape seemed to fit better the function.12

We sew a foam to the sock, to prevent the plastic from collapsing like last time. We also tried to isolate the circuit as much as we could.


Even so, the results were different each time. It varied depending on wether we were using the Arduino Uno or the Fio, if it was connected to the computer through USB or Wi-Fly…


…and if we were touching or not the board. That led us to a problem discussed in the Capacitive Sensor tutorial: the board needs to be grounded. The page also explains a lot of problems we had, like the laptop acting as a sensor too when connected to the board.
After that, we gave up on the Fio/Wi-fly and decided to work with the regular Uno, for this prototype.
For the software part, we added calibration and “tap detection.” Now we finally have it controlling a video!


Wearable Remote Control (2)

A wearable remote control.
Project developed for the Dynamic Interfaces Class, in collaboration with Apon Palanuwech and Ayodamola Okunseinde.

Software Development
a) 1st Prototype

After building a relatively stable device to measure conductivity along 2 axys (see previous post), we started working on the software. Though our purpose was to build a simple remote control, we started to test with a sort of trackpad — big mistake, maybe?
For this prototype, we used processing and serial communication.
We first tried to assign an absolute position to the ball, based on the finger position on the trackpad. That proved to be impossible, because people’s charge on the pad changed a lot.
So we made the charge give the ball a direction, like in a joystick — the direction from the pad’s center is translated as a new direction to the ball.

b) 2nd Prototype
After that, we translated the processing sketch to javascript and changed the functions to control a video on the browser.

Hardware Development
a) 5th (?) Prototype

Meanwhile, we replicated the hardware circuit in a non-rigid device, to make it wearable. We sewed the conductive plastic on felt…


…and the on a sock:10

Though it looked great as a super-like thing, the plastic collapsed and became very low conductive:

Hardware Development
b) 6th Prototype

A much simpler and more stable solution was achieved when we simply put the plastic on an E.V.A. wrist band:

Wearable Remote Control (1)

A wearable remote control. Basic simple functions, like rewind, fast-forward, volume up and down, and play/pause.
The concept plays with the idea that we’re always losing our remote controls, then the best place to have them would be in our bodies.
Project developed for the Dynamic Interfaces Class, in collaboration with Apon Palanuwech and Ayodamola Okunseinde.

Hardware Development
a) 1st Prototype
The project started with a different product in mind. We wanted to build a keyboard embedded in our pants. Maybe using Engelbart’s Chorded Keyboard to reduce the number of keys needed.
We began by experimenting with conductive ink and paper.
01 02

That didn’t work out. Maybe because the ink is not that conductive, maybe because it was a complete mess.
But we also started to rethink the concept from an user’s perspective. What device would make sense as a wearable-remote? A five-finger keyboard probably wouldn’t. That’s how we got to the remote control.

b) 2nd Prototype
We started to experiment with a prototype that Ayo has previously developed. It uses aluminum foil and conductive fabric to create a sort of resistive sensor.


We made some tests using conductive fabric too, but it all seemed too unstable and low-conductive.

c) 3rd Prototype
We changed the material to conductive plastic and it worked better. This prototype uses the Arduino Capacitive Sensing Library, and the circuit is mounted pretty much like in the library’s tutorial. However, we put two wires, one in each side of the stripe. By doing so, we could measure an approximate distance from the finger to the wires.04 05

c) 4th Prototype
With the basic functionality solved, we added two more wires to get readings from the two axys (x and y). We also tried to solve some isolation and conductivity problems by using cardboard, cooper tape, conductive plastic and alligator clips.06

Next steps
Software development: serial communication, filter the data.

Defrag-style Data Visualization

A visualization of your computer’s files in the style of Windows 98 defrag:

This project was the first assignment for Jamie Kosoy’s Dynamic Interfaces class. He asked us to design something that reminded of 1998. Made in collaboration with Apon Palanuwech.

Each square represents a file. Each color in the visualization represents a unique file type — js, jpeg, png etc. Hover over each square displays the file name and type.
The script is currently to go up three levels from the folder it is located. In my case, it reads all files from my user folder.

Code available on github (you must run the app from node.js).

The Little Prince – Final

Screen Shot 2013-12-16 at 3.35.14 PM Idea
An interactive app based on The Little Prince book. The app covers the chapters in which the prince travel through seven planets before coming to the Earth.

The project draws inspiration from interactive children’s books. Particularly the ones by Bruno Munari, which explores the interaction with paper in non-traditional forms.
The purpose then was to make something analogous using a laptop. How can a user interact with it besides using mouse and keyboard?

See previous posts for more precedents and process.

The project uses 3 different inputs: camera, light sensor and speaker. Each planet respond for only one of them — except for the first one, which is supposed to be a tutorial for the other parts.
Screen Shot 2013-12-16 at 3.35.26 PM

This was my last prototype before the final version. All interactions and navigation working. Drawings still in sketch version — though the final is a sort of sketch, too!


Original post: http://codeforart.cc/fall-2013/the-little-prince-final/

The final code is here.

Download the app here.

Interactive animation – 2nd Prototype

Christopher Niemann’s app “Petty Zoo” is pretty close to the interactions and animations I’m planning. In each scene the user interaction triggers different animations:

Picture by Christopher Niemann

As for the drawing style, it will look more like this John Porcellino gif: John_stroll

Main Interaction

This is my 2nd prototype for the Code for Art finals. I’ve changed the idea a little bit. The user will no longer play with all the 3 inputs (sound, light and camera) in the same scene. Instead, I’ll have 7 different planets from the Little Prince book. In each one, the user will play with a single input, as in the table below:
Screen Shot 2013-12-03 at 6.04.52 PM

The first scene/planet will respond to all 3 inputs because it will also work as an introductory tutorial.

I decided to keep the navigation as a separate input, using keyboard. I think that it needs to be something more stable and less exploratory — and also clearly separate from the animation interactions.

These are my sketches for the scenes so far. The next step is to integrate them with the coding from the previous prototype.
Screen Shot 2013-12-03 at 6.03.39 PM Screen Shot 2013-12-03 at 6.04.09 PM Screen Shot 2013-12-03 at 6.03.55 PM Screen Shot 2013-12-03 at 6.03.52 PM Screen Shot 2013-12-03 at 6.03.50 PM Screen Shot 2013-12-03 at 6.03.48 PM Screen Shot 2013-12-03 at 6.03.45 PM Screen Shot 2013-12-03 at 6.03.43 PM

Original post: http://codeforart.cc/fall-2013/interactive-animation-2nd-prototype/

Wildcat Thaumatrope

Make a thaumatrope using Arduino and a DC motor.

a) I began by checking if the speed rotation was enough to make it possible:

b) Then I tested a 3-faced version. But it didn’t work.cat_00

c) I decided to test one that would animate the image, instead of just mixing it. It didn’t work out as I planned, though:

d) So I stick to the idea of mixing two images. I draw this cat in an iPad, transfer it to paper and then to the final boards.

Original post: http://makingtoys.net/2013/11/21/wildcat-thaumatrope/