Tag Archives: node.js

Dynamic Interfaces Midterm – Playtest

In collaboration with Apon Palanuwech.

Fortunately, Playtech was happening this weekend and we had a chance to playtest the game with more people. Things worked out better than we planned: we managed to have 18 people playing at once. The mechanics seems to work nice too!
IMG_5595 IMG_55782014-04-12 14.47.26 IMG_0001 IMG_5581 IMG_5583 IMG_5585 2014-04-12 14.47.36

Advertisements

Dynamic Interfaces Midterm – Final

In collaboration with Apon Palanuwech.

Screen Shot 2014-04-14 at 1.12.26 PM

We changed the game mechanics to act like a sort of soccer. The first prototypes still used  the old controls, which were not very responsive. Also, allowing users to drag and rotate the bars in their phones overloaded the websockets communication.

We playtested with some friends and got to a simpler solution: use a simple swipe to move the objects. That way, the data would be sent to the server only when the users released their fingers.

To play the game, go to www.apon.io/live/pelada on your computer. Then, access the same address in your phone.

Dynamic Interfaces Midterm – Prototype

In collaboration with Apon Palanuwech.

Research
We spent weeks wondering about our proposals for this project and taking a look at different technologies that could help us — timbre.js, d3.js and SVG paths.
Then , we decided to go for the Lemmings idea and build it in one day, in a sort of hackathon.

Development
We split the team in server and client-side. Apon built the whole websockets communication and I started to build the game engine and visuals using a physics engine called matter.js.
Screen Shot 2014-04-07 at 8.43.57 PM
It was hard to mix a physics game with Lemmings, though. The bars were terribly hard to control in a gravity-based simulation. We experimented with lots of different game mechanics — get the balls with the bars, hit the balls etc — and ended up removing the gravity from the game. Even so, the controls were not responsive enough:

Dynamic Interfaces Midterm – Proposals

In collaboration with Apon Palanuwech.

a) Worm Tunes
A visual music synthesizer, inspired by the amazing Patatap. Users can draw their musical “worms” on their phones. Shapes and curves would be translated into sound and then plugged to the last segment of the collective worm. The final result would be a collective piece of music and drawing.
worm_tunes

b) Help the Tribe
A Lemmings-like game. Users control virtual bars from their phones and should help the “tribe” — a bunch of rolling balls or polygons — overcome the obstacles and reach the final goal.
help_the_tribe

Wearable Remote Control (3)

Idea
A wearable remote control.
Project developed for the Dynamic Interfaces Class, in collaboration with Apon Palanuwech and Ayodamola Okunseinde.

Development
Though our last prototype with the wrist band worked, we wanted to make the one with the sock work. The shape seemed to fit better the function.12

We sew a foam to the sock, to prevent the plastic from collapsing like last time. We also tried to isolate the circuit as much as we could.

13

Even so, the results were different each time. It varied depending on wether we were using the Arduino Uno or the Fio, if it was connected to the computer through USB or Wi-Fly…

14

…and if we were touching or not the board. That led us to a problem discussed in the Capacitive Sensor tutorial: the board needs to be grounded. The page also explains a lot of problems we had, like the laptop acting as a sensor too when connected to the board.
After that, we gave up on the Fio/Wi-fly and decided to work with the regular Uno, for this prototype.
For the software part, we added calibration and “tap detection.” Now we finally have it controlling a video!

Wearable Remote Control (2)

Idea
A wearable remote control.
Project developed for the Dynamic Interfaces Class, in collaboration with Apon Palanuwech and Ayodamola Okunseinde.

Software Development
a) 1st Prototype

After building a relatively stable device to measure conductivity along 2 axys (see previous post), we started working on the software. Though our purpose was to build a simple remote control, we started to test with a sort of trackpad — big mistake, maybe?
For this prototype, we used processing and serial communication.
We first tried to assign an absolute position to the ball, based on the finger position on the trackpad. That proved to be impossible, because people’s charge on the pad changed a lot.
So we made the charge give the ball a direction, like in a joystick — the direction from the pad’s center is translated as a new direction to the ball.

b) 2nd Prototype
After that, we translated the processing sketch to javascript and changed the functions to control a video on the browser.

Hardware Development
a) 5th (?) Prototype

Meanwhile, we replicated the hardware circuit in a non-rigid device, to make it wearable. We sewed the conductive plastic on felt…

08

…and the on a sock:10

Though it looked great as a super-like thing, the plastic collapsed and became very low conductive:
09

Hardware Development
b) 6th Prototype

A much simpler and more stable solution was achieved when we simply put the plastic on an E.V.A. wrist band:
11

Wearable Remote Control (1)

Idea
A wearable remote control. Basic simple functions, like rewind, fast-forward, volume up and down, and play/pause.
The concept plays with the idea that we’re always losing our remote controls, then the best place to have them would be in our bodies.
Project developed for the Dynamic Interfaces Class, in collaboration with Apon Palanuwech and Ayodamola Okunseinde.

Hardware Development
a) 1st Prototype
The project started with a different product in mind. We wanted to build a keyboard embedded in our pants. Maybe using Engelbart’s Chorded Keyboard to reduce the number of keys needed.
We began by experimenting with conductive ink and paper.
01 02

That didn’t work out. Maybe because the ink is not that conductive, maybe because it was a complete mess.
But we also started to rethink the concept from an user’s perspective. What device would make sense as a wearable-remote? A five-finger keyboard probably wouldn’t. That’s how we got to the remote control.

b) 2nd Prototype
We started to experiment with a prototype that Ayo has previously developed. It uses aluminum foil and conductive fabric to create a sort of resistive sensor.

03

We made some tests using conductive fabric too, but it all seemed too unstable and low-conductive.
03a

c) 3rd Prototype
We changed the material to conductive plastic and it worked better. This prototype uses the Arduino Capacitive Sensing Library, and the circuit is mounted pretty much like in the library’s tutorial. However, we put two wires, one in each side of the stripe. By doing so, we could measure an approximate distance from the finger to the wires.04 05

c) 4th Prototype
With the basic functionality solved, we added two more wires to get readings from the two axys (x and y). We also tried to solve some isolation and conductivity problems by using cardboard, cooper tape, conductive plastic and alligator clips.06

Next steps
Software development: serial communication, filter the data.