1. Smart Bop Bag
An interactive souvenir. Something to keep in a shelf and play with kids — but not really a toy.
This bop bag would behave like a regular one, with a twist: it would go up and down by itself, sensing the user presence.
c) Look and feel
Like a regular bop bag, with a smiling face and minimal visuals.
2. Mood Box
A box that displays the user mood and status — busy, available etc. Particularly useful for work and D12-like environments. One could tell by looking at the box if a person is focused on work or just checking Facebook, for instance.
3 tilt sensors detect the changes in the x, y and z axys. By doing so, it is possible to track which is the “current” face. That triggers a different color animation of an RGB LED inside of the box.
c) Look and Feel
A regular cube, made out of semi opaque acrylic.
3. Battleship Game
A battleship game for one user.
Two potentiometers control the x and y position of the target. A push button shoots. 3 different feedbacks: red explosion (success); green waves (water); yellow line (missed, but there’s a target either on the same line or the same column).
c) Look and Feel
Like a classic wooden board game, except for the LEDs and knobs.
4. Wacky Building
An interactive souvenir. Seemingly static, it will surprisingly respond to the user presence!
3 sensors may trigger the interaction: sound, lightness and tilt. By turning the ambient lights off, the window-LEDs will start to turn on. Shaking the building triggers an alert sound and animation. Loud sounds trigger a song (played using the buzzer) and an antenna animation (servo).
c) Look and Feel
Like an old wooden toy, with minimal visual and shapes.
Original post: http://makingtoys.net/2013/12/01/final-project-proposals-2/
Make a thaumatrope using Arduino and a DC motor.
a) I began by checking if the speed rotation was enough to make it possible:
b) Then I tested a 3-faced version. But it didn’t work.
c) I decided to test one that would animate the image, instead of just mixing it. It didn’t work out as I planned, though:
d) So I stick to the idea of mixing two images. I draw this cat in an iPad, transfer it to paper and then to the final boards.
Original post: http://makingtoys.net/2013/11/21/wildcat-thaumatrope/
A physical display for weather data.
How it works
The push buttons send a number from 0 to 3 to a Node.js application. The app connects to the Weather Underground API, requesting data from one of the four different cities — depending on the button pressed.
The data is sent to an html page and also back to arduino. The temperature is then mapped to an angle, which is assigned to the servo motor.
The scale has a very short range on purpose. The intention was to highlight the difference between the Brazilian cities and New York.
Original post: http://makingtoys.net/2013/11/21/physical-display-for-weather-data/
An interactive animation that responds to non-conventional user input.
Talking about his Face Tracker in an interview, Kyle McDonald pointed out that:
“…as far as a practical applications, I could imagine it augmenting the way the computer understands us. I’ve been thinking a lot about this recently. Your computer has a microphone to listen to you, an accelerometer to know when you drop it, a camera to watch you, an ambient light sensor to know how bright the screen should be. I have to wonder if it makes sense to respond to our pose and facial expressions.”
Though I’m not using face tracking, this paragraph sums up my inspiration for the project.
My aim is to explore those different inputs as much as possible, using an interactive animation as a basis for that.
I will use 3 different inputs from hardware available on my macbook: ambient light sensor, mic and camera.
See previous proposal for more details.
The circle position changes according to the average optical flow inside the central grid.
The circle brightness responds to the ambient light; the size, to the sound input volume.
I want the inputs to have an intuitive connection to the elements:
- the brightness changes the daylight (sun/moon);
- the sound has a wind effect;
- the user movement rotates the planet.
Now that the technology is working, I’ll add more elements to the animation. The interaction between elements will trigger different events — two touching clouds may cause lightnings, clouds over roses may rain etc.
Code for the input checking example here.
Code for the first prototype here.
You’ll also need two add-ons to run the code: ofxOpenCV (already included in the oF add-ons folder) and ofxOpticalFlowBarneback.
Original post: http://codeforart.cc/fall-2013/interactive-animation-1st-prototype/
A “Little Prince”-like world:
The user can interact with it in different ways.
1 – Moving the head rotates the planet; — Facetracker?
2 – Blowing moves the clouds and other floating elements — sound capture?
3 – Changing the lightness — by covering the ambient light sensor, using a flashlight or turning the lights of the room off — switches from day to night. Original post: http://codeforart.cc/fall-2013/interactive-animation/
A Simon Says-like game, but with a knob instead of buttons. The user has to repeat the sequence of angles, pretty much like a locker.
Thanks Laura Salaberry, Renata Miwa and Alessandra Kalko!
Soundtrack: instrumental version of “Amor de Chocolate“, by Naldo.
Design process here.
Source code here.
P.S.: “Simão” is the portuguese name for Simon. The name of the toy in Brazil is actually “Genius”.