Tag Archives: input

The Little Prince – Final

Screen Shot 2013-12-16 at 3.35.14 PM Idea
An interactive app based on The Little Prince book. The app covers the chapters in which the prince travel through seven planets before coming to the Earth.

Precedents
The project draws inspiration from interactive children’s books. Particularly the ones by Bruno Munari, which explores the interaction with paper in non-traditional forms.
The purpose then was to make something analogous using a laptop. How can a user interact with it besides using mouse and keyboard?

See previous posts for more precedents and process.

Technology
The project uses 3 different inputs: camera, light sensor and speaker. Each planet respond for only one of them — except for the first one, which is supposed to be a tutorial for the other parts.
Screen Shot 2013-12-16 at 3.35.26 PM

Prototypes
This was my last prototype before the final version. All interactions and navigation working. Drawings still in sketch version — though the final is a sort of sketch, too!

Final

Original post: http://codeforart.cc/fall-2013/the-little-prince-final/

The final code is here.

Download the app here.

Interactive animation – 2nd Prototype

Precedents
Christopher Niemann’s app “Petty Zoo” is pretty close to the interactions and animations I’m planning. In each scene the user interaction triggers different animations:

500w3
Picture by Christopher Niemann

As for the drawing style, it will look more like this John Porcellino gif: John_stroll


Main Interaction

This is my 2nd prototype for the Code for Art finals. I’ve changed the idea a little bit. The user will no longer play with all the 3 inputs (sound, light and camera) in the same scene. Instead, I’ll have 7 different planets from the Little Prince book. In each one, the user will play with a single input, as in the table below:
Screen Shot 2013-12-03 at 6.04.52 PM

The first scene/planet will respond to all 3 inputs because it will also work as an introductory tutorial.

Navigation
I decided to keep the navigation as a separate input, using keyboard. I think that it needs to be something more stable and less exploratory — and also clearly separate from the animation interactions.

Scenes
These are my sketches for the scenes so far. The next step is to integrate them with the coding from the previous prototype.
Screen Shot 2013-12-03 at 6.03.39 PM Screen Shot 2013-12-03 at 6.04.09 PM Screen Shot 2013-12-03 at 6.03.55 PM Screen Shot 2013-12-03 at 6.03.52 PM Screen Shot 2013-12-03 at 6.03.50 PM Screen Shot 2013-12-03 at 6.03.48 PM Screen Shot 2013-12-03 at 6.03.45 PM Screen Shot 2013-12-03 at 6.03.43 PM

Original post: http://codeforart.cc/fall-2013/interactive-animation-2nd-prototype/

Interactive Animation – 1st Prototype

Idea
An interactive animation that responds to non-conventional user input.
Talking about his Face Tracker in an interview, Kyle McDonald pointed out that:
“…as far as a practical applications, I could imagine it augmenting the way the computer understands us. I’ve been thinking a lot about this recently. Your computer has a microphone to listen to you, an accelerometer to know when you drop it, a camera to watch you, an ambient light sensor to know how bright the screen should be. I have to wonder if it makes sense to respond to our pose and facial expressions.”
Though I’m not using face tracking, this paragraph sums up my inspiration for the project.
My aim is to explore those different inputs as much as possible, using an interactive animation as a basis for that.

Technology
I will use 3 different inputs from hardware available on my macbook: ambient light sensor, mic and camera.
See previous proposal for more details.

Input Checking

The circle position changes according to the average optical flow inside the central grid.
The circle brightness responds to the ambient light; the size, to the sound input volume.

1st Prototype

I want the inputs to have an intuitive connection to the elements:
- the brightness changes the daylight (sun/moon);
- the sound has a wind effect;
- the user movement rotates the planet.
Now that the technology is working, I’ll add more elements to the animation. The interaction between elements will trigger different events — two touching clouds may cause lightnings, clouds over roses may rain etc.

Code for the input checking example here.
Code for the first prototype here.
You’ll also need two add-ons to run the code:  ofxOpenCV (already included in the oF add-ons folder) and ofxOpticalFlowBarneback.

Original post: http://codeforart.cc/fall-2013/interactive-animation-1st-prototype/

Interactive Animation – Refined Proposal

A “Little Prince”-like world:
sketches_01
 The user can interact with it in different ways.
1 – Moving the head rotates the planet; — Facetracker?
2 – Blowing moves the clouds and other floating elements — sound capture?sketches_02 

3 – Changing the lightness — by covering the ambient light sensor, using a flashlight or turning the lights of the room off — switches from day to night. sketches_03Original post: http://codeforart.cc/fall-2013/interactive-animation/

Interactive Story – Proposal

Idea
In my final project I’d like to build a minimalist short story with simple interaction. My goal is to explore alternative ways of navigating through a digital book/story/app.

Technical Specifications
I’m planning to use the Macbook ambient light sensor.
I may use the motion sensor as well, though my retina Macbook doesn’t have one.

Precedents
The children’s books designed by Bruno Munari are my main inspiration for this project. His experiments with different papers, transparencies and paper cut pushed the boundaries of interaction with print media.noiteescura2Munari’s “In the Dark of The Night” (“Nella notte buia“). Photo by Planeta Zorp.

Also, I drew inspiration from AATOAA‘s interactive animation Bla Bla. In this movie without words, the story is told mainly through user interactions.
blablaOriginal post: http://codeforart.cc/fall-2013/2968/