Tag Archives: processing

All the f***ing lines from “The Wolf of Wall Street”

Screen Shot 2014-05-08 at 9.20.56 AM

Third iteration of the movie mashup project. Now searching for specific words in a movie — also based on the subtitle.
I am not counting the words in Processing, so I copied and pasted the subtitles from “The Wolf of Wall Street” into Textalyser. The first words by far in the ranking are “you” and “what” — not very meaningful, though.

So I used “fucking,” third one with 211 occurrences. Taking a look at the full list, “fuck” and “fucking” also appear a lot of times. Because they are sort of variations of the same thing, I searched for any time that any of them is said — 431 times total. That happens in 351 subtitles — sometimes more than once in a single line, then.

Code on Github.

Repeteated Movie Lines

groundhog_day

Another iteration of the movie mashup project. This one searches for repeated lines in a movie. Also based on the subtitles file.
“Groundhog Day” was an obvious choice because of its repeating plot. Interesting to notice that some of the repeated scenes were certainly shot at once, because Bill Murray’s hair looks exactly the same.

Code on Github.

Movie Lines Sorted Alphabetically

This is the first attempt towards my proposal for a generative movie mashup.  The code is based on the same Processing sketch I used before to make a mashup book.
The subtitles’ time code (start and end) are used to play the movie jumping from one position to another.
The lines are sorted alphabetically and the video editing is automated based on that.

Code on Github.

Delicious Bookmarks

Idea
I’ve been using delicious since 2004. It has been my one and only bookmarking tool since then. I love that it is a tag-based system, what makes much more sense than a folder-based one.
So for Dataviz’s API assignment, I tried to do something with that. The idea was to visualize how my own interests may have changed from 2004 to now.

Development
a) My first iteration with the data was just displaying its full content. Because 1374 links is a lot to display on a single screen, I made it as a pdf static poster:
delicious_ allTags_v1

There’s no much to see in it besides the list itself. 2006 seems to have the larger number of links, but that’s probably because I imported a lot of tags from my browser (IE?) when I began using delicious.

b) So I tried to make a force-directed network graph, based on this code I’ve found, from Karsten Schmidt.
Screen Shot 2014-03-12 at 2.04.16 PMScreen Shot 2014-03-12 at 2.05.14 PM

The left image is Karsten’s original app. The code didn’t work out for my data, because I had too many nodes. The visualization just keeps moving, directed by the repulsion and attraction forces. If I had more time, I would try to fix that.

c) Because my goal was to see some patterns in my interests, I started to work with the tags instead of the links. These iterations are simple attempts to create a sort of tag cloud.
Screen Shot 2014-03-13 at 4.14.55 PM Screen Shot 2014-03-13 at 5.57.36 PM Screen Shot 2014-03-13 at 4.30.15 PM

That starts to show something — art is probably there because is such a broad term that applies for almost everything I tag. However, that is not much different than delicious’ own visualization. And it doesn’t show any time component, which was the interesting part for me.

d) I started to code a timeline showing tag usage x time. Each column in the bubble chart below displays the number of times a tag was stored in a month.
Screen Shot 2014-03-14 at 12.00.10 AM Screen Shot 2014-03-14 at 6.14.42 PM Screen Shot 2014-03-14 at 6.14.21 PM delicious_timeline

It might not be the best way to display the data, but is certainly a pretty simple and quick to do one. Once again, the image is too big for a screen version, so the final output is another pdf poster.
Zooming in this image was interesting to me. Some things I found out:
– the giant bubble in the corner is the tag imported. It is a default delicious tag for bookmarks imported from a browser.
Art, illustration and design are tags that appear most frequently — they’re the red and orange ones at the top. Maybe because they’re broad, but may also be a result of things that I am still interested in.
– Technology-related tags are mostly in the blue-purple spectrum. They reflect more recent interests of mine — javaScript, pComp etc.
– Some tags are clearly redundant: data and visualization are two tags, though they always appear together. The same thing for physical and computing.

Wearable Remote Control (3)

Idea
A wearable remote control.
Project developed for the Dynamic Interfaces Class, in collaboration with Apon Palanuwech and Ayodamola Okunseinde.

Development
Though our last prototype with the wrist band worked, we wanted to make the one with the sock work. The shape seemed to fit better the function.12

We sew a foam to the sock, to prevent the plastic from collapsing like last time. We also tried to isolate the circuit as much as we could.

13

Even so, the results were different each time. It varied depending on wether we were using the Arduino Uno or the Fio, if it was connected to the computer through USB or Wi-Fly…

14

…and if we were touching or not the board. That led us to a problem discussed in the Capacitive Sensor tutorial: the board needs to be grounded. The page also explains a lot of problems we had, like the laptop acting as a sensor too when connected to the board.
After that, we gave up on the Fio/Wi-fly and decided to work with the regular Uno, for this prototype.
For the software part, we added calibration and “tap detection.” Now we finally have it controlling a video!

Wearable Remote Control (2)

Idea
A wearable remote control.
Project developed for the Dynamic Interfaces Class, in collaboration with Apon Palanuwech and Ayodamola Okunseinde.

Software Development
a) 1st Prototype

After building a relatively stable device to measure conductivity along 2 axys (see previous post), we started working on the software. Though our purpose was to build a simple remote control, we started to test with a sort of trackpad — big mistake, maybe?
For this prototype, we used processing and serial communication.
We first tried to assign an absolute position to the ball, based on the finger position on the trackpad. That proved to be impossible, because people’s charge on the pad changed a lot.
So we made the charge give the ball a direction, like in a joystick — the direction from the pad’s center is translated as a new direction to the ball.

b) 2nd Prototype
After that, we translated the processing sketch to javascript and changed the functions to control a video on the browser.

Hardware Development
a) 5th (?) Prototype

Meanwhile, we replicated the hardware circuit in a non-rigid device, to make it wearable. We sewed the conductive plastic on felt…

08

…and the on a sock:10

Though it looked great as a super-like thing, the plastic collapsed and became very low conductive:
09

Hardware Development
b) 6th Prototype

A much simpler and more stable solution was achieved when we simply put the plastic on an E.V.A. wrist band:
11