Generative Soundscape Concept

When studying Interactive Technologies at NYU’s Interactive Telecommunications Program, I was working towards an interactive installation that blended the bowling gesture to trigger scattered half-spheres to generate a musical experience. This is an evolved and collaborative idea, from the Generative Sculptural Synth. The ideal concept is an interactive synthesizer that's made up of replicated modules that generate sound. It is triggered by sphere that creates chain-reaction throughout the installation's configuration.

 

Methodology: Iterative Concept and Prototyping

Tools: Arduino and Little Bits

Deliverables: Prototype of a modules that communicate and trigger through motion and sound range

It started out as re-configurable soundscape and evolve into an interactive –bocce-like– generative instrument. Here's a inside scoop of the brainstorming session were we –with my teammate– sought common ground. (1. Roy's ideal pursuit 2.My ideal pursuit 3.Converged ideal)

Audio Input Instructable

It started out as re-configurable soundscape and evolve into an interactive –bocce-like– generative instrument. Here's a inside scoop of the brainstorming session were we –with my teammate– sought common ground. (1. Roy's ideal pursuit 2.My ideal pursuit 3.Converged ideal)

Littlebits –whatever works–

After the slum dunk failure of the DIY Audio Input, I realize the convenience –limited– of prototyping with Littlebits. This way, I could start concentrating in the trigger event, rather than getting stuck at circuit sketching. I was able to program a simple timer for module to "hear" –boolean triggered by the microphone– and a timer for the module to "speak" –boolean to generate a tone–. What I learnt about the limitations of the Littlebit sensor is a twofold. They have a Sound Trigger and a conventional Microphone. Both bits' circuits have the embedded circuit solved out which turned out to be useful but limiting. The Sound Trigger has an adjustable Gain, an embedded –uncontrollable– 2 second timer and a pseudo-boolean output signal. So even though you can adjust it's sensibility, you can't actually work around with its values in Arduino IDE. The Microphone bit had an offsetted (±515 serial value) but its gain was rather insensible.

This is why, when conveniently using the Sound Triggers, the pitch is proportional to the distance. In other words, the modules are triggered closer when lower pitches are sensed and vice versa. However, since these bits –Sound Trigger– are pseudo-boolean, there can't be a Frequency Analysis.

Mind the Needle Iteration 0.1

First Iteration of Mind the Needle, an exploration of emergent interfaces.

Mind the Needle is project exploring the commercially emergent user interfaces of EEG devices. After establishing the goal as popping a balloon with your mind –mapping the attention signal to a servo with an arm that holds a needle–, the project focused on better understanding how people approach these new interfaces and how can we start creating better practices around BCIs –Brain Computer Interfaces–. Mind the Needle has come to fruition after considering different scenarios. It focuses on finding the best way communicating progression through the attention signal. In the end we decided to only portray forward movement even though the attention signal varies constantly. In other words, the amount of Attention only affects the speed of the arm moving, not its actual position. Again, this is why the arm can only move forward, to better communicate progression in such intangible, rather ambiguous interactions –such as Brain Wave Signals–, which in the end mitigate frustration.

The first chosen layout was two arcs the same size, splitting the screen in two. The arc on the left is the user's Attention feedback and the other arc is the digital representation of the arm.

After the first draft, and a couple of feedback from people experimenting with just the Graphical User Interface, it was clear the need for the entire setup. However, after some first tryouts with the servo, there were really important insights around the GUI. Even though the visual language –Perceptual Aesthetic– used did convey progression and forwardness, the signs behind it remained unclear. People were still expecting the servo to move accordingly with the Attention signal. This is why in the final GUI this signal resembles a velocimeter.

UI Alternatives

Physical Prototype

Sidenote: To ensure the successful popping-strike at the end, the servo should make a quick slash in the end (if θ ≧ 180º) – {θ = 170º; delay(10); θ = 178;}

Interactive Dream Box

For children's month, we created a giant box to make a stronger bond between children and their parents. I was the Interactive Lead for this project making sure the hardware and software would run swiftly for a month and a half.

Methodology: Iterative Development

Tools: Arduino, OpenFrameworks

Deliverables: Interactive Experience triggered by levers and buttons that took children and parents through a journey

Concept

With a collaborative experience, people embarked in a journey in the world of dreams and imagination. To communicate children's boundless imagination and appropriation of everyday objects, we constructed a giant carton box as the ship, with two control panels were knobs and buttons are made out of plastic bottles and other every day objects. 

Technologies

Along with two Interaction Designers, we coded the project's software in OpenFrameworks and the hardware in Arduino. To ensure collaboration in the box's experience, both panels were made wide enough so they could only be triggered by at least two people. There are two starting knobs and two launching/landing levers. The other panel is as wide as the first one, and it has four buttons that light-up to a sequence. Lit buttons have to be pressed at the same time to defeat the violent thread in the journey.

Systema Solar Live Act

Brief

We were commissioned an Interactive Live Show by the Colombian band Systema Solar. With a team of 3 Creative Technologists we developed different real time visual effects. I was in charge for coding the puppetry controls, the audio-reactive silhouette patches and figuring out best UX practices. We created a VJ deck, from the physical rack to the digital patches.

Overview

To better understand the puppetry possibilities with Kinect, we figure out how Animata worked. After having a first glimpse, I began this patch from scratch in the live software VVVV. Even though I had no previous experience with Kinect or VVVV, this project was evidence of perseverant work, squeezed wit and sought fortune. By the end, there were 3 crafted puppets of Systema Solar's crew (Johnpri –lead singer–, Walter –lead performer & singer– and Corpas –dj/scratcher–)

The VJ Deck

The rack is composed of 1 Kinect, 3 GoPro Cameras, 7 signal converters, 1 MIDI Pad, 1 Mac Mini, 1 Four-Channel Mixer. These 4 signals are the input for the VJ's laptop.

Rehersal

Video Documentation

Interactive Table – ∏ "Planta Interactiva"—

Overview

This project blends the Interactive-Tabletop framework known as Reactivsion, with an engaging way of explaining Biodiesel production. I was the Full-Stack Designer creating the UX, storyboards, 3D motion graphics and Creative Coding like animated buttons and knobs.

The objective behind this exploration is to engage potential new engineering students in an experience that helps them understand what involves any of the engineering programs offered by the faculty. Each of the 4 offered programs at Universidad de la Sabana has a specific narrative within the playful experience of creating biodiesel.

Storyboard

Tryout