Palindrome Hour Web-Clock

This is a project that celebrates hours that can be read either from left-to-right and right-to-left, same as palindrome text –flee to me, remote elf–. A concept of living symmetry overlaid with pleasing coincidence, and chunks of daily serendipity. 

 I designed & coded this project in Javascript with the creative toolkit p5.js. Hop in, and catch the palindrome hours! Link To Project Here

Previous Iteration

UI Drafts

Generative Soundscape 0.1.2

 

Concept

This installation pursues playful collaboration. By placing the modules through arbitrary configurations the idea behind this collective experience is to create scenarios where people can collaboratively create infinite layouts that generate perceivable chain reactions. The way to trigger the installation is through a playful gesture similar to bocce where spheres can ignite the layout anywhere in the installation. 

 

Prototyping

After an apparent success –context-specific– and consequent failure –altered context– the project turned onto a functional alternative. The next process better illustrates it.

 

These images show the initial thought out circuit that included a working sound triggered by a –static– threshold. We also experimented with Adafruit's Trinket aiming towards circuit simplification, cost-effectiveness and miniaturization. This shrunk micro controller is composed by an ATTiny85 nicely breadboarded and boot-loaded. In the beginning we were able to upload digital output sequences to drive the speaker and LED included in the circuit design. However, the main blockage we manage to overcome in the end was reading the analog input signal used by the microphone. The last image illustrates the incorporation of a speaker amplifier to improve the speaker's output.

 

These two videos were an initial proof of concept of a module’s ability to detect sound and amplify it.

1. the functional prototype that includes a hearing spectrum –if the microphone senses a value greater than the set threshold, stop hearing for a determined time– 

2. the difference between a normal speaker output signal and an amplified speaker output signal. 

After the first full-on tryout, it was clear that a dynamic threshold –the value that sets the trigger adapts accordingly to its ambient–. The microphone however, broke one day before the deadline, so we never got to try this tentative solution –even though there's an initial code–.

 

Next Iteration –Prototyping Function & Form

Plan B, use the Call-To-InterAction event instead. In other words, use collision and the vibration it generates to trigger the modules through a piezo. Here's the code.

A couple videos that illustrate the colliding key moments that trigger the beginning of a thrilling pursue.

 
 

And because sometimes, plan-b also glitches... Special thanks to Catherine, Suprit and Rubin for play testing

 
 
 

Translated Code –Processing to OF–

This is a book on Generative Design, and the examples I've selected are oriented towards data visualization. The main limitation with the overall pursue is the underlying library –Generative Design– which doesn't exists in OF yet.

The Processing example used libraries that can be found in OF's addons, which draws the attention to the limitations of pursuing an entire translation of the examples. There's other examples that use theGeomerative example and the Generative Design library that are only available to Processing –or Java based IDEs–. Anyhow, this particular example used a PDF converter and Calendar libraries to export the application's canvas onto images with a timestamp. In the failed attempt I was able to include a calendar addon that didn't end up using in the working one. 

Even though there's a Project Generator that will include whichever addon needed, it doesn't work every time. Since this was one of those times, I ended creating the failed attempt in the same folder of the ofxICalendar addon. To try and  solve one of the primitive drawing elements I sought another addon called ofxVectorGraphics, that couldn't ever got it working on an already created project.

There are primitive functions in OF similar to Processing's, the arc however is not one of them. Instead, there's two ways to go around this. The addon mentioned before, and using an object called ofPath that contains the function arc. After a lot of trial and error I was able to finally get an arc drawn in an isolated project. As any OF project, you have to create the variables and objects in the *.h file and then you can work with them in the *.cpp file. What I came to know, after figuring out the specifics of not filling, outlining, setting the resolution and not closing –to an extent– arcs was, invoking the function needed to actually draw the function. This particularly was completely counter intuitive from the previous programming experience.

After Kyle McDonald's workshop in introduction to OF I learned that the project could be simplified significantly to one *.cpp file. This meant however that I wouldn't  be able to include the feature of exporting an image with a timestamp. Currently this is the working translated project. I would also like to thank AV –Sehyun Kim– for helping me out on how to –again– draw the arcs.

Labs –DC Motor, Outer Source & H Bridge–

USING A TRANSISTOR TO CONTROL HIGH CURRENT LOADS

DC MOTOR CONTROL USING AN H-BRIDGE

Generative Soundscape Diagram, Time Table & BOM

System Diagram

This is the basic behavior of the system, where sound is the medium of communication. The Trigger is the element that initiates the chain reaction. It will translate the rolling motion into the sound that will trigger the modules laying in the ground. This event will start the chained reaction.

Time Table

This is our initial Time Table and we've divided the overall project's development in two main blocks, the trigger element –stands for [T] in the timeline– and the module –stands for [M]–. Follow this link for a detailed description on each of the activities involved in our Time Table.

Bill of Materials

 

This Bill of Materials is thought for an initial prototype of one Trigger and two Modules. There's still a lot to figure out, but so far this is how it looks. You can follow this link for future references.

Generative Gesltaltung OF Translation

Taking some Processing code examples from this book, and translate them into OpenFrameworks. The project is directed to anyone interested in learning OF who has previous knowledge in Processing. Not only is there Generative Design involved, but Data Visualization and a new programming language/framework.

This is a book on Generative Design, and the examples I've selected are oriented towards data visualization. The main limitation with the overall pursue is the underlying library –Generative Design– which doesn't exists in OF yet.

Generative Soundscape Concept

When studying Interactive Technologies at NYU’s Interactive Telecommunications Program, I was working towards an interactive installation that blended the bowling gesture to trigger scattered half-spheres to generate a musical experience. This is an evolved and collaborative idea, from the Generative Sculptural Synth. The ideal concept is an interactive synthesizer that's made up of replicated modules that generate sound. It is triggered by sphere that creates chain-reaction throughout the installation's configuration.

 

Methodology: Iterative Concept and Prototyping

Tools: Arduino and Little Bits

Deliverables: Prototype of a modules that communicate and trigger through motion and sound range

It started out as re-configurable soundscape and evolve into an interactive –bocce-like– generative instrument. Here's a inside scoop of the brainstorming session were we –with my teammate– sought common ground. (1. Roy's ideal pursuit 2.My ideal pursuit 3.Converged ideal)

Audio Input Instructable

It started out as re-configurable soundscape and evolve into an interactive –bocce-like– generative instrument. Here's a inside scoop of the brainstorming session were we –with my teammate– sought common ground. (1. Roy's ideal pursuit 2.My ideal pursuit 3.Converged ideal)

Littlebits –whatever works–

After the slum dunk failure of the DIY Audio Input, I realize the convenience –limited– of prototyping with Littlebits. This way, I could start concentrating in the trigger event, rather than getting stuck at circuit sketching. I was able to program a simple timer for module to "hear" –boolean triggered by the microphone– and a timer for the module to "speak" –boolean to generate a tone–. What I learnt about the limitations of the Littlebit sensor is a twofold. They have a Sound Trigger and a conventional Microphone. Both bits' circuits have the embedded circuit solved out which turned out to be useful but limiting. The Sound Trigger has an adjustable Gain, an embedded –uncontrollable– 2 second timer and a pseudo-boolean output signal. So even though you can adjust it's sensibility, you can't actually work around with its values in Arduino IDE. The Microphone bit had an offsetted (±515 serial value) but its gain was rather insensible.

This is why, when conveniently using the Sound Triggers, the pitch is proportional to the distance. In other words, the modules are triggered closer when lower pitches are sensed and vice versa. However, since these bits –Sound Trigger– are pseudo-boolean, there can't be a Frequency Analysis.

Generative Synthesizer Prototype

This is a followup in the Generative Propagation concept. What I intended to answer with these exercises are two questions:

  1.  How can the trigger threshold be physically controlled? (How can the mic’s sensitivity be manipulated?)
  2. How can the tempo be established? (How often should each module emit a sound?)

The trigger threshold can be manipulated through manually controlling the microphone’s gain or amount voltage transferred to the amplifier –Potentiometer to IC–. 

By manipulating this potentiometer, the sensitivity of the microphone can be controlled.

The tempo can be established through timing the trigger’s availability. By setting a timer that allows the a variable to listen again, the speed/rate at which the entire installation reproduces sounds can be established.

Generative Synthesizer Concept

Can unpredictable melodies be created out of Constellaction’s concept?

Composition

Modules will bridge through consecutive emissions and receptions of sound. In the end, the purpose is to create a a cyclic chain that sets the stage for a greater pursue: creating a generative audio experience –like a tangible tone matrix–. In this exercise I will explore simple initial attributes such as trigger-thresholdand tempo.

Concept

How can sound-modules resemble basslines through replication? For the first phase of this project, I will explore ways of creating a module that, triggered by a sound, generate auditive-chain reactions.

Tone Matrix

Tone Matrix

Context

The general idea is to create different behaviors with these modules to the extent that they become generative. In this particular exercise –Mid-Term–, the idea is to create looped compositions that resemble bassline. By scaling these modules, emergent and unpredictable scenarios can appear.

 
 

BOM (Bill Of Materials)

  • Sound receiver (9 Microphone)
  • Sound emitter (9 Piezo–Buzzer)
  • Arduino
  • ATTiny
  • Battery (Coin Cell)
  • Controller (Potentiometer/Switch?)
  • 3 Trigger threshold
  • 3 Tempo
  • 3 PCB

Mind the Needle Iteration 0.1

First Iteration of Mind the Needle, an exploration of emergent interfaces.

Mind the Needle is project exploring the commercially emergent user interfaces of EEG devices. After establishing the goal as popping a balloon with your mind –mapping the attention signal to a servo with an arm that holds a needle–, the project focused on better understanding how people approach these new interfaces and how can we start creating better practices around BCIs –Brain Computer Interfaces–. Mind the Needle has come to fruition after considering different scenarios. It focuses on finding the best way communicating progression through the attention signal. In the end we decided to only portray forward movement even though the attention signal varies constantly. In other words, the amount of Attention only affects the speed of the arm moving, not its actual position. Again, this is why the arm can only move forward, to better communicate progression in such intangible, rather ambiguous interactions –such as Brain Wave Signals–, which in the end mitigate frustration.

The first chosen layout was two arcs the same size, splitting the screen in two. The arc on the left is the user's Attention feedback and the other arc is the digital representation of the arm.

After the first draft, and a couple of feedback from people experimenting with just the Graphical User Interface, it was clear the need for the entire setup. However, after some first tryouts with the servo, there were really important insights around the GUI. Even though the visual language –Perceptual Aesthetic– used did convey progression and forwardness, the signs behind it remained unclear. People were still expecting the servo to move accordingly with the Attention signal. This is why in the final GUI this signal resembles a velocimeter.

UI Alternatives

Physical Prototype

Sidenote: To ensure the successful popping-strike at the end, the servo should make a quick slash in the end (if θ ≧ 180º) – {θ = 170º; delay(10); θ = 178;}

UI Draft #2 BCI & Processing

This is the Interactive Wireframe so far, for my BCI Interactive Installation. Basically I'm trying ways to better communicate what's going on when using the Mindwave, and how can we translate its signal into a more structured task. The code for this UI Wireframe can be found in this Github Repo.

Morse Code Translator

Inspired by the "Hi Juno" project, I sought an easier way to use Morse Code. This is why I've created the Morse Code Translator, a program that translates your text input into "morsed" physical pulses. One idea to explore further could be thinking how would words express physically perceivable (sound, light, taste?, color?, Tº)

So far I've successfully made the serial communication and the Arduino's functionality. In other words, the idea works up to Arduino's embedded LED (pin 13). This is how a HI looks being translated into light.


Followup, making the solenoid work through morse coded pulses. You can find the Processing and Arduino code in this Github Repo.

Tangible Retail Display

Images taken from their Blog Post

Images taken from their Blog Post

After a lot of searching and looking around, I stumble upon a company that creates interactive products for commercial scenarios beyond tactile interfaces onto tangible ones. The interactive product is triggered by lifting one of the products sold in the store, to expose an album of first-person stories around diverse brand’s products. Even though it sets an innovative consumer experience, after half an hour of waiting for someone to comply, I finally decided to take it for a spin. The product is a sealed black box, with what I imagine is a projector, a computer and a camera. The main idea behind it is to transform any surface into an interactive tangible user interface. Basically this is a usable interactive experience with catchy stories behind a tracking framework.

The fact this product is interfacing with real tangible artifacts does set an entire realm of possibilities, even though it was only used for triggering a strictly tactile command interface. This tactile-2D-interface had the proper affordances to easily manipulate the experience. Its results could easily be noticed when navigating and selecting different features, and because it was built on top of the tactile interface paradigm, it was really easy to learn how to use it. However, it lacked the first principle of interaction design, it wasn’t perceivable as an interactive display at first sight. Not really sure why, but its call to action –or its lack of– left clients adrift. Even still when the product had a blinking text prompt of 1/10 of the display’s height –more less– for inviting people to interact –"Please lift to read the stories"–, the overall idea of how to start the interactive experience wasn’t overly persuasive. Maybe, given to the fact that it resembled a light-display-installation that you’re not supposed to touch kind-of imaginary scenario, but not 100% certain. Overall the 5 minute experience was entertaining.

The hypothesis I had before approaching the product was that this interface should aim for what Norman calls affective approach, considering the context and goal are for retail purposes, it is not a scenario that requires a serious concentrated effort reach its goal. In these order of ideas, the product balances beauty and usability fairly well, where easy-going use and contemplation are conveyed.

 

NUI BCI Study #1 "Mindwave"

 

Through this first exploration of interfacing Neurosky's Mindwave I've learned a couple things around EEG and Processing. The current library I'm working with is called Thinkgear, which allows to read different signals (low and high values for alpha, beta and gamma, and delta and theta signals, plus a blinking signal). Besides the annoying bluetooth pairing, this consumer interface is still in the making and Processing's latency doesn't make it easier for user feedback. I'm sure there's better ways of interfacing this to optimize user feedback –other software–, and there should be better consumer EEG devices out there. Nonetheless it has been a thrilling experience to better understand the sine and cosine functions, arrays and libraries. Here's my second draft I've crafted with this curious Natural Brain Computer Interface. The code for this UI Draft can be found in this Github Repo.

Servo Tinker Application

 

After tinkering a conventional servo to read its position data, I'm still figuring out a way to apply this feedback reading into an aesthetic application. Even though I'm unsure on how can I implement this into the former concept, it certainly sparks interesting interactive possibilities. The code can be found in this Github Repo.

I also started a sketch around a servo triggered by a digital input. When triggered, the servo moves across a 30º range, back and forth. The idea to explore further, is to module its speed by an analog input, and maybe add a noisy (perlin most likely) effect.

Servo Lab Tinkering Actuator

I've integrated an exercise from Automata into the second lab exercises (Digital and Analog Input). Here, I've hacked a servo (connected a cable to its potentiometer) to retrieve its spinning position. The excerpted code from Automata's class, allows to record its movement and reproduce it. 

The entire project with instructions can be found in Adafruit's tutorial

Circuit's draw, excerpted from Adafruit's tutorial

Circuit's draw, excerpted from Adafruit's tutorial

IxD Principles

 
 

The answers should be given by the design, without any need for words or symbols, certainly without any need for trial and error.” Don Norman

The answers Don Norman addresses are PERCEIVED through affordances. As he describes it, these affordances are “primarily those fundamental properties that determine just how the thing could possibly be used [, and] provide strong clues to the operations of things”. Thus, affordances allow the transition from the first principle to the second (Perceivability to PREDICTABILITY). It’s thanks to these visible assets in products –affordances–, that people are able to interact (operate and manipulate). Given to FEEDBACK (the third principle) people can understand and know how to overcome error (machine’s) and mistakes (people’s). Through repeated interaction, people get to LEARN how to use a product, and thanks to CONSISTENT standard practices among similar products, transfer its usage from one type of product to another.

Besides use standards and best practices, Don Norman addresses the importance of affection in the design process. He points the nuances between negative and positive affection, and draws the importance of creating good human-centered design whenever addressing stressful situations. In the end, he emphasizes that “[t]rue beauty in a product has to be more than skin deep, more than a façade. To be truly beautiful, wondrous, and pleasurable, the product has to fulfill a useful function, work well, and be usable and understandable.

Rippled Installation Concept Development

Installation–Piece that has 2 behaviours

    Step and Modular ripple effect

1. Autonomous

2. Interactive

—Map physical phenomena in space that draws the work (installation/piece) out of its equilibrium

—Visibilize an invisible disturbance

 

Form – Codex Seraphinianus: modular

Mechanisms

Oval Regulated System

Oval Regulated System

Hexagonal Wave by Bees & Bombs (Tumblr)

Hexagonal Wave by Bees & Bombs (Tumblr)

Idea 

Ripple hexagonal module that reacts to an invisible phenomena

Paper Prototype

 

UI Draft #1

With the opensource JAVA toolkit Processing, I started exploring around User Interfaces, Time representation and Hover Timing. Hover Timing, might bring intersting possibilities for Natural User Interface such as the Kinect or Leap Motion, where different affordances come into place wtih simple tasks like selecting an element. The code for this draft can be found in this Github Repo