Mind the Needle Iteration 0.1

First Iteration of Mind the Needle, an exploration of emergent interfaces.

Mind the Needle is project exploring the commercially emergent user interfaces of EEG devices. After establishing the goal as popping a balloon with your mind –mapping the attention signal to a servo with an arm that holds a needle–, the project focused on better understanding how people approach these new interfaces and how can we start creating better practices around BCIs –Brain Computer Interfaces–. Mind the Needle has come to fruition after considering different scenarios. It focuses on finding the best way communicating progression through the attention signal. In the end we decided to only portray forward movement even though the attention signal varies constantly. In other words, the amount of Attention only affects the speed of the arm moving, not its actual position. Again, this is why the arm can only move forward, to better communicate progression in such intangible, rather ambiguous interactions –such as Brain Wave Signals–, which in the end mitigate frustration.

The first chosen layout was two arcs the same size, splitting the screen in two. The arc on the left is the user's Attention feedback and the other arc is the digital representation of the arm.

After the first draft, and a couple of feedback from people experimenting with just the Graphical User Interface, it was clear the need for the entire setup. However, after some first tryouts with the servo, there were really important insights around the GUI. Even though the visual language –Perceptual Aesthetic– used did convey progression and forwardness, the signs behind it remained unclear. People were still expecting the servo to move accordingly with the Attention signal. This is why in the final GUI this signal resembles a velocimeter.

UI Alternatives

Physical Prototype

Sidenote: To ensure the successful popping-strike at the end, the servo should make a quick slash in the end (if θ ≧ 180º) – {θ = 170º; delay(10); θ = 178;}

UI Draft #2 BCI & Processing

This is the Interactive Wireframe so far, for my BCI Interactive Installation. Basically I'm trying ways to better communicate what's going on when using the Mindwave, and how can we translate its signal into a more structured task. The code for this UI Wireframe can be found in this Github Repo.

Morse Code Translator

Inspired by the "Hi Juno" project, I sought an easier way to use Morse Code. This is why I've created the Morse Code Translator, a program that translates your text input into "morsed" physical pulses. One idea to explore further could be thinking how would words express physically perceivable (sound, light, taste?, color?, Tº)

So far I've successfully made the serial communication and the Arduino's functionality. In other words, the idea works up to Arduino's embedded LED (pin 13). This is how a HI looks being translated into light.


Followup, making the solenoid work through morse coded pulses. You can find the Processing and Arduino code in this Github Repo.

Tangible Retail Display

Images taken from their Blog Post

Images taken from their Blog Post

After a lot of searching and looking around, I stumble upon a company that creates interactive products for commercial scenarios beyond tactile interfaces onto tangible ones. The interactive product is triggered by lifting one of the products sold in the store, to expose an album of first-person stories around diverse brand’s products. Even though it sets an innovative consumer experience, after half an hour of waiting for someone to comply, I finally decided to take it for a spin. The product is a sealed black box, with what I imagine is a projector, a computer and a camera. The main idea behind it is to transform any surface into an interactive tangible user interface. Basically this is a usable interactive experience with catchy stories behind a tracking framework.

The fact this product is interfacing with real tangible artifacts does set an entire realm of possibilities, even though it was only used for triggering a strictly tactile command interface. This tactile-2D-interface had the proper affordances to easily manipulate the experience. Its results could easily be noticed when navigating and selecting different features, and because it was built on top of the tactile interface paradigm, it was really easy to learn how to use it. However, it lacked the first principle of interaction design, it wasn’t perceivable as an interactive display at first sight. Not really sure why, but its call to action –or its lack of– left clients adrift. Even still when the product had a blinking text prompt of 1/10 of the display’s height –more less– for inviting people to interact –"Please lift to read the stories"–, the overall idea of how to start the interactive experience wasn’t overly persuasive. Maybe, given to the fact that it resembled a light-display-installation that you’re not supposed to touch kind-of imaginary scenario, but not 100% certain. Overall the 5 minute experience was entertaining.

The hypothesis I had before approaching the product was that this interface should aim for what Norman calls affective approach, considering the context and goal are for retail purposes, it is not a scenario that requires a serious concentrated effort reach its goal. In these order of ideas, the product balances beauty and usability fairly well, where easy-going use and contemplation are conveyed.

 

NUI BCI Study #1 "Mindwave"

 

Through this first exploration of interfacing Neurosky's Mindwave I've learned a couple things around EEG and Processing. The current library I'm working with is called Thinkgear, which allows to read different signals (low and high values for alpha, beta and gamma, and delta and theta signals, plus a blinking signal). Besides the annoying bluetooth pairing, this consumer interface is still in the making and Processing's latency doesn't make it easier for user feedback. I'm sure there's better ways of interfacing this to optimize user feedback –other software–, and there should be better consumer EEG devices out there. Nonetheless it has been a thrilling experience to better understand the sine and cosine functions, arrays and libraries. Here's my second draft I've crafted with this curious Natural Brain Computer Interface. The code for this UI Draft can be found in this Github Repo.

Servo Tinker Application

 

After tinkering a conventional servo to read its position data, I'm still figuring out a way to apply this feedback reading into an aesthetic application. Even though I'm unsure on how can I implement this into the former concept, it certainly sparks interesting interactive possibilities. The code can be found in this Github Repo.

I also started a sketch around a servo triggered by a digital input. When triggered, the servo moves across a 30º range, back and forth. The idea to explore further, is to module its speed by an analog input, and maybe add a noisy (perlin most likely) effect.

Servo Lab Tinkering Actuator

I've integrated an exercise from Automata into the second lab exercises (Digital and Analog Input). Here, I've hacked a servo (connected a cable to its potentiometer) to retrieve its spinning position. The excerpted code from Automata's class, allows to record its movement and reproduce it. 

The entire project with instructions can be found in Adafruit's tutorial

Circuit's draw, excerpted from Adafruit's tutorial

Circuit's draw, excerpted from Adafruit's tutorial

IxD Principles

 
 

The answers should be given by the design, without any need for words or symbols, certainly without any need for trial and error.” Don Norman

The answers Don Norman addresses are PERCEIVED through affordances. As he describes it, these affordances are “primarily those fundamental properties that determine just how the thing could possibly be used [, and] provide strong clues to the operations of things”. Thus, affordances allow the transition from the first principle to the second (Perceivability to PREDICTABILITY). It’s thanks to these visible assets in products –affordances–, that people are able to interact (operate and manipulate). Given to FEEDBACK (the third principle) people can understand and know how to overcome error (machine’s) and mistakes (people’s). Through repeated interaction, people get to LEARN how to use a product, and thanks to CONSISTENT standard practices among similar products, transfer its usage from one type of product to another.

Besides use standards and best practices, Don Norman addresses the importance of affection in the design process. He points the nuances between negative and positive affection, and draws the importance of creating good human-centered design whenever addressing stressful situations. In the end, he emphasizes that “[t]rue beauty in a product has to be more than skin deep, more than a façade. To be truly beautiful, wondrous, and pleasurable, the product has to fulfill a useful function, work well, and be usable and understandable.

Rippled Installation Concept Development

Installation–Piece that has 2 behaviours

    Step and Modular ripple effect

1. Autonomous

2. Interactive

—Map physical phenomena in space that draws the work (installation/piece) out of its equilibrium

—Visibilize an invisible disturbance

 

Form – Codex Seraphinianus: modular

Mechanisms

Oval Regulated System

Oval Regulated System

Hexagonal Wave by Bees & Bombs (Tumblr)

Hexagonal Wave by Bees & Bombs (Tumblr)

Idea 

Ripple hexagonal module that reacts to an invisible phenomena

Paper Prototype

 

UI Draft #1

With the opensource JAVA toolkit Processing, I started exploring around User Interfaces, Time representation and Hover Timing. Hover Timing, might bring intersting possibilities for Natural User Interface such as the Kinect or Leap Motion, where different affordances come into place wtih simple tasks like selecting an element. The code for this draft can be found in this Github Repo

What is Interaction?

I like to think of Interaction Design as the work towards creating models/experiences that attempt to closely represent people's imagination or conceptual models. Chris Crawford’s metaphor of conversation is the most concise and enlightening explanation I’ve read so far. Luminaries within the Interaction Design realm such as Bill Moggridge or Gillian Crampton have wonderful explanations, yet Crawford’s self contained metaphor gives IxD’s explanation an elegant simplicity with just one word. From the implications of conversation that Crawford describes, there are a couple concepts to highlight. The cyclic nature of the conversation between actors, and fun as key qualitative factor for high interactive designs. To guarantee this cycle, he addresses the importance of the 3 equally necessary factors –listen, think and speak– to consider a conversation good. This is certainly an entertaining challenge when designing interactive works.

Crawford goes on pinpointing the revealing differences between IxD and other similar disciplines such as Interface Design. This difference relies specifically in the in between factor of a conversation, thinking. Interaction Design differs from Interface Design by addressing how will the work behave, through algorithms. He ensembles an articulate comparison that sets the stage for an afterthought analogy, Interaction Design is to Interface Design as Industrial Design to Graphic Design. He describes that, “[...] the user interface designer considers form only and does not intrude into function, but the interactivity designer considers both form and function in creating a unified design.” A systemic approach that never gets easy, yet enormously fulfilling whenever “people identify more closely with it [interactive work] because they are emotionally right in the middle of it.” In other words, interaction design is amazing thanks to the engaging and earnest-provoking experience.

In the end, Crawford finishes with a cautious call for action encouraging the reader to “exploit interactivity to its fullest and not dilute it with secondary business.” Exactly what prodigious creator and visionary Bret Victor denounces about nowadays consumer tech panorama. He is alarmed by the status quo’s acceptance of the narrow vision in interaction’s future-concept behind a flat surface. Victor advocates for tools that “addresses human needs by amplifying human capabilities”. Its through everyday objects’ properties how Interaction Design feedback should be crafted. He wittily highlights haptic feedback and explains haptic typology –power, precision and hook grips–. These premises will allow Interaction Design craft more intuitive works where hopefully people can seamlessly converse with –fingers crossed– other people and seamlessly experience works and devices. Victor wraps it with an encouraging suggestion to “be inspired by the untapped potential of human capabilities” and as Interaction Design “[w]ith an entire body at your command, do you seriously think the Future Of Interaction should be a single finger?

Even though gestured Natural Interfaces cast an interesting future for Interaction such as Disney Research's lovely concept, there is still fine tuning within the Beneficial Aesthetic realm.

Aireal: Interactive Tactile Experiences in Free Air. (n.d.). Retrieved September 9, 2014.

Crawford, C. (2002). The Art of Interactive Design a Euphonious and Illuminating Guide to Building Successful Software. San Francisco: No Starch Press.

Victor, B. (2011, November 8). A Brief Rant on the Future of Interaction Design. Retrieved September 8, 2014.

Elements

"Perfection is achieved not when there is nothing left to add, but when there is nothing left to take away" Antoine de Saint-Exupéry

"Adding the meaningful and subtracting the obvious" John Maeda

Not long ago I stumble across an article called 7 Design Principles, Inspired By Zen Wisdom. In it, they describe the state mastered through composing with these principles as Shibumi and even though it has no direct translation they explain that its meaning "is reserved for objects and experiences that exhibit in paradox and all at once the very best of everything and nothing: Elegant simplicity. Effortless effectiveness. Understated excellence. Beautiful imperfection." This is the beginning for my pursue towards Shibumi Interactive Experience.

This is why, through the homework's brief I began composing by the first two principles, Austerity and Simplicity. This is why when deciding how to compose the portrait, I wonder what are the least necessary elements to perceive a face. Later on, I looked onto adding depth and that's how the overall size composition and hands came about.

Another trait I explored all through the exercise, was to craft the composition through dynamic dimensions. In other words, how this composition have consistent dimensions, regardless of the device it's being used in. In the end, I noticed whenever you're trying to figure out a coordinate in space, it's more effective –as code-crafting– to modify ratios through floating numbers than by arithmetic operands. This is why, everything is created with width and height variables.

I've also started trying another environment for editing code called Sublime Text 2. I find the auto-suggested functions whenever a character is typed appealing, but what has really stand out from the conventional environment is the function parameters auto-filling.