Sunday, January 20, 2013

Trinkets For Command-Line Performance

Take a peek at this MIDI controller:

Twelve switches which send MIDI, and two expression pedals which send MIDI. Most people use it to control loop-triggering in Ableton Live.

But it can do more. If you can write code which takes MIDI input, sends MIDI output, and retains the full power of a Turing machine for processing the MIDI in-between, it means that the twelve switches on this box don't just have to give you a total of twelve loops you can trigger.

Consider the chorded keyboard:

A keyset or chorded keyboard (also called a chorded keyset, chord keyboard or chording keyboard) is a computer input device that allows the user to enter characters or commands formed by pressing several keys together, like playing a "chord" on a piano. The large number of combinations available from a small number of keys allows text or commands to be entered with one hand, leaving the other hand free. A secondary advantage is that it can be built into a device (such as a pocket-sized computer or a bicycle handlebar) that is too small to contain a normal-sized keyboard.

You can control emacs or vim from a chorded keyboard. I have no plans for that, though. I mention it because you pilot either emacs or vim with a combination of muscle memory and brief commands, but that combination permits extraordinary fluidity. Using bash works the same way, and bash is Turing-complete.

Much current UX thinking revolves around gestural interfaces, but I find that to be a shallow idea. You can communicate much more sophisticated instructions to a machine by using a programming language than you can via gestural interfaces. For example, most people conduct business among one another using words, rather than poking each other, except in the case of soldiers and porn stars. I can think of one exception, that gesture recognition can become much more sophisticated when the person gesturing knows American Sign Language, but even there, the power comes from the language, not from the gestures. I think the world of music software needs more domain-specific languages and less gestural interfaces.

I also really like this:

This box receives USB input and sends output in the DMX protocol, which controls theatrical, club, and concert lighting. You could, in theory, program this box via CoffeeScript, Ruby, Perl, or any of several other languages, or indeed design a chord-based 10-key language for it. You can, in practice, plug the other end into a huge variety of lasers, spotlights, smoke machines, and spotlights.

I have both of these widgets, but I haven't gotten them to work yet. I'm hoping to fix that in the coming year via my side project, Teaching The Robots To Sing, a loosely-defined ongoing video series which I started last year.