Wednesday, December 30, 2015

Coping With What Apple's Become

A few years ago, I got annoyed with my iPhone for some reason. It might have been that the center button died, or it might have been the design catastrophe known as iOS 7, but either way, I'd had enough. I switched to a 1990s-style flip-phone which I bought at Best Buy for maybe $10. People kept giving me funny looks, so last year I bought an iPod Touch to see if I could tolerate iOS. The good news: I can tolerate it, on a $200 device. The bad news: I wouldn't pay a penny more. The 90s flip-phone is less irritating, to me. No tragically broken design, and no iTunes. The flip-phone's design is crap too, of course, but it's not tragic crap from a company that should know better; it's just cheap crap. My phone cost me less than a visit to my local comic book store usually does, so I don't mind.

Also, I use the iPod Touch as a social media quarantine device; its main purpose is to isolate social media onto just one machine, so that, if I want to concentrate, I can put that machine away, or go somewhere else and leave it at home. I still occasionally use Twitter on my laptop, but only because I've got a new project where I'm tweeting images I've created in Cinema 4D once every day. I've ordered a Lightning SD card reader for my iPod Touch; when I get it, my plan is to use an SD card to transfer files off the laptop onto the iPod Touch, move the Cinema 4D project off of Twitter and onto Instagram, and permanently hosts-ban Twitter on my laptop.

I prefer this nano-sneakernet approach to Dropbox; I won't use Dropbox because of their privacy policies, their irritating interface, and their connection to the prominent war criminal Condoleezza Rice. And I don't use iCloud, either, not just because of all the horror stories on Twitter and elsewhere of it destroying people's music collections, but also because that happened to somebody I know personally.

Recently my external hard drive for iTunes died. As a result, my music ownership fractured across several devices. I'm scared to sync anything to or from the old iPhone, because it means I'd risk iTunes doing something stupid to my music in the absence of the expected drive. (In fact, that's the real problem; Apple software's gotten so aggressively stupid that I just don't trust it any more, not even with utterly unremarkable, basic tasks.) So I'll probably have to write some software which manually removes the audio files. That software will also have to rename the files as well, since Apple obfuscates the names, but that's not hard; I've solved that problem before. Meanwhile, though, everything I buy on Beatport is on my main laptop; everything I buy on iTunes is on my iPod Touch, or at least one of my two iPads. (Both iPads run iOS 6, btw, because I just haven't been able to get over the awfulness of iOS 7 and up.)

Consequently, I'm just about guaranteed to pick up a ten-year-old iPod on eBay and switch to Swinsian.

About a year ago, I discovered that a guy I know in Los Angeles had switched from Apple to Linux — I think Ubuntu — and taken his whole company with him. They have just one person using OS X now, and that only for QA reasons. I worry that sooner or later, I'll have to make the same move. But I can't — not really. It would only solve my OS problems in dev work and social media. I make music using Ableton Live, I do 3D modeling and animation in Cinema 4D, I make videos in After Effects, I make images in Photoshop and Illustrator, and I make books in InDesign. Moving away from Apple isn't feasible for most of these areas.

I'm really not sure what to do about this, and it's very frustrating.

Update: I have an old MacBook Air running Snow Leopard. I keep it on Snow Leopard, because it's got an old 32-bit app that isn't good enough to upgrade, but is good enough to keep. Just needed that box today, for the first time in months; the recent App Store certificate expiration fiasco broke the app that I needed. The new version doesn't support Snow Leopard, of course, because it's ancient. It also carries a price of $30, and what I get for that $30 is the restoration of functionality which Apple improperly disabled. If Apple robs me of this money, I'll carry on living, but I can hardly call it a consensual exchange.

Monday, December 28, 2015

Soon To Celebrate My Tenth Year Of Not Being An Apple Developer

Been thinking about making this for a while. Finally sat down and drew it.
Some years, I went through this cycle several times. Maybe even monthly, when the iPad was brand new.

The things that give me pause are hopefully pretty obvious:
The things that make my eyes glow while little hearts and stars flutter around me are probably all equally obvious too. With Swift becoming open source, it's not inconceivable that the cycle might end. But it's been a stable pattern for almost a decade now. In fact, Swift becoming open source seems more like the kind of thing which would extend the cycle's lifespan than the kind of thing which would bring it to a happy conclusion. I'm pretty sure I've got another ten years of this ahead of me.

Tuesday, December 15, 2015

Melodics: Bad

Melodics, despite the name, is an app solely about rhythm, which aims to be the Rocksmith of finger drumming. I love Rocksmith, and I like finger drumming. I also play the drums a little. So I wanted to love Melodics, but it has several terrible flaws.

First, it's an OS X app, but its UI has no menus, so you do everything by pointing and clicking on suspiciously link-like fragments of text, which results in suspiciously HTML5-like animations. It seems to be a web app embedded inside an OS X container, with no knowledge at all of the principles that are supposed to be the foundation of UX and UI in OS X apps. To be fair, Rocksmith makes similar mistakes.

Second, the controller mapping in Melodics is a mess. It's a finger-drumming app, so you'll want to connect a MIDI controller. My controller was on their list of supported controllers, but for some reason, I still had to manually set up its controller mapping. And you can't do that without reading the instructions on their support site. The UI is baffling and there is no help text.

If you get it wrong, which I can almost guarantee you will, you have to blow out the ~/Library/Application Support/Melodics/ directory (and the first subdirectory of that is ~/Library/Application Support/Melodics/Melodics/, which is just pitiful). There's no way to edit an existing configuration, and uninstalling the app doesn't help. Clicking on the bad configuration will give you nothing, right-clicking will give you nothing, and as far as I was able to tell, the app has literally no concept of editing or deleting a configuration at all. When you give up in desperation, there's no uninstall process either. You just delete the app manually, reinstall it, discover that your controller config is still fucked, and then go hunting for directories that should have been deleted when you uninstalled, but weren't.

Third, you don't train on finger drumming against normal tracks, like you train in Rocksmith on guitar against songs you know. Licensing songs is expensive, so I expected that Melodics probably wasn't going to use incredibly famous multiplatinum global hit songs. But it doesn't use songs at all. It barely even uses beats. The trainings default to metronomes. You're literally just playing against a click most of the time. It's godawful, especially with syncopated beats, since you have no musical context to get you into the groove, just a metronome and some boxes on a screen. They don't even show you the bpm.

Because there's no musical context, the Melodics way to learn finger drumming doesn't feel like you're learning finger drumming at all. It feels like you're watching blocks on a screen and you've got to hit different buttons when you see different blocks. It's very difficult to even perceive it as being a musical experience, in any sense of the term. It's more like being a lab rat subjected to the tortures of some dickhead scientist who wants to find out how much tedious repetition a rat can take before it loses the will to live.

Fourth, and worst of all, is the pedagogy. There are very good reasons to doubt that the Melodics developers know even one single thing about how people learn musical skills, which would have been useful information to uncover prior to building an app to teach musical skills.

Where Rocksmith gives you a ton of different modalities, both in terms of feedback and in terms of what you can work on — chords, individual notes, specific techniques like hammer-ons and palm muting — Melodics only allows you to hit drum pads when it asks you to. And its feedback is limited to four options: you hit the drum pad at the right time, too early, too late, or you just "miss" a note completely — and playing extra notes count as "missed" notes. So if you get into the vibe and bang out a few extra notes because you're having fun, Melodics counts those as errors. This is a music game which literally punishes you for enjoying the music.

Not that enjoying the music happens very often. You progress to a new level by getting everything perfect on the existing level. So, most of the time, you're playing against a metronome, and your entire run through the exercise becomes worthless if you hit one note imperfectly. So you're constantly restarting the exercises and getting frustrated every time you make minor errors.

Maybe it's harsh to compare Melodics to Rocksmith, but Rocksmith's the high water mark in this area, and if you haven't designed your app to compete with it, you haven't done your due diligence. Also, Rocksmith's very well designed. It sets you up so that you do deliberate practice, but you don't even notice, because you're busy having fun. By contrast, Melodics emphasizes the least interesting aspects of practicing a musical instrument, and demands perfection. That's not how you get good at playing an instrument; that's the way a bad music teacher makes a kid give up on music forever.

Also, consider the fact that Rocksmith has a mode with very similar characteristics to Melodics exercises. In Rocksmith's Score Attack mode, you play notes and it tells you if the notes are on time, late, early, or missed. It'll even kick you out if you miss too many notes. But where Melodics subtracts points for any note which isn't perfect, Rocksmith gives you points for every note you play, and just gives you more points when your timing improves. This motivates you to improve your timing. It gives you a sense of accomplishment. It's fun.

Returning to my enumeration of Melodics's many serious flaws, fifth, you can't control the tempo of your exercise. The most valuable part of Rocksmith is the Riff Repeater, a mode where you can slow down a piece of music to half speed or even slower, and isolate specific sections of the song to focus on the exact parts which are challenging for you. This is a beautiful little deliberate practice machine.

Melodics is very obviously a very early MVP release from some daft little startup somewhere, so I could understand if they hadn't built a fully-fledged Riff Repeater equivalent. But the Riff Repeater's simplest form is just a goddamn user-definable tempo, which is not that difficult to build, especially when the backing track is a fucking metronome. And any good music teacher will tell you that if you're just starting with a piece of music, you should play it slowly and carefully, to get it right, and thereby lock in the habit of getting it right. Melodics pushes you to play unfamiliar music quickly, without taking the time to learn it first, and thereby encourages you to lock in bad habits. It's therefore actively destructive to its stated goal of teaching you a musical skill.

On the rare occasions when you manage to succeed with Melodics, despite all the ways it sets you up to fail, you are "rewarded" with horrifically conceited and obnoxious writing. For instance, where other apps label buttons "OK," Melodics uses "Nice," "Sweet," or "Yessssss" instead. Puzzlejuice did it better, back in 2012, and even then, when it was new, original, better-written, and happening in the context of a more engaging game, it was still only mildly amusing, and only for a little while.

With Melodics, it's just awful. If you get a perfect score on an exercise, Melodics tells you that you are a pad animal, even a "padimal." "Padimal" appears in all upper case, e.g. "PADIMAL," and if you get a perfect score on fifty different exercises, you see this same terrible attempt at humor all fifty times. There isn't any way to turn it off, and they didn't bother to come up with even two distinct jokes for that situation. It's just that one joke, over and over and over again.

Don't get me wrong, I don't want them to write any more of those jokes, because even one of these jokes is too many, but what makes their choice there especially senseless is that they came up with multiple jokes for another part of their app, and it doesn't even make sense. They have this obnoxious little "joke" area, which you see when the app starts up, where they put garbage like "did we learn nothing from the mistakes of disco?" But these jokes aren't funny, and even if they were, they would serve no useful purpose at all. The developers spent time coming up with multiple bits of copy for an area which never needed copy in the first place, but the copy you actually interact with, as a core element of the game, is always the same.

Also, this is a tangent, but disco never really made any mistakes. Disco suffered a backlash first because desperate music companies tried to milk it dry without ever really connecting with the actual disco culture, and second because it came from a predominantly black and gay subculture, and consequently met with enormous hostility from racists and homophobes when these companies pushed it on the mainstream. Neither of these causes for the disco backlash had anything to do with disco itself doing anything wrong at all, except perhaps not hiding enough from the mainstream music industry of its day. If you think disco itself made mistakes, you just don't know anything about the history of dance music.

More to the point, the UI of your app does not exist so that you can have a space to display your ignorance about bygone eras and obscure musicology. Your users will be perfectly fine if you don't inform them of that ignorance, and frankly, everything about this app's user interface is so incredibly bad that they're going to spot the much more relevant forms of ignorance anyway. It's painfully obvious that the makers of Melodics don't know a goddamn thing about how people learn music, or how user interfaces work. In a situation like that, you don't want to give your users specific examples of additional things that you don't know anything about either. Instead, you should just leave that shit blank, because that writing serves no purpose in the first place. Good user experience does not start by treating your app's UI like a bathroom wall at a crappy high school and scribbling there the first goddamn thing that comes into your head while you're taking a shit.

Dear Melodics developers, your product is utterly fucking terrible. I wanted to love it. I was willing to settle for liking it. Instead, I hate it beyond words. Please stop everything you're doing and read these books instead:
Your app will be better if you learn how people get good at making music, and if you take your UI more seriously, and stop flooding it with self-indulgent bullshit.

Also, there are several places on the Melodics web site where two distinct pieces of text overlap, making an unreadable mess, and, if you've got an OS X dock on the right-hand side of your screen, Melodics is basically unusable outside of full-screen mode.

The minute I saw the Melodics web site, I thought I was going to use Melodics literally every single day for the next year. Rocksmith is great but I don't really care all that much about rock music. It's all about dance music and hip-hop for me, so I was thrilled to discover Melodics. But this app was so awful, the only way in hell I'll ever give it another shot is if three years go by, and at least thirty different people tell me that the latest version is amazing. Because everything about Melodics sends the message that the user's time is of no value, and that the user's musical aspirations are not worth taking seriously. Interaction design is a form of communication, and what the Melodics team communicated to me was that they don't give a fuck about me and they think my desire to learn finger drumming is a joke. So I'll probably hate them, or at least their app, forever.

Melodics is basically the strongest argument I've ever seen against underestimating the importance of the word "viable" in the phrase "minimum viable product." I'll give the Melodics team credit for this much, they sure as fuck figured out what the "minimum" part means. But if you get "minimum" right and you get "viable" wrong, then you don't have a product. You have a "product" instead.

To put it another way:

Monday, September 14, 2015

How I Accidentally Falsified The History Of Ruby On Rails

Once upon a time, I decided to write a book:

I'd been building Rails apps for seven years at that point. The book did well. I talked about the many insane and poorly-conceived quirks in Rails, and the reasons Rails succeeded despite those quirks. Newbie Rails devs liked it because they wanted to avoid the framework's pitfalls, and understand the real mechanisms behind the so-called "magic." People with deeper experience and perspective liked it because Rails embodies a fascinating balance of inane bullshit and inspired brilliance, and it's a challenge to uncover the unifying themes, or to understand how these two polar opposites combine.

But I left out something important.

Who do you think is the second most important person in the history of Rails? Obviously, the first is its creator, David Heinemeier Hansson. And there's a long tail of open source contributors and bloggers who did the important work of developing and documenting Rails. But who would be second in line?

Would the second most important person in the history of Rails be Yehuda Katz, the architect of the Rails/Merb merge? No. Mr. Katz's work would never have mattered anyway unless Rails had already gotten off the ground. So it's someone who came along earlier. Someone who Mr. Hansson, when blogging, frequently mentioned and quoted. That's a hint. But it wasn't the founder of 37Signals (now Basecamp), Jason Fried.

Here's another hint, to make it easy. I'll quote an excerpt from my book which describes the effect this person had on Rails:
Rails "luxuries" are in actuality not luxuries at all, but massive, almost godly productivity boosts. Seriously. The code is a mess in places, and even some of the core ideas are bizarrely twisted, but the design is just genius, especially in terms of the priorities it establishes...

This, I think, is the most important thing you can learn from Rails: Make Steve Jobs your role model when you design your APIs, and the world will be your oyster.
I'm not going to say Steve Jobs was the second most important person in the history of Rails, because it wasn't actually Steve Jobs who inspired this stuff.

Kathy Sierra inspired this stuff, and not only that, but I knew this at the time, because when Rails first came out, DHH often quoted Sierra and linked to her blog all the time. I had a habit of diving deeper into the stuff DHH talked about at the time — for instance, the fantastic book Code Generation In Action — and so I read Sierra's magnificent blog of that era avidly. Every word was genius.

And every word was the specific type of genius I talked about in my own book, many years later. All of the underappreciated "luxuries" of Rails development — Rake tasks, migrations, ActiveSupport, the design of ActiveRecord's API, and even scaffolding (which was briefly an awesome thing, in the very early days, when Rails was brand spanking new) — stem from Kathy Sierra's gospel of ease-of-use and user empowerment.

These techniques live on beyond Rails, in countless other frameworks and libraries, as a testament to their usefulness. They made developing Rails apps fun and exciting, and they made every Rails developer quicker, more fluid, and more capable than most had ever been in their lives. The result: staggering popularity, feverish excitement, and an overabundance of eager evangelists. (A few people would claim they'd been equally productive in Scheme or Smalltalk, but even they were still quite excited about Rails.)

Ms. Sierra's writing predicted that if you focus on making your users incredibly effective, and you focus on getting them from newbie status to productive status quickly and gracefully, you'll produce that popularity, that excitement, and those evangelists. And Mr. Hansson obviously read her work, because he blogged about how great it was. And Ms. Sierra's design philosophy did for Rails exactly what she said it would.

I noticed all this recently, because I was reading Kathy Sierra's new book. It's excellent, and it's the missing link in understanding Rails. I had somehow forgotten about this, and in the intervening years, I'd written a book of my own where I reduced all of Kathy Sierra's brilliant insights to a brief reference to Steve Jobs instead.

There's a word for that. It's called erasure.

Here's a simple example: the current issue of Future Music, a print magazine about making electronic music, features a woman named Emika. She has a degree in music tech and did sound design for two of the best manufacturers of music gear. The interviewer asks her if she feels it's unusual and non-traditional for her to be an electronic music producer, because she's a woman. In fact, one of the greatest pioneers of electronic music was a woman named Delia Derbyshire. But the modern producer's unaware of her historical predecessor, and so is the interviewer. And this is not unusual; the achievements of women often vanish beneath a sea of understatement and undeserved dismissal. Emika has plenty of contemporaries; the Future Music story doesn't mention them.

I did the same thing to Kathy Sierra. It's important to recognize that, at the time I did so, I would have called myself a feminist in general, and a fan of Ms. Sierra in particular. I had bought a bunch of her Head First books, and based my entire presentation style on them, to wild acclaim. But I not only failed to tell people where I got the style, I honestly forgot, and went on to write blog posts about my presentation style, and to sell a video explaining how it works (which, again, met with wild acclaim, and paid my bills for a while).

This is something which was hard for me to understand, and I think it's hard for a lot of other guys as well. It's not just that I'm guilty of a sexist revisionist history; I'm also guilty of a sexist forgetting. The guilt is accidental. But it's still guilt. What I did was not the right thing.

I probably got that understanding from a woman too, and I wish I could remember for sure.

Anyway, Ms. Sierra's new book is fantastic, and you should read it. And at some point, when I get the time, I hope to revise my own.

Saturday, July 18, 2015


enra are a dance troupe from Japan, and also, I guess, a live motion graphics performance team. They don't quite fit any existing categories, so they describe themselves as "an entertainment unit which presents the ultimate fusion of images and live performance." Their work is brilliant and unique. Although it's innovative work, it rests on a bedrock of serious study in classical traditions — both Eastern and Western — as well as experience in newer art forms like hip-hop dance and VJing.

Wednesday, July 1, 2015

Actually, No, Stuff Used To Work

I see this sentiment on Twitter all the time:

The link's to a story about the surprising incompetence of Apple's new music streaming service.

But the first computational device, the abacus, was invented around 2400BC. And we've been storing programs on hardware since 1948. So, either software is not still in its infancy, or it's been in that infancy for a very long time. If anything, software seems to get more infantile with every passing generation.

In fact, even as recently as a few decades ago, software companies used to have things called "QA departments" whose whole reason for existing was to make sure that everything worked all the time.

Software is not in its infancy. Software is in a period of decadence, characterized both by unprecedented power and wealth, and by staggeringly low standards. In its past, software put a man on the moon. The most magnificent computers of that time were weaker than the computers in an actual toaster today. It wasn't the hardware, it wasn't the complexity of the software. It was the QA department, and the seriousness of the mission.

Friday, June 19, 2015

Let The Other 95% Of Reality In

Here's a dramatic reading of a Paul Graham blog post.

Thanks for watching! I hope you enjoyed it.

If you're interested in a more serious analysis, here's a gigantic, detailed breakdown.
American technology companies want the government to make immigration easier because they say they can't find enough programmers in the US. Anti-immigration people say that instead of letting foreigners take these jobs, we should train more Americans to be programmers. Who's right?
First of all, this is a false dichotomy. I don't think I've ever heard "anti-immigration people" say anything about training more Americans to be programmers. The only times I've ever heard "anti-immigration people" express any opinions in American politics, those opinions have consistently been racist opinions about Latinos in general and Mexicans in particular.

But Mr. Graham opened up his post with a cliffhanger. "Who's right?" he asked.
The technology companies are right.
I bet you didn't see that one coming.
What the anti-immigration people don't understand is that there is a huge variation in ability between competent programmers and exceptional ones...
Mr. Graham never bothers to identify "the anti-immigration people," and I'm not totally convinced that they exist at all. I mean I know anti-immigration people exist in general, but I think the "anti-immigration people" Paul Graham wants to "debate" here are fictional. Likewise, I could be willing to assume that people who spend most of their time making racist remarks about Latinos might also be unaware of the varying levels of programming talent. But it's not necessarily true, and I've never really seen anything which proves it to be the case. I've even seen counter-examples.

If we're dealing in unsubstantiated arguments, I would feel more comfortable with an argument like "the set of people who make racist remarks about Latinos is unrelated to the set of people who have opinions about the distribution of programming talent." Although I cannot prove it, I have more faith in it. It also doesn't require you to assume that all these "anti-immigration people" don't know what the word "exceptional" means.

But let's keep following Mr. Graham into the hole he's digging for himself:
...and while you can train people to be competent, you can't train them to be exceptional.
This is simply false. There's a book which explains how to train people to be exceptional. It was written after the author visited specific, unusual schools all over the world, each of which reliably churns out exceptional students, including a tiny Russian tennis school which has produced more top-20 women players than the entire United States, and an American music school which covers a year's worth of music training in seven weeks. (Alumni include Yo-Yo Ma and Itzhak Perlman.)

Exceptional programmers have an aptitude for and interest in programming that is not merely the product of training.
Again, Mr. Graham seems to think the "anti-immigration people" have never heard the word "exceptional" before. This is only just barely better than quoting from the dictionary.

Then we make a pretty wild jump.
The US has less than 5% of the world's population. Which means if the qualities that make someone a great programmer are evenly distributed, 95% of great programmers are born outside the US.
That would be the most gigantic "if" that anybody ever built an essay on, if Mr. Graham were being real with us when he refers to his blog posts as essays, but he's not. They're blog posts. Even if you wanted to humor Mr. Graham's pretensions, this "essay" would be a polemic at best, since it fails his own basic rule that "a real essay doesn't take a position and then defend it."

Nonetheless, this "if" is still a pretty huge "if" to build a blog post on, when your blog is as widely-read as Mr. Graham's.

Does Paul Graham even know who this is?

Mr. Graham doesn't give a moment's consideration to this "if." He just graduates it to the status of fact without ever doing anything to justify that leap. So let's dive in for a second. Are the qualities which make someone a great programmer evenly distributed, throughout the entire world?

Some of these qualities probably are evenly distributed throughout the entire world:
  • Intelligence
  • Curiousity
  • A good memory
  • Eagerness to build, analyze, and experiment
While some other relevant qualities probably are not evenly distributed throughout the entire world:
  • Basic familiarity with machines which use electricity
  • The ability to read and write
  • The ability to read and write English
  • The ability to read and write specific machine-oriented dialects of English
  • A general understanding of Boolean logic
  • A precise understanding of the Unix family of operating systems
  • A lifetime of owning expensive machines, and feeling free to break them in order to satisfy some casual curiousity
Remember, the world is actually a very big place, and it's full of people who don't give a fuck about the Internet. There are still many places in Europe where you can hardly even book a hotel online. To be fair, Vietnamese programmer education sounds absolutely fantastic. And programming languages are not entirely a subset of English. But I think it's safe to say that Mr. Graham's 95% number is flat-out ridiculous, and not worth taking seriously for even a moment.

Access to electricity is not evenly distributed, but maybe a truly great programmer will still find a way.

Mr. Graham didn't bother with this question, however. Instead, he said:
The anti-immigration people have to invent some explanation to account for all the effort technology companies have expended trying to make immigration easier. So they claim it's because they want to drive down salaries.
But this "explanation" did not actually have to be "invented," because it's already a proven fact. Google, Apple, and dozens of other tech companies have been caught illegally colluding to drive salaries down.

The illegal wage-fixing scandal Mr. Graham somehow doesn't seem to know about is absurdly well-documented.

If you don't bother to find things out before blogging about them, then you might think that this illegal conspiracy had nothing to do with salaries and was only about being afraid of Steve Jobs.

You would be wrong.

Google's CEO chastizes Google recruiters for driving up salaries.

Mr. Graham apparently somehow missed this incredibly well-documented scandal that everybody was talking about. So, unperturbed by facts, our misguided hero rabbits on:
But if you talk to startups, you find practically every one over a certain size has gone through legal contortions to get programmers into the US, where they then paid them the same as they'd have paid an American.
I'm sure this is true of startups, but it's false of Silicon Valley in general. And, while Mr. Graham's saying that if you want to understand immigration, you should talk to startups, I believe that if you want to understand immigration, you should talk to immigrants.

I worked with a guy one time who worked harder than me, for longer hours, while making less money, because he knew that if our employer got pissed off, they could put him on a plane back to his war-torn home country. He didn't want to go back and get killed. He wanted to get his family into the United States along with him.

I have absolute sympathy for this guy. First, he was a great guy. Plenty of people could have been mad or resentful in a situation like that. His approach to life was to help as many people as he possibly could. He was able to help me, so I was grateful to him, and he had this awesome attitude, so I admired him. Plus he was funny.

My own parents came to the US from another country, and although it was not a war-torn country, I'm a first-generation American, so even in a festival of straw man arguments like Mr. Graham's blog post, you'd have a hard time lumping me in with the "anti-immigration people." I'm very glad that my friend got to come to this country, and I feel like, for him, accepting a slightly lower wage than I got at the same company was not such a bad deal in context — even though he worked much harder than I did, and was much more important to the success of my team, and I feel kinda guilty about that.

However, Mr. Graham's argument, that great programmers get paid the same at startups whether they're American or not, is completely irrelevant in the case of my friend, a very good programmer who was underpaid at a company which was only just barely too large and established to qualify as a startup.

Most imported programmers in Silicon Valley look more like my friend than the startup programmers Mr. Graham is talking about. And even then, the overwhelming majority of immigrant programmers are worse at programming than either my friend or Mr. Graham's examples.

I've certainly encountered exceptionally talented programmers from other countries, but they're rare for the same reason exceptionally talented programmers from America are rare: exceptional talent is rare, by definition. The silliest thing about Mr. Graham's argument is he wants national policy to be optimized for exceptional cases.

It's odd that I have to say this about a venture capitalist, but Mr. Graham should stop asking for government handouts. He's already rich, and it's not the government's job to make it easy to beat the averages.

Back on the subject of startup founders paying top dollar for foreign programmers, Mr. Graham continues:
Why would they go to extra trouble to get programmers for the same price? The only explanation is that they're telling the truth: there are just not enough great programmers to go around.
That is not the only explanation at all. Here's a simpler explanation: they fucked up. Have you ever met a startup founder? Quite a few of them are complete fucking morons. Even some of the successful ones seem batshit insane.

Plenty of startup founders are quite smart, of course. And I'm not saying that bringing in a great programmer from another country can't be a good idea. But Mr. Graham suffers really badly throughout his blog post from a total blindness to the many, many facts and events which undermine his argument, and to say "the only explanation" in that context just begs for a reality check.

People who run startups make mistakes all the time. They're people.

One way startup founders fuck up: failing to recruit the best programmers they can, irrespective of where they're located. Maybe there are not enough great programmers who are willing to live in San Francisco to go around. But startups which hire remote have a much larger talent pool to draw from. If you adopt a modern working style, you might have less hoops to jump through.

One remote worker had this to say on the topic:
I just finished a remote job search. I'm in Los Angeles, and most of the companies I talked to were in SF. I had previously worked remotely for a company in New York (Etsy), and I was honestly really surprised how hostile SF companies were to the idea.

I have been on both sides of remote work, so I totally get that it's not a slam dunk. It works a lot better for companies that have a strong culture established in a home base, and it works a lot better for experienced folks than green college hires. I have shut down the interview process myself when I felt like working remotely wouldn't work out at particular companies.

Even given all of that, I was pretty amazed how quickly some of the conversations got shut down. No remotes, we don't care who you are or what your experience is. I didn't talk to any companies outside of SF that were that quick to say no.

FWIW I landed at [the YC company] Stripe, which in fairness is probably one of the companies pg had in mind when writing his original article.
As a former San Francisco resident, I honestly think this error is inevitable in San Francisco. One of the few ideas which is utterly unacceptable in any discussion among people in the Bay Area – who are normally very open-minded people – is the idea that the Bay might not be the only place in the universe worth living in.

Mind. Blown.

I liked San Francisco, but it's not the only thing out there. London's better for music. Portland's better for open source. Los Angeles is better for street food, attractive single inhabitants, espresso, weather, parking signs, and in some cases even public transportation. (Bay Area public transportation is better if you want to go a long way, but Los Angeles completely wins at simple, local use.) As for New York, don't even get me started.

Plus, as beautiful as San Francisco is, it's packed with people like this:

Imagine you've already got a hangover and you have to deal with some asshole using an iPad app to control his girlfriend's mind.

Back to Mr. Graham:
I asked the CEO of a startup with about 70 programmers how many more he'd hire if he could get all the great programmers he wanted. He said "We'd hire 30 tomorrow morning." And this is one of the hot startups that always win recruiting battles. It's the same all over Silicon Valley. Startups are that constrained for talent.
But not so constrained for talent that they're willing to explore remote work.

Matt Mullenweg pointed out how absurd this was, and I myself took Mr. Graham to task for this on Twitter:

He responded that remote work is only good for building certain types of companies, but I can't find the remark. I think he deleted it, but it's also possible that Twitter search failed me. I won't get into it, beyond saying that I highly doubt anybody's done any actual research which could grant any credibility to that 20th-century attitude at all.

Mr. Graham continues:
It would be great if more Americans were trained as programmers, but no amount of training can flip a ratio as overwhelming as 95 to 5. Especially since programmers are being trained in other countries too. Barring some cataclysm, it will always be true that most great programmers are born outside the US. It will always be true that most people who are great at anything are born outside the US.
Again, this 95% number is ridiculous, and Mr. Graham is starting to get carried away with this whole "born outside the US" thing. He hasn't even substantiated the claim that most great programmers are born outside the US today, and obviously most people who are great at American football are born inside the US. But rather than slow down, Mr. Graham gets stupid nutty:
Exceptional performance implies immigration. A country with only a few percent of the world's population will be exceptional in some field only if there are a lot of immigrants working in it.
Brazil's population is smaller than that of the United States. The Brazillian national football (soccer) team is the most successful in the world, and has been so for more than fifty years. There are not a lot of immigrants working in the Brazillian national football team.

Given enough time, I'm sure I could find about 50 more counter-examples, at least. The Swiss watch-making industry is probably made up mostly of Swiss nationals. Enzo Ferrari and Ferruccio Lamborghini were both native Italians. Immigration probably does not explain the French advantage in manufacturing cheese, or the Japanese superiority at making anime, or the exceptional quality of British bondage gear. Thailand completely dominates the field of elephant feces tourism, and the source of their advantage is probably not imported talent.

The coup de grace comes next. Mr. Graham, not content with his extra large bullshit enchilada, demands a colossal side order of what the fuck are you even talking about:
But this whole discussion has taken something for granted: that if we let more great programmers into the US, they'll want to come. That's true now, and we don't realize how lucky we are that it is. If we want to keep this option open, the best way to do it is to take advantage of it: the more of the world's great programmers are here, the more the rest will want to come here.

And if we don't, the US could be seriously fucked. I realize that's strong language, but the people dithering about this don't seem to realize the power of the forces at work here. Technology gives the best programmers huge leverage. The world market in programmers seems to be becoming dramatically more liquid. And since good people like good colleagues, that means the best programmers could collect in just a few hubs. Maybe mostly in one hub.

What if most of the great programmers collected in one hub, and it wasn't here? That scenario may seem unlikely now, but it won't be if things change as much in the next 50 years as they did in the last 50.

We have the potential to ensure that the US remains a technology superpower just by letting in a few thousand great programmers a year. What a colossal mistake it would be to let that opportunity slip. It could easily be the defining mistake this generation of American politicians later become famous for. And unlike other potential mistakes on that scale, it costs nothing to fix.

So please, get on with it.
As far as I can tell, his argument is that we have to take action immediately, because conditions could change over the next 50 years. And it would cost politicians nothing to abandon anti-immigration policies, somehow. And nobody will remember Ferguson, or the wars in the Middle East, or the bank bailouts, or Edward Snowden, or climate change, but we'll curse the name of every senator who ever voted against H1 visas.

And somehow, we can find "a few thousand great programmers every year." No, we can't. Mr. Graham flatters himself when he refers to his blog posts as essays, but back in the day, he actually did write essays. One of his best essays argued that identifying great programmers is incredibly difficult, even for other programmers.
It's hard to tell great hackers when you meet them... You also can't tell from their resumes... the only way to judge a hacker is to work with him [or her] on something...

Hackers themselves can't tell how good they are...

If there is a Michael Jordan of hacking, nobody knows, including him [or her].
So we need to import a few thousand exceptional programmers, even though we have no way to identify them in the first place. And language barriers will not make identifying them any harder. And somehow we're going to do this on a yearly basis.

Meanwhile, in reality, Mr. Graham's worst fear came true a long time ago. Most of the best programmers have already collected in one hub, and it's not here, or there, or anywhere. It's called GitHub. It was built as a distributed company around a distributed version control system, and — because Conway's Law works in both directions — it is training every programmer in the world to do distributed work.

Silicon Valley was once the center of the universe for programmers. In a sense, it mostly still is, kinda. But the real center of our universe is no longer a physical location. The real center of our universe is a virtual location which is training us all, every single day, to work remote.

And even GitHub is only barely central. There's Bitbucket and GitLab, and any communication through GitHub is typically supplemented with additional communication through IRC, email, videoconferencing, Slack, Twitter, SMS, and more. The best-case scenario for face-to-face communication in 2015 is that it's how you augment your other channels.

Do we seriously need to explain to a guy with three Ivy League degrees that a decentralized networking technology will have a decentralizing effect on every industry that uses it? Can't we just send him to the Wikipedia page for economics or something?

Some of Mr. Graham's other writing is much better than the nonsense I've dug into here. You can also create a great Paul Graham essay if you take one of his classics, The Python Paradox, load it up in Vim, and run a quick s/Python/remote/g. But if you ever took his arguments on immigration seriously, even for a fraction of a second, then you owe it to yourself to do the following simple exercise.

Make a list of all the programmers who fit these criteria:
  • They created a technology which completely redefined its category.
  • You can't find them living in the Bay Area.
  • You can find them working on GitHub.
Your list will be big, and it will include Linus Torvalds (who lives in Portland), John Carmack (a small town in Texas), David Heinemeier-Hansson (Chicago and/or Barcelona), John Resig, Jeremy Ashkenas, Rich Hickey (all in New York), and many, many others.

Or, for a more amusing list, count the number of YC companies which hire remote, despite Mr. Graham telling me on Twitter that hiring remote isn't appropriate for the type of work YC companies do. I know of at least seven such companies, and I haven't even tried putting it in a search engine.

The alleged Bay Area "talent drought" is only half real. Programming skill is certainly both valuable and rare (although well-managed tech companies are probably rarer, and that might not be a coincidence). But the other half is self-inflicted, because an industry organized around innovation and disruption is clinging to a bafflingly archaic work culture. VCs and founders are shooting themselves in the foot, while the execs at established companies form illegal conspiracies to keep programmer salaries low.

("Low" is a relative term, of course, but housing costs are relative too.)

No thanks.

Let The Other 95% Of Great Programmers In is the worst thing Mr. Graham has ever put on That's why I portrayed him as intoxicated in the dramatic reading. I couldn't think of any other way to reconcile his obvious intelligence with the fact that he wrote that thing.

(Likewise, I used the uniform of a Silicon Valley middle manager as my costume because I don't think Mr. Graham's given one iota of thought to what these words might mean coming from Silicon Valley in general. It's entirely possible he constructed his wretched pile of illogic with some honest mistake at its foundation. But if that "essay" made any real progress in national politics, it would be promoted for incredibly dishonest purposes, by incredibly dishonest people.)

In conclusion:

Dat Footnote Tho

Mr. Graham's post contains a few footnotes. I'm not going to bother with them, except to point out that one footnote includes the hilarious phrase "it should be easy to write legislation." And to quote this bit:
it is dishonest of the anti-immigration people to claim that companies like Google and Facebook are driven by [a desire to drive down salaries]
He says this despite the fact that Google was sued for illegally agreeing not to hire people away from numerous specific companies — which suppressed salaries — in a well-documented case which Google has not contested. The judge rejected Google's $324.5M settlement offer because it was too low.

Again, quoting Eric Schmidt, Google's CEO at the time:
Google is the talk of the valley because we are driving up salaries across the board... We need to get this fixed.
If your CEO tells you that driving salaries up is a problem, what direction might you imagine he prefers?

Mr. Graham's argument equates Google with startups, but calling Google a startup today is like calling Microsoft a startup in 1998. It verges on delusion and it leads to completely ridiculous claims. Graham says "the anti-immigration people" are being "dishonest." But I'm a first-generation American. My parents are immigrants. And I'm pointing out something Google actually DID.

If your argument implies that it is dishonest for people to remind you of facts, your argument is probably flawed.

Speaking as a first-generation American, I'm not against immigration. I'm against indentured servitude. I'm against corporations using fraud and collusion to evade both labor laws and the real market price of programming ability. And I think most American programmers are uncomfortable with H1 visas for the same reasons.

By the way: at the company I work for, Panda Strike, we've hired programmers from France, Argentina, Germany, and India. Some of them live in the US, some of them live in their home countries, some of them have wandered through Australia and Thailand while working for us.

We're certainly aware that programming talent comes from all over the world. And I'm not saying American immigration law is perfect. There are foreign-born programmers we would hire in a heartbeat if their visas didn't make them captives of their current employers.

But we use modern technology and modern development practices, and we recommend you do the same. One easy way to start: let the other 95% of American programmers in. If you're building a company, and you want access to the best possible talent pool, you need to know how to manage remote developers. If you want to learn about that, check out our blog.

My opinions are my own, of course. That was just a shout-out.

Also, a quick note: there are some uncredited photographs in the video. Lost track of the links when I was making it. If they belong to you, sincere apologies, and feel free to email me:

Monday, June 15, 2015

The Future Is Full Of Broken Machines

John McCarthy created Lisp in 1958. My hope is that Clojure has finally mainstreamed it, although it might be too soon to say. If Clojure fades out, and if you look at the history of Scheme, Common Lisp, and companies like Naughty Dog, it might be more accurate to say that Lisp periodically surfaces outside of academia, but never really achieves escape velocity. Only time will tell.

But let's assume for the sake of argument that Clojure is indeed mainstreaming Lisp. Even if it's not true, JavaScript is mainstreaming functional programming, which is pretty close.

If that's the case, it's kind of horrifying: the optimistic interpretation is that Lisp took 49 years to reach the mainstream (because Clojure was released in 2007).

Brendan Eich came to Netscape because they told him he could put Scheme in the web browser. So we could stretch the definition and use JavaScript's 1995 release date instead. But we'd be seriously fudging the numbers, and we'd still have Lisp making a 37-year voyage from new paradigm to mainstream acceptance.

One of the classic books of the tech business is Geoffrey Moore's Crossing The Chasm, which is all about how to transition a product from the early adopters to the mainstream. If you buy the narrative that all programmers are eager explorers of the future, it's pretty ironic that a programming language should have such a hard time making this transition.

But let's be honest here. Most programmers are very conservative and tribal about their toolsets. And with good reason: programming any particular language requires a lot of specialized knowledge and experience. Once you get good enough at something intricate and challenging that you can charge a lot of money for it, you usually want to stick with it for a while. If you dive into a Clojure code base after years of writing C, it might be uncomfortable, awkward, and extremely unprofitable.

There's also something paradoxically both intimate and mechanistic about the way that wrapping your head around a programming language can change the way you think, and thus, to some extent, who you are. Learning your second programming language thoroughly and well is a lot harder than your fifth or your sixth. Programmers risk a phenomenon of paradigm freeze, similar to the phenomenon that psychologists have identified as "taste freeze" in music:
From around the age of 15 years old, music tastes begin to mature and expand as listeners increase the diversity of the music on their playlists.

Tastes appear to change most quickly through the teenage years until the age of about 25 when this sense of discovery slows and people move away from mainstream artists.
I even saw a Douglas Crockford keynote where he said that the only way you can advance a new programming paradigm is by waiting for the entire current generation of programmers to retire or die.

Let's pretend that we have all the cynicism and despair that anyone would get from working at Yahoo for a long time, and agree with Mr. Crockford, for the sake of argument.

It would stand to reason that there must be new programming paradigms today that have not yet crossed the chasm.

I believe that this is probably true, and I have two specific examples. The irony is that both of these paradigms are embedded in technologies we all use every single day. Yet I would not be surprised at all if they remained widely misunderstood for the next 50 years, just like Lisp did.

One of them is git.

You Probably Don't Understand Git (For Large Values Of You)

git's a completely decentralized technology, which requires no master branch whatsoever (master is just a name).

But people typically treat git as a completely centralized technology which depends absolutely on its center, GitHub.

You've probably heard some variant of this story before:
Panda Strike's CEO, Dan Yoder, told me a story about a startup where he served as CTO. GitHub went down, and the CEO came into his office, saying, "we have to move everything off GitHub!" He was upset because no programmer could do any work that day. Except, of course, they could. You choose one person's repo as the new, temporary master version of master, fire up sshd, and replace GitHub with your laptop until their servers come back online.
One time, I worked at a company which was nearly all remote, with a small office in one city but plenty of programmers in other cities. Soon after I joined, we all spent a few days hacking and hanging out in a cabin in the woods. Our WiFi was very unreliable, so we were unable to reach GitHub. We just used gitjour, which wraps Bonjour, Apple's Zeroconf networking tech, to host and advertise git servers over small local wireless networks. In other words, one person said "I'm the canonical repo now," and we all connected to their computer.

The point is, git doesn't depend on GitHub. GitHub adds value to git. But to most people, the major difference between git and Subversion is that there's a web site attached to git.

On Panda Strike's blog, I went into detail about this:
GitHub is also hierarchical even though git is flat. If GitHub only added a center to git, it would have commits flow to the center of a web of repos. But that's not how it works. On GitHub, pull requests have to flow upwards to attain lasting impact.

It's this:

Not this:

Using GitHub adds a center and a hierarchy to git. These are usually beneficial. But, as I explore in that blog post, the added-on center and hierarchy become a problem when you have a project with a thriving community but a disinterested creator.

And the real downside of this tradeoff isn't that edge case. The real downside of treating git as if it were centralized is that lots of people assume that it is centralized. To a lot of people, this entirely new paradigm of distributed version control is basically just Subversion with a web site and a smiling cartoon octopus/cat beast from the island of Dr. Moreau.

You Probably Don't Understand HTTP Either

HTTP has this problem too. Not only that, HTTP's had this problem for more than twenty years. People who don't understand HTTP are constantly reinventing features that the protocol already has, and moving those features from the protocol layer to the application layer in the process.

Some examples:
But the biggest and most egregious examples would be media types, the POST header, and, of course, REST.

Media types matter because HTTP has a type system. It's based around the fundamental, important, and seemingly forgotten idea of hypermedia. This idea is so powerful that it basically puts an everything-is-an-object system like Smalltalk or Hypercard around the entire planet; but it's so frequently under-exploited that it's almost just a footnote. (But a footnote which can make your API traffic incredibly fast.)

With POST, the situation's improving, but for decades, POST was the go-to HTTP verb for nearly everything a Web app ever did. Ironically, however, POST was intended as a footnote. HTTP's basically a giant, globe-spanning key/value store, but just in case, the spec included POST for any kind of random operations that weren't covered in the primary use cases of GET, PUT, and DELETE.
The core operations for a key-value store are get, put, and delete. As you'd expect, each of these correspond to well-defined HTTP verbs. And by well-defined, I mean that they're more than just window-dressing to indicate intent. For example, a client might cache the response to a GET request exactly because it's defined to allow that.

But HTTP includes a fourth verb, POST, which provides for cases where strict key-value store semantics don't suffice. Rather than take the pedantic tack of insisting that everything fit into a single abstraction, HTTP gives you POST as a fallback.

Unfortunately, for historical reasons, this led developers to misunderstand and overuse POST, which, in turn, contributed heavily to the confusion that surrounds HTTP to this day.
In practice, most web developers have looked at POST as the mechanism which enables RPC on the web. "If I want to prompt the server to perform an action of any kind, I use POST." This meant that a huge number of HTTP requests over the past twenty-plus years could have used HTTP verbs to identify their purposes and intents, but instead had the application layer figure it out.

Even systems like Rails, whose developers realized that they could use HTTP verbs for this purpose, lost track of the basic idea that HTTP was a big key/value store. Instead of recognizing that PUT maps exactly to the act of putting a new key in a hashtable, they chose randomly, with no obvious rationale, to arbitrarily consider PUT equivalent to the "update" in CRUD, and POST equivalent to CRUD's "create."

Using the application layer to handle protocol-level information makes web apps slower to run, and more expensive to build and maintain. If we could total up the dollar value of this misplaced effort, it would be quite a lot of money. That's true also for the example of rebuilding Basic Auth by hand on every site and app since day one.

As for REST, it's a huge topic. For now, just understand that this mountain of errors we're looking at is really just the tip of an iceberg.

Superset The Dominant Paradigm

To paraphrase William Gibson, the future is already here, it's just not widely recognized. People in general find it a lot easier to put a new, unfamiliar thing in a familiar category than to wrap their heads around a new idea, and that's true even when the new idea doesn't really fit in the category they choose for it. Designers even do this on purpose; for instance, it's not an accident that getting on an airplane feels a lot like getting on a train, and the reason isn't because trains are necessarily great models for organizing transit. They're good, but that's not the reason. The reason is that when flight first became a widespread technology, it scared the shit out of people. Designers made it look familiar so it would feel safe.

In 2008, GitHub basically did the same thing. Git's fundamentally a functional data structure, but that sales pitch will only work for a few very unusual people. "Imagine if Subversion could handle many more branches at a time" is a much easier sell. Likewise, treating hypermedia like a bunch of remote procedure calls was just easier for a lot of people.

But here's where I disagree with Mr. Crockford: I believe that the idea that everybody has to understand a paradigm, for that paradigm to matter, is itself an outdated paradigm. After all, both HTTP and git have been wildly successful despite consistent and incredibly widespread misuse.

Maybe the key is just to superset some existing paradigm, so that the late adopters can use their old paradigms, while early adopters use their new paradigms, all within the same technology. This approach certainly worked for JavaScript, and it might even be the secret sauce behind Git and HTTP's success stories too.

Sunday, June 14, 2015

Mad Max vs. The Babadook: Australian Feminist (And "Feminist") Movies

Two movies. Both from Australia. Both with feminist themes.

Mad Max has a male director, and The Babadook has a female one.

Mad Max has a tumblr about how feminist it is. But in terms of story mechanics, only one of the female characters takes actions or makes decisions which push the story in any direction at all. And that female character sports a masculine look.

I knew a screenwriter once who said that not every actor gets to play a character. What an actor calls a character only counts as a character in screenwriting terms if they play a role in shaping the story. Some actors just play window dressing, situations, punchlines, or MacGuffins. He was talking about movies like Mad Max.

This "feminist" movie is packed to the gills with women who act as props. They take a backseat for almost the entire film. I'm not even speaking figuratively. There is a backseat, and they sit in it.

Story is what makes people give a shit. For the first hour of Mad Max, I thought I would be coming back the next day to see it three or four times in the theaters. After the first hour, I was bored and ready to leave - especially since the second hour is basically just a repeat of the first hour, but with motivations that make much, much less sense.

I heard a snort of contempt from someone else in the audience when Imperator Furiosa did her weird little "we're not white people" tribal gestures with her long-lost fam. A dude somewhere behind me barked an aggressive, approving laugh when Max told Furiosa that she was the only woman who could get in the truck. You lose the audience, you have nothing.


The Babadook centers on a woman, and it puts that woman in conflict with her sister, her sister's (female) friend, her niece, a little old lady who lives next door, and a female bureaucrat. These female characters, in actor terms, are female characters in screenwriting terms as well. They make the story happen.

It's a horror movie, and it's scary as fuck. It's also emotionally moving; because there's so much story, there's also a ton of giving a shit.

If you're a female actor and you got a role in Mad Max, it's not much different than working on the average rap video. You're going to wear revealing clothing. You'll mostly just be standing around looking pretty the whole time - but this time, you're doing it for feminism.

The kind of feminism where the only woman who actually matters in the entire movie dresses like a man. Backed with the kind of story that would never persuade anyone who didn't already agree.

Meanwhile, if you're a female actor and you got a role in The Babadook, you're going to have to work, because your role affects the story.

And the worst misogynist could watch this thing, feel sympathy for this woman, and want her to succeed.

I know a lot of people are starved for feminist semiotics in cinema, yet for some reason they won't satiate that hunger by seeing any movies except for obvious Hollywood stuff. They're happy about Mad Max, and I'm glad they got at least a taste of what they wanted. But if you want the real thing, you don't have to look very hard.

If you're hungry for feminism in movies, you should know that The Babadook is a tomato, and Mad Max is the kind of ketchup they make out of corn starch and red food coloring.

One of these movies is an awe-inspiring chase scene, punctuated by terrible dialog, and followed up by a crappy remake of its own first hour. As a chase scene, it's a work of genius. If it ended after the first hour, it would be the best short film ever made.

But the other one is an actual movie.

Wednesday, June 10, 2015

Browser MIDI Is About Hardware As Much As Music

You can use MIDI to make music in your web browser, and that's going to open up a lot of new possibilities.

But it isn't just about the music. There's such a huge range of MIDI controllers available that the web browser just became a perfectly viable platform for very basic embedded systems. One major, important trend in music software is external controllers which completely replace the computer as a user interface.

With the Traktor S2, released about four years ago, you didn't need to look at your computer very much:

With the Traktor S8, you don't need to look at your computer at all:

By contrast, here's somebody using MIDI controllers with to make music in the browser:

They're looking at their computer while they do it. But they won't have to for very much longer. Web companies will discover the same thing that musical instrument manufacturers have learned. And computers are shrinking rapidly.

Monday, June 8, 2015

Building Laser Guns In Your Garage Is A Thing Now

The same guy also enjoys attaching laser guns to robots for fun.

Sunday, May 24, 2015

Mysterious Project Involves Circuit Spider

I'm working on a project. A circuit spider is involved.

Tuesday, May 19, 2015

Update re Live Electronic Music Performance Design

Last year, I wrote about an idea I had — a way to bring back some elements of classic raves and see those elements survive better. Over the years, rave music has done incredibly well, while rave events have almost been crushed by governments. The difference is, at a concert, the artists are special and the audience is there to see them, while at a classic rave, things were set up for the dancers first and the DJs second.

My design was a hybrid of a concert, a rave, a hippie drum circle, and a video game arcade. A performer plays drums in the center, while audience members can play along in little drum pods scattered in a circle around the performer. All the drums are electronic. They play sounds like a normal drum, but they also trigger video software. Thus the "audience" plays a part in creating the event.

Contrast this with the amazing setup the Glitch Mob uses to perform live:

I love this and hate it too. It's excellent work. It's beautiful. It's badass. It incorporates, updates, and adapts Japanese taiko drums while still remaining respectful of the original source material — a balance which artists often get wrong. But it's still 100% about the artists being special and the audience being there to see them, which to me seems much more like a rock concert than a club or a rave.

Also, at one point in the video, one of the Glitch Mob's shocked that a Hollywood set designer can draw, which is kind of ridiculous, because that's the job. And as a programmer, there are moments which seem silly too, because they wrote custom software, but all it seems to do is function as a bus for controller input. In the age of Overtone and Quil, that's kind of a letdown, especially given the stuff other artists are doing with custom software.

Anyway, back to my own design: in addition to these little 3D sketches. I also wrote a basic version of the software I had in mind. My drumming in this video is terrible, and so is the software, really, but it illustrates the basic idea. Hitting the drums triggers color changes in computer-generated visuals.

I've been outdone in this category as well. This is a promo video for the Critter & Guitari Rhythm Scope, an analog video synthesizer which responds to sound:

The interesting thing about this, to me, is that it's analog rather than digital.

Speaking of which, the performer in my design had a strobe light attached to their drum set. It's the little black box with a grey mesh:

I bought a strobe light and attempted to integrate it with my electronic drum kit using a protocol called DMX. Got absolutely nowhere, although I found some existing solutions in Ruby and Node (using both CoffeeScript and JavaScript).

But I've discovered that a small company in Italy makes a Eurorack solution for this, which links DMX to CV instead of MIDI. (Rolling your mouse over the bottom of the video brings up an audio control, although this rather stupidly assumes you're on a computer, not a phone or a tablet.)

More news as events warrant.

Monday, May 18, 2015

Underrated Synth: The Korg Wavestation

Never buy music gear without looking it up on YouTube and Vintage Synth Explorer first (or Gearslutz, or ModularGrid). And never pay for music gear without looking up the price history on eBay for the last three months. It takes five seconds and it'll save you a lot of money.

I recently found a Korg Wavestation SR on eBay for less than $200. Although they sometimes go for less, they usually go for more. I've had my eye out for a Wavestation for a long time, and seeing one in Richie Hawtin's live setup didn't hurt. But to be sure, I checked YouTube, where I found this demo:

It's painfully 90s, and it sounds as if it was made on the same machine that the entire X-Files theme song was made on, but that's because it was. It's also kind of awesome, in a painfully 90s way.

Wednesday, May 13, 2015

Strong Parameters Are A Weak Schema

Ruby on Rails went off the rails a long time ago.

I don't work with Rails today. But, like so many other developers, I kept working with Rails for many years after the Merb merge. Because I loved Ruby, and because the Rails developer experience remains a thing of beauty, even today.

I stuck around for Rails 4, and one of the changes it made was silly.
Rails has always had a nice way of sanitizing user input coming from ubiquitous forms. Up until Rails 3, the solution was to list accessible fields right in your models. Then Rails 4 came along and introduced a different solution - strong_parameters, allowing you to take a greater control over the sanitizing process.
As is often the case with Rails, the real problem here is that the core team failed to recognize a classic problem of computer science, after underestimating the importance of API-centric web development, and perceiving the problem purely in terms of showing a web page to a user.

What Rails Was Thinking

Before I get into that, I just want to summarize the problem from the Rails perspective: you've got input coming in from users, who are filling out web forms. They might be up to mischief, and they might use your web form to cause trouble. So you have to secure your web forms.

The classic Rails solution for securing a web form: attr_accessible. Since models are the only way Rails puts anything into a database, you can recast "securing a web form" as validating an object. It makes perfect sense to say that code which secures an object's validity belongs in that object. So far, so good.

attr_accessible was a white-listing mechanism which allowed you to specify which model attributes could be mass-assigned. The default for updating or creating an object in Rails, update_attributes, would allow a user to update any aspect of a model, including (for example) their or their authorization privileges.

But this whitelisting was disabled by default. You had to kick it into gear by calling attr_accessible at least once, in your model. People forgot to do this, including people at GitHub, a very high-profile company with great developers, which got very visibly hacked as a result. People responded by writing initializers:


(Obviously, a better way to do that would be to wrap it in a method called enable_whitelist or something, but that's a moot issue now.)

People also responded by writing plugins, and in Rails 4, one of these plugins moved into Rails core.

So this is what changed:
  • attr_accessible had an inverse, attr_protected, which allowed you to use a blacklist instead of a whitelist. strong_parameters only permits a whitelist.
  • The whitelisting default changed from off to on.
  • The code moved from the model to the controller.
David Heinemeier-Hansson wrote up the official rationale. I've added commas for clarity:
The whole point of the controller is to control the flow between user and application, including authentication, authorization, and, as part of that, access control. We should never have put mass-assignment protection into the model, and many people stopped doing so long ago ...

An Alternative Approach

Let's look at this from a different perspective now.

Say you're building a web app with Node.js, and you want to support an API as well as a web site. We can even imagine that your mobile app powers much more of your user base, and your web traffic, than your actual web site does. So you need to protect against malicious actors exploiting your web forms, as web apps always have. But you also need to protect against malicious actors exploiting your API traffic.

At this point, it's very easy to disagree with Mr. Hansson's claim that "we should never have put mass-assignment protection into the model." Both the "protect against malicious actors" problems here are very nearly identical. You might have different controllers for your API and your web site, and putting mass-assignment protection into those controllers could mean implementing the same code twice. Centralizing that code in the relevant models might make more sense.

Rails solves this by quasi-centralizing the strong_parameters in a private method, typically at the bottom of the controller file. Here's the example from the official announcement:

But you could also just use JSON Schema. All your web traffic's probably using JSON anyway, all your code's in JavaScript already, and if you write up a schema, you can just stipulate that all incoming data matches a particular format before it gets anywhere near your application code. You can put all that code in one place, just as you could with models, but you move the process of filtering incoming input right up into the process of receiving input in the first place. So when you do receive invalid input, your process wastes less resources on it.

(This is kind of like what Rails did, except you can put it in the server, which in Rails terms would be more like putting it in a Rack middleware than in a controller.)

The funny thing is, writing a schema is basically what Rails developers do already, with strong_parameters. They just write their schemas in Ruby, instead of JSON.

Here's a less cluttered example:

Note especially the very schema-like language in this line:

params.require(:email).permit(:first_name, :last_name, :shoe_size)

All you're doing here is permitting some attributes and requiring others. That's a schema. That's literally what a schema is. But, of course, it lacks some of the features that a standard like JSON Schema includes. For instance, specifying the type of an attribute, so mischevious Web gremlins can't fuck up your shit by telling you that the number of widgets they want to purchase is `drop table users`. (Rails has other protections in place for that, of course, but the point is that this is a feature any schema format should provide.)

Rails developers are writing half-assed schemas in Ruby. If/when they choose to farm out parts of their system to microservices written in other languages, they'll have to re-write these schemas in other languages. For instance, they might at that point choose to use a standard, like JSON Schema. But if you're building with the standard from the start, you only have to define that schema once, using one format.

In fact, Rails developers typically re-write their schemas whether they refactor to microservices or not. Many Rails developers prefer to handle API output using active_model_serializers, which gives you a completely different Ruby-based schema format for your JSON output.

Here's an example from the README:

This code says "when you build the output JSON, serialize the name and body attributes, include post_id, and add some hypermedia-style URLs as well." It infers a lot from the database, and it's nicer to read than JSON Schema. But you can't infer a lot from a database without some tight coupling, and this syntax loses some of its appeal when you put it side-by-side with your other implicit Ruby schema format, and you have to remember random tiny distinctions between the two. It's kind of absurd to write the same schema two or three different times, especially when you consider that Ruby JSON parsing is so easy that your JSON Schema objects can be pure Ruby too if you want.

strong_parameters really only makes sense if you haven't noticed basic things about the Web, like the fact that HTTP has a type system built in.