Thursday, November 29, 2007

Put This In Your ApplicationController And Smoke It

def instance(conditions = :all)
  i_var = instance_variable_get "@#{model_name.pluralize}"
  i_var = model_name.camelize.constantize.find(conditions)
end

before_filter :instance


The idea comes from Dan Yoder.


Update: I am a crackhead. That should use ivar_set, not _get.

And Now, A Unicorn Chaser

My last post was pretty horrific, so:

Censoring Violent Images: A Weird Pattern

The video game Manhunt 2 uses very violent imagery and themes and had to be edited extensively in order to be released in the United States. Selling the game at all is illegal in the United Kingdom. There's a censorship controversy around Manhunt 2 in the video game world, which, thankfully, did not spill over the way the controversy around Grand Theft Auto 3 (from the same game studio) did.



Someone else recently saw a small censorship controversy which did not spill over into the mainstream either. Hollywood director Brian De Palma made a "docudrama" about Iraq called Redacted. The final moments of Redacted feature a montage of photographs of dead Iraqis. These are photographs which you cannot see in the news, due to the American military's control over the American press. The military restricts the movement and presence of journalists in Iraq by "embedding" them among military units - a practice pioneered by Joseph Goebbels, of the Nazi regime - and has shot at journalists who were not "embedded", including the British journalist Terry Lloyd, whom the American military killed. (The British Government conducted an investigation and found the killing unlawful).



On that level, these are photographs which are already heavily censored. However, there is a further level of censorship. I've read conflicting reports. Some state that Redacted's distributor, Magnolia Pictures, placed black bars over the eyes of the deceased Iraqis; others say that the images were removed entirely. I've seen several of these conflicting reports so my belief is that both things happened; Magnolia altered some pictures and removed others entirely.



The first thing to realize is that neither Manhunt 2 nor Redacted have gotten great reviews, as far as their merits as a video game or as a movie are concerned. This might be the reason the controversies didn't spill over into the mainstream; the Grand Theft Auto controversy, after all, revolved around an utterly fantastic game. The second thing, just as a disclaimer, I've personally always found it weird that American movies haven't censored violence much in the past, given how much sex is censored in American movies and TV. However, the violence removed from Redacted is nothing compared to what you'll see in Hostel or movies like it, and in fact one of the sequences removed from Manhunt 2 to enable its release was very, very similar to a scene in Hostel 2, which was R-rated.

But those are just caveats. Here's the interesting question. Lots of people say violent movies encourage violent behavior; but violent crime in the US as a whole has diminished in parallel with the rise in violent subject matter in entertainment. (For example, Los Angeles is on track to finish 2007 with the lowest murder rate in four decades.) Likewise, the violent subject matter in Redacted was explicitly and deliberately compiled and presented for the purpose of halting violence. Before Redacted's release, Brian De Palma was quoted in several places saying it would be the images in his movie - the same images which Magnolia removed - which would finally end the war. (Everybody knows it was similar images on national TV which ended Vietnam.)

I took out a really horrifying picture here.

Obviously, these are provocative issues, which is one reason I'm so glad I banned comments. Every question raised or implied here is interesting, but the one thing I want to draw attention to here is the possibility that maybe showing people violent images actually discourages violence. I'm not sure if it's true or not. But all the video game press touches on Grand Theft Auto, and I used to play GTA a lot. Me and my brother would spend hours on this game, and a lot of the time we wouldn't even do the missions. Nine times out of ten we'd just kill innocent people and blow up cars at random until the cops showed up, and then kill as many cops as we could - ideally with the flamethrower or the rocket launcher - until they sent in the FBI. Then the game was to see if you could outlast the FBI in a car chase, and then, if you won that "level", they brought in the National Guard. Then the game was to see how long you could stay alive with the National Guard after you.

I'm sure this sounds like an anarchist's fantasy playground, but that's the point - that's exactly what it was. It was a fantasy playground. All these video game censorship monkeys need to remember that video games aren't real. And what really makes this all relevant is that I used to drive really fast, and these days I don't. Every time I'd leave my brother's house after playing Grand Theft Auto, I'd notice myself driving the way I did in the game. These days I drive slowly and carefully and I always get guys in sportscars who aren't old enough to drink legally going insane with frustration behind me. I don't even break the speed limit.

Just something to think about.

Wednesday, November 28, 2007

My Mindstorms-Fu Is Rusty

Below, a robot which can drive around a table without falling off. Took me hours.

Tuesday, November 27, 2007

You Must Fight The Power

Lego Mindstorms Robotics On OS X: Hello World

Here's the dead-stupid overview of how to get your Mindstorms kit running on OS X.

Getting Stuff

First, build a robot with wheels powered by motors on B and C, and with the ultrasonic sensor plugged into sensor port 4.

Then download and install NXTBrowser.

Next download and install NBC/NXC. Give the NBC compiler exec permissions:

chmod a+x nbc

Coding

Then copy the code near the middle of this nice walkthrough, where it says "NXC Sample Code." Put it in a file in the same directory as your NXC install, and call it wander.nxc. Compile it into something the Mindstorms brain can handle with this command:

nbc -O=wander.rxe wander.nxc

Transmission

Turn on your Mindstorms brain; go to Bluetooth and choose On. You should see the words "Turn On?". When you do, hit the orange button. Boom.

Now go to your Bluetooth settings in System Preferences. A "wizard" will walk you through the steps of connecting to the Mindstorms brain. You may have to input a simple passkey on the Mindstorms brain. You do this just by using the arrow keys and orange button. (It may be possible to skip this step; I don't know.)

Now open NXTBrowser.



Choose "Software." There's an "Upload" button at the bottom of the UI; click it and you'll get a file browser. Choose the NBC file you created, wander.rxe, and NXTBrowser will put it on your Mindstorms brain. Go to the Mindstorms brain and choose "Software Files" - you'll see "wander", which is the program you just added. Hit Run and place robot on floor; you're done!

Alternatives

I'm using NXC; it's a C derivative. There's a Java derivative which looks pretty great, but for me personally the frustration of attempting to build it on OS X was too much. A few people have been successful with it, however. There's also a Ruby library, ruby-nxt; although I'm big into Ruby, and I definitely enjoy it more than C, the Ruby version is based on a remote-control model, as opposed to actually putting a program in the robot's brain. If you're building arm robots I would say go with ruby-nxt instead, but for robots that run around on the kitchen floor, NXC is my weapon of choice.

Dice? Physical Dice?

In 2007?!

The Job I Want Next

For a very long time I've been easy to hire and hard to keep. This was a deliberate thing. I chose it for particular reasons, to accomplish specific goals. It did that. But I've decided to change my system entirely, and become hard to hire, but easy to keep.

As a programmer, I've worked for around 30 different companies, almost always in a short-term capacity, as a consultant or a contractor. If you include my time as a graphic design freelancer and my work as an office temp during my late teens and early twenties - yes, I was working full-time and living in my own apartment at 19 - the number of companies is probably more like 60. There are disadvantages to this, which is why I'm changing my approach, but there are advantages as well.

The primary advantage for me was that short-term work always gives you periods of downtime in-between. I used this downtime to write screenplays and novels, to make music, to read very extensively on a very wide range of subjects, and so on and so forth. A secondary advantage is that practical immersion in a very large number of working environments over the course of several years gives you a very good feel for corporate culture. I can spot signs of trouble in a company the way old sea dogs can smell a storm in the air three days before it happens.



Greek sailor on Flickr

However, during the dot-com downturn, I had a lot more downtime. I had been doing some work I really didn't enjoy - maintaining legacy Perl for an investment bank. I was extremely bored there. I had contacted a recruiter to see if she could get me a better job, and she told me I should be grateful I had a job at all - 200,000 people had just been fired in the city of San Francisco alone. Faced with the realization that I would have to either stay with the job I had or be unemployed, I chose unemployment. I packed up my things and went far away.



A camper I used to live in; my dog and my dad's dog

I figured the downturn would blow over and I'd be able to find a new job easy enough. I was right; I just massively misjudged the scale of time on which this rhythm would operate. The downturn did blow over and finding jobs did become effortless again, but it didn't happen half as quick as I thought it would. So I had a lot more downtime than I had anticipated.

During this downtime, I was able to spend a lot more time on writing and art, although there were also significant downsides - I ate less food, bought less clothes, and owned less cool electronic toys. Eventually I came back to technology, partly because I enjoy it, and partly because I enjoy owning toys and eating food. But I definitely came back with a different perspective. Learning to draw is a good example.



I painted this in 2003 or 2004

Firstly, learning to draw changed my perspective on programming because learning to draw will change your perspective on anything. If we're talking about realistic drawing, depicting actual objects, the primary skill - really the only skill - is accurate, detailed observation. That's really all there is to it. When you draw, you make tons of observations, and you only keep the accurate ones. If you practice this mode of thinking, and then apply it to code, you're going to find your code getting more succinct - more powerful, and easier to understand.

Secondly, I took several art classes simultaneously for half a year, struggling with it, then locked myself in my place for a week over winter break with a copy of Drawing On The Right Side Of The Brain and nailed it. One lesson here is that Drawing On The Right Side Of The Brain is a hell of a book, but the most relevant lesson is that working hard at something for half a year and then obsessing over it for a week is probably a lot more productive than working a short-term job for half a year and then obsessing over something else for a week.

If you need a practical experience to cement this lesson for you, ask yourself what's more productive: a marathon coding session before you work for half a year on the problem? Or a marathon coding session after you work for half a year on the problem?

When I look back on that experience today, I realize, not only had I been shortchanging myself in my artistic interests, by sandwiching them in between short-term work, I had been shortchanging myself at work as well, because none of my marathon coding sessions ever took place after half a year's work on the problem. Actually, I say none - in fact, one company, my favorite job, I stuck around for about a year - a whole year - and only left after they managed to convince me to stay a few extra months. I left, get this madness, because I was bored. But during the time at that company, I wrote the code that I'm still proudest of, to this day. I spent a long time maintaining a legacy system, and then I managed to convince my primary customer to let me rewrite it, and the rewrite was such an improvement that people started calling me a Perl guru. (It was years before I encountered the work of real Perl gurus, so for a time I even believed it about myself, but that's another story.)

To produce genuinely excellent work requires consistent, dedicated effort over time. So my theory - that alternating short-term work with artistic "vacations" would give me the best of both worlds - actually just made me a jack of all trades and master of none.

Consequently, the job I want next is not some highly lucrative, short-term consulting, nor a permanent, full-time job. I want a permanent, part-time job as a senior Rubyist. Alternating between work and art was a mistake. I want to address my artistic interests in a consistent, dedicated way, and that includes programming.

I've been looking for this in a tentative, maybe-ish way for a while now, but I've only recently figured out why it really is a good idea. If anybody out there's been wondering, gosh, how do I hire somebody like Giles Bowkett? - now's your chance. Or more accurately, up until now, every moment was your chance, because I was easy to hire but hard to keep, but this period now might be your last chance for several years. If you're a programmer with other interests, like music, art, etc., maybe you should look for the same thing. One very excellent programmer I know gets "fuck programming" phases where all he wants to do is make drum & bass. Maybe he should. Interesting things happen when programmers make music.

Anyway, I'm not going to tell other people they should do what I want to do, especially when I haven't even proven that it's possible yet, but that's the job I want next, and I will say that I'm probably not the only programmer in the world who would enjoy that kind of thing. Why would an excellent programmer get "fuck programming" phases? It's not because they don't love programming. You really can't be an excellent programmer without loving it. It's because they love other things too, and it's good to do the things you love - all of them.

Update: I got an e-mail asking, "Why not just start your own business?" Other people have asked me that as well. So, I guess it's a FAQ. My answer, from the mail:

The thing is, I'm serving two masters either way. If I do programming and art (or actually acting), I've got to think about both programming and acting. If I start my own business, I've got to think about both programming and business. In either case I'm thinking about two things. I think I'll do better if I think about two things I'm interested in, rather than one thing I'm interested in and one thing I'm not interested in.

(I am actually interested in some aspects of business, but more the management stuff than the business stuff - i.e., more healthy development practices than accounting and planning and sales.)

Monday, November 26, 2007

Psychological Neoteny: Prolonged Immaturity As An Evolutionary Adaptation In Science (And Technology)

I have a tattoo of the Decepticon logo.



I spent at least $400 on Legos this year. And there are Lego guys who put me to shame.



The mid-twentieth century saw the rise of the boy-genius, probably because a personality type characterized by prolonged youthfulness is advantageous both in science and modern life generally. This is the evolution of ‘psychological-neoteny’, in which ever-more people retain for ever-longer the characteristic behaviours and attitudes of earlier developmental stages. Whereas traditional societies are characterized by initiation ceremonies marking the advent of adulthood, these have now dwindled and disappeared. In a psychological sense, some contemporary individuals never actually become adults.

Fascinating analysis from a UK psychiatrist.

Seaside/Smalltalk Job In Vancouver

For Deepcove Labs. Posted on ruby-talk by Boris Popov.

The Long Tail (Backwards) In Blog Readership



As usual, the spike is a result of posting something that I figured nobody would ever find interesting.

I guess the moral of the story is post a lot, even if it seems random.

A Threat Decoded (Hollywood Tangent)

(This one has no high-tech content at all - sorry.)

Here's the first in a series of videos featuring speechless actors supporting the writers' strike:



The secret decoder ring, for people outside entertainment, is that the actors' union has an upcoming contract negotiation, similar to the one which caused the writers' strike, and the actors' union has had a presence at many picket lines during the writers' strike, and a very strong presence at the more high-profile events (which is kind of how it works with actors).

What this really means is that if the writers' strike doesn't end before the negotiation date for the contract with the actors' union, the actors' union will go on strike too.

It's kind of interesting, because this video appears to be a fuzzy warm "we like you guys" thing, but it's actually a multi-billion-dollar threat.

Already, the networks are hearing from angry ad-spot customers who want their money back. Companies who buy ad time from networks tend to pay in millions. Returning millions of dollars to a customer is something companies like to avoid.

Get This Book

Have you ever noticed how, instead of computers becoming smarter and smarter, like in Star Trek, more and more people have become programmers? I don't mean professionally - I mean in the sense that you can program a toaster, or a microwave, or "program" the timer on an old-school VCR.

The explanation is in here:



Also the most interesting observations on screenwriting I've ever read, and the only cogent explanation of video games. If you've got a friend who doesn't understand why you spend hours playing video games, get them this book in the upcoming holidays. It's fantastic.

Sunday, November 25, 2007

How To Customize Safari Beyond All Reason

Here's a screenshot of Safari plus my desktop:



If you use Safari, either on Tiger or Leopard, you'll notice some differences. If you're reading this in Safari right now, you'll definitely see some differences. I had to go into Safari's innards to fix an interface annoyance, and while I was there, I decided to have some fun.

To change your windows so they don't have that "oh look it's high-tech" brushed-steel texture thing going on, do this:

sudo open /Applications/Safari.app/Contents/Resources/English.lproj/Browser.nib/

And then go to Tools > Show Inspector and do this:





Voila - the "cool" brushed silver is gone and the "awesome" drop shadow is gone too.

You can take it further than that. Do this:

ls -laF /Applications/Safari.app/Contents/Resources/*.tif*

...to get a list of all the TIFF-format images used in Safari, the app. If you change all of these images and keep them at the correct size and color depth, you can probably create an entirely new visual theme for Safari if you want. I say "probably" because these edits don't always work. One thing I absolutely hate about OS X is the Help menus, so I tried to remove Safari's Help menu, and unfortunately this destroyed something important, somehow, because after that Safari refused to boot. (Luckily Interface Builder backs up your interface files when you edit them.)

However, this does bring us to the other thing you need if you're going to edit Safari: a backup Web browser.



I like Shiira the best. Geoff Grosenbach used it in a Peepcode screencast, and it's got a happy whale. The UI is also pretty cool, and smart, and comfortable, and there's even a Dashboard Widget version, but honestly, it's all about the whale. Or maybe the fish. Looking at the tail, I think it's actually a fish. Either way, it's definitely happy.

Camino is also good. I use Firefox for dev, of course, but its cursor behavior truly annoys me, so I avoid it for anything but dev. Before I figured out how to fix the interface annoyance that started this all off, I thought I was going to have to give up on Safari entirely, so I checked out some other browsers as well - Opera, OmniWeb, and Sunrise. Sunrise has the most original thinking and the least stability. I didn't really dig Opera. OmniWeb's marketing annoyed me so much that I never really spent much time on it.

Saturday, November 24, 2007

How To Fix Safari's Silly New Textarea Alert

Safari 3 improves over Safari 2 in many ways, but it totally sucks ASS in one way: if you fill out a text area field on a form and then navigate away from the page or close the window, it pops up an aggravating little alert to make sure you really wanted to do that. Apple added this because they were apparently unaware that adults use OS X too.

Just kidding. I don't know why they did it. I do know it makes using nifty auto-submitting Ajax apps a pain in the ass in Safari, where before life was good. Luckily this is really easy to fix.

First you do this:

sudo mate /Applications/Safari.app/Contents/Resources/Defaults.plist

(Assuming you use TextMate.)

And then you do this:



And then, if Safari works like any sane person would expect it to, you have me to thank.

(However, if your computer suddenly explodes, it was all your fault. I am not an Apple developer, or a lawyer. Or a tuna casserole, but hopefully you knew that.)

Update: Jack Nutting told me that there's a pretty good chance you can do it an easier way:

defaults write com.apple.Safari DebugConfirmTossingUnsubmittedFormText false

and adds that "the advantage of using the defaults command (if it works) is that the setting will stick in your user account even when new versions of Safari are released." As far as I can tell, it worked, although I guess we'll have to wait til the next Safari upgrade to be sure.

Facebook Apps: The Facebook Trap

Facebook would like you to believe they are giving you a new platform. Anybody who's been on the Web for a while knows that Facebook is selling old wine in new bottles. And yet building Facebook apps can still make sense.

People like to talk about leveraging social networks, but here's what they leave out. Joining a social network takes five minutes (if that). Staying active in a social network is a full-time job. I know exactly what that's like, because I used to organize and promote nightclubs and raves. Going to a club takes five dollars. Being a clubber or a raver takes time. In fact it eats up your life (but in a good way).



Social networking Web sites aren't platforms. They're not circles of friends, either. They're nightclubs. Some people spend all their time at these things. Some people never go once. A new social network becomes hot, everybody who's everybody is there, and one day, suddenly, without warning, everyone is somewhere else. At most of them all you have to do to get in is show your ID.



There's a lot of money in nightclubs. You go to the Winter Music Conference in South Beach, FL, you can easily spend a hundred dollars just to get into one particular club one particular night. And it won't be the only club you go to that night. There's a lot of money in club music, club clothes, expensive vodka, designer drugs, laser lights, and really big speakers. The people who work that system successfully do very well by it.



But if you're building a Facebook app, you're building a sound system you can never take out of the club. Spending money on something which won't work anywhere else only makes sense if the payoff is immediate. It's not really an investment, because assuming any given social network will persist for any given amount of time defies history. These things have been growing first hip and then stale on a cyclical basis for five years. A New York Times article blamed various executives when Friendster went from hot to not, but fashion features a great deal of randomness. Criticizing upper management for being unable to predict effectively random phenomena is like criticizing them for being unable to defy gravity. It might make a lot of sense to leverage social networking applications for business purposes, but it definitely doesn't make a lot of sense to do so in a way that locks you into any one particular social network.

Some companies think building customized social networks is the answer. Some developers think OpenID is the answer. Personally, I think the smartest thing is to do what nightclub and rave promoters do. They go to other people's raves and other people's nightclubs and they pass out flyers for their own events. If you're building a Facebook app, you want to make sure it directs people to your own site, because one day Facebook won't be fashionable and nobody will be there. Like Myspace - nobody goes there, because it's too crowded.

Friday, November 23, 2007

First They Came For The Moths

And I did not object, because I was not a moth. THEN THEY FED MY BRAINS TO A FUCKING ROBOT!

Wow. Fuck That.

Don't be evil? If this is true, then good luck.

Why I Program In Ruby (And Maybe Why You Shouldn't)

Neal Ford recorded an interesting podcast recently, and it's definitely worth a listen, but there's one point I want to raise - it could be a point of actual disagreement, or "violent agreement", but whatever kind of point it is, I think it matters.

Neal brings up the fact that the same programs can be written in any language which is Turing-complete, so the choice then becomes not "which language can I write this program in?" - because you can write it in any language - but "which language gives me the best options in terms of power and efficiency?"



This question overlooks one of the fundamental design principles of Ruby. First and foremost, Ruby was designed to be enjoyable to program in. In one of many Web interviews, Matz said:

Because of the Turing completeness theory, everything one Turing-complete language can do can theoretically be done by another Turing-complete language, but at a different cost. You can do everything in assembler, but no one wants to program in assembler anymore. From the viewpoint of what you can do, therefore, languages do differ—but the differences are limited. For example, Python and Ruby provide almost the same power to the programmer.

Instead of emphasizing the what, I want to emphasize the how part: how we feel while programming. That's Ruby's main difference from other language designs. I emphasize the feeling, in particular, how I feel using Ruby. I didn't work hard to make Ruby perfect for everyone, because you feel differently from me. No language can be perfect for everyone. I tried to make Ruby perfect for me, but maybe it's not perfect for you. The perfect language for Guido van Rossum is probably Python.


The question of "which language gives me the best options in terms of power and efficiency?" is deemed less important in this set of design principles than the question of "which language gives me the best feeling while I program?"



In Japan, the distinction between craft and art is blurred at most, possibly nonexistent. Japanese theories of art (and craft) often involve principles of harmony and balance. We might debate in America whether programming is an art or a craft, but if you've got one word for the same thing, you're asking whether programming is an artcraft or an artcraft, and the answer, obviously, is "Yes, you idiot." So to say harmony and balance matter in programming might seem esoteric or self-indulgent in America, but it's as obvious in Japan as saying that the sky is blue.

Harmony and balance make you feel good. American Rubyists frequently take up all the points of Ruby's power, expressiveness, and efficiency, but they don't seem to register the point that Ruby was designed to make you feel good. Even Rubyists who want to explain why Ruby makes them feel good often fail to mention that it was expressly designed for that exact purpose. Neal does this in his podcast.

Neal's podcast is mainly about JRuby. JRuby is a first-generation American - a child born here of one foreign parent, Ruby itself. I'm a first-generation American too, and even though I have two human, English parents, rather than one Japanese parent made of code, I think I feel JRuby's pain here. So I'm just going to tell you - every first-generation American sees this happen all the time. Some idea from another country or culture disappears like mist scattered by winds unless Americans already have a synonym for it. If they don't have a word for it, they don't have a box to put it in, and the idea just falls through the cracks.



It's not even an American thing. Everyone who really learns a new spoken or written language - a regular human language - discovers new ideas they didn't previously have words for. Conversely, anyone who tries to tell somebody about an idea, when that person has no word for it, faces an uphill struggle at best.



The idea that a language should be designed from the ground up for the purpose of providing you with an experience of harmony, calm, and enjoyment is fundamentally alien to American programming culture, and when American programmers discuss why they prefer Ruby, the fact that it was designed for their enjoyment vanishes from their own vision even when it's right in front of their eyes. The closest equivalent we have in this country is an idea that the language was designed to be fun - which is similar, but not the same thing. It implies toys, rather than Zen gardens.



David Heinemeier Hansson is one programmer who understands this aspect of Ruby. Maybe it's a Danish thing. In Rails' early days, he made an effort to publicize it, although that's fast becoming obscured in history as Rails turns into the thing everybody uses.



The thing that everybody uses is always a hot topic in technology. People like arguing about "why you should program in Language X." But it's not really about why you should program in X. It's about why I'm going to program in X, and why I think you might enjoy it too. If the number one principle behind designing Ruby is enjoyment, then the number one principle behind using Ruby should be enjoyment too.

Don't program in Ruby because you want power or efficiency. Don't program in Ruby because you think you "should", either. Program in Ruby because you like it. And if you don't like it, don't program in it. Religious wars over programming languages are just silly. The messianic zeal of Christianity's shameful Crusades a thousand years ago still lingers on in Western culture, and one glaring example is the ludicrous idea that there should be one true language or one true editor, or one ring to rule them all. It's much better when programmers can work in multiple languages, multiple editors, and multiple environments. Diversity is healthy for ecologies. This is a point Neal makes in his podcast - he calls it polyglot programming, which is to say multilingual programming. He calls it a positive trend, and I agree.

Wednesday, November 21, 2007

Gmail: Now Even More Beta! (It's True, I'm Grumpy Again)



Update: This link has proven a lot more effective than complaining:

http://mail.google.com/mail/?ui=1

Rails Is A Honda, Not An iPod

Apple embraces open-source software, and yet - bizarrely - the company takes a notoriously hostile attitude to users who want to modify Apple products they buy. It took a class-action lawsuit and general outrage to get Steve Jobs to open up the iPhone to developers. If you want to change the battery on your iPod, you're out of luck. It's Steve's way or the highway.



By contrast, the Honda enthusiast community has come up with more options, add-ons, and hacks - for lack of a better word - than anyone besides Ramanujan could ever count.



Based on my own completely unscientific survey of the Rails community, I'm basically certain that more of us own iPhones than heavily customized Hondas. But we ultimately have more in common with the Honda custom community than we do with the community Apple thought they were selling the iPhone to. The disparity between the community Apple thought they were selling to and the community they really were selling to is part of the set of misperceptions which led Apple to think shutting developers and users out of their iPhones wouldn't be a big deal. An interesting topic - but the main thing here is what Rails users have in common with people who want their cars to glow in the dark.



Exhibit A is jRails, a Rails plugin which allows you to swap out all your Prototype and Scriptaculous JavaScript and replace it seamlessly with jQuery JavaScript instead. All your Rails JavaScript helpers remain intact, which means you can migrate from Prototype and Scriptaculous without changing a single line of Ruby code. All you have to change is your custom JavaScript - and that's the whole point of the exercise, because some people prefer to write their custom JavaScript using jQuery.

Exhibit B is auto_migrations, a fantastic Rails plugin which hijacks the entire migrations system and, while keeping it intact, replaces the developer's interface to it - a potentially sprawling mess of individual migration files - with one single, infinitely more elegant, self-migrating schema file. Exhibit C is the discussion in Pat Maddox's blog about how to make ActiveRecord's implementation of Active Record the pattern more consistent with good OOP principles.

The most important piece of evidence is Rails' plugin architecture itself. The whole thing is designed so you can rebuild any piece of it any way you want to. Because he missed this fundamental idea, one Microsoft programmer wrote a post dissing Rails which seemed to almost dip into sheer insanity:

The Rails Team needs to accept that they are now a VENDOR, not radical mavericks.

I responded that Rails core isn't a vendor because nobody's giving them vendor money, and that if you want to change the framework, you've got a flexible plugin architecture and a powerful dynamic language which each make it very easy to do. He replied:

The core point of what you’re addressing here is that "just because people use it, doesn’t mean Rails has any responsibility to people" - I say that’s not so. But we could wave our arms all day about this, and really it comes down to what you’re willing to tolerate as a software provider.

Just think on it for a bit. Your livelihood (if you’re a Rails guy) is resting on this platform... you cool with that?


My opinion, of course, is that my livelihood rests on me, and always will, irrespective of Rails' future, but that's another question. I bring this up because what really caused the disagreement here was a fundamental difference of opinion on what the word "platform" means. Specifically, what kind of responsibility do you have to your users when you provide a platform? Is the way the user uses the platform your decision, or theirs? If you're Apple, you believe that by providing a platform, you take responsibility for pretty much every aspect of the user's experience on that platform. If you're Honda, you take the attitude that if some wacko wants to make their car look like a spaceship, who are you to stop them?



Rails is a platform which is designed to be easy for newbies to get started with and easy for experts to transform, adapt, and bend to their wills. This is something it shares with Ruby - if Ruby lacks a language feature you want, it's often (though not always) pretty easy to add it on somewhere. It's almost even possible to write Ruby in Python. If you want Rails to do something that Rails doesn't do, you have two options: use something else, or make like Tim Allen back when he was funny and rewire that sucker. See what you can make it do.

Tuesday, November 20, 2007

Saturday, November 17, 2007

1951; 2007











Friday, November 16, 2007

validates_presence_of: Ironic Little Rails Gotcha

validates_presence_of doesn't validate itself. If for some lunatic reason you're up and writing code at 8:26 AM, and you make a standard brain-lacks-coffee error like asking a model to validate the presence of some attribute it doesn't actually even have, the model will go ahead and validate against that missing attribute - preventing you from saving anything.

Crazy Science Shit

Molecular 'amplifier' boosts DNA computing

DNA-based computing just got a big boost. A method of amplifying weak chemical signals in a way that can be tailored to specific molecules has brought DNA-based circuits closer to practical applications.

Machine made of electricity and microbes

MECs are modified versions of microbial fuel cells, which are used to harvest electrons produced by metabolising microbes as they feed to generate electricity. The electrochemical reactions are balanced when the used electrons are combined oxygen and hydrogen ions also released by the microbes to form water.

Logan's MECs are like microbial fuel cells in reverse. Instead of charge being drawn out, it is pumped in, and the hydrogen ions combine with electrons alone to form hydrogen gas. Applying roughly 0.5 volts provides enough energy to drive thermodynamically unlikely chemical reactions that break down the dead-end products that limited previous attempts to ferment hydrogen.


Water-gathering machine based on spider webs

A portable dew-harvesting kit inspired by a spider's web is being developed by Israeli architects for use in areas where clean and safe water is scarce.

Reading mind of paralyzed guy

Electrodes have been implanted in the brain of Eric Ramsay, who has been "locked in" - conscious but paralysed - since a car crash eight years ago.

These have been recording pulses in areas of the brain involved in speech.

Now, New Scientist magazine reports, they are to use the signals he generates to drive speech software.


Scientists designing reality wormholes

[They] came up with the idea by building on the mathematical theory that gave us the invisibility cloak - a device that was realized for microwaves last year. Whereas in an invisibility cloak rays of light are guided around a cylindrical or spherical volume like water flowing around a stone, a wormhole would have light guided around a more elaborate, tubular shape.

From I&T

Modern-Day Labor-Saving Goo!



Who Needs Designers?

Thursday, November 15, 2007

What Is Threequals (===)?

There was a post on ruby-talk recently, demanding to know why these statements don't display the associative property you would normally expect of equality:

(1..5) === 3 # => true
3 === (1..5) # => false


The answer is simple: ===, aka threequals, isn't an equality operator - it's a method, and it's used for weird corner cases connected to but not exactly corresponding to equality. Module#=== tests whether an object is an instance of self or any of self's subclasses. Object#=== is a synonym for Object#==. Likewise String#=== is a synonym for String#==, while Regexp#=== is a synonym for Regexp#=~.

Range#=== tests for inclusion; range === some_object returns true when that object falls within the range. For example:

>> ('a'..'c') == 'b'
=> false
>> ('a'..'c') === 'b'
=> true


So (1..5) === 3 is true because 3 is in that range, but 3 isn't equal to that range, so 3 === (1..5) doesn't return true. Logical enough. If you're wondering how you're supposed to remember all this, you aren't. You don't really even need to know it. The === method exists for controlling how a case/when block will evaluate an object. It should never be used by humans. It is code written to be consumed by other code, specifically, by case and when.

Delving into the inner workings of Ruby is lots of fun if you know what you're doing and a recipe for trouble if you don't. (If you sort of know what you're doing, but not totally, then it's a recipe for fun trouble.)

Just to illustrate what can go wrong, here's one of many ways to drive IRB insane:

>> def ==(object)
>> puts "asdf"
>> end
asdf
asdf
asdf
asdf
=> nil
asdf
>> true == false
asdf
asdf
asdf
asdf
asdf
asdf
asdf
asdf
asdf
asdf
asdf
asdf
asdf
asdf
=> nil

California Hustling

As regular readers of my blog know, I'm very interested in entertainment, as well as programming. I've definitely met some skepticism from people who tell me you can't serve two masters, but I think having only one career is as foolish as having only one girlfriend: it misses the whole point.



Anyway - the funny thing about the entertainment industry is that anybody who writes a book on the business or talks to a camera about it will tell you that the creative people and the business people differ immensely, especially when it comes to integrity. This much appears to be true. But they go on to tell you that this makes entertainment different from any other business, and that, sadly, is patently false. If you're sniffing around the fringes of the entertainment industry, and you have your home in technology, the dynamic of creative "superstars" brokered to large corporations by sleazy, unprofessional hustlers is unbelievably familiar. It feels just like home.

I do a lot of contracts. They give me the flexibility to take time off between projects and write novels, or screenplays, or whatever. This puts me in contact with a lot of technical recruiters. You would never believe some of the bullshit technical recruiters will pull. Indian guys with fake "American" names are normal, and nowhere near an extreme example. As a tangent, I have to say, if you're an Indian dude wondering if you should do that, DON'T. For two reasons. One, I correct Americans on the pronunciation of my name several times a day, every day. I don't want you giving them any excuses. Two, and I wish this was obvious, it's a lot easier to respect somebody when that person tells you their real name.



One time I got a call from a guy with an incredibly strong Indian accent, so much so I could barely understand a word he said. After several attempts I realized he was trying to tell me his name was Sheldon. This was hard to believe, but the two of us had expended so much effort to get that sentence from him to me in an understandable form that I decided to go with it anyway. I said, "How are you, Sheldon?" I even said it with a straight face. The conversation continued, he decided I looked like a good fit for whatever requirement he was trying to fill, and pretty soon he put me on the line with this other dude who had an equally strong, equally incomprehensible Brooklyn accent, who referred to the Indian fake-namer as "my man Sheldon," which you really had to hear to believe.

Here's an attempt at recreating the sound of it. If anything, he was more cartoonish in real life.

These guys ultimately failed to set me up with a gig, and given how shady they were, I have to say I'm glad.

The funny thing is, all this actually really reinforces the idea that programming is an art, and programmers are working artists. It's circumstantial evidence, but it's very interesting circumstantial evidence. With Hollywood hustlers and tech recruiter hustlers, we basically have a case of parallel evolution. The term comes from biology, of course; the dolphin and the shark look similar not because they're related, but because they've adapted in similar ways to the same environment.



For both a technical recruiter and a Hollywood producer, the only real professional requirement is that you know somebody who has talent and skill. Meeting people is a job skill with a very low barrier to entry, which is why Hollywood hustlers are legendary and the tech industry overflows with similar characters every time there's a boom. It's pretty easy to do, especially since, in practical terms, all you really have to do is know somebody who seems to have talent and skill.



I should end all this by pointing out that every indication I have of the "unprofessional" nature of Hollywood comes from Hollywood itself, which means at the very least that you should take it with a grain of salt. Certainly there are people in Hollywood who say the most important things are integrity and good manners. I'm not trying to propagate stereotypes. (And for the record I've met tech recruiters who were totally cool people also.) I just think the parallel evolution thing is very interesting.

Wednesday, November 14, 2007

Functional Programming In Rails Templates

The use case: you've got a Widget which has_many Things. You want a partial which allows you to include form fragments for several Things the user might want to attach to the Widget. You may or may not already have existing Things. If you do, you want to display them; either way, you want to display three blank form fragments, because even a user who's already got Things attached to their Widget might want to add on a few more. You also need unique div ids on the fragments, so that you can add simple JavaScript to display any necessary new form fragments instantly - without an Ajax round-trip to the server and back.

Vividly picturing the PHP or Java to fulfill that use case could result in an image so frightening it disturbs your sleep for weeks. The Rails way is gorgeous:

Further map.resources Skepticism

I think map.resources is the devil. This:

<%= link_to 'New widget's thing', new_widget_thing_path(@widget) %>

is nifty. Explicit hrefs are the modern goto, after all. But it's not nifty enough to justify this:

def new
end

def create
end


which bothers me so much I can't even stand it. (map.resources won't let you post to a new.)

YouTube vs. Hollywood

The last writer's stike, the corporations owned the media, but not as much as they do today. You look in the mainstream media, you find very little representation of what's actually going on. 69% of people in Los Angeles support the writers; the media, however, misrepresents them and consequently outside of LA support isn't as strong. (Update: I was wrong about that last part.)

But the writers have YouTube, which they didn't have 20 years ago.



The corporations derive their control over content from their control over distribution. But the Internet creates entirely new methods of distribution; the only remaining resource for corporate control is therefore copyright on legacy content, and the (questionable) usefulness of existing content creation systems. The last writer's strike, the media corporations lost ground to cable. If they lose ground to the Internet, they may lose that ground forever.

UI Designers: Time To Rethink Some Assumptions

Because dual monitors are becoming common. It might be a traditional no-brainer that when the user resizes the window and moves it somewhere else, you should open all subsequent windows in that location at that size. But not if the user has two monitors, and especially not if each monitor is a different size or is set to different resolutions.

Tuesday, November 13, 2007

Folklore

A story about magic.

The Writer's Strike Is About The Internet

Rails Newbies: Never Even Bother With Resources

REST is interesting, cool, and in some ways very useful, but if you're new to Rails, don't even waste a second on its resource-oriented paradigm. It's got absolutely none of Rails' usual ease of use and elegance, and it'll just confuse you.

If you use the resource-oriented paradigm at all, use it when you've built several Rails apps in the pre-resource, simpler way.

Monday, November 12, 2007

Unix Power User Overkill And IRB

Here's a line of actual shell code:

! !! | g? ?

The first ! is a custom shell prompt, defined in my profile. The !! is a Unix shell special variable, which means "last command." The | is a pipe, which means "pipe output of preceding into following." Essentially it's Unix-ese for an arrow. The g? is my alias for grep. (Soon to be replaced on my box with ack.) The ? is the only literal character in the whole thing.

What I'm doing here is I've just run svn stat, and gotten back too many files - I have lots of changes to check in. All I want to see is new files I haven't added yet, so I can add them. Those appear with a ?, so I'm saying "do that svn stat again, but grep it for question marks."

What's cool about this is its incredible brevity. What's not cool about it is the minimal legibility. Unix ninjitsu is so terse it's wonderful, but it sometimes makes Perl regexes look like "see Spot run" by comparison.

I've almost got this much terseness in IRB, but not quite. I don't currently have an IRB equivalent to !!. I have one for !$, which means "the last term in the last command," in _, which is enabled if you just do

IRB.conf[:EVAL_HISTORY] = 1000

...(or any number) in your .irbrc. I also have Ben Bleything's utterly essential history code, and an equivalent to this Unix trick:

!523

...which means "repeat command number 523," in the history code's h! method.

Ben's code is already freely available. I think I can hack an IRB !! pretty quick if I find the time for it. I'm working on releasing all this stuff as a gem, but between work, this blog, my new podcast, my acting class, following the writer's strike, and a new workout regimen which essentially crippled me after only two days, it's still not quite ready for prime time.

Similarly, there's no real equivalent to | g? yet in IRB. If you have a line of code which returns a string, it's very easy to do _.grep(/some*regex/). However, a very common idiom in my own Unix shell usage is h?, which I guess I've elevated from an idiom to a term, but which is essentially just history | grep. That doesn't exist in my IRB yet, but I think adding that will also be pretty easy. It's effortless to get the history, thanks (again!) to Ben's code, but I think (iirc) Ben's code writes to $stdout with puts, so I'll need to write another method to actually grep the history. The cool part is, I will be able to keep my shell term for it, h?, so that part at least will be transparent for me.

If I could just figure out how to enable Unix shell tab-completion from within IRB system() calls and backticks, I'd never use bash again.

Update: Ben corrected me - it turns out h! actually already gives you !! if you call it with no arg. That's pretty awesome. I think IRB might be the biggest strength of Ruby for me personally. It leverages Unix. If you're used to thinking in Unix, IRB makes a lot of sense, and being able to do !! in Ruby kicks much ass.

The Turing Test vs. Scammers: Round 2

I've blogged before that I believe the Turing Test was beaten in the 70s, but nobody is willing to admit it because the guy who did it did it as a joke. (I sometimes think academics believe having a sense of humor is cheating.)

Another example of effective Turing Test victories that will probably not be acknowledged by academia: poker bots.

Debugger Flame War, Finally Demystified

I craftily tricked somebody this morning. We were supposed to go through a strenuous workout; instead we walked to a cafe, bought a latte, and walked back. That's a strenuous workout by programmer standards, but my outraged buddy, in his desperate thirst for revenge, resorted to the unthinkable: he brought up that stupid debugger flame war. As a result, I've been thinking about that crap all day.

I think I figured out why it happened:



It is patently ridiculous that my blog be the top search result for these particular terms - unless you assume that many programmers are more interested in wasting time before lunch than they are in working. This is not such an unreasonable assumption, given the law of averages - certainly it's one of the only explanations possible for my blog in the first place - so what I really saw here last month was basically a media backlash.

(This may have been obvious to observers, but it took me by surprise.)

Sunday, November 11, 2007

The Programming Landscape

The RailsConf t-shirts, keynotes, and conference materials put a lot of attention on Rails' dramatically steep acceptance curve. I've seen this before. It's a lot like the early years of the dot-com boom.

There's a pattern to it. Maybe a lifecycle, although it may not be cyclical. Definitely a story.

The Secret Valley

The technology is secret and magical; anyone who discovers it finds an excuse to return to it, because they love it and they think it's wonderful.

Boomtown

The streets are filled with newcomers, and a few newly wealthy original inhabitants.

Suburbia

The technology becomes the standard, and everyone associated with it has two kids and a garage.

I get out when Suburbia hits, so I don't know what happens afterwards, but after a period of time you get something which reappears on my radar:

The Ghost Town

Where tumbleweeds roll down once-thronging streets.

What It Means

The Secret Valley is heaven for the curious. The Boom Town is heaven for the trendy. Suburbia is hell, and should be wiped off the face of the earth. The Ghost Town is heaven for historians and archaeologists.

Actually, the Secret Valley is heaven for the curious who look forward in time, and the Ghost Town is heaven for the curious who look backward in time.

Boomtown and the Ghost Town feature the best programming income. Boomtown favors young hotshots and the Ghost Town draws aging wizards out of obscurity, but both feature more demand than supply. The Secret Valley has a lot of interesting but not much money. Suburbia is its precise inverse, in that respect.

Exceptions

HTTP started in the Secret Valley, and was a Ghost Town by the time the rest of the Web was Suburbia. REST found HTTP and claimed it as a Secret Valley, which then turned into a Boomtown for a second time.

Similarly, Seaside is a Secret Valley inside Smalltalk, which appears to otherwise be a Ghost Town. Actually, when you think of it that way, the relation of Ruby and Rails is kind of similar. Ruby is really a giant Secret Valley, full of nooks and crannies, which contains a bustling Boomtown. The residents of the Boomtown are so busy with commerce, many of them have no idea of the treasures lurking in the wilderness nearby.

Rails: Cascading Deletes

Just some deliberate Google bait. If you're wondering how to do a cascading delete in Rails, it's very easy:

class Widgets < ActiveRecord::Base
  has_many :whatevers, :dependent => :destroy
end

Deleting a Widget now deletes that Widget's Whatevers as well.

(Josh Susser posted almost exactly this example on a not-Google-able mailing list, I'm just paraphrasing to make it easy for newbs to find.)

What A Find!

Which would you rather type?
$ grep pattern $(find . -name '*.pl' -or -name '*.pm' -or -name '*.pod' | grep -v .svn)
$ ack --perl pattern

Saturday, November 10, 2007

Podcast: Evan Phoenix On Rubinius

Aqui

Lisp Programmers And Anarchists

I'm originally from Chicago. In Chicago, we say "The cops are the biggest gang." We say it like other people say "What goes up must come down." The interesting thing about travelling outside of Chicago is that you really don't hear this sentence every day anywhere else in the country. Tourists in Chicago always think this sentence is intended as a philosophical statement, or a metaphor. If they stay in town long enough, however, they learn sooner or later that there's no metaphor involved. The cops simply are the biggest, best-funded, best-organized gang in the city of Chicago.


Chicago

The gangsters who currently run Chicago, of course, had to fight some other gangsters to take it over first, and those other gangsters were pretty motivated and pretty capable. The history of Chicago is basically a history of gang wars.


Al Capone

I've always loved techno-anarchism, but in a "wouldn't that be great" / "isn't that cute" kind of way. Taking it seriously became impossible for me; growing up in Chicago made me immune to it. When I was younger I hadn't developed my immunity yet, and I was in England. My uncle told me about regions in Belfast were the police never went, because of hostility between the English and the Irish. Having just read a bunch of techno-anarchist stuff, and being pretty naive, I said something along the lines of, "Oh wow, no government - do they police themselves?" My uncle basically said, "Well you could say that, in a sense. The areas are run by gangsters."


Belfast

Belfast isn't the only city with urban areas ruled by gangsters because the police don't go there. Anybody who's ever listened to gangster rap or seen "Boyz N The Hood" is familiar with this phenomenon. The anarchist ideal of taking down social structures and systems doesn't really shine that brightly when you consider that we already have places without law throughout the world, and they're all horrible places to live. There's a relevant Russian saying, which basically goes, "Don't kill the king unless you know who's next in line for the throne."

Many rebellious people like to say that politicians are just gangsters. It's not true the way it's true of Chicago cops, but in many cases it holds a serious kernel of truth. It's certainly relevant when you're talking about the Bush family. But toppling the existing system, just because its rulers are criminals, doesn't necessarily make sense, because the fallout could bring on worse criminals. Don't do away with the cops unless you know who will take up patrolling the streets once they're gone. If you get rid of the boys in blue, you may just end up with new young men, also wearing blue.


Crips

Gangsters are the reality of life that anarchists consistently fail to recognize. They are an emergent phenomenon. If you have a society, groups of men in that society will organize among themselves to dominate others by violence and intimidation. It always happens. The anarchist idea, tear down society to do away with oppressive regimes, misses the fact that you can really only ever do away with a specific oppressive regime.

Lisp programmers are to limiting programming language features as anarchists are to limiting social structures. Anarchists think that life will be better if you get rid of the politicians and the cops, never realizing that if you do, new ones will arise to take their place who could be substantially worse. Lisp programmers think that life will be better if you get rid of limiting programming language features and simply program in Lisp, never realizing that if you do, you will create your own limiting programming language features which might be worse.



The operative word here is might. Sometimes the Lisp programmers are right - sometimes if you operate in a freer context you write better programs. But consider Reddit's famous Python rewrite, undertaken due to the lack of available libraries. They wanted to avoid re-inventing the wheel. That wasn't just because re-inventing the wheel is time-consuming. It's also because re-inventing the wheel is error-prone. A healthy open source project has countless programmers fixing its bugs and working on its design. There's less risk of debugging and less risk of painting yourself into a corner.

One of Paul Graham's Lisp books shows you how to create a system and syntax for object-oriented programming in about 20 lines of code. Building your own OO framework has got to be an incredibly educational experience - but using an untested OO framework still in development is only fun if you're working with really good people. And the incredible freedom of Lisp is probably the reason it doesn't appear to have the community traction or library support that even its fans want from it. Everybody knows the cliche joke "Anarchists unite!". It's got to be difficult getting programmers to agree on conventions and idioms when their language can do anything.

Lisp programmers like to say that every language is a subset of Lisp. But that's the whole point. Lisp gives you so much power it's difficult to rally a community around any one way of using that power. Just as anarchists say everybody has a problem with some social constraint, therefore let's do away with all of them, Lispers say everybody has a problem with some language constraint, therefore let's do away with all of them. But in either case, if you do that, you end up with very little, and sooner or later somebody has to establish constraints just to get anything done.


Black Rock City

There are anarchist environments, temporary autonomous zones where gangsters and politicians have little or no influence. It is deeply healthy to visit these environments and spend time there. However, living there year-round would spoil the magic. Likewise, the righteous anarchist who brings down a corrupt, evil king sometimes does the world a huge favor - but you can't rise up in revolution against every king, because you'd never get anything done.

If you're ever wondering why there aren't more jobs writing Lisp, this is why.

Developing In Silos Burns Cash And Costs You People

This is for forwarding to managers who are shooting themselves in the foot.

Recently I interviewed at a 100-year-old company. Four developers told me what they did, and it was evident that two of them worked on one area of their project, and the other two worked on a separate area. I asked them if they were worried about collective code ownership or re-inventing each other's wheels due to the isolation of their thinking, and they replied that they'd had a meeting about that very issue that morning. It was very much on their minds.

I mentioned my visit to this company to a friend of mine, a technical manager, and his first thought was, roughly, "You have to be careful with a nontechnical company with technical sections in it, because the company already has a system which makes certain people primary in terms of importance, and that system will dominate the technical sections of the company, even though it may be detrimental or dysfunctional for technical projects." To some extent, that's exactly what was happening: the developers told me that the reason they're operating in silos is because nontechnical people need single points of contact for particular projects, and they're mapping those single points of contact to "Bob works on Project X, Joe works on Project Y." As intuitively obvious as that is, it's also guaranteed to cost you money.

Programming is intricate and systematic. If two programmers at the same company do the same thing twice, it's virtually guaranteed that they'll do it in different ways. If these two programmers see each other's code, they'll recognize the repetition, and look to eliminate the repetition. The more systematic and less task-specific your code is, the easier it is to work with. Also, if two programmers see two ways to handle the same task, because they'll want to systematize the task, one way of handling the task will lose and the other will win. Usually the best way to determine which of two ways of handling a task wins is to show both ways to as many programmers as possible. That's essentially why open source works so much better than closed systems.

This is also why programmers have evolved the practices of pair programming and code review. Pair programming and code review are very effective techniques which prevent the accumulation of technical debt. Programmers hate technical debt - if anything, more than managers do. Some kinds of programming work are fun and some kinds of programming work are horrible. Developing new stuff is fun. Debugging a badly-built system is horrible. Managers should avoid building bad systems because it's essentially technical debt; anything you've built badly, you'll have to rebuild correctly at some point in the future. The time you save now you'll just have to spend further down the line, and just like financial debt, technical debt snowballs. But programmers avoid building bad systems because when that time comes, and management inevitably realizes it has to pay off its technical debt, programmers are going to have debug a badly-built system, and that's horrible. Programmers like to avoid building stuff which will have to be debugged because they'll be the ones doing the debugging.

(This is also why building stuff with contractors, and then hiring permanent staff to keep it running, is a much riskier strategy than many people suspect. The normal incentive to avoid doing things which you'll have to fix later is that you'll be the one doing the fixing. When you aren't the one doing the fixing, and somebody else is, human nature means something slips through the cracks.)

I've seen technical debt created this way in teams as small as only three developers, but once you scale past six or seven, that's when the technical debt really starts piling up. That's when you end up with a crucial component everything else depends on, written by just one person, and you realize that if this person gets hired away or hit by a bus, your whole system falls apart overnight. That's the difference between agile and fragile. Although I don't think this company was in quite that much danger - it's not even necessarily a hard and fast rule - the point for companies generally is that this is the kind of thing which you have to address, and which it is much better to address quickly.

The idea of "Bob works on X" is so intuitive and natural that the only way to prevent it from ossifying in that dangerous way is to nip it in the bud. I believe the solution is to implement pair programming as early as possible - ideally before the first line of code is ever written. The association of person to task comes from normal human behavior, not an org chart, and a natural, emergent dysfunction cannot be fixed by an org chart, even though, at some point in time, someone will try. It's like preventing heart disease. Nobody's found a dependable cure for heart disease, but preventing it is pretty simple: exercise and avoid cheeseburgers. The earlier you start that, the better it works. This fundamental approach to preventing toxic buildup works in organizations too. Make sure there's no hook for this natural human behavior to hang on, and some other, healthier natural human behavior will emerge.

Friday, November 9, 2007

Heroku

I'm a bit late with this, having been scooped by Ruby Inside even though I knew about it, but the founders of Bitscribe have a new company, Heroku, which looks to be quite nifty. Essentially it's a Web-based IDE and deployment system for Rails apps which incorporates an editor, log files, and the console, and could very possibly at some point support a debugger as well (since ubercoder Adam Wiggins, who wrote Gyre, is a key member of the startup).

Obviously this picks up a thread from Seaside's thinking and pursues it; I think both Seaside and Heroku are operating from the perspective that deployment is a logical thing to include in an IDE, and if the Web is truly the new development platform, then an IDE should run on the Web. The app's still in its early phases, and private beta is still on its way, according to Ruby Inside. However, I got to play with it because I know these guys, and it's pretty cool.

I don't think I can say too much about this next bit, but there's another company out there pursuing a similar line of thinking from a different angle. Heroku is really onto something. There was a popular idea in the late 90s that the Web should replace the desktop; destroying this movement was Internet Explorer's primary goal and raison d'etre. The fact that some people on the Web today might not even remember or recognize that idea shows that IE made some considerable headway there, but projects like Seaside and Heroku show you how it can be true. Heroku's cool now, but fast-forward a couple years and assume Rubinius performance catches up to Ruby performance; a Web-based Rails IDE starts to look very promising.

Wednesday, November 7, 2007

New Versions Of Prototype & Scriptaculous, And The Book

Lots of releases in JavaScript land.

Live Console: IRB Over TCP

Alpha software from Pete Elmore. A huge security risk and a really exciting possibility, all wrapped up in some version 0.1.0 goodness.

Translate Rails 2.0 Schemas Into 1.2.3

Here's some code which allows you to translate Rails 2 preview release schemas into Rails 1.2.3 schemas.

Obviously 2.0 is more exciting than 1.2.3 (or 1.2.5, which is really what you should be on). However, if you've got an app you want to build fast, you want to be on the stable version of the platform.

The code works for my situation, but if you use it, you'll need to accomodate the possibility of single quotes rather than double quotes, and maybe some similar things. Caveat emptor, don't try this at home, etc.

As a side note, IRB and Ruby are so much better for developing regular expressions than the Perl I'm used to (when it comes to regexes) that it almost boggles the mind. Go in IRB and do this:

/my regex/.match("my string")

If it doesn't work, arrow up, edit the command, and start again. Lather, rinse, repeat. If you're using Ben Bleything's history code, and my variation which sends the history buffer to vi, you're really set, because every time you come up with a complex regex that matches a complex string, you can call the history_to_vi method and save off the one-liner as a test case.

(I'll release my variations on Ben's code soon.)

Murder Will Out

Tuesday, November 6, 2007

Screenwriter's Essay On Remix Culture

Like Larry Lessig says, what we have today with copyright and downloads is essentially the same thing as the Prohibition era, where virtually everyone violated the law regularly and systematically. Pretty much everybody knows the law on downloads is bullshit - but a powerful minority is keeping those laws in place, and pushing even crazier ones through Congress. We're all living as criminals, because a small number of very wealthy, ruthless, dishonest people have inordinate control over the law-making process. This creates a polarized situation, an "us vs. them" mentality.

I blogged recently about tech bloggers imagining screenwriters on the "them" side of that divide, instead of doing some research and finding out what's what. The truth is, many screenwriters understand and embrace remix culture.

Huh?



This happens once every five months or so. I'm never quite sure why. Rails is great and all, but that is just weird. How did it even find its way into my .irbrc in the first place?