Friday, March 30, 2007

Why I'm Not A Graphic Designer



A graphic design blog cites the "if it ain't broken, don't fix it" rule and identifies this "update" to the Dairy Queen logo as taking the crown for "the least broken with the worst fixin'." Unfortunately, they are so right. The new logo is less legibile, less distinctive, less clearly identifiable from a distance, utterly disregards the simplicity and cohesiveness of the original, and contains colors which make no sense and operate solely as visual noise.

And the real problem is that these are technical terms. These will all directly correlate to weaker performance. Every one of these flaws means that people will be less certain what logo they're looking at. In that drive-by split-second where you might have seen the logo out of the corner of your eye and wanted some kind of ice-cream-like processed sweet goo, legibility, consistency, distinctiveness, all these things translate into sales.

I love doing graphic design, but I hate talking to graphic design clients, and the reason is that they always assume they know as much about graphic design as I do. This always makes me wonder why they don't just do it themselves. In Europe graphic design is a respected field, but here in America, the idea that a graphic designer has technical constraints to work within is completely foreign. The idea that a graphic designer is doing work with any technical dimension at all is totally unheard of, and this is why American graphic design is so incredibly bad.



Graphic designers are generally thought of as people who make stuff pretty, and their job is to make executives happy. But if you have a background in graphic design, if you've read things like Robert Bringhurst's Elements of Typographic Style, you know that graphic design is simply a technical and artistic field which American culture mishandles for some unknown reason. Good graphic design, in this country, is seen by many companies as an indulgent luxury, rather than something with genuine usefulness. (I myself am guilty of falling into this trap; take a look at my portfolio site and you'll see that I love design, and yet you'd never guess it from reading my blog.)

A little while ago I posted about how both working from home and deprecating Internet Explorer appear to be luxuries but are in fact powerful competitive advantages. With good graphic design, we have again a thing which companies generally regard as a luxury. But maybe there's a pattern here. Could it in fact be a powerful competitive advantage?



Of course it is. And this makes me wonder why I'm not a graphic designer. For years I've believed that the reason I'm not a graphic designer is because programming and graphic design are equally creative fields, but programmers are given the respect they deserve and designers aren't. As long as your code meets or exceeds requirements, people will be happy with it; whereas your graphic design can meet or exceed requirements, and you're still expected to jump through hoops. But making your design worse just to satisfy the client will in fact cost the client money in lost sales. (Don't believe me? Ask Zune. User experience is design too.)

I think the real reason I'm not a graphic designer is because I never figured out how to communicate this to my clients.

(The other reason, of course, is that it's hard to be two things at the same time, but that's another post.)

Thursday, March 29, 2007

Kathy Sierra Isn't Being Stalked

There's a fascinating and disturbing alternate take in Violet Blue's column for the SF Gate.

If I understand it correctly, this interpretation is that Kathy's not being stalked by an individual sexually repressed lunatic, but harassed by a group of men, and further, that these men are not expressing sexual intentions in a violent way, but expressing violent intentions in a sexual way.

To quote from the column, "we're not talking about a lack of social skills, we're talking about a desire to destroy." It's persecution, online mob violence, with the purpose of keeping a class system intact. It's as evil as evil gets.



Violet compares Kathy's harassment not to Jodie Foster's experience with the demented stalker who tried to kill Ronald Reagan, as I did the other day, but to a "sex-murder" of a transgendered teenager. I don't know what "sex-murder" exactly means, and I don't want to know, but even though the details are fuzzy, the picture is clear. Her point is that the sexual nature of the attacks isn't an expression of actual sexuality, but of gender politics. It's as much a deliberate intimidation tactic discouraging other women as it is any kind of thing involving Kathy specifically. Since high-tech is generally seen as a masculine thing, her attackers are enforcing some kind of gender barrier they feel she's violating. Like "How dare a woman write a blog that is better than anything I'll ever do in my life? Doesn't she know her place?" That kind of thing.

That doesn't jibe naturally and intuitively with my experience, but as a white male with an excellent education, if that jibed naturally and intuitively with my experience, that would be pretty weird. It does, unfortunately, resonate absolutely and incontrovertibly with what I understand of psychology, sociology, fundamental tribal behavior built into every human, and especially class structures. What I'm saying is, it feels weird, but I'm almost certain she's right.



There's actually some good news in this. Individual stalkers tend to be very single-minded and unrealistic, owing to the fact that psychosis and realism are such polar opposites. However, if this is a group of harassers hell-bent on intimidating women and keeping women out of high-tech, there's a much smaller chance of them attacking Kathy personally and physically. In fact if they're harassers rather than stalkers, attacking her physically would be less productive for them than continuing the online harassment, since the online harassment is more public, and more intimidating to women in general. Online harassment is a lot less effort to go to, so women generalizing it out and considering themselves potential targets also is more likely.

Obviously, however, this "good news" is a pretty fucking mixed blessing. I think everybody should be glad that Violet's analysis indicates Kathy will probably be safe, physically, but Kathy's comments that she might never post again take on an entirely different tone in this context. It means that even though Kathy's safe, the evil bastards are winning anyway.



It really hammers home the need for a solution, and I think the solution is to de-geekify technology. Kathy's books defy the perception of what teaching high-tech is about, and yet they're more effective at teaching technical topics than books matching that perception. The people doing this to Kathy are obviously invested in that perception, if Violet's argument holds water -- and I believe it does -- so if we destroy that perception, we draw fire away from Kathy, and from women in general, at the same time as we reduce the attackers' power.

So the good news is we have a solution, and the bad news is it requires transforming an entire worldwide culture. So again we have a very mixed blessing on our hands.



A more immediate solution is to find these assholes, put them in jail, and make sure none of them ever gets hired doing any god damn thing even remotely related to technology. And to make sure Kathy keeps posting. If I was O'Reilly, I'd offer her kung-fu bodyguards at the next conference.



Seriously. Show those motherfuckers we aren't tolerating that shit. The picture's about right, but there's something missing.



There we go.

There's one other thing in that Violet Blue link, by the way. Violet says this situation is a situation confronting every woman out there who has a blog, and I absolutely believe her. She also says that neither she nor a friend of hers, both of whom have had experiences similar to Kathy's, ever made a big deal about it in their blogs or posted about being terrified. The weird thing is, she almost sounds macho about it:

The question is, Do we women need to portray ourselves as victims to garner support when men threaten to defile our corpses if we gain notoriety?

I'm not a woman, so I'm way out of my depth here, but for what it's worth, that sounds to me like a false dichotomy. There's portraying yourself as a victim, and there's not saying a single word, and these things are miles apart from each other. I don't know what the middle ground is, but I'm sure it exists. Tolerating that kind of thing in silence, accepting it as normal, that's not good for anyone.

Finally, although this is totally self-aggrandizing, there is something you can do about this. Link to my blog! Send lots of people here. The more people realize what this is really about, the better the chance that women will shrug it off like Violet, but tell people about it like Kathy.


Brand New You're Retro



Lisp and Smalltalk are sometimes thought of as the languages of the gods. They are strange. They are esoteric. They are pure.

They are JavaScript and Ruby, respectively.

Ruby's object structure and "metaprogramming" emphasis are so thoroughly Smalltalk that many Smalltalkers consider Ruby a Unix dialect of Smalltalk. (That's actually downplaying the Perl influence, but it's still a very relevant view.) And JavaScript is so Lispy that you can run through The Little Schemer in it and even, if you're crazy enough, implement a Scheme interpreter in it too.

I'm resisting the temptation to say that "X is the new Y" as strenuously as I can, but it's a powerful temptation. Lisp and Smalltalk gurus are as famous for grudgingly refusing to ever admit defeat as they are for incredible feats of wizardry. But the reality is, if you look at the Web development landscape in 2007, Lisp and Smalltalk appear to be winning. Is this another bubble, tinier and yet stranger than the first one? The Lisp and Smalltalk in disguise bubble? Or is it something more familiar?



In the movie Aliens, there's a scene where the Marines enter the space colony, and there's one particular shot in that scene which is just a great shot of them moving down a hallway with some flashlights. Mind-numbingly simple, but you can't watch a science fiction movie made after 1986 without seeing a rehash of that moment. Even today, in 2007, there was a version of it in a recent episode of the new Battlestar Galactica.

People repeat stuff that works.

If you're working for a hip, Web 2.0 startup building stuff in Ruby on Rails with Ajax, you're very nearly working entirely in Lisp and Smalltalk. The all-in-one OS-ness of Smalltalk failed, and the fingernail-clippings-everywhere syntax of Lisp failed, but in every important respect, those languages are absolutely alive and well and living in Philadelphia.

Wednesday, March 28, 2007

Java Is Plato's Republic

Plato's Republic is heavily influenced by geometry, the big intellectual innovation of the time. Of course the major weirdness of geometry is that you can design a real object like a building, and base that design on an unreal thing, like a square or a circle or a triangle, and it'll work. So the big disconnect is that the logic of squares and circles and triangles is powerful and effective and works in a pragmatic sense, and yet no true perfect squares, circles, or triangles exist in real life.



So the solution to this is to posit that outside the universe there dwells a pure realm of math, where the real squares and triangles et cetera exist, and to say that our own world is just a shadowy reflection of that pure greatness.

And this philosophy has all kinds of consequences, some of them markedly non-beneficial, but that's another subject. The reason I bring all that up is that's Java.

Anybody who comes to Ruby from Java will sooner or later encounter confusion about the assigning of instance variables to classes. You would think those instance variables would then become available to instances of the class, as class variables; in fact, from the point of view of an instance of a given class, an instance variable assigned to a class just disappears into a cloud of mystery.

The solution is that an instance variable assigned to a class isn't actually assigned to a class at all; it's assigned to a Class. Ruby's object model is very closely based on Smalltalk's, where classes themselves are objects in the system, available for real-time modification by the programmer. This means that if you assign an instance variable to a class, it's available to methods within the class itself, but of course not to instances of the class, because in Ruby, a class isn't an abstract Platonic definition that sits outside the world itself. It's right there in the world with everything else, and an instance of it is just something which implements the definition it provides.

See the thread on the ruby-talk mailing list about instance variables and ActiveRecord::Base for a more detailed discussion (and a nice examination of the Rails source code).

Tuesday, March 27, 2007

Seriously

What should tech bloggers do? I was looking forward to Kathy Sierra's ETech presentation, and I wasn't even going to ETech. I just figured I'd download the podcast. There's a Wild Wild West element to the Internet, but that's no excuse. Is this just an inevitable side effect of being a blogebrity? Is it just a more sinister version of the comment I received the other day about my haircut?

Hollywood celebrities are pretty silly. Are bloggers doing the same thing? Are we headed the same way? Is it all just an inevitable side effect of holding the attention of a large group of people, that some members of that group will be people whose attention it's better not to have?

What can we do about it? How do we know a horror movie isn't happening right now?

One solution, ironically, might be more exposure. Imagine if Kathy filled her home with webcams and her friends and fans were watching those webcams. They'd see if anything bad happened. Crazy stalker comes in through the window, 911 gets flooded with calls. Is the solution to a few creepy people watching you a large number of friendly people watching you? This is the idea of The Transparent Society. It's a fascinating book, but it's kind of scary having such a compelling reason to think about the issues it raises.



I think it'd be very useful to have semi-public webcams. This is sci-fi, but imagine a social networking site where you had webcam feeds from all over your house, shared only with people you trust. Virtual shared space. Not just for women terrorized by evil stalkers; it'd also be useful for parents who want to keep an eye on their children, or senior citizens who wanted to live independently but not without backup. Useful for anyone, really, with fundamental human tribal instincts.

I doubt that'll make Kathy feel better, but there has to be some kind of solution.

Celibacy Considered Harmful

Kathy Sierra is massively freaked out. My initial response was, oh, it's just some jackoff posting bullshit, but having re-read the article and given it some thought, if I were her, I'd have a friend bring me a gun, and I'd get a few nice big dogs. Like Rottweillers, or German shepherds, or maybe just a trained bear. That's some scary fucking shit.



I've said obnoxious things on the Web -- to be provocative, to get attention, to piss people off, to deliberately push well past the boundaries of good taste. Usually I was either kidding or randomly grumpy because I hadn't had my morning coffee yet. Many people have done the same thing. That's part of the Internet. What's going on at Kathy's blog is way past that. The stuff posted is weird enough that you could definitely consider it utterly sick in the head and devoid of any sense of humor, and it looks as if it's a consistent pattern rather than a random outburst.

South by Southwest recently had a panel on "the rise of the blogebrity," the neologism of course meaning celebrity blogger, or blog celebrity, or whatever. If there's anything which validates that notion, it's the first blogger targeted by some weird-ass stalker.



Stalking is strange and terrifying. It appears to be a totally modern phenomenon, but if there's one thing I learned in high school, it's that any time you encounter what appears to be an entirely modern phenomenon, there usually exists some parallel in the literature of the Roman Empire or Ancient Greece. (If there are only two things I learned in high school, number one is that whole historical parallels thing, and number two is how to make a bong out of a plastic bottle and some tinfoil. I had a very broad education. Let's move on.)

What's weird to me is that there don't appear to be historical parallels, which makes me think that what's really different is that this society protects women from stalkers, whereas previous societies tolerated it.

At the risk of sounding totally insane, and because I am Giles Bowkett, and this is how I think, I'd like to suggest a solution to stalking. I believe, in fact, that I could eliminate this problem once and for all, throughout the entire world, forever.

Step one, legalize prostitution. Step two, make paying for sex not only legal -- but required by law. Make it illegal to go longer than a certain length of time without getting laid. (Have medical professionals determine what that length of time should be.)



Why will this end stalking once and forever? Simple. Stalking is usually driven by sexual obsession. Most stalkers are male; most victims of stalking are female. One of my favorite ideas is the idea that any pattern of dangerous behavior can be viewed as a public health problem. All men become more aggressive when they don't get laid. That's biological. But the difference between a man who turns into a stalker due to sexual repression and the man who just deals with it, that's a purely psychological component which must be extraordinarily difficult to predict. The long and the short of it is that sexual repression is a serious public health problem, one that leads to deranged and dangerous behavior, and the easiest way to cure sexual repression is to have sex a lot. So my solution is really just common sense. If going without sex for longer than X amount of time was illegal -- X being whichever minimum health professionals conclude is necessary -- then the courts could just sentence law-breaking celibates to mandatory court-ordered fucking at the hands of a professional, and the sexual frustration, repression, and obsession which would have made those men dangerous would never even happen. The world would be a safer place.

Your tax dollars, hard at work.

Of course, as usual, the opponents of this plan would be the same pack of lunatic assholes who stand in the way of anybody who wants to make the world a safer place. I refer, of course, to the Religious Right. But screw those guys. They're all snorting meth and boning gay hookers anyway. Seriously, it's time to make the world a better place. Legalize prostitution and criminalize celibacy. It's the only way to end stalking forever.

Monday, March 26, 2007

A Conversation With Reg Braithwaite

I had an interesting conversation with Raganwald.

He sent me an e-mail with the subject "Bram's Law explains the FizzBuzz phenomenon." It pointed me to a blog post explaining Bram's Law:

The easier a piece of software is to write, the worse it's implemented in practice.

I replied that this has always been my fear for Rails; that in five to ten years, the worst jobs you could get will be Rails jobs where you're maintaining stuff built by non-programmers who figured Rails made programming so easy that they didn't really need to understand what they were doing.

The conversation then moved to Avi Bryant's post on onions, and the questions it raises -- especially the question of to what extent Rails is simply doing onions better than any other onions framework out there. And then I kind of went off on a tangent:

We had a pretty interesting workshop here the other day. I was supposed to just give my "HREF Considered Harmful" presentation, the one from the screencast, but the projector was all screwed up and I ended up having to just make stuff up and write it on a whiteboard. Some of the stuff I made up didn't really make sense, so I got challenged on it. One person was like, couldn't you just emulate Seaside's componentized nature by replacing Rails controllers and
views with Builder templates containing objects with built-in render methods which call other Builder templates? And the thing was, it'd work, in fact you'd basically get everything but continuations, but the question was whether it'd be worth it if you didn't get Seaside's session management, and whether the session stuff would or wouldn't be a nightmare in any language but Smalltalk.

The idea was that Rails' template system is a big fat smelly onion. We ended up with a design that would probably work much better than Rails, in fact, for Seaside-style development, and still have all the advantages of Ruby over Squeak -- easier DB/Unix integration, more developers, etc. Although the flipside to Bram's law is that any time a really great language is used to solve a moronic problem you suddenly get to choose from ridiculously qualified developers to solve that problem. The Python paradox.

But the flipside to the onions thing, he also mentioned meaningful params as an onion. Some people have told me they think Avi Bryant doesn't understand the Web, and I kind of suspect it was because when you throw away meaningful params, you destroy interoperability. That's why the Rails REST stuff is all about cooking APIs directly into everything you code. Everybody loves mashups, and certainly I was using a Google Maps mashup to predict traffic on the LA freeways before Google itself added that feature. It was a win for Google -- free R&D if it implements the idea -- and a win for every person in LA who ever saw it either way. On the other hand, I heard one of the Flickr people explaining that the downside of their APIs is that they have to be able to accomodate really bad programmers pinging their servers a thousand times a day simply because those programmers don't know any better.

Now I have to admit, this "framework" which would be better than Rails for Seaside-style development, obviously that's a bold statement, and obviously this thing is vaporware in the extreme. So take it with a grain of salt. Please take it with a grain of salt, because the last thing in the world I want is an army of angry recovering PHP junkies descending on me and explaining why Rails is the best thing EVAR. I know. Rails is awesome. It's beautiful.

And forgive my hubris; if it's any consolation, I stole the basic idea from a Rails programmer who was dissing Seaside (ironically enough) in a comment that was posted on Ramon Leon's blog.

However, there's one place where it's very easy to see the benefits of Seaside's componentized, templateless style. Picture a complex enterprise application which a large corporation uses to manage logistics like shipping, billing and accounting. Say this company wants to present, to its clients, a subset of the information available throughout this huge sprawling app. You can do it with things like render :partial and render :template -- but components are more elegant for this sort of thing. And this isn't an academic example; it was a task we'd faced and for which components would have been nice to have.

One interesting and terribly tempting misuse of reflection might be to write objects which had to_html methods, which consisted of render :template calls generated dynamically from the names of the object and action called. This would move code out of the templates and back into objects, even when that code couldn't be contained in any one model, kind of like Jay Fields' Presenter layer. But is it worth doing? The answer depends on whether Rails' template system really is a big fat smelly onion or not. Are there advantages to a template architecture that I'm unaware of? There might be. I don't know.

Saturday, March 24, 2007

Why I've Got Google Ads

I set up Google Ads on my blog. Not because I expect to make any money off it; because I want some kind of metric for how many readers I have, but I want that metric to be extremely easy to use. The theory is flawed, however, because it doesn't measure RSS/Atom readers.

Everything Old Is New Again



Rails is the new HTML.
Merb is the new Camping.
DHH is the new Marc Andressen.
Seaside is the new Rails.
The Newton is the new Amiga.
Terrorists are the new drug dealers.
(Drug dealers were the new Russians.)
Arnold Schwarzenegger is the new Ronald Reagan.
George Bush is the new George Bush.
Iraq is the new Vietnam.
Ecstasy is the new acid.
Google is the new Microsoft.
Microsoft is the new IBM.
Haskell is the new Lisp.
lambda {|a|a+1} is the new print "Hello World";.
BitTorrent is the new Napster.
Mexicans are the new Irish.
(The Irish were the new Italians.)
Detroit is the new Mexico.
New Mexico is the new Nevada.
Vancouver is the new Amsterdam.
(Amsterdam was the new Tijuana.)
Sean Parker is the new Jim Clark. (Not really.)
Rap is the new country.
LEDs are the new Lava Lamps.
boingboing.net is the new Wired.
ActiveRecord is the new garbage collection.
Myspace is the new clubbing.
John Woo is the new Raymond Chandler.
Parkour is the new Hong Kong.
Katamari Damacy is the new REM (pre-fame).
Digg is the new Slashdot.
Lightning In A Bottle is the new Xara Dulzura.

Thursday, March 22, 2007

Can Your Rails App Write Itself?

First off, this post is for wizards who like mischief. You shouldn't even consider implementing the ideas in this post unless you can name twenty or thirty really good reasons not to. You have to be this tall to ride.



That being said, there's a great old post from last year which I happened across the other day. It reminded me of some slightly wizardly and slightly mischevious code I wrote around Novemberish. In the post, Jake Howerton creates a CRUDController which your controllers can inherit from; when they do, they get all the standard Rails CRUD actions for free, the same ones that are autogenerated by script/generate controller, the same ones you see over and over again in many different files in every single Rails app on the planet. Is that DRY? Of course not. And if you use this CRUDController, you get to move all that stuff to one place, and you never have to type def save or def edit in a Rails controller ever again.

However, Howerton gets this to happen through a combination of the usual Rails "convention over configuration" thinking and some instance_variable_get() cleverness. There's nothing wrong with that; in fact David Black says that making a distinction between "metaprogramming" and plain ol' programming is not only unnecessary in Ruby, it's more trouble than it's worth. But even though there's nothing wrong with it, it's the still the type of thing that makes certain people very nervous.



The first argument usually raised against this type of programming is that it's unmaintainable, because nobody but a wizard will ever even be able to figure out what it does. Anybody who's ever looked at the source for Seaside or read Paul Graham's praise for Lisp macros knows the answer to that one. You isolate the code which a novice couldn't maintain somewhere a novice would never find it -- somewhere a novice would only ever even go if they were deliberately looking for something which would challenge them and push them past their novice status. It's pretty easy to maintain wizardly code. Just make it interesting or lucrative enough that a wizard will care, and then slap a big "Do Not Enter" sign on it. This will keep the cowardly away and draw wizards like a magnet.

The second argument raised against this style is that it's harder to debug. That one's harder to tackle, and I don't want to write a novel here. Just realize that all the usual tropes of unadventurous programming apply, in fact, all the stuff that Java programmers are so afraid to do that it makes you wonder if maybe every Java programmer thinks that every other Java programmer is stupid. All those concerns are relevant here. Some of these concerns are foolish and some of them have merit, but none of them have ever really convinced me, because none of them have ever really been strong enough to counteract the two really powerful arguments for coding this way:

1. It's fun.

2. Typing the same thing twice makes you feel like a monkey.



So, I'm definitely very much in favor of this CRUDController thing. So, my own code, which I mentioned earlier, the code I wrote which similarly revolves around instance_variable_get(), this code is a controller with a view which allows you to create a Ferret search method for any particular attribute on a model just by creating a partial named after the attribute in the views dir named after the model.

The interesting thing about this code is that I wrote it for a nonwizardly client who's more of a business guy than a coder. However, he got the idea of it very quickly, and it turned out to be very handy during a deadline-oriented sort of sprint of work -- the type of last-minute late-night sleepless rush which agile software development abhors and ad agencies embrace wholeheartedly.

It turned out very handy because I was able to set up a different type of functionality in the app in the same dynamic way, essentially creating an API for my client which he could use just by creating view files. Ruby and Rails have got to be the only combination I've ever seen where you can get a programmer coding an API and a business guy using it all in the space of a few minutes. Even the Lispers and the Smalltalkers can't claim that, because you'll never find a business guy coding Lisp or Smalltalk.

Although my client ended up switching from Ferret to HyperEstraier due to issues with Ferret's indexing, this code was very, very useful for this particular situation, and additionally is one of the cleanest, concisest, most elegant things I've ever written. I posted about it on the ruby-talk mailing list, saying something like "I love this code so much that sometimes I buy it chocolates and sing to it." I was exaggerating, but honestly, not by much.

I'm posting it here, but if you don't like it, don't tell me. It is a thing of beauty. But the point is not my Irish temper or my prima donna attitude. The point is not even the conciseness of this lovely code. The point is that the number one argument against writing code this way is that it allegedly creates obstacles for less skilled programmers; and yet the single most useful experience I've had with this style of coding was when it proved immensely helpful for a client programmer who was really just a client -- a business guy with the brains to write code, but neither the personality type nor the training of a dyed-in-the-wool hacker. This client should have been the poster child for not coding this way, if the arguments against this style have any merit, but in reality it was very useful for him.

So go for it. Use CRUDController. Use eval() and instance_variable_set(), and not even because it's a good idea. Do it because you can. The big open secret of software development is that good programmers aren't engineers at all -- we're artists, and artists who aren't living on the edge aren't living at all.



Don't go crazy, though.

Saturday, March 17, 2007

Quicksilver Trick

Hopefully everybody knows that you can combine Quicksilver with AppleScript to do nearly anything. What's less obvious, but potentially more useful, is that you can combine Quicksilver with shell scripting really, really easily. All you do is put your shell script in your Applications directory, and you can access it from Quicksilver.

For some reason, the shell script I learned this with, I have to hit Cmd-Space, "a" for "Applications," the right arrow to see the list, and then "j" (the first letter in the script's name). I think that's because I didn't actually put the script in Applications; I put it in /opt/local/bin, and put a symlink in Applications. So Quicksilver doesn't get this exactly right -- I can't just type the script's name -- but it's still four keystrokes, where before I had to open the Terminal.

And why should you have to open the Terminal just to run a shell script? With Quicksilver, you don't. This particular shell script, it's written in Ruby. It's for my personal journal, on my laptop. It creates a filename based on the date and then opens the file with TextMate, using TextMate's mate shell command. So if I've written in my journal earlier that day, I start right where I left off, and if I haven't, the file is automatically created. So in a sense, it isn't really even a shell script; it's using Quicksilver and Unix for TextMate automation. Even though the functionality is enabled by a shell script, it's really got nothing to do with the shell itself at all.

This is why I switched back to the Mac -- and wish I'd done it years ago. Being able to integrate custom Unix programming in the classic sense with the kickass software available for OS X is an unbelievably excellent combination.

Thursday, March 15, 2007

The Business Case For Firefox



The incredibly good software development book Peopleware documents something extraordinary: hard numerical evidence that normal office environments have a powerful negative effect on productivity, both as measured by lines of code written and by number of defects. Cubicles considered harmful. While the anecdotal evidence which the open source movement provides in favor of this conclusion is obvious, Peopleware takes it to the next level by providing actual proof. It forces you to realize, this thing which conventional thinking would label a self-indulgent luxury is no luxury at all: it's a powerful competitive advantage.

At my job, we're fortunate enough to have the luxury of requiring our clients, in some cases, to use Firefox, and refusing to support Internet Explorer at all. But the reality is, it isn't a luxury either. This, also, is a powerful competitive advantage -- and more so for our clients than for us.

Here's why. Many of our clients need us to develop enterprise software for them; software they use to run their businesses. It's not at all unrealistic to imagine this software still running a decade from now. So on the one hand, you have a standards-compliant browser which will remain backwards-compatible for as long as Web standards themselves do. Ten years probably fits within that time frame. And on the other hand, you have a notoriously quirky browser which may some day become standards-compliant, and which will remain backwards-compatible for only as long as Microsoft feels like it. And remember, Microsoft has the single worst track record for security and effortless upgrade paths of any software company still in business.



We charge our clients extra for Internet Explorer compatibility. This makes it obvious in the short term that there's a cost associated with Internet Explorer. But as awesome as it is, there's a downside, and the downside is, charging extra for Explorer compatibility is easily dismissed as developer caprice -- as a quirk, as a necessary evil or irritating side effect when you're dealing with a bunch of strange people who care more about science fiction than business reality. (Remember, that's often what Joe Suit sees when he looks at a software developer.)

The way to translate things for Joe Suit is to get him to do the math. Your initial expense to build the software is going to be X; your ongoing expense to add new features falls in the range of Y. However, if you're asking us to develop Microsoft-specific code, your upfront costs will increase by alpha; and if you're maintaining that code, your ongoing costs will increase by beta. Be honest, make alpha a lot but not too much, but beta is going to be huge. Firstly because any developer in their right mind hates Explorer, and should be compensated simply for even going near that toilet in the first place; second because bad design costs money. Maintaining for Firefox is simply less work than maintaining for Explorer. Less work means less cost.

Also, be sure to indicate that your estimate is much more uncertain with Explorer than it is with Firefox, because Microsoft has total control over Explorer and can do any crazy thing they want at any given moment. They have a very consistent track record, when it comes to Explorer version upgrades, of breaking sites that worked perfectly well under previous versions. Under those circumstances, it's hard to say how much more expensive maintaining for Explorer will be; all you know for sure is that it'll cost more. Possibly a lot more.

Joe Suit is used to the idea that Microsoft is a no-brainer over the long term, but it's not a no-brainer in this case. Even though the words are almost identical, there's a huge difference between a decision which is a no-brainer and a decision which is brainless. Developing for a platform Microsoft controls made sense in the world of Windows, but in the case of Explorer, it's nothing but madness, and I say this not only from a developer perspective, but from a business perspective as well. Your long-term costs go through the roof and the security of your investment disappears.



It's that simple. If your clients don't know that, well, that's why they hire consultants in the first place. Educate them, show them how much money they'll save by switching to Firefox, and they'll say, "wow, you saved us a ton of money already, and this is just the sales meeting."

Tuesday, March 13, 2007

Is SQL Manual Memory Management?



One of the huge advantages to programming in languages like Ruby, Perl, Lisp, Python, Smalltalk, Java, and JavaScript, among many others, is that you never need to do manual memory management. You don't reference and dereference pointers; you really don't even need any understanding of how memory works internally at all.

This is a stark contrast to the experience of programmers from the C/C++ generation, as well as (obviously) anybody who works with those languages today. I've never learned a thing about pointers in my life, except during a brief exploration of C and C++ out of curiousity, inspired by friends who worked in video games. It's never been necessary knowledge. Garbage collection is so pervasive today that many, many programmers have never written a line of code without it.

One of the most awesome things about Rails is ActiveRecord, and the near-seamless experience of writing all your migrations in Ruby, so that you can easily code tons of different complete, fully-fledged applications without ever even seeing a single line of SQL. This prompts the question, is SQL essentially the same thing as manual memory management?



Anybody who's ever wrestled with SQL -- especially anybody who had to do it by hand in Perl in the mid-90s, before Perl even had its DBI library -- is going to find the idea that auto-generating SQL is the same thing as garbage collection very appealling and intuitive because the obvious implication is that SQL is, in some sense, garbage. The counterargument is that there are still plenty of cases, even in Rails apps, where you need all sorts of direct access to the database, because there are so many advanced database features that Rails doesn't even make the slightest effort to address. But this counterargument just gives credibility to the main idea -- I'm sure that when C reigned supreme, there were a lot of limitations to using garbage-collected languages, and that these languages had disadvantages which forced even their biggest fans to use C for certain things, until the garbage collection got good enough that it was no longer necessary.

This is kind of programming language sci-fi, but maybe one day people will look back on Rails not as the best Web framework ever, but as the first SQL-collected language. Maybe one day the idea that database management should be part of the language will seem as obvious and basic as the idea that programmers shouldn't have to do memory management by hand. Maybe a language which doesn't handle SQL for you is like an operating system with no sockets model. Maybe an object which can't write itself to the database is a primitive thing. (For that matter, maybe an object which can't generate a URL to itself is a backward thing too.)

I don't know, but it's an interesting idea.





Update: maybe I have used pointers after all.

Sunday, March 11, 2007

This Blog Is A Non-FizzBuzz Zone

It made me too grumpy.

Unit Testing FizzBuzz? Are You Out Of Your Fucking Mind?



One blogger cooked up an implementation of FizzBuzz in Java which gives you the option of changing everything -- the rules, the terms displayed in the text, the language, everything.

My FizzBuzz was intended to be sarcastic, but this should have been the real target of my sarcasm. My solution is something like fifteen lines of code. This solution comes in a .zip file with documentation, DEFAULT_XYZ variables, and unit tests. The comments are more than fifteen lines!

This solution does not have any significant advantage in flexibility over my own quick joke version, and its absurd, painfully detailed documentation is infinitely less clear. Not only that, the "one little flaw" that the author writes off as inconsequential is that you can't change the numbers FizzBuzz alters its output for. That is to say, if you want to do a FizzBuzz that puts out new strings on multiples of 7 or 2, it just can't be done without dismantling the entire system. Except the whole point of FizzBuzz, in reality, is to teach children to recognize numeric patterns in sequences. The numbers involved are the only interesting thing. And a system this overblown takes a hell of a long time to dismantle.

Honestly, you have to wonder. This FizzBuzz is so nightmarishly awful, such a monument to useless busywork, and the blogger's reasoning so flat-out bad, I'm tempted to think the whole thing's a fake -- a troll from some Rails hacker who wants to revive the "Ruby vs. Java" blog war from last year.

I really shouldn't post before breakfast. I'm always grumpy before breakfast. Last time I posted before breakfast, I got myself kicked off a mailing list. I'm sure when I go back and look at this in a less fiendish frame of mind, I'll see advantages to this approach. On the other hand, I posted a little while back that hiring somebody who's willing to code FizzBuzz is as bad an idea as hiring somebody who's unable to code FizzBuzz, and this is pretty much exactly what I was talking about.

Life is short, and this is as true for business models as it is for human beings. Don't believe me? Ask Microsoft. Or better yet, wait five years, and then ask Microsoft, and watch how quickly those five years went. Ask Microsoft what they spent those five years doing while their business model died its death.



You need to be flexible and adaptable in business. This Java solution packs layers of beauraucracy onto what might be the simplest programming problem in the world. Layers of beauraucracy have the functional result of impeding change -- consider how my solution, you can change anything, whereas the Java one supports some changes and not others, and the developer's assumptions about what changes may or may not be necessary shape the eventual range of what changes are possible.

But the sociological result of layers of beauraucracy is the real danger here. Sociology is much more important to software development than people generally realize. The sociological effect of Java, in this case, is encouraging conformity and limiting imagination. The reason is simple: conformity and limited imagination are necessary for the kind of person willing to go through all that effort just to solve FizzBuzz, and cover every reasonable variation. And the key there is not the thoroughness -- thoroughness is admirable. The key is the term "reasonable." The term "reasonable" is an utterly dangerous term. It is by nature a value judgement, and a conservative value judgement. "Reasonable" and "innovative" are not synonyms. Think of all the unreasonable features on the iPod. You can't even change the battery! No reasonable person would ever have designed that thing, which is why one of the most famously unreasonable people in the world is taking over the music industry.



Anybody who's ever seen a software project fail or even falter knows that if you cover every reasonable variation, you're going to build a whole lot of options that your client or users will never ever use or even see, wasting time and money in the process, while at the same time building something which can bend at every join except the one they need to flex. You shouldn't cover reasonable variations. You shouldn't cover any variations. You should code the quickest, simplest solution you possibly can, even if it looks stupid as fuck, and then if your client or your users need some change, you should choose the way to do it which requires the absolute minimum of work.

Everybody pretends they already know this, but if you really mean it, you have to ultimately realize that spinning your wheels writing unit tests for FizzBuzz is not good training for the process of writing great software. Everything you do shapes your mind. I have professional training in hypnosis and I assure you, literally everything you do shapes your mind to some extent. If you're going out there and practicing something as reasonable, safe, and useless as writing tests for FizzBuzz, the next time you sit in front of a computer, you'll do something reasonable, safe, and useless. The powerful pattern-recognition machine that is your subconscious mind will say, "a-ha! I'm in front of a computer. That means it's time to write unit tests for FizzBuzz!"

Don't do it! It's bad!

The point of unit testing isn't covering your ass. It's making code succinct. Unit testing is a design tool. Unit testing FizzBuzz is like drawing up blueprints when you need to make a sandwich.



Saturday, March 10, 2007

Shave That Yak



If you want to indulge your weakness for yak-shaving, you could start here. Wirble provides kick-ass enhancements to irb, including history, in the sense of the shell command history -- but it doesn't actually include a history command. It's utterly effortless to add one, however.

Here it is as a one-liner:

pp Readline::HISTORY.to_a

All you have to do to have a history command in irb is add that to your .irbrc file, located in your home directory, as a method called history. Unfortunately this yak is still kind of rough in patches; you'll get a bunch of quote marks and some additional noise in the output. Also, pp breaks Wirble's syntax coloring, and that's harder to fix than I'd like; pp is actually implemented directly with a stream, and getting to the string inside is harder than it should be.



More importantly, I haven't yet been able to define shell variables in irb, like !! and !$ in the shell, which give you the last command and the last argument in the last command, respectively. I'm so used to these in the shell that lacking them on the irb command line seriously throws me for a loop, and rather than getting used to it, I'd like to add them to irb (and thereby, of course, to script/console as well, which has to be the best, most underrated bit of brilliance in all of Rails).

The first problem is you can't actually name the methods !! or !$; you have to settle for h! and h?, but that's not a big deal. The trivial implementation is effortless; you use the code above for history and wrap that code in an eval(). Unfortunately, this simple-minded approach will kill your irb completely if you try to do it twice in a row, since the second time, it'll be eval()ing an eval(), and then you're trapped until you hit control-C.

The other thing is that I've been working on slightly the wrong problem. The last argument from your last statement in script/console, that's useful; but the functionality provided in the shell by backticks, which gives you the output of any given command, is very very handy in combination with !!. It would be better to be able to use that. I often use `!!` to mean "the thing I got back a second ago," and that kind of shorthand -- simply "the output you just returned" -- would be very handy in Rails console. It ought to be easy to get to, but I might have to patch irb itself in my ~/.irbrc to do it. I'm very much not sure if this is correct, but it appears that irb, even though it provides the return value of every operation (unless you tell it not to), does not actually capture that return value.

Anyway, this is my progress in the world of shaving that irb and script/console yak -- hopefully I'll have further progress to report at some point.

Friday, March 9, 2007

Sarcastic FizzBuzz



I caved in, despite my instincts, and coded an implementation of FizzBuzz.

It's kind of sarcastic. Here's the code as a text file.

I hate to admit it, but coding this was kind of fun. I got the idea from a comment I made. I was kidding in the comment, but then I thought, it'd be so easy, why not.

It's kind of intended as a joke, the implementation being worthy of the problem in an ironic sense. I think it'd probably be entertaining to have a contest to cook up the least advisable, most foolish, most overblown, most heinously wrong implementation of FizzBuzz possible. This would be a good entry, although not necessarily a winning entry. Altering Fixnum can mess with your maintainability in scary ways, but it could probably be even worse if it were done in Java by executing Unix processes and capturing the output. Especially if the Unix processes used python -e as a calculator, and used Python's explicit relative pathname on a particular box.

Thursday, March 8, 2007

College Considered Harmful



I heard a rumor that universities are now putting FizzBuzz on their mid-term exams.

Anybody who wants to be a programmer, you should probably avoid studying programming in college for the same reason models avoid cheeseburgers (as Paul Graham said).

In fact, you might as well just go the whole hog and avoid college, period.

Emergent Sociology ~= Chaos Math



Books like the amazing Peopleware often put forth the idea that to manage software development, you need to understand sociology. This often produces resistance in developers, which is ironic, because anybody who ever writes any piece of software must manage their own software development. But sociology is a "soft science," a touchy-feely discipline where bullshit artists make stuff up and cook up phony studies to back up their personal generalizations -- this, at least, is the stereotype which often lurks at the back of a developer's mind. It's seen as being as arbitrary, speculative, and disingenuous as psychology, just operating on a group scale rather than an individual one.

An interesting possibility, however, is that certain elements of sociology are not actually derived from psychology at all -- but from mathematics. My own suspicion is that certain elements of sociology, which is to say, certain elements of the patterns of behavior of people in groups, are simply intrinsic not to the psychology of humans as they relate to one another, but to the mathematics of systems of independent agents operating in concert.

My belief in this comes from the fact that certain unanticipated sociological phenomena, such as patterns of etiquette, emerge when robotics researchers design robots to operate in groups. Unfortunately I saw this years ago and have lost track of the link. Google wasn't any help. But this is interesting because robots do not have psychology in any meaningful sense, nor can you trace back their group dynamics to primitive tribal behavior the way you often can with human behavior. Yet groups of robots develop protocols of etiquette anyway. They also develop more sophisticated behavior, such as patterns of deceit and misdirection.

If sociological behavior such as etiquette, and consistent patterns of groups misleading other groups, may indeed derive to some extent in an emergent fashion from the interaction of groups of mechanical agents which have neither primitive tribal urges nor any psychology to speak of, there is a very good chance that certain elements of sociology are in fact entirely mathematical in nature. This does not, however, mean that you automatically get the ability to cook up the math for a good team dynamic. Emergent behaviors are closely linked to chaos math, and chaos math is definitively non-deterministic. If sociological phenomena have any foundation in math it is almost certainly non-deterministic math.

Still, this is a very, very interesting line of research, and I expect further findings in this field will continue to be very interesting.

Tuesday, March 6, 2007

Voodoo Considered Harmful



The handiness of rake test:* is undeniable, but be warned; there's weirdness thereabouts. Today I ran into errors which couldn't be explained easily, but which disappeared magically if you ran rake db:test:prepare beforehand. There's some kind of magic in there, and you should watch out for it.

What Superhero Is Your Programming Language?

Stupid question, I know, but I'm beginning to very strongly suspect that Ruby is Spiderman. Why? Because with great power comes great responsibility.



There are all kinds of things you can do in Ruby which probably aren't a good idea. Do you want to override all numbers to return the string "puppy dog sandwich"? You can. It probably isn't a good idea, but you can if you want to. Want to override plus so it means minus? Go for it.

Rubyists who also love Lisp and Perl will of course point out that Ruby isn't the only language to give you this kind of power; it's just the only one you can also actually read. Pythonistas and Smalltalkers, however, will be quick to point out how easy it is to remove your own foot ballistically, and yet legibly, in their favorite languages too.

The flipside of all this, of course, is languages like Java, where the entire design is structured around preventing bad programmers from programming badly. I don't think it works. I think bad programmers, and good programmers who are still just learning for that matter, can mess up in Java just as badly as in other languages; it just takes longer to figure out, because there are so many extra words involved. In a way messing up is pretty crucial to learning, and using your powers irresponsibly is probably the only way to learn what responsibility even is.

Still, the next time you find yourself using #class_eval(), ask yourself, am I going to regret this when Uncle Ben is gone?

Friday, March 2, 2007

FizzBuzters




Not too long ago Imran gave the world FizzBuzz. FizzBuzz is an incredibly easy programming problem whose sole purpose is to screen software development job applicants who are so bad that they can't write code at all. It has no other purpose. It was literally designed so that it requires programming skill, but if you have any programming skill at all, you can solve it in seconds.

You might expect the blog world to respond, if it responded at all, with a re-evaluation of its hiring priorities, or a re-examination of the ways in which software developers are hired, or even some new ideas about how to hire. Alternatively, if you were absurdly cynical, you might expect programmers to start coding FizzBuzz in every language from Haskell to SQL, and even to start holding FizzBuzz coding contests.

The great thing about being cynical is, it's rarely disappointing. Anybody who makes a prediction that large numbers of humans are going to continue acting like idiots will always get to enjoy the satisfaction of seeing their prediction come to pass. Because this is what people in fact do.

Anyone hoping to stop the madness is facing the usual frustration experienced by those who attempt to get more than three people to use their common sense in the same place at the same time. There was recently a post about a FizzBuzz contest on the Ruby-Talk mailing list which actually linked to Raganwald's Don't Overthink FizzBuzz for a description of the problem. (The FizzBuzz golf contest site appears to be down, but here's the link anyway.)

There is probably nothing more useless than coding a solution to FizzBuzz. It is not a profitable problem; it is not a difficult problem. You will learn nothing, you will not make a cent. The sole purpose of this programming problem is to weed out coders so incompetent that they have no hope of ever learning to program anything. It is only useful to people who hire programmers. Yet programmers are coding it in droves, all over the Web. It's even being called the programmer's "Stairway to Heaven," except that's completely wrong, because playing "Stairway to Heaven" does require some practice and skill.

Obviously I'm just complaining about people being stupid, which is of course at least as unproductive as the actual stupidity itself. But the funny thing is, I think you can use FizzBuzz as a hiring screen in another way. Don't ever hire anybody who can't code FizzBuzz; and don't ever hire anybody who does code FizzBuzz. Because the question is, why has anybody ever coded a FizzBuzz implementation? You get no money, you learn nothing, you don't get laid, you don't get a cookie, you don't even get to wear a shiny paper hat. (Although I suppose nobody's stopping you.) What on earth is the point?

People code FizzBuzz because it's safe. Every programmer out there who has wasted a second of their time on a FizzBuzz implementation has had something better to do, some dream, some goal, some ambition that they passed up to write FizzBuzz instead -- just like everybody out there who's ever sat on the couch and watched Friends instead of doing something worthwhile with their lives. They read about it on the Internet, it involves some form of effort -- insofar as lifting your fingers and then dropping them on a keyboard must burn some fractional quantity of a calorie -- and it's a lot less frightening than taking that dream program you've always wanted to code and actually coding it. (Or worse yet, making a first pass at it, realizing it sucks, throwing it away, and starting again. Which is what you have to do if you want to get anywhere.)

The thing is, if you've got some person who really thinks that coding FizzBuzz is worth the time it takes out of their short life, the few inches it brings them closer to the grave, they're going to be that wasteful with your time too. Even after you've bought that time from them. Hiring somebody who spends the time to write a FizzBuzz means that the thing you hire them to write will turn into a FizzBuzz. Those FizzBuzzers will FizzBuzz it up, until it's so damn FizzBuzzed that nobody but an utter FizzBuzz would be willing to work on it in the first place.

Keep in mind, FizzBuzz is basically an idiot detector. Anybody who bothers to code FizzBuzz is saying, ooh, an idiot detector. I'd better aim this at my own head and see what happens. Then they manage to code FizzBuzz and they go, "yay! I'm not an idiot!"

At the risk of raining on their parades, if you aim an idiot detector at your own head, it really doesn't matter what the readout is.

Thursday, March 1, 2007

w00t

I have a tumblelog. Yay. Expect the shorter posts to mosey over there and the longer ones to stay put like happy moo-cows.