Monday, April 30, 2007

How To Build A Scheme

Tutorial (in Haskell).

Weird Quicksilver / Finder Boot Bug

I just saw, on Quicksilver's autolaunch, not the usual "QS" but instead a version of the Finder logo which was done in shades of red instead of blue. Haven't been able to find any reports of this via Google, though.

Sunday, April 29, 2007

Rough Conference Schedules

Attention stalkers! Want to follow me around? Here's your chance!

Approximate RailsConf Schedule
Approximate OSCON Schedule


Rails On The Desktop

When I realized how Mac OS X Dashboard Widgets work, my first question was "does this mean you could build a desktop app in Rails?"

The idea at the time was met with skepticism, but since then there have been three indicators that this is in fact a very genuine future possibility:

Joyent Slingshot
Evan Weaver's upcoming RailsConf presentation
Clairify (thanks to Apollo and RubyAMF)

And of course at least one Rails application which, though technically a Web app, seems to be generally run as a desktop app by most of its users:


Way back in the 80s, when computer manufacturers started putting copy protection on their floppy drives, this was seen as a betrayal by geeks, many of whom believed the whole point was being able to customize software to your needs and take it apart for your edification. It could be that their ideal is going to be expressed anyway. If writing desktop apps is easy, people will do it.

Curious to see how this develops. (And, honestly, if it develops.)

By the way, speaking of OS X Dashboard widgets, there's one for RDoc which you can use as-is or point directly at the Rails documentation.

Saturday, April 28, 2007

Friday, April 27, 2007

Speaking at OSCON!

Very excited. Looks like I'll have to put some work in!

Come check it out!

My presentation will be based on my "HREF Considered Harmful" screencast about Seaside and Rails. I'm hoping to talk a little also about Web frameworks inspired by Seaside, such as Jifty and Phaux.

Thursday, April 26, 2007

Caveat Emptor

Thousands of Japanese have been swindled in a scam in which they were sold Australian and British sheep and told they were poodles.


The scam was uncovered when Japanese movie star Maiko Kawamaki went on a talk-show and wondered why her new pet would not bark or eat dog food.

She was crestfallen when told it was a sheep.

MindMeister And Choosing Software Projects

MindMeister is a mind-mapping Web 2.0 app in private beta. It's really an astounding example of what's possible today in a Web 2.0 GUI with Ajax, Scriptaculous, Prototype, and JavaScript in general. It's easily competitive with all but the flashiest mind-mapping software for Java or the desktop.

However, although mind-mapping has some great proponents, I personally don't get a ton of use out of it, so my first thought when I saw this was that this is the type of project which loses tons of money but is great for the programmer's career. It doesn't seem particularly useful, yet the actual code is extremely cutting-edge. The company might tank but your resume will look great. My attitude with this kind of project, in the past, was generally "take the money and run."

Unfortunately, that attitude is a pretty bad attitude. The dot-com downturn was pretty much caused by a "take the money and run" mentality, and provided a very stark demonstration that you can only run so far. So my next thought, upon seeing this, was that it was good code, but it might be bad karma.

Far from it. It turns out MindMeister is a project of a services/consulting business called Codemart. If you're in a services business, demonstrating that you can do work of this level is a great marketing move. This is basically "calling card" software - a fantastic way to get new customers.

I'm a fan of 37 Signals. The company began as a Web design firm, which is to say, a services business. I think there's a very strong possibility that their Web apps began as calling card software before becoming self-sustaining. This is one of the overlooked reasons that the Getting Real approach to Web apps is such a good idea: your best-case scenario is a self-sustaining business, and your worst-case scenario is a great calling card.

The old-school venture capital model gave you a best-case scenario of a gazillion dollars, and a worst-case scenario of total unemployment. I know that rollercoaster is fun, but only at the top; the bottom is pretty lame. But more importantly, what's really going on here is the difference between possible outcomes. With VC, you either win big or lose big; with Getting Real, you either win big or win small. But either way, you win.

Last year a few people were saying that another boom was on the way, and a lot of people were saying that companies like 37 Signals and Adaptive Path were the real future - companies with high standards of quality and great business sense, that could do excellent, innovative work, even during the bust. A few high-profile VC cashouts later, everything's different.

Business and technology are as subject to fashion as anything else, and sometimes I feel the lure of startups too, but I think it's pretty important to remember that the core difference between the Getting Real model and the startup model is that possible outcomes thing. A strategy where you might win big, but you probably won't, is inferior to a strategy where you're guaranteed to win and the only question is how much.

On the other hand, the inferior strategy is sometimes more fun.

Tuesday, April 24, 2007

When The Beast Was Born

I realized something interesting today. It's very specific to my own circumstances, but it might be useful to consider anyway.

I'm a programmer, and I'm in acting classes. Other programmers wonder if I'm trying to do some weird career change, and some even seem to think I'm "up to something" - social skills being seriosuly undervalued in programming, almost to the point of suspicion. The people in my acting classes ask me if I work at Urban Outfitters, and while this wouldn't be a flattering question among geeks, I can't be insulted, because there are very talented people in my acting classes who do work at Urban Outfitters, and no disrespect is ever intended.

The result of this is a minor identity crisis. Am I a programmer who studies acting because it makes giving presentations easy? Or am I an aspiring actor who somehow developed an entire career as a computer programmer (of all things!) before I discovered my true calling?

The uncertainty is amplified because one of the very few commonalities these very different fields share is that passion is necessary. Not just because you have to be passionate about either one of these things to be good at them. Also because both job markets have absurd dips and peaks. Nobody sane is doing either of these things just for the money.

Everybody knows this is true of acting; however, people are used to the idea that programming is a steady job. Don't count on it. Go back in time to 2002 and try telling that to people. Anybody who got into programming for the job security needs to seriously reconsider that strategy. The dot-com downturn sent legions of programmers back to their moms' basements. It's true that there's going to be more and more programming work in the future, as every last little thing from your refrigerator to your shoe gets an IP address, but that doesn't necessarily mean the work will pay well at all.

The popularity of this idea that programming leads to riches comes and goes with booms and busts, and that cyclical rhythm in and of itself should be a kind of wake-up call. Sometimes your services are in fashion and sometimes they aren't. The image of the struggling actor waiting tables is so burned into our culture I can't even guess when it first originated, but if the image of the struggling programmer waiting tables ever becomes an established cliche, I can tell you with confidence, it'll have started in 1997 and gained serious credibility in 2001.

So here I am. Am I an aspiring actor? Am I a programmer? What's the deal? I realized the answer today. The answer's actually real simple.

I'm a science fiction screenwriter who does way too much research.

Seriously. I spent so much time researching the Internet that I actually forgot the point was research in the first place - but that is in fact what got me started. What's the best way to find out what changes are really going to happen? Simple. Bet your ability to eat, for several years running, on which technologies you learn. After a while you get a pretty good intuition for that kind of thing. And likewise - the best way to write a screenplay? Learn to act!

But that still doesn't explain it all, so I should probably add that I'm also an egomaniac who enjoys math for its own sake. OK - maybe this is too much about me to be useful for anybody else. Maybe my psychology is kind of odd. But there's a story here. Stick with me.

I got into programming because of zines - specifically Mondo 2000, Fringeware Review, Wired, bOING bOING (in a very different, earlier incarnation), and of course the weird, incredible, wonderful Schwa (which I almost sorta wrote for). These zines chronicled the emerging "cyberculture," this new sci-fi movement that was going to change the world. At the time, Wired was a very different thing than it's since become. Wired didn't publish e-mail addresses in its first issue, and it didn't feature business leaders in its first year. If you were reading Wired back when it was new enough that the rawer, more street-level Fringeware could do a parody called Weird - and mean it sincerely, as the compliment it was at the time - then you know, the first issues of Wired put science fiction novelists on its covers. They got to do huge articles about the future, too. Business leaders were in the mix, but initially, they were not the focus.

A few years later, all the other zines were gone, and Wired had gotten as boring as fuck. You never saw anything but business guys and Rollerblade ads in Wired. I had moved to San Francisco to change the world. Instead I was working for a bank. Admittedly, I was working for an investment bank, in a situation with responsibility and where my programming skills were highly valued, but still, ultimately, I was working for a bank. I was making good money but my soul was dead. I might as well have been a zombie.

When the dot-com downturn hit, I was thrilled. I knew I'd be fired. And one day my manager called me into his office.

"We're going to have to start firing all the contractors," he said.

"Ah," I said. "Well, that's regrettable, but in the current economic climate, it's no shock."

(And it wasn't. Two hundred thousand jobs were lost in January 2001 in the city of San Francisco alone.)

"That's right," my manager said. "So we want you to come on full-time."

I told him I'd think about it, and pretty soon I was driving off to live in a forest in New Mexico and learn how to draw.

It's pretty fucking hard to draw a human hand, by the way. Just so you know. Seriously. Try it sometime. And don't complain. I warned you I was an egomaniac.

But enough about that.

Why is this story allegedly useful?

It's useful first because I've met programmers working on interesting things who clearly think I should be more excited about the projects they're working on than I am - especially when these are also projects which potential clients are offering me. If you're one of those programmers, cheer up. It doesn't mean your project's not fascinating. There is just nothing happening on the Web today that is anywhere near as exciting as the experience of telling people that there's this thing called the World-Wide Web and it's awesome and you should totally get into it. Nothing happening today as fun as arguing with staid business types that advertizing will one day become a big part of the Web. Nothing as high-stakes as moving across the country to a strange city on nothing but a few hundred dollars and your absolute, total faith that you've spotted something that will soon grow huge.

It's useful second because sometimes living in a forest and drawing is awesome.

And if that's true, you have to wonder: is there a way in which working at Urban Outfitters could be awesome too? There is. I've seen it. I saw an actress today doing incredibly fearless work. To pay the bills, she works as a waitress. She's awesome.

It's useful third because programming is looking more and more like acting or screenwriting every day. The entry costs are approaching zero and the value of education is only partial. And certainly the show biz adage that "you're only as good as your latest hit" is true in programming as well. Ask anybody who lost a job to a younger programmer with less brains but trendier skills. So if you're a programmer and you're making money, save some of that money. And take time to keep your skills absurdly sharp. And study business, too, because it's only the great businesspeople who really make money as programmers.

It's useful fourth, and most of all, because if you're choosing between projects, you might think it's because you're brilliant, but it's not. In a very small number of cases it's because you're brilliant, and the economy's doing well; in most cases, it's because the economy is doing well. And when you're choosing between projects, you should absolutely choose the projects which excite you. The reason is simple. Hard work, risk-taking, and talent are the ingredients for a successful career in both acting and programming, and passion's necessary in either case.

But at the same time, just as an actor who wants to succeed should balance artsy "cinema" with popcorn flicks, every programmer should balance programming for money with programming for programming.

Economics 101 for Web 2.0+

I've been holding back this outburst since the late 90s. But today I read something that made me lose my patience. Jeff Atwood has a roundup of blogs addressing the question, where are all the open-source billionaires?

Jeff's post is a good post, and he provides a much calmer and nicer answer than I'm about to, but I don't have his vast reserves of patience. That is the stupidest question in the world. Where are all the air billionaires? Where are all the gravity billionaires? What about the light billionaires? Huh? Huh? Where are they? Tell me! If light is so damn useful, why hasn't anyone been able to license the sun? It must not be worth anything!

Wow. Good point there, Einstein.

The motivator for open source contributions isn't usually economic. It could be, in the case of somebody who wants to raise their profile by making open source contributions, but most open source doesn't come from such a mercenary point of view. People contribute to open source projects because they like working with good software. Open source contributions make software better, and keep good software alive.

The economic effect of this is the same effect any continuous trend of innovation and improvement has: it makes excellence cheaper, and thereby more commonplace. That's a paradox, but there can be enormous power in paradox. Look at the iPod; there's great money in making excellence commonplace. But the iPod isn't open source. The money's in the service you provide or the way you leverage existing resources.

This is why you should never, ever accept equity from a company that doesn't have a firm economic reason for its anticipated success. Amazon undercut the operating costs of Borders and Barnes & Noble; Google introduced a superior pricing model and superior performance metrics to advertizing; eBay leveraged the economic value of existing assets that was being blocked by normal sales channels and enabled prices to become more fluid in the process. Web success stories that don't have firm economic underpinnings are gambling winnings. Web stories like Amazon, Google, and eBay can all easily be explained afterwards by an economist.

The same is true for open source. Imagine if real estate suddenly became free. You wouldn't have to spend as much time apartment-hunting; a huge chunk of everybody's budget would suddenly be freed. What would you do with all that extra money and all that extra time? How different would moving to a new city be? How different would it be to open a new restaurant? Would anyone ever pay for parking again in their lives? That's open source: a massive economic disruption which transforms industries permanently.

Open source, sooner or later, is going to spell doom for venture capital. Your infrastructure costs, your development costs, all these things shrink to near zero. Building a Web app, these days, is like writing or drawing. All you really need to write something brilliant is pencil and paper. All you really need to draw something beautiful is a pencil and paper too. You can't build Web apps with a pencil and paper, but open source has made the startup costs almost that low. If you can get on a computer connected to the Internet, you can build a Web app.

With Amazon, Google, and eBay, you're looking at companies which made a lot of money by freeing up money which had been trapped in the market inefficiencies of old systems. That's the key; these systems dramatically improved the efficiency of existing industries. In the case of travel agencies, the inefficiencies were so strong they supported an entire sub-industry - travel agents - which the Internet has almost completely destroyed. It's usually not about new goods or services; it's usually about optimizing infrastructure.

Open source is a huge infrastructure optimization. It frees up money also, but it doesn't make money in the process. That's because there's a huge difference between generating new wealth and generating new concentrations of wealth. Asking where the open source billionaries are is like pointing to the French Revolution and saying, "If democracy is such a good idea, how come France doesn't have any more kings?" Because the kings were the problem.

What the mighty king Bill Gates contributed to the economy was a series of traps, dead-ends, and strategic victories that leveraged market inefficiences in distribution and manufacturing into a virtual monopoly. Microsoft was always pushing inferior software, even in their heyday. The Reagan soundbite "A rising tide lifts all ships" only actually applies when that rising tide hasn't already been trapped and channelled before it even rose by a massive complex of dams and reservoirs. If you can funnel the tide to one ship, and your walls are high enough, the tide can keep rising forever with only one ship being lifted. Open source erodes those walls.

And that isn't the only reason that "Where are all the open source billionaires?" is a stupid question. The other reason is, as Jeff points out, they're at Google. And Amazon.

How To Spot An Idiot

You: "What's the most interesting change on the Web since 2000?"

Idiot: "Ajax."

Smart Person: "Bayes nets."

The next change that we will see with machine learning is the ability to classify people.

American Obesity And Subsidized Agriculture

For the answer, you need look no farther than the farm bill. This resolutely unglamorous and head-hurtingly complicated piece of legislation, which comes around roughly every five years and is about to do so again, sets the rules for the American food system — indeed, to a considerable extent, for the world’s food system. Among other things, it determines which crops will be subsidized and which will not, and in the case of the carrot and the Twinkie, the farm bill as currently written offers a lot more support to the cake than to the root. Like most processed foods, the Twinkie is basically a clever arrangement of carbohydrates and fats teased out of corn, soybeans and wheat — three of the five commodity crops that the farm bill supports, to the tune of some $25 billion a year.

Monday, April 23, 2007

Might And Glory

The Cave

Wait a minute guys. I've been here before. When the monster got me. And I barely survived. We're in the monster's cave!

Seriously, this is one reason I'm glad I live in Los Angeles. This might be a boom, as opposed to a bubble, but either way, the first time around, I was in San Francisco, and everyone was part of it. In LA, there are a lot of people who don't care about the Internet one way or the other, so this time around, I have the benefit of constant reality checks.

Saturday, April 21, 2007

Friday, April 20, 2007

Revenge Of The Nerds

There's never been a better time for the guy with no muscle definition, Coke-bottle glasses, and a laptop to influence the value of a professional athlete. In the past three years, at least a dozen baseball teams have hired the type of young statisticians you'd more commonly find working in risk arbitrage at Bear Stearns.


The great thing about the Terminator movies, when it comes to actually understanding robotics and artificial intelligence, is that they deliver two extremes on a spectrum of misunderstanding. At the one end is Skynet, a computer which takes over the world, and at the other end is the Terminator, a machine which looks human.

The same thing happens in the Matrix movies, where you've got armies of giant robots hunting down humanity at the same time as you've got whole legions of human beings spending their entire lives trapped inside virtual reality.

The idea is that either computers will dominate humanity, or humanity will dominate computers. It's a classic dichotomy and it's as false as you could wish any dichotomy to be. The reality will be very different - both much more optimistic and much more creepy.

The reality is that machines will become like mitochondria. The mitochondria is a part of human cells. Originally, in the days before cells commonly existed, the mitochondria was a tiny independent organism. As cells evolved and came into being, the mitochondria was absorbed into cells and became part of the system.

This will happen - in fact, it's already happening. This robot resembles a leech, or a caterpillar, or an eel. It inches across the surface of a human heart, "injecting drugs or attaching medical devices." It makes it possible for doctors to perform surgical procedures on hearts in a less invasive surgical context than such surgeries usually happen in. It could be a marvellous improvement in safety for very important surgery. Yet it is very unnerving to witness.

I first figured out this idea in the early 90s. It was edited out of an article I wrote for Wired (which Wired edited heavily) in late 1994 and which Wired published in 1995. Being published in Wired was a thrill at the time, I was pretty young and what was going on with Wired was still very new and exciting, but the editing is the main reason I never really got into magazine writing.

But, for what it's worth, click the links. Read the article and watch the video. Something I predicted in the mid 90s is really starting to happen.

And the Hollywood dichotomy is still actually relevant. It's just that the question should be more sophisticated. The question should really be, are the machines going to be the mitochondria? Or will we?

Wednesday, April 18, 2007

Why Is Twitter Using A Database In The First Place?

A very good question.

Mind-Bogglingly Terrifying Car Accident

So I'm driving down the 101, downhill into Hollywood. I change lanes to the left, but there's somebody behind me and they honk. I'm already halfway into the lane when they honk so I'm like oh shit, blind spot, better change back, so I swerve back in, but I swerve too hard, and my car's gunning at an angle towards some other car, so I swerve to angle back into the lane, but you can't swerve like that headed downhill on a curvy hill, so now my car's at a right angle to all the other traffic, still moving down the hill, with its front end in the leftmost lane and its rear on the lefthand shoulder. And now I'm moving backwards with five lanes of 80mph traffic headed towards me, and the only way to keep from flying right back into all that traffic is to keep swerving, so I've got my steering wheel turned as hard as it can turn, and there's this huge fucking CRASH behind me as I hit the wall, and blue smoke everywhere from my brakes, I've had my foot on the brakes slammed to the floor for a while, and my car's not moving, but I drive a 91 Toyota Supra, and that turbo engine is still gunning forward with all its mighty ferocious heart, so I keep my foot on the brakes, but it starts lurching forwards, so I drop it in park and it's still driving forwards, grinding like a motherfucker, so I turn off the ignition and the car stops. So I'm like, oh shit, I killed my car, will it ever start again? And I'm trying to figure out whether or not I'm in the carpool lane facing against traffic, or just on the shoulder, and if I'm in the carpool lane, how will I survive, but then I see for sure I'm on the shoulder. And this lady pulls up and she's like are you OK? And all the traffic's behind her. And I tell her what happened and she's like that was me! And I'm like HOLY SHIT! and I realize my calves are twitching uncontrollably. So the lady pulls into the shoulder and calls CHP but an LAPD car sees us and tells her since nobody got hit and nobody got hurt I should take her license to be safe but she's basically free to go. So off she goes, like ok bye have a nice day, and then CHP shows up, two cars, and they stop the traffic on the highway so I can turn around, because my car's like parked perfectly on the shoulder, exactly parallel to the center divider, but pointed in the wrong direction, so they do this, they put me back on the road, and MY CAR IS ABSOLUTELY FINE! Body damage on the right but it runs perfect. So I'm like doot dee doot dee doo and fucking PERFECT no problems at all. So I hit the wall going backwards at least 70 miles per hour against the flow of LA traffic and CHP tells me I don't even need to file a report and they're like you can file it with your insurance if you want to, where are you headed? And I'm like, you see that restaurant over there? And they're like, there? That cafe? And I'm like, yeah, I wanted to get a cup of coffee, you know, little bit of energy to start the day, and they're like, well, I guess you're here, but you got your energy already, and I'm like, yeah, I guess I did, and they're like, well, have a nice day sir, and I'm like HOLY FUCKING SHIT THAT WAS THE SCARIEST THING THAT EVER HAPPENED TO ME AND I'VE BEEN HIT BY A CAR AND I FELL DOWN A WATERFALL ONCE AND EVERYBODY THOUGHT I WAS GOING TO DIE BUT THIS WAS SO MUCH MORE TERRIFYING! And they're like, ok, have a nice day, so I went to the restaurant and now I'm having a latte and a pesto crepe. And this cafe is filled with all these like special FX nerds with English accents and hot blonde chicks and emo monkeys with tattoos and sexy Latinas and people writing screenplays and posturing on their cellphones and I've had the most terrifying experience of MY LIFE ever and emerged totally unscathed and they've got the Pet Shop Boys on the radio and the food is delicious and LA cops are so much smarter than New Mexico cops and so much more law-abiding than Chicago cops and the sun is shining and I nearly died but barely even broke a sweat in the process and I'm like I love LA!

By the way I obviously have an awesome car. Takes a licking and keeps on ticking! Needs a bit of body work now though.

Anyway, the moral of the story, obviously, is if an LA driver cuts you off on the freeway, don't honk at them, just let them in, otherwise they might freak out and nearly kill twenty different people, including themselves.

Update: although this post is absurdly manic, as an exercise in writing style, it is pretty successful, because I have been talking pretty much exactly like that all day. Slamming into the center divider and bouncing down the hillside backwards at absurd speeds can affect your speech patterns for hours afterwards. Also, I have to say, this is just what it feels like to have a scary but ultimately painless car accident. It's sufficiently powerful for me to realize how totally I can't imagine what actual shellshock is like (or, to use the jargon of the day, "post-traumatic stress disorder").

Looking For A Good Argument Against REST

My gut reaction is skeptical. I really don't know why. But it seems somehow wrong. Web services, APIs, Flickr, all that stuff is good -- that's it.

I know why I'm skeptical. Just because it's good in the right context doesn't mean every single thing on earth needs to support it. It's like putting mustard on ice cream just because it tasted good in a sandwich once. There's nobody out there saying "some things need REST, some things don't, and here's how you make the judgement call." There's just people saying "REST? What's that?" and people saying "REST! Hallelujah!"

It's just the typical frenzy. The religion of the hammer. Where everything looks like a nail, and everybody looks like either a believer or an infidel.

Let me correct my headline: looking for a good argument about REST. I would love to see an intelligent, rational blog post going over where REST is bad and where it's good. All praise the mighty hammer, sure, Amen, whatever, but what the grownups among us really need is a clear way to decide when to use it and when not to bother.

Obviously the whole value of REST is that it makes URLs into messages passed within an incredibly large virtual machine. Servers run on Unix but the Web itself looks more and more like Smalltalk every day that the REST frenzy grows. But the idea that every last thing in every last corner of the Web should in every single case be a URL-accessible resource is just insane. It's like, either you have resources calling URLs on each other (objects passing messages to each other), or you get infinitely fine-grained access. It's absolutely one or the other.

Tuesday, April 17, 2007

Screencasts Hosed My Bandwidth

If you came here for my screencasts, sorry - they kind of destroyed my bandwidth. I think my site's going to be down for the rest of the month now. Ooops.

Anyway, they're coming back soon - but not on my own site. I'll post new links when I've got them.

Make New Friends, But Keep The Old

I recently coded something very trivial in Perl. What surprised me was how long it took me.

It was a classic Perl situation; rows of text in one format that needed to turn into rows of text in another format. The classic Perl solution here is a foreach, a regex, and a sprintf() call. I've been coding Ruby so much that my first step in coding this was to try to build objects, and then an iterator.

Objects in Perl exist, in a sense, but they're nothing like Ruby objects. The brilliant book Higher-Order Perl actually shows you how to build an iterator in Perl, but that's for unusual circumstances. You don't write Perl that way. It's backwards; it disregards the relevant idioms and adds extra steps. It makes a molehill into a mountain. My solution was all wrong because I was trying to write Ruby in Perl.

It's kind of like the difference between being fluent in a foreign language and simply going native. On the one hand it's great that Ruby's style has become, to some extent, a matter of instinct for me. But I was surprised that it took me as long as it did to mentally switch gears.

When I first started doing Ruby, getting away from Java seemed almost heavenly to me. But now I almost want to code something in Java again to see how it feels different. When you learn new languages, you want to expand your skill set, not replace it.

Monday, April 16, 2007

Very Good Post Jeff Atwood, here.

If you read that post, that's my main concern with Seaside, and the reason I got bored with Lisp almost before I even got started. Long story short: languages are not just languages, they are also platforms, and a great language isn't necessarily a great platform.

Another Tiny Seaside Screencast

This one's on how to kill the logo in Damien Cassou's Web dev image, and a little miscellaneous stuff about getting started.

Target audience: people getting started with Squeak. All this actually does is give you a quick overview and show you how to remove the little logo Damien's image comes preloaded with. (I'm hoping/planning to do something a little more in-depth and useful soonish, but for now that's entirely vaporware.)

If you're looking for something to get you interested in developing with Seaside, check out these screencasts - one long, one short. For an example of actually building something, check out this one (15 minutes). If you're looking for more stuff about how to get started, total basics, check this one out.

Why Geeks Should Study Acting

Chad Fowler says the best thing you can do for your career is destroy the geek culture.

He's right. And it's not just good for your career; it's good for your mental health as well.

Everybody knows that in the world of open source, community involvement is at least as important as the underlying technology. The landscape is littered with failed projects where great technology lost to great marketing. The success story of the moment is absolutely a marketing success and a social success as well as a technological success. Yet one of the worst ideas of the geek culture still persists - the idea that social skills are unnecessary, so much so that having a mild form of autism could even be considered a good thing.

The headline for the Wired article in the autism link is "The Geek Syndrome." The tagline? "Autism - and its milder cousin Asperger's syndrome - is surging among the children of Silicon Valley. Are math-and-tech genes to blame?" My emphasis.

That's the real reason geeks need to study acting. The idea of math and tech genes. It's disgusting.

If you're a geek, you probably come from one of a small number of specific ethnicities, compared to the larger total number of ethnicities available. If you're a geek, you probably don't work with many women, and if you do, those women are mostly receptionists, management, or marketing - in other words, if you're a geek, you're unusual if you work with women at all, and very unusual if you compete with those women intellectually.

And the idea of "math and tech genes" could sound very reasonable to you.

This is why you need to study acting.

If you study acting, you'll learn the social skills you need, but you'll also get something infinitely more valuable. Despite what you may think, acting is actually intensely competitive. People think acting is like just standing around talking; in reality, it's more like boxing.

If you study acting, sometimes you'll win; sometimes you won't. You'll compete with women, and sometimes you'll lose. You'll compete with people from outside the usual narrow range of ethnicities, and again, sometimes you'll lose. And if you're honest with yourself - if you approach acting the same way you approach anything you genuinely want to do well - you'll realize that some of the times you lost, you lost because you were outsmarted; and some of the times you won, you won even though you were outsmarted.

The average run-of-the-mill geek is outsmarted by women in the workplace so infrequently that sexism is the great ugly underbelly of the tech industry. The average run-of-the-mill geek comes from a narrow range of specific ethnicities, and is outsmarted by people who fall outside that range so infrequently that the idea of "math and tech genes" seems plausible. To anybody outside the geek culture, "math and tech genes" sounds like the kind of sick bullshit you'd hear from a Nazi eugenicist, but within the geek culture, it's taken to be so obviously beyond even debating that Wired can put it in a tagline without irony or fear of backlash.

Guess what?

Math and tech genes don't exist.

Your environment shapes you, and if you're a programmer, you're in an environment where you're consistently identified as being smart, and where the people you share this distinction with all share a variety of irrelevant genetic commonalities. If all those women who are smarter than you never come into your environment, you're not going to realize they exist. If all those scary people from outside the "normal" ethnic range who can outsmart you never come into your environment either, you won't realize they exist either.

Now this sounds like an argument for peace and universal harmony; but it sounds this way because if you're reading this, you're probably a geek, and if you're a geek, you're probably ignorant and you're probably isolated, and since ignorance and isolation lead to arrogance, that means that if you're a geek, you're probably arrogant. This is not an argument for peace and love; it's an argument for self-preservation. If you're a geek, your probable ignorance and probable isolation are probably a problem for you. One time I lost a great job due to the corporate/political machinations of a black woman, and the thing is, she wasn't even being particularly subtle. She could afford to be obvious; she was operating in the shelter of a blind spot. If she had been a Chinese dude, I would have seen it coming.

So I kind of lied. This is an argument for peace and love and all that hippie stuff. But I lied to make a point. Really, like every argument for peace and love, this argument for peace and love can also be seen as an argument for self-preservation. Because an argument for peace and love is always an argument for self-preservation.

The point of destroying the geek culture isn't just that we need to change society and save the world. The point is also to destroy the geek culture within yourself, so that you're not one of the people losing their jobs because they bought into the geek culture's bullshit and neglected their social skills. Be instead a good programmer and a good marketer - because you are in marketing. Everybody's in marketing; the only difference is between people who know they're in marketing and people who haven't figured it out.

Update - the Aspberger's thing is total bullshit, by the way. Amateur office psychologists have diagnosed me with both Aspberger's and ADHD. Both diagnoses were wrong. In the Aspberger's diagnosis, I was merely concentrating; in the ADHD diagnosis, I was merely having fun. A personality is not a medical condition.

The Myth of the Blogosphere

Screenwriting bloggers don't talk about the blogosphere. They talk about the scribosphere. (It's the screenwriting sub-blogosphere.) Kathy Sierra's harassment hasn't been all over the blogosphere. It's been all over a blogosphere. Everybody in the Web tech blogosphere knows about it. Nobody in the scribosphere does. Nobody who blogs on Myspace about their dates and their clothes knows about it either.

Neither the blogosphere nor the scribosphere are spheres at all. They're networks. The metaphor is very precisely wrong. A sphere is a three-dimensional surface with all points equidistant from the center. Equidistance from the center is a very, very inappropriate feature for a metaphor for blog networks. They look more like nerve clusters.

Each specific "blogosphere" clusters around specific high-profile bloggers, and closeness or distance to these central bloggers makes for relevance or irrelevance within that particular alleged "sphere". Just as every Rails blogger knows who DHH is, every screenwriting blogger knows who Unk is. Every Myspace blogger into dance music in the state of New Mexico knows who John S. and Grant are. But to the average reader of this particular blog, all those names, except DHH, probably mean nothing.

We tend to think of the Web tech blogosphere as "the" blogosphere because Web technology people are nowhere near as deep and insightful as we like to think we are. We were here first, so our blogosphere is the blogosphere. Right? Myspace bloggers aren't real bloggers. Screenwriting bloggers aren't real bloggers. I Can Has Cheezburger? isn't a real blog.


There isn't one blogosphere. They are very many. And none of them are spherical.

The inappropriateness of the spherical metaphor intensifies when you consider the fractal, recursive nature of blogospheres. The Web technology blogosphere includes the Rails blogosphere, the Java blogosphere, the PHP blogosphere; large parts of the Perl blogosphere, the Python blogosphere, and the Smalltalk blogosphere; as well as markup language blogospheres and the Flash blogosphere; and large areas of the design, usability, marketing, management, entrepeneurialism, investment, and advertizing blogospheres as well.

It's pretty difficult to model this recursivity accurately with spheres. It's very easy with nerve clusters. Just picture the human brain.

Sunday, April 15, 2007

Back To The Future

Battles I fought with clients are now jokes that nobody takes seriously:

A very long time ago, way back in the past… like 10 years ago, people started getting on this crazy thing called the World Wide Web. Somewhere in a dark room filled with crabby old men, it was decided that advertising online would never make sense… "nobody is really going to spend a lot of time in front of the computer. They have books & magazines to read, movies & television to watch, music to listen too…" (I guess they never thought we’d end up doing that ON the computer.)

Not only did they never think it, they made fun of me for telling them they were wrong. They made fun of me for telling them they should read Neuromancer, too.

Wait a minute. I guess this means I got the last laugh. Um. Hey.


Seaside Screencast: Set Up Squeak

Extremely basic, how to get started stuff. (123mb)

I have to admit, this does seem pretty trivial, but when I was first playing with Seaside, I had a number of false starts due to downloading images loaded with out-of-date versions of Seaside. I got Ramon Leon's image (which he's since updated) and that worked well, except it tried to load various Windows fonts (on my Mac) on every boot.

Of course the reason all this is possible is that Smalltalk gives you a lot of options for customization. You could even say Smalltalk is a large set of options for customization. Due to the whole "virtual machine" thing, it's almost more like using Parallels than like using Dr. Scheme. Even Eclipse doesn't come close, which is ironic given Eclipse's strong historical link to Smalltalk. (I think Eclipse even started out as a Smalltalk implementation.)

But all those options can be a bit confusing first time out. So, remember, if you're a Seaside newbie, this is the Squeak image you want to use.

Friday, April 13, 2007

Yes And No

I wish I was dumb enough to believe this:

Bloggers have a responsibility to the people. And in a world where major news outlets have reneged on that responsibility, it is vital that we uphold it.

Bloggers are like anybody else. Some are corrupt; some are honest. You can't get all bloggers to do the same thing. Even trying is just insane.

The mainstream media has a responsibility to the people too. So what? Everybody has a responsibility to be honest. That's what makes civilization liveable. But Amanda Cogdon's dishonesty isn't something which invalidates bloggers and blogging; it merely invalidates Amanda Cogdon. It isn't about a gigantic crusade. Just call a liar a liar and move on.

Thursday, April 12, 2007


X vs. Y!! IT'S WAR!!1

Not really. It makes for good copy, but DHH and Avi Bryant are not at war, and I'm not the trickster who led them into discord.

For the record, sentences like this one make me nervous:

Despite his innovative ideas, Giles Bowkett is a real Rails aficionado.

I think that's more like:

Because he likes other people's innovative ideas, Giles Bowkett is a real Rails aficionado.

Hackers And Haters

Discovered something very interesting via Boing Boing today:

The interesting part is this:

It used to be that programs were easy to copy and change. But manufacturers began to lose money as many people made copies of software and gave them to their friends. Now, many manufacturers have figured out how to 'copy-protect' discs. A copy-protected disc—like a cartridge—can’t be copied or changed. To our mind this is a disaster: Most people learn programming by changing programs to fit their own needs. This capability of customization is what makes computers so attractive. New ways of copy protection will probably be found soon. Until then, a computer owner may have to put up with being 'locked out' of his own machine.

Does that sound familiar at all? It will if you're a Boing Boing reader, or if you've paid any attention to DRM, the Creative Commons, Larry Lessig, etc. It's the same thing driving me nuts right now. I moved from New Mexico to Los Angeles, and left behind a hard drive full of music. No problem - all the stuff I really listen to is on my iPod anyway. Except I can't transfer it from my iPod to my laptop. No dice. Why? DRM crippleware. They made the device less powerful to appease the music industry.

There are huge holes in the music industry's arguments for DRM and copy protection in general, but it's pretty easy to see where early-80s computer manufacturers were coming from. Except free open-source software almost always trumps commercial software for quality. If you want quality product, you're better off with the free shit anyway.

And the thing is, if you're in the business of selling computers, an abundance of great software is a strong selling point. What would the world look like if computer manufacturers had been a little more reckless? Would we have had less software companies but more computer owners? Would the open source world have arrived sooner and stronger?

Would the entire hulking, stinking mass of Microsoft have ever even formed? You have to wonder. Where would Microsoft's monopoly power have come from in a world without copy-protected software? How much better would everybody's life have been?

It could be that what's going on with the copyfight isn't even about "pirates" vs. the Establishment. Maybe it's just a colossal paradigm shift. Go back a few hundred years and wealth wasn't about entrepreneurs; it was about knights and kings. In the past, wealth meant either you were good with a sword, or you had an ancestor who was good with a sword. But today, it means you're smart about making money, or you had an ancestor who was smart about making money. It could be that this whole thing with copyright law is just an old obsession with control that is useless and totally irrelevant in today's society.

If that's the case, it means the copyfight belongs to the Creative Commons people, and it's only a matter of time. Just as Public Enemy's legal battles over sampling were eventually validated by an entire genre based on one sample, this article on PCs from the 80s shows that a bad decision in favor of copyright on the part of early PC manufacturers was later invalidated by the rise of the open source movement.

Wednesday, April 11, 2007

The Problem With The Turing Test

Two very intelligent individuals named Mitch Kapor and Ray Kurzweil have a long-running $10,000 bet about whether or not, by 2029, the Turing Test will be beaten.

However, both these individuals have missed something painfully obvious.

The Turing Test has already been beaten - and the world learned virtually nothing during the process.

It was beaten in the 70s.

In the 1960s there was a famous AI program called Eliza. Eliza simulated a Rogerian psychotherapist. Rogerian psychotherapy consisted entirely of repeating whatever the patient said, and/or asking them for more detail. During the 60s, it was very popular among a certain class of people.

Eliza, of course, did exactly the same thing - it repeated back to whatever you said to it, and occasionally asked you questions. Eliza is now famous as an AI program, but Eliza was not actually constructed to simulate human intelligence. Eliza was intended as satire. The researcher who built Eliza basically wanted to demonstrate that what Rogerian psychotherapists were doing was so brainless a machine could do it.

Because Eliza was satire, everybody was in on the gag, so nobody ever confused Eliza with a real human. But about ten years later, a less famous researcher - I've even forgotten his name, and the name of his program, although I remember where I read about it - created a version of Eliza with a twist. Instead of repeating whatever you said and peppering it with gentle questions, this version repeated whatever you said and peppered it with paranoid accusations. It also always phrased its questions aggressively, and with hostility and suspicion. This version of Eliza successfully decieved several psychologists, all of whom diagnosed it with various forms of acute paranoia.

The problem with the Turing Test is that it is a lot easier to beat by exploiting incomplete knowledge of human psychology than it is by designing a mega-neutron brain. All the test really does is expose how little we know about what makes people people. The fact that it's easy to fool people into thinking a computer is human doesn't actually teach you anything about the difference between computers and humans; all it does is teach you that it's easy to fool people.

Technically, the Wikipedia entry on the Turing Test says that this type of thing is

not the same as a Turing Test. Most obviously, the human party in the conversation has no reason to suspect they are talking to anything other than a human, whereas in a real Turing test the questioner is actively trying to determine the nature of the entity they are chatting with.

Honestly, however, I think that's splitting hairs. Neither academic nor corporate research pursues the Turing Test any longer. It could be that the test is considered too vague and ambitious for serious researchers, but I think the real reason is that it's too easy. The core of the problem was solved only 10 or 20 years after the Test itself was suggested in 1950. It's basically a solved problem with a ton of little implementation details still dangling off of it.

If the Turing Test isn't officially beaten by 2029, it won't be because it wasn't beaten. It'll be because it wasn't officially beaten. Not because the test was so hard. Because officials were too wrapped-up in being official to acknowledge that a prankster beat the Test back in the 70s as a joke.

What's especially funny, and yet especially bittersweet, is that the joke started as an attack on a pretentious field of study with no real record of results. I mean, another pretentious field of study with no real record of results. Some of the best moments in AI have come from a similarly insouciant point of view.

By the way, if it seems I'm being unfair to psychotherapists here, I'm a certified hypnotherapist, and I know for a fact that hypnotherapists can do very easily things which the psychology establishment says are impossible. Many, many psychotherapists have gotten people to "just accept" a huge legion of problems which a hypnotherapist could have simply solved. I'm definitely kind of scornful of psychotherapists, but it's not because I don't understand the good they do - it's because I know how much more good they could do if they took their work seriously.

And if it seems I'm being unfair to AI researchers, well, come on. If there's anybody whose understanding of human consciousness needs a reality check - anybody whose sacred cows need a little cow-tipping - it's AI researchers. The current apex of achievement in artificial intelligence is a robot vaccuum cleaner.

Tuesday, April 10, 2007


Monday, April 9, 2007

Fear And Loathing In The Blogosphere

Yes, Kathy Sierra should be able to post without being harassed. Yes, this is about more than one individual being stalked - it's about preventing vicious, illegal gender harassment.

No, Tim O'Reilly's proposed blogger code of ethics does not make a damn bit of sense.

I am not taking responsibility for the comments people leave in my blogs. I will allow anonymous comments. And I think the whole idea is silly and totally inappropriate.

The appropriate response is to go after Kathy's harassers and subject them to the criminal penalties they have already earned themselves by breaking the law. We already have cops to take care of this thing. That's what they're for.

I don't know if Tim O'Reilly's on some kind of power trip, or freaking out due to panic, or what, but his proposed code is just twenty different kinds of wrong. First, blogging as a community is too diverse for the code ever to reach critical mass. Most bloggers will never even hear about it. Second, what Kathy's harassers did is already in violation of the law, and when systems exist to handle problems, creating new systems to handle the same problem is redundant -- and additionally useless, when you consider that a voluntary code of conduct is a toothless system compared to a system which subjects harassers to legal penalties, including the possibility of jail time. What Tim's proposing is not even a failover system, it's just a totally meaningless add-on. It's like adding lines of code which increment a variable to a program that already works, and then not even doing anything with the incremented variable. It's busywork.

You can't fault him for wanting to help one of his writers, and you can't fault anyone for wanting to be sure things like that never happen, but the proposed solution is so utterly unrealistic that it shouldn't be considered anything but a display of emotion. Can't fault the emotion itself, but when he calms down a little, I think even Tim will see how ridiculous his suggestion is.

In-Memory DBs

This post is more about things that I wonder than things that I know.

The whole idea behind the Google Filesystem is that hard drive access was Google's major architectural bottleneck when developing their search functionality and optimizing it for speed. So they said, well, we can make it a lot faster if we put the whole thing in RAM. And then they did.

The GFS is actually a much more powerful competitive advantage than either Google search or Google ads. That's why Google spends so much on research. The GFS is the next operating system, and the applications that will make the best use of it haven't even been written yet.

One simplistic way to look at the Google Filesystem is as a gigantic in-memory database. This prompts an obvious question. Will databases adopt this architecture?

Performance is key for databases. Hard drive access is a bottleneck. Database-backed apps are much, much more widespread than they were in the days when the current, conventional database architecture was initially developed.

I think the answer is almost guaranteed to be yes.

Sunday, April 8, 2007

The Casino Or The Beauraucracy

One very unusual thing about my career is that I didn't start out to be a programmer, but a graphic designer. There are tons of people who take this route into coding HTML, that's a very common thing, but not nearly as many who go on to build enterprise apps and learn Smalltalk.

So this kind of gives me an unusual perspective - especially when it comes to marketing. The culture of programming often contains a great deal of contempt for marketing, but the culture of graphic design has more respect for it. Consequently, one very unusual thing about my own reading is that I seem to read many more books on marketing than the average programmer.

Sometimes I'm embarassed to admit it. There are a lot of programmers who scorn the whole idea of marketing. But there are other programmers who don't, and it seems to work for them.

One thing I'm sure of, geeks need marketing. Especially in a world where everybody swears blind that a great programmer's work is 10 to 1000 times better than an average programmer's, and some people even have the research to back it up - but the only way for a great programmer to make 10 to 1000 times more money than an average programmer is for that great programmer to start their own company and become a great businessperson as well (or instead).

Any other field, the idea is 10 to 1000 times better work should mean 10 to 1000 times more money. Picture what the world would be like if extras made as much as movie stars. There are plenty of programmers out there, in larger, more beauraucratic organizations, whose main contribution is essentially to walk on stage and stand in the background for a while. There are also dedicated, passionate programmers who carry the weight of entire companies. Sometimes the star gets paid more than the extra. Sometimes not. But never by an appropriate multiplying factor.

It's a serious flaw in the culture of technology. It's probably also the main reason venture capitalists are still around. Last year's SXSW podcasts had a definite theme: with free open source software and cheap commodity hardware, venture capitalists are no longer either necessary or useful, and in some cases they're actively destructive. But the VC world did not disappear overnight; in fact, nine out of ten Rails jobs seem to be startup jobs. There's a certain degree of irony there.

The connection is that people think the only way for a programmer to make a lot of money is to join a startup and get lucky. Alternatively, you can conform to a corporate mold, and be stuck with a predictable series of pay scales. You get this insane 1950s theory that two programmers with X years of Y, where both X and Y are equal, are therefore equal cogs for any given machine. This is of course stupid, so many smart programmers opt instead for the VC system. But the VC system was developed from the 1950s to the 1980s to support the massive infrastructure and complexity required to start technology companies in those eras, and that just isn't even relevant in 2007. That world is gone forever.

What many programmers have to face down is this very toxic false dichotomy: either VCs, or a corporate job - which is to say, either the casino or the beauraucracy. People who say you should skip the casino and the beauraucracy, and opt instead for good, solid, sustainable businesses, have remained a vocal minority.

The future possibility, however, is that VCs are becoming dinosaurs, and small teams building great software on the side is the real shape of the future. 37 Signals, Facebook, and Digg - initially built by an Elance PHP developer for $11 an hour - all provide pretty strong examples of what that future is starting to look like. Likewise, the extraordinary power and flexibility of open source technologies makes the conservatism and inflexibility of large beauraucratic systems more of a weakness than ever before.

This doesn't mean that either the casino or the beauraucracy will be destroyed. It means a third alternative has already emerged. This third alternative is already attracting geeks with above-average marketing savvy. It's already creating its own Internet celebrities.

Did Anybody Read Getting Real?

Sometimes I really wonder. A year or two ago all the news about venture capital was that Kevin Rose didn't take his VCs seriously and 37 Signals thought VCs were stupid, period. But it seems as if the vast majority of Rails work out there is with startups. One startup I talked to told me they had a high-profile serial entrepreneur coming on as founder. I'm like, he's coming on as founder? And they're like, founder is a job title.

That particular group of people had degrees from great schools like Harvard and Stanford and they had no idea how pointless it is to brag about your degree to a college dropout who's instructed guys with master's degrees and was one of the people who invented and defined the category you got your degree in. What are these people thinking? Why are they going to VCs for venture capital to write social networking sites in Ruby on Rails? The whole point of Ruby on Rails is it gives you the ability to write applications without VC funding! That's what it was invented for! Getting Real is the car; Ruby on Rails is the wheels. Using Ruby on Rails to build a VC-backed startup is like putting wheels on a horse. You have to wonder what these VC startup people are even thinking. It's not about the wheels. It's about the engine.

My most recent foray into the startup world, I met somebody with experience at Overture, obviously a kickass startup that did very well - a ball that Yahoo dropped, but quite a ball until Yahoo dropped it - and they were basically like, we'd like to give you tons of money to work constantly. And I'm like, well, how about a reasonable amount of money to work a reasonable amount of the time? And they're like, no. And then I was like, well, how about a tiny amount of money to work hardly ever? And they're like, no.

And the funny thing is, they had initially been worried about whether or not they would be able to afford me. I'm like, look, this is how you get me at a discount. But they weren't into that. In the end we had to settle on no money at all for no work at all, which, I have to tell you, was a much better deal for me than it was for them.

The crazy part? Check this out. This wasn't a startup in the sense of possessing an innovative technology. They had a brand and they were building a social network. They were in essence a social networking site built around an entertainment brand. And a VC had gone "Wow!" because their entertainment brand was good, and then passed them X amount of money and said "Now you need to work constantly!"

But the reality is, they didn't actually need to work constantly. They just thought they did, because that's the culture of startups. But it's the culture of startups because when startups are based around a technological discovery or innovation, time is of the essence. But an entertainment brand's social network isn't based around discovery or innovation. It's based around the network. And social networks aren't about technology, they're about people. The big name in VC-backed social networking startups is Facebook, and that was built in somebody's spare time. The VCs came calling as soon as they heard about it, but they heard about it because it was a huge success. So if the biggest success in VC-backed social networking was built in somebody's spare time, without any VC help, maybe the whole idea of working constantly is totally unnecessary in this context.

Maybe it's all just cargo cult management strategy.

I mean you have to wonder. Anybody who builds Rails applications but thinks they know something that 37 Signals doesn't, well, maybe.

Maybe not.

The thing is, a lot of what went wrong in the dot-com boom was people rushing things. They thought the most important thing would be establishing a brand in a given space. But those were the companies that went down so painfully. It wasn't companies like Google and Amazon that wasted people's time and money in the bubble, and ruined them when it burst; it was and all the rest of that madness. Google and Amazon had technologies and business plans. Amazon's founder Jeff Bezos was previously a "quant geek," which is a business analyst specializing in mathematics and logistics. These companies had real business goals, which is why they're still around. But was all about building a brand.

The brand-building idea is all about casino thinking. The theory is, the minute you have a brand, you can cash out. This casino attitude is good for VCs but it's ruinous for programmers, and it's ruinous for normal investors as well. (And it isn't even really that good for VCs.)

Startups are a great place to meet smart people and a great place to work hard, but the reason so many startups fail is because nine out of ten startups are ridiculously bad businesses with no real foundation at all. The alternative is Getting Real, and if you haven't memorized every word of that book already, you should. Working constantly like that doesn't add up to success. More often it adds up to burnout.

Saturday, April 7, 2007

A Little Birdie Told Me

Watch for a very interesting Avi Bryant interview coming soon!

I don't think I can say where...

Interesting Tangent In Wired Article

From this article:

Think about how Google works. When you type in a term, the search engine puts the site with the most links pointing toward it at the top of the list. That means bloggers and discussion boards are extremely powerful in influencing Google's search results, because bloggers and discussion-board posters are promiscuous linkers, constantly pointing to things they love or hate.

Doesn't this necessarily mean that Google is constrained to the zeitgeist of the moment? I've been wondering about this for a while and this kind of nails it. If blogging and twittering eventually dominate Internet traffic, how valuable will Google remain? In some ways the Internet is like a global nervous system; are we building something which thinks and reflects, or which twitches reflexively?

(I'm feeling very deep today.)

Friday, April 6, 2007

Awesome DVD

If you're interested in the general topic of mastery and excellence, of how people who do much, much better than everybody else actually do it, you should definitely check out this DVD.

(Photo by JP Thompson.)

The guy who put it together is an aikido master who's written several books on mastery. I've only read one, but the one I read was excellent. The video features interviews with BB King, Carlos Santana, Linus Torvalds, and Ivana Chubbuck (the acting teacher who taught Brad Pitt, Charlize Theron, Halle Berry, Elizabeth Shue, Jim Carrey, and numerous others).

Where else do you get to see BB King and Linus Torvalds talking about the same thing?

It's awesome.

Use ActiveRecord To Migrate Legacy DBs

I'm pretty sure you can find this in Rails Recipes, but for what it's worth, here's the short version. Say for the sake of argument that you have a legacy app and you need to convert the tables into a format Rails will work with easily and nicely. You can in fact wire ActiveRecord up in such a way that it'll work with any set of tables and indices, but again, just for the sake of argument.

What you do is write a small Ruby app which uses ActiveRecord, and set up two database connections within that app. One's a standard connection, the other one is set up to use the legacy database (and I know for a fact that this part is in Rails Recipes). Then all you do is create a bunch of objects from the crufty legacy DB, and a bunch of shiny new Rails-y models, and you populate the models with data from the crufty apps. Then all you do is you turn the switch, and test all the new models in the new DB to ensure they match your expectations (which are of course derived from the legacy DB).

It's easy, it's clean, it's quick, and the difference between the two relational models is essentially documented automatically with this code. Very highly recommended technique.

Thursday, April 5, 2007

Attention Lifehackers

Anybody got a good way to avoid this problem?

I need info in my e-mail. I go into my e-mail, Gmail defaults to showing me the inbox, so I see new stuff and respond to it, or read it, and get distracted enough that I forgot what I even logged in to do, until I log out, think, "What was I doing again?" and realize that I needed to check my mail, and log back in.

At which point, in instances of true stupidity, the whole cycle begins again.

I think maybe I should turn off Google Notifier and instead have all new e-mail go into a folder called "New But Not Necessarily Worthy Of Inbox Status." Or probably some shorter thing. The real problem, though, isn't "inbox status," it's the fact that Gmail defaults to an inbox view instead of a search view.

It's funny, actually, because they got simple UI so right on the main site, but they screwed it up so badly on Gmail. Gmail has amazing strengths, but the user interface is so absurdly overkill that the system I use to search my mail often distracts me from searching it. Can we say "lost productivity"? Jesus.

Why You Have To Work As A Consultant

Pat Maddox says you have to work for a startup. He's wrong.

Actually, he's right. You do have to work for a startup, at least once. But he's also wrong. You have to do other things as well. One of those things is working for a consulting firm, or going into business as a consultant yourself, or both.

You will learn how vastly different beautiful code is from useful software.

You will learn how much more useful listening to your users is than visualizing something cool.

You will discover the compelling business reasons for agile software development, and you will do it, at least once, by having to dig yourself out of a pit you created by failing to understand what agile really meant.

You will come to see business processes as programs run not within computers but by groups of people.

You will have time off on the weekends and you'll be able to have a life.

"Great Hackers" And Great Actors

Recently I've been taking acting classes, both for reasons totally unrelated to programming and for reasons related to it at as well. I've also spent some time talking to actors about how they get work, and the answers seem utterly crazy. Apparently what you do is you make a resume, put your picture on it, mail it to a billion advertized openings, and hope you get lucky. Then you get an agent, which is basically a recruiter, hope that your agent hears about openings you didn't hear about, and hope further that they bother to tell you, or indeed even remember your name, and if they do, you mail your picture and your resume to a new set of unadvertized openings, and again hope that you get lucky.

Now this is, of course, madness. One side effect of this madness is that, since the process depends heavily on resumes, tons and tons of short films are created every year which nobody outside Los Angeles will ever even hear about. These things cost thousands of dollars and with the exception of a very small number that make it to film festivals or to Europe -- where short films still actually get seen by audiences from time to time -- they are not intended ever to be seen. They are made so that directors and actors with nothing on their resume can now have something on their resume. There are legions of people in the city of Los Angeles doing nothing but padding their resumes, and spending thousands of dollars to do it!

You may be wondering what this has to do with my usual topic of programming. Well, as a programming blogger, I am legally obligated to meet my quota of Paul Graham references each month, so here we go. There's a great essay called Great Hackers which has a great deal to say about this kind of thing. One of its most interesting ideas is that great hackers avoid frustrating work because it blunts their edge:

The distinguishing feature of nasty little problems is that you don't learn anything from them. Writing a compiler is interesting because it teaches you what a compiler is. But writing an interface to a buggy piece of software doesn't teach you anything, because the bugs are random. So it's not just fastidiousness that makes good hackers avoid nasty little problems. It's more a question of self-preservation. Working on nasty little problems makes you stupid. Good hackers avoid it for the same reason models avoid cheeseburgers.

You have to wonder, if these words are true, wouldn't it apply in other contexts as well? For instance, Paul Graham wrote a book called Hackers And Painters. The idea there is that being a programmer and an artist, as Paul Graham is (and I am), is not such an odd combination. The implication is that programming is to some degree an art rather than a science, or at least, that programming is more about being creative than about being correct. The logic and accuracy of a programmer are like the good looks of an actor. Although they are necessary to success, they are not actually what the work is about, and if that was your only asset, you'd have a limited career.

But if working on bad code wears down your programming skill, what does playing bad roles do to your acting skill? If Paul Graham's onto something here, then surely all these people out there playing little roles in little "movies" are not actually doing themselves any favors. Because the thing is, aspiring actors work on these things, and aspiring directors work on them too, but aspiring screenwriters ignore them. In fact, I never even knew that these things existed until I started acting classes in LA! I wrote my first feature-length script over ten years ago (it was awful, but I wrote it anyway). I've been reading books on screenwriting for at least a decade. I had never even heard of short films. And if aspiring screenwriters aren't working on these things, you could have great directors and great actors and still have shit movies, because the characters won't be there and the plot won't be there either.

Of course the first question is not, "what else would it mean if Paul Graham's words were true?" The first question is, "are the words true or not?" And I would actually say that the words are false. I have worked for hip, exciting startups, and I have worked for dull, lifeless corporations, and I can tell you that the problems I was working on had no effect on my programming skill at all.

What has made the difference, consistently, every time, is whether or not I was programming at home after work because I wanted to. That's what it's really about. If you're doing it because you want to do it, you'll do it well. If you're not, you won't. Your job can get in the way of programming well or it can facilitate programming well, but in neither case does it have any real power in the matter.

That probably applies to these short films as well.

Tuesday, April 3, 2007

Tiny APIs

I've blogged before about building mini-APIs into Rails. I'm not the only one -- Jamis Buck wrote a post praising Marcel Molina, Jr. for his talent at doing the exact same thing. I think this is a useful technique which Ruby programmers need to use more often.

I think the reason you don't see people using mini-APIs as often as they should is because people don't really understand why it's a good idea, and where you should use it.

Mini-APIs come midway between DSLs and refactoring. Refactoring is often used as a synonym for debugging, but refactoring is actually the opposite of debugging. Instead of changing the code to get it to work properly, you're changing the code without changing the functionality at all. Any time you find something repetitive or inelegant, you tidy it up a bit. Just a tiny, tiny bit.

The end result is a cleaner design, but refactoring isn't about redesign. It's about tidying up the code and getting a cleaner design as a side effect. It's actually TDD in reverse. With both refactoring and TDD, your goal is to produce a clean design, but the way you get there is not by designing anything, but rather doing lots of small things which, in the long run, if done consistently, will result in a good design without any explicit designing ever actually taking place. It's a very Zen idea.

DSLs, of course, are all about creating new mini-languages which you or your client programmers can use to write code not in Ruby or in Rails but in the mini-language. Your goal is to represent the problem space so succinctly that a business user can code in terms of business rules without ever needing to learn a fully-fledged language. Rails is a great example of this, and yet all the DSL features of Rails emerged organically from DHH's desire to eliminate repetition -- which is to say, the DSL aspect of Rails is a result not of deliberate design but of consistent and diligent refactoring.

(By the way, I'm sorry I don't have a link to back this up, but I'm absolutely certain of it. I've read a gazillion blogs and listened to every podcast DHH was ever on, and the proof of this statement is definitely out there somewhere. I just don't have the time to track it down at the moment.)

Anyway, if DSLs are really just the product of very extensive refactoring, and mini-APIs sit midway between refactoring and DSLs, what I'm saying is that mini-APIs emerge from streamlining repetitive code. And that's it. That's exactly what happens. If you want to get to a DSL, the way to do it is not to cook up a DSL right away. You start with messy code and you chip away at it until it takes the shape of a DSL -- like a sculptor shaping marble. You'll know you're halfway there when you've got a mini-API.

I think this is why you don't see people using mini-APIs enough. You often see programmers who want to write a big fancy DSL, because that's the hip new thing, and you often see programmers who'll write the same code twice because they're in a hurry. What's rare is the programmer who'll go back and rewrite repetitive code to make it slim and elegant, and what's very, very rare is the programmer who's going to rewrite the same piece of code until it's as elegant as it can possibly be.

Unfortunately this is exactly the type of programming which results in powerful frameworks like Rails! It is very literally exactly the programming method which gave the world Rails in the first place.

DHH has tons of fans. If you're a DHH fan, think about what I'm about to say, and act on it. Nearly every one of DHH's fans is writing code like the code DHH wrote. Very, very few of DHH's fans are writing code the way DHH writes it. But the difference between doing something just like something somebody else did and doing something the way somebody else did it is the difference between being an imitator and being a student.

Don't imitate. Study.