Sunday, February 21, 2016

How Computers Feel Different Now

I learned how to program a computer on a TRS-80, in BASIC. I was six years old. At the time, "computers" meant things like the TRS-80. Today, your phone is a computer, your TV's a computer, your car's made of computers, and, if you want, your frying pan can be a computer.

But it's not just that everything's a computer now; it's also that everything's on a network. Software isn't just eating the world because of Moore's Law, but also because of Metcalfe's Law. In practice, "software is eating the world" means software is transforming the world. It might make sense to assume that software, as it transforms the world, must be making the world more organized in the process.

But if Moore's Law is Athena, a pure force of reason, Metcalfe's Law is Eris, a pure force of chaos. Firstly, consider the fallacies of distributed computing:
  • The network is reliable
  • Latency is zero.
  • Bandwidth is infinite.
  • The network is secure.
  • Topology doesn't change.
  • There is one administrator.
  • Transport cost is zero.
  • The network is homogeneous.
The first and the last — "The network is reliable" and "The network is homogenous" — are basically equivalent to saying "chaos reigns supreme." No area is ever the same, because the network is not homogenous (and the topology is ever-changing), and things don't always happen the same way they happened before, because the network isn't always there. So chaos reigns over both space (the non-homogenous network) and time (the ever-changing network which is only sometimes there).

Chaos also reigns in a social sense: the network isn't secure, and there are many administrators. So if Moore's Law makes everything it touches more automatic and organized, Metcalfe's Law makes everything it touches less reliable and more unpredictable. An unspoken assumption you can see everywhere is that "software is eating the world" means that the world is becoming more organized along the way. But since networking is an implicit fundamental in the definition of software today, everytime software makes the world more organized, it brings networking along with it, and networking makes everything more chaotic.

Everything that software eats becomes newly organized and newly chaotic. Because you have a new form of organization replacing an old form of organization, while a new form of chaos replaces an old form of chaos, it's impossible to really determine whether or not software, when it eats the world, makes it more organized or more chaotic. The net effect is impossible to measure. You might as well assume that they balance perfectly, and Moore's Law and Metcalfe's Law are yin and yang.

But the thing is, when personal computers were a new idea, they emanated order. You typed explicit commands; if you got the command perfectly right, you got what you wanted, and it was the same thing, every time. They didn't have the delays that you get when you communicate with a database, let alone another computer on an unreliable and sometimes absent network. They didn't even have the conceptual ambiguity that comes with exploring a GUI for the first time.

Even the video games back then were mostly deterministic. It's why big design up front looks so insane to developers today, but made sense to smart people at the time. During WWII, the cryptographers who developed computing itself were mathematicians who based everything about computing on rock-solid, Newtonian certainties. You did big design up front because everything was made of logic, and logic is eternal.

This is no longer the case, and this will never be the case again. And this is what feels different about computers in 2016. A few decades ago, "non-deterministic computer" was a contradiction in terms. Today, "non-deterministic computer" is a perfect definition for your iPhone. Everything it does depends on the network — which may or may not exist, at any given time — and you can only use it by figuring out a graphical, haptic interface which might be completely different tomorrow.

Every Netflix client I have operates like a non-deterministic computer. Here's a very "old man yells at cloud" rant. This happened. I go on Netflix, and I start watching a show. There's some weird network glitch or something, and my Apple TV restarts. I go on Netflix a second time, and I go to "previously watched," but the Apple TV didn't tell the network in time, so Netflix doesn't know I was watching this show. So I go manually search for it, and when I hit the button to watch it, Netflix offers me the option of resuming playback where I was before. So it knows I was watching it, now.

Basically, whatever computer cached the list of previously watched shows didn't know the same thing that the computer which cached the playback times did know.

A few decades ago, it was impossible for a computer to have this problem, where the right hand doesn't know what the left hand is doing. Today, it's inherent to computers. And this has long-term consequences which are subtle but deep. Kids who see chaos as an intrinsic element of computing from the moment they're old enough to watch cartoons on Netflix are not going to build the same utopian fantasies that you get from Baby Boomers like Ray Kurzweil. My opinion of transhumanists is that they formed an unbreakable association between order and computers back when networks weren't part of the picture, and they never updated their worldview to integrate the fundamental anarchy of networks.

I don't want to old man yells at cloud too much here. That's where you get these annoying rants where people think the iPad is going to prevent kids from ever discovering programming, as if Minecraft were not programming. And I'm already telling you that the kids see the world a different way, like I'm Morley Winograd, millennial expert. But there's a deep and fundamental generation gap here. Software used to mean order, and now it just means a specific blend of order and chaos.