When my wife started a little garden in our urban backyard, all I could think about were the worms. Also the bugs, and the dirt, which is of course filled with worms and bugs and composted corn cobs. But she was happy. She introduced me to many bees and enthused about borage, which is a flowering herb that bees like. We started to eat our own lettuce.
You're supposed to love nature, so I kept my mouth shut. But I find the whole idea of it genuinely horrifying. Part of the privilege of being a nerd is that you're able to forget you have a body: You cruise around cyberspace, get a beverage out of the fridge, cruise some more. In the natural world, bodies are inescapable. Everything keeps growing, and the growth feels like rot. There is hair everywhere. I did the math, and in the past 16.38 seconds humankind collectively added a mile of fingernails. That's how I see nature. I don't like dirt. I like devices.
But over time, you know, you get curious. You want to know what things are made of. It's the same urge that makes you send your saliva to some random company in order to learn that, after an entire lifetime of being told you're Irish, you're Irish. It's also why skeletons are cool. We like to look inside the thing.
So I learned some assembly language. Assembly is a method of programming that peels back almost all the layers of abstraction and gets you close to a computer's CPU. Instead of speaking in long, detailed Python (for example) statements, you're issuing tons of curt instructions: Move this bit over there. I have a broad definition of fun, but I found assembly to be none at all; it felt like using an angry calculator. To add two numbers, you have to tell the computer to reserve two places for the numbers, put them there, add them, and put the result somewhere else.
But as I read more about the physics of chips, I started to have a kind of acceptance of assembly language. I stopped seeing it as an annoying, unfinished abstraction—a bad programming language—and started seeing it for what it is: an interface to the physical world.
Billions of years ago, I learned, an evil witch, or perhaps God Themself, cursed the class of materials known as silicates, which are abundant on this planet, and made them neither insulators nor conductors but rather an eldritch horror known as semiconductors. Eventually, scientists realized that the dual nature of these materials could be exploited to turn them into tiny switches, visible only through a microscope. Put these little switches all together in a sequence, add a clock, and away you go. You know, something like that.
As I dug in further, I saw that beneath the orderly tower of abstraction there's just an arbitrary, multilayered mess of worms and corn cobs. Each microchip has its own history, its own way of mixing up physics, chemistry, math, and manufacturing. And once I started to internalize and accept that mess—to accept that the computer is a weird hack of reality—it all became kind of fun. This is how we turn dirt into apps that trade Bitcoin.
I've been trying, without much success, to accept climate science. I don't mean that I dispute it, any more than I dispute semiconductor physics. I have no problem believing that we've screwed up the world. I was raised in a chemical-manufacturing part of Pennsylvania, and sometimes people in moon suits would come to the door at 3 am and ask us to please drive somewhere upwind for a while. This meant we'd go to Denny's and have pancakes.
The problem I have is that “climate change” involves a large number of unbelievably boring things—all the pain of physics and chemistry, some biology to make it worse, statistics on top of that. Not enough fun? Add in economics. And there aren't so many nice abstractions. No animated paper clip pops up and says, “Looks like you're trying to incentivize wind turbines!” It's literally as interesting as watching ice melt, because climatologists do watch ice melt. (If the ice has bubbles, they study the gases inside. That's how they determine the paleoclimate.)
But one feels an ethical responsibility to try to understand the planetary CPU. My dumb magpie brain can't comprehend much of it, but I'm learning about ice bubbles, normal distributions, pluvial flooding (vs. fluvial), and, of course, wet-bulb temperature. This turns out to be a world of fun facts: One of the reasons sea level rises is that warm water is bigger. Scientists know how old dead trees are because they know how carbon isotopes decay. Thousands of hacks like that make up a discipline. And after a while you realize that science itself is just an API to nature, a bunch of kludges and observations that work well enough to get the job done. The job being measuring reality and predicting what will come next.
There's a very large piece of public art embedded in the tiles at the Bryant Park subway station in Manhattan. It's a granite-and-glass portrait of root systems and animal burrows by the artist Samm Kunce. Above it are these words, by the psychologist Carl Jung: “Nature must not win the game, but she cannot lose.” I went and looked up the full quote. It continues: “And whenever the conscious mind clings to hard and fast concepts and gets caught in its own rules and regulations—as is unavoidable and of the essence of civilized consciousness—nature pops up with her inescapable demands.”
Little rainstorms come many nights in the summer, more often than they used to. The cucumbers swell in the raised beds. The worms burrow up to the surface. My phone buzzes in my pocket, calling me to a place where the rusty lawn chair I'm sitting in doesn't exist and fingernails don't grow. The garden is indifferent to a lot of the abstractions I hold dear, but I'm learning to accept it. Pluvial flooding is flash floods; fluvial is when the lake rises.
This article appears in the September issue. Subscribe now.