10.9 C
New York
Monday, March 25, 2024

The Media Monsters in the National Dialog

Probably the most pressing question in the world right now—that nobody, much to their detriment, is asking—is this: How do you spell dialog?

There are two ways—just like that, or with the silent, French-derived -ue termination. British writers, being posh, favour the extra flavour of dialogue. The blunter dialog, meanwhile, is more common among heretical, phonetical Americans. WIRED’s style is dialog.

>

But there’s more to it. Many grammarians say the distinction is less regional than situational: One may have a dialogue with fellow human beings, but if you’re talking to a machine, or machines are talking amongst themselves, what’s happening is merely a dialog. Seriously.

A quirk of recent technological history, the shortened spelling coincides with the rise, in the 1980s, of personal computers, which ask questions of you—like whether you want to save the changes in this document (always yes)—in pop-up windows called, of course, dialog boxes. Programmers, with their penchant for optimization and elegance, must have enforced the truncation. As time went on, dialog came to be used to refer to machine-mediated data exchanges of all kinds. You still see the stray dialogue box now and again, particularly in British English, but dialog outnumbers dialogue in technical contexts by at least an order of magnitude. Dialoging with computers, it’s safe to say, is like dialoging with Americans: simple, direct, de-Frenched.

That’s the other issue with the word: It is now very much in vogue—which not even a coder would spell vog—to force dialog into the shape of a verb. This person dialoged with, that person dialoged about, and so on. Here we may assign guilt not to programmers but to corporate Americans, who’ve never met an ugly word they can’t further uglify. In a 2009 survey, the usage panel of the American Heritage Dictionary reviewed the following sentence: The department was remiss in not trying to dialogue with representatives of the community before hiring new officers. The construction nauseated four in five panelists, some of whom “felt moved,” the dictionary reported, “to comment on the ugliness or awkwardness.” In conclusion, it’s not easy to dialog about dialog.

Or about anything at all, for that matter, here on the edge of history. The chief crisis of the age, as everyone from lexicographers to tech luminaries will tell you, is that we don’t know how to talk to one another anymore. People can’t have conversations about the weather, much less the daily news cycle, without one or both parties self-combusting in a fit of apoplexy and/or trying to get the other fired. As my mother, a lifelong banker, likes to put it: “We’re afraid to dialog.”

That’s not her fault, or the fault of her fellow corporate Americans. Nor is it the fault of computer programmers. No, those on whom we place the blame for the mass outbreak of dialogophobia in the modern era are, so much of the time, the professional logorrheics themselves: journalists.

When people speak of—dialog about—the media these days, the tone tends to turn conspiratorial. There’s an implied big M, “the Media,” like the way Morpheus says “the Matrix”: with reverence, swallowed by revulsion. By way of proof, consider two statements. First statement: I love the media. Said no one, ever. Second statement: I hate the media. Said you, just yesterday.

But what is the media? It may help to picture it in your mind. Not your tiny piece of it, the pinky toe of the monstrosity. Let the entire entity, the thing itself, take shape. Because that’s what the term points to, when it’s used: a thing, out there, looming. Storm-cloud huge, to be sure, and just as unpindownable, but a thing nonetheless.

So what’s it look like? An army of ragebots, screaming down the face of Mt. Twitter? A late-stage hydra of several million talking heads? (To think: Hannity and Maddow sharing a body.) A mass of pale, quivering flesh, as if a scaled-up version of the many-breasted chicken blobs of Atwood’s Oryx and Crake escaped the lab to terrorize coastal metropolises? The media, to mangle Marshall McLuhan, is the monster.

“We’re all media critics now,” as Jon Baskin, a magazine editor turned media critic, recently said. It’s true, but at a time when one person’s objectivity is another person’s oppression, each of us sees their own media monster. President Trump’s must be particularly terrible, though he attempts to neuter it, casting his white whale as more of a white fish. CNN “sucks,” The New York Times is “failing,” you know the drill. He’s never been alone in this effort, and his latest shipmates are venture capitalists. One of them, Balaji S. Srinivasan, wants his quarter of a million Twitter followers to “ghost” the Times for reasons that echo Trump’s: The mainstream media militates against the free exchange of ideas. Better to “go direct,” Srinivasan said, “if you have something to say.” Tweet more, in other words. (Or, if you’re rich, join him on Clubhouse, the invite-only app for open dialog.)

Of course, the most prolific tweeters in America, per Pew, skew young, liberal, and female, and many of them are journalists. To impugn the media is, in many ways, to impugn this demographic. Sure enough, one of the main targets of Srinivasan’s ire is Taylor Lorenz, a woman who reports, and tweets about, internet trends for the Times. Perhaps what Srinivasan and his kind see when they picture the media is a millennial Medusa figure, all hissing snakes and distaff rage. Don’t look her in the eye—or take her requests for comment.

People have glimpsed monsters in the media since the beginning, but what distinguishes so many of the modern manifestations is their relative youth. Contra McLuhan’s successor, Neil Postman, the digital revolution didn’t obliterate youth culture; it entrenched and empowered it. “For the first time in history,” writes media theorist Kate Eichhorn, “children and adolescents have widespread access to the technologies needed to represent their lives, circulate these representations, and forge networks with each other, often with little or no adult supervision.” Some of them go on to join the ranks of journalism, where, even at the lowest of those ranks, they can remake the media in their image. They decide how to promote stories on every social platform. They choose which pieces get blasted out in email newsletters. They design, produce, and populate the web pages. That job—of communicating the relative importance of stories, of telling readers and listeners what to prioritize—used to be done by the most experienced journalists in the newsroom. It’s still done by them, but mostly in the context of the print editions and nightly broadcasts. Digital is for early-career youngs; the olds prefer the “prestige” of analog.

Analog. Now there’s a funny word. In fact, you might well be wondering: How do you spell it?

Analog can be spelled one of two ways—just like that, or with the silent, French-derived -ue termination. As with dialogue, analogue predominates outside the US, while Americans are more inclined to lop off the excess vowels. But there’s more to it.

At WIRED, as at many other publications, analogue is the noun, typically meaning the thing that is analogous to something else. When a high-up editor at a newspaper decides which story to run above the fold on tomorrow’s front page, that’s a historical analogue to a social media manager pinning a tweet to that news outlet’s timeline. (Some analogues are more metaphorical, like our media Medusa.) Analog, meanwhile, is the adjective, typically meaning the opposite of digital. A newspaper might be called the analog form of its website. This means that, yes, historical analogues are frequently analog.

Those pesky techies again. The shorter spelling began to take off in the 1940s, as a way to distinguish between an old era of electronics and a new one just emerging. “The analog devices use some sort of analogue or analogy,” wrote John Mauchly in 1941, as careful a wordsmith as he was a physicist; a few years later, he and his friends would unveil the ENIAC, the first general-purpose digital computer and the next phase of evolution. Today, you’re most likely to encounter analog technology—as a phrase roughly three times more common than analogue technology—in the homes of older folks and hipster-adjacent retromaniacs.

But even there you won’t find an analog computer. The concept itself may strike some as something of an oxymoron. How could a thing so synonymous with the digital revolution, the computer, ever have been analog? Yet it was, for longer than it’s been digital. Before transistors began multiplying by the Moore-ish million millions, analog computers were making their best guesses based on noisy, changeable, physical properties—hydraulic, electrical, whatever. Nothing was certain; everything was close enough. Sounds messy, but it’s far more sophisticated than we moderns, safe and secure in our silicon, think.

In his new book, Analogia, the idiosyncratic tech historian George Dyson finds analogues everywhere he looks. In the early 1700s, he writes, Leibniz observed that black and white marbles running along multiple tracks “could both encode and logically manipulate concepts of arbitrary complexity in unambiguous terms,” in essence envisioning digital computing two-plus centuries before the first binary digits became bits. There’s nothing novel or special or Darwinian about digital, in other words. If anything, it simplifies analog, strips and dumbs it down, reduces complexity to on/off switches. Another analogue by Dyson: Digital is to speech what analog is to telepathy.

Now look at nature. DNA may be a kind of digital coding, but our brains don’t process discrete signals. They process everything at once, as an analog computer would. “Incorporating both the countable and the uncountable,” Dyson writes, “nature uses digital computing for generation-to-generation information storage, combinatorics, and error correction but relies on analog computing for real-time intelligence and control.”

Which kind of computing describes the world right now? A world in which the default mode is error correction, to the exclusion of real-time intelligence? Where conversations in the media and beyond have collapsed to yes or no, black or white, 1s and 0s—are in every sense machine-readable? Where the discrete mathematics of things said overpower the continuous mathematics of lives lived? The ultimate analogue for digital computing is us. When we are quick to judge, scared of mistakes, loath to deviate, we speak as machines do, in simplified dialogs. Without silence, without vowels, without French.

Dialogues are the analog analogues of dialogs.

Dyson doesn’t say any of this, but he does say analog computing will be reborn. It’s only natural. Life can’t survive on simplicity, on either/or. “The ghosts of the continuum will soon return,” as he puts it. Machines, like their putative masters, must be free to mess shit up.

Now let us, by way of an epilogue to this monologue, play the pedagogue with a three-part catalog: (1) Have fewer dialogs and more dialogues. (2) Let the analog of the past serve as an analogue for our future. (3) We live in a time of plague, but no human would ever spell it plag.

Related Articles

Latest Articles