A Coder Considers the Waning Days of the Craft
Coding has always felt to me like an endlessly deep and rich domain. Now I find myself wanting to write a eulogy for it.
www.newyorker.com
Coder Considers the Waning Days of the Craft
Coding has always felt to me like an endlessly deep and rich domain. Now I find myself wanting to write a eulogy for it.By James Somers
November 13, 2023
Artificial intelligence still can’t beat a human when it comes to programming. But it’s only a matter of time. Illustration by Dev Valladares
Ihave always taken it for granted that, just as my parents made sure that I could read and write, I would make sure that my kids could program computers. It is among the newer arts but also among the most essential, and ever more so by the day, encompassing everything from filmmaking to physics. Fluency with code would round out my children’s literacy—and keep them employable. But as I write this my wife is pregnant with our first child, due in about three weeks. I code professionally, but, by the time that child can type, coding as a valuable skill might have faded from the world.
I first began to believe this on a Friday morning this past summer, while working on a small hobby project. A few months back, my friend Ben and I had resolved to create a Times-style crossword puzzle entirely by computer. In 2018, we’d made a Saturday puzzle with the help of software and were surprised by how little we contributed—just applying our taste here and there. Now we would attempt to build a crossword-making program that didn’t require a human touch.
When we’ve taken on projects like this in the past, they’ve had both a hardware component and a software component, with Ben’s strengths running toward the former. We once made a neon sign that would glow when the subway was approaching the stop near our apartments. Ben bent the glass and wired up the transformer’s circuit board. I wrote code to process the transit data. Ben has some professional coding experience of his own, but it was brief, shallow, and now about twenty years out of date; the serious coding was left to me. For the new crossword project, though, Ben had introduced a third party. He’d signed up for a ChatGPT Plus subscription and was using GPT-4 as a coding assistant.
Something strange started happening. Ben and I would talk about a bit of software we wanted for the project. Then, a shockingly short time later, Ben would deliver it himself. At one point, we wanted a command that would print a hundred random lines from a dictionary file. I thought about the problem for a few minutes, and, when thinking failed, tried Googling. I made some false starts using what I could gather, and while I did my thing—programming—Ben told GPT-4 what he wanted and got code that ran perfectly.
Fine: commands like those are notoriously fussy, and everybody looks them up anyway. It’s not real programming. A few days later, Ben talked about how it would be nice to have an iPhone app to rate words from the dictionary. But he had no idea what a pain it is to make an iPhone app. I’d tried a few times and never got beyond something that half worked. I found Apple’s programming environment forbidding. You had to learn not just a new language but a new program for editing and running code; you had to learn a zoo of “U.I. components” and all the complicated ways of stitching them together; and, finally, you had to figure out how to package the app. The mountain of new things to learn never seemed worth it. The next morning, I woke up to an app in my in-box that did exactly what Ben had said he wanted. It worked perfectly, and even had a cute design. Ben said that he’d made it in a few hours. GPT-4 had done most of the heavy lifting.
By now, most people have had experiences with A.I. Not everyone has been impressed. Ben recently said, “I didn’t start really respecting it until I started having it write code for me.” I suspect that non-programmers who are skeptical by nature, and who have seen ChatGPT turn out wooden prose or bogus facts, are still underestimating what’s happening.
Bodies of knowledge and skills that have traditionally taken lifetimes to master are being swallowed at a gulp. Coding has always felt to me like an endlessly deep and rich domain. Now I find myself wanting to write a eulogy for it. I keep thinking of Lee Sedol. Sedol was one of the world’s best Go players, and a national hero in South Korea, but is now best known for losing, in 2016, to a computer program called AlphaGo. Sedol had walked into the competition believing that he would easily defeat the A.I. By the end of the days-long match, he was proud of having eked out a single game. As it became clear that he was going to lose, Sedol said, in a press conference, “I want to apologize for being so powerless.” He retired three years later. Sedol seemed weighed down by a question that has started to feel familiar, and urgent: What will become of this thing I’ve given so much of my life to?
My first enchantment with computers came when I was about six years old, in Montreal in the early nineties, playing Mortal Kombat with my oldest brother. He told me about some “fatalities”—gruesome, witty ways of killing your opponent. Neither of us knew how to inflict them. He dialled up an FTP server (where files were stored) in an MS-DOS terminal and typed obscure commands. Soon, he had printed out a page of codes—instructions for every fatality in the game. We went back to the basement and exploded each other’s heads.
I thought that my brother was a hacker. Like many programmers, I dreamed of breaking into and controlling remote systems. The point wasn’t to cause mayhem—it was to find hidden places and learn hidden things. “My crime is that of curiosity,” goes “The Hacker’s Manifesto,” written in 1986 by Loyd Blankenship. My favorite scene from the 1995 movie “Hackers” is when Dade Murphy, a newcomer, proves himself at an underground club. Someone starts pulling a rainbow of computer books out of a backpack, and Dade recognizes each one from the cover: the green book on international Unix environments; the red one on N.S.A.-trusted networks; the one with the pink-shirted guy on I.B.M. PCs. Dade puts his expertise to use when he turns on the sprinkler system at school, and helps right the ballast of an oil tanker—all by tap-tapping away at a keyboard. The lesson was that knowledge is power.
But how do you actually learn to hack? My family had settled in New Jersey by the time I was in fifth grade, and when I was in high school I went to the Borders bookstore in the Short Hills mall and bought “Beginning Visual C++,” by Ivor Horton. It ran to twelve hundred pages—my first grimoire. Like many tutorials, it was easy at first and then, suddenly, it wasn’t. Medieval students called the moment at which casual learners fail the pons asinorum, or “bridge of asses.” The term was inspired by Proposition 5 of Euclid’s Elements I, the first truly difficult idea in the book. Those who crossed the bridge would go on to master geometry; those who didn’t would remain dabblers. Section 4.3 of “Beginning Visual C++,” on “Dynamic Memory Allocation,” was my bridge of asses. I did not cross.
But neither did I drop the subject. I remember the moment things began to turn. I was on a long-haul flight, and I’d brought along a boxy black laptop and a CD-rom with the Borland C++ compiler. A compiler translates code you write into code that the machine can run; I had been struggling for days to get this one to work. By convention, every coder’s first program does nothing but generate the words “Hello, world.” When I tried to run my version, I just got angry error messages. Whenever I fixed one problem, another cropped up. I had read the “Harry Potter” books and felt as if I were in possession of a broom but had not yet learned the incantation to make it fly. Knowing what might be possible if I did, I kept at it with single-minded devotion. What I learned was that programming is not really about knowledge or skill but simply about patience, or maybe obsession. Programmers are people who can endure an endless parade of tedious obstacles. Imagine explaining to a simpleton how to assemble furniture over the phone, with no pictures, in a language you barely speak. Imagine, too, that the only response you ever get is that you’ve suggested an absurdity and the whole thing has gone awry. All the sweeter, then, when you manage to get something assembled. I have a distinct memory of lying on my stomach in the airplane aisle, and then hitting Enter one last time. I sat up. The computer, for once, had done what I’d told it to do. The words “Hello, world” appeared above my cursor, now in the computer’s own voice. It seemed as if an intelligence had woken up and introduced itself to me.
Most of us never became the kind of hackers depicted in “Hackers.” To “hack,” in the parlance of a programmer, is just to tinker—to express ingenuity through code. I never formally studied programming; I just kept messing around, making computers do helpful or delightful little things. In my freshman year of college, I knew that I’d be on the road during the third round of the 2006 Masters Tournament, when Tiger Woods was moving up the field, and I wanted to know what was happening in real time. So I made a program that scraped the leaderboard on pgatour.com and sent me a text message anytime he birdied or bogeyed. Later, after reading “Ulysses” in an English class, I wrote a program that pulled random sentences from the book, counted their syllables, and assembled haikus—a more primitive regurgitation of language than you’d get from a chatbot these days, but nonetheless capable, I thought, of real poetry:
I’ll flay him alive
Uncertainly he waited
Heavy of the past
I began taking coding seriously. I offered to do programming for a friend’s startup. The world of computing, I came to learn, is vast but organized almost geologically, as if deposited in layers. From the Web browser down to the transistor, each sub-area or system is built atop some other, older sub-area or system, the layers dense but legible. The more one digs, the more one develops what the race-car driver Jackie Stewart called “mechanical sympathy,” a sense for the machine’s strengths and limits, of what one could make it do.
At my friend’s company, I felt my mechanical sympathy developing. In my sophomore year, I was watching “Jeopardy!” with a friend when he suggested that I make a playable version of the show. I thought about it for a few hours before deciding, with much disappointment, that it was beyond me. But when the idea came up again, in my junior year, I could see a way through it. I now had a better sense of what one could do with the machine. I spent the next fourteen hours building the game. Within weeks, playing “Jimbo Jeopardy!” had become a regular activity among my friends. The experience was profound. I could understand why people poured their lives into craft: there is nothing quite like watching someone enjoy a thing you’ve made.
In the midst of all this, I had gone full “Paper Chase” and begun ignoring my grades. I worked voraciously, just not on my coursework. One night, I took over a half-dozen machines in a basement computer lab to run a program in parallel. I laid printouts full of numbers across the floor, thinking through a pathfinding algorithm. The cost was that I experienced for real that recurring nightmare in which you show up for a final exam knowing nothing of the material. (Mine was in Real Analysis, in the math department.) In 2009, during the most severe financial crisis in decades, I graduated with a 2.9 G.P.A.