wired.com
Jason Kehe
21–27 minutes
Incredibly, Angelina Jolie called it. The year was 1995. Picture Jolie, short of both hair and acting experience, as a teenage hacker in Hackers. Not a lot of people saw this movie. Even fewer appreciated its relevance. Hackers was “grating,” Entertainment Weekly huffed at the time, for the way it embraced “the computer-kid-as-elite-rebel mystique currently being peddled by magazines like WIRED.” Thirty years later, Entertainment Weekly no longer publishes a magazine, WIRED does, and Hackers ranks among the foundational documents of the digital age. The last time I saw the movie, it was being projected onto the wall of a cool-kids bar down the street from my house.
But that’s not the incredible thing. The incredible thing, again, is that Jolie called it. It. The future. Midway through Hackers, she’s watching her crush (played by Jonny Lee Miller, whom she’d later marry in real life) type passionately on a next-gen laptop. “Has a killer refresh rate,” Miller says, breathing fast. Jolie replies: “P6 chip. Triple the speed of the Pentium.” Miller’s really worked up now. Then Jolie leans forward and, in that come-closer register soon to make her world-famous, says this: “RISC architecture is gonna change everything.”
Our deepest dives and cutting-edge features that will leave you smarter and sharper. Delivered on Sundays.
You have to believe me when I say, one more time, that this is incredible. And what’s incredible is not just that the filmmakers knew what RISC architecture was. Or that Jolie pronounced it correctly (“risk”). Or even that Jolie’s character was right. What’s incredible is that she’s still right—arguably even more right—today. Because RISC architecture is, somehow, changing everything again, here in the 21st century. Who makes what. Who controls the future. The very soul of technology. Everything.
And nobody’s talking about it.
And that’s probably because the vast majority of people everywhere, who use tech built on it every single day, still don’t know what in the computer-geek hell a RISC architecture even is.
Unless you’re in computer-geek hell, as I am, right now. I’ve just arrived at the annual international RISC-V (that’s “risk five”) summit in Santa Clara, California. Here, people don’t just know what RISC is. They also know what, oh, vector extensions and AI accelerators and matrix engines are. At the coffee bar, I overhear one guy say to another: “This is a very technical conference. This is a very technical community.” To which the other guy replies: “It ought to be. It ought to be.”
OK, but where are the cool kids? It’s hard not to fixate on appearances at an event like this—a generic convention center, with generic coffee, in a generic town. I guess I was hoping for neon lights and pixie cuts. Instead it’s frumpy, forgettable menswear as far as the eye can see. There are 30 men for every woman, I count, as everyone gathers in the main hall for the morning presentations.
Then someone takes the stage, and she’s not just a she. She is Calista Redmond, the CEO of RISC-V International, and, Angelina Jolie be praised, she’s wearing a nifty jacket, a statement belt, and gold-and-silver … pumps? stilettos? Wait, what’s the difference? Of all the things to ask Redmond when I run into her at a happy hour later that day, that’s what I choose. She looks at me, smiles blankly, and just says, “I don’t know.”
In shame I retreat to the bar, where I decide I must redeem myself. So, cautiously, I make my way back to Redmond, who’s now deep in conversation with the chief marketing officer of a semiconductor startup. I try to impress them with a technical observation, something about RISC and AI. Redmond turns to me and says, “I thought you wanted to talk about shoes.” I assure her I’m not here to talk about what’s on the outside. I’m here to talk about what’s on the inside.
“Jason here is writing a story about RISC for WIRED,” Redmond tells the CMO. She’s not sure, frankly, that this is a great idea. Not because she isn’t a believer. In many ways, she’s the believer, the face of the brand. Attendees at the conference invoke her name with casual reverence: Calista says this, Calista thinks that. And did you hear her morning keynote? In fact I did. “We have fundamentally launched!” she announced, to the yelps of the business-casuals. RISC-V will transform, is transforming, machinery everywhere, she said, from cars to laptops to spaceships. If anyone doubts this, Redmond sends them the Hackers clip.
As CEO of RISC-V International, Calista Redmond moved the foundation’s headquarters from the US to Switzerland to allay members’ concerns about geopolitical neutrality.
Photograph: Jenna Garrett
So why, I press her now, should I not support the cause and write the big, cyberpunky, untold story of RISC? Because, Redmond says, not only does no one know what RISC is. No one cares what RISC is. And no one should. People don’t buy “this or that widget,” she says, because of what’s inside it. All they want to know is: Does the thing work, and can I afford it?
To my dismay, almost everyone I talk to at the conference agrees with Redmond. Executives, engineers, marketers, the people refilling the coffee: “Calista’s probably right,” they say. Now it’s my turn to get annoyed. I thought insides mattered! RISC is one of the great and ongoing stories of our time! People should care.
So I resolve to talk to the one person I think must agree with me, who has to be on my side: the legendary inventor of RISC itself.
The inner workings of a computer, David Patterson says, should be kept simple, stupid. We’re sitting in an engineering lab at UC Berkeley, and Patterson—77 years old, partial to no-frills athleisure—is scribbling on a whiteboard. A computer’s base operation, he explains, is the simplest of all: ADD. From there you can derive SUBTRACT. With LOAD and STORE, plus 30 or so other core functions, you have a complete basis for digital computation. Computer architects call this the “instruction set architecture,” or the ISA. (They switch between saying each letter, “I-S-A,” and—the neater option—pronouncing it as a word, “eye-suh.”)
Computer architectures are so named because, well, that’s exactly what they are—architectures not of bricks but of bits. The people who made Hackers plainly understood this. In sequences of dorky-awesome special effects, we fly through futuristic streets, look up at futuristic buildings, only to realize: This isn’t a city. This is a microchip.
Even within a chip, there are subarchitectures. First come the silicon atoms themselves, and on top of those go the transistors, the circuits and gates, the microprocessors, and so on. You’ll find the ISA at the highest layer of the hardware. It is, I think, the most profound architecture ever devised by humans, at any scale. It runs the CPU, the computer’s brain. It’s the precise point, in other words, at which dead, inert, hard silicon becomes, via a set of powerful animating conjurations, soft and malleable—alive.
Everyone has their own way of explaining it. The ISA is the bridge, or the interface, between the hardware and the software. Or it’s the blueprint. Or it’s the computer’s DNA. These are helpful enough, as is the common comparison of an ISA to a language. “You and I are using English,” as Redmond said to me at the conference. “That’s our ISA.” But it gets confusing. Software speaks in languages too—programming languages. That’s why Patterson prefers dictionary or vocabulary. The ISA is less a specific language, more a set of generally available words.
Back when Patterson started out, in the 1970s, the early ISAs were spinning out of control. Established tech companies figured that as hardware design improved and programming languages got more sophisticated, computers shouldn’t remain simple; they should be taught larger vocabularies, with longer words. The more types of operations they were capable of, the logic went, the more efficient their calculations would be.
On the whiteboard, Patterson scrawls the word POLYNOMIAL in big letters—just one of the hundreds of operations that Intel and others added to their ISAs. Even as a young recruit at Berkeley, Patterson suspected that the bigwigs had it backward, that exactly none of these esoteric add-ons were necessary. That a bigger dictionary did not lead to clearer sentences.
So he and a senior colleague decided to strip the kruft from the instruction sets of midcentury computing. At the time, the Defense Advanced Research Projects Agency was giving out grants for “high-risk” research. Patterson says they chose the acronym RISC—reduced instruction set computer—as a fundraising ploy. Darpa gave them the money.
Patterson then did as aspiring academics do: He wrote a spicy paper. Called “The Case for the Reduced Instruction Set Computer” and published in 1980, it set off a great war of architectures. “The question then,” as Patterson would later say in an acceptance speech for a major prize, “was whether RISC or CISC was faster.” CISC (pronounced “sisk”) was the name Patterson gave the rival camp: complex instruction set computer. The CISCites fired back with a paper of their own and, at international conferences throughout the early ’80s, battled it out with the RISCites onstage, the bloodshed often spilling into the hallways and late-night afterparties. Patterson taunted his opponents: They were driving lumbering trucks while he was in a feather-light roadster. If you magnify a RISC-based microchip from those years, you’ll spot a sports car etched into the upper left corner, just 0.4 millimeters in length.
The RISCites won. With vigilant testing, they proved that their machines were between three and four times faster than the CISC equivalents. The RISC chips had to perform more operations per second, it’s true—but would you rather read a paragraph of simple words, or a sentence of polysyllabic verbiage? In the end, CISCites retracted their claims to supremacy, and the likes of Intel turned to RISC for their architectural needs.
David Patterson, who remains semiretired, is now studying the life-cycle carbon emissions of AI hardware.
Photograph: Jenna Garrett
Not that anybody outside tech circles talked about this at the time. When Hackers came out in 1995, Patterson was flabbergasted to hear his life’s work, 15 years old by that point, mentioned so casually and seductively by a Hollywood starlet. Computers were still too geeky, surely, to matter to the masses. (When I make Patterson rewatch the scene, he’s all smiles and pride, though he does say they mistake “refresh rate” for “clock rate.”)
RISC Architecture Really Did Change Everything
Jason Kehe
21–27 minutes
Incredibly, Angelina Jolie called it. The year was 1995. Picture Jolie, short of both hair and acting experience, as a teenage hacker in Hackers. Not a lot of people saw this movie. Even fewer appreciated its relevance. Hackers was “grating,” Entertainment Weekly huffed at the time, for the way it embraced “the computer-kid-as-elite-rebel mystique currently being peddled by magazines like WIRED.” Thirty years later, Entertainment Weekly no longer publishes a magazine, WIRED does, and Hackers ranks among the foundational documents of the digital age. The last time I saw the movie, it was being projected onto the wall of a cool-kids bar down the street from my house.
But that’s not the incredible thing. The incredible thing, again, is that Jolie called it. It. The future. Midway through Hackers, she’s watching her crush (played by Jonny Lee Miller, whom she’d later marry in real life) type passionately on a next-gen laptop. “Has a killer refresh rate,” Miller says, breathing fast. Jolie replies: “P6 chip. Triple the speed of the Pentium.” Miller’s really worked up now. Then Jolie leans forward and, in that come-closer register soon to make her world-famous, says this: “RISC architecture is gonna change everything.”
The Big Story Newsletter: Riveting Deep-Dives
Our deepest dives and cutting-edge features that will leave you smarter and sharper. Delivered on Sundays.
You have to believe me when I say, one more time, that this is incredible. And what’s incredible is not just that the filmmakers knew what RISC architecture was. Or that Jolie pronounced it correctly (“risk”). Or even that Jolie’s character was right. What’s incredible is that she’s still right—arguably even more right—today. Because RISC architecture is, somehow, changing everything again, here in the 21st century. Who makes what. Who controls the future. The very soul of technology. Everything.
And nobody’s talking about it.
And that’s probably because the vast majority of people everywhere, who use tech built on it every single day, still don’t know what in the computer-geek hell a RISC architecture even is.

Unless you’re in computer-geek hell, as I am, right now. I’ve just arrived at the annual international RISC-V (that’s “risk five”) summit in Santa Clara, California. Here, people don’t just know what RISC is. They also know what, oh, vector extensions and AI accelerators and matrix engines are. At the coffee bar, I overhear one guy say to another: “This is a very technical conference. This is a very technical community.” To which the other guy replies: “It ought to be. It ought to be.”
OK, but where are the cool kids? It’s hard not to fixate on appearances at an event like this—a generic convention center, with generic coffee, in a generic town. I guess I was hoping for neon lights and pixie cuts. Instead it’s frumpy, forgettable menswear as far as the eye can see. There are 30 men for every woman, I count, as everyone gathers in the main hall for the morning presentations.
Then someone takes the stage, and she’s not just a she. She is Calista Redmond, the CEO of RISC-V International, and, Angelina Jolie be praised, she’s wearing a nifty jacket, a statement belt, and gold-and-silver … pumps? stilettos? Wait, what’s the difference? Of all the things to ask Redmond when I run into her at a happy hour later that day, that’s what I choose. She looks at me, smiles blankly, and just says, “I don’t know.”
In shame I retreat to the bar, where I decide I must redeem myself. So, cautiously, I make my way back to Redmond, who’s now deep in conversation with the chief marketing officer of a semiconductor startup. I try to impress them with a technical observation, something about RISC and AI. Redmond turns to me and says, “I thought you wanted to talk about shoes.” I assure her I’m not here to talk about what’s on the outside. I’m here to talk about what’s on the inside.
“Jason here is writing a story about RISC for WIRED,” Redmond tells the CMO. She’s not sure, frankly, that this is a great idea. Not because she isn’t a believer. In many ways, she’s the believer, the face of the brand. Attendees at the conference invoke her name with casual reverence: Calista says this, Calista thinks that. And did you hear her morning keynote? In fact I did. “We have fundamentally launched!” she announced, to the yelps of the business-casuals. RISC-V will transform, is transforming, machinery everywhere, she said, from cars to laptops to spaceships. If anyone doubts this, Redmond sends them the Hackers clip.

As CEO of RISC-V International, Calista Redmond moved the foundation’s headquarters from the US to Switzerland to allay members’ concerns about geopolitical neutrality.
Photograph: Jenna Garrett
So why, I press her now, should I not support the cause and write the big, cyberpunky, untold story of RISC? Because, Redmond says, not only does no one know what RISC is. No one cares what RISC is. And no one should. People don’t buy “this or that widget,” she says, because of what’s inside it. All they want to know is: Does the thing work, and can I afford it?
To my dismay, almost everyone I talk to at the conference agrees with Redmond. Executives, engineers, marketers, the people refilling the coffee: “Calista’s probably right,” they say. Now it’s my turn to get annoyed. I thought insides mattered! RISC is one of the great and ongoing stories of our time! People should care.
So I resolve to talk to the one person I think must agree with me, who has to be on my side: the legendary inventor of RISC itself.

The inner workings of a computer, David Patterson says, should be kept simple, stupid. We’re sitting in an engineering lab at UC Berkeley, and Patterson—77 years old, partial to no-frills athleisure—is scribbling on a whiteboard. A computer’s base operation, he explains, is the simplest of all: ADD. From there you can derive SUBTRACT. With LOAD and STORE, plus 30 or so other core functions, you have a complete basis for digital computation. Computer architects call this the “instruction set architecture,” or the ISA. (They switch between saying each letter, “I-S-A,” and—the neater option—pronouncing it as a word, “eye-suh.”)
Computer architectures are so named because, well, that’s exactly what they are—architectures not of bricks but of bits. The people who made Hackers plainly understood this. In sequences of dorky-awesome special effects, we fly through futuristic streets, look up at futuristic buildings, only to realize: This isn’t a city. This is a microchip.
Even within a chip, there are subarchitectures. First come the silicon atoms themselves, and on top of those go the transistors, the circuits and gates, the microprocessors, and so on. You’ll find the ISA at the highest layer of the hardware. It is, I think, the most profound architecture ever devised by humans, at any scale. It runs the CPU, the computer’s brain. It’s the precise point, in other words, at which dead, inert, hard silicon becomes, via a set of powerful animating conjurations, soft and malleable—alive.
Everyone has their own way of explaining it. The ISA is the bridge, or the interface, between the hardware and the software. Or it’s the blueprint. Or it’s the computer’s DNA. These are helpful enough, as is the common comparison of an ISA to a language. “You and I are using English,” as Redmond said to me at the conference. “That’s our ISA.” But it gets confusing. Software speaks in languages too—programming languages. That’s why Patterson prefers dictionary or vocabulary. The ISA is less a specific language, more a set of generally available words.
Back when Patterson started out, in the 1970s, the early ISAs were spinning out of control. Established tech companies figured that as hardware design improved and programming languages got more sophisticated, computers shouldn’t remain simple; they should be taught larger vocabularies, with longer words. The more types of operations they were capable of, the logic went, the more efficient their calculations would be.
On the whiteboard, Patterson scrawls the word POLYNOMIAL in big letters—just one of the hundreds of operations that Intel and others added to their ISAs. Even as a young recruit at Berkeley, Patterson suspected that the bigwigs had it backward, that exactly none of these esoteric add-ons were necessary. That a bigger dictionary did not lead to clearer sentences.
So he and a senior colleague decided to strip the kruft from the instruction sets of midcentury computing. At the time, the Defense Advanced Research Projects Agency was giving out grants for “high-risk” research. Patterson says they chose the acronym RISC—reduced instruction set computer—as a fundraising ploy. Darpa gave them the money.
Patterson then did as aspiring academics do: He wrote a spicy paper. Called “The Case for the Reduced Instruction Set Computer” and published in 1980, it set off a great war of architectures. “The question then,” as Patterson would later say in an acceptance speech for a major prize, “was whether RISC or CISC was faster.” CISC (pronounced “sisk”) was the name Patterson gave the rival camp: complex instruction set computer. The CISCites fired back with a paper of their own and, at international conferences throughout the early ’80s, battled it out with the RISCites onstage, the bloodshed often spilling into the hallways and late-night afterparties. Patterson taunted his opponents: They were driving lumbering trucks while he was in a feather-light roadster. If you magnify a RISC-based microchip from those years, you’ll spot a sports car etched into the upper left corner, just 0.4 millimeters in length.
The RISCites won. With vigilant testing, they proved that their machines were between three and four times faster than the CISC equivalents. The RISC chips had to perform more operations per second, it’s true—but would you rather read a paragraph of simple words, or a sentence of polysyllabic verbiage? In the end, CISCites retracted their claims to supremacy, and the likes of Intel turned to RISC for their architectural needs.

David Patterson, who remains semiretired, is now studying the life-cycle carbon emissions of AI hardware.
Photograph: Jenna Garrett
Not that anybody outside tech circles talked about this at the time. When Hackers came out in 1995, Patterson was flabbergasted to hear his life’s work, 15 years old by that point, mentioned so casually and seductively by a Hollywood starlet. Computers were still too geeky, surely, to matter to the masses. (When I make Patterson rewatch the scene, he’s all smiles and pride, though he does say they mistake “refresh rate” for “clock rate.”)