66 Highlights
Programmers are thus among the most quietly influential people on the planet. As we live in a world made of software, they’re the architects. The decisions they make guide our behavior.
If you look at the history of the world, there are points in time when different professions become suddenly crucial, and their practitioners suddenly powerful. The world abruptly needs and rewards their particular set of skills.
The first recorded use of “Hello, World!” was in 1972, when a young computer scientist named Brian Kernighan was writing the manual explaining how to program in the coding language called B. He wanted to show the simplest thing you could get B to do, which was to print a message. As he told me, he’d seen a cartoon of a chick coming out of an egg, saying Hello, World!, and liked its funny, quirky ring. So he wrote a simple snippet of B code that displayed that little message.
Programming languages are languages, a method of speaking to machines; but to speak to a computer is to speak to the most literal-minded entity on the planet, a ruthlessly prissy grammarian.
(A study in the ’80s concluded that coders were “less loyal to their employers than to their profession.”)
At the level of the machine, code truly does feel meritocratic: Crappily written software crashes, and better-written stuff doesn’t. Indeed, that binary clarity is what software engineers love about it; like long-distance running, succeeding at a fiendish coding problem feels like a true measure of yourself.
The field of software can also appear more democratically accessible than many other fields, because it’s one where self-taught amateurs work alongside people with PhDs. (That certainly isn’t true of surgery, law, or aerospace engineering.)
Most computers’ capacity back then was quite limited; the IBM 704 could handle only about 4,000 “words” of code in its memory. Writing a program was like writing a haiku or a sonnet. A good programmer was concise, elegant, and never wasted a word. They were poets of bits.
It’s often a surprise to people today, but at MIT’s Lincoln Labs in the 1960s, when Wilkes worked there, most of the “career programmers” were female. Indeed, it was often assumed back then that women were naturals at programming.
It was often a fiercely anticommercial world. Code was a form of artistic expression, they felt—but it wasn’t one they wanted to copyright and make money off of. On the contrary, they believed in freely giving it away and showing it to everyone who wondered, Hey, how’d you do that? That’s how people were going to learn, right?
But they were also the first generation that began to push women out of the field. Unlike Wilkes’s earlier cohort, the core scene of hackers in the MIT lab were exclusively men—often stilted in conversation and living in “bachelor mode,” as they put it, with no interest in dealing with anyone except those like themselves.
Pretty soon, people around the world were clicking “view source” and getting a glimpse into how this crazy new world, the web, really worked. It was much like the BASIC revolution on the Commodore 64, except even faster and more widespread. Everingham and his peers in the ’80s found it pretty slow going to get their hands on BASIC code to study and learn from; they had to download it from a BBS or buy a tech magazine or book that had printed some programs. There was a long gap between each opportunity to learn something new. The web collapsed that time frame to zero. Every single web page you visited contained the code showing how it was created. The entire internet became a library of how-to guides on programming.
BASIC took programming out from the ivory towers and into teenagers’ basements—but the web planted it firmly into the mainstream.
Neither Krieger nor Systrom actively set out to erode anyone’s self-esteem, of course. They loved photography, dug code, and aimed to unlock the latent energy of a world where everyone’s already carrying a camera 24/7. But social software has impacts that the inventors, who are usually focused on the short-term goal of simply getting their new prototypes to work (and then scale), often fail to predict.
And frankly, the money was deforming decisions—what code gets written and why.
“Never before in history have basically fifty mostly men, mostly twenty to thirty-five, mostly white engineer designer types within fifty miles of where we are right now, had control of what a billion people think and do when they wake up in the morning and turn their phone over
if you had to pick the central plank of coder psychology, the one common thread in nearly everyone who gravitates to this weird craft? It’s a boundless, nigh masochistic ability to endure brutal, grinding frustration.
“The distance between looking like a genius and looking like an idiot in programming? It’s one character wide.”
One morning in early 2017, there was a massive collapse of Amazon Web Services, a huge cloud-computing system used by thousands of web apps, including some huge ones, like Quora or Trello. For over three hours, many of those big internet services were impaired. When Amazon finally got things up and running again and sent in a team to try and figure out what had gone wrong, it appeared that the catastrophe had emerged from one of their systems engineers making a single mistyped command.
The word bug is deceptive. It makes bugs sound like an organic process—something that just sort of happens to the machine, as if through an accident of nature. One early use of the term was in 1876, when Thomas Edison complained about malfunctioning telegraph equipment he was developing. (“Awful lot of bugs still,” as he wrote in his notebook later while working on glitchy incandescent lights.)
Coding is, in a profound way, less about making things than about fixing them.
In one of the most famous bugs in history, NASA was forced to blow up its Mariner 1 spacecraft only minutes after launch when it became clear that a bug was causing it to veer off course and possibly crash in a populated area. The bug itself was caused by a single incorrect character.)
“The programmer personality is someone who has the ability to derive a tremendous sense of joy from an incredibly small moment of success.”
The rapid fluctuation between grinding frustration and abrupt euphoria creates whiplashing self-esteem in programmers. Catch them in the low moments, when nothing is functioning, and you’ll find the most despondent, self-flagellating employee on earth. But if you randomly come back an hour later—during which, hello, a three-week-long problem has been figured out—you might find them crowing and preening, abruptly transformed into the most arrogant, grandiose person you’ve ever met.
Indeed, for many programmers, a profound allure of coding is that it’s a refuge from the unpredictability of humans, from their grayscale emotions and needs.
But coding had one pleasure his old job didn’t: a sense of clarity, of proof that his work actually was valid. His
“Learning to code is hard, but you get the self-esteem of ‘I built this, and it works,’”
(The goal of a good puzzle, he said, is to make it always feel like it’s just about to be solved, when it isn’t.)
“They don’t like people,” Perry and Cannon concluded bluntly. “They dislike activities involving close personal interaction; they generally are more interested in things than in people.”
Konrad Zuse, the inventor of the first digital computer, once argued that people wielding computers would be affected deeply by that exposure. “The danger that computers will become like humans,” he noted, “is not as great as the danger that humans will become like computers.”
As my friend Clara Jeffery, the editor in chief of Mother Jones noted in a tweet, “So many Silicon Valley startups are about dudes wanting to replicate mom.”
Meredith L. Patterson wrote in a 2014 essay, adding: “Code is no respecter of persons. Your code makes you great, not the other way around.”
Antonio García Martínez, a former Facebook ad-tech employee, jokes in his book Chaos Monkeys. “The world crowns you a genius, and you start acting like one.”
“Meritocracies say ‘your GitHub is your résumé,’ then they act surprised that their candidate pool doesn’t include a lot of single moms without time to hack on hobby projects,” as Johnathan Nightingale, a former general manager of the open source browser Firefox, writes.
Even online, participating in projects like Linux could require one to withstand verbal storms of derision, particularly if you fell afoul of Torvalds himself, who had long been known to write furious emails to contributors he thought were acting like idiots. (“Please just kill yourself now. The world will be a better place,” as he wrote in one sample email; “SHUT THE FUCK UP!” in another.)
(If you like today’s AI, thank the Canadian government. It spent plenty of taxpayers’ dollars helping to support crucial deep-learning research for years at the nation’s public universities, back when that style of AI was being pooh-poohed worldwide.) When R&D magazine surveyed the top innovations from 1971 to 2006, they found 88 percent had been funded by federal research dollars. None of these fields were being sufficiently funded by the free market.
“In other words,” as the researchers concluded in their paper, “technology entrepreneurs are not libertarians.” They were, in many ways, just traditional Californian leftish thinkers.
They’re happy to share some of the wealth via taxation but want to ensure nothing stops them in their process of how they amass their wealth.
Most truly useful coding isn’t a lone-gunman activity. It’s a deeply social team sport.
After the war, coding jobs shifted from the military to the workplace, and industry desperately needed more programmers—and thus some way to make coding easier than having to onerously write cryptic, number-based “machine code.” Here, women again wound up being pioneers. They designed some of the first “compilers.”
Grace Hopper was wildly productive in this field, often credited as creating the first compiler, as well as the “FLOW-MATIC” language aimed at nontechnical businesspeople. Later, she worked with a team to create COBOL, the language that became massively used by corporations,
(The 1968 book Your Career in Computers argued that people who liked “cooking from a cookbook” would make good programmers.)
“I had it easy,” she later told her son. “The computer didn’t care that I was a woman or that I was black. Most women had it much harder.”
by the 1983–84 academic year, women were fully 37.1 percent of all student coders. In only one decade, they’d more than doubled their participation rate. But that was the peak. From 1984 onward, the percentage slid downhill, slumping even more in the ’90s, such that by the time 2010 rolled around, it had been cut in half. Only 17.6 percent of the students in computer science programs were women.
So beginning in the mid-80s, some of these showed up for their first class having already done a lot of programming.
As it turns out, the kids who’d had this previous experience were mostly boys, as two academics discovered, when they researched the reasons why women’s enrollment was so low. One of these two researchers was Allan Fisher, then the associate dean of the computer science program at Carnegie Mellon University.
She realized the takeover by guys in the ’90s had now turned into a self-perpetuating cycle. Since everyone in charge was mostly a white guy, they preferred hiring people like them; they only recognized talent when it walked and talked as they did.
“Programmers tend to be extremely self-assured of their own rationality and objectivity,” says Cynthia Lee, who should know: She’s a programmer who was an early stage hire at several tech start-ups, became an expert in high-performance computing, and now teaches computer science at Stanford. “They have a very big blind spot for biases on their side, because they don’t think they have blind spots.”
Many scientists who spend their days seriously studying sex differences say biology alone simply isn’t that strong a determinant of preference and ability. Certainly, psychologists have long documented lower levels of self-confidence among many professional women and students. But the detectable differences in boy and girl cognition and behavior are too small to explain such divergence in life and career paths; instead, they’re like a biological signal that gets heavily amplified by cultural feedback into life-changing decisions and preference.
“To a hacker, a closed door is an insult, and a locked door is an outrage,” as Steven Levy wrote of those MIT coders in Hackers.
In 1976, though, Diffie and a colleague had a breakthrough idea. They invented what’s known as “public/private key” crypto. In this system, we each have two keys: a public one that anyone can see—and a private one known only to each of us individually. Sending a message to someone else meant using two keys—your private one, and their public one. The way the crypto math works out, the only person on the planet who can decipher and read the message is the recipient.
While speaking on a panel at a security conference, the crypto hacker Moxie Marlinspike—creator of the Signal messaging app—argued that making things occasionally easier for lawbreakers was an acceptable cost for securing everyone’s privacy. “I actually think that law enforcement should be difficult. And I think it should actually be possible to break the law,” he said.
For years, coders have been programming computers to do our repetitive actions. Now they’re automating our repetitive thoughts.
Computer programs break when they reach an “edge case,” when the user tries to do something that the coder never anticipated. And human interactions are filled with edge cases.
So what neural-net coders really do, in many ways, is gather data, experiment, tweak, and pray. Easily the biggest part of their work is simply assembling the sample data to train their neural net.
Some of the best deep-learning model trainers worry that there’s still too much mystery about why the models work so well sometimes, yet in other cases don’t. Sure, you can tell if a model is working—it’s accurately recognizing pictures of pedestrians, maybe 90 percent of the time! But you can’t always completely explain why it’s suddenly doing so, or give precise advice for someone training a neural-net model on a very different task, such as translating language.
“Designing any AI system involves moral choices,” she adds. “And if you try not to make those moral choices, you’re still making moral choices.”
In 2018, the EU put into effect a new regulation that establishes an interesting new right for European citizens: the right to an explanation when their lives are impacted by AI.
This is the world of code, after all, where a single aha insight can take an algorithm from “not working” to “working” in a few minutes. In 1933, the physicist Ernest Rutherford pooh-poohed the idea of nuclear energy as impractical, but merely a decade later, the US was creating nuclear reactors and setting off atom bombs.
“It makes for a good Hollywood movie, but AI is very different from human intelligence,” he concludes. Andrew Ng was less scornful. The risk may be real, but it’s so many decades away we’ll have plenty of time to help forestall it. “Worrying about killer AI,” he told me, “is like worrying about overpopulation on Mars.”
Once advertising and growth become the two pillars of a big-tech firm, then it’s nearly inevitable that they’ll seduce their users into endless, compulsive use—or “engagement,” as it’s euphemistically called.
“When your customers are paying you money, you can actually call them customers and not users, which is a term from drug dealing,” jokes David Heinemeier Hansson.
“Learn as if you are going to live forever, live as if you are going to die tomorrow,”
“It was motivation through starvation,” he notes. “My daddy used to say, Life ain’t fair, so wear a helmet.”
One of the things that makes coding weird, as an industry, is that people can teach themselves how to do it. There aren’t very many technical professions that work this way.
John McCarthy, he said, ‘Everybody needs computer programming. It will be the way we speak to the servants.’”