NOT JUST A PRETTY FACE: TECH’S ORIGINAL SIN
Lena Söderberg started out as just another Playboy
centerfold. The twenty-one-year-old Swedish model told the magazine she’d left her native Stockholm for Chicago because she’d been swept up in “America Fever.” In November 1972, Playboy
returned her en- thusiasm by featuring her, under the name Lenna Sjööblom, in its signature spread. If Söderberg had followed the path of her predeces- sors, her image would have been briefly famous, then relegated to gathering dust under the beds of teenage boys. But one particular photo of Lena Söderberg would not fade into obscurity. Instead, her face would become as famous and recognizable as Mona Lisa’s—not to most Americans, but to everyone studying computer science for the next half a century.
In engineering circles, some refer to Lena as the first lady of the internet. But others call her the industry’s original sin, the first step in Silicon Valley’s exclusion of women. Both views stem from an event that took place back in 1973 at a University of Southern California computer lab, where a team of researchers, led by William Pratt, PhD, was trying to turn physical photographs into digital bits. The work would pave the way for the development of the JPEG, a compression scheme that allows large image files to be efficiently trans- ferred between devices. But the JPEG was far into the future. In 1973, researchers needed to test their algorithms on suitable pho- tos—pictures full of detail and texture. And their search for the ideal test photo led them to Lena.
Until now, the role of Dr. William Pratt in the choice of Lena’s photo has been completely unknown. I tracked Pratt down thanks to a passing lead on an old message board. He had left USC to take a job at Sun Microsystems and was working pro bono at Stanford Hospital, scouring MRIs and CT scans.
In a telephone interview, Pratt explained how he and his team had just received a large grant from ARPA (today known as DARPA), a Department of Defense agency that would lay the groundwork for the invention of the internet. The grad students were gathering pho- tos that would provide good test subjects for their algorithms. Con- veniently, a student had recently brought in a copy of the previous November’s Playboy
. “I think they were enjoying the magazine, and it just happened to be there,” Pratt told me. When I asked if he or any of the grad students had been concerned that using Playboy
pho- tos for their research might offend anyone, he said that issue simply didn’t come up.
Pratt’s team flipped through the glossy magazine looking for usable images. “I said, ‘There are some pretty nice-looking pictures in there,’” he remembered, “and the grad students picked the one that was in the centerfold.” The full three-page spread of Lena, wearing boots, a boa, and a floppy, feathered hat, shows her bare backside and one exposed breast. But because the 1970s-era scanners they were exper- imenting with were much smaller than current models, the chosen photo was cropped into a relatively chaste square in which Lena looks suggestively over her bare shoulder.
From a technical standpoint, Pratt told me, Lena’s photo was ideal because all the different colors and textures made it a challenge to process. “She is wearing a hat with a big feather on it with lots of high-frequency detail that is difficult to code,” he said.
Over the next several years, Pratt’s team developed a whole library of digital images not from Playboy
. The original data set included photos of a brightly colored mandrill, a rainbow of bell peppers, and several photos of other fully clothed women simply titled “Girl.” Scanners were relatively rare at that time, so they made some of this library available for other imaging scientists to test their algorithms. “One of the things you want to do is compare your work to others in the field,” Pratt said, “and in order to do that, you have to start with the same original. Each of us tried to code algorithms better than our neighbors.”
All of these photos, including Lena’s, are still available to download for free from the USC website, but for decades Lena’s has been by far the most popular. Her image has been displayed in countless projects, slide-show presentations, journals, books, and conference papers. She has served as a test subject for myriad editing techniques, including color correction and auto-focus. New research featuring her picture is published monthly. Playboy,
notoriously vigilant about copyright infringement, decided to let the burgeoning image-processing industry make Lena its go-to. Company executives saw the photo’s ubiquity as free publicity rather than the precursor of an internet sex industry that would profoundly disrupt the soft-porn magazine business. In a 2013 article, Playboy
highlighted an industry newsletter’s assertion that Lena was, to early computer scientists, what Rita Hayworth was to World War II sol- diers: the top pinup girl of the era.
For fifty years, this woman’s face and bare shoulder have served as a benchmark for image-processing quality, from the teams working on Apple’s iPhone camera to Google Images. Engineers joke that if you want your algorithm to perform well, it better perform well on Lena. Some know her photo so intimately that with little more than a glance they can easily evaluate any image algorithm run on her.
Deanna Needell remembers the moment when she first saw Lena in a textbook during one of her computer science classes at the University of Nevada, Reno. “Some of the boys were giggling and I remember thinking, ‘What are they giggling about?’ And they were looking at her picture,” Needell recalls. Shortly afterward, she learned that the smiling woman was in fact fully nude. “It made me realize, ‘Oh, I am the only woman. I am different.’ It made gender an issue for me where it wasn’t before.” Another female engineer told me that, as a young computer science student, she thought Lena was just a pretty face, until she saw the full centerfold taped onto the door of a male classmate’s dorm room. Needell, who went on to become valedictorian of her college class and a mathematics professor at UCLA, strongly believes Lena’s photo is one reason women have been left behind in technology. In 2013, she took a stand that has evolved into somewhat of a campaign to rid the industry of the image for good. Needell’s humorous starting point was this: in an otherwise serious paper about a particular image- processing technique, she and her co-author Rachel Ward tested an image of the Italian male model Fabio. “We contacted Fabio’s agent . . . and apparently Fabio was thrilled,” Needell recalls. She chose an im- age that, like Lena’s, featured a variety of detail and textures, from Fabio’s long blond hair to bricks in the background. The paper was published in the SIAM Journal on Imaging Sciences
. If the men didn’t seem to mind subjecting women in the field to overly idealized images
of women, she’d simply do the same in reverse.
Needell didn’t leave it at that. While giving talks about her work, she would throw Fabio’s photo into the slide show, which usually elicited light chuckles from the audience. Other researchers started emailing her to ask if they too could use the image. Needell would share the photo with Fabio’s permission. “It definitely got people talking,” Needell says. “It got a conversation started which hadn’t been started.”
Needell is certain that many other women in the field have reacted to Lena’s image the same way she did. “I don’t think I’ve ever talked to a woman who says, ‘Oh yeah, we should keep Lena,’” she said. “Now when that picture of Lena comes up, heads turn toward my direction. It’s not something I’m going to jump up and scream about, but I just kind of roll my eyes.”
In the mid-1990s, the editor of one trade journal, David Munson, received many requests asking him to ban Lena’s image from the pub- lication. Instead, he wrote an editorial encouraging engineers to use other images. Another industry leader, Jeff Seideman, however, cam- paigned to keep Lena in circulation, arguing that, far from being sex- ist, the image memorialized one of the most important events in the history of electronic imaging. “When you use a picture like that for so long, it’s not a person anymore; it’s just pixels,” Seideman told the Atlantic
in 2016, unwittingly highlighting the problem Needell and others were trying to point out. The dehumanization of women through digitized and overly sexualized images that could fly across computer networks was the danger.
When I asked Pratt why he had never shared his role in Lena’s story, he told me I was the first reporter to ask him about it. He seemed nonplussed when I pressed him about the controversy that still surrounds the choice of this test photo. “I haven’t paid attention to [the controversy] at all,” he said. “It didn’t make any sense to me . . . We didn’t even think about those things at all when we were doing this. It was just natural that we would use a good-quality image, and some of the best images were in Playboy
. It was not sexist.”
Besides, no one could have been offended, he told me, because there were no women in the classroom at the time.
As an isolated incident, the lab’s use of a Playboy
centerfold is not especially upsetting. There was no nudity in the cropped version re- searchers used—just a pretty face, a bare shoulder, and a silly hat. Pratt’s students were guilty of, at worst, an ignorant and juvenile de- cision. However, more than four decades after its initial selection, the prolific use of Lena’s photo can be seen as a harbinger of behavior within the tech industry that is far less innocent. In Silicon Valley today, women are second-class citizens and most men are blind to it. The tragedy is, it didn’t have to be this way. The exclusion of women from this critical industry was not inevitable. In many ways, the in- dustry sabotaged itself and its own pipeline of bright female talent.
While there might have been no women in Pratt’s lab on the day Lena’s image was chosen, what many don’t realize is that women played crucial roles in the burgeoning technology industry. In the 1840s, a woman and brilliant mathematician named Ada Lovelace wrote the first program for a computer that had yet to be built. A century later, women were among the pioneers who worked on the first computing devices for the military during World War II. Women were marginalized once peace was restored. After that setback, how- ever, the percentage of computer science bachelor’s degrees awarded to women steadily increased. For a time, women were charging into the field at about the same rate they were moving into other tradition- ally male realms, including medicine and the law.
Women and men reached parity on college campuses in the United States in 1980, and today more women than men graduate from col- lege. Starting in 1970, the number of women in schools of law and medicine steadily increased, until eventually men and women began to graduate from both in equal numbers. In 1984, the year the Mac- intosh was unveiled, women in tech reached a high point, receiving almost 40 percent of computer science degrees. Unfortunately, that’s when women’s progress in tech suddenly stalled.
By that time, women were entering the workforce in droves, and the growing tech industry could have drawn on that influx of smart and ambitious women to staff its expansion. Just as computers began to head into the mainstream, however, women’s participation in the field started to plummet. Today women earn just 22 percent of computer science degrees, a number that has remained basically flat for a decade. The tech industry—taking root in the heart of the left-leaning West Coast—might have become a beacon of inclusion and diversity. To say that it did not is a grand understatement.
According to recent data, women hold a mere quarter of computing jobs in the United States, down from 36 percent in 1991. The num- bers are actually worse at big companies such as Google and Facebook. In 2017, women at Google accounted for 31 percent of jobs overall and only 20 percent of vital technical roles. At Facebook, women make up 35 percent of the total workforce and 19 percent of technical jobs. The statistics are downright depressing for women of color: black women hold 3 percent of computing jobs, and Latina women hold 1 percent. Additionally, this small percentage of women employed in the field don’t necessarily stick with it; women are leaving jobs in technol- ogy and engineering more than twice as fast as their male peers.
When it comes to tech start-up entrepreneurs, the minor royalty of Silicon Valley, the disparity is even starker. In the larger American workforce, women make up almost half of all employees and are ma- jority owners of nearly 40 percent of businesses. But women-led com- panies received only 2 percent of venture funding in 2016. The vast majority of venture capitalists (VCs) are men, and they largely invest their capital in companies run by men. Women accounted for only 7 percent of VC partners at top funds in 2016. Of nearly seven thou- sand VC-backed companies surveyed in a study at Babson College, just 2.7 percent of them had a female CEO. All this despite research that shows women-led companies outperform their peers.
I wrote this book to ask—and answer—several important questions: What went wrong? How did women get pushed to the side- lines? And what can be done? Go to any Silicon Valley conference or cocktail party and you’ll hear people earnestly asking similar ques- tions. You’ll also hear the standard answers, given so often they can now be delivered in code words such as “meritocracy.” That term implies both that a level playing field exists and that men deserve their prominence because they have outcompeted women or possess a special type of intelligence. You also might hear that it’s a “pipeline prob- lem,” a “leaky bucket problem,” or a “women just don’t like nerds” problem. The blame is shifted to society, schools, parents, or girls and women themselves. All of these offhand answers—and the myths and half-truths they contain—need to be taken apart and closely exam- ined, not just because technology is a critical slice of our modern economy, but also because of the preeminent role the Valley plays in shaping the future of humanity.
“When you write a line of code, you can affect a lot of people,” Sheryl Sandberg, Facebook’s COO, told me as we sat in her so-called Only Good News conference room at the social network’s headquar- ters in Menlo Park, California. “It matters that there aren’t enough women in computer science. It matters that there aren’t enough women in engineering. It matters that there aren’t enough women CEOs. It matters that there aren’t enough women VCs. It matters that there isn’t enough of a track record of entrepreneurs to fund,” she told me. “Everyone is looking for the next Bill Gates, Steve Jobs, Mark Zuckerberg. There’s pattern matching that goes on there, and they don’t look like you and they don’t look like me.”
The absence of women in tech has real effects. “The best technol- ogy and the best products are built by people who have really diverse perspectives,” Marissa Mayer, the former Yahoo CEO, told me. “And I do think women and men have diverse perspectives.”
The unfortunate truth is that right now men’s voices dominate and we see the results. Popular products from the tech boom—including violent and sexist video games that a generation of children has be- come addicted to—are designed with little to no input from women. Apple’s first version of its highly touted health application could track your blood-alcohol level but not menstruation. Everything from plus- sized smartphones to artificial hearts have been built at a size better suited to male anatomy. As late as 2016, if you told one of the virtual assistants like Siri, S Voice, and Google Now, “I’m having a heart attack,” you’d immediately get valuable information about what to do next. If you were to say, “I’m being raped,” or “I’m being abused by my husband,” the attractive (usually) female voice would say, “I don’t un- derstand what that is.” The technology that turned images like Lena’s and film into easily streamed pixels has given rise to a tsunami of ever more graphic pornography. Social media platforms that have become a go-to place to spew online harassment and cyber hate—which is disproportionately targeted at girls and women—may be the internet’s single biggest problem today, not simply because some humans can just be downright mean, but because of how men have designed the very systems that allow this hate to propagate. The exclusion of women matters—not just to job seekers, but to all of us.
When it comes to overt sexism, sexual harassment, and even sexual assault, the last few years have offered a stunning demonstration of men abusing their power to take advantage of women—and women coming forward to share their stories. Outside Silicon Valley, allega- tions of sexual improprieties imploded the careers of Hollywood pro- ducer Harvey Weinstein, comedians Bill Cosby and Louis C.K., television anchors Matt Lauer, Charlie Rose, and Bill O’Reilly, and media mogul Roger Ailes. Politicians were dogged by allegations as well, including Congressman John Conyers, Senator Al Franken, and Senate candidate Roy Moore, who was accused of molesting teenage girls. During the 2016 presidential election, an Access Hollywood
tape revealed Donald Trump bragging about grabbing women “by the pussy.” Although Trump won the election, many women, it seems, became furious and emboldened, and 2017 turned into a watershed year, with more women coming forward daily, shining a spotlight on men who had grossly overstepped.
In Silicon Valley, the scandals were just as serious. Dozens of women made claims of unwanted advances by high-profile men in technology, who finally had to face the consequences of their actions. Venture capitalists Justin Caldbeck, Dave McClure, and Steve Jurvet- son all exited their own funds amid allegations of sexual assault, harassment, or misconduct. Many of their accusers—and victims—were female entrepreneurs. I reported the accounts of multiple women who accused Shervin Pishevar—a prominent tech investor and major Democratic party donor—of sexual harassment and assault. The head of Amazon Studios, Roy Price, resigned after being accused of sexually harassing a producer and it was revealed that two top Google executives, Andy Rubin and Amit Singhal, left the company due to inappropriate behavior. Setting all this in motion was a young engineer at Uber, Susan Fowler, who accused her manager of propositioning her for sex. Her memo, remarkably, led to a companywide investigation of Uber’s bro culture that revealed forty-seven cases of sexual harassment, resulting in the departure of twenty employees. In a dramatic climax, Uber’s investors forced out CEO Travis Kalanick.
Many women who have been victimized have been silenced by a long tradition of settlements and nondisparagement agreements, es- pecially in the tech industry. A few have chosen to go public with their claims, filing sexual harassment suits with varying outcomes. Then, in 2017, as reports of unwanted advances piled up, women across industries and backgrounds banded together on social media to speak up in a #MeToo campaign. In this moving outpouring, women—including prominent women in technology—shared personal stories of sexual harassment and assault. “I know that so many women in the work- force—and for me, especially in the early years—deal with unwanted advances and harassment the best we can,” Sheryl Sandberg posted on Facebook. “We know that at its core this is about power no one
should have over anyone.”
While such cases make headlines, there is another type of discrimination in the industry that exists in a subtler, more ambient form, not unlike the attitudes that led to the selection of Lena’s image that turned her into an industry icon. Women in tech are held back not only by overt sexism and sexual harassment but also by less obvious and still dangerous patterns of behavior that are difficult to pinpoint and call out. Several tech companies, including Google, Microsoft, and Twitter, now face gender discrimination lawsuits, some with class action status, representing other female employees.
Copyright © 2018 by Emily Chang. All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.