Sarah Jeong has a warning: Fix the internet before it destroys America.
"The internet, in its current shape, has brought out the worst parts of humanity and is hiding the best parts," she says. "It's possible the internet has made us lose our minds."
When Jeong goes online, here's what she sees: YouTube videos spreading anti-Semitic conspiracy theories. Digital mobs copy-and-pasting harassing memes. Data brokers buying your private information from tech monoliths and placing ads targeting your weaknesses.
"We're living in a scammer's paradise," Jeong says, "not just economic scams, but intellectual scams, too."
People are beginning to listen to her.
Jeong, 30, may be the most influential Oregonian you've never heard of. While she lives quietly on the westside and spends her free time knitting lace, she is also a member of The New York Times editorial board, penning its opinions about the tech industry and internet policy.
She enjoys one of the most prominent perches in America to witness the growing backlash against Facebook and its fellow big-tech companies.
It's hard to keep up. Just last week, the U.S. Department of Housing and Urban Development sued Facebook for letting real estate brokers use personal data to hide housing ads from people of color. In March, U.S. Sen. Elizabeth Warren (D-Mass.) called for the breakup of Facebook, Google and Amazon. Last week, the International Monetary Fund blasted big-tech companies for avoiding taxes. And last month, the world watched in horror as a white supremacist live-streamed the murder of Muslim worshippers in New Zealand.
The techlash has started—and Jeong saw much of this coming.
"Sarah just has an absolute unique combination of mastery of legal issues and the ability to connect those issues to real emergent behaviors online," says her former editor Nilay Patel, editor-in-chief of the Verge, a digital publication dedicated to the tech industry.
And she's doing it from Portland.
Jeong moved here in October 2016 after she "rage-quit" San Francisco and its rent hikes. In some ways, Jeong's arrival is a mirror image to Portland's tech boom. Silicon Valley programmers are arriving here in waves.
In the Old Portland wine bar beneath a Slabtown freeway, Jeong chatted about what she sees as a broken internet, cracked under the weight of billions of garbage posts, filth and disinformation. "It's the new front of the culture war," she says.
She didn't always feel this way about tech. Growing up, she used the web as a road map out of religious fundamentalism.
Raised Southern Baptist outside Los Angeles, Jeong studied so-called creation science at her religious high school—where teachers told her T. rex walked the earth alongside biblical figures like Adam and died out in Noah's flood.
"I got online and it's how I unbrainwashed myself," she says. "The internet was protecting me from disinformation."
A first-generation South Korean immigrant (her family arrived in the U.S. when she was 3), she studied philosophy at the University of California Berkeley and got a law degree at Harvard.
In her 2015 book, The Internet of Garbage, Jeong argued that social media gatekeepers had failed to take out the trash by tolerating unfettered harassment.
Last summer, she joined the Times' editorial board. Almost immediately, a group of far-right trolls dredged up a handful of years-old tweets Jeong wrote satirizing racist and sexist comments from people targeting her. "Are white people genetically predisposed to burn faster in the sun, thus logically being only fit to live underground like groveling goblins," she tweeted in 2014.
The Times received calls for Jeong to be fired. The far-right media machine kicked into gear, spinning off stories calling Jeong a "reverse racist" for weeks. She received death threats. The newspaper refused to cave.
"It was a bad-faith thing," she says. "If this is the worst from my past that's going to come back and haunt me, I'm good. I'm not embarrassed by any of those tweets. There was a context to all of it."
That controversy has quieted down, but Jeong says she sees the same strategies being applied to higher-profile women of color in Congress, like Rep. Ilhan Omar (D-Minn.) and Alexandria Ocasio-Cortez (D-N.Y.).
"You see this happen now in a very political way. A lot of the tactics used are the tactics we saw in Gamergate"—the 2014 online war over women in video game journalism in which right-wing trolls perfected the spread of harassing messages in a coordinated attack.
Four years ago, Jeong described the internet as a trash heap. Now she says the disinformation clogging up cyberspace in 2019 is more like nuclear waste—like the spent fuel rods stored in a Nevada mountain.
"It's like our Yucca Mountain moment," she says. "Tech might be hitting that moment where we're realizing there are these things we don't know how to counteract."
Jeong will talk about Bitcoin and blockchain this week at TechfestNW. Before her talk, WW sat down with her to discuss her work, digital threats to democracy, and whether Facebook is destroying America.
WW: You moved to Portland one month before the 2016 election and the protests that followed. Did that change how you felt about this city?
Sarah Jeong: If anything, the defining moment for Portland for me was actually the TriMet stabbing. The stabbing was a big moment in my sense of normalcy and safety. Because at that point, Hollywood [Transit Center] was my TriMet stop. It was a really traumatizing moment for me. There already was such a heightened sense of fear in the city, especially for nonwhite people and especially people whose immigration status wasn't settled.
And I was in a relatively safe position, all things considered, but I was still super freaked out. Not being a citizen, there's such a defining line, even if you've got your green card. There is such a strong dividing line between citizens and everybody else. But because it made national news, it didn't feel like it was a Portland thing. It felt like it was a national thing. It felt like everyone in the country was in the same boat.
Things were maybe playing out in Portland in a more flagrant way. I did tell my friends outside Portland: Things are playing out in public in Portland. Mainly the confrontations between various alt-right groups and Antifa. Extreme polarization turned into violence, and I don't mean that in a "both sides" way, but it's the most diplomatic way of putting what's going on.
The violence we're seeing in Portland not only is fueled by social media—it actually feels like internet screaming matches come to life in the streets. Did the internet make us all hate each other?
I'm actually really unsure. There's a study that looked at Germany, and it went town by town and found anti-refugee attacks increased where there was more Facebook adoption. It really is possible that the internet is doing this. It's part of the feedback loop, certainly. I don't know what's chicken and egg. All of these people who are hurting each other were born before Facebook existed.
You've dealt with a lot of harassment, right?
I got my first death threat because of Bitcoin [after writing a story about a controversial proposal to make the Bitcoin block size bigger]. It goes with the territory.
Many people issuing violent threats online do so from anonymous accounts. Should anonymity be allowed on the internet?
We know who started Gamergate. We know who the ringleaders of Gamergate are. We know their names. We know a fair amount about them. We know who [Vice Media co-founder and now far-right provocateur] Gavin McInnes is. I think it's easier to be afraid of anonymous people because of so many unknowns, but from what we can see from internet conflagrations, they're always egged on and led by people who are absolutely not anonymous. I don't think anonymity is the key.
There are platforms now like Gab, 4chan and 8chan that are sewers, where people argue for race wars and genocide. What the hell do we do about those places?
I think a lot of it is a symptom of law enforcement not really taking it seriously. We have a problem with federal law enforcement not taking right-wing extremism seriously in the U.S. And a lot of these places harbor child porn. This one is a no-brainer to me. Enforce the laws when it comes to these spaces. There's so much shit you could get these places on. They all eventually cross the line and produce a crime. It happens. There's just such a strange disinterest in it.
Can you remember a single moment that most disillusioned you about the internet?
It happened over a long period of time. But I think the thing that stuck in my head first must have been the "Irish slaves" meme in 2016. It's a theory, not based in historical reality, that the Irish were chattel slaves in America similar to enslaved Africans—that they were bought and sold as African slaves were. A lot of the "evidence" is pulled from blanket false assertions and mislabeled photographs. And it was just proliferating at such a rate and pulling in all kinds of lies that were spreading like wildfire. It was so alive and rampant.
What's the most frightening thing you've seen online?
Let me come up with a top three.
One would be the proliferation of memes like the Irish slaves meme. Memes and conspiracy theories that place people in an alternate reality that allows them not to grapple with racism, sexism or real problems, and allows them to live in a kind of collective social denial because it's convenient for them.
A second one would be a market for and community of support for surveillance technologies and hacking to assist abusers who want to control their intimate partners—so, basically, the world of domestic violence apps. The apps allow people to control their spouses by looking at everything on their phone. That's very frightening to me.
And then, incels and incel-adjacent movements, where it's men who are entitled to women's bodies and harbor grievances because of that. It's so young. I think that's really scary, the idea that this is going to be with us for a long time.
Has tech made racism and radicalization worse?
I'm not going to pull a Satanic panic and blame any one thing. I am half kidding, but it would probably be good for everyone if we left our computers behind. If we got rid of all of our computers and phones, maybe it would be better for everyone, but that's obviously not going to happen.
It has hit me a little personally because people threatened to kill me on the alt-right, and a lot of that stuff happens on the internet. Like, the MAGA bomber threatened to kill me.
Have Mark Zuckerberg and Sheryl Sandberg done more damage to democratic institutions than anyone else this century?
That's a good question. No. I don't think they're the worst, by far. I think it's Rupert Murdoch, actually. But they haven't helped.
He created a new lowest form of discourse. He basically fed people's appetites for tabloidism, but masquerading through a supposedly mainstream and unbiased network. He sowed distrust in media as an institution through the media.
Last week, Zuckerberg said he wanted more government regulation of Facebook. Do you believe him?
Do I think he would be happy if there was more government regulation of the internet? Yes. I think it would be perfectly in line with Facebook's business model. Do I believe he thinks it's going to happen? No. So in a way, when he calls for it, a really cynical read would be that he's just drumming up good press. But I don't have a microscope into Mark Zuckerberg's heart.
Are there patterns to how lies spread online?
The organizers of Gamergate would have these documents floating around with responses to copy-paste at people. It's the phenomenon that we loosely refer to as "Russian bots." But that's not correct. These people are neither bots nor are they Russian. They had essentially hijacked people to operate as nodes in a botnet. So, these people—like, real people—were behaving as though they were bots. We saw so much of it in the lead-up to the 2016 election and afterward.
If it's not Russians, and it's not bots, that means…the problem is us?
We're all capable of really terrible things. We're all capable of spreading rumors and lies, and buying into scams and promoting scams, and promoting lies that excuse bad behavior on our part and help us live in denial. I think the internet helps promote that kind of behavior. It helps promote in-group thinking at the expense of the other. So, xenophobia, misogyny, anti-immigration rhetoric.
We've always had a little evil in us. It's just a question of whether the society and structures around us work to bring out the best in us or work to bring out the worst in us.
Tech has brought a lot of good things, too, hasn't it?
Yeah, but so has nuclear power. Nuclear power is great until it's not. It's mostly clean, it's really efficient. And also there's this nuclear waste problem and nuclear weapons problem that we never fixed. So it's really, really great until you have these downsides we can never fix. And then you realize, oh, we shouldn't have just gone forward with this and assumed we would be able to fix these externalities. And there's no way out of it. There's a lot of good stuff. It's hard to remember that right now.
What can we be proud of?
Digitized books. Large repositories of free information that we've never been able to make accessible to everyone on the planet before. Some forms of automation that get rid of bullshit jobs. The ability to connect people across borders. Making communications less expensive. I think a lot of us don't remember or don't think about the times when long-distance phone calls were extremely expensive.
What's one thing you've gotten wrong?
I thought Gamergate was going to be a rare occurrence. I thought that energy could not sustain itself in multiple instances with great frequency, let alone metastasize into a national political movement. The rhetoric of Gamergate, of promoting fear against "social justice warriors" as a dog whistle for fear of the other and fear of demographic change, that is central to our national politics right now.
What obligation does big tech have to help journalism? Should the platforms be forced to fund local news?
It would certainly be a good thing if they did so, but I don't know if it's fair to apply moral responsibilities to corporations. It would reject basic principles of capitalism. What's happened is that the paper boys are taking the vast majority of the money that the newspapers make. It just doesn't make any sense. It's not going to help anyone in the long run. But this is just one of those capitalistic things that even if it doesn't make sense, it keeps moving forward. I think the platforms have a responsibility not to follow their economic interests, and that's just something that is hard to say. Why would we expect them to do that? Why would we trust them to do that? I mean, this goes against all principles of capitalism, right? We're in a Catch-22.
Do you agree that tech giants should be broken up?
In some ways, breaking up the companies will make things worse, because the bigger they are, the more powerful they are, and the more powerful they are, the better they are able to administer security and use big systemic solutions. But breaking up the companies also means breaking up monopolistic control over information that makes the big systemic failures and informational dystopian problems less catastrophic.
How would you break up a big company like Facebook or Google?
Make them give up Instagram is the obvious one. Some of the companies are not quite as cut-and-dry because of how entangled their products are. Some more ideas: Google could be broken up into so many companies. YouTube: Break that out from search, break that out from mail. There's just so much, when you start thinking about it. These companies are so big, so sprawling and so anti-competitive, it's pretty striking. I think just having a more competitive marketplace changes a lot.
How else can we make things better?
One is creating a central philosophy around search engine optimization. To have the correct answers float up to the top. So all the anti-vaxx results are pushed down if you're asking if vaccines are safe. But on the other hand, you're not instituting a Tiananmen Square-type of regime where if you're typing in "What is Tiananmen Square?" you don't get censored results that remove mentions of the massacre.
I've heard people say that creating a "central philosophy" sounds like thought policing. Is that just a censorship state with a friendly centrist face?
All search engines are a form of censorship. Because there's always a page one and there's always a page 100, or all of the other pages. As long as you're not on page one, you might as well be censored. As long as there is ordering, there is a form of censorship. What's useful about this other philosophy is that it reorients First Amendment rights around the listener instead of the speaker. There's a difference between what you want to see and what you don't want to see. That's the animating idea behind spam filters, and I think we have to move toward that. [Jeong credits James Grimmelmann, a professor at Cornell Tech and Cornell Law School, for inspiring her thinking on reorienting free speech rights for search engine results.]
Ten years ago, the consensus view was that the internet would free people to be creative and original. Are humans just much shittier than we imagined?
There might be this thing where the shitty aspects of humanity are just getting more play because they're better at gaming the system or the system itself is flawed. It's also just possible humanity is shittier than we thought. It is also entirely possible that humanity is hopelessly garbage. I'm not sure. I guess I choose to believe it's possible to build a society in a way where humanity doesn't devolve into complete, horrible dystopia.
GO: Sarah Jeong discusses Bitcoin and the Silk Road at TechfestNW at Portland State University's Viking Pavilion, 930 SW Hall St., on Friday, April 5. 10 am. Tickets start at $99 and are available at techfestnw.com
The More You Know
Sarah Jeong knows a lot about the internet—but you might not. Here's a glossary.
Bitcoin, or blockchain:
A technology that allows users to record exchanges in an irreversible ledger that tracks each transaction. The technology is used as currency by some, often because it offers anonymity.
A 2014 controversy and harassment campaign spurred by false allegations against a woman in the video game industry. Thousands of trolls harassed the woman after her ex-boyfriend alleged her games had received positive reviews by a journalist she had dated. Tactics included rapidly sharing copied-and-pasted messages and publicly disclosing personal information, such as home addresses, employers, and even Social Security numbers.
Irish slaves meme:
A meme shared widely on the internet in the run-up to the 2016 election and afterward that claimed Irish immigrants were bought and sold as slaves in the U.S. in the same way Africans were. The theory is false, yet it gained traction in some online circles as an excuse to disregard discussions of racism in America.
Bots, or "Russian bots":
Computer programs that automatically generate content, like the programs that send spam emails trying to sell Viagra. The term bots—especially Russian bots—usually refers, however, to real people acting like computers by sending copied-and-pasted messages or memes in a loosely coordinated campaign to harass or spread misinformation.
A shortened portmanteau of "involuntary" and "celibate" for members of an online community of men who feel entitled to sex from women of their choice, and express indignation when sexually rejected. Men who identify as incels have been responsible for mass murder, including a 2014 stabbing followed by a shooting near the University of California Santa Barbara that killed six people.
Gab, 4chan, 8chan:
Online forums with lax community standards, in which users frequently discuss taboo topics such as genocide.
In May 2017, Jeremy Christian stabbed three men on a MAX train in Northeast Portland after he was confronted for allegedly hurling racist insults at two black teenage girls, one of whom wore a hijab. Two of the men died. Christian awaits trial for murder and hate crimes.