As a model for how those data-sharing deals can actually be struck, though, Stamos points to another project known as Social Science One. When it was created in April of 2018, that initiative hammered out a deal with Facebook to access some of its user data as part of its efforts to combat disinformation intended specifically to influence democratic elections. That data-sharing arrangement uses a form of so-called differential privacy, a still-developing class of tools that allow data to be queried in aggregate while limiting the details included in responses. It means that no uniquely identifying information is ever shared about individuals.
“This really rose out the ashes of the Cambridge Analytica scandal,” says Nate Persily, who founded Social Science One and also serves as codirector of the Stanford Cyber Policy Center that will house the Internet Observatory. “This is an attempt to figure out a safe, secure, privacy-protective way to make this data available to academics.”
While Stamos’ observatory hopes to access some companies’ data through Social Science One and through the observatory’s own direct negotiations, in other cases it plans to take a more direct approach: simply scraping up public data without asking permission. After all, Stamos points out, much of the internet’s extremist and abusive behavior lives on sites like 4chan, 8chan, Voat, and Gab, not the mainstream sites that might partner with his project. While scraping those sites might seem intrusive, Stamos points out that users of these sites are generally anonymous by default and public in their postings.
“Right now you cannot study what led to the Christchurch shooting, because that data has been intentionally pulled off and purged to cover tracks,” Stamos says, referring to the shooting of 51 people in a New Zealand mosque, an act whose perpetrator posted a manifesto to the fringe social media site 8chan. Privacy issues around those sites, says Stamos, are “something we’re well aware of, and we’re trying to be careful about in our use of this. But in the end, if you want to understand these problems, you can’t do so without understanding the darkest corners of the internet.”
From Security to ‘Abusability’ Education
Stamos’ Internet Observatory idea came into being when he met Craig Newmark at an Aspen Cybersecurity Summit reception last summer. Newmark, who has given more than $100 million to projects focused on what he describes as “information warfare”—including tens of millions to journalism outlets, journalism schools, and a competition focused on ethics in computer science—says he was impressed with the approach Stamos described. “This is real World War II, greatest-generation stuff. The need is dire, the emergency is real,” Newmark says. “People have to stand up and be patriots. That means platforms and researchers and funders. Alex and people like him are on the front lines.”
More broadly, Stamos says his goal with the observatory—and his plans for a Stanford undergraduate education program linked to it—is to push for more systematic thinking about abuse across tech firms, a shift he describes as similar to the cybersecurity evolution the tech industry underwent 20 years ago. Back then, when Stamos was beginning his career at the legendary cybersecurity consultancy @stake, companies were just waking up to the insecurity of their code and learning to cooperate with the academic researchers and white-hat hackers poking holes in their products.
“We’re now in that same place with bigger trust, safety, and privacy issues in that our industry doesn’t know how to build software that can be trusted by users to operate in their best interest and protect them from all these kinds of abuse,” Stamos says. He argues a new generation of engineers needs to learn to think just as systematically about abusability as they do about security—how their tools can have unexpected and dangerous effects in the real world. “If you just have the skill set you normally get from a computer science education, you’re complete unprepared for the kinds of abuse that will happen on your product.”