David Kris, Paul Rosenzweig, and I dive deep on the big tech issue of the COVID-19 contagion: Whether (but mostly how) to use mobile phone location services to fight the virus. We cover the Israeli approach, as well as a host of solutions adopted in Singapore, Taiwan, South Korea, and elsewhere. I’m a big fan of Singapore, which produced in a week an app that Nick Weaver thought would take a year.

In our interview, evelyn douek, currently at the Berkman Klein Center and an SJD candidate at Harvard, takes us deep into content moderation. Displaying a talent for complexifying an issue we all want to simplify, she explains why we can’t live with social platform censorship and why we can’t live without it. She walks us through the growth of content moderation, from spam, through child porn, and on to terrorism and “coordinated inauthentic behavior” – the identification of which, evelyn assures me, does not require an existentialist dance instructor. Instead, it’s the latest and least easily defined category of speech to be suppressed by Big Tech. It’s a mare’s nest, but I, for one, intend to aggravate our new Tech Overlords for as long as possible.


Continue Reading

This episode features a lively (and – fair warning – long) interview with Daphne Keller, Director of the Program on Platform Regulation at Stanford University’s Cyber Policy Center. We explore themes from her recent paper on regulation of online speech. It turns out that more or less everyone has an ability to restrict users’ speech online, and pretty much no one has both authority and an interest in fostering free-speech values. The ironies abound: Conservatives may be discriminated against, but so are Black Lives Matter activists. In fact, it looks to me as though any group that doesn’t think it’s the victim of biased content moderation would be well advised to scream as loudly about censorship or the others for fear of losing the victimization sweepstakes. Feeling a little like a carny at the sideshow, I serve up one solution for biased moderation after another, and Daphne methodically shoots them down. Transparency? None of the companies is willing, and the government may have a constitutional problem forcing them to disclose how they make their moderation decisions. Competition law? A long haul, and besides, most users like a moderated Internet experience. Regulation? Only if we take the First Amendment back to the heyday of broadcast regulation. As a particularly egregious example of foreign governments and platforms ganging up to censor Americans, we touch on the CJEU’s insufferable decision encouraging the export of European defamation law to the US – with an extra margin of censorship to keep the platform from any risk of liability. I offer to risk my Facebook account to see if that’s already happening.


Continue Reading

If you’ve lost the Germans on privacy, you’ve lost Europe, and maybe the world. That’s the lesson that emerges from my conversation with David Kris and Paul Rosenzweig about the latest declaration that the German interior minister wants to force messaging apps to decrypt chats. This comes at the same time that industry and civil society groups are claiming that GCHQ’s “ghost proposal” for breaking end-to-end encryption should be rejected. The paper, signed by all the social media giants, says that GCHQ’s proposal will erode the trust that users place in Silicon Valley. I argue that that argument is well past its sell-by date.
Continue Reading

Our interview is with two men who overcame careers as lawyers and journalists to become serial entrepreneurs now trying to solve the “fake news” problem. Gordon Crovitz and Steve Brill co-founded NewsGuard to rate news sites on nine journalistic criteria. Using, of all things, real people instead of algorithms. By the end of the interview, I’ve confessed myself a reluctant convert to the effort. This is despite NewsGuard’s treatment of Instapundit, which Gordon Crovitz and I both read regularly but which has not received a green check.


Continue Reading