In the wake of the New Zealand massacre, there’s been a lot of talk about how we can better moderate the content on social networks. And that’s a worthy conversation. But it ignores the tougher issues at the heart of the problem. From the NYT’s Charlie Warzel: “Focusing only on moderation means that Facebook, YouTube and other platforms, such as Reddit, don’t have to answer for the ways in which their platforms are meticulously engineered to encourage the creation of incendiary content, rewarding it with eyeballs, likes and, in some cases, ad dollars. Or how that reward system creates a feedback loop that slowly pushes unsuspecting users further down a rabbit hole toward extremist ideas and communities.” We’re Asking the Wrong Questions of YouTube and Facebook After New Zealand. In short, Warzel argues that, “the horror of the New Zealand massacre should be a wake-up call for Big Tech and an occasion to interrogate the architecture of social networks that incentivize and reward the creation of extremist communities and content.” (We’ve had wake-up calls before. The question is whether we’ll ever stop hitting the snooze button.)