YouTube CEO Susan Wojcicki took to the service’s Creator Blog last night to issue some broad goals for the year forward. The plans largely revolve around increased transparency on the company’s part and tightening enforcement — a pretty clearly reaction to multiple creator controversies over the past year and change, including, most recently, the suicide video posted by YouTube star, Logan Paul.
Wojcicki doesn’t actually refer to Paul — or any other creators— by name here, and the fixes outlined in the piece are admittedly pretty abstract. In many cases, they’re a simple reaffirmation of policies the service has already put in place, including a crack down on impersonating accounts and posting misleading thumbnails.
The executive does, however, promise to improve the enforcement of existing policies, with a combination of machine learning and human policing that will bring total number of Google/YouTube employees checking for questionable content north of 10,000.
“We realize we have a serious social responsibility to get these emerging policy issues right, so we seek advice from dozens of expert advisors and third-parties,” Wojcicki writes. “For example, on issues of hate speech we work with the Anti-Defamation League in the U.S. and on issues of self-harm, we work with the National Suicide Prevention Lifeline.”
In addition to the recent Paul controversy, which showed a body hanging in Aokigahara, Japan’s “suicide forest,” YouTube’s platform has come under fire a number of times in the past year. Early last year, the site canceled a premium show with PewDiePie after the Swedish internet personality paid people to carry a sign bearing the phrase “Death to all Jews.”
Wojcicki adds that the company is working toward a better method for demonetizing content. “While we worked hard this year to provide an appeals system and quicker responses to creators when a video is demonetized,” she writes, “we’ve heard loud and clear that we need a better system. We’re currently working on a more accurate solution that includes more human review of your content, while also taking your own input into account.”
Following in Facebook’s footsteps, YouTube is also cracking down on “fake news,” following a spike in malicious content designed to sway political opinion on a massive scale. That includes harsher penalties for channels caught doing “something egregious that causes significant harm to our community as a whole.”