No articles found to show on this page.
YouTube star Logan Paul came under fire for posting a video of an apparent suicide victim in Japan’s “suicide forest,” Aokigahara.
HUNTINGTON BEACH, Calif. — He’s the bad boy of YouTube, but the video website isn’t ready to kick off video personality Logan Paul — yet.
“We want to be consistent,” YouTube CEO Susan Wojcicki said at a conference in this seaside city. “When someone violates our policies three times, we terminate. We terminate accounts all the time.”
YouTube has come under fire in the last year in a series of controversies, including allowing videos that included extremist political views or not-safe-for-work content involving sex and violence that could be seen by kids.
This year, much of the outrage has concerned the 23-year-old Paul, the popular video blogger who mixes youth-oriented comedy with outrageous pranks that some say go too far.
More: What will it take for YouTube to kick off Logan Paul?
More: Logan Paul loses his YouTube ads after he shocks rat with Taser
More: YouTube to label state-funded news videos in effort to provide transparency
More: YouTube cracks down on Tide Pod challenge videos
YouTube recently suspended advertising from Paul’s YouTube channel after he shocked a rat with a Taser and joked on Twitter about ingesting Tide Pods, which are capsules containing laundry detergent. Weeks earlier, he had filmed himself next to a corpse of a Japanese suicide victim, a move that was widely criticized.
Paul responded with an apology tour, first with a short video, then with a longer video on suicide prevention. But then he returned with the Tide Pods tweet and the rat video.
His infractions count as two strikes, Wojcicki said during her appearance Monday at the CodeMedia industry conference. “We can’t just be pulling people off of our platform,” she asserted.
YouTube updated its overall violations policy Friday. Creators who violate the new policy can face sanctions like being removed from the preferred ad program, which means they’ll earn less from their videos, or having ads removed altogether. Also, their work can be taken off the site’s list of recommended videos.
Wojcicki said the policy update “gives us more levers and opportunity to pull back services” if creators violate YouTube terms.
In late 2017, it updated another policy to deal with videos with violent and sexual themes that were aimed at children.
Alex Kruglov, who runs the Los Angeles tech start-up pop.in, asked Wojcicki why his 9-year-old daughter was able to see content involving popular kids’ animated characters like Curious George on YouTube that had been altered and were no longer suitable for kids.
She said that parents should stick with the YouTube Kids app, which looks to be a safer place for children, and that parents, when they find objectionable content, should report the videos so they can be removed from the site.
“We’re working really hard on this,” she said. “We’re working through the content to make sure we’re making the right recommendations.”
Wojcicki has said YouTube will increase the number of workers who oversee and review content to more than 10,000 next year. “Human reviewers remain essential to both removing content and training machine learning systems because human judgment is critical to making contextualized decisions on content,” she said in a blog post last year.
At CodeMedia, she said that computers would flag the content first, and the workers will follow up to double check. “That many people and the machines will help,” she said. “And if it doesn’t, we’ll add more people and more machines.”