Are you actually wasting time on YouTube when you’re watching a cooking video instead of scrolling/tapping mindlessly through one of your various News Feeds elsewhere? Is it pacifying your grabby infant so you can be an adult and clean the bathroom? Are you going to learn how to knit or repair something in your home any other way? See, useful.
YouTube, which recorded 1.5 billion monthly logged-in users last year, also has the downsides of drawing some users into more extreme-content rabbit holes, surfacing disturbing videos on the kid-friendly version of the platform, and amplifying creators like those Paul brothers who stupidly vlog from Japanese forests. Not so useful.
Still, when a one-hour outage on the platform can result in a 20 percent jump in traffic to publishers’ websites (compared to a 2.3 percent increase when Facebook was down), YouTube’s got a special share of the attention economy.The Pew Research Center has new data on just how useful YouTube is — including its recommendations algorithm, which apparently drives 70 percent of consumption. 35 percent of all U.S. adults use YouTube, and 51 percent of those say YouTube has helped learn how to do something for the first time, according to a new report drawing on 4,500 Americans. The percentage of YouTube users who say they get news or headlines there has doubled since 2013 (38 percent today, compared to 20 percent then).
YouTube also plays a big role in occupying those who aren’t yet of reading age. 81 percent of all parents with kids age 11 and under have used YouTube to placate their spawn at least once; more than a third allow their kid to watch videos on the platform regularly. The Pew report points out that YouTube, by YouTube/Google’s own policies, is intended for those age 13 and older, though YouTube Kids is supposed to be a safer version of the platform.
There’s still plenty of questionable content on YouTube, and a majority of respondents noted that they often encounter “troubling or problematic” videos. 60 percent told Pew that they end up watching videos of “dangerous or troubling behavior,” and 64 percent see videos that “seem obviously false or untrue.” This persists in the kids content as well: One example The New York Times highlighted was a three-year-old boy coming across “PAW Patrol Babies Pretend to Die Suicide by Annabelle Hypnotized.” This is pretty much the opposite of useful.
Crises like the PAW Patrol incident uncovered by the Times, not to mention a whipsawing 2017 for the platform — The Verge highlighted the downfall of its biggest star, the apparently anti-Semitic gamer PewDiePie, and a near-boycott from big brands whose advertising was running alongside racist videos — spurred YouTube to release a transparency report in May. Users have always had the opportunity to flag inappropriate content, as we wrote at the time, but it turns out YouTube didn’t rely too heavily on those signals:YouTube’s latest transparency report tells us a great deal about how user flags now matter to its content moderation process — and it’s not much. Clearly, automated software designed to detect possible violations and “flag” them for review do the majority of the work. In the three-month period between October and December 2017, 8.2 million videos were removed; 80 percent of those removed were flagged by software, 13 percent by trusted flaggers, and only 4 percent by regular users. Strikingly, 75 percent of the videos removed were gone before they’d been viewed even once, which means they simply could not have been flagged by a user.
On the other hand, according to this data, YouTube received 9.3 million flags in the same three months, 94 percent from regular users. But those flags led to very few removals. In the report, YouTube is diplomatic about the value of these flags: “user flags are critical to identifying some violative content that needs to be removed, but users also flag lots of benign content, which is why trained reviewers and systems are critical to ensure we only act on videos that violate our policies.”
Pew researchers also explored the recommendation algorithm, which 81 percent of those polled say at least “occasionally” drives their video consumption choices. Here’s what they found:
Earlier this year, YouTube announced its plan for improving the platform’s news discovery experience. It includes $25 million in grants for news organizations to build out their video operations and experiments with boosting local news in YouTube’s connected TV app — not to mention adding text-based news article snippets from “authoritative sources” alongside search results in breaking situations — but TBD on that initiative’s success. If YouTube really wants to be the most useful platform, it might want to make sure it’s not scarring children for the rest of their lives or radicalizing someone who just wants to learn how to clean a gun.