A 15-second YouTube clip titled “The Flag at FedEx 9/11/2011” isn’t exactly video for the ages. This shot of a giant American flag being unfurled across FedEx Field on the 10th anniversary of the Sept. 11 terrorist attacks shakes a bit, and the audio is too muddy to make out the announcer’s words.
YouTube’s content screeners, however, had a different problem with that video of a patriotic pregame ritual: They judged it not “advertiser friendly” and therefore disqualified it from featuring ads that might make its author some spare change.
That Alphabet, Inc. (GOOG, GOOGL) subsidiary fixed the mistake after I inquired about it. But the underlying problem remains: It’s hard to screen the stuff random people upload to YouTube.
Advertiser anxiety
Historically, YouTube has been a money-making machine for its corporate parent. Google doesn’t break out its share of its total ad revenues, but it’s long touted that video-sharing site as a strong contributor to those revenues.
In February, however, the TImes of London reported that YouTube was pairing mainstream ads with videos uploaded by jihadists, neo-Nazis and other extremist elements. That represented a massive failure of YouTube’s “programmatic” ad-matching software, which is supposed to fit ads to the interests of the expected audience of a clip.
More than 250 brands quickly responded by pulling their ads from YouTube. Over April, the ad-analytics firm MediaRadar estimated that 5% of YouTube’s U.S. and Canadian clients had fled the service.
Google apologized and said it would implement stronger safeguards against ads showing up next to videos that would embarrass or horrify ad clients.
That’s what David Heyman ran into with his video of a giant American flag. Heyman is a D.C. sports fan whose most-viewed video is a 2008 clip of President George W. Bush throwing out the first pitch at Nationals Park. He was surprised to get a “Your video can’t be monetized” e-mail from YouTube.
YouTube’s rules
That message explained that Heyman’s video “may not be advertiser friendly.” A Google support document says that term covers “sexually suggestive content,” “violence,” “inappropriate language,” “promotion of drugs and regulated substances,” and “controversial or sensitive subjects and events.”
Heyman was annoyed more by the principle of the thing than the potential lost revenue, since he says he’s “never received a dime” from YouTube ads. He requested a review of that decision and got the same answer.
“My guess is since my video has ‘9/11’ in the title, that is it,” he wrote in an email. “There is no reference or voiceover to the attack.”