‘I cant trust YouTube any more’: creators speak out in Google advertising row

Inconsistencies behind the companys ability to police advertising on controversial content are coming to light

Googles decision-making process over which YouTube videos are deemed advertiser friendly faces scrutiny from both brands and creators, highlighting once again the challenge of large-scale moderation.

The company last week pledged to change its advertising policies after several big brands pulled their budgets from YouTube following an investigation that revealed their ads were shown alongside extremist content, such as videos promoting terrorism or antisemitism.

Havas, the worlds sixth largest advertising and marketing company, pulled all of its UK clients ads, including O2, BBC and Dominos Pizza, from Google and YouTube on Friday, following similar moves from the UK government, the Guardian, Transport for London and LOreal.

Google responded with a blog post promising to update its ad policies, stating that with 400 hours of video uploaded to YouTube each minute we dont always get it right.

However, the inconsistencies behind the companys ability to police advertising on controversial content are coming to light and its not just advertisers who are complaining. Some YouTube creators argue their videos are being unfairly and inconsistently demonetized by the platform, cutting off their source of income that comes from the revenue share on ads placed on videos.

Matan Uziel runs a YouTube channel called Real Women, Real Stories that features interviews with women about hardship, including sex trafficking, abuse and racism. The videos are not graphic, and Uziel relied on the advertising revenue to fund their production. However, after a year, Google has pulled the plug.

Its a nightmare, he said. I cant trust YouTube any more.

Its staggering because YouTube has a CEO [Susan Wojcicki] who is a feminist and a big champion for gender equality, he said, pointing out that there were other far more extreme videos such as those promoting anorexia and self-harm that continued to be monetized. He also referenced PewDiePies videos featuring antisemitic jokes that were allowed on the platform for months.

Its bad that YouTube attempts to censor this very important topic and is not putting its efforts into censoring white supremacy, antisemitism, Islamophobia, racism, jihadists and stuff like that, Uziel said.

He wants Google to be more open about how exactly they moderate content. I want them to be transparent about what they think to be advertiser friendly, he said.

Google currently uses a mixture of automated screening and human moderation to police its video sharing platform and to ensure that ads are only placed against appropriate content. Videos considered not advertiser-friendly include those that are sexually suggestive, violent, contain foul language, promote drug use or deal with controversial topics such as war, political conflict and natural disasters.

Transgender activist Quinby Stewart agrees there needs to be more transparency. He complained after YouTube demonetized a video about disordered eating habits. I definitely dont think the video was even close to the least advertiser-friendly content Ive posted, he said.

QueerBean (@QuinbyStewart)

lmao of course the first video i had marked as not advertiser-friendly was the one about my disordered eating habits pic.twitter.com/UObYPe4fmM

March 20, 2017

He complained to the platform and the company has since approved the video for monetization.

YouTubes policy is just very vague, which makes sense because I think demonetization needs to be handled on a case-by-case basis. Their policies seem more reasonable when you ask a human to check it, but the algorithm that catches videos originally is really unfair, he said.

Sarah T Roberts, an information studies professor from UCLA who studies large-scale moderation of online platforms, said that large technology companies need to be more honest about their shortcomings when it comes to policing content.

Im not sure they fully apprehend the extent to which this is a social issue and not just a technical one, she said.

Companies such as Google and Facebook need to carefully think through their cultural values and then make sure they are applied consistently, taking into account local laws and social norms. Roberts said the drive to blame either humans or algorithms for decisions was based on a false dichotomy as human values are embedded into the algorithms. The truth is they are both engaged in almost every case, she said.

The fact that it is now hitting Googles bottom line should be a wake-up call. Now its financial and is going to hit them where it hurts. That should create some kind of impetus.

The Guardian asked Google for more clarification over how the moderation process works, but the company did not respond.

Read more: https://www.theguardian.com/technology/2017/mar/21/youtube-google-advertising-policies-controversial-content