YouTube video promotion AI change is a "historic victory"

by Mark Tyson on 11 February 2019, 11:21

Tags: YouTube (NASDAQ:GOOG), Google (NASDAQ:GOOG)

Quick Link: HEXUS.net/qad4lg

Add to My Vault: x

Late last month the official YouTube blog was updated with a low key article about how its developers are "continuing our work to improve recommendations on YouTube." The piece was a classic positive spin from Google on something that seems to have very negatively affected several individuals; being drawn down a rabbit hole of content which could create various flavours of extremists, conspiracy theorists, anti-vaxxers, and so on. The YouTube blog touches upon this thorny topic in its centre but starts by saying it hopes to do things like "help users find a new song to fall in love with".

The key change being implemented by YouTube this year is in the way it "can reduce the spread of content that comes close to - but doesn’t quite cross the line of - violating our Community Guidelines". Content that "could misinform users in harmful ways," will find its influence reduced. Videos "promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11," will be affected by the tweaked recommendation AI, we are told.

YouTube is clear that it won't be deleting these videos, as long as they comply with Community Guidelines. Furthermore, such borderline videos will still be featured for users that have the source channels in their subscriptions.

At the weekend former Google engineer Guillaume Chaslot admitted that he helped to build the current AI used to promote recommended videos on YouTube. In a thread of Tweets, Chaslot described the impending changes as a "historic victory". His opinion comes from seeing and hearing of people falling down the "rabbit hole of YouTube conspiracy theories, with flat earth, aliens & co".

In Chaslot's personal circle 'Brian' has basically withdrawn from society due to YouTube conspiracy theory promotion. However, the acquaintance will have been seen by YouTube's AI as a 'jackpot', as it tuned his video recommendations to inspire hyper-engagement. More views is more ad dollars for YouTube, and seeing the 'success' with Brian the YouTube AI would try and replicate this engagement pattern with others.

A particularly badly affected Seattle man ended up killing his brother believing he was a lizard, and his public YouTube likes are available to browse, giving an indication of his journey down the rabbit hole.

Chaslot asserts that the YouTube AI change will "save thousands". Above are a couple of extreme examples, but they are likely the tip of the iceberg of people who believe in conspiracies and alternative facts without critical thinking, perhaps due to being fed these videos by YouTube and other social media companies.

Unfortunately there is an asymmetry with BS, as noted by Alberto Brandolini, pictured above.



HEXUS Forums :: 31 Comments

Login with Forum Account

Don't have an account? Register today!
When it goes back to giving suggestions relevant to the video I'm watching I'll be happy again. I could be watching a multi part video and the likelihood that the next video in the chain even shows up is less than 10%.

Right now it just seems to suggest next video based my video watch metrics
Agree with Tabby, and it's worse than that. Used to be that if you were on a music video which was clearly part of an album or collection, you'd have at least some of the rest of that album or collection in recommended. Now you're lucky to get one in the “next up” spot.

Once you watch a video in any somewhat popular topic, it'll start vomiting up related videos, even if you disliked that video, even if you say you're not interested to 100% of the recommendations it gives you from that topic. What bothers me more though is that there are lots of topics on Youtube which the algorithm just ignores, or the best it can do is recommend that you watch what you're already seen.

Add all of this to the obviously paid recommendations and the fact that new and interesting, relevant content proves hard to find at the best of times, and I'd argue it's halfway to useless.
My concern here is who defines what is “bull”? Who is making that distinction and what right do they have to essentially “peer review” and then decide if it might be harmful?

The examples they give are all quite acceptable - antivaxxers for example. But I absolutely GUARANTEE that this is a slippery slope, will be misused and it will extend to people with the “wrong” political opinions…. because we can all look at the same dataset and come out with different views. The problem is when someone else looks at it and decides your perspective is “bull” and deserves to be censored.

The other issue here is that you will end up with the very real possibility that only established convention will be promoted and non-standard / creative ideas will be supressed as “bull” by someone.

It was once believed the Earth was flat, now we know it is round. If you'd gone back in time and said it was round you'd be accused of spouting “bull”. This will potentially stop creative from having their ideas promoted and as a result, they'll stop bothering. That is dangerous as it's the creative who are willing to challenge conventional wisdom with their “bull” that push us forward.

Youtube is a free marketplace of ideas. There will be some negative feedback from that as there is in all systems which are generally beneficial. For example the infernal combustion engine…. how many people die a year due to that? How many people don't?

Once you start deciding what we can and can't see, you lose a key element of societal freedom. It becomes censorship very quickly. Now the question is who do you trust to make the decision about what you can and can't see? About what ideas are good and what ideas are bad?

It's the slippery slope which started when they began putting some videos in a “restricted state”.
philehidiot
now we know it is round

Proof?
mr_jon
proof?
s i g h