Crime and punishment of YouTube recommendation algorithm

Crime and punishment of YouTube recommendation algorithm
- youtube
  • [Hunyun (WeChat:)] reported on April 1 (compiled: summer by summer) Hunyun Note: a rabbit hole is a metaphor that brings someone into a wonderful or troublesome surreal state or situation.


    On the Internet, rabbit hole usually refers to a very fascinating and time-consuming topic.


    Kevin Roose, a technology and business columnist for the New York Times, said it was called "one of the most powerful radicalization tools of the 21st century", a "petri dish of divisive, conspiracy and hate content", and a tool that "drives people to the darkest corners of the Internet".


    Kevin is talking about the recommendation algorithm of YouTube, specifically YouTube.


    YouTube's recommendation engine determines which videos users will play after watching a video.


    The recommendation engine is the powerful heart of YouTube, which can attract users to linger on the site for hours.


    YouTube says users spend 70% of their time on YouTube reading recommendations.


    But now, as the recommendation engine is accused of guiding users through extreme content, it has become a growing burden for YouTube.


    There has been a recent mass shooting in Christchurch, New Zealand, which was done by a gunman who showed a tendency to be radical online.


    Critics have questioned that YouTube and other platforms not only allow hate and violent content to appear on the Internet, but also actively promote it to users.


    YouTube's biggest rival, Facebook, said this week that it would ban white nationalism and white separatism on the platform.


    Kevin recently interviewed Neil Mohan, its chief product officer, about public criticism of YouTube's recommendation engine and the company's response to radical and violent extreme content.


    Kevin and Neil talked about YouTube's future plans and the controls the company has taken on extreme content, such as hiring additional commentators, introducing breaking news columns that take effect after major news events, and modifying the recommendation engine to reduce the spread of conspiracy theories and other extreme content.


    The following is a transcript of the interview, deleted by the editor of Lianyun.


    com.


    Kevin:, what do you think of the public criticism of the radical content on YouTube? Neil: I think maybe I need a minute to review, and then I'll talk about our views on this issue.


    As you know, YouTube started out as an open platform in terms of content, voice, opinions and ideas, and of course it still is.


    I think the problem has a lot to do with this fact.


    A lot of the content on the site is really out of line, and you, me and others may not share the same view.


    But if I didn't believe in the power of dissenting voices and opinions, I wouldn't be in Yo.


    UTube is engaged in my present job.


    That said, we do now take very seriously the dissemination of harmful information, hate content and, in some cases, violence.


    Kevin:, I've heard a lot about the rabbit hole effect.


    When you start watching a video, then you are advised to watch a video with slightly more extreme content, and so on, and suddenly you are watching a video with very extreme content.


    Is this true? Neil: Yes, I've heard of this phenomenon before.


    I think this description is somewhat fictional, and it is necessary for me to expose the falsehood, which is beneficial to me.


    First of all, the first point is the view that recommending users to watch content violence will extend the time for users to watch and improve the efficiency of the company to some extent.


    I can tell you clearly that YouTube's recommendation system is not designed in this way.


    Viewing time is only one of the considerations in the design and use of the system, in fact, the system design also takes into account the participation and satisfaction of users.


    Compared with other types of content, extreme content does not drive higher user engagement and longer time for users to watch videos.


    Second, it is not in the business interest of our company to promote this kind of video with extreme content.


    Excessive content video can not increase the viewing time of users to a great extent.


    In many cases, advertisers do not want to be associated with this kind of content, so the increased viewing time from radical content is not profitable.


    So I think it is purely false to say that this problem is related to the business interests of our company.


    Kevin:, then why do people talk about the rabbit hole effect? You know, when I watched a video about President Trump, now I get a series of tweets about partisan content.


    People think this is what happened on YouTube.


    What do you think is the reason? Neil: this is what we pay close attention to when revising search engines, which I mentioned to you a few weeks ago.


    Our focus is on what happens in the recommended video section.


    The first point I want to say is that when users watch a video, we will recommend other videos, but we do not consider whether these videos contain more or less radical content.


    So when we look at the data, as you might expect, it is obviously not surprising that the recommended video is related to the content of the video that the user has watched.


    But the video content that users watch on the recommended video section is likely to be a little more extreme than the previous video content.


    Users will also watch what they call mainstream videos, which are not that extreme.


    What kind of video you see depends on the user's behavior.


    It is possible that some users could have clicked to watch videos with more extreme content at first, but in fact they do.


    Regular video.


    This is the result of our careful study of the problem of push.


    That doesn't mean we don't want to solve the problem we're talking about, it's just.


    Kevin: excuse me, can I interrupt you? I want to be clear, are you saying that there is no rabbit hole effect on YouTube? Neil:, I'm trying to explain the nature of the problem to you.


    When the user watches a video, the user will see a series of recommended videos.


    Some of these videos may give the impression that they contain a lot of extreme content, while others give the impression that they do not contain a lot of extreme content.


    Once again, I make it clear that YouTube's recommendation system does not do this, because this is not one of the considerations of the recommendation system, it is just a conclusion drawn by the user's observation on the recommended video section.


    I'm not saying that users can't click on those so-called extreme videos and watch it, and then watch a series of videos recommended by the system, so they keep looking in one direction or the other, but what I want to say is that it's not inevitable.


    Kevin: in terms of breaking news, YouTube uses approved or reliable sources of information rather than conventional recommendation algorithms to display authoritative information to users, which is very important to completely change the way recommendations and search results run.


    So why not do it in all ways? Neil:, I would like to talk about a few points in this regard.


    First, the use of approved or reliable sources of information can be applied not only to breaking news, but also to other information.


    Having said that, when its application is extended to other areas, its effectiveness will be greatly reduced, and it will have to pay a corresponding price.


    For example, when many use cases, as you know, are not in the field of information search, how can you specify that they are reliable across YouTube? For example, entertainment activities, such as music and comedy, are usually driven by people's personal tastes, and it is difficult to regulate them.


    Kevin: is right, but it can only be done on the political side, for example, all political videos should be required to have reliable sources.


    Neil: I think that even if you are involved in a field as wide as politics, there will be a price.


    I only point out that there is one thing to note, that is, political speech will be limited to a predetermined view.


    I think, especially when it comes to things like politics that are dynamic and have an impact on society, there needs to be a platform where different voices can be heard.


    Kevin: since the New Zealand shootings, we have heard the question: many platforms work together to crack down on ISIS-related content, but why don't they do the same against white supremacism or violent right-wing extremism? What do you think the answer is? Neil:, I want to say number one, actually.


    When it comes to the shooting in New Zealand, there are two challenges.


    One of them is that everything we've just talked about is about presenting authoritative, high-quality information, not conspiracy theories or harmful information.


    Another challenge has to do with the speed at which videos are re-uploaded to different platforms.


    Re-upload speed is one of the areas where we work closely with other platforms.


    We work closely together to ensure that every family has the seeds of video and can share it together.


    Second, more generally, when it comes to violent extremism and restrictions on these videos on the platform, the reason for not taking the same measures against white supremacism or violent right-wing extremism as ISIS is that ISIS videos take a specific form.


    They are often designed for promotional and recruitment purposes, so they have their brands and logos visually and in the music they may use.


    Through these clues, we can remove its contents.


    Of course, we are working with other platforms to remove its videos.


    But in some cases, the difficulties are even greater.


    For example, the definition of hate speech or political speech that is offensive and opposed by people is often very vague and difficult to judge.


    People may feel that it is hate speech, but it comes from political remarks such as candidates in the election.


    Kevin: says a lot about how YouTube has become an alternative form of media over the years.


    People who don't go to YouTube because they want to see the same content on YouTube as on TV, but not on YouTube.


    On the other hand, people who go to YouTube because they have established a trusted relationship with bloggers.


    When Logan Paul releases a documentary about the flat earth, or when Shane Dawson questions whether 9 / 11 happened, people will think that YouTube is the place to find the right solution, and maybe that's what makes it irreversible.


    Neil: I've been thinking a lot about this, and I think everything we talked about in the last half hour is in line with my thoughts on it.


    Personally, I think this is due to 2 billion user visits to YouTube each month.


    Everyone comes to YouTube for a specific reason.


    They are either because of the latest or best music videos released, or because of YouTube original videos, or because of their favorite bloggers.


    Another important reason is information.


    The proportion of people accessing YouTube for information has been increasing over the past few years, which partly reflects what is happening in the world.


    And I think people looking for information on YouTube has led to a shift in the way we think about the responsibility of the platform.


    The result of the shift is that the product team is considering all the solutions, many of which we haveWe discussed it here.


    These solutions are a manifestation of the responsibility of the platform, which will ensure that YouTube will make every effort to provide users with information when they are searching for it.


    But YouTube still gives the initiative to users in terms of their intentions and the information they are looking for.


    I think we have made great achievements in this area, but obviously there is still a lot of work to be done.


    We will continue to work hard.


    Report / feedback

Related Content: