
YouTube will try to distract algorithms to stop the web site review videos
On Friday, YouTube revisited the algorithm of its recommendations, offering new videos to users to prevent fraudulent trafficking and fraudulent information, reflecting increased readiness for disinformation on a few major video platforms around the world after several public errors.
In a blog post that YouTube released on Friday, the company announced that it "approached" how it could reduce the spread of content that "comes close but does not cross the line", breaking its rules. YouTube has criticized users for engaging in conspiracy and fraudulent content when they start viewing legitimate news.
Change of algorithms of the so-called recommendation of the company is the result of six-month technical experience. Previously it will be small, YouTube has announced that it will inject more than one percent of the content of the site and only affects English videos, which means that unwanted content continues to be avoided by cracks.
The company underlined that none of the videos were deleted from YouTube. They will still become primitive for those who are looking for or divorcing in the condemned ways.
"We believe this change is a balance between maintaining a free voice and living our responsibility towards users," says the blog.
YouTube, which is widely emphasized in historically free speech concerns, does not prohibit conspiracy theories or other forms of fraudulent information. The company does not forbid the hate speech, but it definitely defines as a word that promotes vulnerable groups of violence or hatred.
Lawyers say that this policy does not go so far to prevent people from misleading information and the company's own programs often compress people on political strata feeding extremist content that they are not looking for.
YouTube's feature suggests new videos to users based on previous video views. The Algorithm takes into account "watch time" or the amount of time people watch for the video, and the number of views as the key to propagating content. If the video is viewed too many times, the company may recognize the quality video and automatically launch it to others. Since 2016, the company has also included its satisfaction, love, disobedience and other criteria in its recommendations.
However, the main video content algorithm often turns sharply into promoting extremist ideas. In December, Washington Post reported that YouTube continues to recommend persecuting and counterfeit videos that enjoy racist and anti-Semitic content.
Recently, YouTube has developed software to stop the conspiracy theories from spinning off innovations. Last February, as a result of shooting at Parkland School, the conspiracy theory claimed that the survivor of the school shot was the so-called "crisis actor" at the top of YouTube. After the massacres in Las Vegas in October 2017, shooting has shown that theft has taken millions of views.
The privacy feature of YouTube is also targeted at promoting conspiracy and fraudulent content. At the beginning of this month, for example, RBG search, the Supreme Court's lawsuit, Ruth Bad Ginsburg, returned a series of extremely accurate videos that she learned with configurations and fewer absolute content that she did not have to recover from the surgery.
Six months ago, YouTube began recruiting human assessors who were asked to review a set of guidelines. Then the company took the evaluators' assessments and used it to offer suggestive algorithms.
© The Washington Post 2019
<! -->
0 Response to "YouTube will try to distract algorithms to stop the web site review videos"
Post a Comment