The best wholesale sex toy brand in the world | KISSTOY | Can Love, Can do.

YouTube executives ignored warnings, letting toxic videos run rampant - most popular sex toys

by:KISSTOY     2020-10-23
YouTube executives ignored warnings, letting toxic videos run rampant  -  most popular sex toys
A year ago, Susan waterski defended YouTube on the stage.
Her company, which has been hit for months by fuelling false information online, is now suffering another attack --
Conspiracy theory videos from Florida high school about the park show the victims are "crisis actors ".
YouTube chief executive Wojcicki is a reluctant public ambassador, but she posted at the South-West Austin conference that she hopes will help calm down a small text box of conspiracy theories from sites such as Wikipedia that will be placed under the video
Established facts like the moon landing connect the audience with the truth.
Wojcicki's media giant is determined to surpass television and expects sales to exceed $16 (RM65. 33bil)a year.
But on that day, woshiki compared her video site to another agency.
"We are more like a library," she said . " As defenders of freedom of expression, we are familiar with our position.
"There will always be controversy if you look back at the library.
"Since Wojcicki came to power, there have been some famous conspiracy theories on the platform, including those on children's vaccination;
Another thing that links Hillary Clinton to Satanic worship has sparked outrage among lawmakers eager to regulate tech companies.
A year later, YouTube was linked to the darker parts of the web.
The problem is not just that there are videos on YouTube that question the effectiveness of the moon landing or vaccine.
This huge "library" was created by users, with little editorial oversight, and there is bound to be untrue nonsense.
Instead, the problem with YouTube is that it allows for a flood of nonsense.
In some cases, it provides fuel even through its powerful artificial intelligence system to let it spread.
Wojcicki and her deputy know that.
In recent years, dozens of people inside YouTube and its owner Google have expressed concern about the vast amount of fake, inflammatory and toxic content that the world's largest video site has surfaced and spread.
An employee wants to mark disturbing videos that do not meet the rules of hate speech and stop recommending them to viewers.
The other wants to track these videos in a spreadsheet to show how popular they are.
Third, to "alt-
Video bloggers have created an internal vertical site that shows how popular they are.
Every time they get the same basic response: Don't shake the boat.
The company spent several years pursuing a business goal that was above other goals: "participation", measuring the point of view of online video, the time spent and the interaction.
Conversations with more than 20 people who work on YouTube or have recently left show that the company's leadership is unable or unwilling to act on these internal alerts due to concerns about limiting participation.
One person who worked for her said Wojcicki "will never put your finger on the scale ".
"Her point is, 'My job is to manage the company, not to deal with it '.
"This person, like other people interviewed by Bloomberg News, asked not to be named for fear of retaliation.
YouTube has rejected Bloomberg's request to speak to Wojcicki, other executive at Google and Alphabet Inc. 's parent board.
Last week, the company's chief product officer, Neal Mohan, told the New York Times that the company had "made great progress" in solving problems with referrals and radical content ".
A YouTube spokesperson questioned Wojcicki's lack of care about these issues and the company's statement that it would be involved in the first place.
Instead, the spokesman said the company has been focusing on finding solutions to its content problems for the past two years.
Since 2017, YouTube has recommended an indicator called "responsibility" that includes input from the satisfaction survey shown after the video.
YouTube declined to describe it more fully, but said it received "millions" of survey responses every week.
"Our main focus is on addressing some of the most difficult content challenges on the platform," a spokesperson said in an email statement . ".
"We have taken some important steps, including updating our recommendation system to prevent the spread of harmful error messages and improve the news experience on YouTube, bringing Google's focus on content issues to 10,000 people, investing in machine learning is able to find and delete offending content faster, reviewing and updating our policies-only in 2018, we have updated more than 30 policies.
Responsibility remains our priority, and that is not the purpose.
In response to criticism about prioritizing growth over security, Facebook Inc. proposed a major shift in its core offerings.
YouTube still has a hard time explaining any new corporate vision to the public and investors, and sometimes to its own employees.
Five senior people who have left YouTube and Google in the past two years have privately said that the platform's inability to tame extreme, disturbing videos is why they left.
YouTube's inability to solve its problems within Google remains a big problem.
A few weeks ago, a deadly measles outbreak drew public attention to the vaccination plot on social media, and YouTube's inertia was again articulated.
New data from London Moonshot CVE
A company that studied extremism found that fewer than 20 YouTube channels spread the lies, with more than 0. 17 billion viewers, and many were subsequently referred to other videos full of conspiracy theories.
The company's lackluster response to explicit videos for children has drawn criticism from the tech industry itself.
Patrick Copeland, a former Google director who left office in 2016, recently posted tough allegations on LinkedIn about his old company.
While watching YouTube, Copeland's daughter was referred to a clip that included both a Snow White character attracted by exaggerated sexuality and a horse engaged in sexual activity.
"Most companies will fire someone for watching this video at work," he wrote . ". “Unbelievable! !
Copeland, who has worked at Google for ten years, decided to block YouTube. com domain.
Micah Schaffer joined YouTube in 2006, nine months before YouTube was acquired by Google, and before it became part of the cultural sky.
He was assigned to write policies for free websites.
At the time, YouTube was working to convince people why they should watch videos from amateurs and upload their own.
A few years later, when he left YouTube, the site was still not profitable and was known for being frivolous (
A fragment of David, a rambling seven
The one-year-old was taken medicine after going to the dentist and was the second
Watch video that year).
But even so, there is a problem with malicious content.
Around that time, YouTube noticed an increase in videos praising anorexia.
In response, the staff moderators began to frantically comb through the clips to limit their age, remove them from suggestions, or delete them altogether.
Schaffer recalls that they "threaten the health of our users ".
He remembered the recent episode when there were some videos about the show.
The risk of vaccination began to spread on YouTube.
He thought it would be a No.
The brain earlier.
"We will strictly restrict or completely ban them," Schaffer said . ".
"YouTube should not make dangerous conspiracy theories a dominant part of the platform's culture.
Over the past decade, he added, YouTube has put the chase for profits above user security.
"We may have been losing money," he said . "
"But at least the dog riding a skateboard has never killed anyone.
From around 2009, Google has more strict control over YouTube.
The company has some executives, such as Robert Kyncl, former head of sales at Netflix, who have developed a technology strategy and business plan to sustain its rapid growth.
In 2012, YouTube concluded that the more people watching it, the more ads it can run-recommend videos, next to the clip or after the clip is finished, is the best way to focus on the website.
So, YouTube, run by Google veteran Salar Kamangar, set up a company --
The goal is to reach the viewing time of 1 billion hours per day and rewrite its recommendation engine to maximize that.
When Wojcicki took over in 2014, YouTube was the third way to achieve this, she recalled in investor John Dole's 2018 book, measuring important matters.
"They think it will destroy the internet!
But it seems to me that such a clear, measurable goal will inspire people and I cheer for them.
"Billions of hours of viewing time per day has earned our technicians Polaris.
YouTube reached its goal by October 2016.
In the fall of the same year, three Google coders published a paper on how the YouTube referral system handled a large number of newly uploaded videos.
They outlined how YouTube's neural network better predicts what viewers will look at next, an artificial intelligence system that imitates the human brain.
The study points out how artificial intelligence tries to suppress "click bait", which lies to their theme and loses the attention of the audience.
However, it does not mention mines-misinformation, political extremism, and disgusting children's content-that have since gained millions of views and made companies feel uneasy
These topics were rare until 2016 U. S. election.
"We are working hard to achieve our goals and drive the use of the site," said a former senior manager . ".
"I don't know if we really looked up.
YouTube did not give the exact recipe for the virus.
But in the 1 billion hour race, a formula came up: anger equals attention.
Brittan Hailer, a researcher at Harvard University's Carl Center, said this is a place where people on the political fringe can easily be used.
"They don't know how the algorithm works," she said . "
"But they know that the more outrageous the content, the more opinions they have.
People inside YouTube know this dynamic.
Over the years, there have been a lot of wrenching debates about how to deal with the hassle-videos that don't violate its content policy, so stay on the site.
Some software engineers nicknamed the problem "bad virus.
Google's privacy engineer Yonatan Zunger recalls a suggestion he made to YouTube employees before leaving the company in 2016.
He proposed the third tier: Videos that are allowed to stay on YouTube, however, will be removed from the proposal due to the "close" cancellation policy for these videos.
"Bad actors are very good at understanding where bright lines are and getting as close to them as possible," Zunger said . ".
His proposal was rejected and adopted by the head of YouTube policy.
"I can say with great confidence that they are very wrong," he said . ".
Instead of improving its recommendation engine, YouTube doubled.
2016 The neural network described in the study came into effect in the YouTube proposal from 2015.
According to the existing measures, it has achieved the goal of keeping people on YouTube.
"It's an addictive engine," said Francis Owen, a computer scientist who criticized YouTube's artificial intelligence system.
Owen said he had raised those concerns with YouTube staff.
They responded with skepticism, he said, or said they had no incentive to change the way the software works.
"This is not a catastrophic failure algorithm," Owen added . ".
"It works well for a lot of people and makes a lot of money.
Paul Covington, Google's senior engineer)
Wrote the 2016 recommendation engine study and presented the results at the next meeting.
He was asked how engineers decided what the goals of their algorithms were.
"This is a product decision," Covington said at the meeting . " He is referring to a separate YouTube department.
"The product tells us that we want to add this indicator and then we go to add it.
So it's not really for us.
Covington did not respond to an email asking for a comment.
A YouTube spokesman said that since the end of 2016, the company has added a measure of "social responsibility" to the recommendation algorithm.
These inputs include the number of times people share and click on the "like" and "Don't Like" buttons on the video.
But YouTube declined to give more details about the indicator or its impact.
Three days after Donald Trump was elected, woshiki called all the staff to attend the weekly meeting.
An employee is upset about the election of the website
Watch the most relevant videos.
They are controlled by publishers like Breitbart News and Infowars, known for their anger and provocation.
Breitbart has a popular section called "black crime.
According to one participant, the incident triggered a broad dialogue, but no immediate policy decision was made.
A spokeswoman declined to comment on the matter, but said "extreme content is usually not performing well on the platform ".
At the time, YouTube's management focused on a very different crisis.
The "creators" who upload videos to the website are very frustrated.
Some complain about wages, while others publicly threaten to move to rival websites.
Wojcicki has a plan with her deputies.
YouTube calls it a project Bean, or sometimes a "boiling ocean", to show the magnitude of the task. (
Sometimes they call it BTO3, the third major overhaul of YouTube following the push for mobile viewing and subscription initiatives. )
According to three former senior staff members, the plan is to rewrite the entire business model of YouTube.
It focuses on a way to pay for creators, which is not based on ads hosted by their videos.
Instead, YouTube pays for participation-how many viewers watch a video and how long they watch it.
A special algorithm will bring together the cash received and then distribute it to the creators even if there is no advertisement on their video.
The idea is to reward video stars who have been shortened by the system, such as those who make sex education and music videos that video advertising companies find too dangerous to speak.
To make this project viable, YouTube programmers have worked hard for at least a year.
But the company manager didn't realize that the project could backfire: Paying for participation could make its "bad virus" problem worse because it might reward those who get popular with anger.
One person involved said that the algorithm for issuing money was closely protected.
This person said that if it came into effect at the time, it is likely that people like Alex Jones, the creator of the information officer and conspiracy theorist, have a large number of followers on the site, before YouTube fired him in last August, he would suddenly become one of the highest-paid stars on YouTube.
The project's leadership over Google was in October 20.
By then, YouTube and other social media sites face the first wave of condemnation for creating a "filter bubble"-leading people to advance
Existing beliefs and then feeding them more of the same.
According to two people familiar with the matter, Wojcicki's boss, Sundar Pichai, declined YouTube's offer, in part because he thought it might make the filtering bubble problem worse.
Another person familiar with the matter said the effort was put on hold because there was concern that it would complicate the way creators pay.
YouTube declined to comment on the project.
In November of 2017, YouTube finally took decisive action against those channels that promoted harmful videos, cutting off thousands of people from receiving advertisements or websites almost overnight.
The creator calls it "cleaning ".
The company is facing continued resistance from advertisers, but the real catalyst is a surge in media coverage of disturbing videos targeting children.
The worst part is the "toy Madman", a channel on which a father and his two daughters post videos, sometimes causing them to vomit or extreme pain.
YouTube deleted the toy monster and quickly pulled away from it.
But the channel does not appear in the shadow.
It is reported to have more than 8 million users and is one of the top 100 most watched on the site.
A former employee said the disturbing videos were a "public secret" within the company, proving that their presence was often linked to freedom of speech.
YouTube also held another debate over its children-oriented program.
Before the launch of a dedicated app for minors YouTube children, some argue that the company only offers manual
Pick videos in the service to avoid any content confusion.
The arguments disappeared and the app picked the video from the algorithm.
YouTube does invest a lot of money in solving its content problems.
It employs thousands of people to screen videos to find videos that violate the rules of the site.
But for some people inside, these fixes take a long time to arrive, or become pale next to the size of the problem.
According to a former host who specializes in foreign countries, YouTube's policy on how content hosts deal with conspiracy theories did not exist as of 2017
Language content.
According to a former employee, at the end of the year, fewer than 20 staff members of the unit that oversees the content policy "trust and security.
The team must "go all out" to get more resources from the tech giant, the person said.
A spokesman for YouTube said the department has grown "significantly" since then, but declined to disclose the exact numbers.
On February, 2018, the video called the victim of the Parkland shooting "crisis actor" and went viral on YouTube's popular page.
Policy staff made suggestions shortly after limiting the suggestions on the page to reviewed news sources.
YouTube management declined the offer, according to a person familiar with the matter.
The person did not know the reason for the rejection, but pointed out that YouTube was planning to speed up the viewing time of news-related videos at the time.
However, YouTube quickly solved the problem surrounding the news --Related content.
On last July, YouTube announced that it would add links to Google's news results to its YouTube search, and began displaying "authoritative" sources from established media in its news section.
YouTube also gave $25. RM102. 08mil)
Funding was provided to the news agency that produced the video.
In the last quarter of 2018, YouTube said it deleted more than 8.
8 million channels that violate their guidelines.
The measures are designed to help bury disturbing videos on the site, and the company now notes that these efforts show its focus on content issues.
In the past, however, YouTube has actively discouraged employees from being proactive.
A former executive who was upset about the practice said the lawyer verbally advised that he was not assigned to handle moderate employees to avoid searching for suspicious videos, such as viral lies about Chief Justice Ginsberg.
The directive has never been published in writing, but the message is clear: if YouTube knows that these videos exist, its legal base will become thinner, the person said.
Federal laws protect YouTube and other tech giants from the responsibility of the content of the site, but if these companies are too active in their editorial roles, they may lose the protection of this law.
In any case, some employees are still looking for these videos.
According to two people familiar with the matter, an exciting moment took place around the beginning of 2018.
An employee has decided to create a new YouTube "vertical" category that the company uses to group its mountain of video clips.
The man in the imagination "alt-
Yes, "a political group with a loose relationship with Trump.
Based on participation, the assumed alt-
The most popular channels on YouTube are music, sports and games, trying to show the importance of these videos to the YouTube business.
A person familiar with the executive team said they did not remember seeing the experiment.
Still, the knife has come out as the company's algorithm continues to be a headache.
Wowojcicki, a number of former employees, inherited a business that was more oriented, but failed to change direction meaningfully.
YouTube's head of business, Kyncl, was also accused of overseeing creator relationships and content review decisions.
While YouTube product leaders Wojcicki and Neal Mohan have given several public addresses on the content --
Related issues, Kyncl is not so outspoken on this issue.
Even so, the executive has made other public moves that some inside Google think are self-directedpromotional.
In last August, a week after a damn report about the popularity of extremist videos on YouTube was released, he imitated a suit in an advertisement for the luxury brand Brioni.
The ad was posted in trouble on YouTube, which, according to a person familiar with the matter, raised concerns among several Google employees about Kyncl priorities.
Delegates to the company and Kyncl declined to comment.
In the past January, YouTube followed the advice of Zunger, a former Google employee, to create a new layer for problematic videos. So-
So-called "marginal content" that does not violate the terms of service can be left on the site, but is no longer recommended to the audience.
A month later, after a series of news stories about the vaccination plot, YouTube said it put some of these videos in the category.
In February, Google also released a lengthy document detailing how it solved the error messages on services including YouTube.
"The main goal of our recommendation system today is to create a trustworthy positive experience for our users," the document wrote . ".
"YouTube Company
The broad goal is not just "growth" but "responsible growth ".
"The company has been applying the fix proposed by Wojcicki a year ago.
YouTube says the information panels of Wojcicki's debut in Austin, Wikipedia and other sources, now show "tens of millions of times a week ".
2015 clip on ihealth tube vaccination.
The "natural health" YouTube channel is one of the videos that now play small gray boxes.
The text links to the Wikipedia entry for the MMR vaccine.
London month CVE-based anti-
Extremist companies have identified the channel as the most consistent
Vaccination theory on YouTube.
But YouTube seems to just apply this fix once in a while.
One of the IHealthTube.
Com's most popular video is not about vaccines. It’s a seven-
Minute clip called "every cancer can be cured in a few weeks.
While YouTube says it no longer recommends the video to viewers, there are no Wikipedia entries on the page.
It has been viewed more than 7 million times.
Chat Online
Chat Online
Chat Online inputting...
Sign in with: