Harvard researchers have found a disturbing trend in the YouTube algorithm which promotes videos of young children to those seeking out erotic content.
by Matt Agorist, guest writer
The company who cracks down on peaceful channels and censors alternative media has long allowed this disturbing content on their platform. Despite advertisers fleeing the platform for the disturbing content, YouTube is once again in the spotlight for promoting pedophilia — this time in an extremely disturbing manner.
Researchers at the Berkman Klein Center for Internet and Society at Harvard University found a disturbing trend when examining the algorithm of YouTube users who view erotic content.
The study showed that after regular users watch erotic videos, they are recommended videos of women dressing as young girls before the algorithm eventually shows them videos of “girls as young as 5 or 6” wearing bathing suits or getting dressed.
As CNBC points out:
“According to the piece, YouTube’s recommendation system changed to no longer link some of the revealing videos together, but the company told the New York Times it was “probably a result of routine tweaks to its algorithms, rather than a deliberate policy change.” YouTube also said that turning off its recommendation system on videos of children would “hurt ‘creators’ who rely on those clicks” but did say it would limit recommendations on videos it deems putting children at risk, the report said.
“The Berkman Klein Center didn’t immediately respond to a request for comment on whether the researchers will be publishing anything on the discovery. Google did not immediately respond to a request for comment.”
AT&T and Hasbro are among some of the major corporations who announced earlier this year that they will no longer purchase advertising on YouTube because it allows this pedophile content to flourish.
“Until Google can protect our brand from offensive content of any kind, we are removing all advertising from YouTube,” an AT&T spokesperson told CNBC.
“Hasbro is pausing all advertising on YouTube, and has reached out to Google/YouTube to understand what actions they are taking to address this issue and prevent such content from appearing on their platform in the future,” read a statement from Hasbro.
As TFTP has reported numerous times, children are the last ones YouTube appears to be concerned with, instead targeting those who’d dare challenge the status quo.
While there is certainly a free speech issue at hand with some of these videos, YouTube has no problem deleting and demonitizing channels who expose government crimes and corruption. So why do they ignore and allow these actual bad actors and promote videos of children to people seeking out pornography?
TFTP has even been a target of this censorship on multiple occasions. On the same day we were banned from Facebook and Twitter in October of last year, YouTube doled out a strike to us as well for a video that was three years old which appeared on dozens of other mainstream media channels.
The company has been known to target peaceful activists for challenging the paradigm so there is no question that they have the capabilities to remove these videos of child exploitation. However, they appear to be utterly incapable and unconcerned with doing so. And, in fact, they appear to be promoting it through the algorithm.
Indeed, one mother was horrified to find that a video of her 10-year-old innocently playing in a pool in the backyard was picked up by the YouTube algorithm. It had promoted it to more than 400,000 people who were viewing erotic videos, according to the NY Times.
“I’m really scared of it,” said Christiane C. “Scared of the fact that a video like this fell into such a category.”
While YouTube will likely claim this is an unintentional function of their algorithm, the end result does not matter. Unsuspecting users will have this content pushed on them — a de facto promotion of pedophilia.
Although YouTube announced that they made a shift in the algorithm to try to prevent this, the study points out that they refused to change the one thing that would successfully do so.
As the NY Times points out, YouTube has not put in place the one change that researchers say would prevent this from happening again: turning off its recommendation system on videos of children, though the platform can identify such videos automatically.