Updated

Though YouTube star Logan Paul has faced severe criticism for posting a video to his channel that appeared to show a body hanging in a Japanese "suicide forest" (which he later deleted and apologized for), it appears the platform itself is also to blame.

Paul deleted the video less than 24 hours after it appeared on his channel, but the video had been accepted by YouTube's reviewers, according to a report in TechCrunch.

Buzzfeed first reported that that video had been approved on Jan.1, after it was initially flagged for concern by viewers.

AS OUTRAGE INTENSIFIES, GOOGLE HIGHLIGHTS NEW WAYS TO CLEAN UP YOUTUBE

Paul's channel is part of YouTube's Red subscription service and has more than 15 million followers. The video in question was titled “We found a dead body in the Japanese Suicide Forest.”

In a statement to TechCrunch, YouTube said:

"Our hearts go out to the family of the person featured in the video. YouTube prohibits violent or gory content posted in a shocking, sensational or disrespectful manner. If a video is graphic, it can only remain on the site when supported by appropriate educational or documentary information and in some cases it will be age-gated. We partner with safety groups such as the National Suicide Prevention Lifeline to provide educational resources that are incorporated in our YouTube Safety Center."

The statement does not address why the video was approved initially after it was flagged for concern over the nature of its content. The company has not yet responded to a request for comment from Fox News.

Reaction on social media has been largely critical of the platform for keeping Paul's video up for a period of time.

LOGAN PAUL APOLOGIZES FOR 'SUICIDE FOREST' VIDEO: WHAT TO KNOW ABOUT THE YOUTUBE STAR

YouTube has come under extraordinary fire in recent months for the way it handles content on its platform, including video approval.

Towards the end of 2017, the company attempted to address the issues in a series of blog posts by CEO Susan Wojicki. The posts said Google (the parent company of YouTube) would increase its team to surpass 10,000 employees dedicated to reviewing videos to help get rid of "problematic content" on YouTube.

Wojicki also added that the company is using machine learning to help human reviewers "find and terminate hundreds of accounts and shut down hundreds of thousands of comments."

Follow Chris Ciaccia on Twitter @Chris_Ciaccia