YouTube goes strict. Well, in lieu to prevent children from watching adult content, YouTube stepped up enforcement of its guidelines for videos aimed at children, the unit of Alphabet Inc’s. So, Google also further confirmed that this measure is in response to the criticism that it has failed to protect children from adult content.

As per the recent blog post by YouTube vice president Johanna Wright, as a part of this measure, the streaming video service has removed more than 50 user channels in the last week and have also stopped running ads on over 3.5 million videos since June.

The post further read that “Across the board, we have scaled up resources to ensure that thousands of people are working around the clock to monitor, review and make the right decisions across our ads and content policies. These latest enforcement changes will take shape over the weeks and months ahead as we work to tackle this evolving challenge”.

Coming back to YouTube, it has become one of Google’s fastest-growing operations in terms of sales by simplifying the process of distributing video online but also putting in certain places, few limits on content.

Now, regarding this matter, Parents, regulators, advertisers and law enforcement officials have become increasingly concerned about the open nature of the service. They have strictly contended that Google must do more on this to banish and restrict access to inappropriate videos, whether it is a propaganda from religious extremists and Russia or comedy skits that appear to show children being forcibly drowned.

This concern about children videos gained new momentum in the last two weeks after the reports published by BuzzFeed and the New York Times and an online essay by British writer James Bridle, who have pointed out questionable clips.

Several forum posts Wednesday showed support for YouTube’s actions while noting that this vetting must expand even further.

YouTube’s Wright also cited “a growing trend around content on YouTube that attempts to pass as family-friendly, but is clearly not” for the new efforts “to remove them from YouTube.”

Now, the company relies on review requests from users, a panel of experts and an automated computer program to help its moderators identify material possibly worth removing.

Lastly, Moderators now are instructed to delete videos “featuring minors that may be endangering a child, even if that was not the uploader’s intent,” regarded Wright. Also, she added that videos with popular characters “but containing mature themes or adult humor” will be restricted to adults.