An Australian regulator has given the internet industry six months to develop an enforceable code to protect children from pornography and other inappropriate materials online. This is in case they fail, the government will have to intervene and develop some mandatory regulations.
As a way of ensuring that adolescents do not come across high-impact material prematurely, which addresses suicide as well as eating problems, such as bulimia and anorexia, eSafety commissioner reached out to key players in the internet industry asking for their plan by 3 October.
According to the commissioner, app stores; pornographic websites; dating sites; search engines; social media networks; chat platforms; and multiplayer gaming systems must have standardized means of verifying that their content is appropriate for users.
This comes after a previous phase where the regulators also approved codes discussing how companies on internet prevent terrorism or child sexual exploitation acts.
The regulator said measures including age verification, default parental controls and software that blurs unwanted sexual content should feature in pornography protection codes for children.
“Many parents are concerned about their children seeing violent or extreme forms of pornography,” Commissioner Julie Grant noted in a statement. “Parents can stop it”, she added.
Google’s unit Alphabet declared through its representative that they would work together with industry stakeholders towards coming up with this new set of rules while Meta Group – owner Facebook and Instagram – continued having productive talks with e-Safety Commissioner about its new initiatives.
Apple’s app store provider was unavailable immediately along with representatives at X formerly known as Twitter.