
Australians will soon be required to prove their age to access pornographic websites, R-rated video games and explicit chatbots when new online safety rules come into effect on Monday.
Under the new age-restricted material codes, search engines, social media platforms, websites, app stores, gaming providers and generative AI systems will be required to introduce safeguards to prevent children from accessing age-inappropriate content online.
The restrictions will apply to material such as pornography, violent content and other harmful online material that could negatively affect young users.
Research conducted by the eSafety Commission found that one in three children aged between 10 and 17 has been exposed to sexual images or videos online. More than 70 percent were also found to have seen or heard violent or harmful content on the internet.
eSafety Commissioner Julie Inman Grant warned that children’s emotional and psychological well-being is at risk without stronger online protections. She said the new rules aim to ensure young users are protected in online spaces where they spend much of their time.
Under the new system, if a young person searches for harmful material online, the first results will direct them to support services rather than harmful content.
Adults will still be able to access legal adult content, but they may be required to verify their age before using services that host explicit material. Search engines such as Google will also blur explicit search results by default unless an adult user is logged into their account.
Websites hosting pornography will also be required to introduce stronger age-verification systems instead of relying on simple “Are you 18 or older?” prompts.
Companies that fail to implement effective age-restriction measures could face fines of up to 49.5 million dollars per violation. The eSafety Commission said it will monitor online providers closely to ensure the new regulations are properly enforced.




