These regulations, part of the 2023 Online Safety Act, require websites and apps to verify users' ages using tools such as facial images or credit cards. The UK's Ofcom Media Office will oversee the implementation.
According to Ofcom CEO Melanie Dawes, about 6,000 internetports have agreed to take the measures. She said other platforms must also ensure children are protected from illegal content, including sex, hate speech and violence.
Ofcom estimates that in the past month alone, 500,000 children aged 8 to 14 watched video games online.
The new measures target content not only related to prostitution but also suicide, self-harm, eating disorders and other risks. Technology companies currently have a legal obligation to protect minors and adults online, otherwise they will face sanctions.
According to the UK government, companies that violate the regulation could be fined up to £18 million ($23 million) or 10% of global revenue, whichever is higher. Senior executives could also face criminal charges for failing to comply with Ofcom's data requirements.
After a period of preparation by the industry and management agencies, the regulations have now come into full effect.
Technology Minister Peter Kyle said that children will " experience a completely different Internet for the first time". Speaking to Sky News, he said he had very high expectations for these changes.
In a separate interview on the website for our parents, Kyle apologized to young people who were exposed to harmful content.
"I would like to apologize to any child over 13 who has not received any of these protections," he said.
Rani Govender of child protection charity NSPCC called the changes "a real milestone," adding that technology companies need to take responsibility.
Prime Minister Keir Starmer's government is also considering additional regulations, including a proposal to limit social media access to two hours per day for children under 16.
Kyle said more details about the regulations for young users will be announced "in the near future".