Balancing innovation and regulatory compliance for online child safety 


Innovation and regulation must work together to foster trust in new technologies. Innovation pushes the boundaries of what can be achieved, keeping the industry continuously moving forward.

Whereas, regulation drives the adoption of frameworks, providing mechanisms for accountability and helping organisations adapt to evolving challenges. It guides technological progress in a way that aligns with societal values and long-term goals.

On one hand, digital innovation can often happen at an accelerated pace. It can drive the development of new advanced tools and platforms online, generating new ways for people to interact and share content. However, this can create online spaces with new considerations and challenges for online safety… When environments are created without the correct regulatory measures in place, people, especially young adults and children, can be exposed to inappropriate, harmful or illegal content, or even other people with nefarious intentions.

On the other hand, regulators can be slow to catch up with innovation, often stifling ambitious ideas and hindering growth and development within the industry. Therefore, it is imperative that the two work hand in hand.

Finding the balance and practical solutions

Regulation and innovation should create an environment that protects consumers and allows businesses to thrive: one that paves the way for an online environment that’s both safe and full of exciting opportunities. It’s a complex balance to strike, but online safety should always be considered first before rolling out new tools and services that potentially put young people at risk.

While overly restrictive regulations can hinder creativity, lax rules can lead to unintended consequences, and it’s important to find the perfect middle ground. Therefore, regulators, industry leaders, and innovators must keep an open dialogue to create clear, achievable regulatory frameworks that are adaptable to new developments. Businesses should work with experts to understand and implement the available practical solutions that work best for their platform’s risk and use cases.

When developing online tools and services, online platforms and social media firms should consider the following key areas: monitoring the content they or their users produce, ensuring only those of the correct age can access age-restricted content, services, or products, and restricting or monitoring communication between specific age brackets.

Age assurance

Platforms that offer age-restricted content, services or products must implement robust age checks that clearly determine the ages of their users, ensuring they deliver age-appropriate experiences for minors while complying with legislative requirements.

Common use cases of age assurance include age-gating, whereby platforms can restrict access to content, products, or services based on a user’s age. This is commonly seen in various industries, such as alcohol, tobacco, and adult entertainment. Platforms can also focus on creating online environments with activities, content, or products suitable for specific age groups and restricting communication between certain age brackets. This is often seen on social media, video sharing, and online video gaming platforms.

There is no one-size-fits-all approach, both online and offline, and therefore, optionality is key. Businesses should work with safety tech providers to offer a range of methods that address individual preferences, minimise business disruption and mitigate the risks associated with bias and exclusion.  A range of innovative methods are available to the market to estimate age, including using an email address and facial age estimation plus verification methods, such as government-issued ID, credit card, mobile phone number, and name and address checks.

Ultimately, these technologies must be robust, low friction and privacy-preserving, ensuring minimal disruption to the user journey. Implementing age assurance measures shouldn’t cost you legitimate business, and educating users on the necessity of these measures is critical to maintaining business as usual. Clarity, a frictionless user experience and minimal business disruption will help boost trust in technology while ensuring organisations remain compliant, too.

Content moderation

In an environment where one in 10 children are exposed to age-restricted content within 10 minutes of going online, it has never been more critical for platforms to have solutions in place that can prevent, monitor and remove harmful, age-restricted or illegal content from their websites.

Content moderation solutions incorporating AI, humans or, ideally, both, can remove harmful and illegal content as soon as it is identified or, better still, prevent it from being published in the first place. This process creates better and safer experiences on platforms while ensuring regulatory compliance.

Regulators globally are actively developing guidance for protecting children from online risk, and enforcement is either live or imminent in many jurisdictions. For example, the UK’s Online Safety Act 2023 requires technology platforms to keep children safe online and ensure content is age-appropriate, legal, and within specific guidelines. Its regulator Ofcom also plans to hold websites accountable for exposure to harmful content, bringing in hefty fines if firms fail to do so.

Striking the balance

Innovation and technology adoption are happening faster than ever, and collaboration with regulators is key. Regulatory bodies need to lead in proactively enforcing by creating frameworks that allow innovation to flourish without undue risks. For example, regulations in the tech industry might focus on data privacy, ensuring new technologies don’t compromise personal identifiable information. In turn, this fosters trust and encourages quicker and wider adoption.

Above all, online platforms must keep children safe by preventing them from viewing or accessing age-restricted content or environments. Safety technology providers can assist platforms in the effective implementation of these tools to ensure compliance and help safeguard children and society online.


Avatar

Lina Ghazal

Head of Regulatory and Public Affairs at VerifyMy specialising in online ethics, regulation and safety previous roles have been at Meta (previously Facebook) and Ofcom.





Source link

About The Author

Scroll to Top