Contact Info

Atlas Cloud LLC 600 Cleveland Street Suite 348 Clearwater, FL 33755 USA

[email protected]

Client Area
Recommended Services
Supported Scripts
WordPress
Hubspot
Joomla
Drupal
Wix
Shopify
Magento
Typeo3

Attempts to protect children’s safety in the two-dimensional realm of online social media could adversely impact the 3D world of augmented and virtual reality, according to a report released Tuesday by a Washington, D.C., technology think tank.

Legislative efforts, like the Kids Online Safety and Privacy Act (KOPSA), which has passed the U.S. Senate and is now before the House of Representatives, could lead to harmful censorship of AR/VR content, maintained the report by the Information Technology & Innovation Foundation.

If KOPSA becomes law, AR/VR platforms may be forced to ramp up enforcement in the same manner as traditional social media platforms, the report explained.

By giving the FTC authority to deem content on these platforms harmful, it continued, the FTC may over-censor content on AR/VR platforms, or the platforms themselves may censor content to avoid liability, which could include content pertinent to children’s education, entertainment, and identity.

“One of our fears that we have with KOPSA is that it opens the door for potential over-censorship by giving the FTC [Federal Trade Commission] power to decide what qualifies as harmful,” said the report’s author, Policy Analyst Alex Ambrose.

“It’s another way for a political party to decide what’s harmful,” she told TechNewsWorld. “The FTC could say content like environmental protection, global warming, and climate change is anxiety-inducing. So we need to completely get rid of anything related to climate change because it could lead to anxiety in children.”

Andy Lulham, COO of VerifyMy, an age and content verification provider based in London, acknowledged that the specter of over-censorship looms large in discussions about online regulation. “But I firmly believe this fear, while understandable, is largely misplaced,” he told TechNewsWorld. “Well-crafted government regulations are not the enemy of free expression, but rather its guardian in the digital age.”

Lulham maintained that the key to regulation lies in the approach. “Blanket, heavy-handed regulations risk tipping the scales towards over-censorship,” he said. “However, I envision a more nuanced, principle-based regulatory framework that can enhance online freedom while protecting vulnerable users. We’ve seen examples of such balanced approaches in privacy regulations like GDPR.”

The GDPR — General Data Protection Regulation — which has been in effect since 2018, is a comprehensive data protection law in the European Union that regulates how companies collect, store, and use the personal data of EU residents.

“I strongly believe that regulations should focus on mandating robust safety systems and processes rather than dictating specific content decisions,” Lulham continued. “This approach shifts the responsibility to platforms to develop comprehensive trust and safety strategies, fostering innovation rather than creating a culture of fear and over-removal.”

He asserted that transparency will be the linchpin of effective regulation. “Mandating detailed transparency reports can hold platforms accountable without resorting to heavy-handed content policing,” he explained. “This not only helps prevent overreach but also builds public trust in both the platforms and the regulatory framework.”

“Furthermore,” he added, “I advocate for regulations requiring clear, accessible appeal processes for content removal decisions. This safety valve can help correct inevitable mistakes and prevent unwarranted censorship.”

“Critics might argue that any regulation will inevitably lead to some censorship,” Lulham conceded. “However, I contend that the greater threat to free expression comes from unregulated spaces where vulnerable users are silenced by abuse and harassment. Well-designed regulations can create a more level playing field, amplifying diverse voices that might otherwise be drowned out.”

The ITIF report noted that conversations about online safety often overlook AR/VR technologies. Immersive technologies foster social connection and stimulate creativity and imagination, it explained. Play, imagination, and creativity are all imperative for children to develop.

The report acknowledged, however, that properly addressing the risks children face with immersive technologies is a challenge. Most existing immersive technologies are not made for children under 13, it continued. Children explore adult-designed spaces, which leads to exposure to age-inappropriate content and can build harmful habits and behaviors in children’s mental and social development.

Addressing these risks will require a combination of market innovation and thoughtful policymaking, it added. Companies’ design decisions, content moderation practices, parental control tools, and trust and safety strategies will largely shape the safety environment in the metaverse.

It was acknowledged that public policy measures are crucial for addressing certain safety risks. There is already a movement among policymakers to safeguard children on traditional social platforms, which could influence regulations for augmented and virtual reality (AR/VR) technologies according to ITIF.

The report advised that before implementing these regulations, policymakers should evaluate the safety measures already in place by AR/VR developers and ensure that these tools remain effective. Where safety measures fall short, the focus should be on creating targeted policies for confirmed dangers rather than theoretical ones.

“Most online services attempt to eliminate harmful content, yet the vast volume of such content inevitably means some will evade detection,” commented Ambrose. “The problems we currently encounter on digital platforms, like incitement to violence, destruction, and the dissemination of harmful content and misinformation, are likely to persist and evolve in immersive environments.”

“Given that the metaverse will be fueled by vast quantities of data, we can expect these challenges to be ubiquitous — perhaps even more so than we currently experience,” she further noted.

Lulham concurred with the idea presented in the report that the design choices of companies will influence the safety dynamics within the metaverse.

“From my perspective, the decisions made by companies concerning online safety are crucial in establishing a safe digital space for kids,” he commented. “The existing environment is full of dangers, and I posit that companies hold the obligation and capability to transform it.”

He emphasized that the design of user interfaces serves as the primary barrier to safeguard children. “When companies focus on developing clear, age-suitable designs, it can transform the way youngsters engage with digital platforms fundamentally,” he stated. “Interfaces that intuitively steer and enlighten users about safe practices can drastically cut down on negative experiences.”

He pointed out that content moderation is at a pivotal point. “The sheer amount of content requires a new strategy,” he noted. “Though AI-driven tools are crucial, they alone cannot fix everything. I suggest that the future should focus on a mixed method that uses both sophisticated AI and human monitoring to balance protection with freedom of expression.”

Parental control tools are essential yet frequently underutilized, he argued. Rather than being optional extras, they should be fundamental components crafted with as much care as the main application itself. He visions a future where such tools are both natural and powerful, becoming a key part of digital family life.

He believed the success or failure of platforms could hinge on their trust and safety strategies. “Platforms that incorporate comprehensive age verification systems, continuous monitoring, and clear communication will set benchmarks,” he asserted. “Continuous dialogue with experts in child safety and policy makers will become imperative for those committed to the safeguarding of younger users.”

“Ultimately,” he elaborated, “I project that the future of online children’s safety will regard ‘safety by design’ not just as a popular phrase, but as a core doctrine that influences every facet of platform creation.”

The report highlighted that children are critical to the metaverse’s market penetration due to their role in adopting and shaping immersive technologies.

Recognizing that nurturing innovation in the emerging AR/VR sector while maintaining safety for all users is a complex challenge, it was acknowledged that parents, corporations, and regulators all have a part to play in harmonizing privacy and safety with the creation of engaging and innovative immersive experiences.


Welcome to DediRock, your trusted partner in high-performance hosting solutions. At DediRock, we specialize in providing dedicated servers, VPS hosting, and cloud services tailored to meet the unique needs of businesses and individuals alike. Our mission is to deliver reliable, scalable, and secure hosting solutions that empower our clients to achieve their digital goals. With a commitment to exceptional customer support, cutting-edge technology, and robust infrastructure, DediRock stands out as a leader in the hosting industry. Join us and experience the difference that dedicated service and unwavering reliability can make for your online presence. Launch our website.

Share this Post
0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x