Nevada Attorney General Aaron Ford negotiated what may be the most straightforward child-safety agreement yet from an online platform company. According to the announcement released on April 15, Roblox has agreed to settle down for $12.5 million to avoid getting sued.


The timing of the deal cannot be better considering the immense number of underage users on Roblox. Almost half of the children below age 16 in the US are active on Roblox. Also, 42% of its total userbase consists of individuals younger than 13 years old. It means that this platform has attracted considerable attention by the regulatory authorities concerned about online interactions between children.


This agreement was seen not as a one-time case but rather as something that should become a model for how other tech companies are treating child safety issues. The bottom line here is simple – if you fail to address potential dangers promptly, you risk being sued.


The money involved is distributed among three major categories.


Roblox Spends $12.5 Million to Protect Kids and Tighten Security


Out of $12.5 million, $10 million will be given to youth organisations, including the Boys & Girls Clubs. These funds will be used to run initiatives that promote children spending less time on devices.


The final $1.5 million will go to establishing a law enforcement liaison. This position will serve as an intermediary between Roblox and any police investigation team that requires information. In cases where safety issues come up, police will receive quick responses rather than having to wait for extended periods.


Credits: San Maeto Daily Journal

Lastly, $1 million will be used to fund an Internet safety campaign aimed at educating parents and kids on the proper use of such platforms and how to identify threats.


The agreement is more than just about monetary contributions. It also brings changes within the platform itself.


Each user must provide proof of age. This can be achieved either by using facial recognition software or official state identification. This requirement intends to counter the issue of false claims that allow minors to circumvent security measures.


In addition, stricter regulations for chats will be introduced. Any adult or child younger than 16 years old cannot chat with each other unless they are listed as trusted friends. Users will need to prove that they know each other personally before reaching this status.


Nevada Reaches Landmark Safety Agreement with Roblox: Encryption Curtailed for Minors


It will also affect the way it handles communication within the app. It will no longer encrypt any of the messages created by minors. This will mean that law enforcement can access the messages in case of an investigation of child abuse crimes. Advocates believe this will help protect kids. Some privacy issues might be raised; however, it is evident what the priorities of the state are.


Moreover, there will also be some modifications of notifications. Minors under the age of 13 will not get any notifications outside the application. Teens between the ages of 13 and 18 will not receive any notifications at night.


However, this agreement is not a standalone measure since Ford’s office is also suing other major social media platforms for the same things. They include Meta, TikTok, Snapchat, YouTube, and Kik.


The reasons are pretty much similar – lack of safety measures, inadequate age verification, and risks to the ways of online interaction. Also, Roblox continues to face several lawsuits in Texas and Kentucky.


Through this agreement, the tone has been set in Nevada. Regulators are prepared to negotiate, provided companies make some changes in their operations.


A New Regulatory Blueprint for Child Protection


It might set an example for others as well since this agreement does not rely solely on penalties. It also involves the modification of products. Such an approach may be adopted by other states to compel quick actions from platforms.


In addition, it shows that the issue of safeguarding children is gradually becoming a priority in regulators’ eyes. Platforms must introduce age verification and other controls that help prevent contact between children and adults. They must also provide authorities with access to their products.


Platforms can either take measures to improve the security of their users or face legal consequences in case of negligence.


Ford’s point of view is straightforward: any platform that provides its services to children should be built in such a way as to ensure their safety.



Contact to : xlf550402@gmail.com


Privacy Agreement

Copyright © boyuanhulian 2020 - 2023. All Right Reserved.