Andrew Walker calls for the separation of social media content for adults and children.
A coroner has written to social media firms and the government calling for action following the inquest into the death of schoolgirl Molly Russell.
The 14-year-old, from Harrow, ended her life in November 2017 after viewing suicide and self-harm content online.
Coroner Andrew Walker issued six recommendations including separating platforms for adults and children and reviewing algorithms used by sites.
Molly’s father Ian urged social media firms not to “drag their feet”.
At the inquest held last month at North London Coroner’s Court, the coroner concluded the schoolgirl died while suffering from the “negative effects of online content”.
In a prevention of future deaths report sent on Thursday to firms including Meta, Pinterest, Twitter and Snapchat, as well as the UK government, Mr Walker identified these concerns and action points for media firms and government to consider:
- Separate platforms for adults and children
- Age verification before joining a platform
- Provision of age specific content
- Review the use of algorithms to provide content
- Government to review the use of advertising
- Parental, guardian or carer control including access to material viewed by a child, and retention of material viewed by a child
The coroner also voiced concerns over age verification when signing up to the platforms, content not being controlled so as to be age-specific and algorithms being used to provide content together with adverts.
Other issues included were the lack of access or control for parents and guardians and the absence of capability for a child’s account to be linked to a parent or guardian’s account.
In his report, Mr Walker said: “I recommend that consideration is given to the setting up of an independent regulatory body to monitor online platform content.”
He also recommended that “consideration is given to enacting such legislation as may be necessary to ensure the protection of children from the effects of harmful online content and the effective regulation of harmful online content”.
The coroner added that while any regulation “would be a matter for government I can see no reason why the platforms themselves would not wish to give consideration to self-regulation”.
“I believe you and/or your organisation have the power to take such action,” he wrote.
Meta, Pinterest, Twitter and Snapchat now all have 56 days to respond with a timetable of action they propose to take or explain why no action is proposed.
Molly’s father Ian Russell said he welcomed the coroner’s report and called on social media firms “to heed the coroner’s words and not drag their feet waiting for legislation and regulation”.
“They should think long and hard about whether their platforms are suitable for young people at all.”
He added that the government “must also act urgently to put in place its robust regulation of social media platforms to ensure that children are protected from the effects of harmful online content, and that platforms and their senior managers face strong sanctions if they fail to take action to curb the algorithmic amplification of destructive and extremely dangerous content or fail to remove it swiftly”.
Representatives from both Meta, Instagram’s parent company, and Pinterest gave evidence during the inquest.
Meta executive Elizabeth Lagone said she believed posts seen by Molly, which her family say “encouraged” suicide, were safe, while Pinterest’s Judson Hoffman told the inquest the site was “not safe” when the schoolgirl used it.
Reacting to the report, Meta agreed that “regulation is needed” and it was “reviewing” the recommendations.
“We’ll continue working hard, in collaboration with experts, teens and parents, so we can keep improving,” the firm added.
In a statement, Pinterest said it was “committed to making ongoing improvements to help ensure that the platform is safe for everyone and the coroner’s report will be considered with care”.