I don't have a horse in this race but in my opinion a more graceful way to deal with this is to freeze the account until the under-16 is over-16 so they don't lose their friend connections, history, etc... The under 16 should have time to add a comment saying how to contact them otherwise. Discord group, etc... There must be a reason to remove the account that I can not see.
Could a possible solution there be to use the same language detection platforms used for detecting terrorist activity to also flag possible grooming for human moderator review? Or might that be too subjective for current language models leading to many false positives?
This is far too pat a dismissal of something which happens regularly. You can argue that it’s not frequent enough to justify this action or would happen anyway through other means but it’s a real problem which isn’t so freakishly rare that we can dismiss it.
I am not sure I meant to reply to you, to be honest. It is an issue but so far the solutions are terrible. Outsourcing parenting to the Government or companies is also meh. I am sure there are parents who know of ways to reduce screen time for their children, it ranges from installing a program that does not let you on a website or start another program until and unless this and that, or take the phone from the kid's hand and go for a walk or study, whatever.