The team says that YouTube needs to do more, especially when it comes to handling content as sensitive as videos dealing with suicide help and prevention. One key proposal is that the algorithm should be fine-tuned so that it doesn’t recommend such content for suggested viewing, because that can bring users into a “rabbit hole” of disturbing content. Another suggestion is that AI-based tools should be deployed for monitoring and timely intervention so that YouTube content doesn’t end up exacerbating loneliness and anxiety issues for users.
“The disconnection resulting from increased social media use is reportedly linked to a lack of deep human relationships and social connections, especially in young people,” says the research published in the Informatics journal. Experts behind the research suggest that instead of relying deeply on online content, or creators for support, users should seek real social interactions for mental health support and to combat their loneliness.
The researchers propose the creation, or at least concept testing, of “independent-of-YouTube algorithms aim to detect recommendation bias and errors as well as moderate interventions through recommending safe and appropriate mental health content.” This independent algorithmic protocol must be created with assistance from mental health experts so that potentially at-risk users are directed to appropriate mental health and wellness content, with the right kind of resources in tow for them to seek real support.
Stay connected with us on social media platform for instant update click here to join our Twitter, & Facebook
We are now on Telegram. Click here to join our channel (@TechiUpdate) and stay updated with the latest Technology headlines.
For all the latest gaming News Click Here
For the latest news and updates, follow us on Google News.