Using AI to reach users who need support in a time of crisis
Sep 01 2022We
are incredibly proud to announce that our custom moderation AI,
created by the Swedish company Oterlu, is now helping our human
moderators detect potential expressions of self-harm. This is the
result of hard work and collaboration between Oterlu, Recolor’s
moderation team, and Recolor’s content strategy team, and will help
us find and assist users struggling with self-harmful thoughts or
behavior.
How
did it start?
The
community team at Recolor already had an established flow to provide
users with support and resources when there were any indications or
expressions of potential self-harmful behavior. But like all manual
moderation, this relied heavily on reports from other users.
In
our striving for continual improvement, we wanted a better, faster way
of finding users who need help. Oterlu was able to expand the AI
model’s capabilities so that it can now flag in real time comments
that could indicate users need support.
An
additional layer of safety
The AI model from Oterlu that is used on the Recolor platform is not intended to operate on its own, but to be an additional tool for the community team. The AI flags publicly-posted comments that are then reviewed by the moderation team, which makes it faster to reach out with help and resources for those who may be in a crisis and need them.
We hope that this will inspire other platforms to focus on how to support those of their users who need help the most.