Triaging content severity in online mental health forums
Journal of the American Society for Information Science and Technology
Published online on September 25, 2017
Abstract
In recent years, social media has become a significant resource for improving healthcare and mental health. Mental health forums are online communities where people express their issues, and seek help from moderators and other users. In such forums, there are often posts with severe content indicating that the user is in acute distress and there is a risk of attempted self‐harm. Moderators need to respond to these severe posts in a timely manner to prevent potential self‐harm. However, the large volume of daily posted content makes it difficult for the moderators to locate and respond to these critical posts. We propose an approach for triaging user content into four severity categories that are defined based on an indication of self‐harm ideation. Our models are based on a feature‐rich classification framework, which includes lexical, psycholinguistic, contextual, and topic modeling features. Our approaches improve over the state of the art in triaging the content severity in mental health forums by large margins (up to 17% improvement over the F‐1 scores). Furthermore, using our proposed model, we analyze the mental state of users and we show that overall, long‐term users of the forum demonstrate decreased severity of risk over time. Our analysis on the interaction of the moderators with the users further indicates that without an automatic way to identify critical content, it is indeed challenging for the moderators to provide timely response to the users in need.