Quiip > Blog > Community Management > How to deal with a suicide threat in an online community

How to deal with a suicide threat in an online community


Online communities play a very real and important role in an increasing number of people’s lives. Communities often form bridges to isolated individuals, be it geographic or social isolation. As taboo as it may be, as a Community Manager you must be prepared to help your community deal with a suicide threat.

As with all communities, people might initially join to seek information – be it sporting, hobbies, fashion – but they will stay for the friendships they form. No matter what the subject of your community – it is not out of the question that a suicide threat may be posted at some point.

If a member of your community posts a suicide threat, you’ll have no way of knowing whether they are seeking intervention or attention. Either way you should take it seriously – it is important to be prepared.

Things to consider:

  • How will you contact the member in question? Can you send them a private message? Do you have any ‘real-life’ contact details? Are they known IRL to other members? Do you know who they are friends with online?
  • Can you call 000 and send police for a welfare check?
  • Do you have a list of contact helplines and resources at hand?
  • Will you remove the post/thread from view? Will you allow comments and discussion about the issue? The event will trigger all sorts of emotion & behaviour – how will you respond and support the community?
  • Will you reveal the actions you are taking?
  • Is there any way the member can post images/video on your site? Can you block them from doing so? (Tragically suicides have been carried out online.)

Steps to take:

I have narrowed it to two steps will help you address a threat.

Develop a Crisis Management Process

This is a specific example for a suicide threat. Escalate the issue to the Community Manager, who can in turn:

  • Contact member privately. In a professional yet compassionate manner let them know the community is concerned, and urge them to seek professional help immediately. Supply contact numbers.
  • If you have an address arrange a 000 police welfare check, or if you do not, request that the members’ friends do so.
  • Lock or remove thread original thread.
  • It is important to note that despite intentions members advice may do more harm that good. Be sure to explain this to the member so they do not feel they have no avenue to be listened to.
  • Post to state the issue is being addressed
  • Escalate the issue and action taken to higher powers.

Develop an Escalation Policy

This should include the complete ‘chain of command’ from a Moderator finding the threat, through to your bosses/clients and legal team. Be sure you have everyone’s out-of-hours mobile numbers correct and accessible.

A note regarding reporting Facebook content

Facebook has a link to report suicidal content – I cannot vouch for their response unit but I personally would not rely on this approach alone. By all means use it in addition to your existing process. If you take this risk of having an un-moderated Facebook page – you might consider posting this link where members can see it, and use it if required.


If you are an Australian-based community you can provide the Beyond Blue or Lifeline help numbers to your online community. For more info about suicide prevention please see SPA. For youth specific info please see Reach Out.

Beyond Blue
Lifeline 13 11 14
Reach Out

Suicide Prevention Australia (SPA)

Credit: A recent blog post by Jonathan Nguyen prompted me to write this post, so I thank him as a fellow community peep for the discussion we had.

If you have any suggestions, tips or advice about how to manage crises in your online community?

Article by Alison Michalk
  • http://www.beeurd.com/ Andy

    Great post. I've drawn up contingency plans for such an event in old forum communities before, but thankfully never had to act on them. I think it's important not to ignore suicide threats, whether believed to be serious or not.

  • http://www.quiip.com.au/blog Alison Michalk

    Thanks Andy, I appreciate your comment. Myself and mod team have only had to act on a few over the years, but thankfully all interventions were successful. I agree they should always be acted on – dismissing them (even with good reason) is a blow to the trust and relationship you have with 'your' community. I think it also serves to negate the perception that online interactions are 'real'. There are times when they be made as a flippant comment but the community rallies to offer support, which may be all the person needed – a great example of the depth and benefit of online communities.

  • Sue

    Great points Alison and a great plan.

  • http://www.quiip.com.au/blog Alison Michalk

    Thanks Sue. I know it's likely you've dealt with this issue in your decade of community management, so I appreciate your comment :)

  • http://www.jumpstartmypc.com/ Chris | JumpstartMyPC.com

    Great information Alison. I am currently putting together a social media initiative for a large company and these are the kinds of things that we are trying to prepare for. There is so little information available out there. I had no idea that Facebook had a page for reporting “suicidal content.” Your article helped us tremendously.

  • http://www.quiip.com.au/blog Alison Michalk

    Hi Chris, thanks for your comment, I'm glad you found it useful. You might enjoy my newest post on Identifying Patterns of Behaviour: http://bit.ly/9BMFKw I also have a popular post on Developing Community Guidelines if this is use for your policy. If you have any questions or blog topic ideas, please let me know!