In what can possibly be seen as a shift to emphasise brand responsibility over their page content, Facebook have rolled out moderation tools which are useful for page admins, community managers and moderators.
As it’s still common for companies to be unaware that they are legally responsible for all their page content – including user-submissions – these moderation tools are a great addition to assist in risk mitigation.
Before you start dreaming of setting your page to “auto-moderate” and putting your feet up on the desk… keep in mind, if anything, these tools make it more apparent that you need a community manager or team to manage your page(s) and run these tools. These new Moderation features include:
A profanity filter
It is not (yet) customisable. You can choose between None, Medium or Strong. Differentiation is not given: “Facebook will block the most commonly reported words and phrases marked as offensive by the broader community.”
Content will be marked as spam so you will need to keep an eye on your spam filter to ensure marked content aligns with your own profanity guidelines. As always – context is vital – and you’ll need a human to make that determination!
A moderation blocklist
You can add comma-separated keywords to the “Moderation Blocklist”. When users include blacklisted keywords in a post and or a comment on your Page, the content will be automatically marked as spam. See Facebook help for more. Again you’ll need to monitor your spam filter, see above.
Finally you can get notifications of content on your page. Whilst free services like Hyper Alerts offer further functionality for now, I suspect FB will develop this further. This is a useful asset for low-volume pages, especially during business hours, but if you have high-volume activity you’ll need someone at the coalface regardless. Unless you like obsessively checking your email!
Commenting/liking as a “page” or “user”
Not a moderation tool per se, but the ability to switch between commenting as the page or a user could be useful for those who like or need to have individuals personally represent their company. Possibly an easier tactic for building rapport, but not necessarily the right fit for all companies. Angela Connor has a great post which highlights the potential risks of this move which shes say, “We are finally going to see the difference between true community managers who understand their craft and those who simply play one on the internet.”
Seasoned Community Managers know that it won’t take long for users to find workarounds for blacklists and filters, however Facebook is notoriously difficult to moderate due to lack of functionality, so these tools are a welcome first step (hopefully deleted content is on their to-do list!).
Have an opinion?
We'd love to hear what you have to say in the comments below
Comment & Discuss