Moderation is key to any strategy involving online social content. But like housecleaning, people may put their attention on it only when it's urgently needed. As a result, some of the basics about moderation aren't well known. Here's a very simple question our prospective clients often ask:
What exactly is moderation and what does it involve?
Moderation of online social content falls into one of two categories: Content moderation, in which moderators vet posted content against published community guidelines, and social engagement moderation, in which moderators interact with the users of the online venue.
Content moderators read, view, and/or listen to posted material, whether text, photos, audio, or video, checking that none of it violates the community guidelines. Moderators remove inappropriate material, escalating unclear matters or requested issues to the client. As part of the moderation process, they can also check for other material important to the brand--like customer issues, complaints, or requests.
Social engagement moderators interact with people around specific topics--often starting or stirring conversations, orienting newcomers, answering questions, and escalating community needs or requests to brand management.
Appropriate moderation helps to set the story, tone, and context for a social venue. Done well, it personifies the culture envisioned by the venue's founders, as it provides model interactions for people.
Moderation can also be critical for legal considerations, such as checking for trademark & copyright violations, or pharmaceutical or financial industry regulations.