Being a Moderator
Moderation Console
8 min
the moderation console is built around a case centric approach , consolidating all reports from ai proactive detection or user reports on a given piece of content into a single view this gives moderators a comprehensive summary of reports and fuller context around each case, while individual reports can still be accessed and reviewed independently within the case view the interface is structured around three panels , with navigation and controls at the top on the top bar, you can see navigation breadcrumb shows where you are (reports / queue name / case number) undo button to revert your last action timer shows how long you've been reviewing the current case left panel report details this panel lists all the reports associated with the case , along with their audit trails each report card contains source the reporting channel (user report, proactive detection, etc ) input labels tags applied to categorize the report, either by the ai detection provider or by users reporting a content reason for report what users may have added when reporting a content confidence score when the report comes from proactive ai detection, the confidence score returned by the detection engine is displayed audit trails the history of actions, decisions, and changes related to that report reports can be filtered to help moderators focus on what they need most center panel content, attributes, and reported account this panel shows the reported content, its details, and the reported account section content preview view the actual content (image, video, text, etc ) or the multiple pieces of content in case of a collection if preview isn't available, click the download link to inspect the original file toggle blur/unblur for sensitive content content details preview content text description or transcript when available content id unique identifier for this piece of content content type format (image, video, text, listing, etc ) country/locale location or language information custom attributes all the additional metadata like prior reviewer signals or system detected risks, as configured in docid 2fmg8gdwrlps9p7lslnuf reported account this section displays key insights about the reported account, such as reports count and policy violations the reported account section is fully customisable teams can configure which custom attributes are displayed, allowing you to surface the information most relevant to your review workflow by clicking on the account, you will get a full history of all the cases ingested in nima for that account right panel actions and policies this is where you make your decision and apply the outcome docid\ o4mxk fvhcwa88dmpctva to select whether the content violates a policy some policies have sub options to let you specify the exact type of violation docid\ lsfkyoqsi365x alowit1 you can choose only one policy or workflow action per report both actions and policies displayed for each queue are configurable in docid\ azsiick6eehflzwytql v when reviewing docid\ hadsvd38gpyc7canb4oyi , only moderation actions will be displayed instead of policies additional actions re route flag send the case to another queue for specialized review escalate flag send to senior moderators or escalation queues request information ask the reported account for more context once a report has been moderated, if it will be re opened from the report list, you will find here the details of the policy and action applied by selecting a different policy, you will be changing the decision additional capabilities revert actions when a case has already been processed, you can revert the action that was applied this is useful for correcting mistakes identified during quality control or when new information changes the original decision to revert an action open a processed case in the moderation console click the "revert action" button select a revert action from the list (only actions marked as revert actions are shown when you apply a revert action the original enforcement is undone on your platform via the configured webhook the affected user receives a notification (if a template is configured in user notification) explaining the reversal the revert action appears in the audit trail and in the flag list view revert actions are configured in settings > actions as any other action blur functionality to protect moderator wellbeing, you can blur sensitive images click the unblur button next to any blurred image to view the original, or click blur to reapply the effect you can configure blur and greyscale intensity in your docid\ aa7jy1u8cvlirssikv i , accessible when clicking on your user icon on the top left translating content when reviewing text in other languages, use the built in translation feature select your target language from the dropdown and click translate to instantly view the content in that language escalating reports for complex or sensitive reports that need expert review, click the escalate flag button select the appropriate escalation queue from the dropdown and confirm to move the case the report is immediately removed from your queue and sent to the specialized team escalation queues are created \[here] (link to be added) cases can be escalated also from the list view request information from users when you need additional context before making a decision, use the request information feature to engage directly with the reported account this allows you to ask clarifying questions without leaving the moderation console how it works click request information in the right panel compose your question or request for clarification the user receives a formal email with a call to action button linking to a nima form the user submits their response through the form their response appears as a chat like thread directly on the case in your moderation console