Being a Moderator
36 min
as a moderator, when you log in nima, you will land on the moderation dashboard the is the central hub for your work, designed to make human review as efficient and as accurate as possible nima offers a customizable and flexible experience to meet the specific needs of you and your team optimizing your workflow managing content queues the console offers two distinct modes for managing content queues dashboard overview when you log in as a moderator, you'll see a dashboard with all the queues you're assigned to team assignments to queues are managed in queues docid\ azsiick6eehflzwytql v settings the dashboard has 2 tabs , both organised by queue reports reports waiting to be reviewed by moderators for the first time appeals docid\ hadsvd38gpyc7canb4oyi reports that users have appealed after the first review, pending moderators second review you can see how many reports are pending and how long they've been waiting (less than 24 hours, less than 48 hours, or more than 48 hours) the dashboard shows when it was last updated and automatically refreshes to keep the information current queue viewing modes you can view queues in two different ways , each suited to different tasks list view displays reports in a table format, giving you a comprehensive overview of the queue this mode works well for bulk moderation and manual prioritization cases appear as rows that are easy to scan you can manually select specific reports to prioritize or skip customize which columns appear per queue to see the details that matter to your team's workflow configure which custom attributes are visible in each queue—t\&s managers set this in queue settings to control what information is disclosed to moderators per queue blur settings automatically protect sensitive information during review based on queue configuration handle specific cases out of the normal order when needed grid view displays cases in a card based layout, ideal for visual scanning and quick case assessment this mode works well for high volume queues where you need to process cases rapidly cases appear as cards with key information at a glance visual layout makes it easy to spot patterns across multiple cases card fields for content, reporter, and reported account are customizable per queue, just like list view moderation console docid\ angnoc xkytjfghpyhxmz shows cases one at a time when you click the play button, using a case centric approach that consolidates all reports on a given piece of content into a single view it provides a full screen view with comprehensive context for each case, so nothing gets overlooked this mode is ideal for high volume queues where you need to process cases quickly and sequentially cases appear in a continuous sequence, moving automatically to the next case after you make a decision each case view includes a summary of all reports, a customisable reported account section, and the full audit trail streamlined workflow keeps you focused on making decisions without having to select each case manually similarly to list & grid view, the moderation console is customisable by t\&s managers in @ see moderation console docid\ angnoc xkytjfghpyhxmz for a detailed walkthrough of the interface customizing your list and grid views your list and grid view experiences are fully customizable by queue what moderators can customize column selection choose which data fields appear in the list view (e g , report id, content type, custom attributes, etc ) filters save filter preferences per queue to quickly surface cases matching your criteria display preferences toggle between list and grid view based on your task personal blur settings what t\&s managers configure per queue (in queue settings) custom attributes visibility select which custom attributes appear for moderators reviewing this queue, controlling information access and compliance with data minimization principles general blur settings set default blur/grayscale intensity for this queue to protect moderator wellbeing while respecting content sensitivity policies these settings ensure moderators only see what's relevant to their role and expertise your view preferences sync across devices and persist until you change them how cases appear and are ordered which cases appear in which queues depends on the rules you've configured both proactive detection rules docid\ hfx29ae9o5ktdj2 dokxb and queue routing rules docid\ hfx29ae9o5ktdj2 dokxb control where cases go you can customize how cases are prioritized in queue prioritisation settings to match your platform's specific needs the order of cases within each queue is based on their severity nima automatically puts the most critical content first, so you can address urgent issues quickly the moderation console is built around a case centric approach , consolidating all reports from ai proactive detection or user reports on a given piece of content into a single view this gives moderators a comprehensive summary of reports and the full context around each case the interface is structured around three panels , with navigation and controls at the top on the top bar, you can see navigation breadcrumb shows where you are (reports / queue name / case number) undo button to revert your last action timer shows how long you've been reviewing the current case left panel reports details this panel lists all the reports associated with the case , along with their audit trails each report card contains source the reporting source (user report, proactive detection, etc ) report age the time elapsed since the report was created is displayed in a human readable format (e g , "2 days 5 hours ago"), helping moderators prioritize cases at a glance input labels tags applied to categorize the report, either by the ai detection provider or by users reporting a content reason for report what users may have added when reporting a content confidence score when the report comes from proactive ai detection, the confidence score returned by the ai detection provider is displayed audit trails the history of actions, decisions, and changes related to that report reports can be filtered to help moderators focus on what they need most center panel content, attributes, and reported account this panel shows the reported content, its details, and the reported account section content preview view the actual content (image, video, text, etc ) or the multiple pieces of content in case of a collection if preview isn't available, click the download link to inspect the original file toggle blur/unblur for sensitive content content details / metadata preview content text description or transcript when available content id unique identifier for this piece of content content type format (image, video, text, listing, etc ) country/locale location or language information custom attributes all the additional metadata like prior reviewer signals or system detected risks, as configured in custom attributes docid 2fmg8gdwrlps9p7lslnuf reported account this section displays key insights about the reported account, such as reports count and policy violations the reported account section is fully customizable per queue t\&s managers configure which custom attributes appear for each queue in queue settings, allowing different teams to surface the information most relevant to their review workflow and role by clicking on the account, you will get a full history of all the cases ingested in nima for that account moderators can reach out to reported accounts through request information docid\ angnoc xkytjfghpyhxmz to engage with the owners of content right panel actions and policies this is where you make your decision and apply the outcome policies docid\ o4mxk fvhcwa88dmpctva to select whether the content violates a policy some policies have sub options to let you specify the exact type of violation workflow actions docid\ lsfkyoqsi365x alowit1 you can choose only one policy or workflow action per report both actions and policies displayed for each queue are configurable in queues settings docid\ azsiick6eehflzwytql v when reviewing appeals docid\ hadsvd38gpyc7canb4oyi , only moderation actions will be displayed instead of policies yet, if the original decision was no policy violation, moderators will be able to take a policy decision in appeals too once a report has been moderated, if it will be re opened from the report list, you will find here the details of the policy and action applied by selecting a different policy, you will be changing the decision additional capabilities revert actions when a case has already been processed, you can user revert actions docid\ lsfkyoqsi365x alowit1 to change the previously applied decision this is useful for correcting mistakes identified during quality control or when new information changes the original decision to revert an action open a processed case in the moderation console click the "revert action" button select a revert action from the list (only actions marked as revert actions are shown when you apply a revert action the original enforcement is undone on your platform via the configured webhook the affected user receives a notification (if a template is configured in user notification) explaining the reversal the revert action appears in the audit trail and in the report list view revert actions are configured in settings > actions as any other action blur functionality to protect moderator wellbeing, you can blur sensitive images click the unblur button next to any blurred image to view the original, or click blur to reapply the effect you can configure blur and greyscale intensity in your blur settings docid\ aa7jy1u8cvlirssikv i , accessible when clicking on your user icon on the top left translating content when reviewing text in other languages, use the built in translation feature select your target language from the dropdown and click translate to instantly view the content in that language policy guidelines drawer click the policy guidelines icon in the console to open a right side drawer listing all your platform's policies and their descriptions moderators can reference policy details without navigating away from the current case, reducing context switching during active moderation escalating or re routing cases for complex or sensitive cases that need expert review, click the escalate button select the appropriate escalation queue from the dropdown and confirm to move the case the case is immediately removed from your queue and sent to the specialized team escalation queues are created in queues docid\ azsiick6eehflzwytql v cases can be escalated also from the list view when a case is better suited to a different workflow or team, you can re route it instead of escalating use the re route case to send the case to another queue (for example, a different language, country, or product queue) once re routed, the case leaves your queue and appears in the destination queue’s backlog request information from reported accounts when you need more context before making a decision, use the request information feature to contact the reported account directly this lets you ask clarifying questions without leaving the moderation console and view the full conversation inside nima how it works click request information in the reported account panel of the moderation console write your question or request for clarification the user receives an email with a call to action button that links to a nima form to submit a response the response appears as a chat like thread directly on the case in the moderation console moderators are notified whenever a new answer is submitted to one of their information requests tools for complex cases for challenging or ambiguous flags, the console provides built in tools to assist in the decision making process the moderation console is built around a case centric approach , consolidating all reports from ai proactive detection or user reports on a given piece of content into a single view this gives moderators a comprehensive summary of reports and the full context around each case the interface is structured around three panels , with navigation and controls at the top on the top bar, you can see navigation breadcrumb shows where you are (reports / queue name / case number) undo button to revert your last action timer shows how long you've been reviewing the current case left panel reports details this panel lists all the reports associated with the case , along with their audit trails each report card contains source the reporting source (user report, proactive detection, etc ) report age the time elapsed since the report was created is displayed in a human readable format (e g , "2 days 5 hours ago"), helping moderators prioritize cases at a glance input labels tags applied to categorize the report, either by the ai detection provider or by users reporting a content reason for report what users may have added when reporting a content confidence score when the report comes from proactive ai detection, the confidence score returned by the ai detection provider is displayed audit trails the history of actions, decisions, and changes related to that report reports can be filtered to help moderators focus on what they need most center panel content, attributes, and reported account this panel shows the reported content, its details, and the reported account section content preview view the actual content (image, video, text, etc ) or the multiple pieces of content in case of a collection if preview isn't available, click the download link to inspect the original file toggle blur/unblur for sensitive content content details / metadata preview content text description or transcript when available content id unique identifier for this piece of content content type format (image, video, text, listing, etc ) country/locale location or language information custom attributes all the additional metadata like prior reviewer signals or system detected risks, as configured in custom attributes docid 2fmg8gdwrlps9p7lslnuf reported account this section displays key insights about the reported account, such as reports count and policy violations the reported account section is fully customizable per queue t\&s managers configure which custom attributes appear for each queue in queue settings, allowing different teams to surface the information most relevant to their review workflow and role by clicking on the account, you will get a full history of all the cases ingested in nima for that account moderators can reach out to reported accounts through request information docid\ angnoc xkytjfghpyhxmz to engage with the owners of content right panel actions and policies this is where you make your decision and apply the outcome policies docid\ o4mxk fvhcwa88dmpctva to select whether the content violates a policy some policies have sub options to let you specify the exact type of violation workflow actions docid\ lsfkyoqsi365x alowit1 you can choose only one policy or workflow action per report both actions and policies displayed for each queue are configurable in queues settings docid\ azsiick6eehflzwytql v when reviewing appeals docid\ hadsvd38gpyc7canb4oyi , only moderation actions will be displayed instead of policies yet, if the original decision was no policy violation, moderators will be able to take a policy decision in appeals too once a report has been moderated, if it will be re opened from the report list, you will find here the details of the policy and action applied by selecting a different policy, you will be changing the decision additional capabilities revert actions when a case has already been processed, you can user revert actions docid\ lsfkyoqsi365x alowit1 to change the previously applied decision this is useful for correcting mistakes identified during quality control or when new information changes the original decision to revert an action open a processed case in the moderation console click the "revert action" button select a revert action from the list (only actions marked as revert actions are shown when you apply a revert action the original enforcement is undone on your platform via the configured webhook the affected user receives a notification (if a template is configured in user notification) explaining the reversal the revert action appears in the audit trail and in the report list view revert actions are configured in settings > actions as any other action blur functionality to protect moderator wellbeing, you can blur sensitive images click the unblur button next to any blurred image to view the original, or click blur to reapply the effect you can configure blur and greyscale intensity in your blur settings docid\ aa7jy1u8cvlirssikv i , accessible when clicking on your user icon on the top left translating content when reviewing text in other languages, use the built in translation feature select your target language from the dropdown and click translate to instantly view the content in that language policy guidelines drawer click the policy guidelines icon in the console to open a right side drawer listing all your platform's policies and their descriptions moderators can reference policy details without navigating away from the current case, reducing context switching during active moderation escalating or re routing cases for complex or sensitive cases that need expert review, click the escalate button select the appropriate escalation queue from the dropdown and confirm to move the case the case is immediately removed from your queue and sent to the specialized team escalation queues are created in queues docid\ azsiick6eehflzwytql v cases can be escalated also from the list view when a case is better suited to a different workflow or team, you can re route it instead of escalating use the re route case to send the case to another queue (for example, a different language, country, or product queue) once re routed, the case leaves your queue and appears in the destination queue’s backlog request information from reported accounts when you need more context before making a decision, use the request information feature to contact the reported account directly this lets you ask clarifying questions without leaving the moderation console and view the full conversation inside nima how it works click request information in the reported account panel of the moderation console write your question or request for clarification the user receives an email with a call to action button that links to a nima form to submit a response the response appears as a chat like thread directly on the case in the moderation console moderators are notified whenever a new answer is submitted to one of their information requests the moderation console is built around a case centric approach , consolidating all reports from ai proactive detection or user reports on a given piece of content into a single view this gives moderators a comprehensive summary of reports and the full context around each case the interface is structured around three panels , with navigation and controls at the top on the top bar, you can see navigation breadcrumb shows where you are (reports / queue name / case number) undo button to revert your last action timer shows how long you've been reviewing the current case left panel reports details this panel lists all the reports associated with the case , along with their audit trails each report card contains source the reporting source (user report, proactive detection, etc ) report age the time elapsed since the report was created is displayed in a human readable format (e g , "2 days 5 hours ago"), helping moderators prioritize cases at a glance input labels tags applied to categorize the report, either by the ai detection provider or by users reporting a content reason for report what users may have added when reporting a content confidence score when the report comes from proactive ai detection, the confidence score returned by the ai detection provider is displayed audit trails the history of actions, decisions, and changes related to that report reports can be filtered to help moderators focus on what they need most center panel content, attributes, and reported account this panel shows the reported content, its details, and the reported account section content preview view the actual content (image, video, text, etc ) or the multiple pieces of content in case of a collection if preview isn't available, click the download link to inspect the original file toggle blur/unblur for sensitive content content details / metadata preview content text description or transcript when available content id unique identifier for this piece of content content type format (image, video, text, listing, etc ) country/locale location or language information custom attributes all the additional metadata like prior reviewer signals or system detected risks, as configured in custom attributes docid 2fmg8gdwrlps9p7lslnuf reported account this section displays key insights about the reported account, such as reports count and policy violations the reported account section is fully customizable per queue t\&s managers configure which custom attributes appear for each queue in queue settings, allowing different teams to surface the information most relevant to their review workflow and role by clicking on the account, you will get a full history of all the cases ingested in nima for that account moderators can reach out to reported accounts through request information docid\ angnoc xkytjfghpyhxmz to engage with the owners of content right panel actions and policies this is where you make your decision and apply the outcome policies docid\ o4mxk fvhcwa88dmpctva to select whether the content violates a policy some policies have sub options to let you specify the exact type of violation workflow actions docid\ lsfkyoqsi365x alowit1 you can choose only one policy or workflow action per report both actions and policies displayed for each queue are configurable in queues settings docid\ azsiick6eehflzwytql v when reviewing appeals docid\ hadsvd38gpyc7canb4oyi , only moderation actions will be displayed instead of policies yet, if the original decision was no policy violation, moderators will be able to take a policy decision in appeals too once a report has been moderated, if it will be re opened from the report list, you will find here the details of the policy and action applied by selecting a different policy, you will be changing the decision additional capabilities revert actions when a case has already been processed, you can user revert actions docid\ lsfkyoqsi365x alowit1 to change the previously applied decision this is useful for correcting mistakes identified during quality control or when new information changes the original decision to revert an action open a processed case in the moderation console click the "revert action" button select a revert action from the list (only actions marked as revert actions are shown when you apply a revert action the original enforcement is undone on your platform via the configured webhook the affected user receives a notification (if a template is configured in user notification) explaining the reversal the revert action appears in the audit trail and in the report list view revert actions are configured in settings > actions as any other action blur functionality to protect moderator wellbeing, you can blur sensitive images click the unblur button next to any blurred image to view the original, or click blur to reapply the effect you can configure blur and greyscale intensity in your blur settings docid\ aa7jy1u8cvlirssikv i , accessible when clicking on your user icon on the top left translating content when reviewing text in other languages, use the built in translation feature select your target language from the dropdown and click translate to instantly view the content in that language policy guidelines drawer click the policy guidelines icon in the console to open a right side drawer listing all your platform's policies and their descriptions moderators can reference policy details without navigating away from the current case, reducing context switching during active moderation escalating or re routing cases for complex or sensitive cases that need expert review, click the escalate button select the appropriate escalation queue from the dropdown and confirm to move the case the case is immediately removed from your queue and sent to the specialized team escalation queues are created in queues docid\ azsiick6eehflzwytql v cases can be escalated also from the list view when a case is better suited to a different workflow or team, you can re route it instead of escalating use the re route case to send the case to another queue (for example, a different language, country, or product queue) once re routed, the case leaves your queue and appears in the destination queue’s backlog request information from reported accounts when you need more context before making a decision, use the request information feature to contact the reported account directly this lets you ask clarifying questions without leaving the moderation console and view the full conversation inside nima how it works click request information in the reported account panel of the moderation console write your question or request for clarification the user receives an email with a call to action button that links to a nima form to submit a response the response appears as a chat like thread directly on the case in the moderation console moderators are notified whenever a new answer is submitted to one of their information requests moderator wellbeing blur functionality protects moderator wellbeing by reducing exposure to explicit or disturbing content this feature automatically applies blur and greyscale filters to sensitive images in the moderation interface configuring blur effects the blur effects configuration allows you to adjust how images are displayed within your moderation interface, applying blur and greyscale effects to mitigate exposure to sensitive content enable filters this setting allows you to choose whether the blur effect is applied to images automatically or if you want to disable it toggle on the blur effect will be applied to images based on the set configurations toggle off no blur will be applied to the images, and they will be displayed normally blur percentage (0% to 100%) the blur percentage allows you to control the intensity of the blur effect on images 0% no blur effect (image is displayed in its original form) 100% maximum blur effect (image is heavily blurred) greyscale percentage (0% to 100%) the greyscale percentage applies a black and white filter to the image 0% the image will be displayed in full color 100% the image will be fully greyscale once you save these settings, all images in the moderation interface are automatically blurred according to your configuration viewing and blurring images when you need to see the original image to make a decision, you can easily control the blur unblur the unblur button is located next to the blurred image and clicking it will immediately display the unblurred version of the image in its full resolution blur again click the blur button to reapply the effect and hide the image