Behind the screens, these unpaid moderators are keeping online communities safe
CBC
Lisa Rufiange wakes up with her partner at 5 a.m. and checks her Facebook.
Specifically, she reviews the two groups she moderates: Edmonton Moms Community Group and Edmonton Apartment, Houses, Rooms For Rent.
She sorts through requests to join the group, and other requests from users to make anonymous posts, then checks overnight posts to ensure none broke any rules and need removing.
When she's finished, she goes back to sleep until her alarm goes off at 7:30 a.m.
"I know if I don't check it, it will just be more overwhelming later in the morning," Rufiange said in an interview.
"It's best to bite it off in small increments."
Depending on the volume of content, Rufiange spends at least two or three hours a day refereeing her Facebook groups.
She's one of many moderators on Facebook, Reddit and other social media sites. Most communities have multiple moderators who filter out rude, inappropriate or harmful content in order to make the online spaces safe and positive experiences for members.
For some, moderating is a paid job. Facebook parent company Meta, blog sites and news organizations hire content management companies that employ hundreds of moderators whose sole job is to remove violent and graphic content.
Those moderators are exposed to the worst parts of the internet, including videos showing murder and sexual assault.
Although volunteer moderators in Alberta don't deal with that level of gruesome content, the unpaid labour does comes with its own challenges.
Moderators often face hate, vitriol, death threats and doxxing — when someone's personal information, such as where they live or work, is shared publicly online.
For these reasons, some moderators — especially on Reddit — prefer to remain anonymous.
They also monitor their online activity closely to avoid revealing where they live or what they do for work.