Scholarly Journals and Reviews of Employees Cognizant of Being Watched

Content warning: This story contains discussion of serious mental health problems and racism.

The panic attacks started after Chloe watched a man die.

She spent the past iii and a one-half weeks in preparation, trying to harden herself against the daily onslaught of agonizing posts: the hate spoken communication, the fierce attacks, the graphic pornography. In a few more days, she will become a total-time Facebook content moderator, or what the company she works for, a professional person services vendor named Cognizant, opaquely calls a "procedure executive."

For this portion of her education, Chloe will have to moderate a Facebook post in front end of her fellow trainees. When it'south her turn, she walks to the forepart of the room, where a monitor displays a video that has been posted to the world'due south largest social network. None of the trainees accept seen it before, Chloe included. She presses play.

The video depicts a human being beingness murdered. Someone is stabbing him, dozens of times, while he screams and begs for his life. Chloe's job is to tell the room whether this post should be removed. She knows that section 13 of the Facebook customs standards prohibits videos that draw the murder of one or more people. When Chloe explains this to the class, she hears her vocalization shaking.

Returning to her seat, Chloe feels an overpowering urge to sob. Another trainee has gone up to review the side by side mail, but Chloe cannot concentrate. She leaves the room, and begins to cry so difficult that she has trouble breathing.

No one tries to comfort her. This is the chore she was hired to do. And for the 1,000 people similar Chloe moderating content for Facebook at the Phoenix site, and for 15,000 content reviewers around the world, today is merely another day at the office.

Over the past three months, I interviewed a dozen current and former employees of Cognizant in Phoenix. All had signed non-disclosure agreements with Cognizant in which they pledged not to discuss their work for Facebook — or even admit that Facebook is Cognizant's customer. The shroud of secrecy is meant to protect employees from users who may exist angry about a content moderation decision and seek to resolve it with a known Facebook contractor. The NDAs are also meant to prevent contractors from sharing Facebook users' personal information with the outside globe, at a fourth dimension of intense scrutiny over data privacy bug.

But the secrecy too insulates Cognizant and Facebook from criticism nigh their working conditions, moderators told me. They are pressured non to discuss the emotional toll that their chore takes on them, even with loved ones, leading to increased feelings of isolation and feet. To protect them from potential retaliation, both from their employers and from Facebook users, I agreed to apply pseudonyms for everyone named in this story except Cognizant'south vice president of operations for business process services, Bob Duncan, and Facebook's managing director of global partner vendor management, Mark Davidson.

Collectively, the employees described a workplace that is perpetually teetering on the brink of chaos. It is an environment where workers cope by telling night jokes about committing suicide, then smoke weed during breaks to numb their emotions. It's a place where employees can exist fired for making merely a few errors a week — and where those who remain live in fright of the old colleagues who render seeking vengeance.

It's a place where, in stark contrast to the perks lavished on Facebook employees, team leaders micromanage content moderators' every bathroom and prayer pause; where employees, desperate for a dopamine rush amid the misery, have been found having sex inside stairwells and a room reserved for lactating mothers; where people develop severe anxiety while still in preparation, and go along to struggle with trauma symptoms long afterwards they get out; and where the counseling that Cognizant offers them ends the moment they quit — or are simply let get.

The moderators told me it's a place where the conspiracy videos and memes that they run across each day gradually lead them to encompass fringe views. One accountant walks the floor promoting the idea that the World is flat. A old employee told me he has begun to question certain aspects of the Holocaust. Another former employee, who told me he has mapped every escape road out of his house and sleeps with a gun at his side, said: "I no longer believe 9/xi was a terrorist set on."

Chloe cries for a while in the break room, and and so in the bathroom, merely begins to worry that she is missing too much preparation. She had been frantic for a job when she applied, as a recent higher graduate with no other immediate prospects. When she becomes a full-time moderator, Chloe will make $xv an hour — $4 more than the minimum wage in Arizona, where she lives, and better than she can await from about retail jobs.

The tears eventually end coming, and her breathing returns to normal. When she goes dorsum to the training room, 1 of her peers is discussing another violent video. She sees that a drone is shooting people from the air. Chloe watches the bodies become limp as they die.

She leaves the room again.

Eventually a supervisor finds her in the bathroom, and offers a weak hug. Cognizant makes a counselor available to employees, only only for part of the day, and he has notwithstanding to get to work. Chloe waits for him for the ameliorate part of an hr.

When the counselor sees her, he explains that she has had a panic attack. He tells her that, when she graduates, she will have more control over the Facebook videos than she had in the training room. You will be able to pause the video, he tells her, or spotter information technology without audio. Focus on your breathing, he says. Brand sure you don't go too caught up in what you lot're watching.

"He said not to worry — that I could probably still do the job," Chloe says. Then she catches herself: "His concern was: don't worry, you can do the job."

On May 3, 2017, Marking Zuckerberg announced the expansion of Facebook's "community operations" team. The new employees, who would be added to 4,500 existing moderators, would be responsible for reviewing every piece of content reported for violating the visitor'due south community standards. By the end of 2018, in response to criticism of the prevalence of violent and exploitative content on the social network, Facebook had more than xxx,000 employees working on safety and security — nigh half of whom were content moderators.

The moderators include some full-fourth dimension employees, but Facebook relies heavily on contract labor to practice the task. Ellen Silver, Facebook's vice president of operations, said in a blog mail service last year that the employ of contract labor allowed Facebook to "scale globally" — to have content moderators working around the clock, evaluating posts in more than 50 languages, at more than twenty sites around the world.

The use of contract labor likewise has a practical benefit for Facebook: it is radically cheaper. The median Facebook employee earns $240,000 annually in salary, bonuses, and stock options. A content moderator working for Cognizant in Arizona, on the other manus, volition earn only $28,800 per year. The arrangement helps Facebook maintain a high profit margin. In its well-nigh contempo quarter, the visitor earned $6.9 billion in profits, on $16.9 billion in acquirement. And while Zuckerberg had warned investors that Facebook's investment in security would reduce the company'south profitability, profits were upwardly 61 percent over the previous year.

Since 2014, when Adrian Chen detailed the harsh working conditions for content moderators at social networks for Wired, Facebook has been sensitive to the criticism that it is traumatizing some of its lowest-paid workers. In her weblog post, Silver said that Facebook assesses potential moderators' "ability to deal with vehement imagery," screening them for their coping skills.

Bob Duncan, who oversees Cognizant'due south content moderation operations in North America, says recruiters carefully explain the graphic nature of the job to applicants. "Nosotros share examples of the kinds of things y'all can meet … so that they have an understanding," he says. "The intention of all that is to ensure people understand information technology. And if they don't feel that work is potentially suited for them based on their state of affairs, they can make those decisions as appropriate."

Until recently, most Facebook content moderation has been washed outside the U.s.a.. But equally Facebook's demand for labor has grown, it has expanded its domestic operations to include sites in California, Arizona, Texas, and Florida.

The United States is the company's dwelling house and one of the countries in which it is nigh popular, says Facebook's Davidson. American moderators are more likely to have the cultural context necessary to evaluate U.S. content that may involve bullying and detest spoken communication, which often involve country-specific slang, he says.

Facebook also worked to build what Davidson calls "state-of-the-fine art facilities, and then they replicated a Facebook part and had that Facebook look and feel to them. That was of import considering in that location's also a perception out there in the market sometimes … that our people sit in very dark, muddy basements, lit only past a green screen. That's really not the case."

It is truthful that Cognizant'southward Phoenix location is neither dark nor muddy. And to the extent that it offers employees desks with computers on them, information technology may faintly resemble other Facebook offices. Just while employees at Facebook'south Menlo Park headquarters work in an airy, sunlit complex designed by Frank Gehry, its contractors in Arizona labor in an often cramped space where long lines for the few available bathroom stalls tin accept up most of employees' limited break time. And while Facebook employees enjoy a broad degree of freedom in how they manage their days, Cognizant workers' time is managed downwards to the second.

A content moderator named Miguel arrives for the twenty-four hours shift simply before information technology begins, at seven a.chiliad. He's ane of about 300 workers who volition eventually filter into the workplace, which occupies two floors in a Phoenix office park.

Security personnel keep watch over the entrance, on the sentinel for disgruntled ex-employees and Facebook users who might confront moderators over removed posts. Miguel badges in to the role and heads to the lockers. In that location are barely enough lockers to go around, so some employees have taken to keeping items in them overnight to ensure they will have one the next day.

The lockers occupy a narrow hallway that, during breaks, becomes choked with people. To protect the privacy of the Facebook users whose posts they review, workers are required to store their phones in lockers while they work.

Writing utensils and newspaper are besides non allowed, in case Miguel might be tempted to write down a Facebook user's personal information. This policy extends to small paper scraps, such equally gum wrappers. Smaller items, like hand lotion, are required to exist placed in clear plastic bags and then they are e'er visible to managers.

To accommodate 4 daily shifts — and high employee turnover — almost people volition not be assigned a permanent desk-bound on what Cognizant calls "the production floor." Instead, Miguel finds an open up workstation and logs in to a piece of software known as the Unmarried Review Tool, or SRT. When he is fix to work, he clicks a button labeled "resume reviewing," and dives into the queue of posts.

Last April, a year after many of the documents had been published in the Guardian, Facebook made public the community standards past which it attempts to govern its 2.iii billion monthly users. In the months afterwards, Motherboard and Radiolab published detailed investigations into the challenges of moderating such a vast amount of speech communication.

Those challenges include the sheer book of posts; the need to train a global regular army of low-paid workers to consistently apply a single set of rules; about-daily changes and clarifications to those rules; a lack of cultural or political context on the part of the moderators; missing context in posts that makes their meaning cryptic; and frequent disagreements amid moderators nearly whether the rules should apply in individual cases.

Despite the high degree of difficulty in applying such a policy, Facebook has instructed Cognizant and its other contractors to emphasize a metric called "accuracy" over all else. Accuracy, in this case, means that when Facebook audits a subset of contractors' decisions, its total-time employees agree with the contractors. The visitor has gear up an accuracy target of 95 percent, a number that e'er seems just out of reach. Cognizant has never hit the target for a sustained menses of time — information technology unremarkably floats in the high 80s or low 90s, and was hovering effectually 92 at printing time.

Miguel diligently applies the policy — even though, he tells me, it often makes no sense to him.

A mail calling someone "my favorite n-----" is allowed to stay up, because under the policy it is considered "explicitly positive content."

"Autistic people should be sterilized" seems offensive to him, but it stays up as well. Autism is non a "protected characteristic" the manner race and gender are, and then it doesn't violate the policy. ("Men should exist sterilized" would be taken down.)

In January, Facebook distributes a policy update stating that moderators should accept into account recent romantic upheaval when evaluating posts that limited hatred toward a gender. "I hate all men" has always violated the policy. But "I just bankrupt upwards with my boyfriend, and I hate all men" no longer does.

Miguel works the posts in his queue. They go far in no item order at all.

Here is a racist joke. Hither is a human being having sex with a farm animal. Here is a graphic video of murder recorded by a drug dare. Some of the posts Miguel reviews are on Facebook, where he says bullying and detest speech are more common; others are on Instagram, where users tin can post under pseudonyms, and tend to share more than violence, nudity, and sex.

Each mail service presents Miguel with 2 carve up but related tests. Offset, he must determine whether a postal service violates the community standards. Then, he must select the correct reason why it violates the standards. If he accurately recognizes that a postal service should exist removed, but selects the "incorrect" reason, this will count against his accuracy score.

Miguel is very proficient at his job. He will have the correct action on each of these posts, striving to purge Facebook of its worst content while protecting the maximum amount of legitimate (if uncomfortable) speech communication. He will spend less than 30 seconds on each item, and he will do this upwards to 400 times a day.

When Miguel has a question, he raises his hand, and a "subject matter expert" (SME) — a contractor expected to take more comprehensive knowledge of Facebook's policies, who makes $1 more per hour than Miguel does — will walk over and assist him. This volition price Miguel time, though, and while he does non have a quota of posts to review, managers monitor his productivity, and enquire him to explain himself when the number slips into the 200s.

From Miguel's 1,500 or so weekly decisions, Facebook will randomly select l or 60 to inspect. These posts will exist reviewed by a second Cognizant employee — a quality assurance worker, known internally as a QA, who also makes $i per hour more than Miguel. Full-time Facebook employees then audit a subset of QA decisions, and from these collective deliberations, an accuracy score is generated.

Miguel takes a dim view of the accuracy figure.

"Accuracy is only judged past agreement. If me and the accountant both permit the obvious auction of heroin, Cognizant was 'right,' because we both agreed," he says. "This number is false."

Facebook's single-minded focus on accuracy developed after sustaining years of criticism over its handling of moderation bug. With billions of new posts arriving each 24-hour interval, Facebook feels pressure on all sides. In some cases, the company has been criticized for non doing plenty — as when United Nations investigators plant that information technology had been complicit in spreading detest speech during the genocide of the Rohingya community in Myanmar. In others, it has been criticized for overreach — every bit when a moderator removed a postal service that excerpted the Declaration of Independence. (Thomas Jefferson was ultimately granted a posthumous exemption to Facebook's spoken communication guidelines, which prohibit the use of the phrase "Indian savages.")

One reason moderators struggle to striking their accuracy target is that for whatsoever given policy enforcement decision, they take several sources of truth to consider.

The canonical source for enforcement is Facebook's public community guidelines — which consist of two sets of documents: the publicly posted ones, and the longer internal guidelines, which offer more than granular detail on complex problems. These documents are farther augmented by a fifteen,000-word secondary document, called "Known Questions," which offers additional commentary and guidance on thorny questions of moderation — a kind of Talmud to the community guidelines' Torah. Known Questions used to occupy a single lengthy document that moderators had to cross-reference daily; terminal yr it was incorporated into the internal community guidelines for easier searching.

A third major source of truth is the discussions moderators take among themselves. During breaking news events, such as a mass shooting, moderators volition endeavour to achieve a consensus on whether a graphic image meets the criteria to be deleted or marked as disturbing. Simply sometimes they reach the wrong consensus, moderators said, and managers accept to walk the floor explaining the correct decision.

The quaternary source is perhaps the most problematic: Facebook's ain internal tools for distributing information. While official policy changes typically arrive every other Wednesday, incremental guidance nearly developing issues is distributed on a near-daily basis. Ofttimes, this guidance is posted to Workplace, the enterprise version of Facebook that the company introduced in 2016. Similar Facebook itself, Workplace has an algorithmic News Feed that displays posts based on engagement. During a breaking news event, such as a mass shooting, managers volition often mail alien information about how to moderate individual pieces of content, which and then appear out of chronological lodge on Workplace. Six electric current and former employees told me that they had made moderation mistakes based on seeing an outdated post at the top of their feed. At times, it feels as if Facebook'southward own product is working against them. The irony is not lost on the moderators.

"Information technology happened all the time," says Diana, a former moderator. "Information technology was horrible — i of the worst things I had to personally bargain with, to practice my job properly." During times of national tragedy, such equally the 2017 Las Vegas shooting, managers would tell moderators to remove a video — and then, in a split post a few hours later, to leave it upwards. The moderators would make a decision based on whichever post Workplace served up.

"It was such a big mess," Diana says. "Nosotros're supposed to be upward to par with our decision making, and it was messing up our numbers."

Workplace posts about policy changes are supplemented by occasional slide decks that are shared with Cognizant workers most special topics in moderation — often tied to grim anniversaries, such as the Parkland shooting. But these presentations and other supplementary materials frequently contain embarrassing errors, moderators told me. Over the by year, communications from Facebook incorrectly identified certain U.S. representatives as senators; misstated the date of an election; and gave the incorrect name for the high school at which the Parkland shooting took place. (Information technology is Marjory Stoneman Douglas High Schoolhouse, not "Stoneham Douglas High School.")

Even with an e'er-changing rulebook, moderators are granted simply the slimmest margins of mistake. The task resembles a high-stakes video game in which yous start out with 100 points — a perfect accuracy score — and then scratch and claw to go on equally many of those points as you tin can. Because in one case y'all fall below 95, your job is at risk.

If a quality balls manager marks Miguel'southward determination wrong, he can appeal the decision. Getting the QA to concord with you is known as "getting the betoken back." In the short term, an "error" is whatever a QA says information technology is, and so moderators have good reason to appeal every fourth dimension they are marked wrong. (Recently, Cognizant fabricated it fifty-fifty harder to become a point back, by requiring moderators to first get a SME to corroborate their entreatment before it would be forwarded to the QA.)

Sometimes, questions about confusing subjects are escalated to Facebook. Simply every moderator I asked about this said that Cognizant managers discourage employees from raising problems to the client, apparently out of fright that besides many questions would annoy Facebook.

This has resulted in Cognizant inventing policy on the wing. When the community standards did non explicitly prohibit erotic asphyxiation, three one-time moderators told me, a team leader declared that images depicting choking would be permitted unless the fingers depressed the peel of the person existence choked.

Before workers are fired, they are offered coaching and placed into a remedial program designed to brand sure they master the policy. But often this serves as a pretext for managing workers out of the job, three former moderators told me. Other times, contractors who have missed likewise many points will escalate their appeals to Facebook for a last decision. But the company does not ever get through the backlog of requests before the employee in question is fired, I was told.

Officially, moderators are prohibited from approaching QAs and lobbying them to opposite a decision. Only it is still a regular occurrence, two onetime QAs told me.

One, named Randy, would sometimes return to his car at the end of a work day to find moderators waiting for him. Five or six times over the grade of a year, someone would attempt to intimidate him into changing his ruling. "They would confront me in the parking lot and tell me they were going to beat the shit out of me," he says. "There wasn't even a single case where information technology was respectful or nice. Information technology was simply, You audited me incorrect! That was a puppet! That was full areola, come up on human being!"

Fearing for his safety, Randy began bringing a concealed gun to piece of work. Fired employees regularly threatened to return to work and damage their old colleagues, and Randy believed that some of them were serious. A former coworker told me she was aware that Randy brought a gun to work, and approved of it, fearing on-site security would not be sufficient in the instance of an set on.

Cognizant's Duncan told me the company would investigate the various safety and direction problems that moderators had disclosed to me. He said bringing a gun to work was a violation of policy and that, had direction been aware of information technology, they would take intervened and taken activity against the employee.

Randy quit afterwards a year. He never had occasion to fire the gun, but his anxiety lingers.

"Part of the reason I left was how unsafe I felt in my own home and my own skin," he says.

Before Miguel can accept a pause, he clicks a browser extension to let Cognizant know he is leaving his desk. ("That's a standard matter in this blazon of industry," Facebook's Davidson tells me. "To be able to track, and then you know where your workforce is.")

Miguel is allowed two xv-infinitesimal breaks, and one xxx-minute lunch. During breaks, he often finds long lines for the restrooms. Hundreds of employees share just one urinal and 2 stalls in the men's room, and three stalls in the women's. Cognizant somewhen allowed employees to utilize a restroom on some other flooring, but getting there and back will accept Miguel precious minutes. By the time he has used the restroom and fought the crowd to his locker, he might accept v minutes to expect at his phone before returning to his desk.

Miguel is also allotted nine minutes per twenty-four hour period of "health fourth dimension," which he is supposed to use if he feels traumatized and needs to step away from his desk-bound. Several moderators told me that they routinely used their wellness time to get to the restroom when lines were shorter. Only direction somewhen realized what they were doing, and ordered employees not to utilize wellness time to relieve themselves. (Recently a group of Facebook moderators hired through Accenture in Austin complained well-nigh "inhumane" conditions related to break periods; Facebook attributed the issue to a misunderstanding of its policies.)

At the Phoenix site, Muslim workers who used health time to perform ane of their five daily prayers were told to finish the practice and do it on their other break time instead, current and sometime employees told me. It was unclear to the employees I spoke with why their managers did not consider prayer to be a valid use of the wellness program. (Cognizant did non offer a comment about these incidents, although a person familiar with 1 example told me a worker requested more than forty minutes for daily prayer, which the company considered excessive.)

Cognizant employees are told to cope with the stress of the jobs by visiting counselors, when they are available; by calling a hotline; and by using an employee assistance program, which offers a handful of therapy sessions. More recently, yoga and other therapeutic activities have been added to the work week. But bated from occasional visits to the counselor, six employees I spoke with told me they institute these resources inadequate. They told me they coped with the stress of the job in other ways: with sex, drugs, and offensive jokes.

Among the places that Cognizant employees have been constitute having sex activity at work: the bathroom stalls, the stairwells, the parking garage, and the room reserved for lactating mothers. In early 2018, the security team sent out a memo to managers alerting them to the behavior, a person familiar with the matter told me. The solution: management removed door locks from the mother's room and from a handful of other private rooms. (The mother's room now locks over again, just would-be users must kickoff check out a key from an administrator.)

A former moderator named Sara said that the secrecy around their piece of work, coupled with the difficulty of the job, forged strong bonds between employees. "You lot get really shut to your coworkers really apace," she says. "If y'all're not allowed to talk to your friends or family unit virtually your task, that's going to create some distance. You might feel closer to these people. Information technology feels like an emotional connectedness, when in reality you're just trauma bonding."

Employees too cope using drugs and booze, both on and off campus. One former moderator, Li, told me he used marijuana on the chore virtually daily, through a vaporizer. During breaks, he says, small groups of employees often head outside and fume. (Medical marijuana utilise is legal in Arizona.)

"I can't fifty-fifty tell you how many people I've smoked with," Li says. "It's so sorry, when I think back well-nigh it — it actually does hurt my heart. We'd go down and get stoned and go back to work. That's not professional. Knowing that the content moderators for the world's biggest social media platform are doing this on the job, while they are moderating content …"

He trailed off.

Li, who worked as a moderator for about a year, was one of several employees who said the workplace was rife with pitch-black humor. Employees would compete to send each other the almost racist or offensive memes, he said, in an effort to lighten the mood. Equally an ethnic minority, Li was a frequent target of his coworkers, and he embraced what he saw as expert-natured racist jokes at his expense, he says.

Only over time, he grew concerned for his mental health.

"We were doing something that was concealment our soul — or whatever you call it," he says. "What else exercise you practice at that point? The 1 thing that makes us laugh is actually damaging us. I had to scout myself when I was joking around in public. I would accidentally say [offensive] things all the time — and then be similar, Oh shit, I'm at the grocery store. I cannot be talking similar this."

Jokes most self-harm were also common. "Drinking to forget," Sara heard a coworker in one case say, when the counselor asked him how he was doing. (The advisor did non invite the employee in for further discussion.) On bad days, Sara says, people would talk about it existence "fourth dimension to go hang out on the roof" — the joke being that Cognizant employees might one solar day throw themselves off information technology.

One day, Sara said, moderators looked up from their computers to see a man standing on top of the office building side by side door. Virtually of them had watched hundreds of suicides that began just this way. The moderators got upwardly and hurried to the windows.

The human didn't jump, though. Eventually everyone realized that he was a swain employee, taking a break.

Similar most of the former moderators I spoke with, Chloe quit later on about a year.

Amidst other things, she had grown concerned about the spread of conspiracy theories among her colleagues. Ane QA often discussed his conventionalities that the Earth is flat with colleagues, and "was actively trying to recruit other people" into believing, some other moderator told me. 1 of Miguel's colleagues once referred casually to "the Holohoax," in what Miguel took as a signal that the human being was a Holocaust denier.

Conspiracy theories were ofttimes well received on the product flooring, six moderators told me. Afterward the Parkland shooting concluding year, moderators were initially horrified past the attacks. Merely as more conspiracy content was posted to Facebook and Instagram, some of Chloe's colleagues began expressing doubts.

"People really started to believe these posts they were supposed to be moderating," she says. "They were saying, 'Oh gosh, they weren't really there. Look at this CNN video of David Hogg — he's too old to be in schoolhouse.' People started Googling things instead of doing their jobs and looking into conspiracy theories near them. We were like, 'Guys, no, this is the crazy stuff we're supposed to be moderating. What are you lot doing?'"

Well-nigh of all, though, Chloe worried nigh the long-term impacts of the job on her mental wellness. Several moderators told me they experienced symptoms of secondary traumatic stress — a disorder that can result from observing immediate trauma experienced by others. The disorder, whose symptoms can be identical to post-traumatic stress disorder, is often seen in physicians, psychotherapists, and social workers. People experiencing secondary traumatic stress report feelings of anxiety, sleep loss, loneliness, and dissociation, amid other ailments.

Last year, a former Facebook moderator in California sued the company, saying her job as a contractor with the business firm Pro Unlimited had left her with PTSD. In the complaint, her lawyers said she "seeks to protect herself from the dangers of psychological trauma resulting from Facebook's failure to provide a safe workplace for the thousands of contractors who are entrusted to provide the safest possible environment for Facebook users." (The suit is withal unresolved.)

Chloe has experienced trauma symptoms in the months since leaving her job. She started to accept a panic attack in a picture palace during the film Mother!, when a vehement stabbing spree triggered a memory of that first video she moderated in front of her fellow trainees. Another time, she was sleeping on the couch when she heard machine gun fire, and had a panic attack. Someone in her house had turned on a violent TV show. She "started freaking out," she says. "I was begging them to shut information technology off."

The attacks make her recollect of her fellow trainees, particularly the ones who fail out of the plan before they tin start. "A lot of people don't actually go far through the training," she says. "They go through those four weeks and then they get fired. They could have had that same feel that I did, and had absolutely no admission to counselors after that."

Last week, Davidson told me, Facebook began surveying a test group of moderators to measure what the company calls their "resiliency" — their power to bounce back from seeing traumatic content and proceed doing their jobs. The visitor hopes to aggrandize the test to all of its moderators globally, he said.

Randy also left later on about a twelvemonth. Similar Chloe, he had been traumatized by a video of a stabbing. The victim had been about his age, and he remembers hearing the man crying for his mother every bit he died.

"Every mean solar day I see that," Randy says, "I have a 18-carat fright over knives. I like cooking — getting back into the kitchen and being around the knives is really hard for me."

The job also changed the way he saw the earth. After he saw so many videos saying that ix/xi was non a terrorist attack, he came to believe them. Conspiracy videos about the Las Vegas massacre were too very persuasive, he says, and he at present believes that multiple shooters were responsible for the attack. (The FBI found that the massacre was the work of a unmarried gunman.)

Randy now sleeps with a gun at his side. He runs mental drills about how he would escape his domicile in the event that it were attacked. When he wakes up in the morning, he sweeps the house with his gun raised, looking for invaders.

He has recently begun seeing a new therapist, after beingness diagnosed with PTSD and generalized anxiety disorder.

"I'm fucked up, man," Randy says. "My mental health — it's merely so up and downwards. One twenty-four hour period I tin can be really happy, and doing actually good. The side by side twenty-four hours, I'm more or less of a zombie. It's not that I'yard depressed. I'm just stuck."

He adds: "I don't think it'due south possible to do the job and not come out of information technology with some acute stress disorder or PTSD."

A mutual complaint of the moderators I spoke with was that the on-site counselors were largely passive, relying on workers to recognize the signs of anxiety and depression and seek help.

"In that location was nothing that they were doing for united states of america," Li says, "other than expecting the states to be able to place when we're cleaved. Most of the people there that are deteriorating — they don't fifty-fifty meet it. And that's what kills me."

Last week, after I told Facebook about my conversations with moderators, the visitor invited me to Phoenix to come across the site for myself. Information technology is the first time Facebook has allowed a reporter to visit an American content moderation site since the company began building dedicated facilities here two years ago. A spokeswoman who met me at the site says that the stories I have been told do non reflect the twenty-four hours-to-mean solar day experiences of most of its contractors, either at Phoenix or at its other sites effectually the globe.

The mean solar day before I arrived at the part park where Cognizant resides, one source tells me, new motivational posters were hung upwards on the walls. On the whole, the space is much more colorful than I wait. A neon wall chart outlines the month'southward activities, which read like a cantankerous between the activities at summer military camp and a senior center: yoga, pet therapy, meditation, and a Mean Girls-inspired event called On Wednesdays We Wear Pink. The day I was there marked the end of Random Acts of Kindness Calendar week, in which employees were encouraged to write inspirational messages on colorful cards, and adhere them to a wall with a piece of candy.

Later meetings with executives from Cognizant and Facebook, I interview five workers who had volunteered to speak with me. They stream into a conference room, along with the man who is responsible for running the site. With their boss sitting at their side, employees acknowledge the challenges of the job but tell me they experience condom, supported, and believe the job will lead to better-paying opportunities — within Cognizant, if non Facebook.

Brad, who holds the title of policy director, tells me that the majority of content that he and his colleagues review is essentially benign, and warns me against overstating the mental health risks of doing the job.

"There'south this perception that we're bombarded past these graphic images and content all the time, when in fact the contrary is the truth," says Brad, who has worked on the site for nearly two years. "About of the stuff we see is balmy, very mild. It's people going on rants. It'south people reporting photos or videos just because they don't want to run across it — not because in that location's any outcome with the content. That's actually the bulk of the stuff that we see."

When I ask well-nigh the high difficulty of applying the policy, a reviewer named Michael says that he regularly finds himself stumped past catchy decisions. "There is an infinite possibility of what'due south gonna exist the next job, and that does create an essence of anarchy," he says. "Merely information technology likewise keeps it interesting. You're never going to go an entire shift already knowing the answer to every question."

In whatever case, Michael says, he enjoys the work better than he did at his last chore, at Walmart, where he was often berated by customers. "I do not accept people yelling in my face," he says.

The moderators stream out, and I'm introduced to two counselors on the site, including the doctor who started the on-site counseling programme here. Both ask me not to utilize their existent names. They tell me that they cheque in with every employee every day. They say that the combination of on-site services, a hotline, and an employee assistance plan are sufficient to protect workers' well-being.

When I ask about the risks of contractors developing PTSD, a counselor I'll call Logan tells me near a different psychological phenomenon: "post-traumatic growth," an event whereby some trauma victims emerge from the experience feeling stronger than earlier. The example he gives me is that of Malala Yousafzai, the women's education activist, who was shot in the head as a teenager by the Taliban.

"That's an extremely traumatic issue that she experienced in her life," Logan says. "It seems like she came back extremely resilient and strong. She won a Nobel Peace Prize... So at that place are many examples of people that experience difficult times and come dorsum stronger than before."

The twenty-four hours ends with a tour, in which I walk the production floor and talk with other employees. I am struck by how immature they are: almost anybody seems to be in their twenties or early thirties. All work stops while I'm on the floor, to ensure I exercise non see any Facebook user's private information, and and so employees chat amiably with their deskmates as I walk by. I take notation of the posters. 1, from Cognizant, bears the enigmatic slogan "empathy at scale." Another, made famous by Facebook COO Sheryl Sandberg, reads "What would you lot do if you weren't agape?"

Information technology makes me think of Randy and his gun.

Anybody I run into at the site expresses bully care for the employees, and appears to exist doing their best for them, within the context of the system they have all been plugged into. Facebook takes pride in the fact that it pays contractors at least xx percent to a higher place minimum wage at all of its content review sites, provides full healthcare benefits, and offers mental health resources that far exceed that of the larger phone call centre industry.

And yet the more moderators I spoke with, the more I came to doubt the utilise of the telephone call center model for content moderation. This model has long been standard across big tech companies — it'south likewise used by Twitter and Google, and therefore YouTube. Beyond cost savings, the do good of outsourcing is that information technology allows tech companies to rapidly aggrandize their services into new markets and languages. Simply information technology too entrusts essential questions of speech communication and safety to people who are paid equally if they were treatment customer service calls for Best Buy.

Every moderator I spoke with took great pride in their work, and talked about the chore with profound seriousness. They wished but that Facebook employees would call back of them as peers, and to treat them with something resembling equality.

"If we weren't there doing that job, Facebook would be and so ugly," Li says. "We're seeing all that stuff on their behalf. And hell yeah, we make some wrong calls. But people don't know that there's actually man beings behind those seats."

That people don't know there are human beings doing this work is, of course, by pattern. Facebook would rather talk well-nigh its advancements in artificial intelligence, and dangle the prospect that its reliance on human being moderators will decline over time.

Just given the limits of the engineering science, and the infinite varieties of homo speech, such a day appears to be very far away. In the meantime, the call eye model of content moderation is taking an ugly toll on many of its workers. Equally kickoff responders on platforms with billions of users, they are performing a disquisitional function of modern civil gild, while being paid less than half as much as many others who work on the forepart lines. They do the work as long as they can — and when they leave, an NDA ensures that they retreat even further into the shadows.

To Facebook, it will seem as if they never worked there at all. Technically, they never did.

Have you lot washed content moderation work for a tech giant? Email Casey Newton at casey@theverge.com , send him a direct message on Twitter @ CaseyNewton , or ask him for his Signal at either address.

owensorwil1974.blogspot.com

Source: https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator-interviews-trauma-working-conditions-arizona

0 Response to "Scholarly Journals and Reviews of Employees Cognizant of Being Watched"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel