The digital rubbish collectors of the Philippines

Content moderatorHave you ever wondered why user-generated Internet sites such as Facebook, Twitter, Instagram and the like are so “clean”? So appropriate for family use? Largely free of nudity, blood and gore, violence, political agitation, depictions of special sexual interests and most other disgusting stuff?

You might think there are smart software algorithms used by Silicon Valley’s social media giants that are filtering the dark junk out. But that’s a misbelief. There is no artificial intelligence to date that would be able to distinguish, let’s say, a cheap selfie showing bare female breasts from a Titian painting showing exactly the same, but being recognised as Renaissance art? Or a picture of Hitler accompanying a historical excursus from an outright Nazi agitation? A subtle cartoon from open vilification? Enacted violence from real violence? Irony from bullying?

It’s not the software doing the dirty work. It is also not someone at national police headquarters sitting there and watching out that nobody posts child porn or pictures of beheadings and other sick stuff.

This work is in fact done by third-party service providers, and most of them are operating in the Philippines.

Silicon Valley corporations are employing tens of thousands of digital house cleaners to get rid of the unwanted junk people try to post on social media sites and other platforms. They are human filters, brigades that brush away inappropriate stuff from the Internet around the clock in digital sweatshops.

They are large-scale BPO companies such as TaskUs, Arvato, MicroSourcing and others with offices in the Philippines which employ digital garbage depositors in order to ensure that content on Internet platforms and social media complies with legal and regulatory requirements, community guidelines, user agreements and that it falls within norms of taste and acceptability for a site.

In plain terms, they are sitting there in front of a monitor and clean the web from things such as dick pics, highway accident gore, Islamist hate speeches, animal violence and all the junk that gets posted by the myriads of the Internet’s jerks, racists, sexually deranged, crooks, slobs and bullies.

The work they do is called “commercial content moderation” and is an important field of revenue for BPO companies in the Philippines, employing more than 100,000 people, almost eight times the number of Facebook staff.

They sit for eight to ten hours clicking through images that most would consider gruesome, distasteful or outright sick, to keep it out of the sight of the ordinary Internet user. Whether it is hardcore porn, political agitation, bloody violence – in a matter of seconds such pics or movies get nuked by a mouse click.

And the flow of disturbing or distasteful content never stops. Once the odd close-up dildo selfie went to trash, a picture of a severed and impaled head posted by IS sympathisers goes the same way, and other obscenities and horrifying trash follow.

Sarah T. Roberts, researcher at the Faculty of Information and Media Studies at the University of Western Ontario, looked behind the scene of “commercial content moderation” and presented her findings together with German theater director Moritz Riesewieck – who went to the Philippines to visit the content moderation firms and anonymously talk to the moderators – at a recent seminar at the Heinrich Böll Foundation in Berlin, Germany.

trashWhat is filtered out by the mostly young Filipino BPO labourers Roberts calls “e-waste,” “psycho garbage”, “digital refuse” and “techno trash.” Staff that is exposed day-in, day-out to sickening depictions that show the worst of humanity in fact get sick, at least psychologically, suffer from nightmares, get alienated to other people, run into relationship problems, start abusing narcotics or alcohol, lose interest in sex due to their non-stop exposure to extreme pornography or become paranoid, depressive or impotent.

“They are constantly exposed to pictures showing rape, animal sex, child porn, beheadings, mutilations, gore, excrements and whatever vile stuff gets posted by people,” says Riesewieck.

“They have to sign non-disclosure agreements, and it’s forbidden for them to talk about what they do even with their closest environment or their partner,” he adds. “They are threatened with penalties if they violate the agreement. Most of them quit the job after a year or two and are left with psychological scars.”

The full impact of looking at shocking and horrifying material without end in this job setting has not been studied yet. Most BPO employers wave away concerns about worker well-being simply because it is not causing any physical damage.

Roberts likens the outsourcing of such digital clean-up work to the offloading of real (illegal) garbage from North America to the Philippines, a practice that was common in the past decades – and probably still is – when shiploads of harzadous waste, disguised as recyclable materials, were shipped to Manila and dumped somewhere within the Philippines without getting proper treatment.

“Now most of the toxic techno waste produced by the Internet arrives in the Philippines through worldwide networks of deep-sea cables for trashing by BPO workers,” says Roberts.

“And, much like the trash collectors and dismantlers of physical refuse, they are unseen and their work goes largely without acknowledgement. After all, who misses what is not there?” she adds.

There are guidelines for what has to be disposed of: A wide range of material, but mostly content that is highly sexual or pornographic, depicts the physical and sexual abuse of adults and children, the abuse and torture of animals, content coming from war zones and other areas besieged by violent conflict, political and religious hate rants and any material that is designed to be racist, shocking, prurient or offensive by nature.

However, commercial content moderators are often left alone with their decisions which they have to make in a matter of seconds as the flow of gigabytes of digital trash never stops.

The global “market leadership” of Filipinos in getting rid of digital waste is explained by the researchers by the close cultural relations to the US and their better understanding of Western culture from an Asian viewpoint. Filipinos, they say, know about Western moral and religious standards. Combined with their diligence, rigour and forbearance they seem to be the ideal people to do such a nasty job the big social media corporations are reluctant to talk about.

But it is not just about cleaning up the Internet. Commercial content management also has political implications. For example, pictures of IS beheadings are not supposed to be deleted as they have “journalistic” significance, while beheadings in the Mexican drug war get trashed.

“The Internet firms don’t detail their exact guidelines for content management,” says Riesewieck, ” they are generally tight-lipped about the issue.”

However, in 2012, one such guideline from a large social media firm leaked, he notes.

“It said that, for example, showing a smashed head is okay as long as no cerebral matter can be seen. Pictures of breastfeeding mothers had to be deleted, though, as well maps showing Kurdistan,” he says.

The opacity of the content guidelines of Facebook & Co is seen as general issue.

“What the moderations are based on reflects the foreign policy agenda of a country where the Internet company is headquartered,” says Roberts. She also argues that commercial content moderation could be seen as “outsourcing of censorship to private companies,” something that would be worrying for a democracy because it would put “profit seeking over freedom of speech.”



Support ASEAN news

Investvine has been a consistent voice in ASEAN news for more than a decade. From breaking news to exclusive interviews with key ASEAN leaders, we have brought you factual and engaging reports – the stories that matter, free of charge.

Like many news organisations, we are striving to survive in an age of reduced advertising and biased journalism. Our mission is to rise above today’s challenges and chart tomorrow’s world with clear, dependable reporting.

Support us now with a donation of your choosing. Your contribution will help us shine a light on important ASEAN stories, reach more people and lift the manifold voices of this dynamic, influential region.

 

 

Have you ever wondered why user-generated Internet sites such as Facebook, Twitter, Instagram and the like are so "clean"? So appropriate for family use? Largely free of nudity, blood and gore, violence, political agitation, depictions of special sexual interests and most other disgusting stuff? You might think there are smart software algorithms used by Silicon Valley’s social media giants that are filtering the dark junk out. But that's a misbelief. There is no artificial intelligence to date that would be able to distinguish, let's say, a cheap selfie showing bare female breasts from a Titian painting showing exactly the same,...

Content moderatorHave you ever wondered why user-generated Internet sites such as Facebook, Twitter, Instagram and the like are so “clean”? So appropriate for family use? Largely free of nudity, blood and gore, violence, political agitation, depictions of special sexual interests and most other disgusting stuff?

You might think there are smart software algorithms used by Silicon Valley’s social media giants that are filtering the dark junk out. But that’s a misbelief. There is no artificial intelligence to date that would be able to distinguish, let’s say, a cheap selfie showing bare female breasts from a Titian painting showing exactly the same, but being recognised as Renaissance art? Or a picture of Hitler accompanying a historical excursus from an outright Nazi agitation? A subtle cartoon from open vilification? Enacted violence from real violence? Irony from bullying?

It’s not the software doing the dirty work. It is also not someone at national police headquarters sitting there and watching out that nobody posts child porn or pictures of beheadings and other sick stuff.

This work is in fact done by third-party service providers, and most of them are operating in the Philippines.

Silicon Valley corporations are employing tens of thousands of digital house cleaners to get rid of the unwanted junk people try to post on social media sites and other platforms. They are human filters, brigades that brush away inappropriate stuff from the Internet around the clock in digital sweatshops.

They are large-scale BPO companies such as TaskUs, Arvato, MicroSourcing and others with offices in the Philippines which employ digital garbage depositors in order to ensure that content on Internet platforms and social media complies with legal and regulatory requirements, community guidelines, user agreements and that it falls within norms of taste and acceptability for a site.

In plain terms, they are sitting there in front of a monitor and clean the web from things such as dick pics, highway accident gore, Islamist hate speeches, animal violence and all the junk that gets posted by the myriads of the Internet’s jerks, racists, sexually deranged, crooks, slobs and bullies.

The work they do is called “commercial content moderation” and is an important field of revenue for BPO companies in the Philippines, employing more than 100,000 people, almost eight times the number of Facebook staff.

They sit for eight to ten hours clicking through images that most would consider gruesome, distasteful or outright sick, to keep it out of the sight of the ordinary Internet user. Whether it is hardcore porn, political agitation, bloody violence – in a matter of seconds such pics or movies get nuked by a mouse click.

And the flow of disturbing or distasteful content never stops. Once the odd close-up dildo selfie went to trash, a picture of a severed and impaled head posted by IS sympathisers goes the same way, and other obscenities and horrifying trash follow.

Sarah T. Roberts, researcher at the Faculty of Information and Media Studies at the University of Western Ontario, looked behind the scene of “commercial content moderation” and presented her findings together with German theater director Moritz Riesewieck – who went to the Philippines to visit the content moderation firms and anonymously talk to the moderators – at a recent seminar at the Heinrich Böll Foundation in Berlin, Germany.

trashWhat is filtered out by the mostly young Filipino BPO labourers Roberts calls “e-waste,” “psycho garbage”, “digital refuse” and “techno trash.” Staff that is exposed day-in, day-out to sickening depictions that show the worst of humanity in fact get sick, at least psychologically, suffer from nightmares, get alienated to other people, run into relationship problems, start abusing narcotics or alcohol, lose interest in sex due to their non-stop exposure to extreme pornography or become paranoid, depressive or impotent.

“They are constantly exposed to pictures showing rape, animal sex, child porn, beheadings, mutilations, gore, excrements and whatever vile stuff gets posted by people,” says Riesewieck.

“They have to sign non-disclosure agreements, and it’s forbidden for them to talk about what they do even with their closest environment or their partner,” he adds. “They are threatened with penalties if they violate the agreement. Most of them quit the job after a year or two and are left with psychological scars.”

The full impact of looking at shocking and horrifying material without end in this job setting has not been studied yet. Most BPO employers wave away concerns about worker well-being simply because it is not causing any physical damage.

Roberts likens the outsourcing of such digital clean-up work to the offloading of real (illegal) garbage from North America to the Philippines, a practice that was common in the past decades – and probably still is – when shiploads of harzadous waste, disguised as recyclable materials, were shipped to Manila and dumped somewhere within the Philippines without getting proper treatment.

“Now most of the toxic techno waste produced by the Internet arrives in the Philippines through worldwide networks of deep-sea cables for trashing by BPO workers,” says Roberts.

“And, much like the trash collectors and dismantlers of physical refuse, they are unseen and their work goes largely without acknowledgement. After all, who misses what is not there?” she adds.

There are guidelines for what has to be disposed of: A wide range of material, but mostly content that is highly sexual or pornographic, depicts the physical and sexual abuse of adults and children, the abuse and torture of animals, content coming from war zones and other areas besieged by violent conflict, political and religious hate rants and any material that is designed to be racist, shocking, prurient or offensive by nature.

However, commercial content moderators are often left alone with their decisions which they have to make in a matter of seconds as the flow of gigabytes of digital trash never stops.

The global “market leadership” of Filipinos in getting rid of digital waste is explained by the researchers by the close cultural relations to the US and their better understanding of Western culture from an Asian viewpoint. Filipinos, they say, know about Western moral and religious standards. Combined with their diligence, rigour and forbearance they seem to be the ideal people to do such a nasty job the big social media corporations are reluctant to talk about.

But it is not just about cleaning up the Internet. Commercial content management also has political implications. For example, pictures of IS beheadings are not supposed to be deleted as they have “journalistic” significance, while beheadings in the Mexican drug war get trashed.

“The Internet firms don’t detail their exact guidelines for content management,” says Riesewieck, ” they are generally tight-lipped about the issue.”

However, in 2012, one such guideline from a large social media firm leaked, he notes.

“It said that, for example, showing a smashed head is okay as long as no cerebral matter can be seen. Pictures of breastfeeding mothers had to be deleted, though, as well maps showing Kurdistan,” he says.

The opacity of the content guidelines of Facebook & Co is seen as general issue.

“What the moderations are based on reflects the foreign policy agenda of a country where the Internet company is headquartered,” says Roberts. She also argues that commercial content moderation could be seen as “outsourcing of censorship to private companies,” something that would be worrying for a democracy because it would put “profit seeking over freedom of speech.”



Support ASEAN news

Investvine has been a consistent voice in ASEAN news for more than a decade. From breaking news to exclusive interviews with key ASEAN leaders, we have brought you factual and engaging reports – the stories that matter, free of charge.

Like many news organisations, we are striving to survive in an age of reduced advertising and biased journalism. Our mission is to rise above today’s challenges and chart tomorrow’s world with clear, dependable reporting.

Support us now with a donation of your choosing. Your contribution will help us shine a light on important ASEAN stories, reach more people and lift the manifold voices of this dynamic, influential region.

 

 

NO COMMENTS

Leave a Reply