
According to a three-part report published by The Wall Street Journal, Facebook responded inadequately or not at all to reports from its own employees that the social media platform was being used by drug cartels and human traffickers to round up business and recruits, among other crimes and atrocities. According to the report in the WSJ, internal Facebook documents reveal employees raising red flags that, “human traffickers in the Middle East used the site to lure women into abusive employment situations in which they were treated like slaves or forced to perform sex work. They warned that armed groups in Ethiopia used the site to incite violence against ethnic minorities. They sent alerts to their bosses on organ selling, pornography and government action against political dissent, according to the documents.” Facebook removed some pages, though left up dozens more. Critics claim that Facebook is not working hard enough to address this exploitation, and may even be profiting from it. The WSJ reports that Brian Boland, a former Facebook vice president, claims that Facebook “treats harm in developing countries as ‘simply the cost of doing business.'” Boland says, “There is very rarely a significant, concerted effort to invest in fixing those areas.”
In one case, a Facebook employee who was a former police officer investigated a Mexican drug cartel’s activities on Facebook and Instagram. Messages were found on Facebook and Instagram between cartel recruiters and recruits about the recruits being beaten or killed if they tried to escape from the training camp. Several pages on Facebook showed weapons and crime scenes. The employee reported this, but Facebook failed to remove the cartel from the platform, only removing some of the content. Days later, a new Instagram account tied to the cartel was discovered with more violent postings.
Last June, the Texas Supreme Court ruled that three teens who are trafficking survivors can move forward with their lawsuit against Facebook. The teens claim that each of them met their pimps or abusers through Facebook messenger and that Facebook was negligent because it “failed to warn about or attempt to prevent sex trafficking from taking place on its internet platforms.” Facebooks’ lawyers argued that the social media giant was immune from such lawsuits, being protected by Section 230 of the federal Communications Decency Act, which says that social media platforms cannot be held responsible for what their users post. The majority of justices on the Texas court, however, wrote, “We do not understand Section 230 to ‘create a lawless no-man’s-land on the Internet’ in which states are powerless to impose liability on websites that knowingly or intentionally participate in the evil of online trafficking.” The court wrote: “Holding internet platforms accountable for the words or actions of their users is one thing, and the federal precedent uniformly dictates that Section 230 does not allow it. … Holding internet platforms accountable for their own misdeeds is quite another thing. This is particularly the case for human trafficking.” The justices pointed out that Congress amended Section 230 to add civil liability for websites that violate state and federal laws against human trafficking.
Facebook, of course, defends its efforts to monitor and remove such criminal activity, and I can believe that the effort is overwhelming. Still, critics say that Facebook’s efforts are either not enough or that they’re just not interested in taking care of the problem. And the problem is huge. According to the Federal Human Trafficking report for 2020, 59 percent of online recruitment of sex trafficking victims took place on Facebook, and 65 percent of child sex trafficking victims that were recruited on social media were recruited on Facebook.
There seems to be two versions of what’s going on here. Facebook says that it’s trying to monitor and eliminate its social media platform being used by drug cartels, sex traffickers and other nefarious actors, but that the effort is massive and its not able to catch everyone. Critics, including current and former employees, insist that Facebook is either failing to act adequately or failing to act at all when it learns of criminal activity exploiting the platform. I gather it’s a bit of both. I certainly can understand that the effort to eliminate all criminal activity on a platform as large as Facebook is enormous. I can also believe that Facebook isn’t too terribly motivated to be bothered to do so. The reports of employees and former employees suggest Facebook’s negligence is real. Call me cynical, but it’s not hard to believe that Facebook is more interested in keeping its customers happy and its platform free and easy than it is in monitoring criminal activity. It’s worth remembering that pro-Trump extremists were able to spend months using Facebook to help plan the January 6 attack on the Capitol Building.
Be Christ for all. Bring Christ to all. See Christ in all.