Jump to content
  • Sign Up
×
×
  • Create New...

How Facebook Messenger and Meta Pay are used to buy child sexual abuse material | Technology


Recommended Posts

  • Diamond Member

How
This is the hidden content, please
Messenger and Meta Pay are used to buy child ******* ****** material | Technology

When police in Pennsylvania arrested 29-year-old Jennifer Louise Whelan in November 2022, they charged her with dozens of counts of serious *******, including **** trafficking and indecent ******** of three young children.

One month earlier, police said they had discovered Whelan was using three children as young as six, all in her care, to produce child **** ****** material. She was allegedly selling and sending videos and photos to a customer over

This is the hidden content, please
Messenger. She pleaded not guilty.

The alleged buyer, Brandon Warren, was indicted by a grand jury in February 2022 and

This is the hidden content, please
of distribution of material depicting minors engaged in ********* explicit conduct. Warren also pleaded not guilty.

Court documents seen by the Guardian quote

This is the hidden content, please
messages between the two in which Warren allegedly describes to Whelan how he wants her to make these videos.

“I’ll throw in a little extra if you tell him it makes mommy feel good and get a good length video,” he tells Whelan, according to the ********* complaint document used for her arrest.

Whelan received payment for the footage over Meta Pay, Meta’s payment system, according to the ********* complaint against him. “Another 250 right? Heehee,” she allegedly wrote to Warren after sending him a video of her abusing a young girl.

Meta Pay, known as

This is the hidden content, please
Pay before rebranding in 2022, is a peer-to-peer payment service enabling users to transfer money over the company’s social networks. Users upload their credit cards, debit cards or
This is the hidden content, please
account information to
This is the hidden content, please
Messenger or
This is the hidden content, please
to send and receive money.

A spokesperson for Meta confirmed that the company has seen and reported payments via Meta Pay on

This is the hidden content, please
Messenger that are suspected of being linked to child ******* exploitation.

“Child ******* exploitation is a horrific ******. We support law enforcement in its efforts to prosecute these ********** and invest in the best tools and expert teams to detect and respond to suspicious activity. Meta reports all apparent child ******* exploitation to NCMEC [the National Center of Missing and Exploited Children], including cases involving payment transactions,” the spokesperson said.

Through reviewing documents and interviewing former Meta content moderators, a Guardian investigation has found that payments for child ******* ****** content taking place on Meta Pay are probably going undetected, and unreported, by the company.

Court documents show Whelan and Warren’s actions were not spotted or flagged by Meta. Instead, Kik Messenger, another social platform, reported Warren had uploaded videos suspected to be child ******* ****** material (CSAM) to share with other users. This triggered a police investigation in West Virginia, where Warren lives. His electronics were seized, and police then discovered the eight videos and five images that he had allegedly bought from Whelan over

This is the hidden content, please
Messenger.

We responded to valid legal process,” said a Meta spokesperson, in response to the Guardian’s findings that the company did not detect these *******.

Additionally, two former Meta content moderators, employed between 2019 and 2022, told the Guardian that they saw suspicious transactions taking place via Meta Pay that they believed to be related to child **** trafficking, yet they were unable to communicate with Meta Pay compliance teams to flag these payments.

“It felt like [Meta Pay] was an easy-to-use payment method since these people were communicating on Messenger. The amounts sent could be hundreds of dollars at a time,” says one former moderator, who spoke under the condition of anonymity because they had to sign a non-disclosure agreement as a condition for employment. The moderator, employed for four years until mid-2022 by Accenture, a Meta contractor, reviewed interactions between adults and children over

This is the hidden content, please
Messenger for inappropriate content.

Payments for **** or CSAM are typically just a few hundred dollars or less in cases reviewed by the Guardian. According to the former Meta compliance analyst, transactions of such small amounts are unlikely to be flagged for review by Meta’s systems.

This means that payments connected to illicit activities are probably taking place undetected, financial ******* experts said.

A Meta spokesperson said that the company uses a combination of automated and human review to detect suspicious financial activity in payment transactions in Messenger.

“The size of the payment is just one signal our teams use to identify potentially suspicious activity, and our compliance analysts are trained to assess a variety of signals,” said the Meta spokesperson. “If our teams had reason to suspect suspicious activity, especially activity involving a child and even if the payments are small, it would be investigated and reported appropriately.” The spokesperson also said that the company had “a strong ‘see something, say something’ culture”.

For situations where ********* men were targeting underage ****** abroad to groom, payments could be for things like getting a phone and school supplies, the moderator said.

“Most of what we saw were older men from America, targeting ****** in ****** countries and often travelling there,” the moderator added.

“When it comes to child exploitation and CSAM, it’s really all about small amounts,” said Silvija Krupena, director of the financial intelligence unit at RedCompass Labs, a London-based financial consultancy. “It’s a global ****** and **********, with different types of offenders. In low-income countries like the Philippines, $20 is big money. The production usually happens in those countries. These are small amounts that can fall through the cracks when it comes to traditional money-laundering controls.”

Meta has a team of about 15,000 moderators and compliance analysts who are tasked with monitoring its platforms for harmful and ******** content. Possible ********* behavior is supposed to be escalated by Meta and reported to law enforcement. Anti-money laundering regulations also require money service businesses to train their compliance staff to have access to enough information to be able to detect when ******** financing occurs.

Yet contractors monitoring Meta Pay transaction activity do not receive specific training for detecting and reporting money flows that could be related to human trafficking, including the language, codewords and slang that traffickers typically use, a former Meta Pay payment compliance analyst contractor said.

“If a human trafficker is using a codeword for selling ******, we didn’t get into that. We didn’t really get trained on those,” said the former compliance analyst. “You don’t even give it a second thought or even dig into that kind of stuff at all.”

A Meta spokesperson disputed the payment compliance analyst’s claims.

“Compliance analysts receive both initial and ongoing training on how to detect potentially suspicious activity – which includes signs of possible human trafficking and child ******* exploitation. Our program is regularly updated to reflect the latest guidance from financial ****** regulators and safety experts,” the spokesperson said.

Meta’s history with accusations of child exploitation

Meta’s platforms have been linked to alleged child exploitation and the distribution of CSAM in the past. In December, the New Mexico attorney general’s office filed a lawsuit against the company, alleging

This is the hidden content, please
and
This is the hidden content, please
are “breeding grounds” for predators targeting children for human trafficking, grooming and solicitation. The suit followed an April 2023 Guardian investigation, which revealed how child traffickers were using Meta’s platforms to buy and sell children into ******* exploitation.

As a money services business, Meta Pay is subject to the US anti-money laundering and “know your client” (KYC) banking regulations, which require businesses to report illicit financing to the US treasury department’s Financial ******* Enforcement Network (FinCEN).

If Meta fails to detect and report these payments, it could be in violation of US anti-money laundering laws, financial ******* experts have said.

“Regulations apply to any company that participates in a payments business. But for social media because they can see users, they see their lives, their transactions, they can see ****** and see contact. It’s such a low-hanging fruit for them to detect this,” said Krupena.

Other peer-to-peer payment apps have faced scrutiny for their practices in preventing illicit activity. In 2023, Senate Democrats requested detailed ****** detection and prevention methods from

This is the hidden content, please
, Venmo and Cash App. **** trafficking “ran rampant” on Cash App,
This is the hidden content, please
last year by US investment research firm Hindenburg. Block, Cash App’s owner, disputed these claims, threatening legal action.

Meta introduced end-to-end encryption to

This is the hidden content, please
Messenger in late 2023, but even before this, payment compliance analyst contractors could not access the Messenger chat between the two users exchanging funds. The former Meta compliance analyst told the Guardian their team could only see transactions with notes and the relationship between the two users.

“I don’t know how you do compliance in general without being able to see intentions around transacting,” said Frances Haugen, a former

This is the hidden content, please
employee turned whistleblower, who released tens of thousands of damaging documents about its inner workings in 2021. “If the platforms actually wanted to keep these kids safe, they could.”

Siloed work prevents flagging suspicious transactions, say ex-moderators

Other former content moderators interviewed by the Guardian compared their jobs to call center or factory work. Their jobs entailed reviewing content flagged as suspicious by users and artificial intelligence software and making quick decisions on whether to ignore, remove or escalate the content to Meta through a software program. They say they could not communicate with the Meta Pay compliance analysts about suspicious transactions they witnessed.

“We were not allowed to contact

This is the hidden content, please
employees or other teams,” one former moderator said. “Our managers didn’t tell us why this was.”

Gretchen Peters, who is the executive director of the Alliance to Counter ****** Online, has documented the ***** of narcotics, including fentanyl, over Meta’s platforms. She also interviewed Meta moderators who were not permitted to communicate with other teams in the company. She said this siloing was a “major violation” of “know your customer” banking regulations.

“We’ve heard from moderators at Meta they can see ******** conduct is occurring and that there are concurrent transactions through Meta Pay, but they have no way of communicating what they are seeing internally to moderators at Meta Pay,” said Peters.

A Meta spokesperson said the company prohibits the ***** or purchasing of narcotics on its platforms and removes that content when it finds it.

“Meta complies with all applicable US anti-money laundering laws,” the spokesperson said. “It is also untrue to suggest that there is a lack of communication between teams. Content moderators are trained to escalate to a specific point of contact, who brings in the appropriate specialist team.”

In December, Meta announced it had rolled out end-to-end encryption for messages sent on

This is the hidden content, please
and via Messenger. Encryption hides the contents of messages from anyone but the sender and intended recipient by converting text and images into unreadable cyphers that are unscrambled on receipt.

Yet this move could also affect the company’s ability to prevent illicit transactions on Meta Pay. Child safety experts, policymakers, parents and law enforcement criticized the move, arguing encryption obstructs efforts to rescue child **** trafficking victims and the prosecution of predators.

“When Meta Pay is linked to Messenger or

This is the hidden content, please
, the messages associated with payments could uncover illicit behaviors,” said Krupena. “Now that this context is removed, the implications are significant. It almost feels like encryption is inadvertently facilitating illicit activity. This opens many opportunities for ********** to hide in plain sight.”

A Meta spokesperson said the decision to move to encryption was to “provide people with privacy”, and that the company encourages users to self-report private messages related to child exploitation to the company.

“Moving to an encrypted messaging environment does not mean we will sacrifice safety, and we have developed over 30 safety tools, all of which work in encrypted messaging,” said the spokesperson. “We’ve now made our reporting tools easier to find, reduced the number of steps to report and started encouraging teens to report at relevant moments.”

FinCEN declined to comment.

This is the hidden content, please
did not respond to a request for comment.



This is the hidden content, please

#

This is the hidden content, please
#Messenger #Meta #Pay #buy #child #******* #****** #material #Technology

This is the hidden content, please

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Vote for the server

    To vote for this server you must login.

    Jim Carrey Flirting GIF

  • Recently Browsing   0 members

    • No registered users viewing this page.

Important Information

Privacy Notice: We utilize cookies to optimize your browsing experience and analyze website traffic. By consenting, you acknowledge and agree to our Cookie Policy, ensuring your privacy preferences are respected.