Combatting CSAM

YumyHub fights to stop the production and dissemination of materials that promote child sexual exploitation (CSAM).

The most secure digital media platform in the world is being developed by YumyHub. On our platform, we do not accept CSAM and actively seek to prohibit it. The production or dissemination of CSAM violates our Terms of Service and Acceptable Use Policy and is morally repugnant and criminal.

To prevent and promptly remove any suspected CSAM from our platform, we have a committed team of individuals working around the clock.

What is the CSAM?

Any photograph or video of sexually explicit behavior, including nudity, involving a minor under the age of 18, is considered CSAM. These pictures are a kind of child exploitation and abuse.

How frequently does CSAM show up on YumyHub?

People who try to utilize our platform to distribute CSAM are aggressively targeted and reported by YumyHub.

Less than 0.001% of all content that authors submit for YumyHub posting contains incidents of suspected CSAM. We notify NCMEC of all CSAM-related suspicious events. While we look into it, every suspected image or video is prohibited and taken down from the platform. Many of those suspected CSAM photos turn out not to be CSAM in the end or are stopped before they ever reach YumyHub.

On its platform, YumyHub distinguishes CSAM in what ways?

To stop CSAM from being posted, we constantly search our platform. Each of our content moderators has received training in spotting and reporting any suspected CSAM.

We evaluate information using cutting-edge digital technology to determine whether it is permitted on the platform before it can show on a newsfeed. Within 24 hours, our skilled human moderators manually review every piece of content that makes it past this preliminary inspection. Any content that our skilled moderators think to be CSAM is promptly flagged and escalated.

When attempting to identify CSAM, what does YumyHub search for?

In order to stop the distribution of known CSAM, we check all content before it can be published on a creator’s newsfeed against databases and other resources utilized by law enforcement.

If the CSAM is “new,” as opposed to previously existing in databases and instruments used by law enforcement, it may be more difficult to identify. So, in order to search for potential “new” CSAM, we carefully examine images, text, and sound data. To report suspected CSAM that hasn’t been previously recognized, our technology and personnel collaborate. After verification, we provide law enforcement and non-governmental groups with this information to aid in the identification of the offenders.

When YumyHub discovers suspected CSAM on its platform, what happens?

We instantly take down any suspected CSAM after submitting a “CyberTipline” complaint to the National Center for Missing & Exploited Children (NCMEC). The amount of reports sent to NCMEC by YumyHub and other digital media firms may be found here.

In order to help law enforcement combat CSAM, NCMEC examines these reports and distributes them to the relevant agencies worldwide. We collaborate extensively with law enforcement to look into, charge, and punish anyone attempting to use our platform to produce or distribute CSAM.

Any time a user tries to share a suspected CSAM on YumyHub, we look into it right away and take the necessary action. We disallow anyone from producing or disseminating CSAM on YumyHub.

How can YumyHub identify CSAM content in a direct message or other private post? Are these postings and information encrypted?

End-to-end encryption is not used by YumyHub. Our staff of skilled reviewers can see everything on the website. YumyHub doesn’t have any “secret” areas, “hidden” posts, or communications that disappear.

Including in all direct communications, we have the right to review and delete any image or video posted on YumyHub.

Can CSAM be distributed via the YumyHub subscription model?

No. The creation and distribution of CSAM is made more challenging by our subscription model.

Users must pass our stringent identity verification checks before being able to subscribe to content or post content on YumyHub. This means that, unlike many other digital media platforms, we are aware of the true identities of every one of our users. YumyHub does not allow anonymous posting.

Users of YumyHub are less likely to attempt to produce or distribute CSAM since they are not anonymous and because end to end encryption is not available. We know who they are, we report them, and we ban them from YumyHub if any user attempts to publish or develop CSAM content on our platform.

How can I file a suspicious CSAM report?

YumyHub provides a report button for each post and account. Please click the report button or send an email to to let us know what you saw if you see any content on YumyHub that you think might constitute CSAM.

How can I be sure YumyHub is taking this matter seriously?

The goal of YumyHub is to create the world’s safest social networking platform. We are accountable for our actions, and we frequently disclose data to support the measures we claim to be taking. Our Transparency Reports, which are available here, contain this information.

We have also taken it a step further and decided to appoint a neutral third party monitor to oversee the procedures we have in place to combat the production and distribution of CSAM.

What other steps do you take to stop CSAM from being produced or distributed?

We collaborate closely with governments, authorities, law enforcement, non-governmental groups, foundations, and other businesses to combat CSAM.

Please send us an email at if you’d want more information about our campaign to stop the production and distribution of CSAM.