We proactively hired an Independent Monitor to provide transparency, assessment, and validation of the conception, execution, and success of our compliance program in order to increase the level of transparency surrounding our safety compliance program. For the purpose of their continual examination, the team gets significant access to both our leaders and staff as well as our data. The Monitor will measure how we put best-in-class safety practices into effect, assess our controls, and provide a full report of their findings.
YumyHub users’ security and privacy may be affected by legal procedures, technological advancements, and other reasons. This report includes details on technical initiatives to strengthen security as well as user data disclosure.
YumyHub, a multinational company, keeps an eye on developments in the legal and regulatory fields that affect our creators and our fans. To ensure that we abide by all relevant laws and regulations, we continue to make any required modifications to our policies and procedures.
YumyHub receives requests for user information from governments all around the world. We thoroughly examine each request to ensure that it complies with applicable laws. YumyHub responds to government agency requests for assistance with administrative, criminal, and civil legal problems.
In every instance, our legal teams demand that the agencies issue formal legal process, such as a subpoena, production order, search warrant, or court order, in order to compel the disclosure of basic subscriber registration information, specific IP addresses, and other data YumyHub may have access to.
Here are the specifics of how many inquiries we get
YumyHub responds to requests from government organizations for assistance with civil, administrative, and criminal issues.
In all situations, before releasing basic subscriber registration information, specific IP addresses, and other information, YumyHub needs the government to execute a subpoena (or comparable court order) to compel disclosure. At month’s end, a rolling update of the overall number of law enforcement requests is made.
Any photograph or video of sexually explicit behavior, including nudity, involving a minor under the age of 18, is considered CSAM. These pictures are a kind of child exploitation and abuse. We use a combination of technology and human review to deter, detect, and remove CSAM from our platforms. We have a zero tolerance stance against, and spend extensively in combatting child sexual exploitation online.
On its platform, YumyHub distinguishes CSAM in what ways?
We employ both automatic and human assessment in our proactive prevention and detection activities. In order to find, remove, and report suspicious CSAM on our platforms, we also promptly respond to reports sent in by our users and other visitors, such as NGOs. We use hash matching provided by NCMEC, Microsoft DNA, and other outside resources. We also use cutting-edge technology to find CSAM that has never been seen before, which is subsequently verified by our expert review teams. Learn more about the actions we take to thwart CSAM here.
When YumyHub discovers CSAM on its platform, what does it do?
We quickly remove any suspected CSAM from public view when we find it on our platforms, file a “CyberTipline” complaint with NCMEC, deactivate the user’s account, and/or take other appropriate action when we find suspected CSAM (we put accounts into a secure server for 90 days).
In the United States, NCMEC acts as a clearinghouse and thorough reporting center for matters pertaining to child exploitation. NCMEC may send reports it receives to law enforcement organizations all around the world.
Any follow-up inquiries from law enforcement regarding NCMEC reports are complied with by YumyHub. Along with CyberTipline reports, when necessary, we cooperate directly with law enforcement to ensure prompt reporting and give pertinent contact information. This statement applies to any country in which we do business.
How do I report CSAM suspicions?
You can email support@yumyhub1stg.wpenginepowered.com with any information you see, and each post and account include a report button. Our moderation process heavily relies on community reporting.
The user, the child victim, and/or any useful contextual information may all be mentioned in a report that is sent to NCMEC. When content is found from numerous sources, for instance, it’s possible that more than one report is sent on a specific user or piece of content. According to NCMEC guidelines and standard operating procedures, it’s also possible that the same content will be discovered linked to a user with many accounts but we’ll only report it once in that case. These reports could be distributed by NCMEC to law enforcement organizations worldwide.
When we find new CSAM, we produce a hash of the material, add it to our internal repository, and notify NCMEC and other third-party tools when necessary or appropriate. We and others can locate CSAM that has already been discovered thanks to hashing technology. Other providers can obtain these hashes as well because we also share the hash values with them. One of the most important approaches to combat online CSAM in any industry is to contribute to the hash databases maintained by NCMEC and others. This indicator shows the total hashes YumyHub has added to the effort so far this month. Visit the NCMEC CyberTipline 2021 Reports for further data demonstrating how our company and other peer companies are providing hashes to NCMEC.
Less than 0.001% of the content that producers submit to YumyHub to be posted (or attempted to be posted) has instances of suspected CSAM. We use CyberTipline to report any suspected CSAM occurrences to NCMEC, and this site keeps track of all reports submitted by YumyHub and other digital media organizations. The suspected media is removed and blocked after a CyberTipline Report has been filed so that it can be investigated. A large percentage of the suspected media does not prove to be CSAM and/or is made up of duplicate pictures or videos. Any little change in the number of reports is due to false positives or duplicate reports. The number of CSAM cases recorded are listed in our transparency reports.
We respect and protect our content creators’ rights to their works and photos, and YumyHub upholds those rights by enforcing laws against improper use and/or republishing of those works and photographs on other websites or for improper causes. To uphold those rights, YumyHub sends DMCA notices to other websites. YumyHub respects third party property rights and the illegal use of images or creations, just as we do for the property rights of our authors.
Our policy is to respond to reports of alleged copyright infringement that are clear and detailed. The Digital Millennium Copyright Act (DMCA) is in compliance with the notice format we outline on our website, which offers copyright holders from all over the world a quick and effective way to claim their rights to content submitted on YumyHub (https://www.yumyhub.com/dmca). When a copyright owner notices that a URL leads to allegedly illegal material, they send us a takedown notice in order to start the process of removing that content from YumyHub. Our employees thoroughly examine each legal takedown notice for completeness and other issues when we receive one. We remove the content from YumyHub if the notification is complete and there are no additional problems.
In order to identify and look into the biggest and most harmful online pirate networks, we also use Active Fence, the premier intelligence and machine learning technology in the market.
The DMCA Requests requesting third-party takedown (requests to remove copyright infringing content from third party websites)
Concerning Trademark Issues from the Outside
YumyHub has a strong content moderation team that adheres to high standards and our Acceptable Use Policy.