Combatting CSAM
BiMaNiA fights to combat the creation and distribution of child sexual abuse materials (CSAM).
BiMaNiA is building the safest digital media platform in the world. We do not tolerate CSAM on our platform, and actively work to block it. The creation or distribution of CSAM is immoral, wrong, illegal and against our Terms of Service and our Acceptable Use Policy.
We have a dedicated team of people who work around the clock to prevent and swiftly remove any suspected CSAM from our platform.
What is CSAM?
How does BiMaNiA identify CSAM on its platform?
What happens when BiMaNiA find suspected CSAM on its platform?
How can BiMaNiA tell if a direct message or other private post contains CSAM material, are these posts and content encrypted?
Does BiMaNiA’ subscription model enable the distribution of CSAM?
How do I report suspected CSAM?
How can I trust that BiMaNiA takes this issue seriously?
What else do you do to prevent the creation or distribution of CSAM?