Censorship and Harmful Content
    • Dark
      Light
    • PDF

    Censorship and Harmful Content

    • Dark
      Light
    • PDF

    Article summary

    Last updated: September 24th, 2020

    Our stance on harmful content and censorship in Rocket.Chat

    Rocket.Chat is built to be an open and free communication platform. We want everyone to be able to run it and use our platform freely, and to improve people's lives. Today, our platform is used for myriads of different purposes, according to how our users deem it fit for their purposes.

    We also do not want to be the judges on what constitutes the “right” way to use Rocket.Chat. Moral values differ from person to person, and laws often collide over highly debated issues such as free speech. We believe the users of Rocket.Chat know best how to use our platform to their and everyone else's benefit.

    How does Rocket.Chat deal with reports of illegal content?

    Sometimes, our organization receives a request from a law enforcement body to produce a certain set of user data to aid in a criminal investigation. Or there might be a request to take down content that has been deemed illegal. We treat these requests very carefully and—where we are able and obliged to help—bring this to the attention of the instance's administrator. Often, though, we cannot do anything.

    That is because we have no way to access or control self-managed Rocket.Chat servers. Self-managed means that Rocket.Chat is installed on a server we do not own. Our platform is open source and has no backdoors or whatsoever to allow us to access your installations remotely.

    In the case of Rocket.Chat instances hosted by us for others, we forward reports to the respective administrator, and if we determine an obvious breach of terms of service, we can terminate the hosting. Where the request is about content on a server directly under our control, such as our Open Server, you can contact us directly under [email protected], and we will take action.

    Resources for administrators be notified about harmful content

    On the same side, we also see that many organizations using our platform are subject to strict content moderation requirements. We are already providing various features for these organizations to administer their instances. For example, notifications for keywords can help to identify potential abuse quickly. Administrators who want to use these features can do so but are not forced by us to do so. Ultimately, the administrator is responsible for the content being processed within their Rocket.Chat instance.

    With this being the current situation, we wanted to share our stance on how we plan to deal with addressing harmful content soon.

    Our principles

    Our policy principles in this matter are:

    • We do not endorse illegal or unethical usage of Rocket.Chat in any way. We understand these terms to be relative and to be interpreted in their local context.

    • We want Rocket.Chat to be a platform that allows for free and unrestricted communication. We do not plan or want to build any kind of backdoor, censorship tool, or hidden remote control mechanism into Rocket.Chat.

    • Administrators are the ones in control over their installation. Administrators are responsible for configuration and content moderation decisions within their instance.

    • We comply with valid local or international law enforcement requests to remove content or produce user data and inform our users affected by these requests.

    • On our Open Server, which we run, we want to provide users with a positive and fun environment to test our platform and get in touch with us.

    While none of these principles are absolute, they are guiding our actions.

    What can you do when dealing with harmful content in Rocket.Chat?

    For users: On our Open Server, you can report harmful content per our code of conduct, and we will look into removing it. We want you to be able to use our open server hassle-free.

    If you encounter another Rocket.Chat instance that is not hosted by us and which you think contains illegal or otherwise harmful content, we recommend you contact the administrator of that instance to moderate the related content.

    If you do not know who your administrator is, you can check the DNS records for contact information. For instances that are hosted by us, we can contact the administrator on your behalf. We have published a tool where you can find out if a Rocket.Chat instance is hosted by us (server lookup). This is not always clear because no one is obliged to use the Rocket.Chat logo or name.

    As an ultimate resort, you may want to contact the law enforcement body in charge of investigating the potential offense in question. They will tell you the legal remedies available and the potential next steps to take.

    For administrators: If you are an administrator, you might be interested in moderating the content that users create or put in your instance. Notable features that can help you with that are:

    • Making use of the Moderator permission in channels to appoint individuals to purge or modify inappropriate messages

    • Notification feature for the use of specified words or phrases

    • Blacklisting certain words or phrases

    • Notifying your users of applicable policies via e.g. pinning messages or adding an announcement to the room

    • Requiring confirmation of user registration by an administrator to prevent unvetted users from posting messages

    • Enabling or disabling end-to-end encryption: With end-to-end encryption enabled, only an encrypted string of the message is stored on the server. This, however, prevents content auditing via administrators and moves responsibility for content moderation to users.

    • Turning on GoogleVision integration for image uploads, which has options to block images containing graphic or adult content

    All of these features are optional to choose, so you have the total flexibility in what to apply in your specific case. Let us know which kind of features you are currently missing - but would find useful - by opening feature requests in our Github repository.

    For law enforcement: We sometimes receive requests from law enforcement to remove content from certain Rocket.Chat instances. We have published guidelines for law enforcement, how we deal with requests, and what to consider before submitting a request to us as the legal entity behind Rocket.Chat.

    In summary: In most cases, we cannot remove the majority of content because it is outside of our control on servers we do not have (and do not want) access to. If the content in question is on our Open Server, we remove it if it is a breach of our code of conduct or if we are compelled by a law enforcement request. For servers hosted by us and under the control of our customers, we remove content after notifying and in collaboration with the customer or directly as a violation of our terms of service. For questions or contact, please use [email protected]

    For reporters and media requests: Are you researching for an article about Rocket.Chat or where Rocket.Chat plays a role?

    We would love to explain our stance in detail or comment before you publish your article. Please reach out to [email protected] to get a comment from us on the topic in question.


    Was this article helpful?

    What's Next
    ESC

    Eddy AI, facilitating knowledge discovery through conversational intelligence