- Print
- DarkLight
- PDF
Scaling Rocket.Chat
- Print
- DarkLight
- PDF
As concurrent users on your workspace grow, you may experience system latency. Monitoring system performance is essential to determine if additional resources are needed. For example, you may see the Rocket.Chat mode process approaching 100% CPU (even if the host CPU load is low). This is due to the single-threaded nature of Node.js applications, as they can't take advantage of multiple cores natively.
To overcome these limitations and ensure smooth performance as your workspace grows, consider scaling your environment by deploying Rocket.Chat using Microservices (smaller components focused on specific features).
The scaling approach you choose depends on how you deploy your workspace. For deployments using Docker, scaling by running multiple Rocket.Chat instances is a well-established approach. However, microservices offer a more granular and efficient scaling option for Kubernetes deployments.
For deployments approaching 1,000 concurrent users and above, the microservices architecture is recommended for optimal scalability.
This documentation will guide you through both approaches, ensuring your Rocket.Chat workspace scales alongside your growing user base.
As of December 15, 2023, Rocket.Chat has ceased support for connections from cloud services and official mobile/desktop apps to workspaces running legacy versions outside our support window. Users on unsupported legacy servers are advised to upgrade to the latest Rocket.Chat version to ensure continued access to cloud, mobile, and desktop applications. Each Rocket.Chat version is supported for six months post-release.