Since our last blog post, Clubhouse has gone from a small community of beta testers to a growing network of communities, made up of people with vastly different opinions, experiences, worldviews and perspectives. This past week, people on Clubhouse have hosted several intense conversations on topics of identity, ethnicity, gender, racism, and religion. These conversations led to a number of serious incident reports, and we received questions and concerns from our community about how we plan to scale safety and moderation on Clubhouse. In the wake of this, we wanted to share some thoughts regarding what we stand for as a company, what we will and will not tolerate, what we are doing to prevent abuse, and how we plan to empower conversation hosts with better moderation tools as we grow.

First, we unequivocally condemn Anti-Blackness, Anti-Semitism, and all other forms of racism, hate speech and abuse on Clubhouse. Our Community Guidelines and Terms of Service make this clear, and we have trust and safety procedures in place to address any violation of these rules. People who violate them are warned, suspended, or removed completely from the platform, depending on the severity of the offense. This is a critical area of investment for us as a company and we are working hard to continue building tools and policies that are robust and that account for the unique dynamics of real-time voice conversations and group discussions.

Second, we celebrate the fact that Clubhouse is not one single community, but a network of interconnected and diverse communities. As these communities grow, we need to provide moderators and club leaders with better tools and infrastructure to bring people together. Our goal is to empower them to host important, and even difficult, conversations—because some of the most powerful moments on Clubhouse happen when you find yourself speaking with a room full of people whose backgrounds and experiences are completely different from your own. These conversations often go on for hours, spilling out into breakout rooms full of people connecting, debating, evolving their worldviews and recognizing their blindspots. Our hope for Clubhouse is that it can be a new type of network based on empathy, discussion and sensemaking, rather than polarization. We think social media needs more of this.

PREVENTING ABUSE

Our Terms of Service and Community Guidelines define what type of behavior is allowed on Clubhouse and we are committed to addressing behavior that violates these rules. Here is what we’re doing to help with that:

  • We’re taking action on all incident reports. Any time someone reports a violation of our Terms of Service or Community Guidelines, we immediately investigate it. We don’t discuss these investigations publicly for user privacy reasons, but they are happening, and when rules are violated, corrective action is taken. This week, we’re also shipping real-time systems to investigate incidents more quickly and empower moderators to restrict and end rooms.
  • We’re continuing to scale our trust and safety operations. This is an ongoing effort for us that spans people, policy and product. On the people side, we’re focused on:
  • Adding advisors. We are building a team of advisors with deep expertise in trust, safety, diversity and inclusion to provide ongoing advice and input.
  • Engaging directly with the community. Since the earliest days of Clubhouse we’ve been engaging deeply with a diverse cross-section of our community to understand their needs—through weekly Town Halls, New User Orientation sessions and deeper discussions, both on Clubhouse and off. We plan to continue the dialogue and see how these formats can be improved. We also use these discussions to continuously evolve our Terms of Service, Privacy Policy and Community Guidelines. These will be living documents.
  • Growing our team. Our trust and safety efforts are staffed to respond swiftly to incident reports, and we plan to proactively scale this operation as we grow.
  • We’re shipping a wave of new safety features. Over the past couple months we introduced blocking, muting, in-room reporting, and the ability for moderators to end a room. This week we are shipping a wave of new enhancements to make in-room reporting more real-time, specific and robust. We are also making the Community Guidelines accessible from every room and shipping new features to empower Clubhouse moderators.

EMPOWERING MODERATORS AND CLUB LEADERS

As we take these steps, we want to avoid conflating abuse with other things that can feel uncomfortable—like differences in opinion or conversational style. Abuse, racism, religious intolerance, sexism and hate speech are never okay. Targeted and coordinated harassment is never okay. But what about general rudeness? Or holding opposing political viewpoints? While these things might seem jarring, we don’t believe they should be banned. We want to make sure that when you use Clubhouse, you get to choose your communities, your rooms, and your style of conversation. Here’s what we’re working on to enable this:

  • Allowing clubs to set their own norms. With our next release, club founders will be able to write rules that are specific to their clubs—to share their community values, communicate their norms, and define the dos and don'ts for speaking. When people join the club they'll be asked to agree to the rules. And when the club hosts a public conversation, non-members will be asked to agree to the rules before speaking. We think this will help people create intentional gathering spaces that cater to many interests and styles. These rules will supplement the Community Guidelines, which still apply to everyone.
  • Hosting formal moderator training sessions. There is no single way to moderate, and each room can have its own style. To help with this, we’re going to start offering regular moderator training sessions on the app, to ensure that people who wish to host discussions are equipped with the tools and knowledge they need.
  • Improving moderator tooling. Great moderators create great conversations, and we need to empower them with the right tools. This week we are building infrastructure that will allow us to notify moderators when there is a safety concern related to their room. Moderators can also tap the “End Room” button anytime if they feel the conversation is getting out of hand.
  • Adding moderator badges. This is a small thing, but it’s easier to provide a speaker with feedback when you know who’s in charge of the room. These will be live in the next release.

The world is not a monoculture, and we want Clubhouse to reflect that. Ideally the experience is more like a town square, where people with different backgrounds, religions, political affiliations, sexual orientations, genders, ethnicities, and ideas about the world come together to share their views, be heard and learn. Some of these communities come together to debate. Some come to relax and joke around. Others hold listening parties and fireside chats. We think many styles should be supported, and we’re working on tools to help everyone create their own space, deepen friendships, meet new people and have meaningful discussions—in the way that suits them best.

Clubhouse is nothing without the community, and we are immensely grateful for all of your ideas, emails, tweets, support and critiques. We’ll continue working around the clock on all of this as we open it up to more of the world. Thank you! 🙏🏽