emphasis on “openness and freedom” and creativity
seems aligned with this: by reducing upfront
filtering, Cluster lowers barriers for users and content creators, which likely aids its adoption.
At the same time, this permissive approach creates “grey-area” content: material that is legal but
perhaps “profoundly violate[s] people’s sense of decency, morality, or justice”
become salient: academic studies describe VR experiences (e.g. harassment or hateful speech in virtual
. In VR these cases
worlds) as especially intense due to immersion . Cluster’s own policy implicitly recognizes this: it
encourages users to be sensitive to diversity and not assume all content is appropriate for public spaces
.
For example, the Community Guideline notes that a graphic or sexual avatar “can cause
discomfort, distress, or even fear,” and urges users to consider the mixed audience (including minors) in
public worlds
. However, as long as such content does not clearly break a rule, Cluster leaves
moderation to user reports. This mixture of broad permission plus reactive policing means “lawful but
awful” content can circulate freely until someone objects.
Community Policing and Exclusion Tactics
Given the ambiguous boundaries, some community members take it upon themselves to judge and
exclude others. Anecdotal reports from Cluster users indicate that event organizers or influential
attendees sometimes accuse others of being “illegal” or “unacceptable” based on subjective
interpretations of the rules, and then kick or shun them from events. For instance, if an avatar or world
uses unlicensed copyrighted material (like game character models or music), some hosts will loudly label
the participant’s presence as “違法 (illegal)” and bar them from the room. Similarly, users have
described seeing notices or hearing moderators say things like ꢀこのイベントは禁止です (this event is
forbidden)” or ꢀお前は出禁だ (you are banned from entry)ꢁ when they violate a site-specific rule. These
actions effectively use Cluster’s vague “public order and morality” clause as a lever to exclude
individuals.
Importantly, Cluster’s official rules disapprove of such vigilante exclusions. The Community
Guideline explicitly lists “attempts to remove certain users from worlds or events (without
justification)” as prohibited harassment
violation of the guidelines on its own
, and it warns that even admonishing others can be a
. In practice, however, many worlds allow the host to kick or
mute participants, so organizers can exercise control. When combined with strong rhetoric (“illegal,”
banned”), this yields a powerful social tool. In effect, some groups exploit the “grey zone” by
“
applying moral pressure rather than formal rules. We could illustrate this with a hypothetical dialogue: a
host might admonish a user, “Your avatar is not authorized – this is illegal activity! You can’t be here,”
even if the user’s actions technically fall within Cluster’s broad Terms. Such exchanges show how
community norms, not just platform policy, determine participation.
Reporting and Moderation Process
For addressing violations, Cluster provides an in-app reporting mechanism. Users can report individuals
or whole worlds/events by clicking the menu (triple-dot) on a profile or event page
. The Help
Center describes exactly how to file a report from the Cluster app or website, but gives no detail on what
happens afterward. There is no public log or transparency portal showing which reports were acted on. In
practice, enforcement seems largely reactive: if enough users report an offending avatar or world (or if
intellectual-property holders complain), Cluster staff will review it. Penalties range from feature bans to
. In its Community Guideline Cluster notes that proven guideline
temporary account suspensions
violations can lead to temporary or permanent bans, and in severe cases the company may even pursue
legal remedies
.
2