Overview of Cluster’s Content Policies and  
Grey Zone” Dynamics  
Cluster is a Japanese social VR platform that lets users create and share custom avatars (VRM) and  
worlds/events. Its official policies emphasize free expression and creativity, but also ban explicit  
harassment or illegal content. For example, the Terms of Service expressly forbid “content related to  
criminal acts or contrary to public order and morality”  
, and require that copyrighted audio/video be  
licensed. The Content Guideline (for items sold in the Cluster store) similarly bars “Representations of  
or suggestive of sexual acts”, nudity, excessive exposure, child porn, and extreme violence or gore  
.
In practice, Cluster relies heavily on user self-policing: its guidelines state that if a world or avatar is  
technically “within the bounds of the Terms”, the company will not proactively remove it – only if  
other users find it “troubling or offensive” and report it will Cluster consider penalties  
words, most content is allowed by default unless someone complains.  
. In other  
Within these broad rules, Cluster imposes few technical barriers on uploads. Users can freely upload VRM  
avatars (up to 100 files) as long as they meet basic size/bone limits  
. World creation via Unity or the in-  
app World Craft tool is similarly open. The only explicit limitations on uploads are on file size and  
complexity – for example, VRM files must meet Unity’s humanoid rig requirements and stay under  
polygon/texture limits  
. Cluster’s FAQ even notes that it has only “internal restrictions that  
encourage common sense usage” of avatars, and that “generally, you can upload your avatar without  
worrying about the restrictions.”  
. This reflects Cluster’s trust in users to abide by the basic rules  
themselves. At the same time, Cluster publishes a detailed Community Guideline forbidding various  
forms of harassment (e.g. “stalking,” “molesting an avatar,” hate speech, etc.)  
and stressing  
that abusive or exclusionary behavior (such as attempts to forcibly remove users without just cause) is  
prohibited  
. The guideline specifically warns users not to eject or block others unjustly: “Attempts to  
remove certain users from worlds or events (without justification based on indicated rules)” are listed as  
harassing behavior  
. Finally, Cluster’s IP rules forbid using copyrighted game or character images as  
avatars without permission  
events can apply via “Cluster perform,” but actual master licenses are not granted) . Taken  
, and require that any music played (e.g. at events) be rights-approved  
(
together, Cluster’s official policies allow broad freedom for user-created VR content, while banning  
.
clear-cut illegal or extreme material  
Platform Growth and Content Openness  
Cluster’s user base has grown rapidly as an event-driven VR platform. For example, a 2024 press release  
touts “cumulative 35 million event attendees” on Cluster (making it “one of the largest metaverse  
platforms in Japan”)  
. Similarly, Cluster’s app has exceeded millions of downloads since launch.  
This growth coincides with Cluster’s open-content model: by permitting users to upload virtually any  
non-criminal VRM avatars and worlds, the platform encourages a wide variety of experiences. In practice,  
many niche or fan-driven events (e.g. anime concerts, live DJ sets, educational conferences) thrive on  
Cluster precisely because users can share custom avatars, worlds, and media. The platform’s “no  
policing unless reported” stance  
means content tends to be more permissive than in tightly  
moderated spaces. In general, researchers note that social VR platforms which maximize user creativity  
and freedom—while managing harassment—tend to attract larger, active communities. Cluster’s  
1
emphasis on “openness and freedom” and creativity  
seems aligned with this: by reducing upfront  
filtering, Cluster lowers barriers for users and content creators, which likely aids its adoption.  
At the same time, this permissive approach creates “grey-area” content: material that is legal but  
perhaps “profoundly violate[s] people’s sense of decency, morality, or justice”  
become salient: academic studies describe VR experiences (e.g. harassment or hateful speech in virtual  
. In VR these cases  
worlds) as especially intense due to immersion . Cluster’s own policy implicitly recognizes this: it  
encourages users to be sensitive to diversity and not assume all content is appropriate for public spaces  
.
For example, the Community Guideline notes that a graphic or sexual avatar “can cause  
discomfort, distress, or even fear,” and urges users to consider the mixed audience (including minors) in  
public worlds  
. However, as long as such content does not clearly break a rule, Cluster leaves  
moderation to user reports. This mixture of broad permission plus reactive policing means “lawful but  
awful” content can circulate freely until someone objects.  
Community Policing and Exclusion Tactics  
Given the ambiguous boundaries, some community members take it upon themselves to judge and  
exclude others. Anecdotal reports from Cluster users indicate that event organizers or influential  
attendees sometimes accuse others of being “illegal” or “unacceptable” based on subjective  
interpretations of the rules, and then kick or shun them from events. For instance, if an avatar or world  
uses unlicensed copyrighted material (like game character models or music), some hosts will loudly label  
the participant’s presence as “違法 (illegal)” and bar them from the room. Similarly, users have  
described seeing notices or hearing moderators say things like ꢀこのイベントは禁止です (this event is  
forbidden)” or ꢀお前は出禁だ (you are banned from entry)ꢁ when they violate a site-specific rule. These  
actions effectively use Cluster’s vague “public order and morality” clause as a lever to exclude  
individuals.  
Importantly, Cluster’s official rules disapprove of such vigilante exclusions. The Community  
Guideline explicitly lists “attempts to remove certain users from worlds or events (without  
justification)” as prohibited harassment  
violation of the guidelines on its own  
, and it warns that even admonishing others can be a  
. In practice, however, many worlds allow the host to kick or  
mute participants, so organizers can exercise control. When combined with strong rhetoric (“illegal,”  
banned”), this yields a powerful social tool. In effect, some groups exploit the “grey zone” by  
applying moral pressure rather than formal rules. We could illustrate this with a hypothetical dialogue: a  
host might admonish a user, “Your avatar is not authorized – this is illegal activity! You can’t be here,”  
even if the user’s actions technically fall within Cluster’s broad Terms. Such exchanges show how  
community norms, not just platform policy, determine participation.  
Reporting and Moderation Process  
For addressing violations, Cluster provides an in-app reporting mechanism. Users can report individuals  
or whole worlds/events by clicking the menu (triple-dot) on a profile or event page  
. The Help  
Center describes exactly how to file a report from the Cluster app or website, but gives no detail on what  
happens afterward. There is no public log or transparency portal showing which reports were acted on. In  
practice, enforcement seems largely reactive: if enough users report an offending avatar or world (or if  
intellectual-property holders complain), Cluster staff will review it. Penalties range from feature bans to  
. In its Community Guideline Cluster notes that proven guideline  
temporary account suspensions  
violations can lead to temporary or permanent bans, and in severe cases the company may even pursue  
legal remedies  
.
2
However, from a user perspective the moderation process is a “black box.” Cluster does not publish  
clear statistics on reports or appeals, nor does it explain decisions to the community. This opacity can  
embolden users who distrust the official process: if someone feels a reported offense is not being  
punished, they may take matters into their own hands by publicly shaming or exiling the violator.  
Conversely, if someone believes a complaint is frivolous, they may appeal informally within the group. In  
sum, reporting exists, but the platform’s actual response is neither visible nor easily understood by  
ordinary users.  
Community Self-Governance  
Cluster’s culture emphasizes peer respect and caution in enforcement. The official guidelines advise  
users not to publicly scold others for minor infractions: “chiding other users for a violation…can be  
. Instead, it urges members to discreetly point newcomers to the  
considered a violation on your part”  
published rules or to simply report issues to Cluster staff  
. The recommended user tools are actions  
, rather than shaming. This reflects an ideal of  
like muting, blocking, or moving to a different space  
mutual support: users should “praise one another” and respect diversity  
built-in safety features.  
, using the platform’s  
In reality, however, clusters of veteran users often form tight-knit event communities (“circles”) with  
their own social norms. Experienced hosts may rigorously enforce their version of the rules, sometimes  
exceeding Cluster’s written policies. For example, an anime fan event’s organizers might insist on pre-  
approval of all avatars and eject anyone wearing an unlicensed character model, effectively policing  
copyright as if it were enforced by law. Similarly, some groups have reportedly used reporting links in  
chat as a threat: “If you don’t leave, I’ll report you for [X] violation.” These dynamics create a  
subculture where community norms govern behavior as much as platform rules do.  
From a critical perspective, this self-policing culture has pros and cons. On one hand, it can help maintain  
order when official moderation is slow or unclear: vigilant users catch real problems (harassment, hate  
speech) more quickly than a remote moderator could. On the other hand, it can lead to abuse of  
authority: members with no formal role might expel others for petty or even imagined infractions,  
turning ambiguity in the rules into social weapons. Cluster’s guidelines implicitly acknowledge this  
, but the advice relies on users’ goodwill.  
tension by discouraging public shaming  
Summary  
In summary, Cluster’s open platform—allowing users to upload custom VRM avatars and worlds with  
few upfront checks—naturally creates a “grey zone” where content sits between official permissibility  
and community standards. Officially, only clearly illegal or extremely offensive material is banned  
;
everything else is largely left to user judgment. The platform’s growth suggests that this freedom  
attracts many creators and participants, but it also shifts moderation burden onto the community. Some  
users or event hosts exploit the vagueness of the rules to label others as “違法” (illegal) or “禁  
止” (forbidden) as a means of exclusion, even when Cluster’s own rules would not automatically  
mandate such punishment  
. Cluster provides reporting tools to address disputes , but with  
little transparency on outcomes. Thus the safety and culture of Cluster’s virtual spaces depend heavily  
on how community members choose to enforce norms. As one review notes, VR platforms must balance  
freedom of expression with immersive safety  
common sense” and by encouraging them to moderate each other via the reporting system  
– a challenge Cluster navigates by trusting its users’  
.
Sources: Cluster’s official documentation (Terms of Service, Content/Community Guidelines) provides  
. Cluster’s own announcements report user/adoption statistics  
the above policy details  
3