Moderation transparency
How we review content and apply measures on YouVisible
Introduction
YouVisible publishes this Moderation transparency policy to explain:
- The principles guiding our decisions.
- The tools we use (human and automated).
- How measures are applied (warnings, removals, suspensions).
- Which channels you have to appeal and defend your rights.
Our goal is to guarantee legality, safety, proportionality, due process, non-discrimination and traceability.
Moderation principles
We are guided by the following principles:
-
Legality and safety
We comply with applicable regulations and prioritise harm prevention, especially child protection. -
Proportionality
We apply graduated measures based on severity, recurrence and impact: educational warnings, feature limitations, content removal, suspension or closure. -
Transparency and reason
Whenever the law allows, we notify the affected user, explaining the reason and the applicable rule or policy. -
Due process
We offer accessible appeal channels, with human review where needed and the possibility to submit arguments and evidence. -
Non-discrimination
Decisions are consistent and free from bias based on origin, gender, orientation, religion, ideology or other protected traits. -
Privacy
We minimise data exposure in moderation processes and protect good-faith reporters.
Tools and processes
We combine automated systems and human review:
-
Automated detection
We use classifiers, perceptual hash matching and heuristics to prioritise reviews and block manifestly illegal material. -
Human review
Borderline cases, statements and appeals are reviewed by human moderators, with escalation to specialised teams for minors, intellectual property or serious risks. -
Traceable logging
For every decision we record what measure is taken, why (rule or law), who adopted it (system or person) and when. -
Graduated measures
- Educational warning.
- Content removal.
- Temporary limits on comments, uploads or messaging.
- Suspension or closure for serious or repeated violations.
-
Specific restrictions for minor accounts
Minor accounts have limited visibility, restricted messaging and strengthened discovery controls. -
Local blocks where required by law
We may apply geo-restrictions or local blocks if required by a competent authority or applicable law.
Recurrence and warning / strike policy
-
Accumulation of violations
Each confirmed violation may generate a warning (strike) with a temporary expiry. -
Escalation
- 1st warning: removal and education.
- 2nd–3rd warning: feature or reach limitations.
- Serious recurrence: prolonged suspension or closure.
-
Reinstatement
Successful appeals remove the warning. Educational measures may reduce expiry time.
Minor content and protection
-
Strengthened controls
Minor profiles have additional controls for visibility and messaging. -
Prioritisation of reports
Reports involving minors are prioritised and escalated to specialised teams where appropriate. -
Immediate removal in serious risk
Manifestly illegal or harmful material involving children is removed immediately, with notification afterwards where appropriate.
Handling of evidence and retention
- We retain evidence, records and moderation communications for as long as necessary to:
- Investigate incidents.
- Defend claims or appeals.
- Comply with valid orders and legal obligations.
- We implement access controls and, where appropriate, encryption and logging.
- Access is limited to authorised personnel bound by confidentiality duties.
Remedies and appeals
If you receive a moderation measure, you can:
- Request internal review through the available channels (e.g. support form or help area).
- Provide evidence: context, image authorisation, usage rights and similar.
- Receive a reasoned resolution within a reasonable time.
If an error is confirmed, we will reverse the measure and reflect it in our internal metrics. - Use legal appeal mechanisms in cases with legal implications (e.g. intellectual property or official orders).
In situations of serious and imminent risk (e.g. child safety), we may act immediately (remove or block) and notify afterwards.
Notices and notification content
When we take a moderation measure, the notification may include:
- The affected content or action.
- The applicable rule, policy or law.
- The measure adopted and its duration.
- How you can appeal and within which timeframe.
- Whether automated systems were involved in detection or decision.
In certain cases we may omit some details when required by an authority or when notification would compromise an investigation.
Formal external notice channel
Any person may report allegedly unlawful content through:
- The internal reporting system available on the platform.
- The legal contact email: [LEGAL EMAIL].
The notice should include, where possible:
- Identification of the claimant.
- URL or clear description of the content.
- Legal grounds or rule allegedly infringed.
- Supporting documentation where applicable.
YouVisible may adopt immediate precautionary measures (removal or blocking) in cases of serious risk, particularly involving minors.
Evidence linked to moderation procedures will be retained only for the time strictly necessary to comply with legal obligations and defend potential claims.
Cooperation with authorities
We cooperate with competent authorities under valid orders or requirements, limiting disclosure to what is strictly necessary and recording actions taken.
We will inform the affected user whenever the law allows.
Publication and changes
This policy will be updated when moderation processes or applicable regulations change.
We indicate the version and effective date. Material changes will be communicated through reasonable means (for example, in-app notices or email).
Last review: 09 February 2026