Codidact is a non‑profit, community‑governed network of Q&A communities. This policy defines the standards for acceptable content across the Codidact platform and outlines how we identify, assess, and mitigate risks in line with the UK Online Safety Act.
This policy applies to:
* All users and visitors
* All posts, comments, messages, and uploaded materials
* All communities, regardless of topic
Communities may adopt stricter rules but may not weaken or contradict this policy.
## Core Principles
### Safety First
We will take reasonable and proportionate steps to reduce the risk of harm to our users, including the risks presented by encountering illegal content on our platform.
### Community Self Governance
Individual communities may set community- and topic-specific rules by general consensus, but may not adopt rules which contradict or weaken those set out in this policy.
### Transparency
Moderation actions will be logged and auditable by peer moderators and staff members. Escalation pathways and appeals processes will be clear, accessible, and auditable.
### Freedom of Expression
We support respectful, good-faith interaction and knowledge sharing while removing illegal or harmful content.
## Prohibited Content
### Illegal Content
Content which constitutes a criminal offence in the UK is prohibited on the platform. This includes but is by no means limited to the following:
* Terrorist content
* Child sexual abuse material, or links to it
* Harassment or hate speech
* Threatening or inciting violence
* Encouraging self-harm or suicide attempts
* Sharing anyone else’s personal information without consent
* Encouraging or assisting a criminal offence
### Adult Content
While our platform is not designed or marketed specifically for children, we acknowledge that children are also capable of engaging in respectful, good-faith knowledge sharing. We do not ask for or record users’ ages, and as such we acknowledge the possibility that a small number of children may use the platform. Adult content, including without limitation pornographic material, whether real, AI-generated, or
written or depicted any other way, is **prohibited** on the platform.
## Restricted Content
### Sensitive Topics
Whilst content fitting the definitions above is prohibited, we acknowledge that legitimate questions may arise about some of these topic areas without necessarily constituting content of that type.
Other topic areas which are not themselves prohibited may also be sensitive topics of discussion. These topics may include, for example, questions about mental health, self-harm, trauma, violence, hate crime, or criminal acts. This list is not exhaustive and Foundation staff and local moderators may take proportional moderation action on any sensitive topic. Posts on these topics
must be:
* Non-graphic in both text and any related media
* Educational, academic, analytical, or supportive
* Not encouraging harmful behaviour
## Moderation
Our communities are moderated through a combination of community moderation actions, elected or appointed volunteer local moderators, staff with moderation powers, and automated moderation systems.
Appointed, elected, or staff moderators may take any action they think proportionate in moderating content to uphold community rules, our Terms of Service and Code of Conduct, and this Content Policy. This may include asking an author to change their post or editing content directly, removing content, or issuing warnings or suspensions. Foundation staff may in serious cases—including deliberately posting illegal content—permanently ban users.
### Appeals
Users may appeal moderation actions by posting in the relevant community’s Meta category. Where possible, this will be reviewed by a different moderator or a member of Foundation staff. Local moderators are empowered to resolve disputes and appeals; Foundation staff will not
overrule a local moderator decision unless there is a policy or legal reason to do so, or when local moderators are not available.
### Illegal content handling & appeals
When illegal content is identified, local moderators will escalate it to Foundation staff. Staff will remove the content, securely preserve evidence, and notify relevant authorities if necessary. Where content is removed by Foundation staff because it is suspected of being illegal content, its author will be notified and a report automatically filed in our Safety Centre. To appeal an illegal content removal, users should not post in the community’s Meta category, but should instead file an appeal via the Safety Centre.