Safety principles
Visible presence
Members should know when AlexAI is installed, where it can read, and when it is in
voice.
Memory with recourse
Memory is useful only if people can ask for corrections, removals, or context when it
gets something wrong.
Adult context
The eR33t origin instance is not child-directed and is not designed for under-18
communities.
Operator accountability
Managed pilots need named admins, clear permissions, and a path for members to raise
issues.
Memory controls
AlexAI may store long-term memories, summaries, relationship state, mood signals, and
generated context. Admins should treat memory as a live community record, not as disposable
chat noise.
- Do not intentionally feed AlexAI secrets, private keys, passwords, private medical or
financial records, or other sensitive personal data.
- Use private or excluded channels for conversations that should not enter bot memory.
- Ask an admin to correct or delete memories that are wrong, stale, invasive, or unsafe.
- Managed pilots should decide up front which channels, roles, and events are in scope.
Voice boundaries
AlexAI voice uses realtime AI services for transcription and spoken response. Voice presence
should be obvious to members in the room. Server owners should not place AlexAI in voice
spaces where members expect no bot processing.
- Do not use AlexAI voice for covert recording, surveillance, or consent-sensitive spaces.
- Voice memory and episode recaps should be treated as opt-in pilot features, not a hidden
default for every room.
- Tell members how voice memory is configured before enabling it in a pilot server.
- Remove AlexAI from a voice channel when the room wants a bot-free conversation.
- Route transcript, recap, or voice-memory deletion requests through server admins or the
public Contact page.
Generated content boundaries
Generated images, videos, roasts, summaries, and blog-style posts can be funny, but they
can also miss context. Admins and operators can remove generated artifacts that cross a
community line or create unnecessary risk.
- No sexual content involving minors, non-consensual intimate content, or requests that
sexualize private people.
- No doxxing, credential theft, malware, evasion, targeted harassment, or real-world
threats.
- No impersonation that confuses people about who is speaking or what is official.
- No professional advice presented as authoritative legal, medical, financial, or safety
guidance.
Moderation and admin controls
AlexAI can support moderation, summaries, diagnostics, and dashboards, but it is not a
replacement for human judgment. Admins are responsible for server rules, permissions, member
notices, appeals, and final moderation decisions.
If AlexAI produces something unsafe or wrong, treat it as a product issue to route,
correct, or remove. Do not assume the model "meant" it.
Managed pilot requirements
Before a pilot goes live, a community should have:
- A server owner or admin responsible for AlexAI configuration.
- A clear list of channels and voice spaces AlexAI can access.
- A member notice that explains memory, voice, generated media, and deletion requests.
- An escalation path for privacy, safety, or moderation problems.
Reporting a safety issue
If you are in the eR33t Discord, contact a server owner or admin. If you are outside the
Discord, use the Contact page. Start the message with "Safety
Issue" and include the server, channel, approximate time, and what happened.