AI Safety & Responsible Use
Essential guidelines for educators teaching AI to K-12 students. Safety is not a separate topic — it is woven into every lesson.
Important Notice for Educators
This curriculum provides educational content about artificial intelligence. It is the responsibility of educators, schools, and districts to ensure appropriate safeguards are in place when students interact with AI tools. Teachers should review all activities before classroom use, adapt content to their students' maturity level, and follow their school's acceptable use policies. AI Foundations provides educational resources only and does not monitor, supervise, or control student interactions with third-party AI tools.
Protecting Personal Information
What Students Should NEVER Share with AI
Students must understand that AI chatbots and tools are not confidential. Anything typed into an AI system may be stored, analyzed, and used for training. Teach students to treat AI interactions as public conversations.
Teaching Points for Educators
Data persistence
Explain that AI companies may retain conversation data indefinitely. Even 'deleted' conversations may exist in backups or training datasets.
No true privacy
AI chatbots are not therapists, counselors, or confidential services. Students should never use them as substitutes for trusted adults.
Third-party sharing
AI companies may share data with partners, use it for model training, or be compelled to provide it to law enforcement.
The screenshot rule
Teach students: 'Don't type anything you wouldn't want screenshotted and shared with the whole school.'
AI and Student Mental Health
Current Concerns
Recent incidents have raised serious concerns about AI chatbots and student wellbeing. Reports have linked AI companion chatbots to harmful outcomes for vulnerable young people, including emotional dependency, inappropriate interactions, and in tragic cases, self-harm. These concerns underscore why AI safety education is critical and why students must understand the limitations of AI systems.
What Students Must Understand
AI does NOT have feelings, consciousness, or genuine empathy. It generates responses based on patterns, not understanding.
AI chatbots are NOT friends, therapists, or counselors. They cannot replace human relationships or professional mental health support.
If you feel sad, anxious, or troubled, talk to a trusted adult — a parent, teacher, school counselor, or call a crisis helpline.
AI can generate harmful, inaccurate, or manipulative content. It can say things that sound caring but are not genuinely supportive.
Guidance for Educators
Establish clear boundaries before any AI tool use: students should never seek emotional support from AI
Monitor student interactions during classroom AI activities
Create a safe environment where students can report concerning AI interactions
Know your school's crisis resources and referral procedures
Discuss real-world incidents age-appropriately to build critical awareness
Reinforce that AI-generated responses are not facts — they are statistical predictions
Teach students to recognize when AI is mimicking empathy vs. when a human genuinely cares
Partner with school counselors to address any concerns that arise
Crisis Resources
If a student is in crisis: 988 Suicide & Crisis Lifeline — call or text 988 | Crisis Text Line — text HOME to 741741 | SAMHSA Helpline — 1-800-662-4357. Always follow your school's mandatory reporting and crisis intervention procedures.
Educator Responsibilities
By using the AI Foundations curriculum, educators acknowledge and accept the following responsibilities:
Supervision
Teachers are responsible for supervising all student interactions with AI tools during classroom activities. Students should never be left unsupervised with AI chatbots or generative AI tools.
Age-Appropriate Adaptation
Teachers must review and adapt all curriculum materials to be appropriate for their specific students' age, maturity level, and cultural context before classroom use.
Acceptable Use Policies
Teachers must ensure compliance with their school's and district's technology acceptable use policies, and obtain required parental permissions before students use AI tools.
Safety Instruction
Before any hands-on AI activities, teachers must explicitly instruct students on personal data protection, appropriate use boundaries, and what to do if they encounter concerning AI outputs.
Incident Response
Teachers must immediately report any concerning student interactions with AI tools to school administration and counseling staff, following established procedures.
Tool Vetting
Teachers should vet all third-party AI tools for age-appropriateness, data privacy policies, and content filtering before introducing them to students.
Parental Communication
Teachers should inform parents/guardians about AI tools used in the classroom and provide guidance for safe AI use at home.
Legal Disclaimer
AI Foundations provides educational curriculum materials only. The platform does not provide, host, operate, or control any third-party AI tools, chatbots, or generative AI systems referenced in lesson plans or activities. AI Foundations is not responsible for the content, behavior, data practices, or outcomes of any third-party AI tools.
Responsibility for student safety rests with educators, schools, and districts. Teachers, administrators, and school districts are solely responsible for: (a) supervising student use of AI tools, (b) ensuring compliance with applicable laws including FERPA, COPPA, and state student data privacy laws, (c) obtaining required parental consents, (d) implementing appropriate content filtering and monitoring, and (e) providing mental health support and crisis intervention services.
AI Foundations makes no warranties regarding the safety, appropriateness, or accuracy of any third-party AI tools. AI technology evolves rapidly, and tools referenced in this curriculum may change their features, terms of service, data practices, or content moderation policies without notice.
Limitation of liability: To the fullest extent permitted by law, AI Foundations, its creators, contributors, and affiliates shall not be liable for any direct, indirect, incidental, consequential, or special damages arising from: (a) student interactions with third-party AI tools, (b) emotional, psychological, or physical harm related to AI use, (c) data breaches or privacy violations by third-party AI providers, or (d) any outcomes resulting from the implementation of this curriculum.
By using this curriculum, educators acknowledge that they have read and understood these safety guidelines, accept responsibility for implementing appropriate safeguards, and agree to the terms outlined in our Terms of Use.