AI Policy
Introduction
Artificial intelligence (AI) exhibits the potential to benefit student education and healthcare in exciting and transformative ways. As AI technologies become increasingly integrated into healthcare, education, and research, this communication aims to provide clear guidelines to ensure AI is used in alignment with the vision, mission, and values of Marshall B. Ketchum University (MBKU).
Purpose
This communication establishes standards for the responsible, ethical, and transparent use of AI at MBKU. It applies to all university students, faculty, staff, and administrators.
Guiding Principles
AI use at MBKU must reflect the institution's core values of accountability, caring, excellence, innovation, and respect.
The adoption of AI technologies should support academic integrity, adhere to existing policies, prioritize patient safety, and promote ethical responsibility. It should also enhance, rather than diminish, the MBKU institutional learning outcomes of communication, analytical reasoning and problem solving, interprofessional health education, health information, literacy and lifelong learning, and professionalism.
To that end, MBKU community members are expected to respect and adhere to the following user responsibilities:
User Responsibilities
- Users must ensure that AI-generated or informed content is accurate, credible, and transparently identified.
- AI tools must be used in compliance with all university policies, as well as applicable data security standards and federal regulations such as HIPAA and FERPA.
- Uploading Protected Health Information (PHI) into AI tools is strictly prohibited without prior authorization from the Information Technology department.
- Uploading password-protected information from MBKU platforms to AI tools is prohibited without prior authorization. This could expose unauthorized, proprietary, or sensitive information to the public.
- Be alert for scammers and verify suspicious requests or activity through a known trusted MBKU phone number or email address.
Policy Review
This policy is a living document that will be reviewed and revised periodically to ensure it continues to address emerging AI technologies, ethical standards, and best practices in healthcare education. It is the responsibility of all MBKU community members to ensure that they follow the most updated version of this policy. For further information or feedback regarding this policy, MBKU community members should contact the MBKU IT Support at itsupport@ketchum.edu.
Acceptable Uses of AI
Academic Integrity
- Students may use AI to enhance their understanding of their coursework, but they should not rely on AI to complete their assignments, essays, case reports, or assessments unless explicitly authorized by the instructor and stated in the course syllabus. In the absence of guidance from the course instructor, students should refer to Table 1 regarding acceptable applications of artificial intelligence for student use.
- Faculty are expected to provide guidance regarding acceptable applications of AI for student use in their course syllabi. Sample syllabus statements are listed in Table 2.
- All AI-generated or informed material must be appropriately cited. Unreported use may be considered academic dishonesty. See https://apastyle.apa.org/blog/cite-generative-ai-references; AMA Manual of Style, 11th edition, Section 14.5.2
- AI tools may be used for other purposes (e.g., personal study) only if approved by the instructor and consistent with university policies.
- Students must not upload confidential or proprietary content, Protected Health Information, and/or research data into public AI platforms. With faculty permission, students may only upload course materials (e.g., lecture slides, assignments, and assessments) with permission from the instructor.
Teaching and Learning
- MBKU community members may use AI for the following tasks with certain restrictions:
- Clarifying complex topics
- Summarizing or synthesizing content
- Creating study questions for self-assessment and learning
- Improving accessibility (i.e., Blackboard Ally resources)
- Adaptive learning (i.e., personalizing learning experiences and tutoring for individual learners)
- Image or video creation (i.e., teaching complex topics with procedural animations, 3D models integration)
Restrictions include:
- If the use of AI requires uploading confidential or proprietary content, consent must be received from the instructor or owner of the material. In addition, it is the responsibility of the user to review the privacy policy of the AI software to ensure that any uploaded content cannot be accessed, viewed, used, or shared with any third parties, for any other purposes (including for training AI/Machine Learning models).
- Faculty may use AI to augment grading student assignments while abiding by FERPA requirements.
- Faculty may use AI to help create course content, but are responsible for ensuring the accuracy and relevancy of the final course materials.
- Faculty are encouraged to clearly communicate expectations regarding AI use in personal study, assignments, and examinations.
- Faculty may use AI tools to help detect plagiarism.
- Community members may use AI-powered screen reading or automatic closed-captioning for accessibility needs.
Research and Scholarship
- Community members are expected to maintain transparency regarding the use of AI tools for research and to comply with academic journal policies, including proper citation when AI is used for writing or editorial assistance. See https://apastyle.apa.org/blog/cite-generative-ai-references; AMA Manual of Style, 11th edition, Section 14.5.2
- AI may be employed in various aspects of scholarly work (e.g., data analysis, literature review, or experimental design). Still, researchers remain fully responsible for ensuring all work's accuracy, originality, and integrity.
Administration and Service
- AI may be used to support students, faculty, staff, and administrative functions while adhering to all university policies and federal regulations:
- Writing assistance outside of academic assignments
- Automating repetitive tasks
- Enhancing the student or prospective student experience
- Resource management
Patient Care
- AI use should support, not replace, clinical judgment when used for the following functions. Community members must protect patient privacy and maintain HIPAA and applicable data security standards.
- Diagnostic assistance
- Clinical decision support
- Treatment plans
- Pharmaceutical support
- Charting
- Health record management
- AI tools may be used to supplement, but not replace, professional interpreters and be limited to non-critical or routine communication.
- If AI is used for patient care, efforts should ensure patients are informed and consent is obtained where applicable.
Table 1. Acceptable applications of artificial intelligence for student use at MBKU.
Always OK | Requires Permission | Never OK |
---|---|---|
|
|
|
Adapted from Westling, C., and Mishra, M.K. (2025). Artificial Intelligence: Lessons Learned from a Graduate-Level Final Exam. Accessed on 13 May 2025 from https://er.educause.edu/articles/2025/4/artificial-intelligence-lessons-learned-from-a-graduate-level-final-exam.
Table 2. Sample syllabi excerpts outlining the acceptable use of artificial intelligence (AI).
Life Sciences CourseFranklin Hays, University of Oklahoma"Use of AI tools (e.g., ChatGPT, Bard, Claude) is encouraged in this course to facilitate the student learning experience and overall productivity. However, such use should follow three clear principles: 1) any and all use should be transparent, properly cited, and otherwise declared in any final work product produced for grading or credit; 2) students are responsible for ensuring accuracy of content produced including references and citations; and 3) students acknowledge that improper attribution or authorization is a form of academic dishonesty and subject to the Academic Misconduct Code as outlined in the Student Handbook and the Faculty Handbook. All work turned into the instructor for grading is assumed to be original unless otherwise identified and cited. If there is uncertainty about any content in regard to the above guidelines, please contact the instructor to discuss these questions prior to turning anything in for grading." |
Art History CourseMary Coffey, Dartmouth College"In every way, AI writing supports contravene the learning objectives and pedagogical design of this course. For that reason, I do not permit AI writing supports in this class." [simple spell and grammar checks permitted] |
First-Year Writing CourseLoretta Notareschi, Regis UniversitySuggests that students use an Artificial Intelligence Disclosure, such as the following: "I did not use artificial intelligence in creating this paper" or "I did use artificial intelligence in creating this paper, namely ____________ (ChatGPT, Bard, etc.). I used it in the following ways (check which of the following acceptable uses were utilized):
|
Example syllabus statements. Generative AI Syllabus Statements.
Accessed on 13 May 2025 from https://dcal.dartmouth.edu/resources/course-design-preparation/syllabus-guide/example-syllabus-statements.
Violations of AI Policy
Violations of AI policy may result in lower grades or other disciplinary action. Members of the MBKU community suspected of violating these tenets will be appropriately adjudicated through the student conduct process for students found within the MBKU University Catalog or the procedures for disciplinary actions concerning faculty found in the MBKU Faculty Handbook. Relevant policy statements will be added to the appropriate documents for all constituents. Periodic training will be provided to promote the responsible and effective use of AI in the MBKU community.