The CCP Artificial Intelligence and Electronic Assistance Policy is intended to provide guidance to CCP Students on the use of generative AI and other assistance technologies during the conduct of their graduate studies.
The University of South Alabama Statement on AI and Electronic Assistance
“Unless expressly permitted by the instructor, all usage of generative AI or similar electronic assistance on coursework at USA is considered unauthorized aid. Faculty may check student work for unauthorized assistance using tools that flag AI contributions. Such instances of academic misconduct are reported in the same fashion as other cases of cheating and plagiarism, following the protocols set forth in the Student Academic Conduct Policy.”
Clinical and Counseling Psychology Doctoral Program Policy
Usage Philosophy
Support for Critical Thinking, Not Substitution: AI should serve as a tool to stimulate ideas, provide research support, and help analyze problems (see below for example permitted uses), rather than directly answering questions or completing assignments. In short, AI should not be used to replace human critical thinking.
Critical Engagement: Any AI-generated outputs should be critically evaluated, with attention to potential biases and limitations. AI-generated responses may occasionally contain misinformation (false or inaccurate information) or hallucinations (perceptions or statements with no factual basis). Users are advised to critically evaluate AI content to ensure accuracy and avoid introducing unsupported or misleading information into clinical discussions or training.
Ownership and Responsibility: Students are fully accountable for the content they submit. They need to verify all information from AI tools, ensuring its accuracy.
Transparency and Documentation: To align with principles of academic integrity, students must document and declare any AI use in their work, explaining how AI was utilized.
Confidentiality of Data: To avoid privacy risks, confidential or personal information should never be entered into AI tools.
Prohibited Academic Use
Within the USA Clinical and Counseling Psychology (CCP) doctoral program, the use of artificial intelligence such as Chat GPT in the generation of writing for course assignments, programmatic milestones (DSK, Clinical Comprehensive Exam), or research products (proposal or defense documents) is prohibited. Violations of this policy will be treated as instances of academic misconduct following the process for cheating and plagiarism.
Specific Examples:
Full AI-generated assignment: A student has an AI tool to write the entire assignment and submits it as their own work.
Partial AI-generated assignment: A student uses an AI tool to write part of an assignment and submits it as though they wrote the entire piece.
Modified AI-generated assignment: A student heavily edits an AI-generated assignment, blending their own phrasing with the AI’s text, and submits it as fully their own work.
Paraphrased AI-generated assignment: A student rephrases an AI-generated assignment to create new text that mirrors the AI’s structure and ideas, then submits it as if they developed all content independently.
Faculty reserve the right, in suspected cases of unauthorized AI use, to request an oral presentation or defense to assess the student’s understanding of submitted material.
Prohibited Clinical Use
Within the USA Psychology Clinic and affiliated practicum training sites, students are NOT permitted to use AI tools, including but not limited to ChatGPT, Bastion, and Bard, for assignments, clinical notes, or report writing. While AI may provide general support in some areas, there are ethical and legal concerns with its use in clinical training. These concerns involve informed consent, safety, transparency, algorithmic biases, and data privacy. Furthermore, to develop as competent clinicians, students must develop a thorough understanding of clinical concepts and processes independently.
Prohibition on Private Client Information. Students, supervisors, and staff are strictly prohibited from entering, sharing, or transmitting private client information (including HIPAA-protected patient health records and FERPA-protected educational records) into AI or LLM systems. As these tools are not HIPAA/FERPA-compliant, they are unsuitable for handling any client-sensitive data.
Prohibition on Clinical Decision-Making Support. As a student within clinical training and due to limitations in reliability, accuracy, and susceptibility to misinformation, AI and LLM systems must not be used for clinical decision-making tasks, including test interpretation, diagnostic classification, or treatment planning within the Psychology Clinic.
Permitted Uses
We acknowledge the value of this fast-growing contemporary technology and recognize the limited beneficial use of AI during training and professional activities. Some examples of permitted activities include:
Background Research: Students may use AI to ask basic factual or thematic questions to gain a general understanding of a topic, similar to using Wikipedia for an overview.
Brainstorming: Students can use AI to explore concepts, ideas, principles, theories, or scholarly debates, similar to consulting scholarly articles for a topic overview.
Psychoeducation Resources: AI systems can recommend books, articles, or exercises based on diagnostic categories or treatment goals. AI may also be used to create or deliver educational content for psychoeducational purposes.
AI for Editing: Students may use AI to suggest stylistic or grammatical edits on their self-written work (non-clinical work) and choose which edits to apply.
Image Creation: AI may be used to create images based on content that is created by the user.
Transparency and Documentation: Students must document and declare any AI use in their work, explaining how AI was utilized. APA provides guidelines for the citation of AI usage: https://apastyle.apa.org/blog/how-to-cite-chatgpt