Welcome to the Teaching with AI website, the main guide for using AI to enhance teaching across IE University.
All resources on this site are managed by the Faculty Training & Support team at the Faculty Office to promote practical, reliable, and responsible use of AI in teaching and learning.
The Faculty Training & Support team offers a wide range of assistance for professors, including guidance on AI tools, access to institutional licenses, and support for adapting pedagogical strategies to this evolving context.
Faculty can receive support by email, through daily Zoom office hours with no registration required, via a custom GPT that answers frequent questions on ChatGPT, and through a dedicated calendar of training sessions led by the team or IE faculty. Links to each option are provided below:
QR code to access this site
Artificial Intelligence is technology that replicates the way humans act, understanding language, making decisions, and producing content for them. Artificial intelligence has been discussed and applied in very common tools for many years, often employed for the purposes analysis, automation, or personalization. The emergence of Generative AI, however, has changed this landscape by making AI directly accessible to students and faculty for creating content, ideas, and learning support.
Artificial Intelligence (AI)
A broad term for computer systems designed to perform tasks that usually require human intelligence, such as reasoning, recognizing patterns, or making decisions.
Machine Learning (ML)
A branch of AI where systems learn from data instead of following fixed rules, improving as they encounter more examples.
Deep Learning (DL)
A type of machine learning inspired by the human brain, where multiple layers refine information step by step to manage complex tasks like image or language recognition.
Generative AI
A form of AI focused on creating new content, such as text, images, audio, or code, rather than only analyzing existing information.
Large Language Models (LLMs)
Generative AI systems trained on very large collections of text so they can understand and produce human-like language.
GPT
A specific large language model that generates text by predicting the most likely next word based on patterns learned from data.
This image shows the relatively small number of AI systems that dominate the current landscape. While these tools may appear similar, they differ in capabilities, strengths, and the types of content they handle best, such as text, images, research, or multimodal tasks. In many cases, other digital tools embed AI features powered by the technology of these providers. Exploring several of them helps professors better understand their differences and make informed choices for teaching and learning in their fields.
Below, we outline the basic features of four of the most relevant AI systems to provide a clear point of reference for comparison. To use them, just sign up on their websites and start chatting:
ChatGPT (https://chat.openai.com/) is the most popular AI chatbot. ChatGPT is also available as a phone app (Android version) (iPhone and iPad version). As of August 2025, all previous models or versions were merged into one, ChatGPT 5:
ChatGPT-5.2 is the standard model designed to provide fast, accurate, and conversational responses across a wide variety of topics. It balances speed and reasoning ability to make it suitable for everyday use.
ChatGPT-5.2 Thinking is an enhanced version that spends more time reasoning through complex problems before responding. It prioritizes depth and logical consistency over speed, making it better for advanced problem-solving or analysis.
ChatGPT 5.2 Pro is the premium tier offering the most powerful performance, faster response times, and access to advanced features. It is optimized for professionals who need reliability, priority access, and extended capabilities beyond the standard model.
Copilot (https://copilot.microsoft.com/), formerly known as Bing Chat, is widely used in professional contexts. It combines traditional search and AI conversation formats, helping you retrieve and process information very smoothly. Copilot is free for you and it comes integrated with Microsoft 365 apps such as Word, Excel, PowerPoint, Outlook, and Teams.
Gemini (https://gemini.google.com) is Google's counterpart to Copilot and ChatGPT, functioning as a versatile AI chatbot. It can connect with Google's applications and services, such as NotebookLM, Maps or YouTube.
Anthropic's Claude (https://claude.ai) is designed to be a helpful and safe assistant, with built-in rules that guide its behavior to avoid harmful or biased responses. The Claude 4.5 Sonnet model, released in October 2025, introduces hybrid reasoning capabilities, allowing users to balance response speed and depth, and includes features like a visible reasoning process to enhance transparency.
CHAT GPT EDU VERSION
In February 2025, IE University became one of the first universities worldwide to partner with OpenAI, which means ChatGPT in its .EDU version becomes our standard AI platform for general use. If you are interesting in training opportunities on ChatGPT and access to a free license, please visit our AI Training section.
HOW CAN I GET A CHAT GPT EDU LICENSE FOR ME AND/OR MY STUDENTS?
Obtaining an individual ChatGPT Edu license at IE University is very straightforward.
Faculty can receive one individual license by completing the Co-Teaching certificate program, a three-level workshop focused on developing core AI skills for teaching.
In special cases, licenses are assigned in advance independently from that program, mainly for professors teaching AI-related subjects or participating in pilot initiatives.
Lastly and periodically, the Faculty Office also contacts professors by email to offer group licenses, which provide ChatGPT access for both faculty and their students, allowing AI use within a protected educational environment.
INDIVIDUAL LICENSES:
·PARTICIPATION IN CO-TEACHING CERTIFICATE PROGRAM → TO REGISTER USE THIS IE CONNECTS LINK AND QR CODE
·SPECIAL CASES → CONTACT FACULTY.TRAINING.SUPPORT@IE.EDU
GROUP LICENSES:
·SEASONAL SIGN-UP CAMPAIGNS → YOU WILL BE CONTACTED BY FACULTY OFFICE VIA EMAIL
COMMERCIAL CHATGPT VERSION VS EDU VERSION
Our ChatGPT EDU version allows structured and safe interaction between faculty and students within a safe academic environment.
It includes access to in-house GPTs and institutional support tailored to your teaching and learning needs.
It does not use your data to train the system, while still requiring respect for copyright rules.
It provides access to Pro mode features that are not available in the standard free version.
It offers a restricted and curated selection of in-house GPTs useful for your teaching journey.
These two images show an overview of the available dropdowns menus and options for each version as of January 2026:
COMMERCIAL VERSION
EDU VERSION
OTHER AI RESOURCES
BLACKBOARD
AI is also available directly within Blackboard, the main platform professors use to manage their courses at IE University. Its AI features are already embedded and enabled for you, so there is no need to install additional tools to start using them. The Faculty Training & Support team provides regular webinars and training on how to get the most out of these functionalities to enrich your teaching.
On Blackboard, AI supports teaching in several practical ways. It can assist with rubrics by quickly generating clear evaluation criteria and learning outcomes that remain fully editable. For tests and assignments, AI supports the creation of multiple-choice tests and question banks based on course materials. AI Conversations, lastly, enable guided, customized interactions that help students practice concepts and receive formative feedback within a controlled, proctored environment.
TOOLS DEVELOPED AT IE UNIVERSITY
These three tools have been developed at IE University and are the first of many to come. They are available in Blackboard by clicking the “+” icon in any content area and selecting “Content Market.” Each tool supports a different level of teaching. AI Tutor functions as a chatbot that helps students prepare for a course and check their progress based on the professor’s materials and guidance. Feedback Assistant supports professors in improving the quality and efficiency of feedback for written essays. AI Mentor is designed for case creation and can be tailored by professors through natural conversation with the platform.
INSTITUTIONAL PHILOSOPHY
POLICIES
Innovation is one of the pillars of IE University. For this reason, since 2023 the institution has actively developed guidelines and documents to promote responsible and effective use of AI tools across its programs. In the IE Policies section of this website, and through the link below, you will find access to the most up-to-date documents related to AI in education. As the AI landscape continues to evolve, our tools and practices will also adapt, making it important to review these resources regularly to support educational values and protect academic integrity.
STUDENTS
AI has reshaped how professors and students interact and how technology is used in the classroom. It is therefore crucial for faculty to understand how students already use AI both inside and outside academic settings. A survey conducted at IE University, consistent with findings published elsewhere, captures students’ usage patterns and attitudes toward AI in education. ChatGPT emerged as the most widely used tool at 89.3%, with 45% of students reporting regular use, mainly for research purposes and information retrieval at 85%. With 84% of students stating they feel confident using AI, it becomes especially important for professors to be prepared to address this reality.
In light of this, preparing students for the future in an AI-rich context requires careful balance. Acquiring strong foundations for each subject they study remains essential, since mastering content without assistance allows students to use AI confidently and assess its outputs critically. Exposure to AI also helps students prepare for the job market and, ideally, understand tools adopted across industries. At the same time, the risk of cognitive offloading must be addressed, ensuring students do not replace thinking and reasoning with automation. In parallel, traditional teaching practices that foster sustained attention, reading, and independent thinking continue to play an important role and they are not to be replaced, only complemented.
PROFESSORS
Faculty responses worldwide to AI in education generally follow three main patterns. Some professors resist or oppose AI in order to protect established teaching practices and reduce its impact. Others accept AI as part of the academic environment and adapt their courses to identify what remains fundamental, incorporating AI only when it clearly supports learning goals. A third group embraces AI more enthusiastically, embedding it into teaching and assessment, particularly in disciplines where some non-essential processes can be delegated to AI.
Professors retain the discretion to adopt the approach to AI they consider most appropriate, as long as learning objectives are met. However, it is important to recognize that, regardless of individual views, AI has already transformed how students engage with courses, the resources they access, and the effectiveness of traditional practices.
Two key guiding principles, explained in the two following sections, are variety and cumulative knowledge, and a selective use of tools and activities chosen to match specific learning goals:
VARIETY AND CUMULATIVE KNOWLEDGE
The Swiss cheese model, borrowed from other academic fields, has been applied to education to show the importance of preventing weaknesses from aligning. It uses the analogy of multiple layers of cheese, where problems occur only when those holes line up. In practice, this involves using diverse assessment formats so that if one becomes vulnerable, for example through AI use, the same limitation does not recur across the course. It also emphasizes designing assessments that connect and build over time, reinforcing cumulative learning and helping students see knowledge as progressive rather than fragmented.
Following the principle of variety exemplified by the swiss cheese, a thoughtful selection of tools and strategies often proves most effective, based on the idea of The right tools for the right tasks. In some cases, AI-supported activities may align well with learning goals, while in others, supervised assignments or closed-laptop discussions better serve the purpose.
As suggested by the image of different hand tools with complementary functions, classroom policies can be adjusted for specific activities or moments, selecting the approach that best supports attention, participation, and learning outcomes.
These moments benefit from being carefully designed in advance, integrated into the session plan, and aligned with clear pedagogical intentions rather than being improvised on the fly.
ACADEMIC INTEGRITY
One of the main challenges posed by generative AI is ensuring that learning has genuinely taken place. Closely linked to this is the need to protect academic integrity in a context where AI can at times produce convincing academic outputs.
Please, check IE University's Academic Integrity policies in the link below and the dedicated tab in these websites:
In recent months, the focus in education has shifted from an ongoing race to detect cheating or the evidence of cheating, often driven by possibilities caused by rapid technological change, toward a stronger emphasis on finding evidence of learning.
In a technological context like this one, concrete proof can feel elusive, more like a shifting trace of light or smoke than something solid and easily grasped, as shown in the illustration on the left.
While vigilance around academic integrity remains necessary, greater attention is now placed on how students demonstrate understanding and skill development. When learning is clearly verified, educational objectives are effectively fulfilled.
In connection with the point above, use of AI detectors should be done with caution, as they are not reliable enough to serve as definitive evidence of misconduct.
This is because AI detectors rely on probabilistic patterns in language rather than on verifiable evidence of how a specific text was produced. As suggested by the image on the right, a passage from the Book of Genesis is flagged as likely AI-generated, pointing to the uncertainty and limitations surrounding AI detection.
A good application of AI detectors is to initiate a conversation or a review process in which students are asked to explain and substantiate their work. Their primary value lies in their dissuasive role, since current technology is not capable of accurately detecting AI use in a way that justifies formal academic integrity proceedings.
PROCTORING TOOLS
Proctoring tools have gained importance as digital, in-class assessments increasingly rely on student laptops, requiring additional measures to support academic integrity. At IE University, two proctoring tools are available. Smowl CM tracks student activity on their laptops and generates reports to help identify irregular behavior. Respondus LockDown Browser restricts access to other applications and websites during an assessment. While both tools are integrated into Blackboard and can be enabled easily, but they do not replace faculty responsibility, which includes carefully reviewing reports (Smowl) and maintaining active vigilance in the classroom during activities. Their effectiveness increases when combined with assessment designs that make AI use more difficult, such as hybrid formats that provide part of the required information on paper. Additional guidance is available in the guides below and through the Faculty Training & Support team.
NEW CHALLENGES
WEARABLES
AI-enabled wearables present a growing challenge for academic integrity, as devices such as smart glasses, smartwatches, or discreet earbuds or neckpieces, can provide real-time assistance during in-class activities. While their use is still emerging, these technologies complicate monitoring and raise new questions about fair assessment. A basic recommendation is to stay alert, set clear expectations, and design activities that emphasize reasoning, explanation, and process over easily assisted outputs.
FAKE SUBMISSIONS
Fake submissions present a risk in digital face-to-face exams when students leave the classroom without submitting their work, allowing them to complete it elsewhere using notes or AI while still within the official exam time (which makes it almost impossible to detect afterwards). This practice undermines the conditions of the assessment despite appearing compliant. A simple but effective precaution is to verify that every exam has been submitted before a student is allowed to leave the classroom.
ADD-ONS AND AGENTS
AI add-ons for browsers, as well as AI-native browsers such as Comet (Perplexity) or Atlas (ChatGPT), can assist students during assessments by summarizing content, rewriting text, or generating responses in real time. These tools can operate discreetly alongside exam platforms, with little or no intervention needed on the part of the student, making detection difficult. Clear policies on permitted tools, combined with supervised conditions and assessment designs that emphasize reasoning and explanation, help reduce their impact.
PROMPT INJECTION
Prompt injection or white text techniques involve hiding instructions for an AI system within an assignment, often using invisible text such as white font on a white background or extremely small font sizes, to influence how AI tools generate responses. These practices seek to bypass intended constraints or steer automated evaluation toward favorable grades or feedback. Prevention depends on careful review of submissions, the use of plain-text formats when appropriate, such as the Blackboard text editor instead of PDFs, and assessment designs that include in-class validation.
CONCLUDING RECOMMENDED PRACTICES
Every course should include supervised opportunities where students demonstrate their learning without access to AI, even when AI use is permitted at other points in the semester.
Professors are encouraged to go beyond written assignments by using alternative formats such as videos, presentations, or projects, which can increase student engagement and, in some cases, require skills that AI alone cannot provide or simulate.
AI policies should be designed to accommodate different degrees of device and AI use, allowing instructors to limit or require electronic resources depending on the learning goals of each activity.
Careful revision of sources has become more important than ever: make sure references are accurate, relevant, and have been actually read, since AI can invent citations or push students toward easily retrievale rather than appropriate materials.
In this new AI environment, rather than focusing only on end results, a stronger emphasis should be placed on the learning journey by using intermediate deliverables, tracking progress over time, and, whenever possible, supporting students throughout the process by engaging more directly with them.
Regardless of how AI is used in a course, human involvement should remain central, with students engaging in genuine critical thinking and faculty maintaining direct responsibility for key academic decisions, especially, grading and feedback.
No matter how fast technology advances, faculty expertise remains the most valuable resource. It offers deep knowledge of the field, real-world experience, and a trusted human voice, since professors, not machines, are ultimately responsible for evaluating student work.
THANK YOU!