Guidance for Instructors

Last modified: March 27, 2024

Disclaimer: The field of generative AI and opinions about how to react to these tools in academia are evolving rapidly. As an organization, LILE is working to keep up to date with this issue in order to provide robust and timely advice for faculty. Watch this page, as well as our events and blog, to learn more.

Quick Links

Looking for an AI policy? We have written an expanded guide to writing artificial intelligence policies.

Ready to incorporate AI in your class? Our resource on the design of AI assignments has examples and considerations to get you started.

Missed one of our AI workshops? Watch the recordings here:

Jump to:

Impact of AI on Education

Shortcomings of AI

Opportunities AI Might Provide

Speaking to Students About AI

Recommendations for Course Policies

AI Detection Software

Artificial Intelligence Tools

Artificial Intelligence (AI) is any technology that attempts to solve problems and complete tasks that would usually require human intelligence. For decades there have been increasing advances in AI software and hardware, including tools many of us use everyday such as GPS, Alexa, or speech to text. The most visible tools at the moment are generative AI models such as ChatGPT, Bing Copilot, Poe, and DALL-E. For some beginning guidance on trying these tools, watch our workshop on how to use ChatGPT and Midjourney (image generator). These tools can generate written responses, images, and code to a wide variety of user prompts and questions. Their power comes from machine learning models that are able to predict content to prompts based on a large amount of data they have been trained on.

Generative AI tools such as ChatGPT that produce written content utilize a natural language model that allows users to speak with the AI in a conversational manner.1 Some tasks that ChatGPT excels at are summarizing information, generating computer code, translating text from one language to another, analyzing data, and writing in specific genres and voice. Examples of prompts might be “Write a three page paper on the effects of climate change” or “I am looking to buy a camera. Do you have suggestions?” or “Review my writing for errors.” The results of these prompts can be impressive, but it is important to note that generative AI is not technically thinking or emoting and lacks important human skills such as critical thinking and fact checking. You should not rely solely on AI to produce factual information.

Impact on Education

Generative AI will continue to grow in capabilities and it is already being incorporated into word processing and search engines. The challenge for instructors is to discover how to incorporate Artificial Intelligence content generators as a tool in their teaching rather than view them solely as a threat. In the past, other technology tools such as multi-function calculators, spelling and grammar checkers, and statistical analysis software shifted the ways we learn and teach. As in those cases, educators will need to help students differentiate when AI can help with learning versus when it is a short cut around learning. 

It must be acknowledged that faculty’s bandwidth to address the emergence of AI is limited. Learning about AI and how it will impact teaching will take time, especially as it is a new trend. This is likely a multi-year shift in education and as such does not need to be tackled all at once. Several steps instructors can take:

  • Try AI content generators and understand their capabilities. Learning the basics of how to write effective prompts is key to maximize the content produced by generative AI. You may also want to take time to understand what happens on the back end of any AI tool and what data the tool is pulling from.
  • Update course policies to include considerations for using AI content. While you may not choose to integrate AI into your course, it is important not to ignore these technologies. Here are a few questions to ask yourself: First, do the values of the technologies align with your course values? Second, what do you want to communicate about these technologies to your students? Finally, what are the intellectual gains and guidelines if you do allow students to use AI? Review our further considerations for developing AI policies for more guidance.
  • Learn about the ethical and legal questions surrounding generative AI. Understanding the risks will help you launch discussions in class that encourage students to understand their own values in relation to AI and mitigate ethical harm if you implement AI activities in your course.
  • Explore ways to change assignments to address AI concerns. Consider adding assignments that educate students about the strengths and limitations of AI and how it relates to your discipline.

Shortcomings of AI

ChatGPT creates text based on predictions and not critical thought. The text generated is based on a data set that includes questionable content. The design and implementation of generative AI models lead to crucial limitations.

Bias 

Its output is only as good as its input. AI retains all of the biases of the information it intakes, including the stereotypes and misinformation present in human writing on the internet.

Inequity 

Depending on the future funding models for AI assistants, there may be a gap between who does and does not have access to them. 

Inaccuracies  

AI-generated content may contain factual errors, incomplete quotes and erroneous findings. There may be a new adage about the internet: “Don’t believe everything you read on the internet and what an AI bot generates based on the internet.”

Intellectual property  

It is not clear who owns AI generated content or the prompts created by users. This ongoing conversation may impact the use of AI now and in the future. Some AI technologies have been shown to plagiarize from other sources when creating “original” content.

Ethics  

Training AI models can produce negative impacts on the environment. AI models have been used to unethically replace workers and there have also been concerns that unethical labor was used to develop and maintain these tools.

Opportunities That AI May Provide

Generative AI has the capacity to fundamentally alter the way we think about education and learning. Over time, higher education may consider how AI and humans can work together to create content.

Efficiency

AI can help draft emails, blog posts, cover letters, article and meeting summaries allowing for time savings in everyday tasks as well as plan lessons and provide feedback to students.

Stimulate Thinking

Students could annotate an AI-generated text, use it to search for counter arguments during a group discussion, or brainstorm an idea for a new project.

Editing

Used judiciously, AI can improve writing and debug code. The goal should be to provide students guidelines for using AI that further learning and writing, rather than students turning to AI generated content as a short cut.

Accessibility

Students can benefit from using AI as a tutoring aid. For example, it could help neurodiverse students who may struggle to initiate work, plus allow all students who don’t understand a concept to find further resources quickly.

Reimagining Learning

As AI generated content becomes more commonplace, this may shift some of the goals of education. Important leaders in higher education are envisioning how learning and academia will change in the age of AI.

Rethinking Assignments

In the age of AI, instructors may change what they define as acceptable evidence of learning. The knowledge and skills students should demonstrate may shift to center on personalized learning, collaborative work, self-reflection and the real-world application of content. 

Supporting AI Literacy for Students

It is the responsibility of instructors and higher education institutions to help students navigate the ramifications and opportunities of generative AI. There are important intellectual questions to be unearthed and discussed.

Open Dialogue

Initiating a direct conversation with students about the use of AI is an opportunity to explain its impact on their education. Together, instructors and students can explore the ways in which AI can support their learning positively. These conversations should also touch on what they may lose intellectually if they use AI to complete their assignments. Students should understand that learning is difficult and challenging, but that is the point of education

Original Thought

Instructors can emphasize why original writing (or coding or creativity) matters and what it means to develop your own voice and ideas. Understanding how writing and communication skills will help them in careers can motivate students to avoid unwanted use of AI.

Limitations of AI

Students should be made aware of the ethical shortcomings of AI content to help them understand why passing it off as original content is not advisable. If students are allowed to use AI, they need to understand how AI content must be reviewed and verified before incorporating it into their own writing.

Help Students Learn AI

Students need to understand how generative AI works and the data behind it. They should also learn how to write effective prompts. The reality is it will be a part of their careers and everyday life so they should have the skills to use it correctly.

Recommendations for Course Policies

We suggest that faculty clarify their expectations regarding the use of AI at the outset of their course. Instructors have discretion in setting specific AI policies to fit their course and individual assignments. There is no one-size-fits-all policy. Below are some general principles and we have written an expanded guide to writing artificial intelligence policies.

Plagiarism

Updated on October 30, 2023 for clarity.

Instructors should update their academic integrity policy to include guidance on the use of generative AI content and plagiarism. Sample syllabus language might be “Contributions from anyone or anything else in your writing—including AI sources—must be properly quoted and cited every time they are used.”2 Whether AI is banned outright or acceptable in some cases, define the consequences for plagiarism in your course.

Cheating

Updated on October 30, 2023 to reflect the changes to the Duke Community Standard.

The view of cheating will differ across classes. For example, you might allow the use of AI for generating early ideas and drafts. Other instructors are comfortable with any use of AI with proper citations. Some instructors do not limit the use of AI at all. If you intend to ban AI, sample language could be “Because this course requires you to use your own critical thinking to solve problems and practice skills, do not use generative AI.” To support instructors that limit or ban AI use, the Duke Community Standard has been updated to include the unauthorized use of generative AI as a form of cheating.

Attribution

If you allow AI, point students to proper citation rules. Students should understand how to cite or give credit to AI content generators. They can list ChatGPT as a reference work or quote it within their work. MLA, APA, and other style guides have developed specific guidelines. Instructors can also ask students to retain original conversations and prompts and append them to writing assignments.

Acceptable Use

If students may consult AI, it is important to give students specific guidelines of what is acceptable or not. Sample language might be “Students are allowed to use AI to help revise this draft assignment, but they may not consult AI to write the final paper after I grade the draft. When submitting the draft, students must clearly identify any writing, text, or media generated by AI.”3

AI Detection Software

Learning Innovation does not endorse any software or programs that claim to determine if student’s writing was produced by AI. If instructors use detection software, it should not be considered a definitive measure of cheating, but rather the starting point of a conversation with the student about potential plagiarism.

False Positives and Negatives

These are early days for detection software and they are by no means foolproof when detecting AI text. In the long term, relying on these tools will not be an effective strategy due to advances in AI. The software can be biased; for example, a Stanford study showed that non-native speakers’ writing was flagged as AI-generated more than native speakers.

Interventions Confuse Software

Making small edits to the AI content, asking the AI to reword its content or copying text from one AI model to another can increase the likelihood of passing detection.

Student Privacy

The use of plagiarism detection can signal that students should not be trusted, and the tools invade their privacy by collecting their work.

What’s Next?

Interested in further exploration of the impact of AI on teaching? Here are a few in-depth articles to explore.


Footnotes

  1.  For an introduction to AI and its programming models, refer to this piece written specifically for educators.
  2.  This policy language, plus many other examples, are being collected as part of a collaborative project.
  3.  This policy wording, plus other student guidelines, come from Joel Gladd, College of Western Idaho, under creative commons licensing.