MENU

Generative AI in teaching

OVERVIEW

Generative artificial intelligence (AI) presents new opportunities and challenges for teaching and learning. Below, you’ll find some frequently asked questions and important considerations when engaging and experimenting with generative AI.

Like any technology, there are benefits and limitations for both instructors and students. While the rapid advancement of generative AI can make it difficult to keep up with the latest tools and types of outputs that are possible, CEE staff are well-situated to support the critical and creative uses of generative AI to enhance curriculum, courses, and assignments.  

CEE offers departmental and unit sessions on the following topics:

  • Generative AI and Academic Integrity

  • Generative AI and Online Courses

  • Generative AI and Student Experience

  • Generative AI and Personal and Department Teaching Goals

Don’t see the topic you’re interested in listed? Contact us and we will work with you to develop a session that works for your teaching and learning needs.

WHAT IS GENERATIVE AI?

Simply put, generative AI is:

“Technology that creates content — including text, images, video and computer code — by identifying patterns in large quantities of training data, and then creating original material that has similar characteristics” (Passick, 2023).  

More specifically:

Generative AI is a type of artificial intelligence that uses machine learning to generate new content by analyzing and processing vast amounts of data from diverse sources. Generative AI tools can generate text, images, video, sound, and code. Different tools are trained on different datasets and with different training methods. The generated responses of these tools are probabilistic, which can result in errors in responses. Large language models (LLMs), for instance, specialize in analyzing and processing text and generating new text. Different LLMs have distinct datasets and employ unique training methods. GPT 3 and GPT 4 are examples of LLMs. (Satia et al., 2023) 

How reliable are generative AI tools like Chat GPT  

Generative AI tools create content based on their training dataset. This dataset may not include information after a certain date and can therefore be limited when generating content about recent events.

As a technology, generative AI’s purpose is to create content, but it cannot verify accuracy and may “hallucinate,” creating factually incorrect outputs that include false or irrelevant information. These hallucinations can include the creation of citations and references to works that do not exist.

Any output created by a generative AI tool should be carefully evaluated for accuracy.

What are the risks and benefits of generative AI tools?

Generative AI tools “cannot replace human critical thinking or the development of scholarly evidence-based arguments and subject knowledge that forms the basis of [a] university education” (University of Oxford, n.d.). Used ethically and appropriately, generative AI tools can support student learning in a variety of ways. They can summarize text, identify keywords, create outlines for projects, and provide commentary on writing.  

Like any tool, there are risks and benefits with generative AI. Becoming aware of both risks (including disclosure of personal information, violations of intellectual property rights, and ethical implications) and benefits (support for skill development and knowledge creation) can help instructors and students discuss and share when to use and when to avoid generative AI tools.

The table below (Mollick & Mollick, 2023) provides an overview of how generative AI tools may offer different approaches to support students “harness the upsides while actively managing the downsides and risks of using AI:”

 

 

AI Use

 

 

 

 

Role

 

 

 

 

Pedagogical Benefit

 

 

 

 

Pedagogical Risk

 

 

 

 

MENTOR

Providing feedback

Frequent feedback improves learning outcomes, even if all advice is not taken.   Not critically examining feedback, which may contain errors

 

 

TUTOR

 

 

Direct instruction Personalized direct instruction is very effective.   Uneven knowledge base of AI. Serious confabulation* risks

 

 

COACH

 

 

 

 

Prompt metacognition

 

 

 

 

Opportunities for reflection and regulation, which improve learning outcomes. 

 

 

 

 

Tone or style of coaching may not match student. Risks of incorrect advice.

 

 

 

 

TEAMMATE

 

 

 

 

Increase team performance

 

 

 

 

Provide alternate viewpoints, help learning teams function better. 

 

 

 

 

Confabulation and errors. "Personality" conflicts with other team members

 

 

 

 

STUDENT

 

 

 

 

Receive explanations

 

 

 

 

Teaching others is a powerful learning technique. 

 

 

 

 

Confabulation and argumentation may derail the benefits of teaching.

 

 

 

 

SIMULATOR

 

 

 

 

Deliberate practice

 

 

 

 

Practicing and applying knowledge aids transfer. 

 

 

 

 

Inappropriate fidelity.

 

 

 

 

TOOL

 

 

 

 

Accomplish tasks

 

 

 

 

Helps students accomplish more within the same time frame. 

 

 

 

 

Outsourcing thinking, rather than work.

 

 

*Confabulation: the production of potentially plausible, but ultimately incorrect of false information

Can students use generative AI to complete assignments?

Currently, individual instructors at SFU determine if students are permitted to use generative AI to complete assignments.  

SFU Academic Integrity offers potential language to use in syllabi concerning the use of generative AI: https://www.sfu.ca/students/enrolment-services/academic-integrity/faculty/prevention/syllabus-statements.html   

Being explicit about when, how, and why generative AI may or may not be used in assignments and in your course is a way of being proactive with students. Engaging in conversation with students about the uses, misuses, and ethical issues associated with generative AI can help everyone better understand appropriate ways to draw on this powerful technology.

How can I use generative AI in my course?

Instructors may benefit from using generative AI tools to, for example, create material that students can critique for accuracy and reliability or demonstrate brainstorming activities that can help develop students’ metacognitive skills to refine research questions.

Remember that personal information from and about students should never be entered into a generative AI tool and if use of these tools is permitted, students who choose not to use them must be given equal opportunity to meet the course, assignments, and/or learning goals.

How can I tell if students are using generative AI?

There is no reliable way to determine if a generative AI tool has been used in a student assignment. Tools that claim to ‘check’ if writing, for example, often result in ‘false positives.’ Submitting any student work to a third-party tool is highly discouraged as this can often be a violation of privacy and intellectual property rights. Proactively discussing academic integrity with students and creating opportunities for conversation around assignments if you, as an instructor, have concerns about submitted work, are better than falsely accusing a student of academic dishonesty.

If the use of generative AI is permitted, how should it be cited?

You will find advice on citing generative AI on the respective reference pages from the SFU library: https://www.lib.sfu.ca/help/cite-write/citation-style-guides  

The University of Toronto Library offers a guide on generative AI citation for MLA, APA 7, and Chicago here: https://www.lib.sfu.ca/help/cite-write/citation-style-guides.

Can students be required to use generative AI tools for an assignment?

Instructors cannot require students that use any program, application, or tool that asks for personal information, including but not limited to, name, email, or student number.   

When instructors permit students to use such tools, alternatives must be provided so that students who choose not to use the tools have equal opportunity to meet the course, assignments, and/or learning goals.

CONSIDERATIONS

  • Generative AI applications are trained on different datasets to create predictive outputs (text, images, audio, and video, for example) in response to user prompts. Outputs will always reflect biases and limitations of the dataset that the application draws on. With Chat GPT, the initial training dataset included 300 billion words drawn from a variety of online texts and resources. 

  • Some generative AI tools require users to share personal information to create accounts. User inputs entered into generative AI tools to prompt responses may be added to the training dataset for the tool, meaning users are providing information, and in some cases, their intellectual property, to a databank. Learning how, where, and why information is shared with a generative AI tool can help users make informed choices about potential use.

  • To learn more about SFU’s Privacy Management Program and how you can navigate appropriate use of generative AI, follow British Columbia legislation, and protect your own privacy and student privacy, visit Archives and Record Management - Freedom of Information and Protection of Privacy Act: https://www.sfu.ca/archives/fippa.html  

  • Generative AI tools, which are increasingly embedded in search engine platforms and various software, can use significant energy. The amount of carbon dioxide generated by these tools cannot yet be fully determined, but some estimates suggest that the dataset training alone, prior to any outputs being generated, “produces 626,000 pounds of planet-warming carbon dioxide, equal to the lifetime emissions of five cars” (Martineau, 2022).

GETTING STARTED WITH GENERATIVE AI

Reminder: Instructors cannot require student use of generative AI tools and must provide alternatives for students who choose not to use the tools. Alternatives must allow students equal opportunity to meet the course, assignments, and/or learning goals

GENERATIVE AI IN CURRICULUM, COURSES, AND ASSIGNMENTS

Curriculum

  • At the curriculum level, programs, departments, faculties, and the university work with stakeholders to develop, assess, and refine curriculum.

  • The Centre for Educational Excellence actively supports curriculum mapping with respect to addressing generative AI, with a focus on overall educational priorities and goals, like those outlined by Dr. Kathleen Landy (2024): 

  • What do we want students in our academic program to know and be able to do with (or without) generative AI?

  • At what point in our academic program – that is, in what specific courses – will students learn these skills?

  • Does our academic program need a discipline-specific, program-level learning outcome about generative AI?

Courses

  • Permitting, or even encouraging, students to use generative AI as a tool that supports the ongoing learning and completion of assignments in a course can be clearly articulated. Creating opportunities and mechanisms for students to disclose their use of tools builds openness and transparency.

  • Creating scaffolded assignments where students are asked to critically analyze and reflect on the use of generative AI tools can offer opportunities that encourage students to describe more about their learning process and how they are developing skills to assess, critique, and draw conclusions based on information.

  • Including or refining course goals related to developing generative AI literacy can help students make explicit connections between their course work and skills that they will continue to use in their university careers and beyond.

Checking in with students about how they use generative AI can reveal critical and creative application. Dr. Zafar Adeel of the School of Sustainable Engineering notes allowing the use of generative AI tools, with attribution, helped students develop critical questions:

"I told my class they were allowed to use ChatGPT for take-home assignments and that they would not be penalized as long as they identified how they are using it. I saw a couple of trends. One was that, overall, relatively few people used ChatGPT or admitted to using it. Another trend I noticed was that there were one or two students who were using ChatGPT a lot more innovatively. They weren't just plugging in the assignment problem, and copying and pasting the answer, but they were asking more probing questions and generating more in-depth responses, which I thought was quite interesting".

Computer Science instructor and AI researcher Parsa Rajabi shares his philosophy regarding student use of generative AI in his course syllabus:

"I view AI tools as a powerful resource that you can learn to embrace.The goal is to develop your resilience to automation, as these tools will become increasingly prevalent in the future. By incorporating these tools into your work process, you will be able to focus on skills that will remain relevant despite the rise of automation. Furthermore, I believe that these tools can be beneficial for students that consider English as their second [language] and those who have been disadvantaged, allowing them to express their ideas in a more articulate and efficient manner".

Assignments

  • As with any assignment design, choosing to incorporate or prohibit use of generative AI in an assignment should begin with careful consideration of the course and educational goals.

  • Demonstrating ethical and effective use of generative AI tools as they specifically relate to assignments can help students make progress towards learning goals and allow them to develop critical thinking skills as they create and refine prompts for LLMs (large language models) like Chat GPT  

Dr. Leanne Ramer, SFU senior lecturer in the department of Biomedical Physiology and Kinesiology, describes how she uses generative AI as a starting point for students working on research assignments and demonstrate her process as a researcher. Read more here

"I show students how to use [ChatGPT] to generate ideas for research papers. It can provide a very high-level summary of a field and help us draft a strategic search in an academic database. We experiment with asking the right questions and using the right prompts – and this is crucial: leveraging AI effectively is a craft that students need to hone".

ChatGPT can also be used to help students practise meta-cognition and learn how to learn. As apprentices in their disciplines, students rely on faculty to make their strategies and thinking for tackling content explicit, and [generative AI] can be leveraged in this process.

Geography lecturer, Dr. Leanne Roderick describes how she designed an assignment that offered students the opportunity integrate generative AI into their assignment: 

"I provided my students with a generative AI assignment option. In the assignment, I provided the prompt and their task was to improve the content so that it more accurately reflected the course material and demonstrate to me what strategies they used to do so. They were still working with the content, so their learning outcomes were the same—it was just a different route to them. In fact, some of the students said they wished they would have just answered the questions themselves because that would been easier".

Contact CEE today to learn more about how we can support your work with generative AI in teaching and learning.

REFERENCES

  • Landy, K. (2024, February 28). The next step in Higher Ed’s approach to ai. Inside Higher Ed | Higher Education News, Events and Jobs. https://www.insidehighered.com/opinion/views/2024/02/28/next-step-higher-eds-approach-ai-opinion

  • Martineau, K. (2022, August 7). Shrinking deep learning’s carbon footprint. MIT News | Massachusetts Institute of Technology. https://news.mit.edu/2020/shrinking-deep-learning-carbon-footprint-0807

  • Mollick, Ethan R. and Mollick, Lilach, Assigning AI: Seven Approaches for Students, with Prompts (September 23, 2023). Available at SSRN: https://ssrn.com/abstract=4475995 or http://dx.doi.org/10.2139/ssrn.4475995
  • Pasick, A. (2023, March 27). Artificial Intelligence Glossary: Neural Networks and other terms explained. The New York Times. https://www.nytimes.com/article/ai-artificial-intelligence-glossary.html
  • University of Oxford. (n.d.). Use of generative AI tools to support learning. https://www.ox.ac.uk/students/academic/guidance/skills/ai-study#:~:text=Generative%20AI%20tools%20can%20be,through%20teaching%20and%20independent%20learning