Universiteit Leiden

nl en

AI in education

The latest generation of artificial intelligence (AI) can use natural language to answer complex questions and tasks. OpenAI launched the ChatGPT chatbot in late 2022. This has caused a stir in the world of education and is a cause of concern for many. What could AI in general and ChatGPT in particular mean for education? What are the opportunities and limitations of this form of AI?

We have brought together all the relevant information on this page. If you have suggestions for these pages or you think there is information that is missing, please contact Strategy and Academic Affairs.

If you looking for support with using ChatGPT in your teaching, please contact your faculty’s teacher support deskSpecific questions at course level about what is and is not allowed in relation to AI in education is primarily up to the teacher and board of examiners to assess and cannot be answered without the context of the course, the learning objectives, and the assignment.

What is Artificial Intelligence (AI)?

It is difficult to provide an all-encompassing definition of AI. This is because the discipline is developing all the time. According to the National AI course, AI has two characteristics at least. It consists of intelligent systems that (1) can perform tasks independently in complex environments and (2) improve their performance by learning from experience. Spotify is a familiar example of AI: the platform suggests music based on the tracks users have already listened to. But the personalised feed on Facebook is also an example of AI.

Artificial Intelligence in education

What opportunities does AI present for education? The Dutch AI Coalition (Leiden University is also a member of this alliance) describes this in its education manifesto:

  • Personalised learning: education that is better adapted to the student, with both better results and a better learning process.
  • Supporting teaching staff: providing holistic and informed insights.
  • Increasing the efficacy of digital teaching materials.
  • Improving the methods used to assess knowledge.
  • Reducing the workload of teaching staff by using AI to support tasks.

One example of AI in education is FeedbackFruits. This is an online platform for collaboration, communication and blended learning. The platform contains different tools that facilitate interaction between students and lecturers (and among students).

Experts do not expect AI to ever take on the role of a teacher in education. What seems more logical is to start a search for the ultimate blend of human and artificial intelligence in education.

ChatGPT

ChatGPT is a user-friendly technology that is both an opportunity and a threat to education and society as a whole. There are many new opportunities for using ChatGPT (and other forms of AI) in education, as a form of educational innovation. But there are also risks, including plagiarism and fraud. Below is a list of frequently asked questions and answers about ChatGPT.

FAQ

ChatGPT is an AI-based chatbot. In response to a ‘prompt’ (a question or command from a user), it generates a text based on an enormous dataset of text material, including from the internet. The user can specify the form and length of the text. The program generates these texts by making a statistical prediction of what the most likely next word in the text would be. That prediction is based on previously used phrases or sentences.

ChatGPT is an advanced version of a text prediction tool. It is not a source of factual information. ChatGPT regularly makes mistakes, so cannot be seen as a reliable source for academic texts.

ChatGPT’s possibilities can be divided into roughly three categories

  1. You can ask a question, and ChatGPT will answer it. This is similar to giving a command to Google. The difference is that Google produces sources from which you have to extract the information yourself. ChatGPT is more efficient and produces a written answer. The disadvantage is that it is not easy to verify whether the information is correct and what it is based on. 
  2. You can ask for a piece of text, e.g. an outline or a reflection on a certain topic. You can specify the length and writing style of the piece, and you can also ask for sources to be added. However, as ChatGPT often makes these sources up itself, this is not recommended. See What are the limitations of ChatGPT?
  3. You can enter a piece of text and ask the chatbot to summarise it, paraphrase it, translate it, correct spelling mistakes, give feedback, etc.

ChatGPT is owned by the company OpenAI, which is backed by several investors. OpenAI started as a non-profit organisation but later became a company in order to attract investment for development. Microsoft, for instance, made a hefty investment in the company to be able to integrate OpenAI technology into its services as well. The name ‘OpenAI’ may give the impression that ChatGPT is open-source software, but it is not. Open source means that the source code and data are publicly available and anyone can use them to create their own version. That is not the case here.

OpenAI has made ChatGPT available in the form of a public test version. Anyone can create an account and use ChatGPT. By making the model available, the company learns all kinds of things about how to use it in practice, how to improve the model, what measures can be taken against unwanted use, etc. Everyone who uses ChatGPT is therefore participating in this OpenAI test.

It is expensive to make ChatGPT available, but it also generates an enormous amount of data. The question is whether ChatGPT will remain openly available. The company can charge people for using Chat GPT in business applications, such as mail programs or search machines. There are also subscriptions for individual users. Subscribers will always be able to use the chatbot, while those who are using the free service may not have access at busy times.

ChatGPT is a web application. All the questions asked, all the answers given and all further interaction with the website are recorded by the company OpenAI. The company also collects visitor data, such as visitor statistics, log-in times, internet browser data, etc. With ChatGPT, OpenAI may use any data entered to improve the language model behind it. The privacy policy states what data is involved. OpenAI may use any data entered by users to improve the language model behind ChatGPT. Entering privacy-sensitive information in ChatGPT and all other web applications for which no agreement has been concluded with Leiden University is not permitted. For example, a teacher may not instruct students to process privacy-sensitive data in ChatGPT, or enter text that contain a student’s contact details. To deal with this, teachers can create accounts that students can use for the duration of the course. Students may also create an account with an impersonal email address.

Experts are concerned about the consequences of AI in education, particularly the use of ChatGPT. It is very simple for students to have a complete text produced by the chatbot. For a lecturer, it is not always easy to tell the difference between a text produced by a student and a text generated by ChatGPT.

AI will continue to develop further in the coming years, which makes it all the more important that we learn to handle it in education. ChatGPT will have to be regarded as a tool. The answers it gives, for example, are not always correct or strong in terms of content. Nor does it incorporate the student’s own opinion. This makes getting students to write reflective essays, for example, a good assessment method.

It is important to discuss this with colleagues. Try the tool out yourself to see what it can do (also see this article by LLInC with tips for trying out the tool). Discuss the opportunities and threats with one another. It can also help to talk to students. Are they familiar with the tool? And are they already using it? What opportunities does it offer? And are they aware that this development also has its drawbacks?

Developing skills related to AI

Whether and to what extent ChatGPT (and other AI) is or will become a functional skill (i.e. necessary to achieve results within a discipline), depends on developments within that discipline.

For students, it is important that they develop the critical skills needed to evaluate, interpret and identify the limitations and risks of ChatGPT and other AI. This requires skills such as analysing and creating social awareness. These skills are covered in all our degree programmes via Leiden University’s 13 shared transferable skills. In addition – alongside thoughtful assessment – students will be expected to demonstrate ethical awareness and integrity. The skills of researching, collaborating and reflecting contribute to this.

A tool like ChatGPT can be used as a resource to come up with creative ideas and improve teaching and assessment. For example, lecturers can have ChatGPT write an essay alongside students. They can then compare the outcome: What can you learn from the AI model and what flawed reasoning does the chatbot use? Another possibility is to have students reflect on using AI tools themselves. What does and does not work well? What moral issues are there?

The use of ChatGPT may also shift the emphasis to (peer) feedback discussions with students, mid-term and final presentations and asking students about what they have done and learned (self-reflection). This may increase the workload, but it will benefit the authenticity, personalisation and inclusiveness of the teaching.

Incorrect sources

ChatGPT uses probability, based on numerous sources, to generate answers. It does so when compiling pieces of text, but also when producing a simple factual answer to a question. As a result, you can never know exactly what sources have been used for the output. This means ChatGPT’s answers are not transparent and are difficult to verify, and it also makes accurate and complete source citations impossible. If you ask ChatGPT to provide the sources used to generate an answer, it often makes up non-existent sources. You can ask for existing sources or peer-reviewed sources, in which case you will usually get relevant sources, but they will not contain all the data on which an answer is based.

Falsehoods and prejudices

One of the disadvantages of ChatGPT is that it sometimes generates very convincing falsehoods. This is because the software is trained with huge amounts of data taken from the internet, where incorrect information can also be found. In addition, there are data biases. ChatGPT’s predecessor (GPT-3), for example, clearly made gender-stereotypical associations in its answers. OpenAI uses filters and human control to extract obvious untruths and the worst biases, but this still remains an inherent limitation.

Outdated information

The current version of ChatGPT (January 2023) has only been trained on data up to the end of September 2021. This makes it unable to interpret more recent data. This limitation is likely to be short-lived, given the speed of developments.

Calculation errors

Last but not least, besides content errors, the chatbot also makes calculation and logic errors. This has to do with how ChatGPT generates answers. It is a language model, not a calculator: the text is generated in response to mathematical questions, but the outcome is a random number based on the human preference for numbers.

If students use ChatGPT for a written assignment without mentioning this, this is deemed to be cheating. If a student makes extensive use of AI for a written assignment and the person marking it cannot see which parts were written by the student and which by the AI, this makes it difficult to assess the work and is therefore considered to be cheating. See also the Rules and Regulations of the Board of Examiners of your programme for more information, available on the regulations site.

Not allowed:

  • Any form of literal copying and copying without full attribution (citation, reference) of any material generated by AI.
  • Any use of AI that prevents the teacher from checking and/or evaluating the student's work.
  • Any use of AI during exams or other evaluations where it is indicated that the use of AI is not allowed.

You can’t always detect if ChatGPT has been used. This is one reason why ChatGPT is of such concern in the world of education. If a report is written in a different style from a student’s usual one, this may point to a ChatGPT text. It is worth noting that ChatGPT does not think for itself: it can only create real-looking texts, so the texts may contain factual errors. In addition, the language model is unable to check sources. If you ask ChatGPT for sources, in many cases it will come up with sources that seem real but do not actually exist.

The software’s expressions are unique, so the texts cannot be found on the internet. This means that the standard plagiarism checkers do not work. Leiden University does not currently use software that can detect whether a chatbot has been used because the results are not (yet) reliable enough; this also applies to the Turnitin tool. Turnitin integrated the AI Writing Detection Tool in FeedbackStudio in April 2023. As the chance of a false positive or false negative is high with this tool, the decision was taken to deactivate it (since 16 June 2023).

The following tips will help students learn from their writing assignments and make it more difficult to rely on a program like ChatGPT:

Test your assignment first

Use ChatGPT to test the assignment to see whether it is easy to do it with AI.

Agree with students on two basic principles

Emphasize that students must write in their own words. Agree on the following two basic principles:

  1. As a student you are fully responsible for what you submit;
  2. As a student, you ensure that the paper is self-written and includes the right references, so that the teacher is able to evaluate which competences you have acquired as a student.

Ask students to explain their work

Talk to students about the work they have submitted. Discuss how they reached their reasoning, analyses and conclusions. Announce that such discussions are part of a new (spot check) working method.

Ask for sources

ChatGPT makes no distinction between sources: it may use data from a scientific article, a satirical column by a comedian or a blog by a conspiracy theorist. It uses all these interchangeably. There is no direct link between the generated text and the collection of texts on which ChatGPT bases its prediction, making it impossible to provide a source.

Address topics that there is no information about online

AI always works with online sources that are often not ‘live’ and ‘complete’ either, limiting the AI's search range. Use tests from practicals, reflections on a working group on conversation techniques, a case you’ve come up with yourself, etc. Lack of content will make it impossible for AI to answer a question like: ‘What would have happened if the variable X was replaced by Y in the practical on ‘A’?’

Adapt the assessment method

  • Avoid knowledge questions, standard algorithms, conclusions, technical analyses and the like for essays and take-home exams. Set these questions via an exam with (online) invigilation.
  • Get students to explicitly answer questions about preparing, structuring ideas and working towards a conclusion when writing an essay. You can also combine the essay with a presentation or oral discussion. Carry out assignments during lessons under direct supervision and peer review. Consider interviews or oral explanations or tests.
  • Consider making written assignments about topics very specific to students themselves for assessments: for example, about their (voluntary) work or the neighbourhood where they live. The more specific the topic, the more difficult it is for ChatGPT to use its background data to produce a readable text
  • Ask students for their personal opinion or reflection. Consciously include these elements in your assessment criteria and give them a heavier weighting than knowledge questions.
  • Ask about recent developments. So far, ChatGPT knows nothing about the world after 2021, but this is expected to change rapidly.
This website uses cookies.  More information.