Artificial intelligence (AI) and your academic work

Updated August 8, 2024

Imagine if you had a friend who had a vast memory and was excellent at improv, but was a compulsive liar with no moral values. That person might be very entertaining, but you’d probably have a hard time trusting them. That’s a pretty good description of many generative AI tools.

Generative AI tools are essentially fast and powerful predictors. Large language models like ChatGPT can quickly predict what word or phrase could come next given what has come before, and given the associations it has learned through its training data. Knowing the basics about how a tool works can help you better understand its benefits and limitations.

To help put AI tools into context, check out the Headline Rephraser, which takes hype-filled news headlines and rewrites them more accurately. It's part of the AI Myths website developed by a Ph.D. researcher in philosophy and technology.

This brief guide discusses whether and how these entertaining but sometimes untrustworthy tools might play a role in your academic work.

Why do academic assignments exist?

You are required to do assignments in your courses so that you can practice what you have learned, and so that your professors can assess your learning. No matter how different they are, all course assignments share some things in common. Each assignment is

  1. a piece of original work
  2. completed wholly by the student (or students, in the case of group assignments).

Anything generated by artificial intelligence would not qualify as work completed wholly by a student. Just like you need to ask permission from your professor to collaborate with other students on a solo assignment, so you also need to get permission to use an AI generator to complete an assignment.

AI and academic misconduct

It is an academic offence at Memorial University to submit work created through the unauthorized use of generative artificial intelligence (GAI) tools and present it as your own original work. This may include but is not limited to quizzes, tests, examinations, essays, research papers, computer code, and solutions to mathematical problems.

Outside of the university, anyone taking credit for something made by a generative AI tool would typically be violating the software's terms of use. In the case of ChatGPT, OpenAI's Terms of Use state, "You may not…represent that output from the Services was human-generated when it is not." Further, OpenAI’s Usage Policies specifically forbid academic misconduct: "We don’t allow the use of our models for ... Fraudulent or deceptive activity, including … Plagiarism… [and] Academic dishonesty."

AI: Citation, attribution, acknowledgement

We cite sources in academic writing for many reasons. One reason is to point to the person or people responsible for creating the evidence we are using to support our work. AI generators are unable to take responsibility for the work they produce, so the idea of citing AI-generated content raises some philosophical questions.

But citing sources is the standard academic way to acknowledge material borrowed from elsewhere. Therefore, you must acknowledge any use of AI-generated content in your academic work. You are not permitted to create work using unauthorized generative artificial intelligence (AI) tools and present it as your own original work.

If your instructor allows you to use AI in your coursework, the major citation styles offer guidance on citing:

AI: Errors and biases

It is important to remember that many popular generative AI tools are not reliable sources of information. Text generators like ChatGPT may seem very confident, but they are not trustworthy. For example, ChatGPT may generate "incorrect Output that does not accurately reflect real people, places, or facts," according to the developer's OpenAI Terms of Use.

AI “hallucinations” - when generative AI tools present fiction as fact - have been well-documented. For example: AI tools make things up a lot, and that’s a huge problem

Generative AI tools also reproduce all the incorrect information, misinformation, disinformation, stereotypes, and biases that may already exist in their unfiltered training data. For example, check out this visually rich article reporting on the racial and gender stereotypes found in an analysis of more than 5,000 images created by generative AI: Humans are biased. Generative AI is even worse.

AI and academic literature

It is especially important to fact-check sources cited by generative AI. The sources that many popular AI tools cite do not exist. Even if they do exist, the AI tool may wildly misrepresent and misinterpret the information. In general, you should not cite sources that you have not actually read yourself, so the same rule applies to sources obtained through generative AI: track down and read the sources before you cite them.

If you need help finding and using reliable academic sources, the Library can help: Ask us.

Tips for using generative AI for academic work

To ensure you maintain your academic integrity:

  • Seek permission from your instructor to use generative AI for academic work. For undergraduate courses, your instructor is required to include a statement in the course syllabus to clarify the permissible use of assistive tools, such as generative artificial intelligence, in the course. If you do not see such a statement, be sure to ask them about it.
  • Be transparent about how you have used generative AI in your work. In consultation with your instructor, you might include:
    • A narrative statement describing what tools you used and how you used them;
    • A record of the prompts you used and outputs you generated from the AI tool;
    • Citations
  • Be responsible. Remember that you are responsible for the academic work you submit. Confirm facts and recognize biases. When it comes to your academic work, don't ask genAI to do anything you wouldn't ask a person to do.

Ask us

If you have questions, please see the Libraries' Academic Integrity page or Ask us.