Skip to Main Content

Faculty Guide to Generative AI in Higher Education

This guide discusses generative AI tools (GAITs), with a focus on the ethical use of these tools in academic work. Other major topics include: using GAITs for teaching and learning, incorporating GAITs into graded assignments, and citation guidelines.

Trash into computer trash comes out

What goes in is what comes out!

GAITs were trained mainly on information from the open internet.  Many of those sources are biased, and GAITs may magnify that bias.

Additionally, any bias in the human-provided prompt results in a biased response from the GAIT.

Hand typing with other hand holding cell phone. Locks and other security images overlay. What's your privacy worth?

Tech companies collect user input to further train the GAITs. Many tech companies do not allow their employees to use GAITs for work, out of fear that confidential information will be compromised.

Recently, Zoom had to revise their terms and conditions involving data collection for training their GAITs. It appeared that Zoom would collect all data from meetings on their platform, and use it to train their GAIT. There was such an outcry over the implied breach of privacy for schools and corporations that Zoom walked back that policy. 


It is recommended that students create a false email account when using GAITs to preserve their privacy. Should you require GAIT use in the classroom, create a general login for all students to use.

Computer and AI symbols indicating Incorrect ResponsesYou're smarter than GAITs!

GAITs do not understand or analyze their output. Like their (much less advanced) auto-fill cousins, GAITs just "guess" what word is most likely to come next, regardless of sense or accuracy. They are also known to "hallucinate" or make things up that are completely inaccurate or fictional. 

For example, I once asked for recipes that used no more than 1 cup milk, 1/4 cup sugar, and 1/3 cup oil.
  Even though milk was one of the ingredients, ChatGPT called all the recipes "dairy free."

 

ChatGPT does not provide accurate citations! 

GAITs may provide citations when asked, but few (if any) can generate accurate ones at this time.

Venn diagram with Ethics on a blue side, and Generative Artificial Intelligence on the yellow side, with AND emphasized in the overlapping section.

Ethical Academic Use

The first step in ensuring ethical academic use of GAITs in your class is by putting your expectations in the syllabus. There are essentially three options:

  1. Ban all use.
  2. Specific and designated use allowed, with appropriate citations.
  3. Assume and approve use for all assignments, with appropriate citation.

Further information on syllabus statements appears in the How tab.

Man and robot sitting waiting for an interview.Employment Impact

GAITs require humans to train the models, and the companies responsible for doing so often traumatize and exploit those human employees. 

OpenAI, the company responsible for ChatGPT, used Kenyan workers making less than $2.00 per hour to train ChatGPT and DALL-E. Many of these employees have developed severe mental disorders, lost their families, and filed lawsuits over their treatment and the materials they were exposed to.


Additionally, companies want to use GAITs to reduce staffing and personnel costs. Many companies have already laid people off in anticipation of an AI being able to take jobs over. One, an eating disorder hotline, had to re-hire humans after the AI gave damaging advice.

Computer garbage dump on fireEnvironmental Impact

GAITs require significant resources to run. This includes water to cool the servers and many rare metals and minerals.  The water use contributes to drought, and the mining for the metals and minerals damages the environment. Additionally, the power use needed to run the GAIT servers is astronomical. These are just a few of the environmental concerns exacerbated by GAIT use.