NEHA March 2024 Journal of Environmental Health


Even though there are challenges with proving plagiarism, another downfall of ChatGPT is that it makes mistakes. These mistakes include fabricated sources and out- right misinformation. For faculty to identify these mistakes, it requires considerable addi- tional time to assess student work. For us to be sure students are doing their own work, we now must consider checking every refer- ence in a term paper, confirming that any data students present are accurate and valid, and comparing their writing to other work they have done in class. Some of you may think that this process is how we should always be evaluating student work, but when there are 30, 10-page research papers or essays to grade in 48 hours at the end of the semester, it is just not possible. It also means that we will spend more time on the mechanics of the paper rather than its substance. The challenges with verifying the origi- nality of writing assignments have led some faculty to completely redesign their courses, which includes eliminating writing assign- ments altogether in favor of multiple-choice tests. Since multiple-choice tests, however, can also be completed with the use of AI, some faculty are heading back into in-per- son exams with scantron sheets. Recalling the impact of pencils and mimeographs on teaching decades ago, students are once again using this hand-held “technology” in an age of digital hand-held technology. When we address the ethical and accu- racy issues of ChatGPT, we shift assignments away from analytical writing to assignments that require memorization rather than criti- cal thinking. When we do this shift, there are implications for the workforce. Many students who graduated before generative AI often needed additional on-the-job training to write and communicate eŒectively, so this situation may be becoming more dire. Recent graduates might not only have limited experience with interpersonal communication because of alter- native internships but also limited experience in written communication because of AI.

There is no doubt that universities are not keeping up with rapidly evolving AI. It is pos- sible for universities to use AI in many ways, such as predicting drop-out risks, grading essays, and providing study assistance. For universities to eŒectively use this technol- ogy, however, they must develop the infra- structure and train faculty and staŒ. While it appears that universities believe AI is impor- tant, very few have a strategy for integrating it into the educational environment (O’Dea & O’Dea, 2023). So, faculty are left to their own devices to determine how their classes will or will not use AI. Workforce Impacts Environmental health academic programs across the country are facing the same chal- lenges that most academic programs are facing when it comes to AI in general and generative AI specifically. While we are figuring out how we can most eŒectively use this technology to educate emerging professionals, we need input from practitioners. We have to be aware of workforce applications and expectations so we can carefully construct our curricula, courses, and learning outcomes to provide experiences for students that will address these needs. Some of us envision scenarios in which envi- ronmental health professionals rely heavily on this technology for day-to-day communi- cation. Others are concerned that AI might aŒect the qualifications currently required to become a registered environmental health spe- cialist. All of us are contemplating the ethical implications of these powerful tools. Change in higher education is challeng- ing, especially when, just like the workforce, faculty are aging. An aging faculty workforce is happening across the country as so many universities shift from tenure-track positions to part-time or instructional faculty positions. Young, qualified applicants are discouraged from applying for these positions because of low pay, limited benefits, and a lack of job security. In many EHAC-accredited pro- grams, there are one or two faculty members

responsible for delivering the entire cur- riculum. In some cases, faculty coordinating these programs might delay retirement due to the perception that university administration will use this departure to close the program. This situation is not optimal for students, but it does open the door for working profession- als to become teachers. It also complicates the integration of AI in courses. Less than 20 years ago, many of us could not have imagined that every student would always have an “answer machine” in their pocket. Generative AI takes this machine to new levels by not only answering questions but also generating content that applies in the classroom and in the field. The rapid advancement of this technology is either making learning better or worse depending on who you talk to. We cannot, however, ignore it as a profession so we must engage in productive discussion that includes educa- tors and practitioners. Corresponding Author: Michele Morrone, Professor, Environmental Health Science and Chair, Department of Social and Pub- lic Health, Ohio University, West 355 Grover

Center, Athens, OH 45701. Email:

References McMurtrie, B., & Supiano, B. (2023, Decem- ber 11). ChatGPT has changed teaching. Our readers tell us how. The Chronicle of Higher Education . https://www.chronicle. com/article/chatgpt-has-changed-teaching- our-readers-told-us-how O’Dea, X.(C.), & O’Dea, M. (2023). Is artifi- cial intelligence really the next big thing in learning and teaching in higher education? A conceptual paper. Journal of University Teaching & Learning Practice , 20 (5), Arti- cle 05. Ramsey, J.L., & West, R.E. (2023). A recent history of learning design and technol- ogy. TechTrends , 67 , 781–791. https://doi. org/10.1007/s11528-023-00883-5

Did You Know?

You can access back issues of our Journal online at We have provided open access to the current electronic versions of our Journal, as well as PDFs of past issues going back to 2012. No login is required and you no longer need to save links or access issues through our online store.


Volume 86 • Number 7

Powered by