Skip to main content

Strategy for Appropriate Use of Generative AI by Students

picture of Barbara Jo White

Barbara Jo White

With the rapid emergence of generative artificial intelligence tools such as ChatGPT, many in academia are trying to determine how to best enable their students to gain experience with this increasingly valuable asset without making it easier for them to use the technology to complete assignments, tantamount to academic dishonesty.

That’s the subject of a study by Western Carolina University College of Business faculty members and a colleague from East Carolina University’s College of Business who have developed an instrument that they believe can be used to help students understand how to use AI effectively and appropriately in their academic pursuits – and beyond.

The study is co-authored by Lorrie Willey, professor of business law at WCU; Barbara Jo White, professor of computer information systems at WCU; and Cynthia S. Deale, professor of hospitality leadership at ECU. Through their research, Willey, White and Deale conceived what they call the “AI prompt development life cycle,” a four-phase model designed to introduce and enhance prompt-writing processes and skills by students in their use of generative AI.

Their forthcoming article on their research is titled “Teaching AI in the College Course: Introducing the AI Prompt Development Life Cycle (PDLC).” It was published in October 2023 in Volume 24 of Issues in Information Systems, a journal published by IACIS (International Association for Computer Information Systems).

In their study, White, Willey and Deale describe the four phases of prompt development as “the planning and analysis phase,” which includes identifying the subject, establishing phrases and keywords that describe the subject, determining desired output and pinpointing available tools; “the development phase,” which consists of the creation and refinement of the AI prompt based on planning and analysis; “the testing phase,” which centers on running the prompt and collecting output generated by the prompt; and “the evaluation phase,” which determines the appropriateness of the output and checks for accuracy and bias.

The researchers believe their prompt development life cycle is a solid first step in the brave new world of generative artificial intelligence and its potentially disruptive influence in educational settings, where instruments such as ChatGPT, an AI-powered software, create human-like text based on prompts, previous conversations and user input.

“The prompt development life cycle was developed as a tool any faculty member can use to help students learn how to effectively use AI,” Willey said. “AI is not a search engine, and there are techniques in the prompting process that, when learned, can assist in outcomes that meet the students’ requirements.”

White said the PDLC can be useful in higher education as a way to teach the strategic process of writing effective AI prompts. “You can find prompting guides online, but they often emphasize different techniques and technical aspects of prompting. Our model highlights the processes, which are different from techniques,” she said. “For example, the process of decision-making is different than a technique, like brainstorming, which can be included as part of a larger decision-making process.”

The trio became interested in generative AI because of its constant appearance on the nightly news and in the headlines beginning in late 2022. Readers can’t skim an edition of Insider Higher Ed or The Chronicle of Higher Education without seeing the subject of the impact of AI somewhere in the publication.

“One hallmark I associate with being a professor is being curious and keeping up with the news,” White said. “Starting in November and December of 2022, generative AI and ChatGPT were all over the news. I also had a student come to me during final exams and describe a program they were using to help with their writing. I was pretty sure they were talking about ChatGPT, so I started looking into it.”

The story is the same for ECU’s Deale. “Like Barbara Jo, I became curious about AI from the news,” she said. “After hearing about ChatGPT on the news, I wanted to see what it could do for me – as a professor and for my students. I also had a student mention that they had Chat GPT on their phone and I became very curious about how it can be used in positive ways in higher education.”

Much of the buzz surrounding generative AI was its potential to become a disruptor to traditional educational processes, Willey said. “AI allows students to complete assignments, papers and other work through a technological means rather than through their own efforts. That’s a major concern,” she said. “On the other hand, AI is a technology used in business and industry and is a valuable tool to which students need to be exposed. The questions for faculty are ‘When do I want to use and when do I want to avoid AI’ and ‘If I want students to use the tool, how can I help them use it effectively?’”

While AI tools can help students obtain answers to essay questions and create other written products, its use also can diminish what students learn through the process of creating self-written products, Deale said. “Furthermore, while it can create answers, it may not be a very useful research tool for students when they need to cite relevant sources for their work. And, students may rely on the rather natural answers created via AI and then not learn as much in terms of depth about various topics.”

Much of college life revolves around instructors assigning a paper or other written project, with the expectation that student will do the work, not technology, Willey said. “The obvious drawback to AI is plagiarism and the ability to complete coursework without individual effort and study. Online testing is also a concern. Students could use AI to respond to test questions, and it would be difficult for faculty to identify that conduct. There is no clear solution to this other than faculty rethinking assignments and teaching methods to accommodate the concerns about AI.”

One factor to keep in mind, White said, is that people in all types of work settings, including higher education, are trying to figure out how AI can help with daily tasks. “A student’s job is going to school, learning and demonstrating what they know,” she said. “Faculty don’t want artificial intelligence to replace student intelligence. But at the same time, we need to make students aware of prompting skills.”

That’s because part of the role of higher education is preparing graduates to enter the workforce, the researchers said. “People who have AI prompting skills and experience with AI are likely to get jobs faster and get better jobs,” White said. “Microsoft made a $10 billion investment in AI and is integrating it with Office 365 applications like Word, Excel and PowerPoint. Hopefully, universities will make this version of Office available to students and faculty.”

AI also can be used appropriately for individual tutoring, research, resume and cover letter preparation, study guides, initial outlines for written assignments and the completion of other day-to-day tasks that are routine and time-consuming, the researchers said.

With the PDLC as a building block, higher education must take additional steps in evaluating and managing the use of AI in all its form, they said, including administrative support through the development of more robust testing centers with stronger academic integrity assurance processes.

“But, we cannot ignore generative AI and try just to ban student use of the tool. It will not work,” Willey said. “Faculty will need to make decisions about coursework in terms of when it is appropriate and when it is not appropriate for students to use generative AI. Last, students will use AI in the workplace and we must expose them to the tool and help them learn to use the tool efficiently.”

Deale said she finds it interesting to see the policies evolving regarding AI in higher education. “I think that people will probably continue to have differing views about the use of AI,” she said. “In higher education, I think there may be some instructors and professors who want to ban it, others who may want to allow students to use AI and ask the students to mention its use, and perhaps others will allow the use of AI freely.” Deale points to recently developed AI policy at ECU, which requires faculty to include the policy as part of their course syllabi.

Share
Office of Web Services