Testing the Ability of ChatGPT to Write Effective Instructions
Submitter: Stuart A. Selber, Penn State U
——————————————————
The experiment:
To see how well ChatGPT might perform as a technical writer—AI evangelists claim that ChatGPT can automate the work of writers and editors—we purchased a prompt from PromptBase.com for generating instruction sets. We chose “Instructions Generator” because it claims to make it easy to communicate complex topics. Simplifying complexity is a hallmark goal of technical writing. After downloading the prompt, we replaced [INSERT SUBJECT] with “Flowflex COVID-19 antigen home test.” We wanted to try a procedure with real consequences for users. We then cut and pasted this one-shot prompt into ChatGPT. To analyze the results, students used the evidenced-based guidelines in our technical communication textbook (Markel and Selber, 2021) for creating effective instructions. The guidelines include crafting an effective title, numbering the instructions, presenting the right amount of information in each step, using the imperative mood, separating steps and feedback statements, and including graphics. There are other conventions for the genre of instructions, but these are key elements.
Results:
On the positive side, ChatGPT decomposed the broad task into logical subtasks, and it used the imperative mood. We teach these important techniques routinely. But there were significant problems with the output. I will list just 5 of them: the headings were numbered, which is unconventional; the steps were bulleted, which is also unconventional; users were not told how deeply to insert the nasal swab, so this step lacks specificity; users should rotate the swab for 15 seconds in each nostril, not 10-15 seconds, so this step is inaccurate; and users were not told to swirl the nasal swab for 30 seconds in the extraction tube, so this step is missing. Either one of the last two problems could produce a false result. The output, of course, reflects the input—and there are many problems with the prompt. For example, the prompt does not account for a title (how-to and gerund forms are conventional), a preview of the task, a list of items users will need, and signal words (note, caution, warning, danger).
Despite these issues, ChatGPT produced a one-shot version that served as an instructive starting place for students. It illuminated both possibilities and problems, opened spaces for talking about prompt engineering, and allowed us to treat technology as both an educational subject and a platform for work. Students concluded that ChatGPT could serve as an assistant, not a replacement. I wholeheartedly agree.
Relevant resources:
- https://promptbase.com
- https://www.macmillanlearning.com/college/us/product/Technical-Communication/p/1319245005
Contact:
- Email: selber[AT]psu[DOT]edu
- Website: https://sites.psu.edu/selber

Leave a Reply