UIC Pharmacy takes on AI’s promises pitfalls, and possibilities

Generative artificial intelligence (AI)— programs like ChatGPT that can create new content—has swept into industries globally in recent years. Several UIC faculty and students are working to help the UIC Pharmacy community take on the new technology’s promise and risk.

AI Research and Training Heading link

In the spring of 2024, Dr. Faria Munir and Dr. Heather Ipema, clinical assistant professors, led a project in which two students served as AI educators for faculty. The 15-session series aimed to provide students pedagogical and research experience, while also exploring technology training that wouldn’t further tax faculty. As a digital-native, younger population, UIC’s pharmacy students proved ideal candidates for the task.

“We wanted to inform our . . . drug information faculty more about artificial intelligence since it is becoming such a hot topic, not only in education but healthcare in general,” said Munir.

The approach was “an incredible opportunity not only to be involved with the faculty, but also to learn more about AI ourselves,” said Elma Abdulbaki, PharmD ’25, who conducted the sessions with Zeba Saiyad, PharmD ’26.

Topics ranged from prompt engineering to AI tools and regulations. Each session included discussion so attendees could ask questions and explore how AI might affect pharmacy, Sayiad said.

In before-and-after surveys, the program’s research component, professors seemed receptive to the ideas. Among 12 faculty who attended any session, most completed surveys, showing increased knowledge and decreased anxiety about AI. For faculty who’d never used AI, “not only did they realize how much easier it is to use [than they’d thought] but the perceived risk they realized was not as high or . . . was more fueled by misinformation and general fear,” Abdulbaki said.

Even educators beyond UIC have been impressed. Saiyad and Abdulbaki shared poster presentations of their results at the 2024 UIC Retzky College of Pharmacy Research Day and American Society of Health-System Pharmacists Midyear Conference. Afterward, a professor from another school shared that “she found it incredibly encouraging to have students be the ones to teach and report back to the faculty,” Abdulbaki said.

Saiyad and Abdulbaki also coauthored—with Munir and Dr. Jennifer Phillips, clinical professor and Drug Information Group director—an article in the Illinois Council of Health-System Pharmacists “KeePosted” newsletter surveying AI policies and statements by healthcare organizations. “The membership was really trying to figure out what to do with AI and how to feel about AI,” said Phillips, who edits that newsletter, and the November 2024 article aimed to provide “a landscape survey” of AI statements. The students also have a manuscript in the works.

Future Research and Educating Students Heading link

Munir, Phillips, Ipema, and Dr. Kathy Sarna, clinical assistant professor, also secured a grant to study ChatGPT as a writing-evaluation tool. They’ll compare the technology’s performance to humans in grading longer pharmacy papers. “We have historical results of when our faculty have graded these papers,” Phillips said. “We’d like to see how ChatGPT” does.

The project will additionally investigate ChatGPT’s potential for helping pharmacy students improve their writing skills.

Also in spring 2024, Munir and Phillips added a generative AI unit into two core drug information courses. That included a live demonstration asking AI about an FDA safety alert, information that was not yet in ChatGPT but was available in more traditional resources. “We used that to plant the seed that this is not something that you should be utilizing to answer clinical questions,” Phillips said, “and that there are more reputable references.” Integrating AI into the classroom will only grow in significance, said Saiyad, who took one of the classes. “It’s really important that faculty . . . utilize ChatGPT in our learning since it is something we have to deal with now and it’s going to be present in our future learnings.”

Putting AI to Use Heading link

Many at UIC Pharmacy have already taken advantage of AI’s potential, said Mary Sullivan Kopale, who presented on generative AI at a pharmacy faculty retreat. “I think a lot of faculty are using it [in ways] that we’re not even all aware of yet,” said Kopale, director of instructional design and learning innovation. “We’re just at the beginning. . . . I think it will be a great tool going forward.”

Kopale’s presentation covered several ways AI could save time in administrative and teaching tasks, such as devising patient cases for recitations. “AI is really good at that. You put in an example of the type of case you’re looking for and ask it to change the parameters,” she said. “And something that used to take faculty many hours . . . AI does it in a minute.”

With all AI uses, of course, faculty need to check for accuracy, Kopale warned. Other use cases she shared included quiz writing; creating documents, such as agendas and reference letters; and “ask[ing] it to review your lecture and see if you’ve missed anything that would be usually taught. . . . You could ask it to reorganize it for you, make it more concise.”

For Dr. Kathryn Sawyer, clinical assistant professor, AI helps greatly by improving communication, such as finetuning instructions. “I just ask it to improve or clarify, or you can ask it to make it more engaging or humorous,” she said. “It doesn’t really change what I’m saying, but it definitely makes it clearer.”

Both Dr. Les Hanakahi, associate professor, and Dr. Adam Bursua, clinical assistant professor, use ChatGPT for crafting assessment questions, a first step into AI for many academic users, Bursua said. Most commonly, he said, “I give it questions that I’ve written . . . throughout the semester” and ask for similar questions for the final. “It does a really good job with that.”

In one unique usage, Sawyer has trained a ChatGPT to play a patient. This helped standardize model patients in her courses, versus the usual method of employing fourth-year students, whose responses may vary. One issue: although the specific ChatGPT instance gets trained only on information Sawyer provides, it can still give away too much at once. “I’m still trying to fine-tune it to only say, ‘I take lisinopril,’ but it ends up saying, ‘I take lisinopril 10 mg by mouth every single day for blood pressure,’” she said. “It’s hard to get it to not do that.”

ChatGPT can also save faculty tremendous time in sifting through information from resumes for recommendation letters, Hanakahi said. “Our CVs get into the hundreds of pages, and I can say, ‘ChatGPT, show me just the slice of this enormous document that pertains to this type of service event’ . . . and it’ll condense it down into the three or four pages that are relevant.”

AI Dangers Heading link

AI generated images of

Pharmacy users need to be keenly aware of the technology’s potential drawbacks, however. Those include hallucination, in which AI programs invent information that appears factual. “I saw a really scary one the other day, where it just gave the wrong information about a drug,” Bursua said. Only after a few corrections from Bursua, who’d queried about drug mechanisms, did the program get the answer right.

AI programs will also contradict themselves, even within the same answer, Hanakahi said. Overall, they can give you the wrong information very confidently, making it easy to be led astray, Bursua said. “It feels like you’re talking to somebody who knows their stuff,” he said. “There are no tells.”

All that underscores the need to verify the information— and ensure that the user has the proper knowledge base. “If you utilize it, [and] the information is wrong, you can cause harm,” Phillips said. “So it’s very important to validate the information in there. That is, I think, the number one thing.”

UIC faculty also cautioned that AI should not be used to make clinical decisions. Only a person with the appropriate clinical knowledge should do that. “It should not be used for clinical decision-making because the information is not referenced,” Phillips said. “I tell the students to think of this as a new person who you just met, and so when they say things . . . you can’t always take it at face value.”

For privacy and liability reasons, researchers also warned that patient and student information should not be uploaded into AI programs.

Bias in AI can become apparent in interesting ways. At the faculty retreat, Kopale asked ChatGPT to present a picture of a dean of a Midwestern college of pharmacy. “It looked very similar to our dean … a middle-aged, white man with facial hair,” she said. Meanwhile, several other nearby Midwestern pharmacy colleges, such as Roosevelt University, have female deans. “We have to remember that generative AI is based on the internet, and the internet has its own bias.”

Overall, pharmacy users should be wary of letting generative AI wholly compose content for them, as opposed to editing drafts about topics they understand well, Sawyer said. “You should not be using ChatGPT to write things [for you]. If you do, you’re not an expert,” Sawyer said. “You will have no idea if it’s wrong, if it’s right, but it’s going to sound super convincing.”

As the college embraces the future of AI in education, it is clear that the potential benefits are vast. “When integrated effectively into our curriculum, we believe AI can enhance knowledge delivery, improve student learning outcomes, and better equip graduates for career success,” says Dean Glen Schumock. With a balanced approach of embracing AI’s advantages while staying vigilant about its risks, the UIC Pharmacy community is well on its way to harnessing this transformative technology to better prepare the next generation of pharmacists.