Your browser is unsupported

We recommend using the latest version of IE11, Edge, Chrome, Firefox or Safari.

Artificial Intelligence – Harness the Power for Medical Education

"Artificial Intelligence is going to replace teachers!" Not today, Satan.

There has been a lot of rumblings in the news lately about a particular breed of artificial intelligence (AI) bots and services that are having an impact on education. One of these that is causing alarm is called ChatGPT. (The GPT part is Generative Pre-trained Transformer). Here’s how the developers of this service describe it:

"We’ve trained a model called ChatGPT which interacts in a conversational way. The dialogue format makes it possible for ChatGPT to answer follow-up questions, admit its mistakes, challenge incorrect premises, and reject inappropriate requests. ChatGPT is a sibling model to InstructGPT, which is trained to follow an instruction in a prompt and provide a detailed response."

Many secondary school systems are banning it outright as they feel it can help students cheat. They have apparently determined that some students are using it to write regular essay assignments and even college entrance essays. I tried it out myself to see how it works. For now, signing up for an account and using the service is free. There are subtle hints that eventually it will not be free or perhaps there will be tiers of access.

How could we use this type of technology in health professions education? You can pose questions to it such as "tell me about renal physiology". I just did this with ChatGPT and received this response:

"Renal psychiatry is the subspecialty of psychiatry that focuses on the mental health of individuals with kidney disease. People with chronic kidney disease (CKD) are at an increased risk for developing mental health disorders such as depression, anxiety, and cognitive impairment. Renal psychiatrists work with patients to address these mental health issues and also work with nephrologists to manage the overall care of patients with CKD. They also may be involved in research on mental health and kidney disease."

Obviously, the system mis-interpreted my question because I did not type 'psychiatry', but 'physiology'. I posed this same question about two weeks ago in this same system and the results were different and more accurate. I received about 5 paragraphs of text to explain renal physiology. I asked one of our faculty who is an expert in this area to review the results and they said it looked to be accurate. They also said, (paraphrasing) "and I become obsolete as an instructor" - so, there is a bit of fear brewing when these types of technologies appear.

I believe we will continue to need humans to teach our students, if only to help guide how to use this type of technology and for them to understand the pros and cons of it. It can be potentially quite dangerous, for example, if a student in a clerkship used one of these AI systems to help guide them to produce a differential diagnosis, how can they be confident what the system is telling them is accurate?

Combating the System?

This enterprising 22-year old has developed a tool to help teachers determine if the text produced is either from a human or from AI technology: https://www.npr.org/sections/money/2023/01/17/1149206188/this-22-year-old-is-trying-to-save-us-from-chatgpt-before-it-changes-writing-for. His tool is called GPTZero and you can access it (for free), here: https://gptzero.me/. This is how he describes it: “GPTZero turns the very technologies used to build ChatGPT around — to detect AI. It uses variables like perplexity to fingerprint AI involvement.”

So, that’s good news (right?). We are not letting the technology run us (for now).

Here is something that came across an education-related list where I am a subscriber. This is for those faculty who feel they will be replaced by AI:

"As I watch the hand-wringing over ChatGPT and other systems of its ilk, I can't help thinking about previous times when disruptors changed the way we learn and store information:

  • Computers would replace teachers!
  • Typewriters, and eventually word processors, would mark the end of cursive handwriting!
  • Spell checkers would destroy learning to spell!
  • Calculators would mean the end of memorizing multiplication tables and weaken our mental math abilities!
  • The internet, Wikipedia, and Google would mean students could plagiarize like never before!
  • And on and on...

and how we have adjusted our norms and expectations accordingly over time." - attributed to Carine Ullom, Ed.D., Associate Dean of Instructional Innovation at Ottawa University via the EDUCAUSE Instructional Technology group.

Another AI tool that might be of interest to try out is DiffusionAI. With this AI tool, you type in a description of something and see if it can come up with an image based on your description. I typed in "lungs showing severe deterioration due to smoking" and the results can be found here: https://comfaculty.uic.edu/wp-content/uploads/sites/42/2023/01/diffusion-example.jpg.

FakeYou is one of many 'fake voice' simulators that has been in the news recently. This tool makes it easy for a user to manipulate the voice of an actual or fake (cartoon) person. You select which voice and even language you would like the fake voice to use and type in some text for it to produce. I asked the system to use Arnold Schwarzenegger as the Terminator to say, "I would like very much to learn more about renal physiology." Here is the result: https://fakeyou.com/tts/result/TR:tey4hq69xnfh5a8hymkv14tfs3drw. Not great, in my opinion but I have seen in the news examples which can be quite alarming. Does this mean in the future, people would be able to use a system like this to make it seem like their faculty, administrators, politicians, etc. to say something they did not? Here is Morgan Freeman saying "no more classes until May 2023!" https://fakeyou.com/tts/result/TR:cf019hd137d6w3p4s2z3kad96rmjk.

ChatGPT and other similar tools are not going away anytime soon. In fact, Microsoft is investing 'billions' of dollars in OpenAI: https://www.cnn.com/2023/01/23/tech/microsoft-invests-chatgpt-openai/index.html.

There is so much to talk about in AI technology – more to come on how it is actually being used in health professions education (and has been for many years).

Some additional reading that might be interesting to you:

Duckworth, A., & Ungar, L. (19 Jan 2023). Op-ed: Don’t ban chatbots in the classrooms – use them to change how we teach. Los Angeles Times. https://www.latimes.com/opinion/story/2023-01-19/chatgpt-ai-education-testing-teaching-changes

Lee, J., Wu, A. S., Li, D., Kulasegaram, K. (2021). Artificial intelligence in undergraduate medical education: A scoping review. Academic Medicine. https://journals.lww.com/academicmedicine/Fulltext/2021/11001/Artificial_Intelligence_in_Undergraduate_Medical.14.aspx

Metzler, K. (and ChatGPT). (7 Dec 2022). How ChatGPT could transform higher education. Social Science Space. https://www.socialsciencespace.com/2022/12/how-chatgpt-could-transform-higher-education/.

Mills, A. (2022). AI text generators: Sources to stimulate discussion among teachers, https://docs.google.com/document/d/1V1drRG1XlWTBrEwgGqd-cCySUB12JrcoamB5i16-Ezw/edit#heading=h.r4nxfwgxmhae

Mitrano, T. (17 Jan 2023). Coping with ChatGPT. Inside Higher Ed. https://www.insidehighered.com/blogs/law-policy%E2%80%94and-it/coping-chatgpt

Palmer, K. (12 Jan 2023). How medical schools are missing the mark on artificial intelligence. Stat. https://www.statnews.com/2023/01/12/medical-school-artificial-intelligence-health-curriculum/

Schroeder, R. (n.d.). AI in higher education metasite. https://sites.google.com/view/ai-highered?pli=1

A collection of articles from the Radiological Society of North America (RSNA) NSA about AI, deep learning, and machine learning (ML): https://pubs.rsna.org/journal/ai