Archive FM

ASHPOfficial

Educator Essentials: Transforming Pharmacy Residency Programs With AI: Where Do We Go From Here?

Duration:
39m
Broadcast on:
05 Aug 2024
Audio Format:
mp3

With the fast advancement of artificial intelligence (AI) and its widespread use, it is important to understand how residency programs can incorporate AI to improve administrative efficiency, residents’ learning experience, and recruitment and interview processes. This podcast will dive into the specifics of how AI can transform the current processes in residency programs from the field experts.

The information presented during the podcast reflects solely the opinions of the presenter. The information and materials are not, and are not intended as, a comprehensive source of drug information on this topic. The contents of the podcast have not been reviewed by ASHP, and should neither be interpreted as the official policies of ASHP, nor an endorsement of any product(s), nor should they be considered as a substitute for the professional judgment of the pharmacist or physician

What happens at the biggest and best pharmacy event in the world? Join the best and brightest pharmacy professionals in New Orleans this December for an energizing and riching, enlightening experience like no other. Simply put, there is nothing like it. ASHP's mid-year clinical meeting offers everything you need for your career to blossom, including countless professional development and career advancement opportunities. Just imagine what you can accomplish at an event that brings together 20,000-plus pharmacy professionals from across the globe. Special rates are available when you register and book your hotel before September 27th. Learn more at midyear.ashp.org. That's M-I-D-Y-E-A-R.ASHP.org. Welcome to the ASHP official podcast, your guide to issues related to medication use, public health, and the profession of pharmacy. Thanks for joining us in this episode of Educator Essentials, the podcast where we talk with our members about success stories, best practices and strategies for faculty, preceptors, and those involved in the Education of the Pharmacy Workforce. My name is Michelle Chu and the co-host, Brooke Bait, and today we'll be chatting with three AI experts on the Transforming Pharmacy and Residency Program with AI. Where do we go from here? Technology and AI are advancing at a fast pace and residency preceptors and RPDs must understand and utilize them to enhance the resident learning environment and recruitment, application screening, and interview process. Our three speakers will also share resources and some of the precautions for adopting AI use in the residency program. Today we'll be chatting with Dr. Jeff Kane, Dr. Heather Epima, and Dr. Adrian Wong. Dr. Kane is an associate professor and vice chair of pharmacy practice and science at the University of Kentucky College of Pharmacy. Dr. Heather Epima is a clinical assistant professor at the University of Illinois Chicago College of Pharmacy and Dr. Adrian Wong, medical ICU's clinical pharmacy specialist at the Bath Israel-Deaconess Medical Center. Thank you for joining us today. So first we're going to talk about the AI use in enhancing learning environment and recruitment. So we're going to ask three speakers on how we can, as a preceptors, utilize an AI to enhance the resident's clinical learning and training. Yeah, and thank you for that introduction. Thanks for having us. Dr. Tu invates certainly a great question and I think there's a lot of opportunities that are available for AI to enhance resident clinical learning and training, but I'd say that there's not a lot of great data to support its use, similar to a lot of AI and clinical practice. When you think of artificial intelligence or how some people term it as augmented intelligence, it's basically anything that human brain can do on that a machine is trained through data to replicate. And I think just thinking of how you precepts your residents, theoretically anything that you're able to perform, AI could also help with. I think just thinking of currently when I'm precepting a resident, I could potentially use it to tailor their learning style that might help them best develop practice questions for them that they might encounter on rounds help facilitate orientation of them to our institution and certain practices that might be specific to our institution, thinking of resource optimization, such as when on weekends they are staffing and how they can more efficiently work up patients. And then something near and dear in my heart because I'll be having too long into farm academics later today is how AI can see rightly evaluate feedback that has been submitted for a resident and then help determine goals that are tailored to a specific rotation based on the evaluated objectives. However, I think it's probably more important to also understand that we know that there's potential opportunities. There's definitely a need for education of pharmacists and preceptors of how AI can be used. But I think more importantly, understand the limitations of what AI currently does and what it could perform in the future. You know, one thing that strikes me as a preceptor is that AI is a huge umbrella term, and it includes everything from chatbots to clinical calculators to generative AI tools, even image generators. So going forward, I think it will be helpful for us as preceptors to try to use more specific language so that residents have clarity and that we all have clarity on the specific type of applications that we are talking about. And for me, one of the ways forward in this new AI infused world is to really stay grounded in the basics. Our job as preceptors has not changed. We have to ensure that clinical learning and critical thinking are happening in our learners. Artificial intelligence really cannot fully replace those things. And so as a preceptor, I'm trying to continue to be diligent about assessing residents' clinical competency and ensuring that I'm aware of how their learning is actually progressing. Hopefully, all of us are already asking our residents questions like, how did you calculate that? Or what resource did you use? And like any tool or resource, AI does have limitations, as Dr. Wong just said. And so as preceptors, we need to discuss those limitations with our residents. And we ideally should be modeling appropriate use of AI-based tools as we would for any other clinical tool. One other thought, preceptors should be aware of institutional practices about using AI in our clinical settings, if there is an institutional policy. And if AI tools are already a part of our practice, then the residents should really be taught very clearly that the specific tool is AI-based and what data inputs were used to train the model, again, so that they can critically evaluate the limitations of that output. So one of the really great things I like about this podcast in the self is that Dr. Wong, Dr. Ipima and I come from three different perspectives, so I love listening to their thought process and the way they've answered the questions. And I'll stay up front. I'm a non-pharmacist, so I come from a social administrative background and I deal primarily when I'm working with residents, either on academia rotations or from the teaching side and a little bit to the scholarship side. So if I think of some of the additional ways, and I agree completely with Dr. Wong and Dr. Ipima that with the things they've suggested, think some other ways that AI can help enhance residents' clinical learning and training is in preparing for things such as topic discussions, whether that's with preceptors or whether it is when they are precepting students, maybe getting some literature, summaries, developing their own discussion questions to gauge their own understanding. So I think that's some of the low-hanging fruit that's out there. But one of the other things that hasn't been mentioned yet is we're so new to the AI explosion, even though there's been aspects of it out there for a while, but now that it's exploded, that things are changing rapidly. And I'm hoping that by the time this podcast actually comes out that what I've said is not outdated, because there's a good chance that something could be as new tools are developed, but also in that the individuals that are going to be entering residency programs are going to be coming with a wide range of skills and knowledge of AI and AI use. There will be some that will hardly have used it at all, and then there will be some that are probably maybe even better than I am at using it. So we have to kind of think of that different skill set that's coming and when and where is that training going to occur. And you know, there are institutes going on now of trying to determine what are AI competencies for pharmacy graduates, which then leads into pharmacy residents and what do they need to know and be prepared and how do they start entering into a world that's going to be more prevalently heavy with AI use. And definitely agree with what Dr. Eppemon, Dr. Kane mentioned. There's a lot of opportunity to make work more efficient, just thinking of when I was a resident a long time ago now, topic discussions, developing a medication use evaluation, but also understanding the limitations and how to appropriately use them. I think I was really important. I think Dr. Kane mentioned a really important part about education of our future generation of students and how they might see this in the curriculum or classroom or even in clinical practice. I think there's a lot of need for future research on whether or not AI is better than a preceptor training a resident, but certainly it'll make work more efficient, potentially reduce burnout, which I know is a big thing right now, but also just really understanding the big limitations of AI. And I'd say that currently the big application of AI is the chatbots that Dr. Eppemon mentioned, such as chat GBT and how they might be integrated into your institution. For example, we have one that is able to search throughout our policies, procedures, guidelines, and directives, and is able to spit out an answer, draw prompts, but I have rarely used it. And that's because I can more readily and faster of the answer based on my knowledge. And also I know where the guideline is. So I think there's a lot of work that needs to be done before AI is truly integrated into practice, education, and also how we take care of patients. Thank you for that. I think that some of the themes and common theme that resonated with me wise, basically, there are so many different types of functionality under AI. So understanding and using the specific language when we are referring to AI, I thought that was what resonated with me. And also as a preceptor, continuing the education of the critical thinking and even if they use the AI, it would be important to ask the resident what kind of tools they use and also how they can evaluate the limitation of those tools that was also one of the things that kind of popped up. And I really liked the comment about the people coming in as a resident or as a preceptor. We all have a wide range of experience in AI use. So considering those different types of skill sets and definitely understanding the limitations that each AI functionality brings. So that was really helpful to understand what as a preceptors, how we can utilize this AI. So we're going to go on to the next question. So we talked about how the preceptors can utilize AI. But now we want to know how residents can utilize AI to improve their own learning in clinical research and teaching areas. So we'll start with Dr. Wang. Yeah, another great question. And I think certainly a lot of opportunities available. We already touched on some during the first question, but I think really trying to personalize your learning I think is one big aspect that's all for hypothetical or case based simulations where you give a prompt for a certain patient, then you work through yourself how you might manage a patient for their therapies and then compare it to what the AI is able to provide to you first. But second answer, I know that some people have used it to develop abstracts, a skeleton for manuscripts, certainly a lot of considerations for appropriate use for that. But also, I think just getting feedback from a prompt or something that you develop such as a note and trying to identify Arizona improvement are certainly things that could help a resident improve their own learning. I think some other aspects would be helping to understand how an AI cat bot might be able to provide guidance for a patient on medical advice so that you know what they might receive while comparing it to what might actually be more accurate. I know that one term is hallucinations where it comes out with a certain statement which actually might not be supported by evidence or any references. They may also make up references. So always important to validate what is output from an AI tool. And then certainly a lot of issues such as the bloody of information that I just mentioned and also the impact of protected health information. Dr. Wong, I appreciate your comments about trying new things. And it strikes me that the structured support of a residency program really is a perfect time to practice, that the support that a preceptor and that an RPD can provide is in helping the resident to really identify an appropriate moment to use AI, generative AI in particular. And I think it would be great if residency programs openly encourage that experimentation. One key thing is that we want our residents to learn how to use AI thoughtfully. And that has to include self-reflection and it has to include a self-awareness of why am I using this and what am I gaining from this and is this really increasing my efficiency, like you said or not. It strikes me too that the residency year is just a year of tremendous growth. There is a lot to learn. We're talking about basic knowledge, but also we're growing interpersonal skills, adaptability, resilience, leadership. And most of these are areas of learning where AI cannot help. But if there are areas or specific tasks where AI can provide some assistance and support, then it does make sense to help our residents to identify those appropriate moments, but as well the inappropriate moments when it really would not be ethical or would not be safe even to use AI. So both of you have alluded to some really good things about practicing and learning. Because I think with AI in particular, unlike even other technologies, this is an area in which you have to actually use it and play with it and practice it to actually see both the potential and the issues that come along with it. And as I've had various residents come on academia rotations with me, we inevitably do some stuff related to AI. And some have come very excited about it and others have come very hesitant. But I think what happens over time of giving them some of the different tools to look at, some different assignments to do with it, is once they get into it, they can actually see, well, this is how this could help. This is how this could help us to have questions on whether it's the right thing to do or not. And one of the things that I will tell residents students or whomever about when you're using AI and particularly generative AI, which is the things that are used the most is if you're using it to do something and you have to go and then check and verify if it's correct or not, then that might be not the best time to use it. The best time to use it is if you use AI, it spits something out to you and you can immediately check it and verify and it's fine and go on. That's where the time saving comes. But if you put something into a chat GPT or other platform and you get a result back and then you have to go research to see whether that's correct or not, then maybe that's a situation where you need to figure that out first. I think that's one of the interesting aspects about it. Going back to the question that Dr. Chiu asked about how can they utilize it to improve their own learning, whether it's research or teaching. I think there are some additional ways in the scholarship rounds, that there's some additional tools, perplexity, scholar.ai, elicit that do some really interesting things with literature searches and literature summaries and extraction. They can find relevant papers without perfect keyword matching, provide summaries and extract data into spreadsheets that make it pretty easy to get maybe given overview on topic, whether someone is starting some research and they need that background to get going. So that's a really good way. I think that the things to where also take a lot of time, whether they're residents or faculty, or when you're giving presentations or coming up with presentation titles, coming up with the learning objectives, maybe even slide duration. There are some tools like gamma that does a really good job of actually creating that versus slides. Over and over and over, there's so many different ways that we could get into. But I think the opportunities are there and I do hope and I think we've got a question maybe coming up a little bit later about the precautions that we always have to balance that optimism out with some real caution of where are we going with this and what is going to be the right thing to do. All three of you have provided some really great helpful information on how we can use AI to enhance the learning environment for our residents and we will in a bit talk a little bit about precautions in using AI. But first I want to switch gears and talk a little bit more about how we interact with residency candidates throughout the recruitment process. So how do you envision using AI within the recruitment process, the application process, screening and interviews over the next few years, and what are some approaches that we can take down to prepare for that? Great question, Dr. Bates. I think personally we need to assume that AI is already affecting residency recruitment, particularly generative AI. Chat GPT can write CVs, letters of intent, letters of recommendation, and develop answers to common interview questions. And there are software applications out there that can help to detect whether something was written by AI or by a human, but that software isn't fully reliable. And there's always been this possibility that candidates were using work that was primarily written by another person. And so that possibility maybe has only increased, but it isn't really new. So really, I think we need to continue with really robust interview processes that we are hopefully already trying to follow, you know, with kind of asking really good discussion starting questions with a hope of hearing unrehearsed answers. And so, you know, having really engaging the candidate during that interview process, engaging in live time, clinical and creative exercises, so that we see candidates thinking on their feet, those may be things that we have to rely on even more going forward in the future. And, you know, it does strike me that there could be some positive effects from AI in the recruitment process too. For instance, there's already been scholarship among medical residencies about whether chat GPT can reliably follow a rubric to evaluate resident applications. And I think we could certainly do studies in that area in the pharmacy application process as well. So I think this is a really interesting question. And as Dr. Thamma stated, I'm positive that there are students that are using chat GPT for their application process to improve their letter writing. And on the surface, you might say, well, that's fine. I mean, the letter is the letter. But one of the interesting things in research is that human beings are really terrible at actually evaluating, writing or separating content from writing ability from grammar. Let's just, let's just reduce it to grammar in that someone could have a really good content or message to say. But if they can't write it in a good, clear, grammatically correct way, we rate it lower than someone who may not have the content that we're looking for, but can write really well. So now if you level the playing field, if every student were to use chat GPT, for instance, to assist in writing their letters, then maybe that levels the playing field on writing ability. And it makes it easier to tease out like you has the best content. So, you know, that's just a question for thought. I think some other ways AI could be potentially beneficial in residency recruitment selection is going back to those interview questions of if you're doing this in a way that they can be recorded, maybe it's via Zoom, you take the transcript, AI can develop some really good summaries. And I think there's probably potential to ask some really good questions of AI to see how it summarizes or ranks or rates answers. I don't know offhand what that would look like, but I think it would be interesting. Some other other ways that we can think about preparing is I think you could we as or you all as residency program directors could definitely use AI to develop better, maybe interview questions, better ways of screening, helping with the scoring, helping with the scheduling. A lot of those efficiency things. Yeah, I'm really great points brought up by Dr. Zippima and Kane. I think an important point that Dr. Kane mentioned was how to leveling the playing field, I think of it especially for students or learners who are English as second language that could certainly help. But I think that caution would be understanding if the leveling of the playing field may help mask national issues on English comprehension, which is certainly one thing to consider. And I think when developing letters of intent, which are for applications to residency programs, I think it would be challenging, especially since I find that letters of intent often don't have that personal touch and using a something such as chat GBT might make them even more impersonal and may make it more of a just a requirement that doesn't necessarily add anything to the residence application. And I think the other big thing is a more objective evaluation of applications. I know that there's a lot of growing literature on sexual gender bias and also race and how even though I guess implicit bias, we don't intend to be biased towards certain applications. There'll be a potential more objective way of evaluating applications, although depending on how the AI is trained, we may already have that bias built into it because that's just how it was trained. I think one final thing would be when evaluating the applicants that end up matching at a certain institution or wouldn't set you rank the highest, it may help to develop more relevant application criteria specific to your institution to identify those that are best fit for your institution that might not just be a general rubric that would be applicable to other places. So many of your comments really resonate with me as an RPD. And I can say that I've really found it more difficult to start distinguishing between candidates on their application that are written the application alone. And I wonder if that's because of the the increasing use of AI and so letters and CBs are all starting to look a little bit more similar. And I really appreciate the comments about using AI to improve efficiency because regardless of how large your program is, how many residents see candidates you're interviewing, it can be quite a time consuming process for RPD, the RAC, and other preceptors who are involved. So anything we can do to make that more efficient would be super helpful. We've sort of alluded to this along the way, but I want to make sure that we have some dedicated time to talk about precautions with the use of AI. So what are some policies and expectations we should have as a residency program on the use of AI by residents on their deliverables specifically? And this is probably one of the most important questions or released one of the questions that comes to mind first for everyone. Like if you talk to professors, it's how do we prevent students from cheating and what are the policies we have in the syllabi and the same way, you know, transitions into a residency program. So first and foremost, I think one is a policy or a set of guidelines or both, you know, it's depending on how deep you want to go with this, because first and foremost, the residents need to have the clear expectations made of what is acceptable use and what is unacceptable use. So being very clear with them on that, that's the first thing. And if you're going to have the guidelines or the policy, some of the general things that are recommended are that any work that's utilizing AI, there needs to be transparency involved in stating what was used, how it was used, citing that information, and then obviously making it clear to the resident that they are ultimately responsible for any output that they use from an AI platform. And one of the things then that gets interesting, and you have the discussions of what is actually, and we'll use the term academic misuse instead of cheating, academic misuse of an AI platform. Because an academic misuse could be, yes, downright cheating to use something and pass it off as your own, or it could be shortcutting the learning circuit, which I think when you get to the residency program, I think that's the big fear is that this is a learning environment, a learning arena. And there are things that you have to do that an AI platform could do maybe quickly or even better, but that may prevent you from learning the skills or the knowledge that you actually need to. So one of the ways to actually think about this then is what is misuse or what is not misuse, is this distinguishing between what's most important, the process or the product. And if it's the product, if it's just I need this answer, I need this output, and that's the most important thing, then that tends to lead more towards decided if someone can use AI to get to the product faster, more efficiently, or a better product, then there you go. But if the process is important, so maybe yes, the answer that is given or the product that is produced is important, but the most important for the resident at this point in time in their life and in their training is going through the process and knowing how to do it, then that is where we need to steer clear. And you can use the example of going back to simple example of calculators that for any of us now, I think everyone says if there's some complex math problem, that the answer is what we're trying to get, like we need the answer and kind of using the calculator to get the correct answer that's perfectly fine and even recommended. If someone is a first grader or a second grader, the answer is not as important as them actually learning how to do basic math and how to do you know, those simple things. So at that point in time, the process may be as important or more important than the product. Yeah, really great points, Dr. Kane definitely agree with you on the use of development of clear guidelines on use and when appropriate and also validation of the output of an AI tool. I think the other big thing is that things happen all the time to the technology that we use. I know that some limitations for AI are whether it doesn't work, it breaks, there's downtime and we have procedures for when our electronic health record has downtime. But I think the other thing to consider is cyber security and the risk of an attack and how that might impact our control and intelligence tools will so remain important that a resident is able to do that process without this technology. So really preserving the critical thinking process behind it and then not forgetting it totally and just relying solely on AI to help with that process. One other concern that we haven't really touched on is the idea of whether the AI application keeps and internalizes the input in order to use it as part of future training models and one boundary that we really should expect from our residents is that no patient specific or identifiable information would go into the application unless they were turning off the ability of that information to be retained. And it strikes me that whether we have more national guidance because really pharmacy residents are pharmacists and we really don't have a lot of these standards of practice that have actually been defined for just the pharmacy profession as a whole. Some of the things that Dr. Kane mentioned like transparency and accountability, it hasn't really been stated or articulated yet that that is expected of all pharmacists. But until we sort of have something that's a bit more of like a standard of practice for the profession as a whole, I think we can help residents to understand either through formal policies or just like more informal kind of conversations that choosing to use AI is not a small matter depending on the extent and depending on the context of what they're using it for and how they're using it. And I think that we just need to have open conversations with residents so that they understand that there are like professionalism, quality, ethical, privacy, kind of concerns. And what some of that can lead to is like impaired patient safety due to misinformation from AI or authorship concerns that we've brought up or liability concerns that need to disclose the need to protect patient information. And if we kind of follow those risks through and help them to kind of see like the potential scope of violations or problems that can arise, I think that just helping residents to see those things clearly is going to be useful. And ultimately, we're kind of mentoring each other and holding each other accountable in these personal frameworks of using AI in ways that benefit the profession, but also balance that responsibility that we have to the public. I really appreciate all of your expertise. And I think you three have left us with a lot to think about in our own programs and with our own residents. But because the majority of us are really just getting familiar with the use of AI and what that looks like in our own practices or within our programs, I want to be sure that we can share some resources so that people can reference back as things come up or new information becomes available. So can you share some resources in ways that residency leadership and preceptors can stay up to date on the latest advancements in AI? Dr. Zwang and if MMA have some more pharmacy specific or more resident specific type of things, but from what I follow and there are literally hundreds or thousands of websites or newsletters or LinkedIn pages or ex Twitter users that you could follow on the topic. But Ethan Malek from the Education Society who's a professor at Wharton School has a website called One Useful Thing and that's all about the implications of artificial intelligence in work and education and life in general. He also has a more useful things.com/promps that is a prompt library that has a plethora of prompts that can be used in the different generative platforms to perform a variety of tasks and these are really long prompts, not the little short things, but really long good prompts. I follow, I have a newsletter, the one newsletter that follows AI Tool Report and you find that at aitoolreport.com and you just get that newsletter of all the different tools and the things that are happening there. There's a lot more, but I'll leave it at that and we'll let Dr. Wang and Dr. Epp in the chime in. Yeah, I appreciate those very practical resource ideas, Dr. Kane. I haven't personally found anything yet that I could just name as kind of a go-to resource for pharmacy and specifically like clinical pharmacy or pharmacy practice, AI type of resources. So if anyone else knows of that, I would be interested to hear of it. It's interesting to just think of the fact that really our residents need to learn how to stay up-to-date in many different areas and AI is kind of, in a way, it can just become one of those things and so if there are sort of hashtags or keywords or, you know, specific social media or podcasts or email lists, those are things that I think our residents can kind of curate and collect for themselves and then it just becomes one part of their overall staying current kind of methodology personally. It also strikes me that our residents are really the digital natives, you know, and as like newer residents and newer generations are kind of now coming into our residency programs, they really might help to like lead the way in staying current with these technologies and so I hope that we can get to the point that it's kind of this two-way conversation that younger pharmacists are kind of mentioning emerging tools and strategies and resources that they're aware of and then that the pharmacists in practice then are able to take that information and then kind of model in live time like how do we assess and kind of critically evaluate those things. And I would like to actually go back and one that I actually forgot that is a little more pharmacy specific and it's not just AI use, it has a lot of other and gets more into digital health but I do lean a lot on Dr. Timothy Onks at MCPHS and he has a website that's digital apothecary that has a lot of digital health stuff and he's, you know, really advanced in the AI realm as well. Yeah, I think similar to any other topic that you're interested in, I think things I do are sign up for journal email listservs. I have a NCBI weekly email that got sent to me speaking specifically about AI. The journal that I've most paid attention to is New Zealand Journal of Medicine which has an AI specific journal and they've released a lot of guidance including appropriate use of large language models which the most famous one is chat GBT. I know that there's a lot of pharmacy specific events focused on AI. I know ACHP has a summit on AI and pharmacy practice but I think similar to learning about any other topic I think it's just important for you to understand that this is going to be a big thing in the future. Obviously a lot of work that needs to be done before it's appropriately used but I think just continuing education, seeking continuing education on it such as events and incorporating that into your biannual CE topics, I think just actively seeking out education on AI would be important and I think what Dr. Epema mentioned, just speaking with your newer learners might be a good way of keeping on top as well. I know that ACHP has a lot of good articles that have had commentary on AI specifically. You know there was one recently published a few months ago which was a great summary of the impact of AI and clinical pharmacy practice. It's really just seeking out resources, so much topics of interest for you. Thank you so much. That's all we have time for today. I want to thank Dr. Kane, Epema and Wong for joining us today to discuss transforming pharmacy residency program with AI. Where do we go from here? If you haven't before, I encourage you all to check out ACHP's educator resources. You can find member exclusive offerings such as the preceptor toolkit, the research resource center, and exchange ideas with your peers on the ACHP education connect community. Thanks again for tuning in for this session of educator essentials. We hope you enjoyed today's conversation and be sure to subscribe to ACHP podcast through your favorite podcast provider. Thank you for listening to ACHP official, the voice of pharmacists advancing healthcare. Be sure to visit ashp.org/podcast to discover more great episodes, access show notes, and download the episode transcript. If you loved the episode and want to hear more, be sure to subscribe, rate, or leave a review. Join us next time on ACHP official. [Music] (gentle music)