|
|
|
|
Search
06/27/2023 Michael L. Brody DPM
Artificial intelligence and the Augmented Practice of Medicine (Wenjay Sung, DPM)
The Journal of the American Bar Association recently published an article titled What cybersecurity threats do generative AI chatbots like ChatGPT pose to lawyers?
Software tools such as ChatGPT are 'not ready for prime time' when it comes to Healthcare due to the reasons that are outlined in the article. I am optimistic about the future of IA as a tool for the future of medicine. Note: I use the term IA - Intelligence Augmented as opposed to artificial intelligence. These are tools to assist us in clinical decision-making, not tools to make decisions for us.
When it comes to tools such as ChatGPT they are based upon huge databases. What is in the databases? Is the information that is being used by the tool accurate or has inaccurate or erroneous information made its way into the database? Let’s assume that the database is a database of all clinical notes for all physicians in the country. That would be a vast treasure trove of information to power the tool. But I ask you this question. How many times have you seen clinical notes generated by an EHR system that is full of contradictory and boilerplate language that is either inaccurate or even fabricated by the EHR program? Is this information in the database the tool is using to assist me in clinical decision-making? When the security is in place to ensure that use of a clinical decision support tool will not compromise the privacy or security of my patient information I will begin to use these tools. Once I do use these tools, I will use the information to inform me as part of my clinical decision process. Let’s stay informed as to what is happening and wait until the tools do not pose a risk to our patients.
Many EHR vendors are looking at IA tools to enhance their products. Ask them about the database that their tool is based on and if the information you put into the tool (including potential patient information) gets stored in that database and if it is possible that if you use the tool the PHI (protected health information) of your patients may be seen by other users of the tool.
Michael L. Brody DPM, Commack, NY
There are no more messages in this thread.
|
| |
|
|