Quality and Patient Safety
Leo Anthony Celi, MD, MPH, MS (he/him/his)
Beth Israel Deaconess Medical Center BIDMC
Disclosure(s): No relevant financial relationship(s) to disclose.
Hallucinations in Artificial Intelligence is where machine learning generates false perceptions or misinterprets data, the “hiccups” of machine learning. They have been found throughout the use of AI but would be quite harmful if it is generated in the ICU without realizing the ramifications. This presentation examines instances where AI systems in critical care settings generate false perceptions or misinterpret data, discussing their potential impact on patient outcomes and strategies for mitigating risks. This presentation will delve into the problems with AI, define the limitations, and the biases found in AI and how they will be overcome to allow it to truly change the delivery of critical care.