Bringing together AI algorithms and data in ways that preserve privacy and intellectual property has long been a hurdle. Can software-based enclaves help clear it? Paul Cerrato and John Halamka, MD, of Mayo Clinic Platform have some thoughts.
By combining the right data with the right models, social determinants of health are a powerful asset in predictive models of health, outcomes, and potential health disparities, according to researchers from Mayo Clinic Platform and Change Healthcare.
It’s not enough to provide evidence that AI tools are clinically effective; practitioners want reasonable explanations demonstrating that “they will do what they claim to do,” according to Dr. John Halamka and Paul Cerrato of Mayo Clinic Platform.
The most sophisticated algorithms for screening diseases and improving treatments won’t do much good if they misrepresent the populations they seek to serve, according to Paul Cerrato and Dr. John Halamka. In this blog, they discuss the “ethical dilemma” that can deter digital health initiatives.
One way to help drive adoption of AI tools? Creating “a set of standards to help clinicians and other decision makers separate useful tools from junk science,” according to John Halamka, MD, and Paul Cerrato of Mayo Clinic Platform.