Building Patient Journeys from Multi-Modal Healthcare Data Using Medical Language Models
ฝัง
- เผยแพร่เมื่อ 30 พ.ย. 2024
- Data integration has been an enormous challenge in healthcare for decades. Issues of data standardization, data quality, legacy formats, unstructured data, and semantic inconsistencies have made it hard to answer basic questions about how a hospital operates or what should be done next for a patient. Recent advances in Healthcare AI combine to transform this age-old problem - enabling you to automatically ingest large volumes of raw, multi-format, multi-modal, untrusted medical data into coherent longitudinal patient stories in an industry-standard format.
This webinar presents an integrated solution in action that uses John Snow Labs’ state-of-the-art Medical Language Models, healthcare-specific data preparation pipelines, and Text-to-OMOP question answering models running on Databricks’ secure, scalable, and compute-optimized AI platform. The solution takes in multi-modal data - structured (tabular), semi-structured (FHIR resources), and unstructured (free-text) - and generates an OMOP/OHDSI standard data model that:
Builds a unified view of each patient over time.
Builds this unified patient view from multi-modal source data.
Reasons at the patient level.
We’ll then show how the resulting patient data model can then be used for either “AI” (building patient cohorts with natural language queries) or for “BI” (dashboards for patient risk scoring and quality measures), all from the same source of truth, with full explainability and traceability.
Excellent job! Congratulations!!!
Glad you enjoyed it!