Cardiology Practice

AI-Driven Echocardiogram Interpretation: Multisite Validation of the PanEcho System

Article Impact Level: HIGH
Data Quality: STRONG
Summary of JAMA. https://doi.org/10.1001/jama.2025.8731
Gregory Holste et al.

Points

  • Researchers developed and validated an artificial intelligence system, PanEcho, to fully automate the interpretation of transthoracic echocardiograms using multitask deep learning on over one million videos.
  • The model performed 18 diagnostic tasks with a median AUC of 0.91 and estimated 21 parameters with a median normalized mean absolute error of 0.13 in internal validation.
  • It accurately detected conditions like severe aortic stenosis with an AUC of 1.00 and estimated left ventricular ejection fraction with a mean absolute error of 4.5% in external validation.
  • The AI system demonstrated high accuracy in limited studies, including real-world point-of-care ultrasound acquisitions from emergency departments, showcasing its resilience in lower-quality imaging.
  • This technology could serve as an adjunct reader in echocardiography labs or as a screening tool in low-resource settings, pending further prospective evaluation in clinical workflows.

Summary

A new study reports the development and retrospective, multisite validation of an artificial intelligence (AI) system, PanEcho, designed to automate the interpretation of transthoracic echocardiography (TTE). The model was developed using 1.2 million echocardiographic videos from 32,265 TTE studies and was validated internally on a temporally distinct cohort and externally across four diverse cohorts. PanEcho was evaluated on its ability to perform 39 diagnostic and quantitative tasks, comparing its predictions against the assessments of interpreting cardiologists.

In internal validation, the AI system performed 18 diagnostic classification tasks with a median area under the receiver operating characteristic curve (AUC) of 0.91 (IQR, 0.88-0.93) and estimated 21 parameters with a median normalized mean absolute error of 0.13 (IQR, 0.10-0.18). For specific high-value tasks, the model accurately estimated left ventricular ejection fraction with a mean absolute error of 4.2% internally and 4.5% externally. It also detected moderate or worse left ventricular systolic dysfunction (AUC: 0.98 internal; 0.99 external), right ventricular systolic dysfunction (AUC: 0.93 internal; 0.94 external), and severe aortic stenosis (AUC: 0.98 internal; 1.00 external).

The system maintained high performance even with limited imaging protocols. On abbreviated TTE studies, it performed 15 diagnostic tasks with a median AUC of 0.91 (IQR, 0.87-0.94). Furthermore, when tested on real-world point-of-care ultrasonography (POCUS) acquisitions from emergency departments, it completed 14 tasks with a median AUC of 0.85 (IQR, 0.77-0.87). These findings suggest that AI may serve as an effective adjunct to echocardiography laboratories or a screening tool in resource-limited settings, following a prospective clinical evaluation.

Link to the article: https://jamanetwork.com/journals/jama/article-abstract/2835630


References

Holste, G., Oikonomou, E. K., Tokodi, M., Kovács, A., Wang, Z., & Khera, R. (2025). Complete ai-enabled echocardiography interpretation with multitask deep learning. JAMA. https://doi.org/10.1001/jama.2025.8731

About the author

Hippocrates Briefs Team