HeartPad: Real-time visual guidance for cardiac ultrasound

Steven T. Ford*, Ivan Viola, Stefan Bruckner, Hans Torp, Gabriel Kiss

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Scopus citations


Medical ultrasound is a challenging modality when it comes to image interpretation. The goal we address in this work is to assist the ultrasound examiner and partially alleviate the burden of interpretation. We propose to address this goal with visualization that provides clear cues on the orientation and the correspondence between anatomy and the data being imaged. Our system analyzes the stream of 3D ultrasound data and in real-time identifies distinct features that are basis for a dynamically deformed mesh model of the heart. The heart mesh is composited with the original ultrasound data to create the data-to-anatomy correspondence. The visualization is broadcasted over the internet allowing, among other opportunities, a direct visualization on the patient on a tablet computer. The examiner interacts with the transducer and with the visualization parameters on the tablet. Our system has been characterized by domain specialist as useful in medical training and for navigating occasional ultrasound users.

Original languageEnglish (US)
Title of host publicationProceedings - WASA 2012
Subtitle of host publicationWorkshop at SIGGRAPH Asia 2012
Number of pages8
StatePublished - 2012
Externally publishedYes
EventWorkshop at SIGGRAPH Asia 2012, WASA 2012 - Singapore, Singapore
Duration: Nov 26 2012Nov 27 2012

Publication series

NameProceedings - WASA 2012: Workshop at SIGGRAPH Asia 2012


ConferenceWorkshop at SIGGRAPH Asia 2012, WASA 2012


  • 3D echocardiography
  • in-situ visualization

ASJC Scopus subject areas

  • Computer Graphics and Computer-Aided Design
  • Computer Vision and Pattern Recognition


Dive into the research topics of 'HeartPad: Real-time visual guidance for cardiac ultrasound'. Together they form a unique fingerprint.

Cite this