Home | Site Map | Watch | FAQ | History | Store | Contact

Some Ideas for Visualizing Cardiac Sounds


Overview

Since 1997apr03, I've been talking (emailing) with Joan Freedman and W. Reid Thompson about the possibility of a tool for visualizing cardiac sounds, to aid students in learning to recognize and distinguish features of those sounds. I've created this document as a place to collect, summarize, and share some ideas about such a tool.

To the extent that I can, I will say where these ideas came from. Please let me know if anything I've written here is unclear. And most important: if there's something you think I should add, please tell me!

Stephen Malinowski


Ideas


1997apr03 It was Joan Freedman's idea that a display along the lines of the Music Animation Machine could be used to highlight the differences in heart sounds, and thus teach pediatricians to distinguish normal heart sounds from abnormal ones. I told her that although the Music Animation Machine display itself wasn't suitable, some sort of special-purpose sonogram (such as VoiceTracker, a tool designed for visualizing the singing voice) could be invented.
1997apr04 I suggested having visual artists listen to heart sounds and draw pictures of what they heard, as a way to generate useful analogies between aural and visual patterns. I also suggested doing an AI neural network sort of approach, having a neural net trained by expert cardiologists.
1997apr05 Since the heartbeat is a cyclic pattern, it might be useful to have a visualization which reflected this, which used a circular, rather than linear, representation of time. In this way, changes between cycles of the heartbeat would become animations of a single object (at the same approximate position in the display); changes in heartrate would appear as displacements around the circle, etc. Some sketches for this are here. Later (1997apr09) I had the idea of an "average" heartbeat accumulating in this picture (either by actually averaging the data from multiple cycles, or by just drawing one cycle on top of another).
1997apr09 W. Reid Thompson explained that his project was twofold: a) to make a web-based teaching module, and b) to collect a database of normal and abnormal cardiac recordings to aid in the search for recognizable differences that could be incorporated into an automated diagnostic screening tool.
1997apr09 After listening to some sample cardiac sounds, I realized that differences between one recording and another could be caused by many factors that had little or nothing to do with the heart's functioning (e.g. the microphone, fatty tissues between the heart and the microphone, etc.), and that something akin to a stereo system's "graphic equalizer" would be useful for removing insignificant differences. A related idea was to somehow calibrate this by applying a known sound to the patient's body. 1997apr11, Reid pointed out that the diagnostician's hearing was also a factor, and that we need a way to calibrate that.
1997apr09 Animations of the heart. Currently, there are sonograms (and x-rays?) of the heart; an addition to this would be an animated schematic picture of the heart, showing the parts that are different when there is an abnormality. 1997apr10, Reid said he was planning to include motion video from sonograms. I started thinking about how we go from a sound to the idea of the object or action that produced that sound, and wondered whether a schematic (animation) could serve as an intermediate point between the sounds of the heart and the structures/events which created them. Reid pointed out that infant heart sounds presented special problems, because the faster beatrate could be misleading.
1997apr10 The sound of a heartbeat is the final result of a chain of things: electrical impulses result in muscle tension, which results in motion of muscles and other tissues, leading to blood flow (and turbulence), valves closing, etc.; these sounds are (selectively) absorbed as they pass through tissue. Differences at each stage can contribute to differences in the final sound. How could this chain of cause and effect be represented (and viewed) in the visualization of the sound? One possible way is to look at not just the final waveform, but various transformations of it (more description and pictures here).
1997apr10 Various attributes of a sound, once extracted algorithmically, could be presented visually in different ways (also discussed and diagrammed here).


1997apr10 "Compared to what?" is a useful question; with a heartbeat, the usual answer is "compared to a normal heartbeat." So, we could have a display which showed not just the waveform (or some transformation of it), but the differences between the patient's waveform and a normal one (discussed and pictured here).
1997apr11 Reid explained that there was a typical way for the rhythm of the heartbeat to change when the period changed. It might be useful to have something which made it obvious when changes did not fit the typical pattern.
1997apr11 A fundamental principle for a diagnostic tool: the diagnostician using the tool needs to understand what the tool does. This means that even if a tool is technically complex, it should be conceptually simple.
1997apr11 A heartbeat synthesizer could be useful in several ways: A related idea is a tool which allows a "morph" between two sounds: a way to create a waveform which is an interpolation point between two sounds. This could be used to create a small amount of a certain feature, to sensitize a student to it.
1997may04 We haven't said this explicitly anywhere, though we're assuming it: use FFTs and related tools for extraction of features of the heartbeat.
2005sep19 Eldon Nelson read this page and took the idea as the subject of a thesis for his Biosystems Instrumentation class.
2005dec19 Eldon Nelson (tenthousandfailures.com) posted Investigations into Visualization and Significance of Phase Relationships in Cardiac Rhythms online.
Last update 2006mar16.