Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEMS AND METHODS FOR AUTOMATED IMAGE SELECTION IN DOPPLER ULTRASOUND IMAGING SYSTEMS
Document Type and Number:
WIPO Patent Application WO/2009/013686
Kind Code:
A2
Abstract:
An ultrasound system (10) is disclosed for selecting a diagnostic image from a series of ultrasound images. An image characterization parameter, such as the standard deviation, is calculated for each of a series of Doppler ultrasound images. The image characterization parameters are then analyzed to select an image corresponding to a predetermined point in a patient's cardiac cycle. The selected image is then displayed. In some embodiments, Doppler images are processed to identify an area of interest corresponding to an individual blood vessel. The image characterization parameter is then calculated based on the area of interest. The area of interest may be identified by receiving a user input, such as positioning a cursor at a specific point on an image. In other embodiments, ultrasound images are mapped to points on an ECG waveform and an image is selected based on analysis of the ECG waveform.

Inventors:
HILL, Steven (P.O. Box 3003, Bothell, Washington, 98041-3003, US)
SAAD, Ashraf (PO Box 3003, Bothell, Washington, 98041-3003, US)
LOUPAS, Thanasis (PO Box 3003, Bothell, Washington, 98041-3003, US)
SHI, Xuegong (PO Box 3003, Bothell, Washington, 98041-3003, US)
Application Number:
IB2008/052903
Publication Date:
January 29, 2009
Filing Date:
July 18, 2008
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
KONINKLIJKE PHILIPS ELECTRONICS, N.V. (Groenewoudseweg 1, BA Eindhoven, NL-5621, NL)
HILL, Steven (P.O. Box 3003, Bothell, Washington, 98041-3003, US)
SAAD, Ashraf (PO Box 3003, Bothell, Washington, 98041-3003, US)
LOUPAS, Thanasis (PO Box 3003, Bothell, Washington, 98041-3003, US)
SHI, Xuegong (PO Box 3003, Bothell, Washington, 98041-3003, US)
International Classes:
A61B8/06; G01S15/89
Domestic Patent References:
WO2002045586A12002-06-13
Foreign References:
EP1652477A12006-05-03
US20020062078A12002-05-23
Attorney, Agent or Firm:
SCHOUTEN, Marcus, M. (Philips Intellectual Property & Standards, High Tech Campus 44PO Box 220, AE Eindhoven, NL-5600, NL)
Download PDF:
Claims:
WHAT IS CLAIMED IS:

1. A method for presenting ultrasound data comprising: emitting a series of ultrasonic waves at a target area and receiving reflected waves; analyzing the reflected waves to generate velocity data corresponding to the velocity of blood within the target area; generating a plurality of images representing the velocity data; calculating an image characterization parameter for each image; analyzing the image characterization parameters to select one of the parameter values; and displaying the image corresponding to the selected parameter value.

2. The method of claim 1, further comprising providing an interface for receiving inputs from an operator and visually scrolling through the images according to the inputs, and wherein displaying the selected image comprises automatically scrolling to the selected image.

3. The method of claim 1, wherein calculating an image characterization parameter for each image comprises calculating a standard deviation of at least a portion of each of the images.

4. The method of claim 1, further comprising identifying a region of interest within each image and wherein analyzing the images comprises analyzing the regions of interest.

5. The method of claim 4, wherein identifying a region of interest comprises identifying a region corresponding to a blood vessel.

6. The method of claim 5, wherein identifying a region of interest comprises: analyzing a plurality of the images to determine blood flow regions; generating a mask corresponding to the blood flow regions; and applying the mask to the images to generate masked images.

7. The method of claim 6, further comprising removing flash artifacts from the masked images.

8. The method of claim 6, further comprising analyzing the masked images to identify vessel portions corresponding to individual blood vessels.

9. The method of claim 8, further comprising receiving an operator input identifying a selected vessel portion of the vessel portions and wherein analyzing the images to determine the selected image comprises analyzing the selected vessel portion of each image to determine the selected image.

10. The method of claim 9, wherein analyzing the selected vessel portion of each image comprises calculating and analyzing at least one of a standard deviation, velocity-weighted flow area, mean velocity, median velocity, and a specified percentile velocity of the selected vessel portion of each image to select the image.

11. The method of claim 1, wherein analyzing the image characterization parameters to select one of the plurality of images comprises identifying one of the image characterization parameters corresponding to a predetermined point in the patient's cardiac cycle.

12. The method of claim 11, wherein the predetermined point in the patient's cardiac cycle is one of the peak systolic and the diastole.

13. The method of claim 11, wherein the predetermined point in the patient's cardiac cycle is the occurrence of at least one of regurgitant jets and trans-valvular flow.

14. A method for presenting ultrasound data comprising: emitting ultrasonic waves at a target area and receiving reflected waves while simultaneously measuring the patient's electrocardiogram (ECG);

analyzing the reflected waves for each ultrasound measurement to generate velocity data corresponding to the velocity of blood within the target area for each measurement; analyzing the ECG to determine an ECG data point corresponding to occurrence of the predetermined point in the patient's cardiac cycle and identifying a selected image taken substantially concurrently with measurement of the ECG data point; and displaying the selected image.

15. The method of claim 14, wherein the predetermined point in the patient's cardiac cycle is at least one of the peak systolic and the diastole.

16. A method for presenting ultrasound data comprising: generating a plurality of images of a patient using received ultrasound echoes; displaying a video loop containing the images; calculating an image characterization parameter for each image; analyzing the calculated image characterization parameter for each of the plurality of images to select one of the plurality of images; interrupting display of the video loop upon receiving a user generated interrupt; and in response to the user generated interrupt, statically displaying the selected diagnostic image.

17. The method of claim 16, wherein selecting a diagnostic image comprises: calculating an image characterization parameter for each image; analyzing the image characterization parameters to identify the diagnostic image corresponding to a predetermined point in the patient's cardiac cycle.

18. A system for presenting ultrasound data comprising: a transducer operable to emit ultrasonic waves at a target area within a patient, receive reflected waves from the target area, and to generate an output signal corresponding to the reflected waves; a processor coupled to the transducer and operable to receive the output signal and generate a series of images based on the output signal, the processor further operable to analyze the series of images to calculate respective values of an image characterization parameter, the processor further being operable to select one of the images based on the values of the image characterization parameter; and a display device coupled to the processor and operable to display the selected image.

19. The system of claim 18, further comprising a buffer storing the series of images having a pointer associated therewith identifying one of the series of images, wherein the display device is operable to display the image associated with the pointer and wherein the processor is operable to associate the pointer with the selected image.

20. The system of claim 18, further comprising an input device for receiving inputs from an operator, the processor operable to receive the inputs and to cause the display to scroll through the images according to the inputs.

21. The system of claim 18, wherein the image characterization parameter comprises a standard deviation of at least a portion of each of each of the images.

22. The system of claim 18, further comprising an input device for receiving inputs from an operator, the processor operable to receive an input indicating an area of interest within the images and to calculate the respective values of the image characterization parameter in the area of interest of each image.

23. The system of claim 22 wherein the processor is operable to present a cursor on the display and interpret the inputs from the operator to change a position of the cursor on the display, and wherein the area of interest corresponds to an area of the image proximate the cursor.

24. The system of claim 22 wherein the processor is operable to analyze the images and identify vessel portions corresponding to individual vessels and wherein the processor is operable to interpret the inputs as a selection of one of the vessel portions as the area of interest.

Description:

SYSTEMS AND METHODS FOR AUTOMATED IMAGE SELECTION IN DOPPLER ULTRASOUND IMAGING SYSTEMS

[001] This invention relates to systems and methods for processing and presenting Doppler mode ultrasound images to a clinician.

[002] Advances in ultrasound technology enable imaging of blood flow within a subject's body by measuring frequency shifts in reflected ultrasound waves. Typically, velocity measurements are mapped to a color scale and displayed as a color image overlaid on a grayscale tissue density image. Such images can be used to measure blood flow within major arteries or the heart itself in order to diagnose disease.

[003] In prior systems an operator performed a series of ultrasound measurements at a frequency high enough to generate images of blood flow at various stages within the cardiac cycle. The operator then scrolled through the images and attempted to identify the image corresponding to a specific point in the patient's cardiac cycle in order to assess, for example, the patient's heart function. This process tends to be time consuming and introduces variation into the process due to different operator experience levels and human error.

[004] In view of the foregoing, it would be advantageous to provide a system for conveniently and consistently identifying ultrasound images corresponding to predetermined points in a patient's cardiac cycle.

[005] In one aspect of the invention, a series of Doppler images of blood flow within the patient are acquired. An image characterization parameter is calculated for each Doppler image. The image characterization parameters are then analyzed to select one of the Doppler images, such as a Doppler image corresponding to a predetermined point in the patient's heart cycle The selected image is then displayed to an operator. In some embodiments, the image characterization parameter is the standard deviation of the color of pixels constituting the image.

[006] In another aspect of the invention, the Doppler images are processed to identify an area of interest corresponding to an individual blood vessel. The image characterization parameter is then calculated based on the area of interest.

The area of interest may be identified by receiving a user input, such as positioning a cursor at a specific point on an image. [007] In another aspect of the invention, a patient's electrocardiogram

(ECG) is measured simultaneously with performing ultrasound scans. The ECG is then analyzed to determine the occurrence of a predetermined point in the patient's cardiac cycle. The image based on the ultrasound measurement occurring substantially simultaneously with the occurrence of the predetermined point is then identified and displayed to an operator. [008] In the drawings:

[009] Figure 1 is a block diagram of an ultrasonic diagnostic imaging system in accordance with an embodiment of the present invention. [010] Figure 2 is a process flow diagram of a method for identifying a diagnostic image in accordance with an embodiment of the present invention. [011] Figure 3 is a mosaic display of a cineloop of color Doppler frames produced using an ultrasound system. [012] Figure 4 is a mosaic display of a cineloop of color Doppler frames produced using an ultrasound system following processing to remove flash artifacts. [013] Figure 5 is an image of a segmented color Doppler ultrasound image.

[014] Figure 6 is an image produced using an ultrasound system having a cursor and associated area of interest superimposed thereon in accordance with an embodiment of the present invention. [015] Figure 7 is a process flow diagram of a method for selecting an area of interest in accordance with an embodiment of the present invention. [016] Figure 8 is a block diagram of a system for selecting a diagnostic image using an electrocardiogram (ECG) monitor and an ultrasound system in accordance with an embodiment of the present invention. [017] Figure 9 is a graph illustrating the timing of ultrasound scans relative to an ECG waveform in accordance with an embodiment of the present invention. [018] Figure 10 is a process flow diagram of a method for identifying a diagnostic image based on an ECG waveform in accordance with an embodiment of the present invention.

[019] Figure 11 is a process flow diagram of a method for identifying a diagnostic image in a video ultrasound display system.

[020] Figure 12 is a process flow diagram of an alternative method for identifying a diagnostic image in a video ultrasound display system.

[021] Referring to Figure 1, an ultrasonic diagnostic imaging system 10 includes a probe 12 positioned in contact with a patient 14. The probe preferably includes a transducer emitting ultrasonic waves into the patient 14 and receiving waves reflected from the patient's tissues. In some embodiments, the transducer is a phased array transducer array including piezoelectric elements. The elements of the transducer are coupled to a beamformer 16 operable to produce drive signals for the transducer and process received signals in order to steer and focus beams over the patient's anatomy as known in the art.

[022] The output of the beamformer 16 is coupled to an image processor 18 and a Doppler processor 20. The Doppler processor 20 analyzes the output of the beamformer 16 as known in the art in order to determine characteristics of movement, such as blood flow, within the patient's tissues. The image processor 18 converts the output of the beamformer 16 into an image representing the densities and boundaries of tissue within the scanning area of the probe 12. The image processor 18 may also receive the output of the Doppler processor 20 and produce a color, grayscale, or other graphical representation of movement within the scanning area. Images 22 from the image processor 18 are stored in an image buffer 24. The image buffer 24 has a pointer 26 associated therewith that refers to one of the images 22 and is varied by a user interface 34.

[023] The images 22 may be analyzed by an image analysis module 28 in order to facilitate diagnosis based on the images 22. In some embodiments, the image analysis module 28 masks portions of the images 22 or creates image overlays. The images 22 and image overlays from the analysis module 28 are input to a video processor 30 that generates signals to display one or more of the images 22 and image overlays on a display 32. In some embodiments, the image overlays are stored in the image memory 24 and the video processor 30 reads the images and overlays from the image memory 24. The image 22 presented by the

video processor 30 on the display 32 may be the image referred to by the pointer 26. The user interface 34 is coupled to the image analysis module 28 and receives operator inputs regarding the analysis performed, overlays to be displayed, and the like. In some embodiments, the user interface 34 enables an operator to scroll through the images 22 on the display 32. For example, the user interface 34 may be coupled with the image memory 24 to adjust the value of the pointer 26 or otherwise adjust which image 22 is displayed in order to enable a user to scroll through the images 22. The user interface 34 may include a mouse, keyboard, touch screen, or like device. In some embodiments, the user interface 34 is coupled to the video processor 30 in order to display graphical user interface elements.

[024] The image processor 18, Doppler processor 20, image buffer 24, and image analysis module 28 may represent individual components of an ultrasound system or represent different functions performed by a single computing device. The system 10 may be a dedicated ultrasound system or a general purpose computer executing software suitable for processing signals from the probe 12 and generating control signals for the probe 12.

[025] Referring to Figure 2, an ultrasound system, such as the system 10, may execute a method 36 to facilitate diagnosis using the system 10. At step 38, a series of scans are performed, preferably at closely spaced intervals within a contiguous period of time at a rate sufficiently high to capture variations in blood flow within an individual cardiac cycle. Accordingly, an image acquisition rate that is, for example, between ten and thirty times the patient's heart rate may be adequate. The scans are also preferably performed over a period including at least one complete cardiac cycle.

[026] At step 40, tissue density (grayscale) images representing the density of tissues within the scan area of the probe 12 are generated for each scan by the image processor 18. At step 42, Doppler images representing the speed of blood flow within the scan area of the probe 12 are generated for each scan by the Doppler processor 20. The Doppler images typically represent velocity information in color. Doppler images 44 and tissue density images 46 for each scan may be combined as shown in Figure 3. As is apparent in Figure 3, a wall 48 and lumen 50 of the

proximal aorta are seen in a series of tissue density images. Blood flow 52 is shown in a Doppler image overlay 44.

[027] The Doppler and tissue density images 44, 46 may be subject to further processing in order to improve the image quality and remove noise. For example, at step 54, a threshold may be applied to the Doppler images 44 in order to remove noise. A flow mask is generated at step 56. Generating a flow mask may include analyzing the Doppler images 44 and/or the tissue density images 46 to identify portions of the image corresponding to blood vessels. Generating the flow mask may include averaging a number of the images 44 to remove pulsatility effects. Different averaging algorithms can be applied to generate the flow mask including absolute velocity averaging and vector velocity averaging. Absolute velocity averaging ignores the sign (or direction) of the averaged velocity pixels, whereas vector velocity averaging treats the velocity pixels as angular phase values. Vector averaging will show high values in regions with coherent flow and it will show small values in regions of color Doppler noise due to vector cancellation.

[028] The flow mask is applied to the Doppler images 44 at step 58.

Applying the flow mask at step 58 removes portions of the Doppler images 46 that correspond to noise.

[029] At step 60, flash reduction is performed on the Doppler images.

Flash refers to Doppler image data that is the result of transducer or tissue movement, rather than blood flow. Flash defects 62 such as are shown in Figure 3 are removed during the flash reduction step 60 as is seen in Figure 4 in which the defects 62 have been removed.

[030] At step 64, vessel segmentation is performed. For purposes of this disclosure, vessels may also include chambers of the heart. Vessel segmentation in this example includes analyzing the Doppler images 44 to associate blocks of pixels with individual blood vessels. The flow mask applied at step 58 may facilitate the segmentation step by identifying the extent of each blood vessel. In some embodiments, a classical "connected component algorithm" is applied to the flow mask image to segment the different vessels within the image. For example, as

shown in Figure 5, area 66 has been identified as a contiguous block of pixels corresponding to blood flow within the proximal aorta.

[031] At step 68, the segmented vessels are displayed to an operator. At step 70, the system 10 receives an operator input indicating which of the segmented vessels to analyze. In an alternative embodiment, one or more individual vessels are selected automatically based on certain criteria such as size, image quality, or the like, without manually performing the steps 68 and 70. Different techniques of vessel selection may be applied, including shape analysis techniques to select the largest longitudinal vessel using measures such as object area and eccentricity. Heuristic information may be used in the vessel selection process where context information provided by the ultrasound system, such as the current clinical preset (arterial versus venous) and the current Doppler cursor (sample volume) position can guide the vessel selection algorithm.

[032] At step 72, a vessel selection mask is applied to the Doppler images

44 to exclude pixels other than those corresponding to blood flow in the vessel selected at step 70. At step 74, one or more image characterization parameters are calculated for each image 44. The image characterization parameters are values derived from the velocity of blood within the chosen vessel, as represented by the color and/or intensity of pixels in the image 44, which preferably includes a single blood vessel selected in step 72. The image characterization parameters may include statistical measures of the flow area and velocity content of each image 44 such as velocity-weighted flow area; mean signed or unsigned velocity; 50 th , 90 th , or other velocity percentiles; and/or signed or unsigned velocity spread measures, such as the standard deviation. In some embodiments, only one of the above mentioned parameters is calculated for each image 44. In other embodiments, two or more of the parameters are calculated for each image 44. In some embodiments, an operator may input which of the parameters are to be calculated and used for analyzing the images 44.

[033] At step 76, the image characterization parameters of the images 44 are analyzed to identify one of the images 44 that is useful for diagnosing heart conditions. The identified image may preferably correspond to a predetermined

point in the cardiac cycle of the patient 14, such as the occurrence of peak systolic flow. For example, the standard deviation of blood flow within a vessel is typically at its largest during peak systolic flow. Accordingly, in some embodiments, the image characterization parameter is the standard deviation and analyzing the image characterization parameters at step 76 may include identifying the images 44 having the largest standard deviation.

[034] However, images 44 associated with other points in the cardiac cycle may be identified, such as the atrial systole or the cardiac diastole. The images 44 may also be analyzed to identify the images 44 having the most aliasing or manifesting the most turbulence. The images 44 may be analyzed to determine which manifests regurgitant jets or peak trans-valvular flow.

[035] At step 78, the diagnostic image is selected according to the analysis performed at step 76, and, at step 80, the diagnostic image is displayed. In some embodiments of the invention, displaying the diagnostic image may include setting the pointer 26 to refer to the diagnostic image.

[036] Referring to Figure 6, in an alternative embodiment, the user interface 34 may receive inputs from an operator, such as from a mouse, trackball, touch screen or the like, indicating a desired position of a cursor 82 over a displayed image, such as a Doppler image 44, tissue density image 46, or a combination of Doppler and tissue density images 44, 46. The image analysis module 28 may then analyze an area 84 surrounding the cursor for each image 44, as described with respect to step 76 in the method 36, in order to determine which image 44 to display as the diagnostic image. The area 84 may be a preset size surrounding the cursor 82 or may be of a size specified by the operator.

[037] Referring to Figure 7, a method 86 for facilitating diagnosis using the ultrasound system 10 may include performing the image acquisition and processing steps 38-54 as in the method 36, and further includes the steps of displaying an image 44 and/or its corresponding tissue density image 46 at step 88. The image or images 44, 46 may be chosen by the operator or at random or may be the image or images 44, 46 corresponding to the first, last, or other scan within the series of scans performed at step 38. The image or images 44, 46 displayed may be chosen by the

operator by, for example, allowing the operator to scroll through the images 44, 46 and provide an input indicating which of the images 44, 46 to use. Alternatively, the user interface 34 may receive inputs from the operator causing the images to scroll across the display 32, and the last displayed image will remain displayed to permit selection of a point of interest using the cursor 82.

[038] At step 90, the user interface 34 receives an operator input indicating a cursor position on the image or images 44, 46. At step 92, the area 84 containing or adjacent the position determined at step 90 is analyzed for each Doppler image 44 in order to calculate one or more characterization parameters for each image 44. The size of the area 84 may be fixed, automatically determined based on the image 44, or specified by an operator.

[039] The image characterization parameters may include statistical measures of the flow area and velocity content of image data within the area 84 for each image 44, such as: velocity- weighted flow area; mean signed or unsigned velocity; 50 th , 90 th , or other velocity percentiles; and/or signed or unsigned velocity spread measures, such as the standard deviation. In some embodiments, only one of the above mentioned parameters is calculated for each image 44. In other embodiments, two or more of the parameters are calculated for each image 44. In some embodiments, an operator may input which of the parameters is to be calculated and used for analyzing the images 44. The diagnostic image may then be selected at step 78 and displayed at step 80 as in the method 36 described above.

[040] In the method 86, the steps of generating a flow mask (step 56), applying the flow mask (step 58), performing vessel segmentation (step 64), displaying the segmented vessel (step 68), and receiving an operator selection of the segmented vessels (step 70), may be omitted. Alternatively, some or all of these steps may be included in the method. For example, segmentation of the vessels (step 64) and receiving an operator selection of a vessel (step 70) may precede the step of specifying the location of the cursor 82, such that the cursor 82 may be positioned within a sub-area 84 of the vessel chosen at step 70.

[041] Referring to Figure 8, in an alternative embodiment, the ultrasound system 10 is coupled to receive signals from an electrocardiogram (ECG)

monitoring system 94 receiving signals from electrodes 96 coupled to the patient 14 to measure the electrical activity of the patient's heart.

[042] Referring to Figure 9, the system of Figure 8 may be used to select an ultrasound image 44 based on a measured ECG waveform 98. In one embodiment, the ECG waveform is measured simultaneously with the performance of a series of ultrasound scans, indicated by marks 100. The waveform 98 may be analyzed to determine the timing of an occurrence 102 in the patient's heart cycle such as the peak systolic, the atrial systole, the cardiac diastole, or other normal or abnormal points in the cardiac cycle. The image 44 corresponding to the scan 104 performed simultaneously, or close to simultaneously, with the occurrence 102 is then selected as the diagnostic image displayed to an operator.

[043] Referring to Figure 10, a method 108 using the system of Figure 8 includes performing a series of ultrasound scans at step 110 while simultaneously measuring the patient's ECG at step 112. The steps 110 and 112 may be registered with respect to one another or synchronized such that timing of ultrasound scans may be mapped to points along the ECG output. For example, the begin and end times of the ECG measurement may be determined according to a first clock, and each ultrasound scan may be time stamped according to either the first clock or a second clock synchronized with the first clock. Alternatively, the begin and end times of the series of ultrasound scans may be recorded or caused to be identical to the begin and end times of the ECG measurement. The rate of the ultrasound scans may then be used to map each ECG scan to a point on the ECG waveform.

[044] At step 114, the ultrasound signals may be processed to produce

Doppler images 44, and at step 116 the signals may be processed to produce tissue density (grayscale) images 46. At step 118 the images are processed. Processing the images at step 118 may include performing some or all of the steps of the method 36, including applying a threshold (step 54), generating a flow mask (step 56), applying the flow mask (step 58), performing vessel segmentation (step 64), displaying the segmented vessel (step 68), and receiving an operator selection of the segmented vessels (step 70).

[045] At step 120, the ECG waveform is analyzed to locate an occurrence of an event in the patient's cardiac cycle. At step 122, the image 44 corresponding temporally to the event is identified by determining the timing of the event located at step 120, and identifying the image 44 corresponding to the scan that occurred closest to that time. At step 123, the image 44 identified at step 122 is displayed.

[046] In some embodiments, a series of ultrasound images, such as one or both of the Doppler images and tissue density images 44, 46 are displayed as a video, graphically displaying blood flow over time. In such embodiments, a user may provide an input causing interruption of the video display and static display of the image being displayed upon receiving the input. Accordingly, in some embodiments, an ultrasound system such as the system 10 may optionally execute a method 124 as shown in Figure 11.

[047] At step 126 of the method 124, ultrasound scans are performed, preferably a series of consecutive scans at a rate suitable for capturing blood flow at multiple times within a single cardiac cycle. At step 128, images are generated for each scan, such as Doppler images 44 and/or tissue density (grayscale) images 46. At step 130, the images are displayed as a video. The video may be looped continuously or displayed a single time upon receiving an input from the operator causing the video display to begin. At step 132, a user interrupt is received. The user interrupt may be generated upon the user pushing a button or interacting with a graphic user interface element labeled "freeze," "stop," or the like. At step 134, the video display is interrupted. At step 136, the diagnostic image is selected according to one of the methods described above, such as the method illustrated in Figures 2 through 6. At step 138, the diagnostic image selected at step 136 is statically displayed.

[048] With further reference to Figure 1, in an alternative embodiment an ultrasound system such as the system 10 may perform a method 140 as shown in Figure 12. At step 142 in the method 140, ultrasound scans are performed, preferably a series of consecutive scans at a rate suitable for capturing blood flow at multiple times within a single cardiac cycle. At step 144, images are generated for each scan, such as Doppler images 44 and/or tissue density (grayscale) images 46.

At step 146, the images are stored in an image buffer, such as the image buffer 24 of the system 10. At step 148, a video loop of the images within the image buffer 24 is displayed. For example, the video processor 30 may automatically sequentially display the images within the image buffer 24.

[049] At step 150, a user interrupt is received, such as in response to the user pushing a button or interacting with a graphic user interface element labeled "freeze," "stop," or the like. In other embodiments of the invention, the user interrupt occurs when the operator switches between different viewing modes. For example, an ultrasound system 10 may have a duplex display mode in which one portion of the display 32 shows the Doppler image 44 and/or tissue density image 46, and another portion of the display 32 shows spectral Doppler data, or a selected portion of each Doppler image 44. In such systems, only one portion of the display is "live" (i.e., scrolling continuously through the images in the image buffer 24 or continuously acquiring and displaying spectral data from acquired Doppler data). A user may select which of the portions is live by providing an input, such as pressing an "update" key. Upon providing the input, the live portion is frozen and the other portion becomes live. Accordingly, where the Doppler image 44 and/or tissue density image 46 have been live, pressing the update key in such a system will stop video display of these images and provide the user interrupt which triggers automatic selection and display of the diagnostic image as described below. Alternatively, in some embodiments pressing the update key does not trigger selection and display of the diagnostic image, but rather only when the "freeze" button is pressed to stop video display of images or acquisition of spectral data.

[050] At step 152, the video display is interrupted. At step 154, the images are read from the image buffer 24. In some embodiments, the image analysis module 28 reads the images from the image buffer 24. At step 156, the diagnostic image is selected according to any of the methods described hereinabove, such as the method illustrated in Figures 2 through 6. The images in the image buffer may be updated at step 158 and the selected static diagnostic image selected in step 156 is displayed at step 160.