US20140105478A1 - Ultrasound image processing apparatus - Google Patents
Ultrasound image processing apparatus Download PDFInfo
- Publication number
- US20140105478A1 US20140105478A1 US14/125,131 US201214125131A US2014105478A1 US 20140105478 A1 US20140105478 A1 US 20140105478A1 US 201214125131 A US201214125131 A US 201214125131A US 2014105478 A1 US2014105478 A1 US 2014105478A1
- Authority
- US
- United States
- Prior art keywords
- image data
- movement
- point
- search
- diagnostic information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0883—Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the heart
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/13—Tomography
- A61B8/14—Echo-tomography
- A61B8/145—Echo-tomography characterised by scanning multiple planes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4444—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4483—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/486—Diagnostic techniques involving arbitrary m-mode
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/463—Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5223—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
Definitions
- the present invention relates to an ultrasound image processing apparatus, and in particular, to a device which executes a correlation calculation between image data.
- Patent Documents 1 and 2 disclose epoch-making techniques in which a motion of a cardiac muscle is traced over a plurality of frames by pattern matching based on the correlation calculation.
- a template is set at a location of interest in one of the sets of image data, and a correlation of the image data in the template is calculated while the template is moved in the other set of image data.
- a position of the template in the other set of image data having the largest similarly is set as a position corresponding to the location of interest.
- a position corresponding to the location of interest is searched (tracked) over a plurality of time phases, to enable observation of the movement of the location of interest.
- a fixed point which forms a reference point is set, and a distance from the fixed point is calculated, to enable quantitative evaluation of the movement of the location of interest.
- the present inventors have researched and developed the correlation calculation between ultrasound image data.
- the present inventors have focused attention in the function to observe the movement of the location of interest.
- the present invention was made in the course of the research and development, and an advantage of the present invention is provision of a technique for appropriately evaluating movement of a set point which is set on a location of interest.
- an ultrasound image processing apparatus comprising an image storage unit which stores ultrasound image data of a plurality of time phases, an image processor which searches image data of a search time phase for a movement point corresponding to a set point which is set in image data of a reference time phase, based on a correlation calculation between image data, and a diagnostic information generator which determines a primary direction of movement of the set point over a plurality of time phases based on the movement points searched over a plurality of time phases and obtains diagnostic information by evaluating the movement of the set point with the primary direction as a reference.
- the diagnostic information generator determines the primary direction according to a spatial variation of the movement points searched over a plurality of time phases.
- the diagnostic information generator sets a fixed point which becomes a reference for diagnosis on a line corresponding to the primary direction.
- the diagnostic information generator forms a displacement waveform showing a change with respect to time of a distance from the fixed point to the movement point.
- the image processor comprises a template generation function to generate, based on a set point which is set in the image data of the reference time phase, a template corresponding to the set point, a search area setting function to set a search area in the image data of a search time phase, a correlation calculation function to execute a correlation calculation for each position in the search area while moving the template in the search area, based on image data of a template of the reference time phase and image data overlapping a template of the search time phase, and a weighting process function to apply a weighting process on a result of the correlation calculation obtained at each position in the search area based on the distance from a reference position to the position, wherein a movement point corresponding to the set point is searched in the search area based on a result of the correlation calculation to which the weighting process is applied.
- a template generation function to generate, based on a set point which is set in the image data of the reference time phase, a template corresponding to the set point
- a search area setting function to set a search area
- a program which, when executed, causes a computer which processes ultrasound image data of a plurality of time phases to realize an image processing function to search image data of a search time phase for a movement point corresponding to a set point which is set in image data of a reference time phase, based on a correlation calculation between image data, and a diagnostic information generating function to determine a primary direction of movement of the set point over a plurality of time phases based on the movement points searched over a plurality of time phases and to obtain diagnostic information by evaluating the movement of the set point with the primary direction as a reference.
- the above-described program is stored on a computer-readable storage medium such as, for example, a disk and a memory, and is provided to a computer through the storage medium.
- the program may be provided to the computer through an electrical communication line such as the Internet.
- movement of a set point which is set at a location of interest can be appropriately evaluated.
- movement of the set point is evaluated with the primary direction of movement over a plurality of time phases as a reference, there can be obtained diagnostic information in which influence of movement is relatively strongly reflected and the movement is sensitively captured.
- FIG. 1 is diagram showing an overall structure of an ultrasound diagnostic apparatus according to a preferred embodiment of the present invention.
- FIG. 2 is a diagram for explaining a pattern matching between image data.
- FIG. 3 is a diagram showing an example setting of a search area according to a position of a movement point.
- FIG. 4 is a diagram showing a specific example of a weighting coefficient used for a weighting process.
- FIG. 5 is a flowchart showing a process at a pattern matching processor.
- FIG. 6 is a diagram for explaining setting of a fixed point which is used as a reference in diagnosis.
- FIG. 7 is a diagram showing an example display image including a displacement waveform.
- FIG. 8 is a flowchart showing a process performed in a diagnosis information generator.
- FIG. 1 is diagram showing an overall structure of an ultrasound diagnostic apparatus preferable in practicing the present invention.
- the ultrasound diagnostic apparatus of FIG. 1 has functions of an ultrasound image processing apparatus according to a preferred embodiment of the present invention.
- a probe 10 is an ultrasound probe which transmits and receives ultrasound to and from an area including a target object such as, for example, a heart and a muscle.
- the probe 10 comprises a plurality of transducer elements which transmit and receive ultrasound, and transmission of the plurality of transducer elements is controlled by a transmitting and receiving unit 12 , to form a transmission beam.
- the plurality of transducer elements also receive the ultrasound obtained from the area including the target object, a signal thus obtained is output to the transmitting and receiving unit 12 , and the transmitting and receiving unit 12 forms a reception beam, to collect echo data along the reception beam.
- the probe 10 scans the ultrasound beam (transmission beam and reception beam) in a two-dimensional plane, and collects the echo data.
- a three-dimensional probe which three-dimensionally scans the ultrasound beam in a three-dimensional space.
- an image forming unit 20 forms ultrasound image data based on the collected echo data.
- the image forming unit 20 forms image data of, for example, a B mode image.
- the image forming unit 20 also forms a plurality of sets of image data corresponding to a plurality of ultrasound images.
- the image forming unit 20 forms a plurality of sets of image data showing the target object over a plurality of points of time (plurality of time phases).
- a plurality of sets of image data showing the target object at different positions may be formed while the probe 10 is gradually moved.
- the plurality of sets of image data formed by the image forming unit 20 are stored in an image storage unit 22 .
- a pattern matching processor 30 functions as an image processor which executes the pattern matching between image data.
- the pattern matching processor 30 has a function to generate a template which is set in image data, a function to set a search area in the image data, a function to execute a correlation calculation based on the image data in the template, and a function to apply a weighting process on a result of the correlation calculation.
- the pattern matching processor 30 executes the pattern matching between image data based on the correlation calculation on the plurality of sets of image data stored in the image storage unit 22 .
- FIG. 2 is a diagram for explaining the pattern matching between image data, and shows a process between image data of a reference time phase and image data of each search time phase.
- the image data of the reference time phase and the image data of the search time phase are, for example, image data obtained from the same heart at different points of time.
- a set point P is set by a user such as an inspector in the image data of the reference time phase
- a template T is set surrounding the set point in the image data of the reference time phase.
- FIG. 2 shows a template T having a square shape centered at the set point.
- a size of the template T in terms of a number of pixels is, for example, about 20 pixels in the vertical direction and 20 pixels in the horizontal direction.
- the size, shape, and position of the template T are not limited to those of the specific example configuration of FIG. 2 . Alternatively, the size, shape, and position of the template T may be changed by the user.
- a search area SA is set in the image data of each search time phase.
- the search area SA is fixedly set, for example, at the same position in the image data over a plurality of search time phases.
- the size and shape of the search area SA are also fixedly set.
- the size (breadth) of the search area SA is preferably relatively large.
- the entire image data of each search time phase may be set as the search area SA.
- the search area SA may be set, over a plurality of search time phases and for each search time phase, to surround a position of a movement point searched in the image data of the time phase adjacent to the search time phase.
- the search area SA may be determined for a certain search time phase using, as a reference, a movement point which is searched one time phase prior to that search time phase.
- FIG. 3 is a diagram showing an example setting of the search area SA corresponding to a position of the movement point. Specifically, a specific example configuration is shown in which, for each search time phase, the search area SA of the search time phase is set corresponding to a movement point P′ searched in the previous time phase of the search time phase.
- a rectangle shown by a dotted line shows an area corresponding to the template T which is set in the image data of the reference time phase.
- a rectangular search area SA centered at the movement point P′ searched in the previous time phase is set, and the area corresponding to the template T is located within the search area SA.
- a rectangular search area SA centered at the movement point P′ searched in the previous time phase is set.
- the area corresponding to the template T extends beyond the search area SA.
- the search area SA is expanded such that the area corresponding to the template T is within the search area SA.
- an upper side and a right side are translated to expand the search area SA and such that the template T is included.
- a rectangular search area SA centered at the movement point P′ searched in the previous time phase is set.
- the area corresponding to the template T is outside of the search area SA.
- the search area SA is expanded such that the area corresponding to the template T is within the search area SA.
- the right side is translated to expand the search area SA and such that the template T is included.
- the search area SA is set centered at the movement point which is searched in the previous time phase as in the setting examples shown in FIG. 3 , so that the movement point to be originally detected can be searched even when the search area SA is set relatively narrow.
- the search area SA is expanded to include the area corresponding to the template T of the reference time phase, even in a case, for example, in which an organ which involves periodical motion such as a heart is diagnosed over a plurality of time phases and the organ returns to the state corresponding to the reference time phase, an area corresponding to the template T of the reference time phase can be included as a candidate of the movement point.
- the template T and the search area SA are set, the template T is moved in the search area SA of the image data of each search time phase, and, at each position, a correlation value is calculated based on a plurality of pixels within the template T of the image data of the reference time phase and a plurality of pixels in an area overlapping the template T of the image data of each search time phase.
- a position shown by a dotted rectangle in the search area SA in FIG. 2 is set as an initial position
- the template T is moved stepwise from the initial position in the x direction and the y direction, a correlation value is calculated at each position, and a plurality of correlation values are calculated corresponding to a plurality of positions over the entire area in the search area SA.
- the correlation value is a numerical value indicating a degree of correlation relationship (degree of similarity) between image data, and, for calculation of the correlation value, known equations may be used corresponding to each method of correlation calculation. For example, as in a phase only correlation or a cross-correlation, a correlation value which shows a larger value as the degree of similarity becomes larger may be used, or, as in a minimum-sum absolute-difference, a correlation value which shows a smaller value as the degree of similarity becomes larger may be used. In the present embodiment, a self-correlation value which shows a smaller value as the degree of similarity becomes larger is used as a specific example of the correlation value. In addition, in the present embodiment, a weighting process to be described later is applied on the result of the correlation calculation (self-correlation value) obtained in each position in the search area SA.
- a position having the largest degree of similarity is identified from the plurality of positions and is set as a movement point which is a point to which the set point has moved.
- FIG. 4 is a diagram showing a specific example of a weighting coefficient used for the weighting process.
- a horizontal axis of FIG. 4 represents a distance from a reference position for each position for which the self-correlation value is calculated.
- the reference position is, for example, the position of the movement point searched in the previous time phase (for example, reference numeral P′ in FIG. 3 ) or the position of the set point which is set in the image data of the reference time phase (for example, reference numeral Pin FIG. 2 ).
- a vertical axis of FIG. 4 represents a weighting coefficient. As shown in FIG. 4 , a weighting coefficient at a distance d is k(d).
- the weighting coefficient is set smaller for an area closer to the reference position having a higher possibility of having a high degree of similarity, and the weighting coefficient is increased as the position becomes farther away from the reference position.
- the self-correlation value at each position is multiplied by the weighting coefficient corresponding to the position, and the movement point which is a point to which the set point has moved is searched based on the multiplication result; that is, the self-correlation value to which the weighting process is applied.
- a search in which more importance is placed on an area near the reference position expected to have a relatively high degree of correlation can be enabled while reducing overlooking of search by setting a relatively wide search area.
- significant deviation of the searched movement point from the actual moved point of the set point can be inhibited, and, as a result, the precision of the search based on the correlation calculation can be improved.
- weighting coefficient shown in FIG. 4 is merely exemplary, and, alternatively, for example, there may be used a weighting coefficient which non-linearly changes with distance or a weighting coefficient which stepwise changes (in steps) with distance.
- a weighting coefficient which becomes larger as the position becomes closer to the reference position and which becomes smaller as the position becomes farther away from the reference position is preferable.
- FIG. 5 is a flowchart showing a process by the pattern matching processor 30 ( FIG. 1 ).
- a template T corresponding to the set point P is set in the image data of the reference time phase (S 502 ; refer to FIG. 2 ).
- a search area SA is set in the image data of each search time phase (S 503 ; refer to FIGS. 2 and 3 ), a template T is set in the search area SA, and a self-correlation value a is calculated in the position of the template T based on a plurality of pixels in the template T of the image data of the reference time phase and a plurality of pixels in an area overlapping the template T in the image data of each search time phase (S 504 ).
- a distance d from the reference position to each position is calculated, a weighting coefficient k(d) for that position is determined (S 505 ; refer to FIG. 4 ), the self-correlation value a is multiplied by the weighting coefficient k(d), and a self-correlation value a′ after the weighting process is calculated (S 506 ).
- a diagnostic information generator 40 when the search for image data of all search time phases is completed at the pattern matching processor 30 , a diagnostic information generator 40 generates diagnostic information related to a target tissue based on the result of the search.
- the diagnostic information generator 40 has a function to set a fixed point which becomes a reference in the diagnosis, and a function to generate, as diagnostic information, a displacement waveform related to a movement point, with the fixed point as a reference.
- a display image including the displacement waveform formed in the diagnostic information generator 40 is displayed on a display unit 50 .
- FIG. 6 is a diagram for explaining setting of a fixed point which forms a reference in the diagnosis.
- FIG. 6 shows movement points P′ of a plurality of time phases searched in the image data over a plurality of search time phases for a certain set point P.
- the diagnostic information generator 40 determines a primary direction of movement of the set point P over the plurality of time phases based on the movement points P′ of the plurality of time phases for the set point P. For example, as shown in FIG. 6 , a rectangle R which surrounds all of the plurality of movement points P′ and which circumscribes the movement points P′ corresponding to the 4 sides is set, and a major axis passing through the center of the rectangle R is set as the primary direction D.
- the determination of the primary direction D is not limited to the method of using the rectangle R.
- a known method commonly referred to as principle component analysis may be used, and a direction which most clearly represents the spatial variation of the movement points P′; that is, a direction where the variance of the movement points P′ is the maximum, may be set as the primary direction D.
- the diagnostic information generator 40 sets a fixed point F on the determined primary direction D.
- the user may set a provisional fixed point Fin advance, and the provisional fixed point F may be moved to the fixed point F determined by the diagnostic information generator 40 .
- the diagnostic information generator 40 calculates a distance from the set fixed point F to each movement point P′, and generates a displacement waveform representing a change of the distance over the plurality of time phases.
- FIG. 7 is a diagram showing an example display image including the displacement waveform.
- FIG. 7 shows an example display image obtained when a heart is diagnosed as the target tissue.
- the display image of FIG. 7 includes a tomographic image related to the heart, and 4 set points A, B, C, and D which are set for the diagnosis of the heart are shown in the tomographic image.
- FIG. 7 shows an example display image including displacement waveforms L 1 , L 2 , L 3 , and L 4 obtained from the set points A, B, C, and D, respectively.
- the fixed point F is set on the primary direction of movement over a plurality of time phases, in each displacement waveform obtained with reference to the fixed point F, an influence of movement of each set point is relatively strongly reflected and the movement of each set point is relatively sensitively represented.
- the fixed point F is desirably set according to a unified reference.
- all fixed points F for all set points are set outside of the cardiac muscle.
- all fixed points for all set points may be set inside the cardiac muscle.
- the example display image shown in FIG. 7 also includes, as an auxiliary display when the transmission status of motion is to be observed, a connection curve 104 connecting maximum value points of the displacement waveforms L 1 , L 2 , L 3 , and L 4 , graphs M 1 , M 2 and M 3 representing the time differences of motion between set points, and a cursor 106 which is used when the user designates a particular time phase.
- FIG. 8 is a flowchart showing a process in the diagnostic information generator 40 ( FIG. 1 ).
- the primary direction D of movement over the plurality of time phases for the set point is determined based on the plurality of movement points (S 802 ; refer to FIG. 6 ), and the fixed point F is set on the primary direction D (S 803 ; refer to FIG. 6 ).
- the user may designate, on the primary direction D), an outer side or an inner side of the cardiac muscle, and the fixed point F may be set according to the designation.
- An ultrasound diagnostic apparatus has been described.
- the functions of the pattern matching processor 30 and the diagnostic information generator 40 shown in FIG. 1 may be realized on a computer, and the computer may be functioned as an ultrasound image processing apparatus.
- the above-described preferred embodiment is merely exemplary in every aspect, and does not limit the scope of the present invention thereto.
- the present invention includes various modified configurations within the scope of the essentials thereof.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Surgery (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Pathology (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Heart & Thoracic Surgery (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Cardiology (AREA)
- Multimedia (AREA)
- Gynecology & Obstetrics (AREA)
- Quality & Reliability (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
- Physiology (AREA)
Abstract
Description
- The present invention relates to an ultrasound image processing apparatus, and in particular, to a device which executes a correlation calculation between image data.
- An ultrasound image processing apparatus and an ultrasound diagnostic apparatus are known which execute a correlation calculation on image data of ultrasound images obtained by transmitting and receiving ultrasound. For example,
Patent Documents - In the pattern matching between image data, for example, a template is set at a location of interest in one of the sets of image data, and a correlation of the image data in the template is calculated while the template is moved in the other set of image data. A position of the template in the other set of image data having the largest similarly is set as a position corresponding to the location of interest.
- With this process, for example, a position corresponding to the location of interest is searched (tracked) over a plurality of time phases, to enable observation of the movement of the location of interest. For example, a fixed point which forms a reference point is set, and a distance from the fixed point is calculated, to enable quantitative evaluation of the movement of the location of interest.
-
- [Patent Document 1] JP 2007-130063 A
- [Patent Document 2] JP 2007-143606 A
- In view of the related art described above, the present inventors have researched and developed the correlation calculation between ultrasound image data. In particular, the present inventors have focused attention in the function to observe the movement of the location of interest.
- The present invention was made in the course of the research and development, and an advantage of the present invention is provision of a technique for appropriately evaluating movement of a set point which is set on a location of interest.
- According to one aspect of the present invention, there is provided an ultrasound image processing apparatus comprising an image storage unit which stores ultrasound image data of a plurality of time phases, an image processor which searches image data of a search time phase for a movement point corresponding to a set point which is set in image data of a reference time phase, based on a correlation calculation between image data, and a diagnostic information generator which determines a primary direction of movement of the set point over a plurality of time phases based on the movement points searched over a plurality of time phases and obtains diagnostic information by evaluating the movement of the set point with the primary direction as a reference.
- According to above-described configuration, because the movement of the set point is evaluated with the primary direction of movement over a plurality of phases as a reference, for example, there can be obtained diagnostic information in which influence of the movement is relatively strongly reflected and the movement is sensitively captured.
- According to another aspect of the present invention, preferably, the diagnostic information generator determines the primary direction according to a spatial variation of the movement points searched over a plurality of time phases.
- According to another aspect of the present invention, preferably, the diagnostic information generator sets a fixed point which becomes a reference for diagnosis on a line corresponding to the primary direction.
- According to another aspect of the present invention, preferably, the diagnostic information generator forms a displacement waveform showing a change with respect to time of a distance from the fixed point to the movement point.
- According to another aspect of the present invention, preferably, the image processor comprises a template generation function to generate, based on a set point which is set in the image data of the reference time phase, a template corresponding to the set point, a search area setting function to set a search area in the image data of a search time phase, a correlation calculation function to execute a correlation calculation for each position in the search area while moving the template in the search area, based on image data of a template of the reference time phase and image data overlapping a template of the search time phase, and a weighting process function to apply a weighting process on a result of the correlation calculation obtained at each position in the search area based on the distance from a reference position to the position, wherein a movement point corresponding to the set point is searched in the search area based on a result of the correlation calculation to which the weighting process is applied.
- According to another aspect of the present invention, there is provided a program which, when executed, causes a computer which processes ultrasound image data of a plurality of time phases to realize an image processing function to search image data of a search time phase for a movement point corresponding to a set point which is set in image data of a reference time phase, based on a correlation calculation between image data, and a diagnostic information generating function to determine a primary direction of movement of the set point over a plurality of time phases based on the movement points searched over a plurality of time phases and to obtain diagnostic information by evaluating the movement of the set point with the primary direction as a reference.
- The above-described program is stored on a computer-readable storage medium such as, for example, a disk and a memory, and is provided to a computer through the storage medium. Alternatively, the program may be provided to the computer through an electrical communication line such as the Internet.
- According to various aspects of the present invention, movement of a set point which is set at a location of interest can be appropriately evaluated. For example, according to a preferred configuration of the present invention, because movement of the set point is evaluated with the primary direction of movement over a plurality of time phases as a reference, there can be obtained diagnostic information in which influence of movement is relatively strongly reflected and the movement is sensitively captured.
-
FIG. 1 is diagram showing an overall structure of an ultrasound diagnostic apparatus according to a preferred embodiment of the present invention. -
FIG. 2 is a diagram for explaining a pattern matching between image data. -
FIG. 3 is a diagram showing an example setting of a search area according to a position of a movement point. -
FIG. 4 is a diagram showing a specific example of a weighting coefficient used for a weighting process. -
FIG. 5 is a flowchart showing a process at a pattern matching processor. -
FIG. 6 is a diagram for explaining setting of a fixed point which is used as a reference in diagnosis. -
FIG. 7 is a diagram showing an example display image including a displacement waveform. -
FIG. 8 is a flowchart showing a process performed in a diagnosis information generator. -
FIG. 1 is diagram showing an overall structure of an ultrasound diagnostic apparatus preferable in practicing the present invention. The ultrasound diagnostic apparatus ofFIG. 1 has functions of an ultrasound image processing apparatus according to a preferred embodiment of the present invention. - A
probe 10 is an ultrasound probe which transmits and receives ultrasound to and from an area including a target object such as, for example, a heart and a muscle. Theprobe 10 comprises a plurality of transducer elements which transmit and receive ultrasound, and transmission of the plurality of transducer elements is controlled by a transmitting and receivingunit 12, to form a transmission beam. The plurality of transducer elements also receive the ultrasound obtained from the area including the target object, a signal thus obtained is output to the transmitting and receivingunit 12, and the transmitting and receivingunit 12 forms a reception beam, to collect echo data along the reception beam. - The
probe 10 scans the ultrasound beam (transmission beam and reception beam) in a two-dimensional plane, and collects the echo data. Alternatively, there may be used a three-dimensional probe which three-dimensionally scans the ultrasound beam in a three-dimensional space. - When the ultrasound beam is scanned in the area including the target object and the echo data are collected by the transmitting and receiving
unit 12, animage forming unit 20 forms ultrasound image data based on the collected echo data. Theimage forming unit 20 forms image data of, for example, a B mode image. Theimage forming unit 20 also forms a plurality of sets of image data corresponding to a plurality of ultrasound images. For example, theimage forming unit 20 forms a plurality of sets of image data showing the target object over a plurality of points of time (plurality of time phases). Alternatively, a plurality of sets of image data showing the target object at different positions may be formed while theprobe 10 is gradually moved. The plurality of sets of image data formed by theimage forming unit 20 are stored in animage storage unit 22. - A pattern matching
processor 30 functions as an image processor which executes the pattern matching between image data. The pattern matchingprocessor 30 has a function to generate a template which is set in image data, a function to set a search area in the image data, a function to execute a correlation calculation based on the image data in the template, and a function to apply a weighting process on a result of the correlation calculation. The pattern matchingprocessor 30 executes the pattern matching between image data based on the correlation calculation on the plurality of sets of image data stored in theimage storage unit 22. -
FIG. 2 is a diagram for explaining the pattern matching between image data, and shows a process between image data of a reference time phase and image data of each search time phase. The image data of the reference time phase and the image data of the search time phase are, for example, image data obtained from the same heart at different points of time. In the pattern matching, first, a set point P is set by a user such as an inspector in the image data of the reference time phase, and a template T is set surrounding the set point in the image data of the reference time phase.FIG. 2 shows a template T having a square shape centered at the set point. A size of the template T in terms of a number of pixels is, for example, about 20 pixels in the vertical direction and 20 pixels in the horizontal direction. The size, shape, and position of the template T are not limited to those of the specific example configuration ofFIG. 2 . Alternatively, the size, shape, and position of the template T may be changed by the user. - When the template T is set, a search area SA is set in the image data of each search time phase. The search area SA is fixedly set, for example, at the same position in the image data over a plurality of search time phases. In this case, for example, the size and shape of the search area SA are also fixedly set. In order to fix the position of the search area SA, the size (breadth) of the search area SA is preferably relatively large. Alternatively, the entire image data of each search time phase may be set as the search area SA.
- The search area SA may be set, over a plurality of search time phases and for each search time phase, to surround a position of a movement point searched in the image data of the time phase adjacent to the search time phase. In other words, as will be described later, because a movement point corresponding to a set point P which is set in the image data of the reference time phase is searched in the search area SA which is set for the image data of each search time phase, when the movement points are sequentially searched over a plurality of search time phases, the search area SA may be determined for a certain search time phase using, as a reference, a movement point which is searched one time phase prior to that search time phase.
-
FIG. 3 is a diagram showing an example setting of the search area SA corresponding to a position of the movement point. Specifically, a specific example configuration is shown in which, for each search time phase, the search area SA of the search time phase is set corresponding to a movement point P′ searched in the previous time phase of the search time phase. InFIG. 3 , a rectangle shown by a dotted line shows an area corresponding to the template T which is set in the image data of the reference time phase. - In a setting example 1, a rectangular search area SA centered at the movement point P′ searched in the previous time phase is set, and the area corresponding to the template T is located within the search area SA.
- In a setting example 2 also, a rectangular search area SA centered at the movement point P′ searched in the previous time phase is set. However, in this state, the area corresponding to the template T extends beyond the search area SA. When the template T extends beyond the search area SA in this manner, the search area SA is expanded such that the area corresponding to the template T is within the search area SA. For example, in the setting example 2, an upper side and a right side are translated to expand the search area SA and such that the template T is included.
- In a setting example 3 also, first, a rectangular search area SA centered at the movement point P′ searched in the previous time phase is set. However, in this state, the area corresponding to the template T is outside of the search area SA. When the template T is completely outside of the search area SA as in this example setting also, the search area SA is expanded such that the area corresponding to the template T is within the search area SA. For example, in the setting example 3, the right side is translated to expand the search area SA and such that the template T is included.
- Because it is highly likely that the movement point which is searched is located near the movement point which is searched in the previous time phase, the search area SA is set centered at the movement point which is searched in the previous time phase as in the setting examples shown in
FIG. 3 , so that the movement point to be originally detected can be searched even when the search area SA is set relatively narrow. In addition, because the search area SA is expanded to include the area corresponding to the template T of the reference time phase, even in a case, for example, in which an organ which involves periodical motion such as a heart is diagnosed over a plurality of time phases and the organ returns to the state corresponding to the reference time phase, an area corresponding to the template T of the reference time phase can be included as a candidate of the movement point. - Referring again to
FIG. 2 , when the template T and the search area SA are set, the template T is moved in the search area SA of the image data of each search time phase, and, at each position, a correlation value is calculated based on a plurality of pixels within the template T of the image data of the reference time phase and a plurality of pixels in an area overlapping the template T of the image data of each search time phase. For example, a position shown by a dotted rectangle in the search area SA inFIG. 2 is set as an initial position, the template T is moved stepwise from the initial position in the x direction and the y direction, a correlation value is calculated at each position, and a plurality of correlation values are calculated corresponding to a plurality of positions over the entire area in the search area SA. - The correlation value is a numerical value indicating a degree of correlation relationship (degree of similarity) between image data, and, for calculation of the correlation value, known equations may be used corresponding to each method of correlation calculation. For example, as in a phase only correlation or a cross-correlation, a correlation value which shows a larger value as the degree of similarity becomes larger may be used, or, as in a minimum-sum absolute-difference, a correlation value which shows a smaller value as the degree of similarity becomes larger may be used. In the present embodiment, a self-correlation value which shows a smaller value as the degree of similarity becomes larger is used as a specific example of the correlation value. In addition, in the present embodiment, a weighting process to be described later is applied on the result of the correlation calculation (self-correlation value) obtained in each position in the search area SA.
- When the self-correlation value after the weighting process is calculated at each of a plurality of positions over the entire area of the search area SA in this manner, a position having the largest degree of similarity is identified from the plurality of positions and is set as a movement point which is a point to which the set point has moved.
-
FIG. 4 is a diagram showing a specific example of a weighting coefficient used for the weighting process. A horizontal axis ofFIG. 4 represents a distance from a reference position for each position for which the self-correlation value is calculated. The reference position is, for example, the position of the movement point searched in the previous time phase (for example, reference numeral P′ inFIG. 3 ) or the position of the set point which is set in the image data of the reference time phase (for example, reference numeral PinFIG. 2 ). A vertical axis ofFIG. 4 represents a weighting coefficient. As shown inFIG. 4 , a weighting coefficient at a distance d is k(d). - In the present embodiment, as a specific example of the correlation value, a self-correlation value which shows a smaller value as the degree of similarity becomes larger is used. Thus, in the present embodiment, the weighting coefficient is set smaller for an area closer to the reference position having a higher possibility of having a high degree of similarity, and the weighting coefficient is increased as the position becomes farther away from the reference position. The self-correlation value at each position is multiplied by the weighting coefficient corresponding to the position, and the movement point which is a point to which the set point has moved is searched based on the multiplication result; that is, the self-correlation value to which the weighting process is applied.
- With this process, for example, a search in which more importance is placed on an area near the reference position expected to have a relatively high degree of correlation can be enabled while reducing overlooking of search by setting a relatively wide search area. Thus, significant deviation of the searched movement point from the actual moved point of the set point can be inhibited, and, as a result, the precision of the search based on the correlation calculation can be improved.
- The specific example of the weighting coefficient shown in
FIG. 4 is merely exemplary, and, alternatively, for example, there may be used a weighting coefficient which non-linearly changes with distance or a weighting coefficient which stepwise changes (in steps) with distance. When a correlation value which shows a larger value as the degree of similarity becomes larger is used, a weighting coefficient which becomes larger as the position becomes closer to the reference position and which becomes smaller as the position becomes farther away from the reference position is preferable. -
FIG. 5 is a flowchart showing a process by the pattern matching processor 30 (FIG. 1 ). When the user sets a set point P in the image data of the reference time phase (S501; refer toFIG. 2 ), a template T corresponding to the set point P is set in the image data of the reference time phase (S502; refer toFIG. 2 ). - Next, a search area SA is set in the image data of each search time phase (S503; refer to
FIGS. 2 and 3 ), a template T is set in the search area SA, and a self-correlation value a is calculated in the position of the template T based on a plurality of pixels in the template T of the image data of the reference time phase and a plurality of pixels in an area overlapping the template T in the image data of each search time phase (S504). - Then, a distance d from the reference position to each position is calculated, a weighting coefficient k(d) for that position is determined (S505; refer to
FIG. 4 ), the self-correlation value a is multiplied by the weighting coefficient k(d), and a self-correlation value a′ after the weighting process is calculated (S506). - It is then checked whether or not the search over the entire area in the search area SA is completed (S507), and, if the search is not completed, the process returns to S504, the template T is moved to a next position, and the processes from S504 to S506 are executed at the new position.
- When the processes from S504 to S506 are repeatedly executed and it is confirmed in S507 that the search over the entire area in the search area SA is completed, a position, among all positions in the search area SA, where the self-correlation value a′ after the weighting process is the minimum is identified, and the identified position is set as the movement point of the set point (S508). In this manner, the movement point for the image data of each search time phase is identified.
- It is then checked whether or not a search for all search time phases is completed (S509). If the search is not completed, the process returns to S503 and processes related to the image data of the next search time phase are executed. When the processes from S503 to S508 are repeatedly executed and it is confirmed in S509 that the search for all search time phases is completed, the process at the
pattern matching processor 30 is completed. - Referring again to
FIG. 1 , when the search for image data of all search time phases is completed at thepattern matching processor 30, adiagnostic information generator 40 generates diagnostic information related to a target tissue based on the result of the search. Thediagnostic information generator 40 has a function to set a fixed point which becomes a reference in the diagnosis, and a function to generate, as diagnostic information, a displacement waveform related to a movement point, with the fixed point as a reference. A display image including the displacement waveform formed in thediagnostic information generator 40 is displayed on adisplay unit 50. -
FIG. 6 is a diagram for explaining setting of a fixed point which forms a reference in the diagnosis.FIG. 6 shows movement points P′ of a plurality of time phases searched in the image data over a plurality of search time phases for a certain set point P. - The diagnostic information generator 40 (
FIG. 1 ) determines a primary direction of movement of the set point P over the plurality of time phases based on the movement points P′ of the plurality of time phases for the set point P. For example, as shown inFIG. 6 , a rectangle R which surrounds all of the plurality of movement points P′ and which circumscribes the movement points P′ corresponding to the 4 sides is set, and a major axis passing through the center of the rectangle R is set as the primary direction D. The determination of the primary direction D is not limited to the method of using the rectangle R. For example, a known method commonly referred to as principle component analysis may be used, and a direction which most clearly represents the spatial variation of the movement points P′; that is, a direction where the variance of the movement points P′ is the maximum, may be set as the primary direction D. - The
diagnostic information generator 40 then sets a fixed point F on the determined primary direction D. Alternatively, the user may set a provisional fixed point Fin advance, and the provisional fixed point F may be moved to the fixed point F determined by thediagnostic information generator 40. When the fixed point F is set, thediagnostic information generator 40 calculates a distance from the set fixed point F to each movement point P′, and generates a displacement waveform representing a change of the distance over the plurality of time phases. -
FIG. 7 is a diagram showing an example display image including the displacement waveform.FIG. 7 shows an example display image obtained when a heart is diagnosed as the target tissue. The display image ofFIG. 7 includes a tomographic image related to the heart, and 4 set points A, B, C, and D which are set for the diagnosis of the heart are shown in the tomographic image. - For each set point, the movement point is searched (tracked) over a plurality of time phases, to set the fixed point F (refer to
FIG. 6 ). In addition, for each set point, a distance from the fixed point F to each movement point P′ (refer toFIG. 6 ) is calculated, and a displacement waveform showing a change of the distance over a plurality of time phases is generated.FIG. 7 shows an example display image including displacement waveforms L1, L2, L3, and L4 obtained from the set points A, B, C, and D, respectively. - In the present embodiment, because the fixed point F is set on the primary direction of movement over a plurality of time phases, in each displacement waveform obtained with reference to the fixed point F, an influence of movement of each set point is relatively strongly reflected and the movement of each set point is relatively sensitively represented. For the plurality of set points A, B, C, and D, the fixed point F is desirably set according to a unified reference. For example, all fixed points F for all set points are set outside of the cardiac muscle. Alternatively, all fixed points for all set points may be set inside the cardiac muscle. With this configuration, for the heart which repeats a contraction and expansion motion as a whole, a transmission status of motion can be observed based on a time difference of start of motion at set points A, B, C, and D, and the cardiac muscle can be diagnosed.
- The example display image shown in
FIG. 7 also includes, as an auxiliary display when the transmission status of motion is to be observed, a connection curve 104 connecting maximum value points of the displacement waveforms L1, L2, L3, and L4, graphs M1, M2 and M3 representing the time differences of motion between set points, and a cursor 106 which is used when the user designates a particular time phase. -
FIG. 8 is a flowchart showing a process in the diagnostic information generator 40 (FIG. 1 ). For a certain set point, when information of movement points related to all search time phases is obtained from the pattern matching processor 30 (FIG. 1 ) (S801), the primary direction D of movement over the plurality of time phases for the set point is determined based on the plurality of movement points (S802; refer toFIG. 6 ), and the fixed point F is set on the primary direction D (S803; refer toFIG. 6 ). In the setting of the fixed point F, the user may designate, on the primary direction D), an outer side or an inner side of the cardiac muscle, and the fixed point F may be set according to the designation. - When the fixed point F is set, a distance between the fixed point F and each movement point P′ is calculated (S804; refer to
FIG. 6 ), and a displacement waveform representing a change of distance over a plurality of time phases is generated (S805). It is then checked whether or not processes for all set points are completed (S806), and, if the processes are not completed, the process returns to S801, and the process for the next set point is executed. When the processes from S801 to S805 are repeatedly executed and it is confirmed in S806 that processes for all set points are completed, the process at thediagnostic information generator 40 is completed, and the display image, for example, shown inFIG. 7 is displayed on the display unit 50 (FIG. 1 ). - An ultrasound diagnostic apparatus according to a preferred embodiment of the present invention has been described. Alternatively, for example, with programs corresponding to the processes described above and shown in
FIG. 5 and inFIG. 8 , the functions of thepattern matching processor 30 and thediagnostic information generator 40 shown inFIG. 1 may be realized on a computer, and the computer may be functioned as an ultrasound image processing apparatus. Moreover, the above-described preferred embodiment is merely exemplary in every aspect, and does not limit the scope of the present invention thereto. The present invention includes various modified configurations within the scope of the essentials thereof. -
- 10 PROBE; 20 IMAGE PROCESSOR; 30 PATTERN MATCHING PROCESSOR; 40 DIAGNOSTIC INFORMATION GENERATOR
Claims (13)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011-163990 | 2011-07-27 | ||
JP2011163990A JP5409719B2 (en) | 2011-07-27 | 2011-07-27 | Ultrasonic image processing device |
PCT/JP2012/067897 WO2013015135A1 (en) | 2011-07-27 | 2012-07-13 | Ultrasonic image processor |
Publications (2)
Publication Number | Publication Date |
---|---|
US20140105478A1 true US20140105478A1 (en) | 2014-04-17 |
US9349190B2 US9349190B2 (en) | 2016-05-24 |
Family
ID=47600986
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/125,131 Expired - Fee Related US9349190B2 (en) | 2011-07-27 | 2012-07-13 | Ultrasound image processing apparatus |
Country Status (4)
Country | Link |
---|---|
US (1) | US9349190B2 (en) |
JP (1) | JP5409719B2 (en) |
CN (1) | CN103717145B (en) |
WO (1) | WO2013015135A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9349190B2 (en) * | 2011-07-27 | 2016-05-24 | Hitachi Aloka Medical, Ltd. | Ultrasound image processing apparatus |
US11308622B2 (en) * | 2017-05-31 | 2022-04-19 | Canon Kabushiki Kaisha | Information processing apparatus and method for controlling the same to generate a difference image from first and second inspection images |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110772280B (en) * | 2018-07-31 | 2023-05-23 | 佳能医疗系统株式会社 | Ultrasonic diagnostic apparatus and method, and image processing apparatus and method |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030210812A1 (en) * | 2002-02-26 | 2003-11-13 | Ali Khamene | Apparatus and method for surgical navigation |
US6980679B2 (en) * | 1998-10-23 | 2005-12-27 | Varian Medical System Technologies, Inc. | Method and system for monitoring breathing activity of a subject |
US20070071295A1 (en) * | 2005-09-27 | 2007-03-29 | Siemens Medical Solutions Usa, Inc. | Orientation-based assessment of cardiac synchrony in medical imaging |
US7251352B2 (en) * | 2001-08-16 | 2007-07-31 | Siemens Corporate Research, Inc. | Marking 3D locations from ultrasound images |
US7301530B2 (en) * | 2001-09-11 | 2007-11-27 | Samsung Electronics Co., Ltd. | Pointer control method, pointing apparatus, and host apparatus therefor |
US20080287803A1 (en) * | 2007-05-16 | 2008-11-20 | General Electric Company | Intracardiac echocardiography image reconstruction in combination with position tracking system |
US20090270732A1 (en) * | 2008-04-25 | 2009-10-29 | Toshiba Medical Systems Corporation | Ultrasound imaging apparatus and method for processing ultrasound image |
US20090303252A1 (en) * | 2008-06-04 | 2009-12-10 | Dong Gyu Hyun | Registration Of CT Image Onto Ultrasound Images |
US20110257525A1 (en) * | 2005-10-11 | 2011-10-20 | University Of Pittsburgh - Of The Commonwealth System Of Higher Education | Sensor guided catheter navigation system |
US20130346050A1 (en) * | 2012-06-21 | 2013-12-26 | Samsung Electronics Co., Ltd. | Method and apparatus for determining focus of high-intensity focused ultrasound |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3734443B2 (en) * | 2001-12-26 | 2006-01-11 | 株式会社東芝 | Ultrasonic diagnostic equipment |
CN100563582C (en) * | 2004-08-25 | 2009-12-02 | 株式会社日立医药 | Diagnostic ultrasound equipment |
WO2006054635A1 (en) | 2004-11-17 | 2006-05-26 | Hitachi Medical Corporation | Ultrasonograph and ultrasonic image display method |
JP2007130063A (en) | 2005-11-08 | 2007-05-31 | Aloka Co Ltd | Ultrasonographic apparatus |
JP4758736B2 (en) | 2005-11-24 | 2011-08-31 | 日立アロカメディカル株式会社 | Ultrasonic diagnostic equipment |
JP2009000444A (en) * | 2007-06-25 | 2009-01-08 | Panasonic Corp | Ultrasonic diagnostic equipment |
JP4971114B2 (en) * | 2007-11-28 | 2012-07-11 | 日本システムウエア株式会社 | Object recognition apparatus, object recognition method, object recognition program, and computer-readable medium storing the program |
JP5156421B2 (en) * | 2008-02-07 | 2013-03-06 | 株式会社日立メディコ | Ultrasonic diagnostic equipment |
JP5409719B2 (en) * | 2011-07-27 | 2014-02-05 | 日立アロカメディカル株式会社 | Ultrasonic image processing device |
-
2011
- 2011-07-27 JP JP2011163990A patent/JP5409719B2/en not_active Expired - Fee Related
-
2012
- 2012-07-13 US US14/125,131 patent/US9349190B2/en not_active Expired - Fee Related
- 2012-07-13 CN CN201280037593.8A patent/CN103717145B/en not_active Expired - Fee Related
- 2012-07-13 WO PCT/JP2012/067897 patent/WO2013015135A1/en active Application Filing
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6980679B2 (en) * | 1998-10-23 | 2005-12-27 | Varian Medical System Technologies, Inc. | Method and system for monitoring breathing activity of a subject |
US7251352B2 (en) * | 2001-08-16 | 2007-07-31 | Siemens Corporate Research, Inc. | Marking 3D locations from ultrasound images |
US7301530B2 (en) * | 2001-09-11 | 2007-11-27 | Samsung Electronics Co., Ltd. | Pointer control method, pointing apparatus, and host apparatus therefor |
US20030210812A1 (en) * | 2002-02-26 | 2003-11-13 | Ali Khamene | Apparatus and method for surgical navigation |
US20070071295A1 (en) * | 2005-09-27 | 2007-03-29 | Siemens Medical Solutions Usa, Inc. | Orientation-based assessment of cardiac synchrony in medical imaging |
US20110257525A1 (en) * | 2005-10-11 | 2011-10-20 | University Of Pittsburgh - Of The Commonwealth System Of Higher Education | Sensor guided catheter navigation system |
US20080287803A1 (en) * | 2007-05-16 | 2008-11-20 | General Electric Company | Intracardiac echocardiography image reconstruction in combination with position tracking system |
US20090270732A1 (en) * | 2008-04-25 | 2009-10-29 | Toshiba Medical Systems Corporation | Ultrasound imaging apparatus and method for processing ultrasound image |
US20090303252A1 (en) * | 2008-06-04 | 2009-12-10 | Dong Gyu Hyun | Registration Of CT Image Onto Ultrasound Images |
US20130346050A1 (en) * | 2012-06-21 | 2013-12-26 | Samsung Electronics Co., Ltd. | Method and apparatus for determining focus of high-intensity focused ultrasound |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9349190B2 (en) * | 2011-07-27 | 2016-05-24 | Hitachi Aloka Medical, Ltd. | Ultrasound image processing apparatus |
US11308622B2 (en) * | 2017-05-31 | 2022-04-19 | Canon Kabushiki Kaisha | Information processing apparatus and method for controlling the same to generate a difference image from first and second inspection images |
Also Published As
Publication number | Publication date |
---|---|
JP5409719B2 (en) | 2014-02-05 |
CN103717145A (en) | 2014-04-09 |
US9349190B2 (en) | 2016-05-24 |
JP2013027453A (en) | 2013-02-07 |
CN103717145B (en) | 2016-06-08 |
WO2013015135A1 (en) | 2013-01-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6935020B2 (en) | Systems and methods for identifying features of ultrasound images | |
US7925068B2 (en) | Method and apparatus for forming a guide image for an ultrasound image scanner | |
JP4745133B2 (en) | Ultrasonic diagnostic apparatus, medical image processing apparatus, and medical image processing program | |
JP5386001B2 (en) | Ultrasonic diagnostic equipment | |
JP2017079977A (en) | Ultrasound diagnostic apparatus and ultrasound signal processing method | |
CN108882916A (en) | The method for displaying parameters and its ultrasonic image-forming system of supersonic blood | |
US9349190B2 (en) | Ultrasound image processing apparatus | |
JP5102475B2 (en) | Ultrasonic diagnostic equipment | |
JP4598652B2 (en) | Ultrasonic diagnostic equipment | |
CN106955125A (en) | Motion independence in acoustic radiation power Pulse Imageing | |
JP5999935B2 (en) | Ultrasonic diagnostic equipment | |
JP4761999B2 (en) | Ultrasonic diagnostic apparatus, image processing method thereof, and image processing program thereof | |
JP2008289548A (en) | Ultrasonograph and diagnostic parameter measuring device | |
JP2012115387A (en) | Ultrasonic image processor | |
CN110811674B (en) | Ultrasonic diagnostic apparatus and storage medium | |
JP2016055040A (en) | Ultrasonic diagnostic device | |
JP5746926B2 (en) | Ultrasonic image processing device | |
JP2010246630A (en) | Ultrasound diagnostic apparatus | |
JP5551627B2 (en) | Ultrasonic image processing device | |
JP6591199B2 (en) | Ultrasonic diagnostic apparatus and program | |
JP5587751B2 (en) | Ultrasonic image processing device | |
JP2013042929A (en) | Ultrasonic image processing apparatus | |
JP6389084B2 (en) | Ultrasonic diagnostic apparatus and control program therefor | |
JP2013017716A (en) | Ultrasonic diagnostic apparatus | |
JP2010088632A (en) | Ultrasonic diagnostic apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: HITACHI ALOKA MEDICAL, LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KASAHARA, EIJI;REEL/FRAME:031748/0776 Effective date: 20131111 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: HITACHI, LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HITACHI ALOKA MEDICAL, LTD.;REEL/FRAME:041891/0325 Effective date: 20160401 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20200524 |