WO2012164919A1 - 超音波画像生成装置および超音波画像生成方法 - Google Patents
超音波画像生成装置および超音波画像生成方法 Download PDFInfo
- Publication number
- WO2012164919A1 WO2012164919A1 PCT/JP2012/003523 JP2012003523W WO2012164919A1 WO 2012164919 A1 WO2012164919 A1 WO 2012164919A1 JP 2012003523 W JP2012003523 W JP 2012003523W WO 2012164919 A1 WO2012164919 A1 WO 2012164919A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- movement
- ultrasonic
- movement vector
- image
- vector
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/13—Tomography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/463—Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/42—Details of probe positioning or probe attachment to the patient
- A61B8/4245—Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/483—Diagnostic techniques involving the acquisition of a 3D volume of data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5269—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving detection or reduction of artifacts
- A61B8/5276—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving detection or reduction of artifacts due to motion
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0891—Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of blood vessels
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/42—Details of probe positioning or probe attachment to the patient
- A61B8/4245—Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
- A61B8/4263—Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors not mounted on the probe, e.g. mounted on an external reference frame
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
- G01S15/8906—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
- G01S15/8934—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a dynamic transducer configuration
- G01S15/8936—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a dynamic transducer configuration using transducers mounted for mechanical movement in three dimensions
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
- G01S15/8906—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
- G01S15/8993—Three dimensional imaging systems
Definitions
- the present invention relates to an ultrasonic image generating apparatus and an ultrasonic image generating method.
- the present invention relates to an ultrasonic image generation apparatus and an ultrasonic image generation method for generating an ultrasonic diagnostic image using a plurality of ultrasonic signals obtained from a subject while moving an ultrasonic probe.
- X-ray diagnostic apparatuses X-ray diagnostic apparatuses, MR (magnetic resonance) diagnostic apparatuses, and ultrasonic diagnostic apparatuses are widely used as biological image diagnostic apparatuses.
- An ultrasonic diagnostic apparatus has advantages such as noninvasiveness and real-time characteristics, and is widely used for diagnosis including medical examination.
- the diagnostic site of the ultrasonic diagnostic apparatus has a wide variety of blood vessels such as the heart or carotid artery, liver, or breast. Among them, the carotid artery is one of the most important diagnostic sites due to the recent increase in the number of people suffering from arteriosclerosis.
- the carotid artery Intima Thickness
- IMT Intima Thickness
- plaque the thickness of the intima-media complex
- the position information (position and orientation) of the ultrasonic probe at the time of acquiring the ultrasonic image is acquired, and each ultrasonic image is three-dimensionally based on the position information. Map in space.
- a method for estimating the position information there is a method in which a marker attached to the ultrasonic probe is photographed with a camera, and the position of the ultrasonic probe is estimated based on a change in the position and shape of the marker in the photographed image (for example, a patent) Reference 1).
- an object of the present invention is to provide an ultrasonic image generation apparatus that suppresses the direction dependency of the position acquisition accuracy of an ultrasonic probe, in view of such problems.
- an ultrasonic image generating apparatus generates an ultrasonic diagnostic image using a plurality of ultrasonic signals obtained from a subject while moving an ultrasonic probe.
- An ultrasonic image generation apparatus wherein a movement vector indicating movement of the ultrasonic probe when a plurality of images corresponding to each of the plurality of ultrasonic signals is acquired is obtained by a first estimation method using a first movement vector
- the first movement estimation unit for estimating the movement vector when the plurality of images corresponding to each of the plurality of ultrasonic signals are acquired is different from the first estimation method in the direction dependency of the estimation accuracy.
- a second movement estimation unit that estimates the second movement vector by a second estimation method; the first movement vector that is estimated by the first movement estimation unit; and the second movement that is estimated by the second movement estimation unit.
- a vector is combined with weighting based on the direction of the first movement vector or the second movement vector, thereby calculating a combined movement vector and using the combined movement vector and the plurality of images.
- a position reconstruction unit that constitutes an ultrasound diagnostic image of the specimen.
- ultrasonic image generation device highly accurate position information can be acquired in any direction of movement of the ultrasonic probe, and a three-dimensional image with higher accuracy than a conventional ultrasonic image generation device can be obtained. Can be built.
- FIG. 1A is a configuration diagram of an example of the ultrasonic image generation apparatus according to the first embodiment.
- FIG. 1B is a configuration diagram of another example of the ultrasonic image generating apparatus according to the first embodiment.
- FIG. 2 is an explanatory diagram of a movement vector acquisition method by image processing.
- FIG. 3 is a flowchart showing the operation of the ultrasonic image generating apparatus according to the first embodiment.
- FIG. 4 is a diagram illustrating the configuration of the position reconstruction unit.
- FIG. 5 is an explanatory diagram of an angle formed by the reference direction and the movement vector.
- FIG. 6 is a flowchart showing the operation of the position reconstruction means.
- FIG. 7 is a diagram illustrating a method of decomposing a movement vector into a reference direction component and an orthogonal component.
- FIG. 1A is a configuration diagram of an example of the ultrasonic image generation apparatus according to the first embodiment.
- FIG. 1B is a configuration diagram of another example of the ultrasonic image generating apparatus
- FIG. 8 is a diagram illustrating a method of decomposing a movement vector into absolute coordinate space components.
- FIG. 9 is an explanatory diagram of a position information acquisition method using a plurality of cameras.
- FIG. 10 is a diagram illustrating an effect of the ultrasonic image generating apparatus according to the first embodiment.
- FIG. 11 is a diagram illustrating an effect of the ultrasonic image generation apparatus according to the second embodiment.
- FIG. 12 is a flowchart illustrating the operation of the ultrasonic image generating apparatus according to the second embodiment.
- FIG. 13 is an explanatory diagram of camera arrangement and auxiliary information for camera arrangement.
- FIG. 14A is a flowchart illustrating an operation of a modification of the ultrasonic image generating apparatus according to the second embodiment.
- FIG. 14B is a configuration diagram of an example of an ultrasound image generation apparatus including a feedback unit.
- FIG. 15 is a configuration diagram of an example of a conventional ultrasonic image generation apparatus.
- FIG. 16 is a flowchart showing the operation of the conventional ultrasonic image generating apparatus.
- FIG. 17 is an explanatory diagram of a position acquisition method of a conventional ultrasonic image generation apparatus.
- FIG. 18 is an explanatory diagram when the ultrasonic image generation method is implemented by a computer system using a program recorded on a recording medium such as a flexible disk.
- a conventional ultrasonic diagnostic apparatus that constructs a three-dimensional image based on the position information of the ultrasonic probe acquired at the time of scanning will be described with reference to FIGS. 15 and 16.
- FIG. 15 is a configuration diagram of an example of a conventional ultrasonic diagnostic apparatus 1501.
- the ultrasonic diagnostic apparatus 1501 includes an ultrasonic probe 101, a transmission unit 102, a reception unit 103, a transmission / reception control unit 104, an ultrasonic image generation unit 105, an image memory 1506, A position acquisition unit 1507, a position reconstruction unit 1508, and a display unit 1509 are provided.
- An element such as a piezoelectric element arranged in the ultrasonic probe 101 generates an ultrasonic signal based on a drive signal output from the transmission unit 102.
- the ultrasonic signal is reflected by the in-vivo structure of the subject, such as a blood vessel wall or muscle, and a part of the reflected component is received by the ultrasonic probe 101.
- the receiving unit 103 sequentially performs amplification, A / D (analog / digital) conversion, delay addition processing of the signal of each element, and the like on the received reflected signal to generate a received RF (Radio Frequency) signal.
- the transmission / reception control unit 104 controls operations of the transmission unit 102 and the reception unit 103.
- For the transmission unit 102 switching of a driving voltage, setting of a transmission frequency, and the like are performed in order to perform predetermined scanning.
- the receiver 103 is set with a delay time for performing reception beamforming.
- the ultrasonic image generation unit 105 converts the received RF signal into an ultrasonic image and stores it in the image memory 1506.
- Examples of the ultrasonic image to be generated include a B-mode image in which the signal intensity is represented by the magnitude of luminance, or a Doppler image indicating the blood flow or the motion speed of the tissue calculated based on the Doppler effect of the received RF signal. is there.
- the position acquisition unit 1507 acquires the position information LocInf0 of the ultrasonic probe and outputs it to the position reconstruction unit 1508.
- the position acquisition unit 1507 is realized by a camera, for example.
- the position reconstruction unit 1508 maps the ultrasonic image stored in the image memory 1506 in the three-dimensional space based on the position information LocInf0 received from the position acquisition unit 1507, and configures a three-dimensional image ProcImg0 to be imaged.
- Display unit 1509 displays the three-dimensional image ProcImg0 on a display device such as a monitor.
- FIG. 16 is a flowchart showing the operation of the conventional ultrasonic diagnostic apparatus 1501.
- the conventional ultrasonic diagnostic image 1501 acquires an ultrasonic image in step S1601.
- step S1602 position information corresponding to the ultrasonic image acquired in step S1601 is acquired from the position acquisition unit 1507.
- step S1603 it is determined whether or not the acquisition of the ultrasound image has ended.
- steps S1601 and S1602 are repeated until it is determined in step S1603 that the acquisition of the ultrasound image has ended.
- step S1604 the acquired ultrasonic image is mapped in the three-dimensional space based on the position information.
- step S1605 a three-dimensional image of the ultrasonic image is displayed.
- position information is determined based on information from a single position acquisition unit.
- position information acquisition accuracy of the position acquisition means is direction-dependent, there is a problem that the accuracy of the position information in the direction where the position acquisition accuracy is low is lowered and the construction accuracy of the three-dimensional image is lowered.
- the position information acquisition method includes a method using a position sensor such as a camera and a method using image processing using an ultrasonic image.
- a position sensor such as a camera
- image processing using an ultrasonic image the direction dependency of the position resolution will be described by taking the position estimation by the camera and image processing as an example.
- FIG. 17 is an explanatory diagram of a position information acquisition method of a conventional ultrasonic image generation apparatus.
- FIG. 17 is explanatory drawing about the positional information acquisition method by a camera.
- the camera is arranged in the x-axis direction, tracks an optical marker (marker) attached to the ultrasonic probe, and determines marker position information based on the marker shape or posture change amount.
- the amount of change in the shape or posture of the marker when moving the same distance differs depending on the moving direction of the marker, and the amount of change is smaller in the x-axis direction than in the y-axis direction or the z-axis direction. If the amount of change is small, the resolution of the position information decreases. Therefore, in the position information acquisition by the camera, the resolution in the x-axis direction, that is, the depth direction of the camera is lowered.
- (B) of FIG. 17 is explanatory drawing about the positional information acquisition method by image processing.
- the relative positional relationship between images is determined by determining the amount of positional deviation between continuously scanned ultrasonic images based on the correlation between images.
- the movement vector between the (N ⁇ 1) th and Nth images can be determined by examining the amount of positional deviation between two (N ⁇ 1) th and Nth consecutive B-mode images.
- the resolution in the traveling direction of the probe (corresponding to the y-axis direction) is lower than the positional deviation amount in the image plane (the plane formed by the x-axis and the z-axis).
- the position information acquired by a single method has direction dependency on the resolution, there is a direction in which the resolution of the position information decreases when imaging while moving the probe in an arbitrary direction.
- the ultrasound image is mapped in a three-dimensional space based on position information, so that the construction accuracy of the three-dimensional image is reduced due to a decrease in resolution of the position information.
- an object of the present invention is to provide an ultrasonic image generation apparatus that suppresses the direction dependency of the position acquisition accuracy of an ultrasonic probe, in view of such problems.
- an ultrasonic image generation apparatus uses a plurality of ultrasonic signals obtained from a subject to move an ultrasonic diagnostic image while moving an ultrasonic probe.
- An ultrasonic image generation device for generating a movement vector indicating movement of the ultrasonic probe when a plurality of images corresponding to each of the plurality of ultrasonic signals is acquired by a first estimation method.
- a first movement estimation unit that estimates as a movement vector, and the movement vector when a plurality of images corresponding to each of the plurality of ultrasonic signals are acquired.
- the first estimation method is a direction dependency of estimation accuracy.
- a second movement estimation unit that estimates as a second movement vector by a different second estimation method, the first movement vector estimated by the first movement estimation unit, and the second movement estimation unit By combining the two movement vectors with weighting based on the direction of the first movement vector or the second movement vector, a combined movement vector is calculated, and the combined movement vector and the plurality of images are used.
- a position reconstruction unit that constitutes an ultrasound diagnostic image of the subject.
- a plurality of movement vectors are estimated using the first and second estimation methods having different direction dependences of the estimation accuracy, and they are combined to obtain a direction in which the estimation accuracy is relatively low in the first estimation method.
- the movement vector of the ultrasonic probe can be calculated.
- the direction dependency of the position acquisition accuracy of the ultrasonic probe can be suppressed.
- the first movement estimation unit may acquire one of the two images after acquiring the other based on an image shift between two images of the plurality of images.
- the movement vector of the ultrasonic probe is estimated by calculating the movement vector between the two.
- the displacement of the two images can be analyzed and the amount of movement of the ultrasonic probe can be estimated.
- the movement of the ultrasonic probe in the plane including the cross section of the subject that has transmitted the ultrasonic signal corresponding to the image can be detected with relatively high accuracy.
- the first movement estimation unit detects an image shift amount between the two images based on a correlation between pixel values constituting the two images of the plurality of images.
- the movement vector of the ultrasonic probe from when one of the two images is acquired until the other is acquired is estimated.
- the movement amount of a corresponding pixel or a region that is a collection of a plurality of pixels is calculated, and the movement amount of the ultrasonic probe is estimated from the movement amount. can do.
- the movement of the ultrasonic probe in the plane including the cross section of the subject corresponding to the image can be detected with relatively high accuracy.
- the position reconstruction unit has a large angle between the moving direction of the ultrasonic probe and the reference direction used as a reference in the estimation by the first movement estimating unit or the second movement estimating unit.
- the ultrasonic diagnostic image is configured by increasing the weight of the first movement vector or the second movement vector and calculating the combined movement vector.
- the weight of the component in the direction in which the estimation accuracy is relatively high is set large, and the estimated position vector is weighted to generate the combined position vector, thereby improving the accuracy of the combined movement vector. Can be high. Therefore, the direction dependency of the position acquisition accuracy of the ultrasonic probe can be suppressed.
- the position reconstruction unit increases the first movement estimation as the angle formed between the moving direction of the ultrasonic probe and the direction in which the estimation accuracy that is the reference direction in the first movement estimating unit is low.
- the ultrasonic diagnostic image is constructed by increasing the weight of the movement vector estimated by the unit and calculating the combined movement vector.
- the weight of the direction component in the plane including the cross section of the subject corresponding to the image is set large, and then the synthesized position vector is generated.
- the accuracy of the combined movement vector can be increased.
- At least one of the first movement estimation unit and the second movement estimation unit images an optical marker attached to the ultrasonic probe with a camera, and the position or shape of the captured optical marker Based on the above, the position and angle of the ultrasonic probe are calculated.
- the movement vector of the ultrasonic probe can be estimated by the optical marker and the camera. Thereby, the movement of the ultrasonic probe in a plane parallel to the imaging surface of the camera can be detected with relatively high accuracy.
- the position reconstruction unit increases the optical movement as the angle formed between the moving direction of the ultrasonic probe and the normal direction of the imaging surface of the camera, which is the reference direction in the optical movement estimating unit.
- the plurality of images are configured by increasing the weight of the movement vector estimated by the estimation unit and combining the movement vectors.
- the combined movement is performed by generating the combined position vector after setting the weight of the direction component in the plane parallel to the imaging surface of the camera to be large.
- the accuracy of the vector can be increased.
- the first movement estimation unit may acquire one of the two images after acquiring the other based on an image shift between two images of the plurality of images.
- the movement vector of the ultrasonic probe between the two is estimated by calculating the movement vector of the ultrasonic probe between, and the second movement estimation unit images the optical marker attached to the ultrasonic probe with a camera. Then, the movement vector of the ultrasonic probe is estimated by calculating the position and angle of the ultrasonic probe based on the imaged position or shape of the optical marker.
- the position estimation of the ultrasonic probe by image processing and the position estimation of the ultrasonic probe by the optical marker and the camera are used together, and the position vector of the ultrasonic probe estimated by each method is synthesized, A composite position vector can be generated.
- the ultrasonic image generation device may further change the reference direction by an operator so that the reference directions in the first movement estimation unit and the second movement estimation unit are substantially orthogonal to each other.
- An arrangement assist unit that presents information for prompting an operation is provided.
- the operator can suppress the direction dependency of the estimation accuracy of the synthesized position vector.
- the direction of the estimation method can be set. Therefore, the direction dependency of the position acquisition accuracy of the ultrasonic probe can be suppressed.
- An ultrasonic image generation method is an ultrasonic image generation method that generates an ultrasonic diagnostic image using a plurality of ultrasonic signals obtained from a subject while moving an ultrasonic probe.
- An estimation step and a second estimation method in which the movement vector when a plurality of images corresponding to each of the plurality of ultrasonic signals is acquired is different from the first estimation method by a second estimation method that is different in direction dependency of estimation accuracy.
- a second movement estimation step for estimating two movement vectors By combining the determined second movement vector with weighting based on the direction of the first movement vector or the second movement vector, a combined movement vector is calculated, and the combined movement vector and the plurality of movement vectors And a position reconstruction step of constructing an ultrasonic diagnostic image of the subject using the image.
- the ultrasonic image generation method in the first movement estimation step, based on the image shift between two images of the plurality of images, the two images
- the movement vector of the ultrasonic probe is estimated by calculating the movement vector from when one is acquired until the other is acquired.
- FIG. 1A is a configuration diagram of the ultrasonic image generating apparatus 10 according to the first embodiment of the present invention.
- the ultrasonic image generating apparatus 10 includes a first movement estimation unit 10A, a second movement estimation unit 10B, and a position reconstruction unit 10C.
- the first movement estimation unit 10A estimates the movement vector of the ultrasonic probe when a plurality of images corresponding to each of the plurality of ultrasonic signals are acquired by the first estimation method.
- the second movement estimation unit 10B estimates the movement vector of the ultrasonic probe by a second estimation method that is different from the first estimation method in the direction dependency of estimation accuracy.
- the position reconstruction unit 10C calculates a synthesized movement vector by weighting and synthesizing the movement vectors estimated by the first movement estimation unit and the second movement estimation unit based on the moving direction of the ultrasonic probe.
- An ultrasonic diagnostic image of the subject is generated by constructing a plurality of images according to the movement vector.
- FIG. 1B is a configuration diagram of the ultrasonic image generating apparatus 11 according to the first embodiment of the present invention.
- the ultrasonic image generation device 11 includes an ultrasonic probe 101, a transmission unit 102, a reception unit 103, a transmission / reception control unit 104, an ultrasonic image generation unit 105, an image memory 105, a position acquisition unit 111, An image position estimation unit 112, a position reconstruction unit 113, and a display unit 114 are included. Since the operation of the functional block that transmits and receives an ultrasonic signal and performs processing for generating a B-mode image or a Doppler image to generate an ultrasonic image is the same as that of the conventional ultrasonic diagnostic apparatus 1501, the same symbol is used. The description is omitted.
- the image position estimation unit 112 corresponds to the first movement estimation unit 10A.
- the position acquisition unit 111 corresponds to the second movement estimation unit 10B.
- a linear probe that is composed of at least one row of ultrasonic probes and obtains a two-dimensional image, or a row of ultrasonic probes oscillates or translates to continuously generate a two-dimensional image.
- an oscillating 3D probe that obtains a three-dimensional image or a matrix probe that obtains a three-dimensional image using a probe arranged two-dimensionally can be used.
- a linear probe will be described as an example.
- the position acquisition unit 111 acquires the position information LocInf1 of the ultrasonic probe 101 and inputs it to the position reconstruction unit 113.
- the position information is information indicating the position in the three-dimensional space (corresponding to the coordinate values of the x, y, and z axes) and the direction (the amount of rotation around each of the three axes, and the ultrasonic probe in the three-dimensional space. Information for determining the attitude of the user).
- various systems such as an optical system such as a camera, a magnetic sensor, a gyroscope, an acceleration sensor, or a GPS can be used. In this embodiment, a camera will be described as an example.
- the image position estimation unit 112 estimates the movement vector MEInf between images based on the correlation between the images based on the ultrasonic image held in the image memory 115 and inputs the estimated vector to the position reconstruction unit 113.
- FIG. 2 is an explanatory diagram of a method for acquiring a movement vector by image processing.
- 2A shows the Nth acquired ultrasonic image
- the solid line in FIG. 2B shows the (N + 1) th ultrasonic image.
- This displacement is determined based on the evaluation value for the entire image or for each evaluation unit obtained by dividing the image. For example, an evaluation unit having a minimum evaluation value in the (N + 1) th image with respect to an evaluation unit at a specific position in the Nth image, using the sum of differences in pixel values between evaluation units as an evaluation value. And may be determined by calculating the positional deviation between the two evaluation units.
- the contour of the imaging target may be extracted from the image, and a region including the extracted contour may be used.
- a measure such as a correlation of pixel values between evaluation units or a mutual information amount.
- feature points are detected based on a dynamic contour model such as SIFT method (Scale-Invariant Feature Transform) or SNAKES method, and a displacement vector is determined from the average or intermediate value of displacements of multiple feature points. May be.
- the center of gravity of the contour may be calculated from the detected feature points, and the position shift of the center of gravity may be used as a position shift vector.
- the contour of the imaging target may be extracted from the image, and a region including the extracted contour may be used.
- the positional deviation vector may be obtained using the extracted points on the contour as feature points. Since the positional deviation vector shown in FIG. 2B is a vector in the image plane, the positional deviation vector is converted into a vector in a three-dimensional space based on the positional information of the Nth image. Hereinafter, a vector in a three-dimensional space indicating a positional deviation between two images is referred to as a movement vector.
- FIG. 2C shows the result of converting the displacement vector of FIG. 2B into a movement vector.
- the moving direction and moving distance of the ultrasonic probe in the three-dimensional space correspond to the moving vector. Note that the conversion into the movement vector may be performed by the position reconstruction unit 113.
- the movement vector can be calculated from the difference between the position information of the Nth image and the (N + 1) th image acquired by the position acquisition unit 111.
- the position reconstruction unit 113 maps the ultrasonic image in the three-dimensional space based on the position information determined by weighting the position information LocInf1 and the movement vector MEinf, constructs a three-dimensional image, and displays 3 for display. A dimensional image ProcImg1 is generated. Finally, the display unit 114 displays the three-dimensional image ProcImg1 on an output device such as a monitor. Since the ultrasonic image generation device 11 of the present invention is characterized by the operations of the position acquisition unit 111, the image position estimation unit 112, and the position reconstruction unit 113, the operation of these functional blocks will be mainly described below. Description of other functional blocks is omitted as appropriate.
- FIG. 3 is a flowchart showing the operation of the ultrasonic image generating apparatus 11.
- the ultrasonic image generating device 11 acquires an ultrasonic image in step S ⁇ b> 101.
- step S102 position information corresponding to the ultrasound image acquired in step S101 is acquired from the position acquisition unit 101.
- step S103 it is determined whether or not the acquisition of the ultrasonic image has ended.
- step S103 the processes in steps S101 and S102 are repeated until it is determined that the acquisition of the ultrasound image is completed.
- step S104 the amount of relative positional deviation between the ultrasonic images is estimated by image processing, and a movement vector is calculated.
- step S105 the position information of the ultrasonic image is determined by weighting both the movement vector calculated based on the position information acquired from the position acquisition unit 111 and the movement vector acquired by image processing.
- step S106 a three-dimensional image is constructed by mapping each ultrasonic image in the three-dimensional coordinate space based on the position information determined in step S105.
- step S107 a three-dimensional image of the ultrasonic image is displayed.
- FIG. 4 is a block diagram showing the configuration of the position reconstruction unit 113.
- the position reconstruction unit 113 includes a direction difference acquisition unit 1131, a weight determination unit 1132, and a 3D image construction unit 1133.
- the direction difference acquisition unit 1131 calculates the direction difference Diff between the movement vector acquired by the position acquisition unit 111 and the reference direction, that is, the angle formed by both vectors.
- the reference direction indicates a specific direction with respect to the position acquisition unit 111.
- the reference direction is, for example, the depth direction of the camera.
- the reference direction may be the normal direction of the imaging surface of the camera.
- the weight determination unit 1132 weights the movement vector acquired by the position acquisition unit 111 and the image position estimation unit 112 based on the direction difference Diff to determine a final movement vector. Further, the movement vector is added to the position information of the immediately preceding image of the processing target image, the position information of the processing target image is determined, and input to the three-dimensional image construction unit 1133 as the position information LocInf2.
- the 3D image construction unit 1133 maps the ultrasound image in the 3D space based on the position information LocInf2 to construct a 3D image.
- the position information itself can be added or subtracted when using a representation format that can be added or subtracted, such as a Quaternian, but the orientation in the location information is converted to a format that can be added or subtracted when using a representation format such as Euler angle. After that, position information is added or subtracted. Further, since the position acquisition unit 111 obtains position information with respect to the reference position of the marker, when mapping the ultrasonic image in the three-dimensional space, the pixel position on the ultrasonic image and the reference position and orientation of the marker are Rotate and translate in consideration of the offset.
- a representation format such as a Quaternian
- FIG. 5 is an explanatory diagram of the angle formed by the reference direction and the movement vector.
- the depth direction of the camera where the position resolution of the camera is lowered is set as the reference direction. If the angle between the two is ⁇ , the angle ⁇ corresponds to the direction difference Diff. Note that the movement vector obtained from the camera is used as the movement vector when determining the angle ⁇ .
- the weight determination unit 1132 weights the movement vector obtained from the position acquisition unit 111 and the movement vector obtained by image processing based on the angle ⁇ by the method exemplified in (Equation 1).
- mv_3 represents a weighted movement vector
- mv_1 represents a movement vector obtained from the position acquisition unit
- mv_2 represents a movement vector obtained by image processing.
- the weight of mv_1 decreases as the movement vector approaches the depth direction, that is, cos ⁇ approaches 1.
- the method of calculating the movement vector is not limited to the method of (Equation 1). It is also possible to use a function that monotonically increases or decreases as ⁇ changes.
- FIG. 6 is a flowchart showing the operation of the position reconstruction unit 113.
- the position reconstruction unit 113 calculates a movement vector 1 that is a difference in position between the (N ⁇ 1) -th image obtained from the position acquisition unit 101 and the N-th image.
- a movement vector 2 indicating the amount of positional deviation between the two is calculated.
- an angle ⁇ that is an angle formed by the movement vector 1 and the reference direction is calculated.
- the movement vector 1 and the movement vector 2 are weighted based on the angle ⁇ , and the movement vector 3 that is the final movement vector of the Nth image is determined.
- step S1035 the movement vector 3 is added to the position vector of the (N ⁇ 1) th image to determine the position of the Nth image.
- the position vector is a vector determined by the position and orientation of the image.
- the angle ⁇ may be calculated based on the angle formed by the movement vector 3 of the (N ⁇ 1) th image and the reference direction.
- the movement vector 1 is regarded as the movement vector 3.
- step S1036 the ultrasonic image is mapped in the three-dimensional space based on the position information to construct a three-dimensional image.
- Equation 1 the movement vector 1 and the movement vector 2 are weighted together based on the angle ⁇ .
- the movement vector may be weighted after being decomposed into direction components.
- FIG. 7 shows an example of decomposing the movement vector based on the reference direction.
- FIG. 7A shows the relationship between the movement vector 1 and the reference direction.
- FIG. 7B shows an example in which the movement vector 1 is decomposed into three components: a reference direction component, a first orthogonal component in the reference direction, and a second orthogonal component in the reference direction. Since the reference direction component is significantly affected by a decrease in the position resolution of the camera, the reference direction component is weighted differently from the other two directions. (Equation 2) shows an example of weighting.
- each vector of mv_1, mv_2, mv_3 is the same as (Equation 1).
- [ref], [crs1], and [crs2] indicate a reference direction component of the movement vector, a first orthogonal component of the reference direction, and a second orthogonal component of the reference direction, respectively.
- ⁇ represents a weight for the first orthogonal component
- ⁇ represents a weight for the second orthogonal component. It should be noted that by setting ⁇ or ⁇ to 0, it is possible to eliminate the contribution of the component in the corresponding direction (not to contribute at all).
- FIG. 8 shows an example in which a movement vector is decomposed into components of an absolute coordinate space.
- the absolute coordinate space is a three-dimensional space determined by preset three axes orthogonal to each other (the x axis, the y axis, and the z axis in FIG. 8), and the movement vector 1 is in the three axis directions of x, y, and z. It is decomposed into components.
- the components in the respective axial directions are determined based on angles formed by the respective axes, the reference direction C (camera reference direction), and the reference direction I (image processing reference direction).
- the reference direction I reference direction for image processing
- An example of movement vector weighting is shown in (Equation 3).
- ⁇ represents an angle formed by the reference direction C and the x axis
- ⁇ represents an angle formed by the reference direction I and the x axis
- [x] represents an x axis direction component of the movement vector.
- c1 and c2 are coefficients for normalizing the sum of weights for mv_1 [x] and mv_2 [x] to 1, respectively. The same process can be performed for the y-axis direction and the z-axis direction. According to this method, there is an advantage that the weighting of the movement vector obtained from the camera and the movement vector obtained from the image processing can be made flexible.
- the ratio of the contribution of mv_1 [x] and mv_2 [x] to each mv_3 [x] can be set. For example, by making c1 and c2 the same value, the respective contributions of mv_1 [x] and mv_2 [x] can be made equal. Further, by making c1 a constant multiple of c2, the contribution of mv_1 [x] can be made the constant multiple of the contribution of mv_2 [x].
- FIG. 9 is an explanatory diagram of a position information acquisition method using a plurality of cameras.
- FIG. 9A shows an example in which position information is acquired by two cameras, a camera 901 and a camera 902. At this time, it is desirable to arrange the cameras so that they are spatially separated and face different directions. By arranging in this way, (1) it is possible to prevent the marker from being obstructed by the operator of the ultrasonic probe or the subject, and (2) the reference direction for each camera is different from each other. By arranging in the position, the direction in which the position resolution decreases can be reduced. FIG.
- FIG. 9B shows the angle between the reference direction C1 (reference direction of the camera 901) and the reference direction C2 (reference direction of the camera 2) and the movement vector.
- the movement vector for determining the angle ⁇ the movement vector acquired by the camera 1, the movement vector acquired by the camera 2, or the movement vector 3 of the immediately preceding image can be used.
- the movement vector 3 which is the final movement vector is obtained by combining the movement vectors acquired by each of the plurality of cameras to determine the movement vector 1, and then calculating by weighting the movement vector 2 as in (Equation 1). .
- a plurality of movement vectors obtained from the camera and the movement vector 2 may be weighted to each other.
- the former example is shown in (Formula 4).
- mv_1, mv_11, and mv_12 represent a movement vector 1 obtained by combining movement vectors of a plurality of cameras, a movement vector obtained from the camera 1, and a movement vector obtained from the camera 2, respectively.
- [ref], [crs1], and [crs2] indicate a reference direction component of the movement vector, a first orthogonal component in the reference direction, and a second orthogonal component in the reference direction, respectively.
- ⁇ represents a weight for the first orthogonal component
- ⁇ represents a weight for the second orthogonal component.
- the camera 1 has a low resolution with respect to movement in the x-axis direction
- the amount of movement in the x-axis direction obtained by the camera 2 is used when determining position information in the camera 1, thereby The position resolution is improved.
- FIG. 9 demonstrated the case where two cameras were used, it can be extended when using three or more cameras.
- a plurality of types of position sensors such as a camera and a gyro may be used in combination.
- the position acquisition unit 101 and image processing are used together and only a relative positional relationship can be obtained in the image processing, a movement vector is introduced in determining position information.
- the position acquisition unit 101 such as a camera
- the absolute value of the position information based on the origin of the position acquisition unit 101 is obtained. Therefore, when determining the position information only from the position information of the plurality of position acquisition units 101, the absolute value of the position information obtained from each position acquisition unit may be weighted without using the movement vector.
- FIG. 10 is a diagram showing the effect of the ultrasonic image generating apparatus 11 according to the first embodiment of the present invention.
- the ultrasonic probe moves in the y-axis direction
- the reference direction of the camera is the x-axis direction. Therefore, the camera can acquire the position information with high resolution with respect to the movement of the ultrasonic probe in the yz plane and in the image processing in the zz plane. Therefore, by combining the camera and image processing, high-resolution position information can be acquired in all directions. In this way, by combining a plurality of position information acquisition methods, position information can be acquired with high resolution with respect to movement of the ultrasonic probe in an arbitrary direction.
- the plurality of movement vectors are estimated using the first and second estimation methods having different direction dependencies of the estimation accuracy, and are combined.
- the movement vector of the ultrasonic probe can be calculated by combining the component in the direction with a relatively low estimation accuracy in the first estimation method with the component in the direction with a relatively high estimation accuracy by the second estimation method. it can.
- the weight of the component with the lower estimation accuracy and increasing the weight of the component with the higher estimation accuracy it is possible to calculate a movement vector with high accuracy in all directions. Therefore, the direction dependency of the position acquisition accuracy of the ultrasonic probe can be suppressed.
- the displacement of the two images can be analyzed and the amount of movement of the ultrasonic probe can be estimated.
- the movement of the ultrasonic probe in the plane including the cross section of the subject that has transmitted the ultrasonic signal corresponding to the image can be detected with relatively high accuracy.
- the movement amount of the corresponding pixel or a region that is a collection of a plurality of pixels can be calculated, and the movement amount of the ultrasonic probe can be estimated from the movement amount. it can.
- the movement of the ultrasonic probe in the plane including the cross section of the subject corresponding to the image can be detected with relatively high accuracy.
- the weight of the component in the direction where the estimation accuracy is relatively high is set large, and then the estimated position vector is weighted to generate a combined position vector, thereby increasing the accuracy of the combined moving vector. Can do. Therefore, the direction dependency of the position acquisition accuracy of the ultrasonic probe can be suppressed.
- the weight of the direction component in the plane including the cross section of the subject corresponding to the image is set large, and then the combined movement is generated by generating the combined position vector.
- the accuracy of the vector can be increased.
- the movement vector of the ultrasonic probe can be estimated by the optical marker and the camera. Thereby, the movement of the ultrasonic probe in a plane parallel to the imaging surface of the camera can be detected with relatively high accuracy.
- the weight of a direction component in a plane parallel to the imaging surface of the camera is set to a large value, and then the combined position vector is generated, thereby improving the accuracy of the combined moving vector. Can be high.
- the position estimation of the ultrasonic probe by image processing and the position estimation of the ultrasonic probe by the optical marker and the camera are used together, and the position vector of the ultrasonic probe estimated by each method is synthesized, and the combined position vector Can be generated.
- FIG. 11 is a block diagram showing a configuration of the ultrasonic image generating apparatus 12 according to the second embodiment of the present invention.
- the ultrasonic image generation device 12 includes an ultrasonic probe 101, a transmission unit 102, a reception unit 103, a transmission / reception control unit 104, an ultrasonic image generation unit 105, an image memory 115, an image position estimation unit 112, a position reconstruction unit 113, a position An acquisition unit 201, an arrangement assist unit 202, and a display unit 203 are included.
- the position resolution decreases between the plurality of position acquisition units or between the position acquisition unit and the image processing. Reduce.
- the ultrasonic image generation device 12 is a device in which an arrangement assist unit 202 for determining the arrangement position of the position acquisition unit 201 such as a camera is newly added to the ultrasonic image generation device 11. Note that the same functional blocks as those of the ultrasonic image generating apparatus 11 are denoted by the same reference numerals and description thereof is omitted.
- the arrangement assist unit 202 determines the arrangement target position of the position acquisition unit 201 based on the initial position information LocInf2 acquired from the position acquisition unit 201, and generates auxiliary information NaviInf for arranging the position acquisition unit at the arrangement target position. To the display unit 203. The display unit 203 displays the auxiliary information NaviInf on the display device.
- FIG. 12 is a flowchart showing the operation of the ultrasonic image generating apparatus 12.
- step S301 the current position of the position acquisition unit 201 is acquired, and the arrangement target position of the position acquisition unit 201 is determined so that the position acquisition unit 201 is a predetermined position with respect to the imaging target.
- processing when a camera is used as the position acquisition unit 201 will be described.
- a calibration marker placed near the imaging target site of the subject at the current position is photographed with a camera, and the relative position of the camera with respect to the calibration marker is calculated.
- the relative position of the camera with respect to the imaging target position may be determined by scanning the imaging target site with an ultrasonic probe to which the marker is attached to acquire the marker position.
- a method for determining the arrangement target position will be described. In the image processing, the position resolution in the normal direction of the imaging section of the ultrasonic probe is lowered. Therefore, the camera is arranged so that the difference between the direction in which the position resolution of the camera is high and the normal direction of the imaging section is equal to or less than the threshold value.
- the normal direction of the imaging cross section can be acquired by arranging the calibration marker so that the calibration marker and the normal direction have a predetermined positional relationship, and imaging the calibration marker with a camera.
- the normal direction of the imaging section can also be obtained by scanning the imaging region and photographing the moving direction of the ultrasonic probe with a camera. Further, in the camera, the position resolution in the depth direction is reduced. Therefore, when using a plurality of cameras, the depth direction of each camera is set to be a direction in which the position resolution of the other camera is high.
- the angle formed by the depth direction of each camera is close to 90 degrees (substantially orthogonal). If the angle formed by the depth direction of each camera is 90 degrees, the position resolution of the camera greatly decreases when the movement vector of the ultrasonic probe matches the depth direction of one camera. It does not match the depth direction of the camera. For this reason, the resolution of position acquisition by both cameras is maintained above a certain level.
- the angle formed by the depth direction of each camera may be set within a predetermined range near 90 degrees. For example, it may be from 80 degrees to 100 degrees, or may be other ranges. Note that the distance between the camera and the imaging target region is determined so that the moving range of the ultrasonic probe is within the field of view of the camera.
- step S302 auxiliary information for guiding the position acquisition unit 201 to the arrangement target position is generated and displayed based on the current position and the arrangement target position of the position acquisition unit 201. Thereafter, the user moves the position of the position acquisition unit 201 so as to be aligned with the arrangement target position according to the auxiliary information.
- step S303 it is determined whether or not the difference in position and orientation between the current position and the arrangement target position is equal to or less than a threshold value, and the processes in steps S301 and S302 are repeated until the difference is equal to or less than the threshold value.
- step S303 If it is determined in step S303 that the difference in position and orientation between the current position and the arrangement target position is equal to or smaller than the threshold value, auxiliary information indicating that the arrangement of the position acquisition unit 201 is completed is displayed in step S304. Thereafter, the same operation flow as that of the ultrasonic image generating apparatus 11 is performed.
- the position acquisition unit 201 may be attached to a movable device such as an electric stage or a robot arm and automatically moved to the arrangement target position.
- the position acquisition unit 201 is arranged only once before the ultrasonic image is picked up.
- the arrangement target position changes when the traveling direction changes. Therefore, when the position acquisition unit 201 can be automatically moved, the arrangement target position may be calculated according to the traveling direction, and the position acquisition unit 201 may be moved to the arrangement target position.
- FIG. 13 is an explanatory diagram of camera arrangement and auxiliary information for camera arrangement.
- (A) of FIG. 13 is an example of the target position of the camera when imaging the carotid artery located near the human neck.
- the camera 1 (1301) and the camera 2 (1302) are arranged so that the neck 1313 that is an imaging target part enters the field of view, and the depth direction of both forms an angle close to a right angle.
- the calibration marker 1312 is used to acquire the current position, and is installed so that the relative position between the neck 1313 and the calibration marker 1312 is constant. In this way, the relative position between the neck 1313 and the cameras (1301 and 1302) can be acquired.
- FIG. 13 shows a display example of auxiliary information.
- the current position (1301A) of the camera 1 and the current position (1302A) of the camera 2 are indicated by solid lines.
- the broken line indicates the target placement position (1301B) of the camera 1 and the placement target position (1302B) of the camera 2.
- the arrows in the figure indicate the movement directions for guiding the respective cameras to the arrangement target positions, and the user moves the cameras in the direction indicated by the arrows.
- the ultrasonic probe is often moved in the x-axis or y-axis direction as compared to the z-direction shown in FIG.
- the x-axis or the y-axis reduces the position resolution by image processing by arranging the camera at a position looking down at the neck like the camera 2 (1302) in the figure.
- Directional movement can be acquired with high resolution.
- FIG. 14A is a flowchart showing the operation of a modification of the ultrasonic image generation apparatus 11 according to Embodiment 1, and weights the movement vectors obtained from one or more position acquisition units to obtain the movement vectors of the ultrasonic image. Once determined, the movement vector is corrected by image processing. The same steps as those in the flowchart of FIG.
- step S201 position information is acquired from one or more position acquisition units.
- step S202 the movement vector is determined by weighting the position information from the one or more position acquisition units acquired in step S201.
- step S203 a relative displacement amount between the ultrasonic images is estimated by image processing, and a correction vector obtained by converting the displacement amount into a vector in a three-dimensional space is calculated.
- the amount of displacement between the (N-1) th image and the Nth image may be estimated after the Nth image is moved based on the movement vector of the Nth image determined in step S202. In this way, it is possible to roughly estimate the amount of displacement using the movement vector obtained from the position acquisition unit, and then perform highly accurate estimation by image processing, reduce the amount of processing, and position estimation Increases robustness.
- the correction vector may be set so that the movement vector changes smoothly in consideration of the continuity of the movement amount between a plurality of consecutive images.
- the movement vector obtained from the position acquisition unit fluctuates due to camera shake or an error in the position acquisition unit
- the movement vector is smoothed by image processing, thereby reducing fluctuations and accurate position. Can be approached.
- the movement vector in the position acquisition unit is smoothed by interpolation processing such as spline interpolation or filtering processing, not correlation between ultrasonic images, and the movement vector before and after smoothing is obtained.
- the difference may be used as a correction vector.
- the position information acquired from the position acquisition unit may be smoothed, and the difference between the position information before and after the smoothing may be used as the movement vector.
- step S204 the correction vector is added to the movement vector to correct the movement vector.
- step S205 a three-dimensional image is constructed by mapping an ultrasonic image in the three-dimensional space based on the corrected movement vector, and is displayed in step S107.
- the position reconstruction unit 103 may perform the following operation.
- the reliability of the position information obtained by the position acquisition unit or the image processing may be set, and the position information with low reliability may not be used. For example, when using one camera and position information obtained by image processing, if the reliability of the position information obtained by image processing is equal to or less than a threshold, only the position information obtained by the camera is used. Alternatively, in the case of using position information from two cameras and image processing, if the reliability of the position information of one camera is low, position information from another camera and image processing is used.
- the ultrasonic image may not be used for 3D image construction.
- the reliability of the position acquisition unit the amount of change in the movement vector between successive images can be used. For example, if the absolute value difference between the movement vector of the (N ⁇ 1) th image and the movement vector of the Nth image or the difference in direction exceeds a threshold value, the position information of the Nth image is invalidated.
- the position information may be invalidated if the minimum evaluation value exceeds a threshold value.
- the position information of the image may be determined only from the position information with respect to the processing target image by resetting the position information for each predetermined scanning time interval or every specific number of acquired images, and the accumulation of errors may be reduced.
- the position information is weighted based on the difference in direction between the reference direction and the movement vector, taking the camera as an example of the position acquisition unit.
- the weighting scale is not limited to the difference in direction. For example, when the resolution differs depending on the distance between the magnetic transmitter and the probe in the magnetic sensor or the rotation direction as in a gyro, weighting may be performed according to the rotation direction.
- an apparatus for constructing a three-dimensional image based on position information acquired with high position resolution has been described.
- the application is not limited to construction of a three-dimensional image, and other than ultrasound such as CT and MRI.
- the present invention can also be applied to alignment of a two-dimensional or three-dimensional image captured in a modal manner and an ultrasonic image, or alignment of ultrasonic images captured at different dates and times such as periodic diagnosis.
- the combined use of the position acquisition unit and the image processing is not essential, and in particular, when using a plurality of position acquisition units, the position information may be determined only from the position acquisition unit.
- the improvement in the accuracy of position acquisition in the position acquisition unit 111 and the image position estimation unit 112 and the improvement in the frequency are in a trade-off relationship.
- the frequency of position acquisition by the position acquisition unit 111 and the image position estimation unit 112 is changed according to the accuracy of the positions acquired by the position acquisition unit 111 and the image position estimation unit 112. Specifically, the difference between the position information acquired by the position acquisition unit 111 and the image position estimation unit 112 is feedback-controlled to the frequency of position acquisition.
- the position acquisition unit 111 and the image position estimation unit 112 are respectively an operation mode (high frequency mode) with low accuracy or robustness and high position acquisition frequency, and an operation mode (high accuracy) with high accuracy or robustness and low position acquisition frequency. These operation modes are switched by feedback information. By doing so, the accuracy or frequency of position acquisition of the position acquisition unit 111 and the image position estimation unit 112 can be improved according to the position acquisition situation.
- An example of feedback control is shown in FIG. 14B.
- FIG. 14B is a configuration diagram of an example of an ultrasonic image generation apparatus including a feedback unit. As shown in FIG. 14B, in this example, the ultrasonic image generating apparatus includes a feedback unit 116. Since the operation of other functional blocks is the same as that of the ultrasonic diagnostic apparatus 11, the same reference numerals are given and the description thereof is omitted.
- the feedback unit 116 acquires position information acquired by the position acquisition unit 111 and the image position estimation unit 112. Then, the difference between these pieces of position information is calculated. If the difference is larger than the threshold value, the position acquisition unit 111 and the image position estimation unit 112 are changed to the high accuracy mode. If the difference in position information is smaller than the threshold, the position acquisition unit 111 and the image position estimation unit 112 are changed to the high frequency mode. By doing in this way, according to a position acquisition situation, the accuracy or frequency of position acquisition of the position acquisition part 111 and the image position estimation part 112 can be improved.
- the direction of the estimation method in the estimation method is matched with the direction in which the estimation accuracy in the other estimation methods is high.
- the operator can set the direction of the estimation method so as to suppress the direction dependency of the estimation accuracy. Therefore, the direction dependency of the position acquisition accuracy of the ultrasonic probe can be suppressed.
- FIG. 18 is an explanatory diagram when the ultrasonic image generation method of each of the above embodiments is executed by a computer system using a program recorded on a recording medium such as a flexible disk.
- FIG. 18B shows the appearance, sectional structure, and flexible disk as seen from the front of the flexible disk
- FIG. 18A shows an example of the physical format of the flexible disk that is the main body of the recording medium.
- the flexible disk FD is built in the case F, and a plurality of tracks Tr are formed concentrically on the surface of the disk from the outer periphery toward the inner periphery, and each track is divided into 16 sectors Se in the angular direction. ing. Therefore, in the flexible disk storing the program, the program is recorded in an area allocated on the flexible disk FD.
- FIG. 18C shows a configuration for recording and reproducing the program on the flexible disk FD.
- the program for realizing the image processing method is recorded on the flexible disk FD
- the program is written from the computer system Cs via the flexible disk drive.
- the image processing method for realizing the image processing method by the program in the flexible disk is constructed in the computer system
- the program is read from the flexible disk by the flexible disk drive and transferred to the computer system.
- the recording medium is not limited to this, and any recording medium such as an IC card or a ROM cassette capable of recording a program can be similarly implemented.
- blocks such as the ultrasonic image generation unit 105, the position acquisition unit 111, the image position estimation unit 112, the position reconstruction unit 113, and the image memory 115 in FIG. 1B are typically integrated circuits such as LSI (Large Scale). Integration). These may be individually made into one chip, or may be made into one chip so as to include a part or all of them.
- LSI Large Scale
- LSI Integrated Circuit
- IC Integrated Circuit
- the method of circuit integration is not limited to LSI, and may be realized by a dedicated circuit or a general-purpose processor.
- a dedicated circuit for graphics processing such as GPU (Graphic Processing Unit) can be used.
- An FPGA Field Programmable Gate Array
- a reconfigurable processor that can reconfigure the connection and setting of circuit cells inside the LSI may be used.
- each component may be configured by dedicated hardware or may be realized by executing a software program suitable for each component.
- Each component may be realized by a program execution unit such as a CPU or a processor reading and executing a software program recorded on a recording medium such as a hard disk or a semiconductor memory.
- the software that realizes the image decoding apparatus of each of the above embodiments is the following program.
- this program is an ultrasonic image generation method for generating an ultrasonic diagnostic image using a plurality of ultrasonic signals obtained from a subject while moving an ultrasonic probe to a computer
- a first movement estimation step of estimating a movement vector indicating movement of the ultrasonic probe when a plurality of images corresponding to each of the sound wave signals is acquired as a first movement vector by a first estimation method
- the movement vector when a plurality of images corresponding to each of the ultrasonic signals is acquired is estimated as a second movement vector by a second estimation method that is different in direction dependency of estimation accuracy from the first estimation method.
- the second movement vector is combined with weighting based on the direction of the first movement vector or the second movement vector to calculate a combined movement vector, and the combined movement vector and the plurality of images And a position reconstruction step of constructing an ultrasonic diagnostic image of the subject.
- the program may cause the computer to acquire the other of the two images after acquiring one of the two images based on the image shift between the two images of the plurality of images in the first movement estimation step.
- the step of estimating the movement vector of the ultrasonic probe is executed by calculating the movement vector until is acquired.
- the ultrasonic image generation apparatus has been described based on the embodiment, but the present invention is not limited to this embodiment. Unless it deviates from the gist of the present invention, one or more of the present invention may be applied to various modifications that can be conceived by those skilled in the art, or forms constructed by combining components in different embodiments. It may be included within the scope of the embodiments.
- the ultrasonic image generating apparatus and method of the present invention when constructing a three-dimensional image based on an ultrasonic image obtained by scanning an imaging target from an arbitrary direction and position information, position information from a plurality of position acquisition units. Is used to determine the final position information, so that highly accurate position information can be acquired with respect to movement in an arbitrary direction, and a three-dimensional image can be constructed with high accuracy. Therefore, since the ultrasonic image generating apparatus and method according to the present invention can observe the three-dimensional shape of the diagnostic region with high accuracy, it can be expected to improve the diagnostic accuracy, and particularly has high applicability in the medical diagnostic equipment industry.
Abstract
Description
本発明者は、「背景技術」の欄において記載した超音波診断装置に関し、以下の問題が生ずることを見出した。
図1Aは、本発明の実施の形態1の超音波画像生成装置10の構成図である。
図11は、本発明の実施の形態2の超音波画像生成装置12の構成を示すブロック図である。超音波画像生成装置12は、超音波プローブ101、送信部102、受信部103、送受信制御部104、超音波画像生成部105、画像メモリ115、画像位置推定部112、位置再構成部113、位置取得部201、配置アシスト部202、および、表示部203を有する。図10で説明したように、本発明の超音波画像生成装置においては、複数の位置取得部の間、あるいは、位置取得部と画像処理との間で互いに補完して、位置分解能が低下する方向を低減する。超音波画像生成装置12は、超音波画像生成装置11に対して、カメラなどの位置取得部201の配置位置を決定するための配置アシスト部202を新たに追加したものである。なお、超音波画像生成装置11と同一の機能ブロックについては同一符号を付し、説明を省略する。
上記各実施の形態で示した超音波画像生成方法を実現するためのプログラムを、フレキシブルディスク等の記録媒体に記録するようにすることにより、上記実施の形態で示した処理を、独立したコンピュータシステムにおいて簡単に実施することが可能となる。
10A 第一移動推定部
10B 第二移動推定部
10C、113、1508 位置再構成部
101 超音波プローブ
102 送信部
103 受信部
104 送受信制御部
105 超音波画像生成部
111、201、1507 位置取得部
112 画像位置推定部
114、203、1509 表示部
115、1506 画像メモリ
1131 方向差分取得部
1132 重み決定部
1133 3次元像構築部
Claims (13)
- 超音波プローブを移動させながら、被検体から得られる複数の超音波信号を用いて超音波診断画像を生成する超音波画像生成装置であって、
前記複数の超音波信号のそれぞれに対応する複数の画像が取得されたときの前記超音波プローブの移動を示す移動ベクトルを、第一推定方法により第一移動ベクトルとして推定する第一移動推定部と、
前記複数の超音波信号のそれぞれに対応する複数の画像が取得されたときの前記移動ベクトルを、前記第一推定方法とは推定精度の方向依存性が異なる第二推定方法により第二移動ベクトルとして推定する第二移動推定部と、
前記第一移動推定部が推定した前記第一移動ベクトルと前記第二移動推定部が推定した前記第二移動ベクトルとを、前記第一移動ベクトルまたは前記第二移動ベクトルの方向に基づいて重み付けして合成することで、合成移動ベクトルを算出し、前記合成移動ベクトルと前記複数の画像とを用いて前記被検体の超音波診断画像を構成する位置再構成部とを備える
超音波画像生成装置。 - 前記第一移動推定部は、
前記複数の画像のうちの2枚の画像間の画像のずれに基づいて、前記2枚の画像の一方が取得されてから他方が取得されるまでの間の前記移動ベクトルを算出することで、前記超音波プローブの移動ベクトルを推定する
請求項1に記載の超音波画像生成装置。 - 前記第一移動推定部は、
前記複数の画像のうちの2枚の画像を構成する画素値の相関に基づいて、前記2枚の画像間の画像のずれ量を検出することで、前記2枚の画像の一方が取得されてから他方が取得されるまでの間の前記超音波プローブの移動ベクトルを推定する
請求項2に記載の超音波画像生成装置。 - 前記位置再構成部は、
前記超音波プローブの移動方向と、前記第一移動推定部または前記第二移動推定部における推定の際に基準となった基準方向とのなす角度が大きいほど、前記第一移動ベクトルまたは前記第二移動ベクトルの重みを大きくし、前記合成移動ベクトルを算出することで、前記超音波診断画像を構成する
請求項1~3のいずれか1項に記載の超音波画像生成装置。 - 前記位置再構成部は、
前記超音波プローブの移動方向と、前記第一移動推定部における前記基準方向である推定精度が低い方向とのなす角度が大きいほど、前記第一移動推定部が推定した前記移動ベクトルの重みを大きくし、前記合成移動ベクトルを算出することで、前記超音波診断画像を構成する
請求項4に記載の超音波画像生成装置。 - 前記第一移動推定部及び前記第二移動推定部のうちの少なくとも一方は、
前記超音波プローブに取り付けられた光学マーカーをカメラにより撮像し、撮像された前記光学マーカーの位置または形状に基づいて、前記超音波プローブの位置および角度を算出する
請求項1に記載の超音波画像生成装置。 - 前記位置再構成部は、
前記超音波プローブの移動方向と、前記光学移動推定部における基準方向である前記カメラの撮像面の法線方向とのなす角度が大きいほど、前記光学移動推定部が推定した前記移動ベクトルの重みを大きくし、前記移動ベクトルを合成することで、前記複数の画像を構成する
請求項6に記載の超音波画像生成装置。 - 前記第一移動推定部は、
前記複数の画像のうちの2枚の画像間の画像のずれに基づいて、前記2枚の画像の一方が取得されてから他方が取得されるまでの間の前記超音波プローブの移動ベクトルを算出することで、前記超音波プローブの移動ベクトルを推定し、
前記第二移動推定部は、
前記超音波プローブに取り付けられた光学マーカーをカメラにより撮像し、撮像された前記光学マーカーの位置または形状に基づいて、前記超音波プローブの位置および角度を算出することで、前記超音波プローブの移動ベクトルを推定する
請求項1に記載の超音波画像生成装置。 - 前記超音波画像生成装置は、さらに、
前記第一移動推定部及び前記第二移動推定部における前記基準方向が互いに略直交するように、操作者による前記基準方向の変更のための操作を促すための情報を提示する配置アシスト部を備える
請求項1~8のいずれか1項に記載の超音波画像生成装置。 - 超音波プローブを移動させながら、被検体から得られる複数の超音波信号を用いて超音波診断画像を生成する超音波画像生成方法であって、
前記複数の超音波信号のそれぞれに対応する複数の画像が取得されたときの前記超音波プローブの移動を示す移動ベクトルを、第一推定方法により第一移動ベクトルとして推定する第一移動推定ステップと、
前記複数の超音波信号のそれぞれに対応する複数の画像が取得されたときの前記移動ベクトルを、前記第一推定方法とは推定精度の方向依存性が異なる第二推定方法により第二移動ベクトルとして推定する第二移動推定ステップと、
前記第一移動推定ステップにおいて推定された前記第一移動ベクトルと前記第二移動推定ステップにおいて推定された前記第二移動ベクトルとを、前記第一移動ベクトルまたは前記第二移動ベクトルの方向に基づいて重み付けして合成することで、合成移動ベクトルを算出し、前記合成移動ベクトルと前記複数の画像とを用いて前記被検体の超音波診断画像を構成する位置再構成ステップとを含む
超音波画像生成方法。 - 前記第一移動推定ステップにおいて、
前記複数の画像のうちの2枚の画像間の画像のずれに基づいて、前記2枚の画像の一方が取得されてから他方が取得されるまでの間の前記移動ベクトルを算出することで、前記超音波プローブの移動ベクトルを推定する
請求項10に記載の超音波画像生成方法。 - 請求項10に記載の方法をコンピュータに実行させるためのプログラム。
- 超音波プローブを移動させながら、被検体から得られる複数の超音波信号を用いて超音波診断画像を生成する集積回路であって、
前記複数の超音波信号のそれぞれに対応する複数の画像が取得されたときの前記超音波プローブの移動を示す移動ベクトルを、第一推定方法により第一移動ベクトルとして推定する第一移動推定部と、
前記複数の超音波信号のそれぞれに対応する複数の画像が取得されたときの前記移動ベクトルを、前記第一推定方法とは推定精度の方向依存性が異なる第二推定方法により第二移動ベクトルとして推定する第二移動推定部と、
前記第一移動推定部が推定した前記第一移動ベクトルと前記第二移動推定部が推定した前記第二移動ベクトルとを、前記第一移動ベクトルまたは前記第二移動ベクトルの方向に基づいて重み付けして合成することで、合成移動ベクトルを算出し、前記合成移動ベクトルと前記複数の画像とを用いて前記被検体の超音波診断画像を構成する位置再構成部とを備える
集積回路。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP12792497.5A EP2716230A4 (en) | 2011-05-30 | 2012-05-30 | ULTRASONIC IMAGE GENERATING DEVICE AND ULTRASONIC IMAGE GENERATION METHOD |
JP2012540616A JP5862571B2 (ja) | 2011-05-30 | 2012-05-30 | 超音波画像生成装置および超音波画像生成方法 |
US13/812,062 US20130131510A1 (en) | 2011-05-30 | 2012-05-30 | Ultrasound image generation apparatus and ultrasound image generation method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011119986 | 2011-05-30 | ||
JP2011-119986 | 2011-05-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012164919A1 true WO2012164919A1 (ja) | 2012-12-06 |
Family
ID=47258792
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/003523 WO2012164919A1 (ja) | 2011-05-30 | 2012-05-30 | 超音波画像生成装置および超音波画像生成方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20130131510A1 (ja) |
EP (1) | EP2716230A4 (ja) |
JP (1) | JP5862571B2 (ja) |
WO (1) | WO2012164919A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016527998A (ja) * | 2013-08-20 | 2016-09-15 | キュアファブ テクノロジーズ ゲゼルシャフト ミット ベシュレンクテル ハフツング | 光学追跡 |
WO2022044391A1 (ja) * | 2020-08-26 | 2022-03-03 | 富士フイルム株式会社 | 超音波診断システムおよび超音波診断システムの制御方法 |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160135782A1 (en) * | 2014-11-14 | 2016-05-19 | General Electric Company | Finger joint ultrasound imaging |
JP6405058B2 (ja) | 2015-03-31 | 2018-10-17 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 医療イメージング装置 |
WO2017100920A1 (en) * | 2015-12-14 | 2017-06-22 | The Governors Of The University Of Alberta | Apparatus and method for generating a fused scan image of a patient |
USD843399S1 (en) | 2016-07-29 | 2019-03-19 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with transitional graphical user interface |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08332187A (ja) * | 1995-06-08 | 1996-12-17 | Aloka Co Ltd | 超音波画像処理装置 |
JPH11313822A (ja) * | 1998-03-20 | 1999-11-16 | General Electric Co <Ge> | 3次元イメ―ジング・システム並びに走査平面の動きを追跡する方法及び装置 |
JP2001157677A (ja) * | 1999-12-01 | 2001-06-12 | Hitachi Medical Corp | 超音波診断装置 |
JP2002102223A (ja) * | 2000-10-03 | 2002-04-09 | Mitani Sangyo Co Ltd | 超音波断層画像における面座標検出方法ならびにシステムおよび同方法がプログラムされ記録された記録媒体 |
JP2005103328A (ja) * | 2005-01-18 | 2005-04-21 | Hitachi Ltd | 超音波像診断装置及びそれに使用するプログラム |
JP2008125692A (ja) * | 2006-11-20 | 2008-06-05 | Aloka Co Ltd | 超音波診断装置 |
JP2010075503A (ja) | 2008-09-26 | 2010-04-08 | Hitachi Medical Corp | マルチモダリティ手術支援装置 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5608849A (en) * | 1991-08-27 | 1997-03-04 | King, Jr.; Donald | Method of visual guidance for positioning images or data in three-dimensional space |
US5899861A (en) * | 1995-03-31 | 1999-05-04 | Siemens Medical Systems, Inc. | 3-dimensional volume by aggregating ultrasound fields of view |
US5566674A (en) * | 1995-06-30 | 1996-10-22 | Siemens Medical Systems, Inc. | Method and apparatus for reducing ultrasound image shadowing and speckle |
AU1983397A (en) * | 1996-02-29 | 1997-09-16 | Acuson Corporation | Multiple ultrasound image registration system, method and transducer |
JP4636696B2 (ja) * | 1999-04-20 | 2011-02-23 | アーオー テクノロジー アクチエンゲゼルシャフト | ヒト又は動物の器官の表面における3d座標の経皮的獲得用の装置 |
WO2006127142A2 (en) * | 2005-03-30 | 2006-11-30 | Worcester Polytechnic Institute | Free-hand three-dimensional ultrasound diagnostic imaging with position and angle determination sensors |
US7844094B2 (en) * | 2005-04-29 | 2010-11-30 | Varian Medical Systems, Inc. | Systems and methods for determining geometric parameters of imaging devices |
CA2751629C (en) * | 2007-10-19 | 2016-08-23 | Metritrack, Llc | Three dimensional mapping display system for diagnostic ultrasound machines and method |
WO2009070696A1 (en) * | 2007-11-26 | 2009-06-04 | Proiam, Llc | Enrollment apparatus, system, and method |
-
2012
- 2012-05-30 US US13/812,062 patent/US20130131510A1/en not_active Abandoned
- 2012-05-30 JP JP2012540616A patent/JP5862571B2/ja active Active
- 2012-05-30 WO PCT/JP2012/003523 patent/WO2012164919A1/ja active Application Filing
- 2012-05-30 EP EP12792497.5A patent/EP2716230A4/en not_active Withdrawn
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08332187A (ja) * | 1995-06-08 | 1996-12-17 | Aloka Co Ltd | 超音波画像処理装置 |
JPH11313822A (ja) * | 1998-03-20 | 1999-11-16 | General Electric Co <Ge> | 3次元イメ―ジング・システム並びに走査平面の動きを追跡する方法及び装置 |
JP2001157677A (ja) * | 1999-12-01 | 2001-06-12 | Hitachi Medical Corp | 超音波診断装置 |
JP2002102223A (ja) * | 2000-10-03 | 2002-04-09 | Mitani Sangyo Co Ltd | 超音波断層画像における面座標検出方法ならびにシステムおよび同方法がプログラムされ記録された記録媒体 |
JP2005103328A (ja) * | 2005-01-18 | 2005-04-21 | Hitachi Ltd | 超音波像診断装置及びそれに使用するプログラム |
JP2008125692A (ja) * | 2006-11-20 | 2008-06-05 | Aloka Co Ltd | 超音波診断装置 |
JP2010075503A (ja) | 2008-09-26 | 2010-04-08 | Hitachi Medical Corp | マルチモダリティ手術支援装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2716230A4 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016527998A (ja) * | 2013-08-20 | 2016-09-15 | キュアファブ テクノロジーズ ゲゼルシャフト ミット ベシュレンクテル ハフツング | 光学追跡 |
WO2022044391A1 (ja) * | 2020-08-26 | 2022-03-03 | 富士フイルム株式会社 | 超音波診断システムおよび超音波診断システムの制御方法 |
JP7476320B2 (ja) | 2020-08-26 | 2024-04-30 | 富士フイルム株式会社 | 超音波診断システムおよび超音波診断システムの制御方法 |
Also Published As
Publication number | Publication date |
---|---|
JP5862571B2 (ja) | 2016-02-16 |
US20130131510A1 (en) | 2013-05-23 |
JPWO2012164919A1 (ja) | 2015-02-23 |
EP2716230A1 (en) | 2014-04-09 |
EP2716230A4 (en) | 2014-10-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5862571B2 (ja) | 超音波画像生成装置および超音波画像生成方法 | |
US9492141B2 (en) | Ultrasonic image generating device and image generating method | |
CN102047140B (zh) | 具有引导efov扫描的扩展视野超声成像 | |
US20090306509A1 (en) | Free-hand three-dimensional ultrasound diagnostic imaging with position and angle determination sensors | |
US20050085729A1 (en) | Ultrasonic image processor and ultrasonic diagnostic instrument | |
US20090054776A1 (en) | Ultrasound diagnosis apparatus and method for acquiring 3-d images | |
US20180092628A1 (en) | Ultrasonic diagnostic apparatus | |
US9592028B2 (en) | Ultrasonic diagnostic apparatus | |
CN107072635A (zh) | 用于中间用户反馈的多跳超声心动图采集的质量度量 | |
US20130261460A1 (en) | Ultrasonic processing apparatus and probe supporting apparatus | |
JP4598652B2 (ja) | 超音波診断装置 | |
JP4276595B2 (ja) | 超音波診断装置 | |
JP3263131B2 (ja) | 超音波診断装置 | |
JP7275261B2 (ja) | 3次元超音波画像生成装置、方法、及びプログラム | |
EP3752984B1 (en) | An imaging system and method with stitching of multiple images | |
CN112022202A (zh) | 用于确定超声探头运动的技术 | |
JP4944582B2 (ja) | 超音波診断装置 | |
JP2005152192A (ja) | 超音波診断装置 | |
JP5182932B2 (ja) | 超音波ボリュームデータ処理装置 | |
US20220287686A1 (en) | System and method for real-time fusion of acoustic image with reference image | |
JP2005130877A (ja) | 超音波診断装置 | |
KR101487688B1 (ko) | 단면의 위치를 가이드하기 위한 내비게이터를 제공하는 초음파 시스템 및 방법 | |
Abbas et al. | MEMS Gyroscope and the Ego-Motion Estimation Information Fusion for the Low-Cost Freehand Ultrasound Scanner | |
JP5383253B2 (ja) | 超音波診断装置及び画像データ生成装置 | |
US20220207743A1 (en) | System and method for two dimensional acoustic image compounding via deep learning |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2012540616 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12792497 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13812062 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012792497 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |