WO2017195540A1 - Dispositif d'imagerie ultrasonore, dispositif de traitement d'image et procédé associé - Google Patents

Dispositif d'imagerie ultrasonore, dispositif de traitement d'image et procédé associé Download PDF

Info

Publication number
WO2017195540A1
WO2017195540A1 PCT/JP2017/015573 JP2017015573W WO2017195540A1 WO 2017195540 A1 WO2017195540 A1 WO 2017195540A1 JP 2017015573 W JP2017015573 W JP 2017015573W WO 2017195540 A1 WO2017195540 A1 WO 2017195540A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
ultrasonic
imaging apparatus
unit
ultrasound
Prior art date
Application number
PCT/JP2017/015573
Other languages
English (en)
Japanese (ja)
Inventor
子盛 黎
荒井 修
Original Assignee
株式会社日立製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立製作所 filed Critical 株式会社日立製作所
Priority to CN201780013492.XA priority Critical patent/CN108697410B/zh
Publication of WO2017195540A1 publication Critical patent/WO2017195540A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • A61B8/14Echo-tomography

Definitions

  • the present invention relates to an ultrasonic imaging apparatus, and more particularly to an imaging technique for simultaneously displaying an imaged ultrasonic image and a predetermined characteristic part in a subject.
  • the ultrasonic imaging apparatus irradiates the subject with ultrasonic waves and images the structure inside the subject using the reflected signal, the patient can be observed non-invasively and in real time.
  • the positional relationship of the scan plane is calculated by attaching a position sensor to the ultrasonic probe, and it corresponds to the image of the ultrasonic scan plane from the three-dimensional diagnostic volume (3D image) data captured from the medical image diagnostic apparatus.
  • 3D image three-dimensional diagnostic volume
  • An image diagnostic system that constructs and displays a two-dimensional cross-sectional image is also becoming popular.
  • the diagnostic 3D image data is generally image data taken from another medical image pickup device such as an X-ray CT (Computed Tomography) device or an MRI (Magnetic Resonance Imaging) device in addition to ultrasound.
  • Patent Document 1 a two-dimensional cross-sectional image corresponding to an ultrasonic two-dimensional (2D) image that is an image of an ultrasonic scan plane is constructed from diagnostic three-dimensional (3D) image data captured from a medical image diagnostic apparatus.
  • the cross-sectional direction of the two-dimensional cross-sectional image is set according to the purpose of diagnosis and the type of the ultrasonic probe.
  • the ultrasonic scan plane image and the diagnostic 3D image are aligned, and the diagnostic 3D image is two-dimensionally converted. Build and display cross-sectional images.
  • Non-Patent Document 1 discloses an ultrasonic 3D image stitching method and the like.
  • Patent Document 1 a user sets a point on a mark that can be observed on both the ultrasonic scan plane image and the two-dimensional cross-sectional image constructed from the diagnostic 3D image, and matches the positions. Need to be aligned. Such a complicated user operation and the burden on the subject in an open state are major issues. Moreover, in the technique of Patent Document 1, there is a problem that information such as the position and name of an anatomical feature portion cannot be displayed in real time.
  • An object of the present invention is to provide an ultrasonic imaging apparatus, an image processing apparatus, and a method thereof that are capable of automatically and in real time displaying information on characteristic parts in an intraoperative ultrasonic image and accurately guiding a surgery. It is in.
  • an ultrasonic imaging apparatus that transmits an ultrasonic wave to a subject and receives an ultrasonic wave from the subject, and an ultrasonic probe.
  • An image generation unit that generates an ultrasonic 2D image from a received signal of a child and generates an ultrasonic 3D image by transmitting and receiving ultrasonic waves a plurality of times, and an image processing device that receives and processes the ultrasonic 2D image and the ultrasonic 3D image
  • the image processing apparatus estimates and identifies the characteristic part of the subject from the ultrasonic 3D image, aligns the ultrasonic 2D image and the ultrasonic 3D image, and places the characteristic part on the ultrasonic 2D image.
  • An ultrasonic imaging apparatus for displaying the information is provided.
  • an image processing apparatus includes a feature part position estimation / identification unit that estimates and identifies a feature part of a subject from an ultrasonic 3D image of the subject, Image processing including an image alignment unit that performs alignment between the ultrasonic 2D image and the ultrasonic 3D image of the subject and calculates the position of the 2D cross-sectional image of the ultrasonic 3D image corresponding to the ultrasonic 2D image Providing the device.
  • an image processing method in an image processing apparatus wherein the image processing apparatus estimates and identifies a characteristic part of a subject from an ultrasonic 3D image of the subject.
  • the characteristic part of the subject is estimated and identified from the ultrasonic 3D image, the ultrasonic 2D image and the ultrasonic 3D image are aligned, and the characteristic part information is displayed on the ultrasonic 2D image. Therefore, it is possible to accurately guide the operation.
  • FIG. 1 is a block diagram showing an example of the overall configuration of an ultrasonic imaging apparatus according to Embodiment 1.
  • FIG. FIG. 1 is a block diagram illustrating a hardware configuration example of an ultrasonic imaging apparatus according to a first embodiment.
  • 1 is a functional block diagram of an image processing apparatus of an ultrasonic imaging apparatus according to Embodiment 1.
  • FIG. FIG. 3 is a flowchart showing a process flow of the ultrasonic imaging apparatus according to the first embodiment.
  • FIG. 3 is an explanatory diagram illustrating an example of a characteristic part according to the first embodiment. Explanatory drawing which shows the other example of the characteristic part based on Example 1.
  • FIG. Explanatory drawing which shows the other example of the characteristic part based on Example 1.
  • FIG. 3 is a diagram illustrating an example of ultrasonic feature region information according to the first embodiment.
  • FIG. 3 is a flowchart showing a position estimation / identification process of a characteristic part from volume data according to the first embodiment.
  • FIG. 3 is a flowchart illustrating an alignment process between an ultrasonic 2D image and an ultrasonic 3D image according to the first embodiment.
  • FIG. 3 is a diagram illustrating an initial position of an alignment process between an ultrasonic 2D image and an ultrasonic 3D image according to the first embodiment.
  • FIG. 6 is a functional block diagram of an image processing apparatus of an ultrasonic imaging apparatus according to a second embodiment. The figure which shows an example of the display screen of a display, and a button selection means based on each Example.
  • FIG. 9 is a block diagram illustrating an example of the overall configuration of an ultrasonic imaging apparatus according to a third embodiment.
  • FIG. 9 is a block diagram illustrating a hardware configuration example of an ultrasonic imaging apparatus according to a third
  • the feature part information means the position and name of the feature part, and also the distance relation information, and the distance relation means the projection distance from the feature part to the ultrasonic 2D image. .
  • the first embodiment is an ultrasonic imaging apparatus that transmits an ultrasonic wave to a subject and receives an ultrasonic wave from the subject, and an ultrasonic wave 2D from a reception signal of the ultrasonic probe.
  • the image processing apparatus includes an image generation unit that generates an image and generates an ultrasonic 3D image by transmitting and receiving ultrasonic waves a plurality of times, and an image processing apparatus that receives and processes the ultrasonic 2D image and the ultrasonic 3D image.
  • the ultrasonic imaging that estimates and identifies the characteristic part of the subject from the ultrasonic 3D image, aligns the ultrasonic 2D image and the ultrasonic 3D image, and displays the characteristic part information on the ultrasonic 2D image It is the Example of an apparatus.
  • the image processing apparatus is a feature part position estimation / identification unit that estimates and identifies a feature part of a subject from an ultrasound 3D image of the subject, an ultrasound 2D image of the subject, and an ultrasound 3D. It is an Example of an image processing apparatus provided with the image position alignment part which performs position alignment with an image and calculates the position of 2D cross-sectional image of the ultrasonic 3D image corresponding to an ultrasonic 2D image, and its method.
  • position estimation and name identification are performed on a predetermined anatomical feature portion from an ultrasonic 3D image obtained by imaging a subject.
  • an ultrasonic 2D image captured in real time during surgery that is, a 2D image of an ultrasonic scan surface and an ultrasonic 3D image are aligned, a geometric transformation matrix for alignment is calculated, and an estimated feature part And the distance relationship between the position of the captured ultrasonic 2D image.
  • the distance relationship which is one of the information on the characteristic part in the present embodiment means the projection distance from the characteristic part in the subject to the ultrasonic 2D image.
  • the 2D cross-sectional image of the ultrasonic 3D image corresponding to the ultrasonic 2D image from the three-dimensional position of the characteristic part of the subject estimated from the ultrasonic 3D image, that is, the coordinates of the point in the 3D space. That is, the projection distance to the position of the surface in the three-dimensional space, and the calculated projection distance is set as a distance relationship with the ultrasonic 2D image of the characteristic part.
  • the ultrasonic imaging apparatus of the present embodiment includes an ultrasonic probe 7, an image generation unit 107, and an image processing device 108, and further includes a transmission unit 102, a transmission / reception switching unit 101, The receiving unit 105, a user interface (UI) 121, and a control unit 106 are configured. An image obtained by the ultrasonic imaging apparatus 100 is displayed on the display 16. The display 16 may be included in the user interface (UI) 121.
  • the configuration example of the ultrasonic imaging apparatus shown in FIG. 1 is commonly used in other embodiments.
  • the transmission unit 102 generates a transmission signal under the control of the control unit 106 and passes it to each of a plurality of ultrasonic elements constituting the ultrasonic probe 7 called an ultrasonic probe. Thereby, each of the plurality of ultrasonic elements of the ultrasonic probe 7 transmits an ultrasonic wave toward the subject 120.
  • the ultrasonic waves reflected by the subject 120 reach the plural ultrasonic elements of the ultrasonic probe 7 again and are received and converted into electric signals.
  • the signal received by the ultrasonic element is delayed by a predetermined delay amount corresponding to the position of the reception focal point and phased and added by the receiving unit 105. This is repeated for each of a plurality of reception focal points.
  • the phasing addition signal is transferred from the reception unit 105 to the image generation unit 107.
  • the transmission / reception switching unit 101 selectively connects the transmission unit 102 or the reception unit 105 to the ultrasound probe 7.
  • the image generation unit 107 performs processing such as arranging the phasing addition signal received from the reception unit 105 at a position corresponding to the reception focus, and generates an ultrasonic 2D image. While the user rolls the ultrasonic probe 7, the image generation unit 107 can generate a plurality of ultrasonic 2D images and synthesize the ultrasonic 3D images.
  • the image processing apparatus 108 receives the ultrasonic 3D image from the image generation unit 107, and performs name identification and position estimation of a predetermined anatomical feature part. Further, the image processing apparatus 108 receives the ultrasonic 2D image generated in real time, aligns the ultrasonic 2D image and the ultrasonic 3D image, and obtains the name and position of the obtained characteristic part and the ultrasonic 2D. The distance relationship with the image is displayed on the ultrasonic 2D image generated in real time.
  • FIG. 2 is a block diagram illustrating a hardware configuration example of the image processing apparatus 108 and the user interface 121.
  • the hardware configuration example shown in FIG. 2 is commonly used in other embodiments as well as the configuration of the ultrasonic imaging apparatus in FIG.
  • the image processing apparatus 108 includes a CPU (processor) 1, a ROM (nonvolatile memory: a read-only storage medium) 2, a RAM (volatile memory: a storage medium capable of reading and writing data) 3, a storage device 4, and display control. Part 15 is provided.
  • the user interface 121 includes a medium input unit 11, an input control unit 13 and an input device 14, and a display 16.
  • the image generation unit 107, the image processing apparatus 108, and the user interface 121 are connected to each other via the bus 5.
  • At least one of the ROM 2 and the RAM 3 of the image processing apparatus 108 stores in advance a program and data for the arithmetic processing of the CPU 1 necessary for realizing the operation of the image processing apparatus 108.
  • Various processes of the image processing apparatus 108 are realized by the CPU 1 executing a program stored in advance in at least one of the ROM 2 and the RAM 3.
  • the program executed by the CPU 1 may be stored in a storage medium 12 such as an optical disk, for example, and the medium input unit 11 (for example, an optical disk drive) may read the program and store it in the RAM 3.
  • the program may be stored in the storage device 4 and the program may be loaded from the storage device 4 into the RAM 3.
  • the program may be stored in the ROM 2 in advance.
  • the storage device 4 may include a nonvolatile semiconductor storage medium such as a flash memory, for example.
  • a nonvolatile semiconductor storage medium such as a flash memory
  • An external storage device connected via a network or the like may be used.
  • the input device 14 is a device that receives user operations, and includes, for example, a keyboard, a trackball, an operation panel, a foot switch, and the like.
  • the input control unit 13 receives an operation input input by a user.
  • the operation input received by the input control unit 13 is processed by the CPU 1.
  • the display control unit 15 performs control to display the image data obtained by the processing of the CPU 1 on the display 16.
  • the display 16 displays an image under the control of the display control unit 15.
  • FIG. 3 is a functional block diagram showing one function of the image processing apparatus 108 of the present embodiment.
  • the image processing apparatus 108 includes an ultrasonic 3D image acquisition unit 21, an ultrasonic 3D image feature site position estimation / identification unit 22, and an ultrasonic 2D image acquisition unit 24. Further, the image processing apparatus 108 includes ultrasonic feature part information 23 indicating information on the name and position of the feature part, an ultrasonic 2D-3D image registration unit 25, and an image display unit 26.
  • step S201 the ultrasonic probe 7 is applied and a display prompting the user to perform scanning while turning is displayed on the display 16.
  • the transmission unit 102, the reception unit 105, and the image generation unit 107 continuously generate an ultrasonic 2D image.
  • the image generation unit 107 synthesizes an ultrasonic 3D image from continuously generated ultrasonic 2D images.
  • the ultrasonic 3D image acquisition unit 21 receives the synthesized ultrasonic 3D image.
  • the characteristic part position estimation / identification unit 22 of the ultrasonic 3D image estimates the position of a predetermined anatomical characteristic part from the ultrasonic 3D image using a known machine learning technique, and estimates According to the result, the name of each characteristic part is identified.
  • the characteristic site is a medically defined organ or site within the organ such as the portal vein umbilicus of the liver, the inflow of the inferior vena cava, the gallbladder, and the bifurcation points of the liver portal vein or vein. is there.
  • FIG. 5A, FIG. 5B, and FIG. 5C are explanatory diagrams showing the three-dimensional position of the portal vein umbilical portion of the liver, the inflow portion of the inferior vena cava, and the features of the gallbladder as features in the ultrasound 3D image. is there.
  • the cube 50 shown in FIG. 5A, FIG. 5B, and FIG. 5C shows the surrounding local area
  • FIG. 6 shows an example of the characteristic part name estimated and identified from the ultrasonic 3D image and the three-dimensional position information as the ultrasonic characteristic part information 23.
  • These ultrasonic characteristic part information 23 can be stored as a table in the RAM 3 or the storage device 4.
  • step S203 the ultrasonic 2D image acquisition unit 24 receives the 2D ultrasonic image acquired in real time from the image generation unit 107.
  • the ultrasonic 2D-3D image registration unit 25 receives the ultrasonic 3D image and the ultrasonic 2D image from the ultrasonic 3D image acquisition unit 21 and the ultrasonic 2D image acquisition unit 24, respectively, and aligns the two.
  • a registration transformation matrix for performing the above is calculated. Details of the alignment transformation matrix calculation will be described later.
  • step S205 the image display unit 26 receives the ultrasonic 3D image, the ultrasonic 2D image, the ultrasonic characteristic part information 23, the alignment conversion matrix, and further the position of the 2D cross-sectional image.
  • the image display unit 26 uses these data to calculate the projection distance from the coordinates of the point on the 3D space of the characteristic part of the subject to the position of the 2D cross-sectional image of the ultrasonic 3D image corresponding to the ultrasonic 2D image.
  • the calculated projection distance is set as a distance relationship with the ultrasonic 2D image of the characteristic part.
  • the image display unit 26 displays the ultrasonic 2D image on the screen of the display 16 as shown in an example in FIG.
  • the image display unit 26 uses the alignment conversion matrix to project from the position of the characteristic part in the three-dimensional coordinate system of the ultrasonic 3D image to the currently displayed ultrasonic 2D image, and the projected location The markers 17B and 18B indicated by the location of the characteristic part and the X mark are displayed. That is, the image display unit 26 displays in real time on the ultrasonic 2D image on the display 16 the name of the characteristic part estimated from the ultrasonic 3D image and the positional relationship between the characteristic part and the ultrasonic 2D image acquired in real time. Therefore, accurate surgical navigation for the user can be realized.
  • the image display unit 26 determines the position of the 2D cross-sectional image of the ultrasonic 3D image corresponding to the ultrasonic 2D image from the coordinates of the point on the 3D space of the characteristic part of the subject estimated from the ultrasonic 3D image. Is calculated as the distance relationship between the characteristic part and the ultrasonic 2D image acquired in real time. Then, the sizes of the markers 17B and 18B are displayed in proportion to the projection distance that is the distance relationship between the calculated feature portion and the ultrasonic 2D image.
  • the size of the markers 17B and 18B is displayed in proportion to the calculated projection distance, whereby the positional relationship between the two is displayed. Can be grasped at a glance, so that the usability of the user is further improved.
  • the image display unit 26 can turn on and off the display of the feature part names 17A and 18A and the markers 17B and 18B by user selection of the check box 28, and only when the user needs it. Characteristic part names 17A and 18A and markers 17B and 18B can be displayed. Note that the touch panel operation button 19 in FIG. 11A and FIG. 11B are described in the second embodiment.
  • the image display unit 26 changes one color of the real-time ultrasonic 2D image and the 2D cross-sectional image at a position corresponding to the ultrasonic 2D image of the ultrasonic 3D image.
  • An image in which these two are superimposed transparently can be generated and displayed on the display 16.
  • the image display unit 26 displays the feature part names 17A and 18A and the markers 17B and 18B on the 2D image generated by superimposing them. Even in this case, the image display unit 26 can display the size of the markers 17B and 18B in proportion to the calculated projection distance from the characteristic part to the ultrasonic 2D image.
  • the characteristic part position estimation / identification unit 22 of the ultrasonic 3D image receives the ultrasonic 3D image from the image generation unit 107.
  • the characteristic part position estimation / identification unit 22 of the ultrasonic 3D image performs position estimation and name identification of the characteristic part candidate.
  • the feature part position estimation / identification unit 22 of the ultrasound 3D image reduces the size of the ultrasound 3D image and searches for feature part candidates with a coarse resolution using machine learning.
  • a method of position estimation and name identification of a characteristic part for example, a Hough Forest method that is a known machine learning method can be used.
  • the characteristic part position estimation / identification unit 22 of the ultrasonic 3D image obtains a local 3D image that is a local region around the searched characteristic part candidate from the normal size ultrasonic 3D image.
  • the feature part position estimation / identification unit 22 of the ultrasound 3D image searches and identifies the feature part in detail in the local region surrounding the feature part candidate.
  • the above Hough Forest method can be used.
  • a known deep learning method 3D CNN (convolutional neural network) method can be used.
  • step S405 the characteristic part position estimation / identification unit 22 excludes the characteristic part as an erroneous identification part when the characteristic part identification score obtained in the search in step S404 is equal to or less than a predetermined threshold.
  • step S ⁇ b> 406 the characteristic part position estimation / identification unit 22 of the ultrasonic 3D image outputs the position / name information of the identified characteristic part as the ultrasonic characteristic part information 23.
  • the processing of the ultrasonic 2D-3D image alignment unit 25 of the present embodiment will be described in detail using the flowchart shown in FIG.
  • the processing of the ultrasonic 2D-3D image alignment unit 25 is also realized by executing a program by the CPU 1.
  • the ultrasonic 2D-3D image registration unit 25 receives the ultrasonic 3D image from the ultrasonic 3D image acquisition unit 21 and the ultrasonic 2D image from the ultrasonic 2D image acquisition unit 24, and converts the ultrasonic 3D image.
  • the corresponding three-dimensional position of the ultrasonic 2D image is roughly estimated from the inside. That is, the initial position of the corresponding ultrasonic 2D image is estimated.
  • FIG. 9 shows 15 examples of three-dimensional initial position candidates used in the ultrasonic 2D-3D image alignment unit 25. Each position 91 is shown in each pattern.
  • the ultrasonic 2D image is input to the 15-pattern discriminator using machine learning, and the position 91 of the pattern that can obtain the highest discrimination score is obtained from the ultrasonic 2D image acquisition unit 24.
  • the initial position of the ultrasonic 2D image The ultrasonic 2D-3D image registration unit 25 selects a 3D position candidate of a 2D cross-sectional image corresponding to the ultrasonic 2D image from the ultrasonic 3D image and uses it as an initial position.
  • the ultrasonic 2D image and the ultrasonic wave Position alignment with the 3D image is performed, and a 2D cross-sectional image of the ultrasound 3D image corresponding to the ultrasound 2D image is calculated.
  • ⁇ Classifiers for each pattern are created by learning.
  • a large number of ultrasonic 3D images are collected as learning data, 2D cross-sectional images at each position 91 in FIG. 9 are extracted from each ultrasonic 3D image, and created as learning data of a pattern corresponding to the position. Further, in order to increase the quantity and diversity of learning data, a 2D cross-sectional image is extracted from each position 91 in FIG. 9 by performing a random and small range of translation and rotation angle, and used as learning data at that position. .
  • learning data of the three-dimensional initial positions of the 15 patterns in FIG. 9 can be created, and a discriminator for each pattern can be created by using machine learning.
  • a machine learning method for example, a well-known Adaboost method or a deep learning method can be used.
  • the ultrasonic 2D-3D image registration unit 25 estimates the translation and rotation angle, which is geometric conversion information for ultrasonic 2D-3D image alignment, from the estimated initial position of the ultrasonic 2D image. To do.
  • the ultrasonic 2D-3D image alignment unit 25 constructs a 2D cross-sectional image corresponding to the ultrasonic 2D image from the ultrasonic 3D image using the obtained parallel movement and rotation angle.
  • step S304 the ultrasonic 2D-3D image alignment unit 25 performs an image similarity evaluation function between the 2D cross-sectional image obtained from the ultrasonic 3D image and the ultrasonic 2D image acquired by the ultrasonic 2D image acquisition unit 24. Perform the operation.
  • a known mutual information amount can be used as the image similarity.
  • step S305 the ultrasonic 2D-3D image registration unit 25 performs translation and rotation so that the image similarity between the 2D cross-sectional image obtained from the ultrasonic 3D image and the ultrasonic 2D image is maximized or maximized. Convergence calculation is performed to obtain the angle.
  • step S306 if the image similarity is not converged, the translation and the rotation angle are updated in order to obtain a higher similarity. Then, Steps S303 to S305 are performed again using the updated parallel movement and rotation angle.
  • the ultrasonic 2D-3D image registration unit 25 outputs the translation and rotation angle information obtained in step S307, the position of the 2D cross-sectional image, and the like.
  • the processing of the ultrasonic 2D-3D image alignment unit 25 in FIG. 3 can be completed.
  • the initial position of the alignment of the subsequent ultrasonic 2D image after completion of the alignment of the first ultrasonic 2D image is the previous time.
  • the translation and rotation angle information which is the alignment result of the acquired ultrasonic 2D image, can be used. That is, if the processes in steps S303 to S307 are performed using the parallel movement and rotation angle of the previous ultrasonic 2D image, real-time ultrasonic 2D-3D image alignment is possible.
  • the relationship between the name and position of the characteristic part in the patient and the projection distance from the characteristic part to the ultrasonic 2D image is represented in the ultrasonic 2D image. Real-time display and automatic and accurate surgical navigation can be realized.
  • the image processing apparatus 108 is provided inside the ultrasonic imaging apparatus 100.
  • the image processing apparatus 108 illustrated in FIGS. 1 and 2 is different from the ultrasonic imaging apparatus 100. It is also possible to use the device. In that case, the image processing apparatus 108 and the ultrasonic imaging apparatus 100 are connected via a signal line or a network.
  • the image processing apparatus 108 is mounted on a general computer or a processing apparatus such as a workstation, and is connected to the ultrasonic imaging apparatus 100 via a network.
  • the image processing apparatus 108 receives the ultrasonic 3D image for identifying the characteristic part and the ultrasonic 2D image for alignment from the ultrasonic imaging apparatus that is the client terminal via the network, respectively.
  • the position estimation / identification process and the image alignment process of FIG. 8 are performed.
  • the name and position information of the identified characteristic part and the ultrasonic 2D-3D image alignment result are transmitted to the client terminal.
  • the ultrasonic imaging apparatus 100 can perform alignment processing by using the calculation capability of the image processing apparatus 108 connected via a network. Therefore, the ultrasonic imaging apparatus 100 is a small and simple ultrasonic imaging apparatus. It is possible to provide a device capable of displaying the name and distance relationship of a characteristic part on an ultrasonic 2D image in real time.
  • position estimation and name identification are performed on a predetermined anatomical feature part from an ultrasonic 3D image obtained by imaging a subject, and the part and name are being operated on. It is possible to display an ultrasonic 2D image including a distance relationship with the ultrasonic 2D image captured in real time to realize automatic and accurate surgical navigation.
  • position estimation and name identification are performed on a predetermined anatomical feature portion from an ultrasound 3D image, and the ultrasound 2D image and the ultrasound 3D image captured in real time during surgery are aligned.
  • the distance relationship between the position / name of the identified characteristic part and the ultrasonic 2D image is displayed on the ultrasonic 2D image.
  • the addition and correction of feature parts or the correction of the geometric transformation calculation of the alignment can be performed based on a user instruction. This is an example. That is, an ultrasonic imaging apparatus, the image processing apparatus stitches together an ultrasonic 3D image and an ultrasonic 3D image obtained by imaging with an ultrasonic probe other than the ultrasonic 3D image.
  • FIG. 4 is an example of an ultrasonic imaging apparatus that displays the position, name, and distance relationship of a characteristic part on an ultrasonic 2D image using the position of the characteristic part and the position of a 2D cross-sectional image of the second ultrasonic 3D image. .
  • the image processing apparatus includes a correction unit that joins an ultrasonic 3D image and an ultrasonic 3D image other than the ultrasonic 3D image to generate a second ultrasonic 3D image, and estimates a characteristic part position.
  • the identification unit estimates and identifies the characteristic part of the subject from the second ultrasonic 3D image
  • the image alignment unit performs alignment between the ultrasonic 2D image and the second ultrasonic 3D image, and the ultrasonic 2D It is an Example of the image processing apparatus which calculates the position of 2D cross-sectional image of the 2nd ultrasonic 3D image corresponding to an image, and its image processing method.
  • the same components and processes as those of the first embodiment are denoted by the same reference numerals and the description thereof is omitted.
  • FIG. 10 is a functional block diagram illustrating functions of the image processing apparatus 108 according to the second embodiment
  • FIG. 11 is a diagram illustrating an example of a display screen and a button selection unit according to the second embodiment.
  • the image processing apparatus 108 according to the present exemplary embodiment includes an ultrasonic 3D image acquisition unit 21, an ultrasonic 3D image characteristic part position estimation / identification unit 22, and an ultrasonic 2D image acquisition unit 24. Including.
  • the image processing apparatus 108 further includes feature part identification and And an alignment result correction unit 27.
  • the image display unit 26 displays the ultrasonic 2D image on the screen of the display 16 as shown in FIG. 11 (a) as an example. Based on the distance relationship with the 2D image, etc., the positions of the identified feature parts, names 17A and 18A, and the markers 17B and 18B indicating the distance relation between the feature part and the ultrasonic 2D image are displayed on the screen.
  • the position and name of the identified feature part and the distance relationship between the feature part and the ultrasonic 2D image are displayed on the display 16.
  • the feature part identification and alignment result correcting unit 27 displays the touch panel operation button 19 as a display for inquiring whether the user determines that the feature part identification and alignment is successful. That is, touch panel operation buttons 19 such as volume addition, characteristic part manual correction, alignment initial position correction, and alignment detail correction are displayed on the display 16, and the user's judgment is made via the button selection means of the input device 14 such as a mouse. Accept. When the user inputs through the input device 14 that the feature part identification and the alignment are successful, the alignment process is completed.
  • the feature part identification and alignment result correction unit 27 of this embodiment performs the correction process of the feature part identification and the alignment. Execute.
  • the feature part identification and alignment result correction unit 27 displays a display asking whether or not the user determines to additionally acquire an ultrasonic volume on the display 16, and the user is input via the input device 14 or the touch panel operation button 19. Accept the judgment.
  • the user determines that the feature part information is insufficient and additionally acquires one or more ultrasound 3D images from the ultrasound probe in addition to the ultrasound 3D image described above, the input is performed via the input device 14 or the touch panel operation button 19. In such a case, the ultrasonic imaging apparatus 100 additionally acquires an ultrasonic 3D image.
  • the feature part identification and alignment result correcting unit 27 performs a stitching, ie, stitching, with the additionally acquired ultrasonic 3D image and the original ultrasonic 3D image, that is, one synthesized ultrasonic 3D image, that is, A second ultrasonic 3D image is generated.
  • a stitching method for example, the method described in Non-Patent Document 1 can be used.
  • the feature part identification and alignment result correcting unit 27 uses the second ultrasonic 3D image generated by the stitching process to identify the characteristic part in FIG. 7 and the ultrasonic 2D-3D in FIG. Perform image alignment processing. Since the characteristic part identification process and the alignment process are the same as those in the first embodiment, description thereof will be omitted.
  • the ultrasonic characteristic part information 23 shown as an example in FIG. 6, that is, the position of the characteristic part of the ultrasonic 3D image The user manually corrects the names using the input device 14 or the like.
  • the feature part identification and alignment result correction unit 27 receives the position / name information of the corrected 3D image feature part, and outputs it to the image display unit 26.
  • the image display unit 26 executes the image display process in step 205 of FIG.
  • the feature part identification and alignment result correction unit 27 is displayed on the display 16 as shown in FIG.
  • the alignment initial position pattern is displayed and the user's selection is accepted.
  • the user can also manually fine-correct the selected initial alignment position pattern.
  • the user acquires an ultrasonic 2D image from the corrected alignment initial position.
  • the feature part identification and alignment result correction unit 27 receives the corrected alignment initial position pattern and the ultrasonic 2D image, and executes the processes of steps S302 to S307 in FIG.
  • the feature region identification and alignment result correction unit 27 receives from the ultrasonic 2D image acquisition unit 24.
  • the ultrasonic 2D image and the 2D cross-sectional image of the ultrasonic 3D image corresponding to the ultrasonic 2D image are superimposed on the display 16 and displayed.
  • the user corrects the alignment with the ultrasonic 2D image while manually adjusting the position and rotation angle of the 2D cross-sectional image.
  • the feature part identification and alignment result correction unit 27 receives the corrected alignment result and outputs it to the image display unit 26.
  • the image display unit 26 executes the image display process in step 205 of FIG. 4 described above. Also in the present embodiment, the image display unit 26 can display the names 17A and 18A of the characteristic parts and the markers 17B and 18B when the user needs, by user selection of the check box 28.
  • an ultrasonic imaging apparatus capable of executing addition or correction of feature parts and recalculation of coordinate conversion information for alignment based on a user instruction.
  • the ultrasonic probe includes a position sensor, and the image generation unit generates an ultrasonic 3D image from the ultrasonic 2D image and the position information of the ultrasonic probe obtained from the position sensor.
  • the image generation unit generates an ultrasonic 3D image from the ultrasonic 2D image and the position information of the ultrasonic probe obtained from the position sensor.
  • position estimation and name identification are performed on a predetermined anatomical feature part from an ultrasonic 3D image generated by transmitting and receiving ultrasonic waves a plurality of times, and an ultrasonic 2D image captured in real time during surgery Alignment with the ultrasonic 3D image was performed, and the distance relationship between the position / name of the identified characteristic part and the ultrasonic 2D image was displayed on the ultrasonic 2D image.
  • a position sensor is attached to an ultrasonic probe, an ultrasonic image is generated from a reception signal of the ultrasonic probe, and position information of the ultrasonic probe obtained from the ultrasonic image and the position sensor is used. From the ultrasonic 3D image, position estimation and name identification are performed on a predetermined anatomical feature part.
  • the positional information from the position sensor of the ultrasonic probe is used to align the ultrasonic 2D image and the ultrasonic 3D image. Then, the position, name, and distance relationship between the identified characteristic part and the ultrasonic 2D image are displayed on the ultrasonic 2D image.
  • the same components and processes as those of the first embodiment are denoted by the same reference numerals and description thereof is omitted.
  • FIG. 12 illustrates a configuration example of the ultrasonic imaging apparatus according to the third embodiment.
  • FIG. 13 is a block diagram illustrating a hardware configuration example of the image processing apparatus 108 and the user interface 121 according to the third embodiment. 12 and 13, a position detection unit 6 and a position sensor 8 are further added to the configuration of the first embodiment.
  • the position detection unit 6 detects the position of the ultrasonic probe 7 from the output of the position sensor 8.
  • a magnetic sensor unit can be used as the position detection unit 6.
  • the position detection unit 6 can detect the coordinates from the position serving as the reference point, that is, the position information of the ultrasonic probe, by forming a magnetic field space and the magnetic sensor as the position sensor 8 detecting the magnetic field. .
  • FIG. 3 is a functional block diagram illustrating a functional example of the image processing apparatus 108 according to the third embodiment. Further, the operation processing of the image processing apparatus 108 in the third embodiment shown in FIG. 3 is as shown in the flowchart of FIG.
  • the image generation unit 107 generates an ultrasonic image from the reception signal of the ultrasonic probe 7, and generates ultrasonic 3D from the ultrasonic image and the position information of the ultrasonic probe obtained from the position sensor 8. Generate an image.
  • the ultrasonic 2D-3D image alignment unit 25 detects the position information of the ultrasonic probe added to the ultrasonic 3D image and the ultrasonic 2D imaged in real time during the operation. An alignment transformation matrix for performing alignment is calculated using the position information of the ultrasound probe added to the image.
  • Other processes in the third embodiment are the same as those in the first embodiment.
  • the position sensor is attached to the ultrasonic probe, the ultrasonic image is generated from the reception signal of the ultrasonic probe, and the ultrasonic probe obtained from the ultrasonic image and the position sensor is used.
  • An ultrasonic 3D image is generated from the position information of the tentacles, and position estimation and name identification are performed on a predetermined anatomical feature portion from the ultrasonic 3D image.
  • the position information of the identified characteristic part is registered by aligning the ultrasonic 2D image and the ultrasonic 3D image using the position information of the ultrasonic probe.
  • an ultrasonic imaging apparatus capable of displaying the name and information on the distance relationship with the ultrasonic 2D image on the ultrasonic 2D image can be configured.
  • the position of a predetermined anatomical characteristic part is estimated and the name is identified from an ultrasound 3D image, and an ultrasound scan that is an ultrasound 3D image and an ultrasound 2D image captured in real time. Alignment with the two-dimensional image of the surface, projecting the position of the feature part on the ultrasound 2D image, calculating the distance relationship between the feature part and the ultrasound 2D image, and positioning the feature part on the ultrasound 2D image.
  • the ultrasonic imaging apparatus which displays the name and the information on the distance relationship with the ultrasonic 2D image can be provided.
  • the present invention is not limited to the ultrasonic imaging apparatus, but can be realized as an image processing apparatus connected to the ultrasonic imaging apparatus via a network and an image processing method thereof.
  • a part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment. Further, it is possible to add, delete, and replace other configurations for a part of the configuration of each embodiment.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

La présente invention concerne un dispositif qui affiche, automatiquement et en temps réel, des informations sur une partie d'élément anatomique d'un sujet, dans une image ultrasonore peropératoire, et guide précisément une opération. Un dispositif d'imagerie ultrasonore comprend : une unité de production d'image qui transmet des ondes ultrasonores vers un sujet, produit une image 2D ultrasonore à partir de signaux de réception d'une sonde ultrasonore qui reçoit des ondes ultrasonores provenant d'un sujet, et produit une image 3D ultrasonore en envoyant et en recevant des ondes ultrasonores à de multiples reprises ; et un dispositif de traitement d'image qui traite l'image 2D ultrasonore et l'image 3D ultrasonore. Le dispositif de traitement d'image comprend : une unité d'estimation et d'identification de position de partie d'élément d'image 3D ultrasonore qui estime et identifie une partie d'élément d'un sujet à partir d'une image 3D ultrasonore acquise par une unité d'acquisition d'image 3D ultrasonore 21 ; une unité d'alignement de position d'images ultrasonores 2D-3D 25 qui aligne les positions de l'image 2D ultrasonore et de l'image 3D ultrasonore ; et une unité d'affichage d'image 26 qui utilise les informations de position de partie d'élément obtenues et les résultats d'alignement de position pour afficher, sur une image 2D ultrasonore en temps réel, des informations sur la position de la partie d'élément, le nom et la relation de distance avec l'image 2D ultrasonore.
PCT/JP2017/015573 2016-05-12 2017-04-18 Dispositif d'imagerie ultrasonore, dispositif de traitement d'image et procédé associé WO2017195540A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201780013492.XA CN108697410B (zh) 2016-05-12 2017-04-18 超声波拍摄装置、图像处理装置及其方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016095761A JP6689666B2 (ja) 2016-05-12 2016-05-12 超音波撮像装置
JP2016-095761 2016-05-12

Publications (1)

Publication Number Publication Date
WO2017195540A1 true WO2017195540A1 (fr) 2017-11-16

Family

ID=60266475

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/015573 WO2017195540A1 (fr) 2016-05-12 2017-04-18 Dispositif d'imagerie ultrasonore, dispositif de traitement d'image et procédé associé

Country Status (3)

Country Link
JP (1) JP6689666B2 (fr)
CN (1) CN108697410B (fr)
WO (1) WO2017195540A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020174778A1 (fr) * 2019-02-28 2020-09-03 富士フイルム株式会社 Système endoscopique à ultrasons et procédé de fonctionnement de système endoscopique à ultrasons
JPWO2021039101A1 (fr) * 2019-08-27 2021-03-04

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102512104B1 (ko) * 2020-05-07 2023-03-22 한국과학기술연구원 3차원 초음파 이미지 생성 장치 및 방법
JP7502899B2 (ja) * 2020-05-28 2024-06-19 富士フイルムヘルスケア株式会社 超音波撮像装置、及び、それを用いた手術支援システム

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008212680A (ja) * 2007-03-06 2008-09-18 General Electric Co <Ge> 超音波画像内の所定の点を追跡するための方法及び装置
JP2008246264A (ja) * 2003-05-08 2008-10-16 Hitachi Medical Corp 超音波診断装置
JP2010042190A (ja) * 2008-08-18 2010-02-25 Toshiba Corp 医用画像処理装置、超音波診断装置、及び医用画像処理プログラム
JP2013165936A (ja) * 2011-04-06 2013-08-29 Canon Inc 情報処理装置

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8435181B2 (en) * 2002-06-07 2013-05-07 Verathon Inc. System and method to identify and measure organ wall boundaries
CN101271526B (zh) * 2008-04-22 2010-05-12 深圳先进技术研究院 一种图像处理中物体自动识别并三维重建的方法
JP5395538B2 (ja) * 2009-06-30 2014-01-22 株式会社東芝 超音波診断装置及び画像データ表示用制御プログラム
US8824762B2 (en) * 2010-10-22 2014-09-02 The Johns Hopkins University Method and system for processing ultrasound data
JP6058290B2 (ja) * 2011-07-19 2017-01-11 東芝メディカルシステムズ株式会社 画像処理システム、装置、方法及び医用画像診断装置
US9357981B2 (en) * 2011-12-21 2016-06-07 Konica Minolta, Inc. Ultrasound diagnostic device for extracting organ contour in target ultrasound image based on manually corrected contour image in manual correction target ultrasound image, and method for same
JP6073563B2 (ja) * 2012-03-21 2017-02-01 東芝メディカルシステムズ株式会社 超音波診断装置、画像処理装置及び画像処理プログラム
WO2014050596A1 (fr) * 2012-09-26 2014-04-03 日立アロカメディカル株式会社 Dispositif de diagnostic par ultrasons et procédé de génération d'image tomographique bidimensionnelle par ultrasons
KR102106535B1 (ko) * 2013-02-06 2020-05-06 삼성전자주식회사 일 호흡 주기에 따른 장기의 형상 및 위치의 변화를 나타내는 모델을 생성하는 방법, 장치 및 시스템.
AU2014228281A1 (en) * 2013-03-15 2015-09-17 Stephanie Littell Evaluating electromagnetic imagery by comparing to other individuals' imagery
EP2807978A1 (fr) * 2013-05-28 2014-12-03 Universität Bern Procédé et système d'acquisition en 3D d'images ultrasonores
CN105433977B (zh) * 2014-07-31 2020-02-07 东芝医疗系统株式会社 医学成像系统、手术导引系统以及医学成像方法
CN104398272B (zh) * 2014-10-21 2017-09-19 无锡海斯凯尔医学技术有限公司 选择检测区域的方法及装置及弹性检测系统

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008246264A (ja) * 2003-05-08 2008-10-16 Hitachi Medical Corp 超音波診断装置
JP2008212680A (ja) * 2007-03-06 2008-09-18 General Electric Co <Ge> 超音波画像内の所定の点を追跡するための方法及び装置
JP2010042190A (ja) * 2008-08-18 2010-02-25 Toshiba Corp 医用画像処理装置、超音波診断装置、及び医用画像処理プログラム
JP2013165936A (ja) * 2011-04-06 2013-08-29 Canon Inc 情報処理装置

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020174778A1 (fr) * 2019-02-28 2020-09-03 富士フイルム株式会社 Système endoscopique à ultrasons et procédé de fonctionnement de système endoscopique à ultrasons
CN113490455A (zh) * 2019-02-28 2021-10-08 富士胶片株式会社 超声波内窥镜系统及超声波内窥镜系统的工作方法
JPWO2020174778A1 (ja) * 2019-02-28 2021-12-16 富士フイルム株式会社 超音波内視鏡システムおよび超音波内視鏡システムの作動方法
JP7218425B2 (ja) 2019-02-28 2023-02-06 富士フイルム株式会社 超音波内視鏡システムおよび超音波内視鏡システムの作動方法
CN113490455B (zh) * 2019-02-28 2024-09-20 富士胶片株式会社 超声波内窥镜系统及超声波内窥镜系统的工作方法
JPWO2021039101A1 (fr) * 2019-08-27 2021-03-04
WO2021039101A1 (fr) * 2019-08-27 2021-03-04 富士フイルム株式会社 Système endoscopique à ultrasons et procédé de fonctionnement de système endoscopique à ultrasons
CN114302679A (zh) * 2019-08-27 2022-04-08 富士胶片株式会社 超声波内窥镜系统及超声波内窥镜系统的工作方法
JP7158596B2 (ja) 2019-08-27 2022-10-21 富士フイルム株式会社 超音波内視鏡システムおよび超音波内視鏡システムの作動方法

Also Published As

Publication number Publication date
CN108697410A (zh) 2018-10-23
JP6689666B2 (ja) 2020-04-28
JP2017202125A (ja) 2017-11-16
CN108697410B (zh) 2021-06-04

Similar Documents

Publication Publication Date Title
RU2748435C2 (ru) Ультразвуковая система и способ для визуализации ткани груди
JP6490820B2 (ja) 超音波撮像装置、画像処理装置、及び方法
CN109310400B (zh) 用于乳房组织成像和注释乳房超声图像的超声系统和方法
JP5858636B2 (ja) 画像処理装置、その処理方法及びプログラム
JP6097452B2 (ja) 超音波撮像システム及び超音波撮像方法
WO2017195540A1 (fr) Dispositif d&#39;imagerie ultrasonore, dispositif de traitement d&#39;image et procédé associé
US20180360427A1 (en) Ultrasonic diagnostic apparatus and medical image processing apparatus
EP3832599A1 (fr) Dispositif pour fournir un repérage d&#39;image 3d et procédé associé
JP6383483B2 (ja) 超音波撮像装置、および、画像処理装置
JP7321836B2 (ja) 情報処理装置、検査システム及び情報処理方法
CA3102807A1 (fr) Detection de l`orientation dans des images fluoroscopiques
JP2017225835A (ja) 画像処理装置
US10521069B2 (en) Ultrasonic apparatus and method for controlling the same
JP6887942B2 (ja) 超音波撮像装置、画像処理装置、及び方法
US20190388061A1 (en) Ultrasound diagnosis apparatus displaying shear wave data for object and method for operating same
JP6382031B2 (ja) 超音波診断装置及びその制御プログラム
JP7027029B2 (ja) 超音波診断装置および医用画像処理装置
JP6391544B2 (ja) 医用画像処理装置、医用画像処理方法、及びプログラム
US20210038184A1 (en) Ultrasound diagnostic device and ultrasound image processing method
JP2008259764A (ja) 超音波診断装置及び該装置の診断プログラム
JP6598565B2 (ja) 画像処理装置、画像処理方法及びプログラム
CN118717170A (zh) 一种超声设备、体位图及病灶图的显示方法
KR20200140683A (ko) 초음파 영상과 3차원 의료 영상의 정렬을 위한 장치 및 방법

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17795901

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17795901

Country of ref document: EP

Kind code of ref document: A1