CN108697410B - Ultrasonic imaging apparatus, image processing apparatus, and method thereof - Google Patents

Ultrasonic imaging apparatus, image processing apparatus, and method thereof Download PDF

Info

Publication number
CN108697410B
CN108697410B CN201780013492.XA CN201780013492A CN108697410B CN 108697410 B CN108697410 B CN 108697410B CN 201780013492 A CN201780013492 A CN 201780013492A CN 108697410 B CN108697410 B CN 108697410B
Authority
CN
China
Prior art keywords
image
ultrasonic
ultrasound
unit
feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201780013492.XA
Other languages
Chinese (zh)
Other versions
CN108697410A (en
Inventor
黎子盛
荒井修
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujifilm Healthcare Corp
Original Assignee
Hitachi Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Ltd filed Critical Hitachi Ltd
Publication of CN108697410A publication Critical patent/CN108697410A/en
Application granted granted Critical
Publication of CN108697410B publication Critical patent/CN108697410B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • A61B8/14Echo-tomography

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

Information on an anatomical feature of a subject is automatically displayed in real time on an intra-operative ultrasound image, and an operation is accurately guided. The ultrasonic imaging device is provided with: an image generation unit that generates an ultrasound 2D image from a reception signal of an ultrasound probe that transmits and receives ultrasound to and from a subject, and generates an ultrasound 3D image by a plurality of ultrasound transmissions and receptions; an image processing device that processes an ultrasonic 2D image and an ultrasonic 3D image, the image processing device including: a characteristic part position estimation/recognition unit (22) for an ultrasonic 3D image, which estimates/recognizes a characteristic part of the subject from the ultrasonic 3D image acquired by the ultrasonic 3D image acquisition unit (21); an ultrasonic 2D-3D image position alignment unit (25) that performs position alignment of the ultrasonic 2D image and the ultrasonic 3D image; an image display unit (26) displays the position and name of the characteristic part and information on the distance relationship with the ultrasonic 2D image on the real-time ultrasonic 2D image by using the acquired position information and the position alignment result of the characteristic part.

Description

Ultrasonic imaging apparatus, image processing apparatus, and method thereof
Technical Field
The present invention relates to an ultrasonic imaging apparatus, and more particularly to an imaging technique for simultaneously displaying an ultrasonic image and a predetermined characteristic portion in a subject.
Background
The ultrasound imaging apparatus irradiates an object with ultrasound waves and images the internal structure of the object with its reflected signals, thereby enabling a non-invasive and real-time observation of a patient. In addition, diagnostic imaging systems have also come into widespread use, in which a position sensor is attached to an ultrasonic probe to calculate the positional relationship of a scanning plane, and a two-dimensional cross-sectional image corresponding to an image of the ultrasonic scanning plane is constructed and displayed from three-dimensional diagnostic volume (3D image) data captured by a medical diagnostic imaging apparatus. In addition to the ultrasound, the diagnostic 3D image data is generally image data captured by other medical Imaging devices such as an X-ray CT (Computed Tomography) device and an MRI (Magnetic Resonance Imaging) device.
In patent document 1, when a two-dimensional cross-sectional image corresponding to an ultrasound two-dimensional (2D) image of an ultrasound scanning plane is constructed from three-dimensional (3D) image data for diagnosis captured by a medical image diagnostic apparatus, the cross-sectional direction of the two-dimensional cross-sectional image is set according to the purpose of diagnosis and the type of ultrasound probe. Based on the acquired cross-sectional direction and the positional information of the position sensor attached to the ultrasonic probe, the image of the ultrasonic scanning plane and the 3D image for diagnosis are aligned in position, and a two-dimensional cross-sectional image is constructed and displayed from the 3D image for diagnosis. Further, non-patent document 1 discloses an ultrasonic 3D image stitching method and the like.
Documents of the prior art
Patent document
Patent document 1: japanese patent laid-open No. 2014-239731
Non-patent document
Non-patent document 1 "Medical Image Computing and Computer-Assisted interaction-MICCAI 2008.Springer Berlin Heidelberg,2008.52-60," Volumetric Image Computing and Computer-Assisted interaction ", Ni, Dong, et al.
Disclosure of Invention
Problems to be solved by the invention
In recent years, a technique for confirming a region requiring surgery, such as a tumor in a subject surgery, by an intra-operative ultrasound image captured non-invasively and in real time has been desired. In addition, in order to guide the operation accurately, it is desirable to display information such as the position, name, and distance relationship of the tumor or anatomical feature in the subject in real time on the intra-operative ultrasound image. In order to align the intra-operative ultrasound image with the position of the subject, it is desirable to avoid as much as possible contact of the hand of the user such as a physician during the operation with the switch, the mouse, or the like of the manual input device. In addition, in order to reduce the burden on the subject in the open state, it is desirable to perform the alignment in as short a time as possible.
However, in the technique of patent document 1, it is necessary for the user to align the set points on the observable target by aligning the targets with each other in the position between the image of the ultrasound scanning plane and the two-dimensional cross-sectional image constructed from the 3D image for diagnosis. Such complicated user operations and the burden on the subject in the open state are significant problems. In addition, the technique of patent document 1 has a problem that information such as the position and name of an anatomical feature part cannot be displayed in real time.
An object of the present invention is to provide an ultrasound imaging apparatus, an image processing apparatus, and a method thereof, which can automatically display feature part information in real time on an intra-operative ultrasound image and accurately guide an operation.
Means for solving the problems
In order to achieve the above object, the present invention provides an ultrasonic imaging apparatus including: an ultrasonic probe that transmits ultrasonic waves to a subject and receives ultrasonic waves from the subject; an image generation unit that generates an ultrasonic 2D image from a reception signal of the acoustic wave probe and generates an ultrasonic 3D image by a plurality of ultrasonic transmissions and receptions; and an image processing device for receiving and processing the ultrasonic 2D image and the ultrasonic 3D image. The image processing device estimates and recognizes a characteristic region of a subject from an ultrasound 3D image, aligns the ultrasound 2D image and the ultrasound 3D image, and displays information of the characteristic region on the ultrasound 2D image.
In order to achieve the above object, the present invention provides an image processing apparatus including: a characteristic portion position estimating/recognizing unit that estimates and recognizes a characteristic portion of a subject from an ultrasound 3D image of the subject; an image position alignment unit performs position alignment of an ultrasound 2D image and an ultrasound 3D image of a subject, and calculates the position of a 2D cross-sectional image of the ultrasound 3D image corresponding to the ultrasound 2D image.
In order to achieve the above object, the present invention also provides an image processing method in an image processing apparatus which estimates and recognizes a characteristic portion of a subject from an ultrasound 3D image of the subject, performs position alignment of an ultrasound 2D image and an ultrasound 3D image with respect to the subject, and calculates a position of a 2D cross-sectional image of the ultrasound 3D image corresponding to the ultrasound 2D image.
Effects of the invention
According to the present invention, a characteristic region of a subject is estimated and recognized from an ultrasound 3D image, and the ultrasound 2D image and the ultrasound 3D image are aligned to each other, whereby information on the characteristic region is displayed on the ultrasound 2D image, thereby enabling accurate guidance of a surgery.
Drawings
Fig. 1 is a block diagram showing an example of the overall configuration of an ultrasonic imaging apparatus according to embodiment 1.
Fig. 2 is a block diagram showing an example of a hardware configuration of the ultrasonic imaging apparatus according to embodiment 1.
Fig. 3 is a functional block diagram showing an image processing apparatus of the ultrasonic imaging apparatus according to embodiment 1.
Fig. 4 is a flowchart showing a processing flow of the ultrasonic imaging apparatus according to embodiment 1.
Fig. 5A is an explanatory diagram illustrating an example of the characteristic portion according to example 1.
Fig. 5B is an explanatory diagram showing another example of the characteristic portion according to embodiment 1.
Fig. 5C is an explanatory diagram showing another example of the characteristic portion according to embodiment 1.
Fig. 6 is a diagram showing an example of ultrasonic characteristic region information according to example 1.
Fig. 7 is a flowchart showing the process of estimating and identifying the position of a feature portion from volume data according to example 1.
Fig. 8 is a flowchart showing the process of aligning the ultrasonic 2D image and the ultrasonic 3D image according to example 1.
Fig. 9 is a diagram showing an initial position of the alignment process of the ultrasonic 2D image and the ultrasonic 3D image according to example 1.
Fig. 10 is a functional block diagram showing an image processing apparatus of an ultrasonic imaging apparatus according to embodiment 2.
Fig. 11 is a diagram showing an example of a display screen and a key selection unit of the display according to each embodiment.
Fig. 12 is a block diagram showing an example of the overall configuration of the ultrasonic imaging apparatus according to embodiment 3.
Fig. 13 is a block diagram showing an example of a hardware configuration of the ultrasonic imaging apparatus according to embodiment 3.
Detailed Description
Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings. In all the drawings for describing the embodiments, the same components are denoted by the same reference numerals in principle, and redundant description thereof is omitted. In this specification, the information of the feature portion refers to information of a position, a name, and a distance relationship of the feature portion, and the distance relationship refers to a projection distance from the feature portion to the ultrasonic 2D image.
(example 1)
Embodiment 1 is an embodiment of an ultrasonic imaging apparatus including: an ultrasonic probe that transmits ultrasonic waves to a subject and receives ultrasonic waves from the subject; an image generation unit that generates an ultrasonic 2D image from a reception signal of the ultrasonic probe and generates an ultrasonic 3D image by a plurality of times of transmission and reception of ultrasonic waves; and an image processing device that receives and processes the ultrasound 2D image and the ultrasound 3D image, wherein the image processing device estimates and recognizes a characteristic region of the subject from the ultrasound 3D image, performs position alignment between the ultrasound 2D image and the ultrasound 3D image, and displays information of the characteristic region on the ultrasound 2D image. Further, embodiment 1 is also an embodiment of an image processing apparatus and a method thereof, the image processing apparatus including: a characteristic portion position estimating/recognizing unit that estimates and recognizes a characteristic portion of a subject from an ultrasound 3D image of the subject; and an image position alignment unit that performs position alignment between the ultrasound 2D image and the ultrasound 3D image of the subject, and calculates the position of the 2D cross-sectional image of the ultrasound 3D image corresponding to the ultrasound 2D image.
In the ultrasound imaging apparatus according to the present embodiment, position estimation and name recognition are performed on a predetermined anatomical feature portion from an ultrasound 3D image obtained by imaging a subject. In addition, during surgery, a 2D image of an ultrasound scanning surface, which is an ultrasound 2D image captured in real time, and an ultrasound 3D image are aligned, a geometric transformation matrix for alignment is calculated, and a distance relationship between the estimated position of the feature portion and the captured ultrasound 2D image is calculated. The obtained distance relationship between the name and position of the feature portion and the ultrasonic 2D image is displayed on the ultrasonic 2D image as information of the feature portion, thereby enabling real-time guidance of the operation.
Here, the distance relationship, which is one of the information on the characteristic portion in the present embodiment, refers to a projection distance from the characteristic portion in the subject to the ultrasonic 2D image. Preferably, a projection distance from a three-dimensional position of a characteristic portion of the subject estimated from the ultrasonic 3D image (that is, from coordinates of a point on the 3D space) to a position of a 2D cross-sectional image of the ultrasonic 3D image corresponding to the ultrasonic 2D image (that is, to a position of a surface on the three-dimensional space) is calculated, and the calculated projection distance is defined as a distance relationship with the ultrasonic 2D image of the characteristic portion.
(construction and action)
Hereinafter, a specific configuration example of the ultrasonic imaging apparatus according to embodiment 1 will be described in detail. As shown in fig. 1, the ultrasound imaging apparatus of the present embodiment includes an ultrasound probe 7, an image generating unit 107, and an image processing device 108, and is further configured by a transmitting unit 102, a transmission/reception switching unit 101, a receiving unit 105, a User Interface (UI)121, and a control unit 106. The image obtained by the ultrasonic imaging apparatus 100 is displayed on the display 16. The display 16 may be included in the User Interface (UI) 121. The configuration example of the ultrasonic imaging apparatus shown in fig. 1 can be commonly used in other embodiments.
The transmission unit 102 generates a transmission signal under the control of the control unit 106, and transmits the transmission signal to each of the plurality of ultrasonic elements constituting the ultrasonic probe 7 called an ultrasonic probe. Thereby, the plurality of ultrasonic elements of the ultrasonic probe 7 transmit ultrasonic waves to the subject 120, respectively. The ultrasonic waves reflected by the subject 120 reach the plurality of ultrasonic elements of the ultrasonic probe 7 again, are received, and are converted into electric signals. The signal received by the ultrasonic element is delayed by a predetermined delay amount corresponding to the position of the signal reception focus by the reception unit 105, and phase-aligned addition is performed. This operation is repeated for each of the plurality of receive foci. The phasing addition signal is transmitted from the receiving unit 105 to the image generating unit 107. The transmission/reception switching unit 101 selectively connects the transmission unit 102 or the reception unit 105 to the ultrasonic probe 7.
The image generation unit 107 performs processing such as arranging the phasing addition signal received from the reception unit 105 at a position corresponding to the reception focus, and generates an ultrasonic 2D image. The image generating unit 107 can generate a plurality of ultrasound 2D images and synthesize an ultrasound 3D image while the user swings the ultrasound probe 7.
The image processing device 108 receives the ultrasound 3D image from the image generating unit 107, and performs name recognition and position estimation of a predetermined anatomical feature. The image processing device 108 receives the ultrasound 2D image generated in real time, aligns the ultrasound 2D image and the ultrasound 3D image, and displays the name and the position of the obtained feature part and the distance relationship between the ultrasound 2D image and the ultrasound 2D image on the ultrasound 2D image generated in real time.
The specific configuration and operation of the image processing apparatus 108 and the User Interface (UI)121 will be described in detail below.
Fig. 2 is a block diagram showing an example of the hardware configuration of the image processing apparatus 108 and the user interface 121. The hardware configuration example shown in fig. 2 is the same as the configuration of the ultrasonic imaging apparatus shown in fig. 1, and other embodiments can be applied.
The image processing device 108 includes: a CPU (processor) 1, a ROM (nonvolatile memory: read only memory) 2, a RAM (volatile memory: memory medium capable of reading and writing data) 3, a storage device 4, and a display control section 15. The user interface 121 includes: a medium input unit 11, an input control unit 13, an input device 14, and a display 16. The image generating unit 107, the image processing apparatus 108, and the user interface 121 are connected to each other via a bus 5.
At least one of the ROM2 and the RAM3 of the image processing apparatus 108 stores in advance programs and data for realizing arithmetic processing of the CPU1 necessary for the operation of the image processing apparatus 108. The CPU1 realizes various kinds of processing of the image processing apparatus 108 by executing a program stored in advance in at least one of the ROM2 and the RAM 3. The program executed by the CPU1 may be stored in a storage medium 12 such as an optical disk, and the program may be read by a media input unit 11 (e.g., an optical disk drive) and stored in the RAM 3. Alternatively, the program may be stored in the storage device 4 and then downloaded from the storage device 4 to the RAM 3. In addition, the program may be stored in the ROM2 in advance.
The storage device 4 may be provided with a nonvolatile semiconductor storage medium such as a flash memory. And an external storage device connected via a network or the like may be used.
The input device 14 is a device that receives user operations, and includes, for example, a keyboard, a trackball, an operation panel, a foot switch, and the like. The input control unit 13 receives an operation input by a user. The operation input received by the input control unit 13 is processed by the CPU 1. The display control unit 15 performs control of displaying image data obtained by the processing of the CPU1 on the display 16, for example. The display 16 displays an image under the control of the display control section 15.
Fig. 3 is a functional block diagram showing a function of the image processing apparatus 108 according to the present embodiment. As shown in fig. 3, the image processing apparatus 108 includes an ultrasonic 3D image acquisition unit 21, a characteristic portion position estimation/recognition unit 22 for an ultrasonic 3D image, and an ultrasonic 2D image acquisition unit 24. The image processing apparatus 108 further includes ultrasonic feature information 23 indicating information on the name and position of a feature, an ultrasonic 2D-3D image position alignment unit 25, and an image display unit 26.
The operation process of the image processing apparatus 108 shown in fig. 3 will be described with reference to the flowchart shown in fig. 4. First, in step S201, a display for urging an image to contact the ultrasonic probe 7 and scan while shaking is displayed on the display 16. When the user scans the ultrasound probe 7 within the organ region in accordance with the display, the transmission unit 102, the reception unit 105, and the image generation unit 107 continuously generate ultrasound 2D images. The image generating unit 107 continuously synthesizes an ultrasound 3D image from the generated ultrasound 2D images. The ultrasonic 3D image acquisition unit 21 receives the synthesized ultrasonic 3D image.
In step S202, the feature position estimation/recognition unit 22 of the ultrasound 3D image estimates the position of a predetermined anatomical feature from the ultrasound 3D image by a known machine learning method, and recognizes the name of each feature according to the estimation result. The characteristic site herein refers to a medically defined organ such as the umbilical region of the portal vein of the liver, the inferior vena cava inflow region, the gallbladder, and each branch point of the portal vein or vein of the liver, and an intra-organ site.
Fig. 5A, 5B, and 5C are explanatory diagrams showing three-dimensional positions of a portal vein umbilical region of a liver, an inflow portion of an inferior vena cava, and a gallbladder as characteristic portions in an ultrasound 3D image, and characteristics of the image. The cube 50 shown in fig. 5A, 5B, and 5C represents the local area around each position of the anatomical feature. The estimation of the position and name recognition of the feature portion in the feature portion position estimation/recognition unit 22 of the ultrasonic 3D image are described in detail below.
Fig. 6 shows an example of the names and three-dimensional position information of the feature portions estimated and recognized from the ultrasonic 3D image in the ultrasonic feature portion information 23. These ultrasonic characteristic region information 23 can be stored as a table in the RAM3, the storage device 4, or the like.
In step S203, the ultrasonic 2D image acquisition unit 24 receives the 2D ultrasonic image acquired in real time by the image generation unit 107.
In step S204, the ultrasonic 2D-3D image registration unit 25 receives the ultrasonic 3D image and the ultrasonic 2D image from the ultrasonic 3D image acquisition unit 21 and the ultrasonic 2D image acquisition unit 24, respectively, and calculates a registration transformation matrix for registration of the two images. The calculation of the position alignment transformation matrix is described in detail below.
In step S205, the image display unit 26 receives the ultrasound 3D image, the ultrasound 2D image, the ultrasound feature information 23, the alignment transformation matrix, and the position of the 2D cross-sectional image. Using these data, the image display unit 26 calculates a projection distance from the coordinates of a point on the 3D space of the characteristic region of the subject to the position of the 2D cross-sectional image of the ultrasound 3D image corresponding to the ultrasound 2D image, and sets the calculated projection distance as a distance relationship with the ultrasound 2D image of the characteristic region. As shown in fig. 11(a), the image display unit 26 displays the ultrasonic 2D image on the screen of the display 16. Further, the position of the recognized feature, the names 17A and 18A, and the distance relationship between the feature and the ultrasonic 2D image are displayed on the screen. At this time, the image display unit 26 projects the ultrasound 2D image currently displayed from the position of the characteristic portion in the three-dimensional coordinate system of the ultrasound 3D image using the alignment transformation matrix, and displays the position of the characteristic portion on the marks 17B and 18B indicated by x-signs used in the projected field. That is, since the image display unit 26 can display the name of the feature portion estimated from the ultrasound 3D image and the positional relationship between the feature portion and the ultrasound 2D image acquired in real time on the ultrasound 2D image on the display 16 in real time, accurate surgical navigation for the user can be realized.
As described above, the image display unit 26 calculates the projection distance from the coordinates of the point on the 3D space of the characteristic portion of the subject estimated from the ultrasound 3D image to the 2D cross-sectional image position of the ultrasound 3D image corresponding to the ultrasound 2D image, as the distance relationship between the characteristic portion and the ultrasound 2D image acquired in real time. The sizes of the markers 17B and 18B are displayed in proportion to the projection distance, which is the distance relationship between the calculated feature portion and the ultrasonic 2D image. That is, when the image display unit 26 displays the marker indicating the feature portion on the ultrasonic 2D image, the size of the markers 17B and 18B is displayed in proportion to the calculated projection distance, and the positional relationship between the two can be grasped at a glance, thereby further improving the convenience of use for the user.
Further, the image display unit 26 can turn on and off the display of the names 17A, 18A and the marks 17B, 18B of the feature portions according to the user selection of the check box 28, and can display the names 17A, 18A and the marks 17B, 18B of the feature portions only when the user needs them. The touch panel operation keys 19 shown in fig. 11(a) and fig. 11 (b) are explained in embodiment 2.
In the display in step S205, the image display unit 26 can change the color of one of the real-time ultrasound 2D image and the 2D cross-sectional image at the position corresponding to the ultrasound 2D image of the ultrasound 3D image, and generate and display a superimposed image that transmits both images on the display 16. Further, the image display unit 26 displays the names 17A and 18A of the feature portions and the marks 17B and 18B on the superimposed 2D image. In this case, the image display unit 26 may display the sizes of the markers 17B and 18B in proportion to the calculated projection distance from the feature portion to the ultrasonic 2D image.
Next, the processing of the characteristic portion position estimating/recognizing unit 22 of the ultrasonic 3D image according to the present embodiment will be described in detail with reference to the flowchart shown in fig. 7. As described above, the image processing apparatus 108 can be realized by the program execution of the CPU1, and therefore the processes of fig. 7 can also be realized by the program processing of the CPU 1.
First, in step S401, the characteristic portion position estimation/recognition unit 22 of the ultrasound 3D image receives the ultrasound 3D image from the image generation unit 107. In step S402, the characteristic portion position estimation/recognition unit 22 of the ultrasonic 3D image estimates the position of the characteristic portion candidate and recognizes the name of the characteristic portion candidate. In order to increase the processing speed, the feature position estimation/recognition unit 22 of the ultrasound 3D image reduces the size of the ultrasound 3D image, and searches for feature candidates at a coarse resolution using machine learning. As a method of estimating the position of the feature portion and recognizing the name, for example, a Hough Forest method (Hough Forest method) which is a known machine learning method can be used.
Next, in step S403, the feature position estimation/recognition unit 22 of the ultrasonic 3D image acquires a local 3D image of a local area around the feature candidate searched for from the ultrasonic 3D image of the normal size. In step S404, the feature position estimation/recognition unit 22 of the ultrasonic 3D image searches and recognizes the feature in detail in the local area around the feature candidate. Here, the hough forest method described above may be used. When a more accurate position estimation/recognition result is expected, a 3D CNN (Convolutional Neural Network) method, which is a known Deep Learning (Deep Learning) method, may be used.
In step S405, when the recognition value of the feature portion obtained in the search in step S404 is equal to or less than the predetermined threshold value, the feature portion position estimation/recognition unit 22 excludes the feature portion as the erroneous recognition unit. In step S406, the feature position estimation/recognition unit 22 of the ultrasonic 3D image outputs the position/name information of the recognized feature as the ultrasonic feature information 23.
Next, the processing of the ultrasonic 2D-3D image registration unit 25 according to the present embodiment will be described in detail with reference to the flowchart shown in fig. 8. The processing of the ultrasonic 2D-3D image registration unit 25 is also realized by the execution of a program by the CPU 1.
In step S301, the ultrasonic 2D-3D image alignment unit 25 receives the ultrasonic 3D image from the ultrasonic 3D image acquisition unit 21 and the ultrasonic 2D image from the ultrasonic 2D image acquisition unit 24, and roughly estimates a three-dimensional position corresponding to the ultrasonic 2D image from the ultrasonic 3D image. That is, the initial position of the corresponding ultrasonic 2D image is estimated.
In fig. 9, an example of the three-dimensional initial position candidates used in the ultrasonic 2D-3D image registration unit 25 is shown as 15 patterns. The positions 91 are shown in each pattern. When estimating the initial position, the ultrasonic 2D image is input to the identifier of the 15 patterns by machine learning, and the position 91 of the pattern at which the highest identification value is obtained is set as the initial position of the ultrasonic 2D image obtained from the ultrasonic 2D image obtaining unit 24. The ultrasonic 2D-3D image registration unit 25 selects a candidate of a three-dimensional position of a 2D cross-sectional image corresponding to an ultrasonic 2D image from the ultrasonic 3D image as an initial position, performs registration of the ultrasonic 2D image with the ultrasonic 3D image, and calculates a 2D cross-sectional image of the ultrasonic 3D image corresponding to the ultrasonic 2D image.
The recognizer of each pattern is made by learning. A large number of ultrasound 3D images are collected as learning data, and 2D sectional images at each position 91 in fig. 9 are extracted from each ultrasound 3D image to create learning data of a pattern corresponding to the position. In addition, in order to increase the number and diversity of the learning data, 2D sectional images are extracted from each position 91 of fig. 9 by performing random parallel movement and rotation angle in a small range and used as the learning data. Thus, for example, learning data of three-dimensional initial positions of 15 patterns shown in fig. 9 can be created, and a recognizer for each pattern can be created by machine learning. As a method of machine learning, for example, a known adaaboos method or a deep learning method can be used.
In step S302, the ultrasonic 2D-3D image position alignment unit 25 estimates the parallel movement and the rotation angle of the geometric transformation data for ultrasonic 2D-3D image position alignment from the estimated initial position of the ultrasonic 2D image. In step S303, the ultrasonic 2D-3D image position alignment unit 25 constructs a 2D cross-sectional image corresponding to the ultrasonic 2D image from the ultrasonic 3D image using the obtained parallel translation and rotation angle.
In step S304, the ultrasonic 2D-3D image alignment unit 25 performs calculation of an image similarity evaluation function on the 2D cross-sectional image obtained from the ultrasonic 3D image and the ultrasonic 2D image obtained by the ultrasonic 2D image obtaining unit 24. As the image similarity, it is more sufficient to use a known mutual information amount
In step S305, the ultrasonic 2D-3D image position alignment unit 25 performs convergence calculation in order to obtain the parallel translation and the rotation angle that maximize or maximize the image similarity between the 2D cross-sectional image obtained from the ultrasonic 3D image and the ultrasonic 2D image.
In step S306, if the image similarity does not converge, the parallel translation and the rotation angle are updated to obtain a higher similarity. Then, the updated parallel translation and rotation angle are used to re-execute steps S303 to S305.
On the other hand, when the similarity does not converge in step S305, the ultrasonic 2D-3D image registration unit 25 outputs the parallel translation and rotation angle information obtained in step S307, the position of the 2D cross-sectional image, and the like, thereby completing the processing of the ultrasonic 2D-3D image registration unit 25 of fig. 3.
In addition, for a plurality of ultrasound 2D images acquired in real time and continuously, the initial position of the positional alignment of the subsequent ultrasound 2D image after the positional alignment of the first ultrasound 2D image is completed can use the parallel movement and the rotation angle information of the positional alignment result of the ultrasound 2D image acquired at the previous time. That is, if the processing in steps S303 to S307 is performed using the parallel movement and the rotation angle of the previous ultrasonic 2D image, the ultrasonic 2D-3D image position alignment can be performed in real time.
As described above, according to the ultrasound imaging apparatus of the present embodiment, the name and position of the feature portion in the patient's body and the distance relationship indicating the projection distance from the feature portion to the ultrasound 2D image can be displayed in real time on the ultrasound 2D image, and automatic and accurate surgical navigation can be realized.
Although the configuration of embodiment 1 is a configuration in which the image processing apparatus 108 is provided inside the ultrasonic imaging apparatus 100, the image processing apparatus 108 shown in fig. 1 and 2 may be a device separate from the ultrasonic imaging apparatus 100. In this case, the image processing apparatus 108 and the ultrasound imaging apparatus 100 are connected by a signal line or a network. For example, the image processing apparatus 108 is installed in a general computer or a processing apparatus such as a workstation, and is connected to the ultrasound imaging apparatus 100 via a network.
In this case, the image processing device 108 receives an ultrasonic 3D image for feature recognition and an ultrasonic 2D image for alignment from the ultrasonic imaging device of the client terminal via the network, and performs the feature position estimation/recognition processing of fig. 7 and the image alignment processing of fig. 8. The name and position information of the identified feature portion and the ultrasonic 2D-3D image alignment result are transmitted to the client terminal. Thus, it is not necessary to mount the image processing device 108 requiring a large amount of computation in the ultrasound imaging device 100 of the client terminal. Therefore, the ultrasound imaging apparatus 100 can perform the alignment process using the calculation capability of the image processing apparatus 108 connected via the network, and thus can provide a compact ultrasound imaging apparatus and a device that can display the relationship between the name and the distance of the feature portion in the patient's body on the ultrasound 2D image in real time.
As described above, according to the present embodiment, by capturing an ultrasound 3D image of a subject, performing position estimation and name recognition on a predetermined anatomical feature portion, and displaying the position and name on an ultrasound 2D image including a distance relationship between the portion and the name and an ultrasound 2D image captured in real time during surgery, automatic and accurate surgical navigation can be realized.
(example 2)
In example 1, a predetermined anatomical feature is estimated and identified in terms of the position and name from an ultrasound 3D image, an ultrasound 2D image and an ultrasound 3D image captured in real time during surgery are aligned, and the distance relationship between the position/name of the identified feature and the ultrasound 2D image is displayed on the ultrasound 2D image. Embodiment 2 is an embodiment of an ultrasonic imaging apparatus capable of performing addition or correction of a feature portion or correction of geometric conversion calculation of alignment in accordance with a user instruction, in addition to the configuration of embodiment 1. That is, in the ultrasonic imaging apparatus in the embodiment, the image processing apparatus generates the 2 nd ultrasonic 3D image by combining the ultrasonic 3D image and the ultrasonic 3D image captured by the ultrasonic probe other than the ultrasonic 3D image, performs the position alignment of the ultrasonic 2D image and the 2 nd ultrasonic 3D image, calculates the position of the 2D cross-sectional image of the 2 nd ultrasonic 3D image corresponding to the ultrasonic 2D image, and displays the position, name, and distance relationship of the characteristic portion on the ultrasonic 2D image using the position of the characteristic portion and the position of the 2D cross-sectional image of the 2 nd ultrasonic 3D image.
In addition, in the image processing apparatus and the image processing method according to the embodiment, the image processing apparatus includes a correction unit that generates a 2 nd ultrasound 3D image by combining an ultrasound 3D image and an ultrasound 3D image other than the ultrasound 3D image, the feature portion position estimation/recognition unit estimates and recognizes a feature portion of the subject from the 2 nd ultrasound 3D image, and the image position alignment unit performs position alignment of the ultrasound 2D image and the 2 nd ultrasound 3D image, and calculates a position of a 2D cross-sectional image of the 2 nd ultrasound 3D image corresponding to the ultrasound 2D image. In the description of embodiment 2, the same components and processes as those of embodiment 1 are denoted by the same reference numerals, and the description thereof is omitted.
(construction and action)
Fig. 10 is a functional block diagram showing functions of the image processing apparatus 108 according to embodiment 2, and fig. 11 is a diagram showing an example of a display screen and key selection means of the display according to embodiment 2. As shown in fig. 10, the image processing apparatus 108 of the present embodiment includes: an ultrasonic 3D image acquisition unit 21, an ultrasonic 3D image characteristic portion position estimation/recognition unit 22, and an ultrasonic 2D image acquisition unit 24. The image processing apparatus 108 further includes a correction unit 27 for feature recognition and alignment results, in addition to the ultrasonic feature information 23, the ultrasonic 2D-3D image alignment unit 25, and the image display unit 26, which display the name and position information of the feature. In the present embodiment, as an example shown in fig. 11(a), the image display unit 26 displays the ultrasonic 2D image on the screen of the display 16, and displays the positions of the recognized feature portions, the names 17A and 18A, and the signs 17B and 18B, which indicate the distance relationship between the feature portions and the ultrasonic 2D image in the size, on the screen based on the ultrasonic feature portion information, the distance relationship with the ultrasonic 2D image of the feature portions, and the like.
In the present embodiment, as shown in an example of fig. 11(a), in a state where the position and name of the recognized feature portion and the distance relationship between the feature portion and the ultrasonic 2D image are displayed on the display 16, the correction unit 27 for the feature portion recognition and alignment result displays the touch panel operation button 19 as a display for inquiring the user whether or not the feature portion recognition and the alignment are determined to be successful. That is, the touch panel operation keys 19 for volume addition, manual correction of characteristic portions, initial position correction of alignment, detailed correction of alignment, and the like are displayed on the display 16, and the determination by the user is received by the key selection means of the input device 14 such as a mouse. If the user inputs that the feature recognition and the position alignment are successful through the input device 14, the position alignment process ends. In addition, instead of the input device 14 and the touch screen operation keys 19, a trackball or a foot switch connected by a USB cable 20B for use in surgery as shown in fig. 15(B) may be used as the key selection unit.
On the other hand, when the user determines that the feature recognition and the alignment are not successful, the correction unit 27 of the feature recognition and the alignment result according to the present embodiment performs the correction process of the feature recognition and the alignment.
The correction unit 27 for the feature recognition and the alignment result displays a display on the display 16 inquiring the user whether or not to determine whether or not to add the ultrasonic volume, and receives the determination from the user by operating the key 19 through the input device 14 or the touch panel. When the user determines that the information of the characteristic portion is insufficient and the ultrasonic probe additionally acquires an ultrasonic 3D image of 1 or more in addition to the ultrasonic 3D image, the ultrasonic imaging apparatus 100 additionally acquires an ultrasonic 3D image when an input is made by the input device 14 or the touch panel operation button 19. The correction unit 27 for feature recognition and alignment results combines, i.e., splices, the additionally acquired ultrasonic 3D image and the original ultrasonic 3D image to generate a 2 nd ultrasonic 3D image which is a synthesized 1 ultrasonic 3D image. As a splicing method, for example, a method described in non-patent document 1 can be used.
The feature recognition and alignment result correction unit 27 performs the feature recognition of fig. 7 and the ultrasonic 2D-3D image alignment process of fig. 8 using the 2 nd ultrasonic 3D image generated by the stitching process. Since the feature recognition processing and the alignment processing are the same as those in embodiment 1, the description thereof will be omitted.
On the other hand, if the user does not additionally acquire the ultrasonic 3D image but determines and inputs that the correction is performed manually, the user manually corrects the position and name of the feature of the ultrasonic 3D image, which is the ultrasonic feature information 23 shown in fig. 6 as an example, using the input device 14 or the like. The correction unit 27 for the feature recognition and alignment result receives the corrected position/name information of the feature of the 3D image and outputs the position/name information to the image display unit 26. The image display unit 26 executes the image display processing of step 205 of fig. 4 described above
In addition, when the user determines and inputs the position alignment initial position correction by operating the key 19 through the input device 14 or the touch panel, the correction unit 27 of the feature recognition and position alignment result displays, for example, the position alignment initial position pattern of fig. 9 on the display 16, and receives the user's selection. The user can also manually perform fine correction on the selected initial position pattern for position alignment. And the user acquires an ultrasonic 2D image from the corrected initial position of the alignment. The feature recognition and alignment result correction unit 27 receives the corrected initial alignment position pattern and the ultrasonic 2D image, and executes the processing of steps S302 to S307 in fig. 8.
When the user determines and inputs the position alignment correction details by operating the keys 19 via the input device 14 or the touch panel, the correction unit 27 for feature recognition and position alignment results displays the ultrasonic 2D image from the ultrasonic 2D image acquisition unit 24 and the 2D cross-sectional image of the ultrasonic 3D image corresponding to the ultrasonic 2D image on the display 16 in a superimposed manner. The user corrects the positional alignment with the ultrasonic 2D image while manually adjusting the position or the rotation angle of the 2D cross-sectional image. The correction unit 27 for feature recognition and alignment results receives the corrected alignment results and outputs the results to the image display unit 26.
The image display unit 26 executes the image display processing of step 205 in fig. 4. In the present embodiment, the image display unit 26 can display the names 17A and 18A and the marks 17B and 18B of the feature portions when the user needs them, in accordance with the user selection of the check box 28.
As described above, the ultrasonic imaging apparatus according to the present embodiment can be configured to perform addition or correction of the feature portion and recalculation of the coordinate conversion information for the alignment based on the user instruction.
(example 3)
An embodiment of the ultrasound imaging apparatus according to embodiment 3 is configured such that the ultrasound probe includes a position sensor, and the image generating unit generates an ultrasound 2D image and an ultrasound 3D image based on position information of the ultrasound probe obtained by the position sensor. In example 1, a predetermined anatomical feature is estimated and identified in position and name by an ultrasonic 3D image generated by a plurality of times of transmission and reception of ultrasonic waves, an ultrasonic 2D image and an ultrasonic 3D image captured in real time during surgery are aligned in position, and a distance relationship between the position and name of the identified feature and the ultrasonic 2D image is displayed on the ultrasonic 2D image. In example 3, a position sensor is attached to an ultrasonic probe, an ultrasonic image is generated from a reception signal of the ultrasonic probe, an ultrasonic 3D image is generated from the ultrasonic image and position information of the ultrasonic probe obtained by the position sensor, and position estimation and name recognition of a predetermined anatomical feature are performed from the ultrasonic 3D image.
Further, in example 3, when the ultrasonic 2D image is captured in real time during the operation, the ultrasonic 2D image and the ultrasonic 3D image are aligned using the position information from the position sensor of the ultrasonic probe, and the position, the name, and the distance relationship with the ultrasonic 2D image of the identified feature portion are displayed on the ultrasonic 2D image. In the description of the present embodiment, the same components and processes as those of embodiment 1 are denoted by the same reference numerals, and the description thereof is omitted.
(construction and action)
Fig. 12 shows a configuration example of an ultrasonic imaging apparatus according to embodiment 3. Fig. 13 is a block diagram showing an example of the hardware configuration of the image processing apparatus 108 and the user interface 121 in embodiment 3. In fig. 12 and 13, a position detection unit 6 and a position sensor 8 are added to the structure of embodiment 1. The position detection unit 6 detects the position of the ultrasonic probe 7 from the output of the position sensor 8. For example, as the position detection unit 6, a magnetic sensor unit may be used. By forming a magnetic field space and detecting a magnetic field by the position sensor 8, i.e., a magnetic sensor, the position detection unit 6 can detect position information from the ultrasonic probe, which is coordinates of a reference point position.
A functional block diagram showing a functional example of the image processing apparatus 108 according to embodiment 3 is similar to embodiment 1, and is fig. 3. The operation process of the image processing apparatus 108 according to embodiment 3 shown in fig. 3 is shown in the flowchart of fig. 4.
The image generating unit 107 in embodiment 3 generates an ultrasonic image from the reception signal of the ultrasonic probe 7, and generates an ultrasonic 3D image from the ultrasonic image and the positional information of the ultrasonic probe obtained from the position sensor 8. In step S204 of fig. 4 in embodiment 3, the ultrasound 2D-3D image registration unit 25 calculates a registration transformation matrix for performing registration using the positional information of the ultrasound probe attached to the ultrasound 3D image and the positional information of the ultrasound probe attached to the ultrasound 2D image captured in real time during surgery. The processing other than that in embodiment 3 is the same as that in embodiment 1.
As described above, in embodiment 3, the position sensor is attached to the ultrasonic probe, the ultrasonic image is generated from the reception signal of the ultrasonic probe, the ultrasonic 3D image is generated from the ultrasonic image and the position information of the ultrasonic probe obtained by the position sensor, and the position estimation and name recognition of the predetermined anatomical feature portion are performed from the ultrasonic 3D image. Further, when the ultrasonic imaging apparatus is caused to image the ultrasonic 2D image in real time during the operation, the ultrasonic 2D image and the ultrasonic 3D image can be aligned using the positional information of the ultrasonic probe, and information on the position, the name, and the distance relationship with the ultrasonic 2D image of the recognized characteristic portion can be displayed on the ultrasonic 2D image.
The above-described invention can provide an ultrasound imaging apparatus that estimates the position and identifies the name of a predetermined anatomical feature portion from an ultrasound 3D image, aligns the ultrasound 3D image with a two-dimensional image of an ultrasound scanning plane that is an ultrasound 2D image captured in real time, projects the position of the feature portion on the ultrasound 2D image, calculates the distance relationship between the feature portion and the ultrasound 2D image, and displays information on the position, name, and distance relationship with the ultrasound 2D image of the feature portion on the ultrasound 2D image.
The present invention is not limited to the above embodiment, and various modifications are also included. For example, the above-described embodiments are described in detail for better understanding of the present invention, and are not necessarily limited to having all of the configurations described. As described above, the present invention is not limited to the ultrasonic imaging apparatus, and can be realized as an image processing apparatus connected to the ultrasonic imaging apparatus via a network, and an image processing method thereof. Further, a part of the structure of one embodiment may be replaced with the structure of another embodiment, and the structure of another embodiment may be added to the structure of one embodiment. In addition, deletion, and replacement of another configuration may be performed with respect to a part of the configurations of the embodiments.
Although an example in which a part or all of the programs for realizing the above-described configurations, functions, image processing apparatuses, and the like are created has been described, for example, an integrated circuit may be designed to realize a part or all of them by hardware.
Description of the symbols
1 CPU
2 ROM
3 RAM
4 storage device
5 bus
6 position detecting unit
7 ultrasonic probe
8 position sensor
10 image capturing device
11 medium input part
12 storage medium
13 input control part
14 input device
15 display control part
16 display
17A, 18A names
17B, 18B marks
19 touch screen operation key
20A pedal switch
20B USB line
21 ultrasonic 3D image acquisition unit
22 characteristic part position estimating/identifying unit for ultrasonic 3D image
23 ultrasonic characteristic region information
24 ultrasonic 2D image acquisition unit
25 ultrasonic 2D-3D image position alignment part
26 image display unit
27 feature part recognition and position alignment result correction unit
28 check box
50 cube
90 position
100 ultrasonic imaging apparatus
101 transmission/reception switching unit
102 sending part
105 receiving part
106 control part
107 image generation unit
108 image processing apparatus
120 users
121 User Interface (UI)

Claims (8)

1. An ultrasonic imaging apparatus is characterized by comprising:
an ultrasonic probe that transmits ultrasonic waves to a subject and receives the ultrasonic waves from the subject;
an image generation unit that generates an ultrasonic 2D image, which is an ultrasonic two-dimensional image, from a reception signal of the ultrasonic probe, and generates an ultrasonic 3D image, which is an ultrasonic three-dimensional image, by transmitting and receiving ultrasonic waves a plurality of times; and
an image processing device that receives and processes the ultrasonic 2D image and the ultrasonic 3D image,
the image processing apparatus performs the following processing:
estimating and recognizing a characteristic portion of the subject from the ultrasonic 3D image,
performing a positional alignment of the ultrasound 2D image and the ultrasound 3D image,
displaying information of the characteristic part on the ultrasonic 2D image;
the information of the characteristic part is the position, the name and the distance relation with the ultrasonic 2D image of the characteristic part;
the image processing apparatus includes a feature position estimation/recognition unit that estimates and recognizes a position and a name of the feature from the ultrasonic 3D image;
the image processing apparatus includes an image position alignment unit that performs position alignment of the ultrasonic 2D image and the ultrasonic 3D image,
the image position alignment unit calculates a position of a 2D cross-sectional image of the ultrasonic 3D image corresponding to the ultrasonic 2D image;
the image processing apparatus further includes a correction unit that corrects the position and name of the feature portion estimated and recognized by the feature portion position estimation/recognition unit;
the correction unit corrects the position of the 2D cross-sectional image of the ultrasound 3D image corresponding to the ultrasound 2D image calculated by the image position alignment unit.
2. The ultrasonic imaging apparatus according to claim 1,
the ultrasonic imaging apparatus includes an image display unit for displaying the ultrasonic 2D image,
the image display unit calculates a distance relationship with the ultrasonic 2D image using the position of the feature portion and the position of the 2D cross-sectional image, and displays the position, the name, and the distance relationship with the ultrasonic 2D image of the feature portion on the ultrasonic 2D image.
3. The ultrasonic imaging apparatus according to claim 2,
the image display unit displays an image in which the ultrasonic 2D image and the 2D cross-sectional image are superimposed.
4. The ultrasonic imaging apparatus according to claim 1,
the image position alignment unit selects a candidate of a three-dimensional position of a 2D cross-sectional image corresponding to the ultrasonic 2D image from the ultrasonic 3D image as an initial position, performs position alignment of the ultrasonic 2D image and the ultrasonic 3D image, and calculates a 2D cross-sectional image of the ultrasonic 3D image corresponding to the ultrasonic 2D image.
5. The ultrasonic imaging apparatus according to claim 1,
the image processing apparatus performs the following processing:
generating a 2 nd ultrasonic 3D image by combining the ultrasonic 3D image with an ultrasonic 3D image captured by the ultrasonic probe other than the ultrasonic 3D image,
performing position alignment of the ultrasonic 2D image and the 2 nd ultrasonic 3D image, and calculating a position of a 2D cross-sectional image of the 2 nd ultrasonic 3D image corresponding to the ultrasonic 2D image,
and displaying the position, name, and distance relationship of the feature portion on the ultrasonic 2D image using the position of the feature portion and the position of the 2D cross-sectional image of the 2 nd ultrasonic 3D image.
6. The ultrasonic imaging apparatus according to claim 1,
the ultrasonic probe is provided with a position sensor,
the image generation unit generates the ultrasonic 3D image from the ultrasonic 2D image and the positional information of the ultrasonic probe obtained by the position sensor.
7. An image processing apparatus is characterized by comprising:
a characteristic portion position estimating/recognizing unit that estimates and recognizes a characteristic portion of a subject based on an ultrasonic 3D image of the subject; and
an image position alignment unit that performs position alignment of an ultrasound 2D image and the ultrasound 3D image of the subject, and calculates a position of a 2D cross-sectional image of the ultrasound 3D image corresponding to the ultrasound 2D image;
displaying information of the characteristic part on the ultrasonic 2D image, wherein the information of the characteristic part is the position, the name and the distance relation with the ultrasonic 2D image of the characteristic part;
the image processing apparatus includes a feature position estimation/recognition unit that estimates and recognizes a position and a name of the feature from the ultrasonic 3D image;
the image processing apparatus includes an image position alignment unit that performs position alignment of the ultrasonic 2D image and the ultrasonic 3D image,
the image position alignment unit calculates a position of a 2D cross-sectional image of the ultrasonic 3D image corresponding to the ultrasonic 2D image;
the image processing apparatus further includes a correction unit that corrects the position and name of the feature portion estimated and recognized by the feature portion position estimation/recognition unit;
the correction unit corrects the position of the 2D cross-sectional image of the ultrasound 3D image corresponding to the ultrasound 2D image calculated by the image position alignment unit.
8. An image processing method of an image processing apparatus, characterized in that,
the image processing apparatus performs the following processing:
estimating and recognizing a characteristic portion of a subject from an ultrasonic 3D image of the subject,
performing position alignment of an ultrasound 2D image and the ultrasound 3D image of the subject, and calculating a position of a 2D cross-sectional image of the ultrasound 3D image corresponding to the ultrasound 2D image;
combining the ultrasonic 3D image with ultrasonic 3D images other than the ultrasonic 3D image to generate a 2 nd ultrasonic 3D image;
estimating and recognizing a characteristic portion of the subject from the 2 nd ultrasonic 3D image;
the information of the characteristic part is the position, the name and the distance relation with the ultrasonic 2D image of the characteristic part;
presume and discern the position, name of the said characteristic part according to the said ultrasonic 3D picture;
performing a positional alignment of the ultrasound 2D image and the ultrasound 3D image,
calculating a position of a 2D cross-sectional image of the ultrasonic 3D image corresponding to the ultrasonic 2D image;
correcting the position and name of the estimated and recognized characteristic part;
correcting the calculated position of the 2D cross-sectional image of the ultrasonic 3D image corresponding to the ultrasonic 2D image.
CN201780013492.XA 2016-05-12 2017-04-18 Ultrasonic imaging apparatus, image processing apparatus, and method thereof Active CN108697410B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2016-095761 2016-05-12
JP2016095761A JP6689666B2 (en) 2016-05-12 2016-05-12 Ultrasonic imaging device
PCT/JP2017/015573 WO2017195540A1 (en) 2016-05-12 2017-04-18 Ultrasound imaging device, image processing device and method therefor

Publications (2)

Publication Number Publication Date
CN108697410A CN108697410A (en) 2018-10-23
CN108697410B true CN108697410B (en) 2021-06-04

Family

ID=60266475

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780013492.XA Active CN108697410B (en) 2016-05-12 2017-04-18 Ultrasonic imaging apparatus, image processing apparatus, and method thereof

Country Status (3)

Country Link
JP (1) JP6689666B2 (en)
CN (1) CN108697410B (en)
WO (1) WO2017195540A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3932323A4 (en) * 2019-02-28 2022-04-13 FUJIFILM Corporation Ultrasonic endoscopic system and operating method of ultrasonic endoscopic system
JP7158596B2 (en) * 2019-08-27 2022-10-21 富士フイルム株式会社 Endoscopic Ultrasound System and Method of Operating Endoscopic Ultrasound System
KR102512104B1 (en) * 2020-05-07 2023-03-22 한국과학기술연구원 Apparatus and method for generating 3d ultrasound image
JP2021186211A (en) * 2020-05-28 2021-12-13 株式会社日立製作所 Ultrasonic imaging apparatus, and surgery support system and method using the same

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101271526A (en) * 2008-04-22 2008-09-24 深圳先进技术研究院 Method for object automatic recognition and three-dimensional reconstruction in image processing
CN101653381A (en) * 2008-08-18 2010-02-24 株式会社东芝 Medical image processing apparatus, ultrasound imaging apparatus, x-ray ct apparatus
CN101669831A (en) * 2003-05-08 2010-03-17 株式会社日立医药 Reference image display method
CN102300505A (en) * 2009-06-30 2011-12-28 株式会社东芝 Ultrasonic diagnostic device and control program for displaying image data
CN101259026B (en) * 2007-03-06 2012-11-14 通用电气公司 Method and apparatus for tracking points in an ultrasound image
US8435181B2 (en) * 2002-06-07 2013-05-07 Verathon Inc. System and method to identify and measure organ wall boundaries
CN103315769A (en) * 2012-03-21 2013-09-25 株式会社东芝 Ultrasonic diagnostic apparatus, image processing apparatus, and image processing method
CN103460245A (en) * 2011-04-06 2013-12-18 佳能株式会社 Information processing apparatus
CN104093362A (en) * 2011-12-21 2014-10-08 柯尼卡美能达株式会社 Ultrasound diagnostic apparatus and contour extraction method
CN104398272A (en) * 2014-10-21 2015-03-11 无锡海斯凯尔医学技术有限公司 Method and device for selecting detection area and flexible detection system
CN105051783A (en) * 2013-03-15 2015-11-11 S·利特尔 Evaluating electromagnetic imagery by comparing to other individuals' imagery
US9262685B2 (en) * 2013-02-06 2016-02-16 Samsung Electronics Co., Ltd. Method and apparatus for representing changes in shape and location of organ in respiration cycle
CN105407811A (en) * 2013-05-28 2016-03-16 伯尔尼大学 Method and system for 3D acquisition of ultrasound images

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8824762B2 (en) * 2010-10-22 2014-09-02 The Johns Hopkins University Method and system for processing ultrasound data
JP6058290B2 (en) * 2011-07-19 2017-01-11 東芝メディカルシステムズ株式会社 Image processing system, apparatus, method, and medical image diagnostic apparatus
US9786040B2 (en) * 2012-09-26 2017-10-10 Hitachi, Ltd. Ultrasound diagnostic apparatus and ultrasound two-dimensional cross-section image generation method
CN105433977B (en) * 2014-07-31 2020-02-07 东芝医疗系统株式会社 Medical imaging system, surgical guidance system, and medical imaging method

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8435181B2 (en) * 2002-06-07 2013-05-07 Verathon Inc. System and method to identify and measure organ wall boundaries
CN101669831A (en) * 2003-05-08 2010-03-17 株式会社日立医药 Reference image display method
CN101259026B (en) * 2007-03-06 2012-11-14 通用电气公司 Method and apparatus for tracking points in an ultrasound image
CN101271526A (en) * 2008-04-22 2008-09-24 深圳先进技术研究院 Method for object automatic recognition and three-dimensional reconstruction in image processing
CN101653381A (en) * 2008-08-18 2010-02-24 株式会社东芝 Medical image processing apparatus, ultrasound imaging apparatus, x-ray ct apparatus
CN102300505A (en) * 2009-06-30 2011-12-28 株式会社东芝 Ultrasonic diagnostic device and control program for displaying image data
CN103460245A (en) * 2011-04-06 2013-12-18 佳能株式会社 Information processing apparatus
CN104093362A (en) * 2011-12-21 2014-10-08 柯尼卡美能达株式会社 Ultrasound diagnostic apparatus and contour extraction method
CN103315769A (en) * 2012-03-21 2013-09-25 株式会社东芝 Ultrasonic diagnostic apparatus, image processing apparatus, and image processing method
US9262685B2 (en) * 2013-02-06 2016-02-16 Samsung Electronics Co., Ltd. Method and apparatus for representing changes in shape and location of organ in respiration cycle
CN105051783A (en) * 2013-03-15 2015-11-11 S·利特尔 Evaluating electromagnetic imagery by comparing to other individuals' imagery
CN105407811A (en) * 2013-05-28 2016-03-16 伯尔尼大学 Method and system for 3D acquisition of ultrasound images
CN104398272A (en) * 2014-10-21 2015-03-11 无锡海斯凯尔医学技术有限公司 Method and device for selecting detection area and flexible detection system

Also Published As

Publication number Publication date
JP6689666B2 (en) 2020-04-28
JP2017202125A (en) 2017-11-16
WO2017195540A1 (en) 2017-11-16
CN108697410A (en) 2018-10-23

Similar Documents

Publication Publication Date Title
CN108697410B (en) Ultrasonic imaging apparatus, image processing apparatus, and method thereof
JP6490820B2 (en) Ultrasonic imaging apparatus, image processing apparatus, and method
KR101495528B1 (en) Ultrasound system and method for providing direction information of a target object
CN107106144B (en) Ultrasonic imaging apparatus and image processing apparatus
EP3832599A1 (en) Device for providing 3d image registration and method therefor
CN107106128B (en) Ultrasound imaging apparatus and method for segmenting an anatomical target
JP7321836B2 (en) Information processing device, inspection system and information processing method
US11847730B2 (en) Orientation detection in fluoroscopic images
US20230062672A1 (en) Ultrasonic diagnostic apparatus and method for operating same
US20160367221A1 (en) Ultrasound diagnosis apparatus
US20120065513A1 (en) 3d ultrasound system for extending view of image and method for operating the 3d ultrasound system
JP6887942B2 (en) Ultrasound imaging equipment, image processing equipment, and methods
CN112545551A (en) Method and system for medical imaging device
US20220249174A1 (en) Surgical navigation system, information processing device and information processing method
US11974883B2 (en) Ultrasound imaging apparatus, method of controlling the same, and computer program
JP6382031B2 (en) Ultrasonic diagnostic apparatus and control program therefor
CN112336375B (en) Ultrasonic diagnostic apparatus and ultrasonic image processing method
JP7027029B2 (en) Ultrasound diagnostic equipment and medical image processing equipment
JP6731369B2 (en) Ultrasonic diagnostic device and program
US11452495B2 (en) Apparatus and method for detecting a tool
JP2014212904A (en) Medical projection system
WO2024047143A1 (en) Ultrasound exam tracking
US20190271771A1 (en) Segmented common anatomical structure based navigation in ultrasound imaging
CN116543029A (en) Image registration method and related device based on ultrasonic probe
KR20200140683A (en) Apparatus and method for aligning ultrasound image and 3D medical image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20211122

Address after: Chiba County, Japan

Patentee after: Fujifilm medical health Co.,Ltd.

Address before: Tokyo, Japan

Patentee before: Hitachi, Ltd.

TR01 Transfer of patent right