WO2005104955A1 - 超音波診断装置 - Google Patents
超音波診断装置 Download PDFInfo
- Publication number
- WO2005104955A1 WO2005104955A1 PCT/JP2005/007926 JP2005007926W WO2005104955A1 WO 2005104955 A1 WO2005104955 A1 WO 2005104955A1 JP 2005007926 W JP2005007926 W JP 2005007926W WO 2005104955 A1 WO2005104955 A1 WO 2005104955A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- image data
- dimensional
- ultrasonic
- data
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/52017—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
- G01S7/52053—Display arrangements
- G01S7/52057—Cathode ray tube displays
- G01S7/52074—Composite displays, e.g. split-screen displays; Combination of multiple images or of images and alphanumeric tabular information
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/12—Diagnosis using ultrasonic, sonic or infrasonic waves in body cavities or body tracts, e.g. by using catheters
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/463—Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5238—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4444—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4444—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
- A61B8/445—Details of catheter construction
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/56—Details of data transmission or power supply
- A61B8/565—Details of data transmission or power supply involving data transmission via a network
Definitions
- the present invention irradiates an ultrasonic wave radially into a living body, receives an echo of the ultrasonic wave, scans the ultrasonic wave, and performs the scan based on the ultrasonic image data obtained by the scan.
- the present invention relates to an ultrasonic diagnostic apparatus for displaying and outputting an ultrasonic image of a body. Background art
- an ultrasonic diagnostic apparatus that irradiates an ultrasonic wave into a living body and performs a sector scan for receiving an ultrasonic echo from the living tissue to generate and output an ultrasonic image of the living body.
- the surgeon considers the known anatomical positional relationship of each organ and each tissue in the living body in advance and determines the biopsy position in the living body that is currently being observed. Estimate and observe the ultrasonic image to make a medical diagnosis of this living body.
- an ultrasonic diagnostic apparatus has been proposed which displays and outputs a guide image for guiding an anatomical position in a living body observed by an ultrasonic image.
- the ultrasonic diagnostic apparatus includes an extracorporeal ultrasonic probe for irradiating an ultrasonic wave into a living body of an examinee and an anatomical chart database storing anatomical illustration images. Detects the position and posture of the ultrasonic transducer provided in the ultrasonic probe, and selects an illustration image that matches the anatomical position and posture based on the detected position and posture in the anatomical map database. It is configured to display and output. Further, the ultrasonic diagnostic apparatus allows the operator to set the contact position and the contact posture of the ultrasonic probe before starting the diagnosis so that the scanning plane of the ultrasonic probe matches the predetermined cross section (reference plane) of the living body. By adjusting, the anatomical chart database force illustration image can be automatically selected according to the position and posture of the ultrasonic transducer (see Patent Document 1).
- the ultrasonic diagnostic apparatus As a method of matching the reference plane with the scanning plane, the ultrasonic diagnostic apparatus is used for the inspection. A skeletal map composed based on the physique information of the patient is displayed and output. There is a method of adjusting the contact position and contact posture of the sound wave probe. Alternatively, the ultrasonic diagnostic apparatus outputs and displays a skeleton diagram corresponding to the current contact state of the ultrasonic probe, and the operator inputs the coordinates of the reference plane that matches the scanning plane of the ultrasonic probe on the skeleton diagram. There is a way to specify.
- the ultrasonic diagnostic apparatus matches the running surface of the ultrasonic probe with the reference plane on the skeletal diagram based on these methods, and thereby, the coordinate system of the ultrasonic probe (recognized by the ultrasonic diagnostic apparatus). (Coordinate system) and the coordinate system on the living body.
- Patent Document 1 JP-A-2002-263101
- the ultrasonic diagnostic device described in Patent Document 1 described above has the following problems.
- this ultrasonic diagnostic apparatus is configured to automatically select and display the closest to the observation position from the database containing multiple illustration images. Unless stored in the anatomical chart database in advance, the illustration image may not exactly match the position of the scanning plane by the ultrasonic transducer. Therefore, there is a problem that the selected illustration image is not a guide image accurately representing the observation position.
- the ultrasonic diagnostic apparatus is configured according to the physique information of a subject as a first specific method for matching a scanning plane to a specific cross section (reference plane) of a living body.
- a skeleton diagram is displayed on the screen, a reference plane is arbitrarily specified on the skeleton diagram, and the contact of the probe is adjusted so that the running plane matches the reference plane.
- the coordinates that match the scanning plane in the current contact state of the probe are designated on the skeleton diagram.
- Patent Document 1 does not disclose what the coordinate system of the probe is and the coordinate system on the living body. Also, the probe coordinate system The specific calculation method that matches the coordinate system with the living body coordinate system is disclosed! In other words, it cannot be said that this ultrasonic diagnostic apparatus does not eliminate the inaccuracy in exactly matching the scanning plane to the reference plane or specifying coordinates that match the scanning plane. Therefore, as a result, similarly to the first problem, there was a problem that the selected illustration image could not be said to be a guide image accurately representing the observation position.
- the present invention has been made in view of the above circumstances, and provides an ultrasonic diagnostic apparatus capable of displaying and outputting a guide image that anatomically accurately corresponds to an ultrasonic image for observing the inside of a subject.
- the purpose is to provide.
- the present invention provides a method for performing scanning within a body of a subject to obtain two-dimensional image data in the body and obtaining the two-dimensional image data.
- An ultrasonic diagnostic apparatus that detects the position and orientation of the scan surface of the scan and generates and outputs a two-dimensional ultrasonic image of the body based on the detected position and orientation and the two-dimensional image data.
- Image processing control means for generating a guide image corresponding to the anatomical position and orientation of the two-dimensional ultrasound image based on anatomical image data stored in advance as anatomical image data of a human body
- display means for simultaneously displaying and outputting a plurality of various images including the guide image and the two-dimensional ultrasonic image.
- input means for instructing and inputting a feature point indicating an anatomically characteristic position on the anatomical image data
- Sample point detecting means for detecting a corresponding sample force of the subject's positional force
- the image processing control means comprises the characteristic point, the sample point, and the detected position and orientation.
- the guide image corresponds to an anatomical position and orientation of the two-dimensional ultrasonic image.
- the image processing control means may be configured to determine a cutting plane of the anatomical image data based on the characteristic point, the sample point, and the detected position and orientation. Calculating, generating cross-sectional image data corresponding to the cross-sectional image of the cutting plane based on the anatomical image data, and generating the guide image based on the cutting plane and the cross-sectional image data. It is characterized by the following.
- the input means inputs at least four of the feature points
- the sample point detecting means anatomically deviates to at least four of the feature points. The method is characterized in that at least four corresponding sample points are detected.
- the present invention according to the above invention, further comprising a probe inserted into a body cavity, wherein the sample point detecting means is provided at a tip of the probe, and detects a sample point from inside the body cavity of the subject. It is characterized by the following.
- the probe includes optical observation means for obtaining an optical image of the inside of the body cavity of the subject, and the display means displays the optical image obtained by the optical observation means.
- the sample point is displayed, and the sample point is detected from the body cavity of the subject while the display means is displaying the optical image.
- the image processing control means sets a feature point three-axis coordinate system on the anatomical image data based on the feature points, and A sample point three-axis coordinate system anatomically corresponding to the feature point three-axis coordinate system is set on the two-dimensional image data, and the detected position and orientation are defined on the sample point three-axis coordinate system.
- the position and orientation are converted into the position and orientation, and the converted position and orientation are converted into the position and orientation on the feature point three-axis coordinate system.
- the cutting plane is calculated.
- the anatomical image data is a plurality of slice image data corresponding to each slice image of a human body transverse section perpendicular to a body axis
- the input unit includes: The feature points are instructed and input on the slice image displayed as an anatomical image, and the image processing control means interpolates an intersection line between the cutting plane and the plurality of slice image data to form the cross section. It is characterized in that image data is created.
- the sample point detecting means is arranged on a body surface of the subject and corresponds to an anatomical characteristic position near the body surface.
- Reference sample point detecting means for detecting the sample points, and the image processing control means includes a sample point three-axis coordinate system whose origin is a reference sample point detected by the reference sample point detecting means among the sample points. It is characterized by setting.
- the sample point detection means may include the reference sample check. Further detecting the orientation of the reference sample point detected by the outputting means, the image processing control means
- the coordinates of the four sample points and the change in the body position of the subject are calculated. It is characterized in that the coordinates of the four sample points that fluctuate due to the above are respectively corrected.
- an image generating means for generating the anatomical image data using a desired human body and the anatomical image data is transmitted from the image generating means to the image processing control means.
- the anatomical image data is anatomical three-dimensional image data of a human body
- the input means designates a cutting position of the three-dimensional image data.
- the image processing control means cuts the three-dimensional image data at the cutting plane to create the cross-sectional image data It is characterized by doing.
- the present invention is characterized in that, in the above invention, the image generating means includes an X-ray CT device, an MRI device, or a PET device.
- the present invention is characterized in that, in the above invention, the sample point anatomically corresponds to any one of four forces of the xiphoid process, the right end of the pelvis, the pylorus, the duodenal papilla, and the cardia. I do.
- the present invention is characterized in that, in the above invention, the anatomical image data is classified in advance by region.
- the present invention is characterized in that, in the above invention, the anatomical image data is classified by color in advance for each region.
- the input means inputs a rotation angle of the two-dimensional ultrasonic image or the guide image having a rotation center at an image center
- the image processing control means includes: The two-dimensional ultrasonic image forming the rotation angle is sequentially generated and output in a direction perpendicular to the normal direction without changing the normal direction of the two-dimensional ultrasonic image.
- a guide image forming the rotation angle is sequentially generated and output in a direction perpendicular to the normal direction without changing the line direction.
- the image processing control means includes: The guide image forming the rotation angle is sequentially generated and output in a direction perpendicular to the normal direction without changing the normal direction of the two-dimensional ultrasonic image, and the normal direction of the two-dimensional ultrasonic image is not changed.
- the method is characterized in that a two-dimensional ultrasonic image having the rotation angle is sequentially generated and output in a direction perpendicular to the normal direction.
- the present invention is characterized in that, in the above invention, the input means inputs a rotation angle that changes according to an input amount of the input means.
- the scan is a radial scan
- a center position and a normal direction of a scanning surface of the radial scan are detected, and default orientation data regarding a direction perpendicular to the normal direction is detected.
- the image processing control means based on the rotation angle, based on the detected center position and normal direction and the default orientation data.
- the orientation is anatomically matched with the orientation of the guide image.
- the input means further inputs specific information for specifying the two-dimensional ultrasonic image
- the image processing control means performs the input for each of the input specific information.
- the specific information is associated with the two-dimensional ultrasonic image and the guide image
- a two-dimensional ultrasonic image is searched based on the input specific information
- the searched two-dimensional ultrasonic image and the two-dimensional ultrasonic image are associated with each other.
- a guide image associated with the three-dimensional ultrasonic image is displayed on the display means.
- the present invention according to the above invention, further comprising an insertion shape detecting means for detecting an insertion shape of the insertion portion of the probe for performing the scan into the body, wherein the image processing control means indicates the insertion shape.
- An insertion shape image is displayed on the display means together with the two-dimensional ultrasonic image and the guide image.
- a plurality of ultrasonic transducers are arranged in a ring, and the electronic radial scan type of transmitting and receiving ultrasonic waves in a predetermined order in the body to perform the scan.
- a probe is provided.
- two-dimensional image data obtained by imaging a cross section of the inside of a subject, and guide image data that anatomically accurately corresponds to the position and orientation of the two-dimensional image data are obtained.
- the two-dimensional ultrasound image corresponding to the two-dimensional image data can be easily created in real time. It is possible to realize an ultrasonic diagnostic apparatus capable of sequentially displaying and outputting corresponding guide images in real time.
- the operator can simultaneously check the two-dimensional ultrasonic image and the guide image, and refer to, for example, a color-coded organ image or the like indicated by the guide image. It is possible to accurately and easily recognize which position of the subject is anatomically indicated by the current two-dimensional ultrasound image, and thereby, the interest of a lesion or the like in the subject can be recognized. This makes it possible to easily find a region and accurately observe the region of interest, thereby providing an effect that medical diagnosis of a subject can be performed accurately and efficiently. This is because the extracorporeal force of the subject is much more medically useful than an ultrasonic diagnostic device that irradiates ultrasonic waves, especially for shortening the examination time for the subject and learning the operator's beginners. The time is greatly reduced.
- FIG. 1 is a block diagram illustrating a configuration example of an ultrasonic diagnostic apparatus according to Embodiment 1 of the present invention.
- FIG. 2 is a schematic diagram schematically illustrating one embodiment of a marker coil and one embodiment of a plate.
- FIG. 3 is a schematic diagram schematically illustrating a setting state of a rectangular coordinate system in a receiving coil.
- FIG. 4 is a flowchart illustrating processing steps until a two-dimensional ultrasonic image and a guide image are arranged and displayed on the same screen and output.
- FIG. 5 is a schematic diagram schematically showing a display example in which a two-dimensional ultrasonic image and a guide image are displayed and output side by side on the same screen.
- FIG. 6 is a flowchart exemplifying processing steps until a feature point setting process is achieved.
- FIG. 7 is a schematic diagram illustrating an operation of setting a feature point in slice image data.
- FIG. 8 is a flowchart illustrating processing steps until a sample point setting process is achieved; It is.
- FIG. 9 is a flowchart illustrating processing steps up to achieving guide image creation processing.
- FIG. 10 is a schematic diagram illustrating a relationship between a two-dimensional image plane and a three-axis coordinate system based on sample points.
- FIG. 11 is a schematic diagram illustrating an operation of calculating a guide image plane and its position data.
- FIG. 12 is a block diagram showing an example of a configuration example of an ultrasonic diagnostic apparatus according to Embodiment 2 of the present invention.
- FIG. 13 is a flowchart exemplifying a processing flow up to achieving feature point setting processing using volume data.
- FIG. 14 is a schematic diagram illustrating an operation of setting a cross section of volume data.
- FIG. 15 is a schematic diagram schematically illustrating a state in which volume data and a cross-sectional image are displayed and output side by side on the same screen.
- FIG. 16 is a block diagram showing an example of a configuration example of an ultrasonic diagnostic apparatus according to Embodiment 3 of the present invention.
- FIG. 17 is a schematic diagram schematically illustrating a state in which a guide image and a two-dimensional ultrasonic image subjected to rotation processing are displayed and output on the same screen.
- FIG. 18 is a schematic diagram schematically illustrating a state in which a two-dimensional ultrasonic image and a guide image subjected to a rotation process are output and displayed on the same screen.
- FIG. 19 is a schematic diagram schematically illustrating a state in which a two-dimensional ultrasound image subjected to rotation processing and a guide image subjected to rotation processing are displayed and output on the same screen.
- FIG. 20 is a block diagram showing an example of a configuration example of an ultrasonic diagnostic apparatus according to Embodiment 4 of the present invention.
- FIG. 21 is a block diagram showing an example of a configuration example of an ultrasonic diagnostic apparatus according to Embodiment 5 of the present invention.
- FIG. 22 is a block diagram showing an example of a configuration example of an ultrasonic diagnostic apparatus according to Embodiment 6 of the present invention.
- FIG. 23 is a schematic diagram schematically illustrating a screen display example of a two-dimensional ultrasonic image, a guide image, and an insertion shape image at the same timing.
- FIG. 24 is a block diagram illustrating an example of a configuration of an ultrasonic diagnostic apparatus according to Embodiment 7 of the present invention.
- FIG. 25 is a schematic diagram schematically illustrating one configuration of the tip of an electronic radial scan probe.
- FIG. 1 is a block diagram illustrating a configuration example of an ultrasonic diagnostic apparatus according to Embodiment 1 of the present invention.
- an ultrasonic diagnostic apparatus 1 includes a probe 2 having an insertion section 3 inserted into a subject and an operation section 4 for operating the insertion section 3, an ultrasonic observation apparatus 5, and position data.
- Calculation device 6 transmission coil 7 having a plurality of coil forces, marker coil 8, plate 9 having a plurality of coils, reception coil 10, input device 11, display device 12, image processing device 13 and an optical observation device 17.
- An ultrasonic transducer 3a is rotatably incorporated at the insertion-side end of the insertion section 3, and an operation section 4 is disposed at the rear end of the insertion section 3.
- a transmission coil 7 is detachably disposed near the ultrasonic transducer 3a.
- the insertion section 3 has a shaft 3b serving as a rotation axis of the ultrasonic transducer 3a, and the operation section 4 has a motor 4a and a rotary encoder 4b.
- Motor 4a is connected to ultrasonic transducer 3a via shaft 3b Is done.
- the rotary encoder 4b is connected to the motor 4a.
- the ultrasonic observation device 5 is electrically connected to the ultrasonic transducer 3a, the motor 4a, and the rotary encoder 4b via a power switch (not shown) provided on the operation unit 4, a cable, and the like.
- the position data calculation device 6 is electrically connected to the transmission coil 7, the marker coil 8, the plate 9, and the reception coil 10 via a cable or the like. Furthermore, the probe 2 has an optical observation window 3c realized using a cover glass, a lens 3d, and a CCD (Charge
- a Coupled Device camera 3e an illumination light irradiating window (not shown) for irradiating illumination light into the body cavity, and the like are provided.
- the CCD camera 3e is electrically connected to the optical observation device 17 via a cable or the like.
- the image processing device 13 is electrically connected to the ultrasonic observation device 5, the position data calculation device 6, the input device 11, the display device 12, and the optical observation device 17 via a cable or the like.
- the probe 2 radially moves inside the subject by rotating the ultrasonic transducer 3a and repeatedly transmitting and receiving ultrasonic waves while the insertion section 3 is inserted into the subject. Perform a scan.
- the insertion section 3 is realized using a flexible member, and has an elongated cylindrical shape suitable for insertion into a subject.
- the ultrasonic vibrator 3a is realized by using a piezoelectric ceramic such as norium titanate or lead zirconate titanate, and converts a pulse-shaped voltage applied from the ultrasonic observation device 5 into ultrasonic waves by an inverse piezoelectric effect.
- the shaft 3b is a flexible shaft, and functions as a flexible rotating shaft that transmits the rotational drive by the motor 4a to the ultrasonic vibrator 3a. That is, the ultrasonic vibrator 3a rotates around a direction substantially coaxial with the direction in which the insertion section 3 is inserted into the subject.
- the operation unit 4 includes a distal end of the insertion unit 3 including a portion where the ultrasonic vibrator 3a and the transmission coil 7 are arranged according to an operation of an operator, that is, an operator who performs in-vivo observation or diagnosis of the subject. Has the function of bending
- the operation unit 4 is turned on by an operator operating a power switch of the operation unit 4 and by a predetermined command input from the input device 11 and passing through the image processing device 13 and the ultrasonic observation device 15.
- the ultrasonic transducer 3a, the motor 4a, and the rotary encoder 4b are electrically connected to the ultrasonic observation device 5.
- the ultrasonic observation device 5 can apply a pulse voltage (pulse voltage) of about 100 [V] to the ultrasonic transducer 3a and apply a DC drive voltage of about 12 [V] to the motor 4a. Further, it can receive an electric signal from the rotary encoder 4b.
- the motor 4a performs a rotational drive using the DC drive voltage applied from the ultrasonic observation device 5, and transmits the rotational drive to the ultrasonic transducer 3a via the shaft 3b.
- the motor 4a rotates the ultrasonic transducer 3a around the shaft 3b as a rotation axis.
- the rotary encoder 4b detects the rotation angle of the rotation drive by the motor 4a and outputs an electric signal (angle detection signal) corresponding to the detected rotation angle to the ultrasonic observation device 5.
- the optical observation window 3c is arranged near the ultrasonic transducer 3a or the transmission coil 7, for example, at a position about 0.5 cm away from the transmission coil 7. Illumination light is emitted from the above-described illumination light irradiation window (not shown) to illuminate the inside of the subject.
- the image of the lumen surface in the body of the subject is formed on the CCD camera 3e from the optical observation window 3c via the lens 3d.
- the CCD camera 3e outputs an electric signal (CCD signal) corresponding to the formed image through an optical observation device 17.
- the optical observation device 17 creates image data of the lumen surface inside the body cavity of the subject based on the CCD signal from the CCD camera 3e, and uses this data as optical image data in the control unit in the image processing device 13. Output to section 16.
- the ultrasonic observation device 5 is configured using a detection circuit, an amplification circuit, an AZD conversion circuit, a coordinate conversion circuit, and the like.
- the ultrasonic observation device 5 uses the scan signal sequentially received from the ultrasonic transducer 3a and the angle detection signal received from the rotary encoder 4b, and performs envelope detection processing, logarithmic amplification processing, AZD conversion processing, and orthogonal processing from the polar coordinate system. Performs well-known processes such as a coordinate conversion process to a coordinate system. Thereby, the ultrasonic observation apparatus 5 creates one two-dimensional image data for each sequentially received scan signal, that is, for each of the above-described radial scans.
- the ultrasonic observation device 5 sequentially transmits the created two-dimensional image data to the image processing device 13.
- the two-dimensional image data is digital image data corresponding to a two-dimensional ultrasonic image of the inside of the subject by this radial scan, and a reference based on the angle detection signal described above in a direction parallel to the two-dimensional image plane.
- the direction is set.
- the reference direction will be described as the 12:00 direction, that is, the upward direction of the two-dimensional ultrasonic image.
- the position data calculation device 6 excites each coil constituting each of the transmission coil 7, the marker coil 8, and the plate 9 at a different frequency, and then transmits the transmission coil 7, the marker coil 8, and the An electric signal corresponding to each alternating magnetic field in which each coil force that forms each of the plates 9 is also generated is received from the receiving coil 10.
- the receiving coil 10 detects each alternating magnetic field generated from each coil constituting each of the transmitting coil 7, the marker coil 8, and the plate 9, converts the detected alternating magnetic field into an electric signal, and outputs the electric signal.
- the signal is transmitted to the position data calculation device 6 as a position detection signal.
- the position data calculation device 6 decomposes each position detection signal received from the reception coil 10 for each frequency, thereby decomposing each received position detection signal for each alternating magnetic field. That is, the position data calculation device 6 decomposes the position detection signal resulting from the alternating magnetic field from each of the coils constituting the transmission coil 7, the marker coil 8, and the plate 9 from the received position detection signals. obtain. Then, the position data calculating device 6 calculates and calculates data (position data) on each position and each orientation of the transmission coil 7, the marker coil 8, and the plate 9 based on the obtained position detection signals. The position data is transmitted to the image processing device 13.
- the transmission coil 7 includes a first coil having a coil winding axis (coil axis) fixed in the direction of the insertion axis of the insertion section 3 into the subject, that is, the direction of the rotation axis of the ultrasonic transducer 3a. This is realized by using a second coil whose coil axis is fixed in the reference direction based on the angle detection signal from the rotary encoder 4b in the direction perpendicular to the insertion axis direction, that is, in the 12:00 direction of the two-dimensional ultrasonic image.
- the transmitting coil 7 is fixed so that the distance and the orientation with respect to the ultrasonic transducer 3a are substantially constant, each position and each orientation of the first coil and the second coil are applied to the ultrasonic transducer 3a. On the other hand, it is almost fixed.
- the transmission coil 7 generates an alternating magnetic field when the position data calculation device 6 supplies a current to the first coil and the second coil.
- the device 6 obtains position data on the position and orientation of the ultrasonic transducer 3a based on each position detection signal corresponding to each alternating magnetic field of the first coil and the second coil force. be able to.
- the transmitting coil 7 When the transmitting coil 7 is disposed near the ultrasonic transducer 3a, it may be removably disposed on the outer wall of the insertion section 3, but is preferably removably inserted into the insertion section 3. .
- the marker coil 8 has one built-in coil for converting a current supplied from the position data calculation device 6 into a predetermined alternating magnetic field, and has a stick shape. This coil is provided on the tip side of the stick shape of the marker coil 8.
- the marker coil 8 generates an alternating magnetic field indicating a position near the body surface when a current is supplied from the position data calculation device 6 in a state of being in contact with the body surface of the subject.
- the position data calculation device 6 can obtain position data relating to the contact position of the body surface of the subject based on the position detection signal corresponding to the alternating magnetic field from the marker coil 8.
- the plate 9 has three built-in coils for converting a current supplied from the position data calculating device 6 into a predetermined alternating magnetic field, and has a plate shape such as an elliptical shape that easily adheres to the body surface of the subject. .
- the plate 9 When a current is supplied from the position data calculating device 6 while the plate 9 is in contact with the body surface of the subject, the plate 9 generates an alternating magnetic field indicating a position near the body surface.
- the three coils of the plate 9 are arranged in the plate 9 such that the respective coil axes are aligned with each other!
- the plate 9 has a rectangular coordinate system x "y" z ⁇ at the origin 0 ⁇ at which the X "axis, the y" axis, and the z "axis are orthogonal.
- the rectangular coordinate system x'V “z” is fixed to the plate 9, and when the plate 9 itself moves, the rectangular coordinate system x'V “z” also moves. The setting of the coordinate system x "y" z ⁇ will be described later.
- the receiving coil 10 is realized by using a plurality of coils, and detects each alternating magnetic field generated from the transmitting coil 7, the marker coil 8, and the plate 9 as described above, and positions the detected alternating magnetic field.
- the position data is converted into a detection signal and transmitted to the position data calculation device 6.
- a rectangular coordinate system xyz of an origin O at which three axes of the X axis, the y axis, and the z axis are orthogonal is set.
- This orthogonal coordinate system xyz is fixed to the receiving coil 10. Since the receiving coil 10 does not move in the subsequent operation, this rectangular coordinate system xyz And a coordinate system whose orientation is fixed in space.
- the orthogonal coordinate system xyz is based on the position data calculated by the position data calculation device 6, that is, each position in the subject's body detected using the transmission coil 7, the marker coil 8, and the plate 9, and the ultrasonic vibration.
- This is an orthogonal coordinate system for expressing the position and orientation of the child 3a.
- the setting of the orthogonal coordinate system xyz for the receiving coil 10 will be described later.
- the input device 11 is realized by using a keyboard, a touch panel, a trackball, a mouse, a joystick, or the like singly or in combination, and performs the above-described radial scan or ultrasonic waves such as various image displays on the display device 12.
- Input to device 13 For example, when using a keyboard or a touch panel, the user can enter or select desired instruction information or coordinate information, or directly enter an information menu or coordinate position displayed on the screen of the display device 12 or the touch panel. Is input.
- the information menu displayed on the screen of the display device 12 can also be used to select the desired instruction information or directly change the coordinate position displayed on the screen of the display device 12.
- desired instruction information or coordinate information is input. Specifically, by operating a trackball, a mouse, or a joystick, a cursor or the like displayed on the screen of the display device 12 is moved to a selected option or coordinate position of desired instruction information, and a click operation is performed. Desired instruction information or coordinate information is input.
- the image processing device 13 is realized by using a well-known computer, and has an image storage unit 14, a display circuit 15, and a control unit 16.
- the image storage unit 14 is realized using various types of IC memories such as an EEPROM or a flash memory, a hard disk drive, or various types of storage devices capable of writing and reading data, such as a magneto-optical disk drive.
- the image storage unit 14 stores, under the control of the control unit 16, various image data such as two-dimensional image data input from the control unit 16 or guide image data to be described later. At this time, under the control of the control unit 16, the image storage unit 14 stores various images such as two-dimensional image data or guide image data. Each position data of the image data can be stored in association with the image data. Also
- the image storage unit 14 transmits the stored various image data and the like to the control unit 16 under the control of the control unit 16.
- the image storage unit 14 stores in advance a slice image data group including a plurality of slice image data, which are anatomical image data of a biological section.
- this slice image data group an orthogonal coordinate system x'y'z 'of an origin O' where three axes of x 'axis, y' axis, and z 'axis are orthogonal is set in advance. That is, the slice image data group is stored in the image storage unit 14 in a state of being arranged on the orthogonal coordinate system x'y'z '.
- the control unit 16 can read out the slice image data or the slice image data group associated with the rectangular coordinate system x'y'z '.
- the slice image data is, for example, square photograph data of about 40 cm on a side obtained by slicing a frozen human body other than the subject in parallel at lmm pitch, and using pixels of the photograph data as organs. These are image data obtained by separately classifying and then color-coding by organ. The reason that one side of this photographic data was set to about 40 cm was a force large enough to cover the entire cross section of the human body perpendicular to the body axis.
- the display circuit 15 performs DZA conversion processing or the like on various image data input from the control unit 16 under the control of the control unit 16, and displays the input various image data on the display device 12 To an image signal that can be displayed on the screen. After that, the display circuit 15 transmits this image signal to the display device 12.
- the display device 12 displays and outputs one or more of various images corresponding to the various image data by arranging or switching the images based on the image signal received from the display circuit 15. For example, the display device 12 displays the two-dimensional ultrasonic image corresponding to the two-dimensional image data by receiving an image signal corresponding to the two-dimensional image data created by the ultrasonic observation device 5 from the display circuit 15. Output.
- the display device 12 receives an image signal corresponding to the optical image data created by the optical observation device 17 from the display circuit 15, and thereby, for example, a surface of a luminal surface in the subject corresponding to the optical image data. Display and output the optical image. Under the control of the control unit 16, this optical image is displayed and output singly or plurally or arranged or switched.
- the control unit 16 includes a ROM in which various data such as a processing program is stored in advance, and an operation parameter.
- the storage unit 16a is realized by using a RAM for temporarily storing data and the like, and is realized by using a CPU that executes this processing program.
- the control unit 16 controls various operations of the ultrasonic diagnostic apparatus 1 and the image display operation of the display device 12 regarding the radial scan described above.
- the control unit 16 performs information input / output control for various types of information input from the ultrasonic observation device 5, the position data calculation device 6, or the input device 11, and operates and performs image processing of the ultrasonic observation device 5.
- the operation of each component of the device 13 is controlled.
- the storage unit 16a temporarily stores various image data, various types of information input from the input device 11, or various types of position data input from the position data calculation device 6 under the control of the control unit 16. You. .
- the control unit 16 further includes a timer 16b, an image composition unit 16c, a mixing unit 16d, and a correction unit 16e.
- the timer 16b functions to notify the control unit 16 of a time t at a predetermined timing under the control of the control unit 16.
- the timer 16 b controls the time at which the control unit 16 receives the sample point setting instruction information described below from the input device 11 or the control unit 16 transmits the sample point setting instruction information to the ultrasonic observation device 5.
- the control unit 16 is notified of the time of receiving the two-dimensional image data.
- the image forming unit 16c converts the coordinate information input from the input device 11 into the orthogonal coordinate system x'y'z 'set in the slice image data group described above. It works to set points based on them (hereinafter, feature points). Specifically, the operator inputs the feature point coordinate information using the input device 11 while checking the slice image on the display device 12, and the control unit 16 detects the input feature point coordinate information. At this time, under the control of the control unit 16, the image construction unit 16c sets a feature point at coordinates on the orthogonal coordinate system x'y'z 'based on the feature point coordinate information.
- the image forming unit 16c uses the four points (feature points P ′, P ′, P ′, P ′) among the plurality of feature points set to at least four points, and the three-axis coordinate system P′P′P ′ Set.
- Control unit 1
- the feature points are set as points indicating anatomically distinct sites, such as the xiphoid process, the right or left end of the pelvis, the pylorus, the duodenal papilla (the exit of the common bile duct into the duodenum), It is desirable to set the anatomical characteristic site such as the cardia.
- the specific point coordinate information is coordinate information for setting this feature point as a point on the orthogonal coordinate system xV'z '.
- the three-axis coordinate system P 'P' P ' is one of the four feature points (only
- the feature point P ') and the remaining feature points P', P ', P' are determined by vectors connecting the respective points.
- the three axes are not necessarily orthogonal to each other.
- the image forming unit 16c sets a point based on information input from the input device 11 on the rectangular coordinate system xyz set for the receiving coil 10 described above. It works. Specifically, the operator uses the input device 11 while touching the marker coil 8 and the plate 9 to the surface of the subject or operating the probe 2 and checking the optical image on the display device 12. The instruction information (sample point setting instruction information) for instructing the point setting is input, and the control unit 16 detects the input sample point setting instruction information. At this time, under the control of the control unit 16, the image forming unit 16c uses the position data input from the position data calculating device 6 to determine this point (hereinafter, sample point) based on the sample point setting instruction information. Set on the rectangular coordinate system xyz. After that, the image forming unit 16c sets the three-axis coordinate system PPP using at least four of the set sample points (sample points P 1, P 2, P 2, and P 3).
- sample points P 1, P 2, P 3, and P 2 correspond to the feature points P ′, P ′, P ′, and P ′ described above.
- sample point P 0 1 2 3 0 1 2 3
- 1 ′ is a point indicating the right end of the pelvis in the slice image data group. If the sample point P is the subject's pylorus, the feature point P ′ is the slice image data.
- this sample point is a coordinate component corresponding to the position data by the transmission coil 7, that is, the position data of the ultrasonic transducer 3 a, the position data by the marker coil 8, or the position data by the plate 9 on the orthogonal coordinate system xyz. Have. Furthermore, the three-axis coordinate system P P P P
- This coordinate system is a three-axis coordinate system with one point (for example, sample point P) of the 1 2 3 sample points as the origin.
- the axis is a solid connecting the origin (ie, sample point P) and the remaining sample points P 1, P 2, and P 3, respectively. Determined by Torr. That is, the three axes are not necessarily orthogonal to each other.
- the image forming unit 16c when the control unit 16 receives the two-dimensional image data from the ultrasonic observation apparatus 5 as a trigger, controls the two-dimensional image data and the position under the control of the control unit 16. It is associated with the position data from the data calculation device 6. For example, image composition part 1
- the image forming unit 16c associates the position data received from the position data calculation device 6 with the two-dimensional image data at substantially the same timing as the control unit 16 receives the two-dimensional image data from the ultrasonic observation device 5.
- the image forming unit 16c can set the position and the orientation of the two-dimensional image data.
- the position data calculation device 6 uses the position detection signal based on the alternating magnetic field from the transmission coil 7 to calculate the position of the position C (t) of the first coil among the transmission coils 7.
- Each direction component of the direction vector V (corresponding to the above-described coil axis direction of the second coil fixed at 12 o'clock with respect to each orthogonal coordinate system xyz is calculated.
- the position data calculation device 6 calculates the direction vector V (t) and the direction vector V (t).
- the position data calculating device 6 calculates each of the obtained position vector OC (t), direction vector V (t), and direction vector V (t).
- the 12-direction components are transmitted to the control unit 16 as position data.
- the image forming unit 16c associates the two-dimensional image data received by the control unit 16 at substantially the same timing with the position data.
- the two-dimensional image plane of the two-dimensional image data corresponds to the above-described radial scan running surface.
- the position vector OC (t) can be considered as a position vector of the rotation center of the ultrasonic transducer 3a.
- the first coil has the coil axis direction fixed to the insertion axis direction, so the direction vector V (t) is the vector in the direction perpendicular to the two-dimensional ultrasonic image, that is, the normal vector. Can be considered.
- the direction vector V (t) is Think of it as a vector.
- center position c (t), the position vector oc (t), the direction vector v (t), and the direction vector V (t) is the coordinate, position vector, normal vector, and 12:00 at time t, respectively.
- This is a direction vector, and the coordinate or the directional component of each vector changes in accordance with the change in the position or orientation of the tip of the insertion section 3 with the passage of time t.
- the image construction unit 16c also stores the slice image data group read from the image storage unit 14 and various position data input from the position data calculation device 6. At the same time, it functions to create guide image data. Specifically, the image forming unit 16c uniquely identifies the position vector OC (t), the direction vector V (t), and the direction vector V (t).
- the image forming unit 16c determines a plane having the same position and orientation as the three-axis coordinate system P′P′P ′ (hereinafter referred to as a guide image plane).
- the control unit 16 reads a part of the slice image data group corresponding to the guide image plane from the image storage unit 14.
- the image forming unit 16c performs an interpolation process or a coordinate conversion process on the read data, and corresponds to a cross-sectional image obtained by cutting the slice image data group of the image storage unit 14 on the guide image plane. Create guide image data. Thereafter, the guide image data is input to the mixing unit 16d.
- the guide image data is image data of a section of a living body anatomically corresponding to the two-dimensional image data created by the ultrasound observation apparatus 5.
- the guide image output and displayed on the display device 12 based on the guide image data includes the position and orientation of the displayed and output organs and the anatomical image with the two-dimensional ultrasonic image corresponding to the two-dimensional image data. It almost matches.
- the image forming unit 16c anatomically converts the two-dimensional image data into As the corresponding image data, guide image data showing a cross section of the spleen head with duodenal capping is created.
- the reason for this match is as follows. First, the anatomical structures and organ shapes of the human body are almost the same, although there are differences between the bodies, but there is no gender difference in the abdomen. Second, the four sample points P 1, P 2, P 3, and P 4 taken from the actual body surface or body lumen surface of the subject are slice image data, respectively.
- the relationship between the position and the orientation with respect to the image plane is the same.
- the corresponding point R 'having the same position address as the arbitrary point R on the guide image data with respect to the arbitrary point R on the two-dimensional image data is anatomically the same organ, the same organ Or the same biological tissue. Therefore, it can be said that the two-dimensional image data and the guide image data correspond anatomically.
- the slice image data group used by the image forming unit 16c to generate the guide image data is color-coded in advance for each organ as described above. Therefore, the guide image data is color-coded for each organ similarly to the slice image data.
- the mixing unit 16d uses the two-dimensional image data input from the ultrasonic observation device 5 and the guide image data created by the image forming unit 16c to generate two-dimensional image data. It creates image data (mixed image data) for displaying and outputting a two-dimensional ultrasonic image corresponding to the image and a guide image corresponding to the guide image data on the same screen of the display device 12 side by side.
- the mixed image data created by the mixing unit 16d is output to the display circuit 15 under the control of the control unit 16.
- the display circuit 15 converts and outputs an image signal corresponding to the mixed image data under the control of the control unit 16 as described above.
- the display device 12 displays and outputs a two-dimensional ultrasonic image and a guide image corresponding to the mixed image data on the same screen based on the image signal received from the display circuit 15.
- the correction unit 16e functions to correct the coordinate data of the sample points that have changed with the passage of time t.
- the correction unit 16e converts the coordinate data of the sample point at the time t into the coordinate data of the subsequent time, that is, the current sample point, by this correction processing. Then, the coordinate data of the current sample point changed from the previous sample point force is obtained.
- the image forming unit 16c uses the corrected coordinate data of the current sample point to generate the above-described three-axis coordinate system P P P P P
- FIG. 2 is a schematic diagram schematically illustrating one embodiment of the marker coil 8 and one embodiment of the plate 9.
- the marker coil 8 has a stick shape as shown in FIG. Also, Marco As described above, the coil 8 incorporates one coil on the tip side of the stick shape.
- the plate 9 has a plate shape such as an ellipse that easily adheres to the body surface of the subject, and as shown in FIG. 2, the body 9 is a surface that adheres to the body surface of the subject. It has a front contact surface. Further, the above-described rectangular coordinate system x "y" z "is set for the plate 9. As shown in FIG.
- the origin O "of the orthogonal coordinate system x" y "z" is set at a position on the plate 9 where the positional relationship with the plate 9 is fixed, and is set, for example, at a reference position L on the plate 9.
- this reference position L is the center of gravity of the three coil positions in plate 9, the midpoint of a straight line connecting the midpoint of the two coil positions in plate 9, and the remaining coil position, or the body of plate 9. Table Set near the center of the contact surface and at the position where one coil is provided.
- the unit vector i ⁇ is set on the X "axis
- the unit vector j ⁇ ⁇ ⁇ ⁇ is set on the y" axis
- the z "axis is set on the z" axis.
- the unit vector vector is set.
- FIG. 3 is a schematic diagram schematically illustrating a setting state of the orthogonal coordinate system xyz in the receiving coil 10.
- the origin O of the receiving coil 10 is set at a position on the receiving coil 10 in which the positional relationship with the receiving coil 10 is fixed, for example, near the center axis of the alternating magnetic field receiving surface 10a of the receiving coil 10.
- the z-axis is set in the normal direction of the alternating magnetic field receiving surface 10a, and the X axis and the y-axis are set parallel to the alternating magnetic field receiving surface 10a. In this way, the rectangular coordinate system xyz is set in the receiving coil 10.
- the orthogonal coordinate system xyz is set as a spatial coordinate system in the actual space where the operator examines the subject.
- a unit vector i is set on the X axis
- a unit vector j is set on the y axis
- a unit vector k is set on the z axis.
- the receiving coil 10 in which the orthogonal coordinate system xyz is set detects the alternating magnetic fields from the transmitting coil 7, the master coil 8, and the plate 9 as described above, and outputs the position detection signal to the position data. To the data calculation device 6.
- the position data calculation device 6 calculates the position vector OC (t) of the center position C (t) of the 2D image plane of the 2D image data, the direction vector V (t) of the 2D image plane of the 2D image data, V (t), position vector OL (t) of reference position L (t) of plate 9,
- the rotation matrix T (t) indicating the orientation of the plate 9 and the direction components of the orthogonal coordinate system xyz of the position vector OM (t) of the position M (t) of the marker coil 8 are calculated.
- the reference position L (t), the position vector OL (t), the rotation matrix T (t), the position M (t), and the position vector OM (t) are the position, vector, or This is a rotation matrix, and changes due to a change in the position and orientation of the tip of the insertion section 3 with the passage of time t.
- the rotation matrix T (t) is a rotation matrix indicating the orientation of the plate 9 in the rectangular coordinate system xyz, and its (f, g) component is t (t). Is the rotation matrix of. Plate 9 is fg
- the integers f and g are any of 1 to 3.
- the unit vector ⁇ is the unit vector i described above
- the unit vector e is the unit vector j described above
- the unit vector e is the unit vector e described above.
- unit vector e is the unit vector j ⁇ described above, and the unit vector e" is the unit vector e "described above.
- the rotation matrix T (t) uses so-called Euler angles ⁇ , ⁇ , and ⁇ to rotate the angle ⁇ around the z-axis, rotate the angle ⁇ around the y-axis, A matrix that assumes that when a rotation of the surrounding angle ⁇ ⁇ is applied to the orthogonal coordinate system xyz in this order, it matches the orthogonal coordinate system x "y" z "set on the plate 9; If the subject changes his or her body position over time t, the Euler angles ⁇ , ⁇ , and ⁇ change with this change in posture.
- control unit 16 sets the three-axis coordinate system P′P′P ′ based on the feature points on the rectangular coordinate system x′y′z ′.
- the coordinates of four points of the xiphoid process, the right end of the pelvis, the pylorus, and the duodenal papilla in the slice image data group are acquired as the feature points, and the sample points.
- a description will be given of an example of acquiring the coordinates of four points of the xiphoid process, the right end of the pelvis, the pylorus, and the duodenal papilla of the subject, but the present invention is not limited to this.
- FIG. 4 is a diagram showing a process performed by the control unit 16 until the two-dimensional ultrasound image corresponding to the two-dimensional image data and the guide image corresponding to the guide image data are displayed side by side on the same screen of the display device 12. It is a flowchart which illustrates a processing process.
- the operator operates the input device 11 at the positions of the xiphoid process, the right end of the pelvis, the pylorus, and the papilla of the duodenum on the slice image displayed on the display device 12, and detects the feature points.
- the control unit 16 detects the feature point coordinate information for each of the xiphoid process, the right end of the pelvis, the pylorus, and the duodenal papilla, and controls the image forming unit 16c. Under the control of the control unit 16, the image construction unit 16c performs a feature point setting process of setting each feature point based on the input feature point coordinate information on the orthogonal coordinate system xV'z '(see FIG. Step S101). Thereafter, the control unit 16 associates the coordinate data of each feature point set by the image configuration unit 16c with the above-described slice image data group and stores the coordinate data in the image storage unit 14.
- the image forming unit 16c places a feature based on this feature point coordinate information on an orthogonal coordinate system x′y′z ′ corresponding to a xiphoid process on a slice image.
- a feature point P ′ based on the feature point coordinate information is set on an orthogonal coordinate system xV′z ′ corresponding to the right end of the pelvis on the slice image.
- the image composing section 16 c
- a feature point P based on the feature point coordinate information is placed on an orthogonal coordinate system xV'z 'corresponding to the pylorus on the slice image.
- the feature point P based on this feature point coordinate information is placed on the orthogonal coordinate system x'y'z' corresponding to the duodenal papilla on the slice image.
- Control unit 1
- the operator uses the probe 2, the marker coil 8, or the plate 9 and the plate 9 and the input device 11 to sample the xiphoid process, the right end of the pelvis, the pylorus, and the duodenal papilla of the subject.
- the control unit 16 detects the sample point setting instruction information for each of the xiphoid process, the right end of the pelvis, the pylorus, and the duodenal papilla, and determines the position at the detected timing.
- the position data received from the data calculation device 6 is recognized as position data for each sample point, and the image forming unit 16 is controlled. Further, the control unit 16 detects the time t detected for each sample point setting instruction information from the timer 16b.
- the image forming unit 16c uses each position data recognized as the position data for each sample point, and calculates each sample point based on these input sample point setting instruction information in rectangular coordinates.
- a sample point setting process to be set on each of the systems xyz is performed (step S102).
- the image forming unit 16c is based on the sample point setting instruction information and the position data on the orthogonal coordinate system xyz corresponding to the xiphoid process of the subject.
- the time t when the sample point setting instruction information is detected is detected as the time tl when the sample points P and P are set.
- the image forming unit 16c places a sample based on the sample point setting instruction information and the position data on the orthogonal coordinate system xyz corresponding to the pylorus of the subject.
- Point P
- the sample point P based on this sample point setting instruction information and this position data.
- control unit 16 sets the time t at which the sample point setting instruction information is detected to the sample points P and P, respectively.
- the control unit 16 stores the sample points set by the image composition unit 16c.
- the coordinate data of P 1, P 2, P 1, and P 2 are stored in the storage unit 16a.
- Step S103 the control unit 16 does not detect the scan start instruction information (step S103, No), Step 103 is repeated. That is, the control section 16 constantly monitors the force / no-force input of the scan start instruction information from the input device 11.
- the control unit 16 detects the scan start instruction information (Step S103, Yes), and A radial scan start instruction is given to the ultrasonic observation device 5 based on the can start instruction information (step S104).
- the ultrasonic observation apparatus 5 drives and controls the ultrasonic vibrator 3a and the motor 4a to start radial scanning.
- the control unit 16 acquires two-dimensional image data from the ultrasonic observation apparatus 5 (step S105), and detects a time ts from the timer 16b as the acquired time.
- the image forming unit 16c associates the two-dimensional image data with the position data received from the position data calculating device 6 at substantially the same timing as the acquired timing (time ts).
- this position data is the position data based on the alternating magnetic field from the transmitting coil 7, and is the coordinate data of the position vector OC (ts) of the center position C (ts) and the coordinate data of the direction vector V (ts).
- the two-dimensional image data is converted into a position vector OC (ts) of the center position C (ts) of the image plane, a direction vector V (ts) of the image plane, and a direction vector V (ts ts) is associated.
- the control unit 16 has a center position C (ts),
- Each coordinate data of the system xyz is stored in the storage unit 16a.
- the reference position L (t) of the plate 9 is always set so as to always overlap the xiphoid position of the subject.
- the correction unit 16e is set at the above-described times tl to t3, triggered by detecting the passage of the time t in steps S102 to S106 based on the time t detected by the control unit 16 from the timer 16b. Sample points P to P And the difference between each coordinate component of sample points P to P at time ts (that is,
- a correction process for each correction is performed (step S107).
- the correction unit 16e calculates the coordinate components of the sample points P to P and the rotation matrix T (
- the correction unit 16e replaces the sample points P to P set at the times tl to t3 with the sample points P to
- the correction unit 16e calculates the coordinate components of the sample point P at time tl and the position vector OL (ts).
- the position vector OP (ts) corresponding to the sample point P is considered to be the same as the position vector OL (ts).
- the correction unit 16e calculates the coordinate components of the sample point P at time tl and the coordinate components of each sample point P at time tl and ts.
- the position vector OP (tl) on xyz is placed on the rectangular coordinate system xyz of the sample point P at time ts.
- the position vector OP (ts) is expressed by the following equation (6).
- the transposed matrix tT (ts) is a transposed matrix of the rotation matrix T (ts) and is calculated based on the rotation matrix T (ts).
- correction unit 16e calculates the coordinate components of the sample point P at time t2 and the sample points P at time t2 and ts.
- Y (ts), y (ts), ⁇ (ts) are defined by the following equation (7).
- the position vector OP (ts) is expressed by the following equation (8).
- the correction unit 16e calculates the coordinate components of the sample point P at time t3 and the coordinate components of each sample point P at times t3 and ts.
- the position vector OP (t3) on xyz is placed on the rectangular coordinate system xyz of the sample point P at time ts.
- the sample point P (corresponding to the right end of the subject's pelvis at time tl)
- the correction unit 16e sets the sample point P (corresponding to the subject's pylorus at time t2) set at time t2 to the time
- the image construction unit 16c performs various kinds of position data of the two-dimensional image data acquired by the control unit 16 at time ts, for example, the position vector OC (ts), the direction vector V (ts), and direction vector V (ts),
- guide image creation processing for creating guide image data anatomically corresponding to the two-dimensional image data at time ts is performed (step S108).
- This guide image data is created as image data at time ts anatomically corresponding to the two-dimensional image data at time ts, and its position data is represented on the above-described orthogonal coordinate system x'y'z 'by:
- the position vector O'C '(ts) of the center position C' (ts) of the guide image plane, the direction vector V '(ts), and the direction vector V' (ts) are associated with each other. That is, the position vector O'C '(t
- direction vector V '(ts), and direction vector V' (ts) are the position vector OC (ts),
- the mixing unit 16d creates the two-dimensional image data associated with the position data at the time ts in step S106 and the step S108.
- the mixing unit 16d uses the guide image data at the time ts to create mixed image data for displaying and outputting the two-dimensional image data and the guide image data at the time ts side by side on the same screen of the display device 12 is created.
- the mixed image data created by the mixing unit 16d is output to the display circuit 15 under the control of the control unit 16.
- the display circuit 15 converts and outputs an image signal corresponding to the mixed image data under the control of the control unit 16 as described above.
- Display device 12 On the basis of the image signal received from the display circuit 15, a two-dimensional ultrasonic image at time ts and a guide image at time ts corresponding to the mixed image data are arranged and output on the same screen. That is, the control unit 16 transmits the mixed image data to the display circuit 15, so that the display device 12 displays the two-dimensional ultrasonic image at time ts and the guide image at time ts on the same screen. They are displayed side by side (step S109).
- the control unit 16 detects the scan end instruction information (Step S110, Yes), Based on the scan end instruction information, the ultrasonic observation apparatus 5 is instructed to end the radial scan.
- the ultrasonic observation apparatus 5 performs drive control for terminating the radial scan on the ultrasonic transducer 3a and the motor 4a under the control of the control unit 16.
- the control unit 16 repeats the above-described processing steps after step S103 without detecting the scan end instruction information (step S110, No).
- FIG. 5 is a schematic diagram schematically showing a display example in which a two-dimensional ultrasonic image and a guide image are displayed side by side on the same screen of the display device 12 and output.
- the two-dimensional ultrasound image UG corresponds to the two-dimensional image data at time ts described above
- the guide image GG corresponds to the guide image data at time ts described above.
- the two-dimensional ultrasound image UG shows the vicinity of the confluence of the subject's splenic duct and bile duct, and the splenic duct (Pancreas
- the center of the image corresponds to the rotation center of the ultrasonic transducer 3a, that is, the center position C (ts), and the normal direction corresponds to the direction vector V (ts).
- the 12:00 direction of the two-dimensional ultrasonic image UG, that is, the upward direction in FIG. 5 corresponds to the direction vector V (ts), and the 3 o'clock direction of the two-dimensional ultrasonic image UG, that is, the right direction in FIG.
- the center of the image corresponds to the center position C ′ (ts), and the normal direction corresponds to the direction vector V ′ (ts).
- the 12 o'clock direction of the guide image GG that is, the upward direction in FIG. 5 corresponds to the direction solid V ′ (ts)
- the 3 o'clock direction of the guide image GG that is, the right direction in FIG.
- control unit 16 determines the image direction based on the direction vector V (ts), the image direction based on the direction vector V '(ts), the image direction based on the direction vector V (ts), and the direction vector V' (ts Image by
- the two-dimensional ultrasound image UG and the guide image GG By aligning the image directions anatomically, the two-dimensional ultrasound image UG and the guide image GG, in which the position and orientation of the organs and the like correspond anatomically accurately, are displayed side by side on the same screen and output. Can be.
- the guide image GG is displayed on the display device 12 in a state where the guide image data is color-coded for each organ as described above.
- the guide image GG indicates the conjunctiva of the splenic duct and bile duct as illustrated in FIG. 5, and the indicator organs such as splenic duct PD, common bile duct CBD, and portal vein PV are color-coded, respectively.
- the control unit 16 controls the mixing unit 16d to superimpose an abbreviation for each organ such as PD, CBD, or PV on the guide image GG
- the abbreviation for each organ as an annotation for example,
- the guide image GG on which the PD, CBD, and PV illustrated in FIG. 5 are superimposed can be displayed on the display device 12.
- the annotation information on the abbreviation for each organ is stored in advance in the image storage unit 14 in a linked manner with the slice image data group.
- control unit 16 repeats the processing steps of steps S103 to S110 described above to obtain the two-dimensional image data and its position data and the guide image anatomically corresponding to the two-dimensional image data.
- the data and its position data can be sequentially acquired, and the two-dimensional ultrasonic image and the guide image corresponding to the acquired two-dimensional image data and the guide image data are sequentially updated and displayed on the same screen of the display device 12. Display them side by side.
- the control unit 16 repeats the processing steps of steps S103 to S110 described above.
- the guide image and the two-dimensional ultrasonic image are sequentially updated in real time and displayed on the display device 12.
- the operator confirms the two-dimensional ultrasound image and the guide image together on the display device 12 to refer to the color-coded organ images and the like of the guide image and display the currently displayed image.
- What position of the subject is anatomically observed in the two-dimensional ultrasound image? Can be accurately and easily recognized, and medical diagnosis of the subject can be performed accurately and efficiently.
- the surgeon can easily recognize the yellow area as the spleen and observe the two-dimensional ultrasound image, or observe the insertion section 3 By moving the tip, the scanning plane of the ultrasonic transducer 3a is changed to search for the spleen.
- FIG. 6 is a flowchart illustrating processing steps until the control unit 16 achieves the above-described feature point setting processing.
- FIG. 7 is a schematic diagram illustrating an operation of setting a feature point in slice image data stored in the image storage unit 14 in advance.
- the operator uses the slice image data group S DG stored in the image storage unit 14 to obtain a slice showing anatomically distinct points. Select image data. That is, when the operator performs an input operation of the image display instruction information for instructing the display output of the slice image using the input device 11, the control unit 16 detects the input image display instruction information.
- Step S201, Yes on the basis of the detected image display instruction information, read out one slice image data from the slice image data group SDG in the image storage unit 14, and extract a slice image corresponding to the read slice image data.
- a slice image display process to be displayed and output on the display device 12 is performed (step S202).
- the control unit 16 does not detect the image display instruction information (Step S201, No), and repeats the process of Step S201. That is, the control unit 16 constantly monitors the force / non-force input of the image display instruction information from the input device 11.
- this slice image data group SDG is composed of N (N: integer) slice image data SD to SD arranged on the above-described orthogonal coordinate system xV′z ′.
- Image data group N N: integer slice image data SD to SD arranged on the above-described orthogonal coordinate system xV′z ′.
- the orthogonal coordinate system xV'z ' has an origin O' at the corner of the first sliced image data SD and the image planes of the sliced image data SD to SD
- 1 1 N It is set to be a plane with the x 'axis and the y' axis. Further, in the rectangular coordinate system xV'z ', as shown in Fig. 7, a unit vector i' is set on the x 'axis, a unit vector j' is set on the y 'axis, and a unit vector is set on the z' axis. The vector k 'is set.
- the control unit 16 reads out the slice image data SD from the image storage unit 14 based on the image display instruction information detected in step S201, and Display image data SD on display circuit 15
- the display circuit 15 corresponds to the slice image data SD as described above.
- the display device 12 Converts and outputs image signals.
- the display device 12 receives the image signal from the display circuit 15 and displays and outputs a slice image corresponding to the slice image data SD.
- step S203 when the operator performs an input operation of the image display instruction information using the input device 11, the control unit 16 detects the input image display instruction information (step S203, Yes). ), The above-described processing steps after step S202 are repeated. In this case, the control unit 16 sequentially reads the slice image data SD to SD from the image storage unit 14 for each of the detected screen display instruction information based on the screen display instruction information detected in step S203.
- the slice image data SD to SD are sequentially updated on the display device 12.
- control unit 16 repeats the processing steps of steps S201 to S203 described above, so that the surgeon can slice each of the slice image data SD to SD.
- N source images can be sequentially confirmed on the display device 12.
- the surgeon may select an anatomically distinct site on the slice image SG corresponding to the n-th (n: an integer from l to N) slice image data SD of the slice image data group SDG, such as the xiphoid process and the pelvis. Find the right edge, pylorus, and duodenal papillae.
- the operator uses the input device 11 to specify the characteristic point in this part, and obtains the characteristic point coordinate information of the characteristic point. Perform input operation.
- the control unit 16 detects the input feature point coordinate information (Step S204, Yes) without detecting the image display instruction information (Step S203, No), and the image forming unit 16c
- the coordinate data based on the detected feature point coordinate information is set as slice image data SD, that is, the coordinate data of the feature point on the orthogonal coordinate system xV'z '(step S205).
- a slice image SG force displays a xiphoid process of a rib H.
- the control unit 16 detects the input feature point coordinate information, and the image forming unit 16c converts the coordinate data based on the detected feature point coordinate information into the feature point P ′ under the control of the control unit 16.
- step S206 No
- step S203 the processing steps after step S203 described above are repeated. This allows the surgeon to determine the feature point P 'of the xiphoid process.
- Characteristic point coordinate information can be designated and input for each of the other characteristic parts in substantially the same manner as 0.
- the image construction unit 16c sequentially sets the feature points on the orthogonal coordinate system x'y'z 'based on each feature point coordinate information input for each of the characteristic parts. .
- the image forming unit 16c calculates the orthogonal coordinates respectively corresponding to the right end of the pelvis, the pylorus, and the duodenal papilla based on the characteristic point coordinate information on the characteristic pelvis right end, the pylorus, and the duodenal papilla.
- the control unit 16 detects the input feature point end instruction information (Step S206, Yes). Then, the processing steps after step S102 described above are performed. When the feature point coordinate information is not detected in step S204 described above (step S204, No), the control unit 16 repeats the processing steps from step S203 described above.
- the image forming unit 16c converts the feature points P 'to P' into the rectangular coordinate system x'y'z '.
- ⁇ ' ⁇ ' ⁇ ' i '+ y + ⁇ V--(14)
- the direction components x ', y', and ⁇ ' are the ⁇ ⁇ ⁇ ⁇ ' -axis direction and the y'-axis direction of the position vector ⁇ ' ⁇ ', respectively.
- y ', z' are the x'-axis direction, y'-axis direction, and z'-axis direction of the position vector O'P ', respectively.
- P3 P3 P3 These are coordinate components in the three directions, the y'-axis direction, and the z'-axis direction.
- each image plane of the slice image data SD to SD has a side of 40c.
- the image forming unit 16c calculates the directional components of the position vectors ⁇ ' ⁇ ', O'P', O'P ', O'P'
- the control unit 16 calculates the position vector ⁇ ' ⁇ ', O'P
- FIG. 8 is a flowchart illustrating processing steps until the control unit 16 achieves the above-described sample point setting processing.
- the surgeon uses the input device 11 while touching the marker coil 8 and the plate 9 to the surface of the subject or operating the probe 2 and checking the optical image on the display device 12. Enter the point setting instruction information.
- the content of this sample setting instruction information is based on the information from the position data calculation device 6 that is used to simply set the sample data.
- or instruction information for the position data acquisition source such as whether to simultaneously acquire the position data of the transmission coil 7 and the plate 9 from the position data calculation device 6.
- the operator uses the input device 11 while the marker coil 8 and the plate 9 are in contact with the body surface near the right end of the subject's pelvis and the body surface near the xiphoid process, respectively.
- the control unit 16 Without detecting the optical image display instruction information (step S301, No), this sample point setting instruction information is detected (step S304, Yes), and the current time is detected from the timer 16b.
- the image forming unit 16c Using the position data input from the position data calculation device 6, a sample point based on the sample point setting instruction information is set on the orthogonal coordinate system xyz (step S305).
- the control unit 16 stores the position data of the sample points set by the image forming unit 16c, that is, each coordinate data on the orthogonal coordinate system xyz in the storage unit 16a.
- the content of the sample point setting instruction information is the instruction information of "set the sample point” and "obtain simultaneously the position data of the marker coil 8 and the plate 9 from the position data calculation device 6".
- the control unit 16 detects the time tl from the timer 16b, and detects the position data based on the alternating magnetic field from the plate 9 (plate position data) and the position data based on the alternating magnetic field from the marker coil 8 (marker coil position). ) Is received from the position data calculation device 6.
- the plate position data is the coordinate data of the position vector OL (tl) of the above-described reference position L (tl) and the rotation matrix T (tl).
- the force coil position data is coordinate data of the position vector OM (tl) of the position M (tl) described above.
- the image forming unit 16c sets coordinate data based on the plate position data at time tl as coordinate data of the sample point P on the orthogonal coordinate system xyz.
- the coordinate data based on the marker coil position data at the time tl is set as the coordinate data of the sample point P on the rectangular coordinate system xy z.
- the coordinate data of the sample point P at the time tl is the coordinate data of the position vector OL (tl).
- the coordinate data of the sample point P at the time tl is the coordinate data of the position vector OM (tl).
- Step S301 the control unit 16 detects the input optical image display instruction information (Step S301, Yes), and detects the detected optical image display instruction information.
- Optical image data from the optical observation device 17 is obtained based on the instruction information (Step S302).
- the control section 16 outputs the obtained optical image data to the display circuit 15.
- the display circuit 15 converts and outputs an image signal corresponding to the optical image data input from the control unit 16 under the control of the control unit 16 as described above.
- the display device 12 displays and outputs an optical image corresponding to the optical image data based on the image signal received from the display circuit 15. That is, the control unit 16 causes the display device 12 to display and output the optical image by transmitting the optical image data to the display circuit 15 (step S303).
- the operator checks the optical image of the display device 12 and inserts the distal end of the insertion portion 3, that is, the ultrasonic vibrator 3 a, the transmission coil 7, and the light into the anatomically characteristic site in the subject.
- the control unit 16 detects the sample point setting instruction information (step S304). , Yes), the current time is detected from the timer 16b.
- the image forming unit 16c uses the position data input from the position data calculating device 6, and converts the sample points based on the sample point setting instruction information into the rectangular coordinate system xyz.
- the control unit 16 stores the position data of the set sampling points. That is, each coordinate data on the orthogonal coordinate system xyz is stored in the storage unit 16a.
- the control unit 16 repeats the processing steps of steps S301 to S306 described above until detecting the sample point end instruction information input from the input device 11.
- the content of the sample point setting instruction information is instruction information such as "set a sample point” and "obtain simultaneously the position data of the transmission coil 7 and the plate 9 from the position data calculation device 6". If there is, the control unit 16 detects the time t2 from the timer 16b and receives from the position data calculation device 6 position data (transmit coil position data) based on the alternating magnetic field from the transmission coil 7 and plate position data. I do. At time t2, if the distal end of the insertion section 3 is in contact with the vicinity of the pylorus in the subject, the control section 16 receives the transmission coil position data as coordinate data corresponding to the pylorus.
- the transmission coil position data at the time t2 includes the coordinate data of the position vector OC (t2) of the center position C (t2), the coordinate data of the direction vector V (t2), and the direction vector V (t2 ) Coordinate data
- the plate position data at the time t2 is the coordinate data of the position vector OL (t2) and the rotation matrix T (t2) of the reference position L (t2) described above.
- the image forming unit 16c sets the coordinate data based on the transmission coil position data at the time t2 as the coordinate data of the sample point P on the rectangular coordinate system xyz.
- the image forming unit 16c sets the sample point P corresponding to the pylorus of the subject
- the image forming unit 16c calculates the position vector OP (t2) of the sample point P at time t2.
- This position vector OP (t2) is represented by the following equation (17).
- Equation (17) the direction components X (t2), y (t2), and ⁇ (t2) are
- the position vector OP (t2) and the rotation matrix T (t2) described above are calculated in the above-described step S107.
- the coordinate data of the sample point P at the time t2 is the coordinate data of the position vector OC (t2).
- the position vector OP (t2) of the sample point P at the time t2 in the rectangular coordinate system xyz Is considered to be the same as the position vector OC (t2), and is represented by the following equation (18).
- control unit 16 detects time t3 from timer 16b, and receives transmission coil position data and plate position data at time t3 from position data calculation device 6. At time t3, if the distal end of the insertion section 3 is in contact with the vicinity of the duodenal papilla in the subject, the control section 16 receives the transmission coil position data at time t3 as coordinate data corresponding to the duodenal papilla. .
- the transmission coil position data at time t3 is the coordinate data of the position vector OC (t3) of the center position C (t3), the coordinate data of the direction vector V (t3), and the coordinate data of the direction vector V (t3). It is coordinate data. Also, at this time t3
- the plate position data is the coordinate data of the position vector OL (t3) of the above-described reference position L (t3) and the rotation matrix T (t3).
- the image forming unit 16c sets coordinate data based on the transmission coil position data at time t3 as coordinate data of the sample point P on the orthogonal coordinate system xyz.
- the image forming unit 16c has set the sample point P corresponding to the duodenal papilla of the subject on the orthogonal coordinate system xyz. At the same time, the image composition unit 16c
- This position vector OP (t3) is represented by the following equation (19).
- Equation (19) the direction components x (t3), y (t3), and z (t3) are
- the coordinate data of the sample point P at time t3 is the coordinate data of the position vector OC (t3).
- the position vector OP (t3) of the sample point P at the time t3 in the rectangular coordinate system xyz Is considered to be the same as the position vector OC (t3), and is expressed by the following equation (20).
- Step S306, Yes the control unit 16 sets the sample point end instruction The information is detected (Step S306, Yes), and the processing steps after Step S103 described above are performed. If the control unit 16 does not detect the sample point setting instruction information in step S304 (No in step S304), the control unit 16 repeats the processing steps in step S306 and thereafter.
- FIG. 9 is a flowchart illustrating processing steps until the control unit 16 achieves the guide image creation processing described above.
- FIG. 10 is a schematic diagram illustrating the relationship between the two-dimensional image plane of the two-dimensional image data at time ts and the three-axis coordinate system P P P by the sample points P to P in FIG.
- FIG. 11 is a schematic diagram illustrating an operation in which the image forming unit 16c calculates a guide image plane at time ts and position data of the guide image plane.
- Fig. 9- In L1, after the control unit 16 performs the above-described step S107, the image forming unit 16c sets the four feature points P 'to ⁇ ' set in the above-described step S101, This step S1
- the guide image plane GF of the guide image data at time ts is calculated based on the two-dimensional image data of ts and the position data associated therewith (step S401).
- the guide image plane GF in step S401 is conceptually calculated as follows.
- the image composition unit 16c uses the feature points P 'to ⁇ '
- the 3-axis coordinate system PPP described above is set on the rectangular coordinate system xyz. At this time, the image composition unit 16c
- the image forming unit 16c uses the three-axis coordinate system P 'P' P '
- a three-axis coordinate system P P P whose axis direction anatomically corresponds to 1 2 3 is set.
- the image forming unit 16c sets the sample point P based on the above-described plate position data as the origin of the three-axis coordinate system PPP. Therefore, in the following,
- the image forming unit 16c determines the guide image plane GF as follows. First, using the position data associated with the two-dimensional image data in step S106 described above, the arbitrary point R (ts) on the two-dimensional image plane UF shown in FIG. Find the coordinates of
- the image forming unit 16c has calculated the position and orientation relationship between the two-dimensional image plane UF and the three-axis coordinate system PPP.
- a corresponding point R ′ (ts) on the three-axis coordinate system P′P′P ′ shown in FIG. 11 corresponding to the arbitrary point R (ts) represented by the coordinates is obtained,
- a set of corresponding points R '(ts) is defined as a guide image GF.
- the image forming unit 16c can determine the position and arrangement with the three-axis coordinate system P'P'P '.
- the orientation relationship is the same as the relationship between the position and orientation between the two-dimensional image plane UF and the three-axis coordinate system P P P.
- the two-dimensional image plane UF and the guide image plane GF are anatomically identical. This is because the anatomical structure or organ shape of the human body is considered to be almost the same without any gender differences in the abdomen, although there is a difference in body size.
- the image forming unit 16c obtains the guide image plane GF as follows.
- To calculate the guide image plane GF means that the position vector O'C '(ts) of the center position C' (ts) of the guide image plane GF and the direction vector V 'indicating the normal direction of the guide image plane GF (ts) and the direction vector V '(ts) indicating the 12 o'clock direction of the guide image plane GF.
- the image forming unit 16c sets each of these three positions at the center position C (ts) of the two-dimensional image plane UF.
- the image construction unit 16c does not actually find the corresponding point R '(ts) as described above for every arbitrary point R (ts) on the two-dimensional image plane UF.
- the concept of the method of obtaining the guide image plane GF will be described first, and the center position, the normal direction, and the The calculation method of the 12 o'clock direction will be described.
- the concept of the method for determining the guide image plane GF describes a plurality of mathematical expressions. This is a mathematical formula.
- the image forming unit 16c performs a numerical operation for obtaining the guide image plane GF based on the mathematical expression described in the actual method of obtaining the guide image plane GF.
- the position vector P R (ts) between the plate 9 and the arbitrary point R (ts) is obtained by calculating appropriate real numbers a, b, and c.
- the directional component can be expressed on a three-axis coordinate system P P P as in the following equation (21).
- P R (ts) aP P (ts) + bP P (ts) + cP P (ts)
- the feature points P ′, P ′, P ′, P ′ are anatomically the same as the sample points P 1, P 2, P 3, P 3
- P 'R' (ts) aP 'P' (ts) + bP 'P' (ts) + cP 'P' (ts)
- each direction component of the orthogonal coordinate system xyz of the position vector OR (ts) of the arbitrary point R (ts) is represented by x (
- O'R '(ts) — O'P' a (0'P '- ⁇ ' ⁇ ,) + b (0'P' ⁇ ' ⁇ ')
- the anatomical corresponding point R of the arbitrary point R (ts) on the two-dimensional image plane UF on the orthogonal coordinate system xyz is analytically based on Equations (24) and (35).
- the position vector O'R '(ts) of' (ts) and the directional components x '(ts), y' (ts), z '(ts) of the orthogonal coordinate system xV'z' were obtained. That is, for the arbitrary point R (ts) on the two-dimensional image plane UF, the control unit 16 stores the coordinate data of the feature point stored in the image storage unit 14 in the above-described step S101 and the control in step S102 described above.
- the set of corresponding points R '(ts) that can be used and calculated by equations (24) and (35) is the guide image plane GF.
- the position data associated with the two-dimensional image data at the time ts that is, the center vector of the two-dimensional image plane UF, the position vector of the C (ts) OC (ts), direction vector V (ts), direction vector V (ts)
- the image forming unit 16c calculates the above-described three-axis coordinate system PPP, P'P'P 'and the position vector OC (ts).
- a coordinate transformation process substantially similar to the above equation (35) is performed to calculate a position vector O′C ′ (t s). Since the center position C (ts) is set on the orthogonal coordinate system xyz, the position vector OC (ts) is expressed by the following equation (36). In addition, since the center position C ′ (ts) is set on the orthogonal coordinate system x′y′z ′, the position vector O′C ′ (ts) is expressed by the following equation (37).
- the direction components X ′ (ts), y ′ (ts), ⁇ ′ (ts) are position vectors O′C ′ (t c c c
- the following equation (38) is obtained.
- the image forming unit 16c sets the center positions C (ts) and C '(ts) on the two-dimensional image plane UF and the guide image plane GF, respectively, as shown in FIGS.
- the center positions are determined so that the two-dimensional image plane UF and the guide image plane GF anatomically correspond to each other.
- a direction vector V ′ (ts) of the guide image plane GF is derived using (ts).
- this position vector OR (ts) is expressed by the following equation.
- a unit point R '(ts) anatomically corresponding to the unit point R (ts) is defined on the orthogonal coordinate system x'y'z'.
- the position vector OR (ts) of the unit point R (ts) is equal to the position vector OC (ts) and the direction vector.
- the direction vector V (ts) is obtained by using the position vector OR (ts) and OC (ts) as follows:
- V (ts) OR (ts) -OC (ts)--(43)
- the image forming unit 16c standardizes the difference between the position vector O'C '(ts) and the position vector O'R' (ts) to the unit length. To obtain the direction vector V '(ts).
- V, (ts) (0, R, (ts) —O'C '(ts)) /, (ts) —O'C
- the image forming unit 16c outputs the direction vectors V (ts) and V (ts).
- the direction vector V ′ (ts) is, as described above, the position vector O′R ′ (ts) and the position vector
- the direction vector V (ts) is expressed by the following equation (46) using the direction components XV (ts), yv (ts), and zv (ts).
- V (ts) xv (ts) i + yv (ts) j + zv (ts) k
- the direction vector V '(ts) is obtained by combining the above-described position vector O'R' (ts) and position vector O'C '(ts).
- V '(ts) xv, (ts) i, +, (ts) j, + zv, (ts) k,
- the image forming unit 16c calculates, as a normal vector, that is, a direction vector V ′ (ts), a vector orthogonal to all vectors formed by arbitrary points on the guide image plane GF anatomically corresponding to the two-dimensional image plane UF. Two arbitrary points R (ts), R on the 2D image plane UF
- OR (ts) and OR (ts) are represented by the following equations (50) and (51), respectively.
- each position vector O'R '(ts), O'R' (ts) of the corresponding point R '(ts), R' (ts) is
- the direction components X ′ (ts), y ′ (ts), z ′ (ts) are These are coordinate components of O'R '(ts) in the X'-axis direction, the y'-axis direction, and the Z'-axis direction. Further, based on the above equation (35), each coordinate data of the position vector OR (ts), OR (ts)
- the direction vector V (ts) of the two-dimensional image plane UF is calculated using the directional components xv (ts), yv (ts), and zv (ts) in the respective axial directions of the orthogonal coordinate system xyz. 58).
- V (ts) xv (ts) i + y v (ts) j + zv (ts) k... (58)
- this direction vector V (ts) is the two-dimensional image data at time ts, that is, the normal vector of the two-dimensional image plane UF
- the direction vector V (ts) is orthogonal to the vector RR (ts) connecting the arbitrary points R (ts) and R (ts).
- equation (59) the inner product V (ts) ⁇ R R (ts) is the inner product of the direction vector V (ts) and the vector R R (ts).
- the direction vector V '(ts) of the guide image plane GF is defined by the directional components xv' (ts), yv '(ts), zv' (ts) in each axis direction of the rectangular coordinate system xV'z '. And is represented by the following equation (61).
- V '(ts) xv' (ts) i '+ yv' (ts) j '+ zv' (ts) k '--(61)
- V '(ts) -R' R '(ts) 0 "-(64)
- this equation (64) means that the direction vector V '(ts) is orthogonal to all vectors connecting arbitrary points on the guide image plane GF. That is, the direction vector V ′ (ts) based on the above-described equations (61) and (62) is a normal vector that determines the normal direction of the guide image plane GF. Therefore, the image forming unit 16c can calculate the direction vector V ′ (ts) that determines the normal direction of the guide image plane GF based on the above-described equations (61) and (62). Accordingly, the image forming unit 16c sets the direction vectors V (ts) and V '(ts) in the two-dimensional image plane UF and the guide image plane GF, respectively, as shown in FIGS. Each normal direction is determined such that the two-dimensional image plane UF and the guide image plane GF anatomically correspond to each other.
- the image forming unit 16c adjusts the orientation (center position, normal direction, 12:00 direction) of the guide image plane GF calculated in step S401 described above.
- the image forming unit 16c can set the guide image plane GF in which the orientation is associated as the image plane anatomically corresponding to the two-dimensional image plane UF.
- the control unit 16 notifies the image composition unit 16c that this guide image plane GF has been set.
- a slice image data group corresponding to the guide image plane GF is read from the image storage unit 14 (step S402). Specifically, the control unit 16 obtains each slice image data of the slice image data group and the guide image based on the position data regarding the position and orientation of the guide image plane GF obtained by performing the above-described step S401. Reads image data on the line of intersection with plane GF. The image forming unit 16c performs an interpolation process or a coordinate conversion process on the read slice image data group on the intersection line, and cuts the slice image data group in the image storage unit 14 along the guide image plane GF. Guide image data corresponding to the image is created (step S403). After that, the control unit 16 performs the processing step of step S109 described above.
- the control unit 16 sets the feature points on the slice image data based on the coordinate information of the feature points input by the operator using the input device 11.
- the present invention is not limited to this. If the region of interest of the inspection is determined in advance (the protocol has been determined!), A set of several types of feature points can be used as the default coordinate data together with the slice image data group at the time of factory shipment.
- the control unit 16 reads out the default coordinate data of the feature point from the image storage unit 14 and sets it as a feature point based on the instruction information on the selection of the feature point input from the input device 11 and stored in the storage unit 14. You may do so.
- the control unit 16 when setting the feature points, sets the control unit 16 based on the image display instruction information sequentially input from the input device 11 by the operator's input operation.
- a power source configured to sequentially read out the slice image data arranged at the beginning of the slice image data group and to sequentially display the slice image data arranged at the rear of the slice image data group and to sequentially output the image on the screen of the display device 12.
- the present invention is not limited to this.
- the slice image data group may be collectively read out based on the image display instruction information, and the list of the slice images may be displayed on the screen of the display device 12.
- the plate 9 or the marker coil 8 is attached to a plurality of predetermined positions such as the xiphoid process or the pelvis of the subject to change the position of the subject or change the position of the subject. After compensating for differences in various position data due to differences in physique, etc., this plate 9 is left on the subject's body surface, the marker coil 8 is removed, and the alternating magnetic field from the remaining plate 9 is applied. The difference between the coordinate data of the sample points due to the change in the body position of the subject under examination is corrected based on the position data based on the position data.
- the present invention is not limited to this.
- the sample points may be sequentially set based on the position data based on the alternating magnetic field from the marker coil 8, and The correction of the coordinate data of the sample point may not be performed. Also, based on the alternating magnetic fields output from the plate 9 and the marker coil 8 attached to the body surface of the subject during the examination, the coordinate data of the sample points due to the change in the body position of the subject are calculated. You may comprise so that it may correct. According to the powerful configuration, by attaching the marker coil 8 to an appropriate place, the accuracy of the guide image anatomically corresponding to the two-dimensional ultrasonic image is increased.
- the guide image plane is set by calculating the position and orientation (center position, normal direction, 12 o'clock direction) of the guide image plane.
- the present invention is not limited to this.
- the coordinate points at the four corners of the acquired two-dimensional image data are detected, and based on the above equation (35), the coordinate points at the four corners are anatomically assigned to the four coordinate points, respectively.
- the corresponding four coordinate points may be obtained, and the guide image plane may be calculated based on the four coordinate points.
- designation information relating to the size of the guide image for example, numerical information or selection information in which the display size and display magnification are considered in advance is input or selected from the input device 11 so that the input specification information is input.
- the configuration may be such that the size of the guide image plane is determined based on the guide image plane.
- the first coil and the second coil of transmission coil 7 are provided orthogonal to ultrasonic vibrator 3a, and one of the coil directions is two-dimensionally oriented.
- the normal direction of the image data and the other coil axis direction are set to the 12:00 direction of the two-dimensional image data.
- the present invention is not limited to this. Is in a fixed positional relationship with respect to the ultrasonic transducer 3a, the coil axis direction of the first coil or the coil axis direction of the second coil is oriented in a known direction.
- the normal direction and the 12 o'clock direction of the image data may be corrected and calculated based on the positional relationship based on the known direction.
- the first coil and the second coil are provided obliquely. With this configuration in which the data normal direction and the 12:00 o'clock direction can be corrected and calculated, the insertion portion 3 of the probe 2 can be made thinner.
- the reception coil 10 receives the alternating magnetic field from the transmission coil 7 provided in the probe 2 to detect the position data on the ultrasonic transducer 3a.
- the receiving coil 10 is provided in the probe 2 and the transmitting coil 7 is provided in the position data calculating device 6, and the transmitting coil 7 generates an alternating magnetic field of the receiving coil of the probe 2.
- the position data on the ultrasonic transducer 3a may be detected by the reception of the signal by the.
- the position and orientation of the two-dimensional image data are detected based on the acceleration based on the relative position change between the ultrasonic transducer 3a and the subject. May be.
- the origin O of the orthogonal coordinate system xyz is set at a predetermined position of the receiving coil 10, for example, near the center axis of the alternating magnetic field receiving surface.
- the origin O may be set at a desired position where the relative positional relationship with the receiving coil 10 does not change.
- a guide image is created by using slice image data created in advance based on photographic data that is a cross-sectional image of a frozen human body other than the subject.
- the present invention is not limited to this.
- a two-dimensional image data group is obtained by a radial scan performed by the ultrasonic diagnostic apparatus 1 or a similar ultrasonic diagnostic apparatus on a subject or a human body other than the subject.
- a color-coded image data group obtained by color-coding the two-dimensional image data group by organ is stored in the image storage unit 14 in advance, and this color-coded image data group may be used instead of the above-described slice image data group.
- PET Pierositron
- 3D image data obtained in advance using another modality such as Emission Tomography or an extracorporeal ultrasonic diagnostic apparatus that irradiates ultrasonic waves from outside the body is stored in the image storage unit 14 in advance.
- Feature points may be set using three-dimensional image data, and a guide image may be created.
- a slice image data group that is anatomical image data color-coded for each organ is stored in advance, and the orthogonal coordinate system of the slice image data group is stored.
- Set a feature point on xV'z 'and anatomically correspond to the feature point of the bracket A sample point is set on the subject's rectangular coordinate system xyz, and two-dimensional image data inside the subject obtained by performing a radial scan and the position related to the position and orientation of the two-dimensional image plane Data, and coordinate data of four characteristic points out of at least four points, and four points anatomically corresponding to these four points out of at least four sample points.
- the past target is corrected. Since this point is configured to be updated to the current sample point, it is possible to accurately obtain guide image data anatomically corresponding to the two-dimensional image data sequentially acquired in real time. The accuracy of the anatomical correspondence of the guide image displayed and output together with the sound image can be improved.
- two coordinate data sets of two coordinate points are placed near the body surface of the subject, for example, at the xiphoid process and the right end of the pelvis of the subject, respectively.
- From the transmission coil located in random order in the subject for example, near the pylorus and near the duodenal papilla of the subject.
- the contamination of the probe before operation Less time to clean Can.
- the movement of the probe is The sample point can be set at a position in the body that is displaced in accordance with the movement of the target region of interest in the body, and the sample point can be set at a position in the body near the target region of interest.
- the accuracy of the anatomical correspondence of the guide image to the two-dimensional ultrasound image can be improved.
- the pylorus or duodenal papilla may move along with the spleen head as the probe insert moves, the anatomical accuracy of the guide image can be improved by setting these points as sample points. Can be enhanced.
- the operator can simultaneously check the two-dimensional ultrasonic image and the guide image. For example, referring to the color-coded organ image or the like indicated by the guide image, the surgeon can check the current image. It is possible to accurately and easily recognize which position of the subject is anatomically indicated by the two-dimensional ultrasound image of the subject, thereby making it possible to identify a region of interest such as a lesion in the subject. This makes it easy to find out and accurately observe this region of interest, so that medical diagnosis of the subject can be performed accurately and efficiently. This is because the extracorporeal force of the subject is much more medically useful than an ultrasonic diagnostic device that irradiates ultrasonic waves, especially in reducing the examination time for the subject and the time for the operator's beginner learning. Significant contribution to mitigation.
- the force that creates the guide image data using the slice image data group previously stored in the image storage unit 14 In the second embodiment, the three-dimensional MRI (Magnetic Resonance Imaging) or X-ray 3D helical CT (Computer Tomography) Acquire anatomical image data created by an image diagnostic device via a network, and use the acquired anatomical image data It is configured to create guide image data.
- the three-dimensional MRI Magnetic Resonance Imaging
- X-ray 3D helical CT Computer Tomography
- FIG. 12 is a block diagram illustrating a configuration example of an ultrasonic diagnostic apparatus according to Embodiment 2 of the present invention.
- This ultrasonic diagnostic apparatus 21 is an image processing apparatus in place of the image processing apparatus 13.
- the device 22 is arranged.
- the image processing device 22 is provided with a control unit 23 instead of the control unit 16, and further includes a communication circuit 24.
- the communication circuit 24 is electrically connected to the control unit 23, and is further communicably connected to a 3D MRI apparatus 25 and an X-ray 3D helical CT apparatus 26 via a network 27.
- Other configurations are the same as those of the first embodiment, and the same components are denoted by the same reference numerals.
- the control unit 23 has the same configuration and function as the above-described control unit 16, further controls the communication circuit 24, and controls the 3D MRI apparatus 25 or the X-ray 3D helical CT via the network 27. It functions to perform an information communication process with the device 26.
- the control unit 23 performs an input operation of the acquisition instruction information for instructing the operator to acquire anatomical image data using the input device 11 and the selection instruction information for selecting and transmitting the transmission source of the anatomical image information. If this is performed, the input acquisition instruction information and selection instruction information are detected, and based on the acquisition instruction information and selection instruction information, leverage information communication processing is performed, and the information is processed via the communication circuit 24 and the network 27.
- anatomical image data from a 3D MRI device 25 or an X-ray 3D helical CT device 26 Obtain anatomical image data from a 3D MRI device 25 or an X-ray 3D helical CT device 26.
- the control unit 23 acquires three-dimensional volume data as the anatomical image data, and acquires the anatomical information from the X-ray three-dimensional helical CT apparatus 26.
- a two-dimensional CT image data group including a plurality of two-dimensional CT image data is acquired as this anatomic image data.
- this volume data is a set of button cells identified by monochrome or color luminance or the like and a unit, and covers the entire three-dimensional area of the subject or another subject.
- the two-dimensional CT image data group is a two-dimensional tomographic image data group of the subject or another subject, and is arranged in substantially the same manner as the above-described slice image data group.
- the communication circuit 24 is realized by using a large-capacity and high-speed communication modem or the like. Under the control of the control unit 23, the communication circuit 24 performs a three-dimensional communication via a predetermined optical communication or high-speed telephone line communication network 27. Receives volume data from the MRI device 25 and sends this volume data to the control unit 23, or receives two-dimensional CT image data from the X-ray three-dimensional helical CT device 26 and controls the two-dimensional CT image data Send to Part 23. [0206] When receiving the two-dimensional CT image data group from the communication circuit 24, the control unit 23 sets the two-dimensional CT image group in substantially the same manner as the association of the rectangular coordinate system xV'z 'with the slice image data group described above.
- the image data group is associated with the orthogonal coordinate system xV'z ', and the associated two-dimensional CT image data group is stored in the image storage unit 14.
- the control unit 23 performs the above-described steps S101 to S110 using the two-dimensional CT image data group instead of the slice image data group.
- the control unit 23 uses the two-dimensional CT image data group instead of the above-described slice image data group, performs the processing steps of steps S201 to S206 described above, sets the above-described feature points, and performs the above-described steps.
- the guide image data described above is created by performing the processing steps of S401 to S403.
- the control unit 23 when receiving the volume data from the communication circuit 24, the control unit 23 sets the predetermined position of the volume data, for example, at the position where the corner of the frame indicating the outer frame of the volume data coincides with the origin O '.
- the rectangular coordinate system xV'z ' is set, and the volume data associated with the rectangular coordinate system XV'z' is stored in the image storage unit 14. Thereafter, the control unit 23 performs the above-described processing steps of steps S101 to S110 using the volume data instead of the above-described slice image data group.
- FIG. 13 is a flowchart illustrating a processing flow until the control unit 23 sets the above-described feature points using the volume data.
- FIG. 14 is a schematic diagram illustrating the operation of the control unit 23 for setting the cross section of the volume data. 13 and 14, this volume data VD has a frame VF corresponding to its outer frame and an image information plane marker FM corresponding to its cross-sectional position and translating in the volume data VD. As described above, the rectangular coordinate system x'y'z 'is set in the volume data VD such that the predetermined corner of the frame VF matches the origin O'.
- the control unit 23 causes the display device 12 to display and output the volume data VD based on the instruction information input from the input device 11, and in this state, uses the volume data VD and replaces the above-described steps S201 to S206 with FIG. The following feature point setting process is performed.
- Step S501 when the operator performs an input operation of the cross-section specification information for specifying the cross-section of the volume data VD using the input device 11, the control unit 23 detects the input cross-section specification information ( (Step S501, Yes), based on the detected section designation information, the volume data In addition to setting the cross-section of the data VD, the cross-sectional image data of the volume data VD in the set cross-section is read from the image storage unit 14, and the cross-sectional image corresponding to the read cross-sectional image data is displayed on the display device 12. Image display processing is performed (step S502).
- the operator operates the input device 11 to move the cursor K1 displayed on the display device 12 to a desired position in the volume data VD, for example, as shown in FIG.
- the input operation of the designated information is achieved, and the position of the image information plane marker FM is designated to the control unit 23.
- the control unit 23 sets a section whose position and orientation match the image information plane marker FM based on the section specifying information as a section of the volume data VD.
- the control unit 23 reads out the cross-sectional image data of the volume data VD corresponding to the set cross-section from the image storage unit 14, and then transmits the read-out cross-sectional image data to the display circuit 15.
- the display circuit 15 outputs an image signal corresponding to the received cross-sectional image data to the display device 12, similarly to the above-described slice image data.
- the control unit 23 achieves a cross-sectional image display process using the cross-sectional image data, and arranges and outputs the volume data VD and the cross-sectional image on the same screen of the display device 12.
- the control unit 23 does not detect the section designation information (step S501, No), and repeats the processing step of step S501.
- the control unit 23 transmits the input cross-section designation information.
- the information is detected (Step S503, Yes), and the above-described processing steps after Step S502 are repeated.
- the control unit 23 sequentially reads out the cross-sectional image data of the volume data VD for each of the detected cross-section designation information based on the cross-section designation information detected in step S503, and The image data is sequentially transmitted to the display circuit 15, and the display device 12 sequentially updates each slice image of the slice image data and causes the display device 12 to output the slice image data.
- the surgeon can obtain an anatomically distinct site such as a xiphoid process, as in the case of the slice image data described above.
- anatomically distinct site such as a xiphoid process, as in the case of the slice image data described above.
- Right pelvis, pylorus, and ten The duodenal papilla can be found.
- the control unit 23 detects the input feature point coordinate information (Step S504, Yes) without detecting the section designation information (Step S503, No).
- the coordinate data based on the detected feature point coordinate information is set as the coordinate data of the feature point on the orthogonal coordinate system x'y'z '(step S505).
- the control unit 23 switches from a mode for setting a section of the volume data VD (section setting mode) to a mode for setting a feature point (feature point setting mode) in response to detection of feature point coordinate information. .
- step S506 No the control unit 23 does not detect the feature point end instruction information (step S506, No), and performs the processing after step S503 described above. Repeat the process.
- the surgeon can obtain the anatomical features such as the xiphoid process, the right edge of the pelvis, the pylorus, and the duodenal papilla in almost the same manner as in the slice image data described above.
- Each feature point coordinate information can be specified and input sequentially. Based on the sequentially designated and input feature point coordinate information, the control unit 23 performs an orthogonal coordinate system corresponding to the xiphoid process, the right end of the pelvis, the pylorus, and the duodenal papilla in substantially the same manner as in the first embodiment.
- control unit 23 determines in step S503 that the sectional finger
- the detection of the cross-section designation information is used as a trigger to switch from the above-described feature point setting mode to the above-described cross-section setting mode, and repeat the above-described processing steps from step S502.
- Step S506 when the operator performs the input operation of the feature point end instruction information using the input device 11, the control unit 23 detects the input feature point end instruction information (Step S506, Yes). Then, the processing steps after step S102 described above are performed by using the volume data VD as necessary instead of the slice image data group SDG described above. If the control unit 23 does not detect the feature point coordinate information in step S504 described above (step S504, No), The processing steps after step S503 described above are repeated.
- FIG. 15 is a schematic diagram schematically illustrating a state in which the volume data VD and the cross-sectional image of the cross section corresponding to the image information plane marker FM are displayed side by side on the same screen and output.
- a cross-sectional image DG showing the xiphoid process of rib H is shown.
- the control unit 23 causes the volume data VD and the cross-sectional image (for example, the cross-sectional image DG) of the cross section corresponding to the image information plane force FM to be displayed side by side on the same screen of the display device 12 as described above. . That is, as shown in FIG. 15, the control unit 23 arranges and outputs the volume data VD having the frame VF and the image information plane marker FM and the cross-sectional image DG on the same screen. Further, as shown in FIG. 15, the control unit 23 superimposes the movable cursor K1 on the volume data VD in the above-described cross section setting mode, and performs the above-mentioned feature point setting mode on the cross section image DG. The movable cursor K2 is superimposed.
- the control unit 23 superimposes the movable cursor K1 on the volume data VD in the above-described cross section setting mode, and performs the above-mentioned feature point setting mode on the cross section image DG.
- the movable cursor K2 is
- the control unit 23 moves the force solver K1 to a position based on the above-described section designation information, and moves the image information plane force FM to the position to which the cursor K1 moves. .
- the control unit 23 matches the position based on the section designation information with the position of the image information plane marker FM.
- the control unit 23 displays and outputs both the volume data VD and the cross-sectional image DG.
- the control unit 23 moves the cursor K2 to a position based on the above-described feature point coordinate information, and sets a feature point at a position to which the cursor K1 is moved. Accordingly, the control unit 23 sets the feature point P ′ at a position based on the feature point coordinate information, for example, the position of the xiphoid process of the rib H, as shown in FIG.
- a two-dimensional CT image data group acquired from an X-ray three-dimensional CT scanner 26 as anatomical image data is the same as the slice image data group described above.
- the two-dimensional CT image data group is used to superimpose each two-dimensional CT image data or interpolate between the two-dimensional CT image data. Processing, etc., to create three-dimensional CT image data based on this two-dimensional CT image data group.
- the feature point may be set and a guide image may be created in the same manner as in the first embodiment.
- Embodiment 2 of the present invention two-dimensional CT image data acquired from the three-dimensional MRI apparatus 25 or two-dimensional CT image data acquired from the X-ray three-dimensional helical CT apparatus 26 is used as anatomical image information.
- the above-described feature points are set and a guide image is created, the present invention is not limited to this, and has the same function as the ultrasonic diagnostic apparatus 1 of the above-described first embodiment.
- the two-dimensional image data group acquired by the ultrasonic diagnostic apparatus is color-coded in advance for each organ, and the color-coded two-dimensional image data group is acquired from the ultrasonic diagnostic apparatus via the network 27, and the acquired two-dimensional image is acquired.
- a feature point may be set using a data group and a guide image may be created.
- PET Pierositron
- 3D image data acquired in advance using another modality such as Emission Tomography or an extracorporeal ultrasonic diagnostic apparatus that irradiates ultrasonic waves from outside the body is acquired via the network 27.
- a guide image may be created by setting feature points using the acquired three-dimensional image data.
- the control unit 23 converts the luminance data of the volume data acquired from the three-dimensional MRI apparatus 25 or the two-dimensional CT image data group acquired from the X-ray three-dimensional helical CT apparatus 26 via the network 27 into the brightness value of the data.
- the guide image may be color-coded for each organ and stored in the image storage unit 14, and the color-separated guide image may be created in the same manner as the slice image data group described above.
- anatomical image data such as anatomical image data from the outside is transmitted by optical communication or high-speed telephone line communication.
- the system is configured to acquire two-dimensional CT image data or volumetric data as three-dimensional image data and create a guide image using the acquired anatomical image data.
- Anatomical image data can be used as the original data of the guide image data, and the anatomical image data of the desired range according to the target region of interest can be obtained while enjoying the operation and effect of the first embodiment.
- a guide image indicating anatomical correspondence can be created and output more accurately with respect to the displayed and output two-dimensional ultrasonic image of the subject.
- Embodiment 1 described above
- the two-dimensional ultrasonic image and the guide image anatomically corresponding to the two-dimensional ultrasonic image are displayed and output side by side on the same screen. At least one of the sound image and the guide image can be further rotated.
- FIG. 16 is a block diagram illustrating a configuration example of an ultrasonic diagnostic apparatus according to Embodiment 3 of the present invention.
- the ultrasonic diagnostic apparatus 31 includes an image processing device 32 instead of the image processing device 13.
- the image processing device 32 is provided with a control unit 33 instead of the control unit 16.
- the control unit 33 has the configuration and functions of the control unit 16 described above, and further includes a rotation processing unit 33a.
- Other configurations are the same as those of the first embodiment, and the same components are denoted by the same reference numerals.
- the control unit 33 performs the operation based on the mode switching instruction information input by the input operation. Switch the operation mode to the 2D image rotation mode. If the operation mode of the control unit 33 is the two-dimensional image rotation mode, the rotation processing unit 33a outputs a display to the display device 12 based on the angle information input from the input device 11 under the control of the control unit 33. With respect to the two-dimensional image data of the two-dimensional ultrasound image, each coordinate point of the two-dimensional image plane is rotated around the center position C of the two-dimensional image plane as the rotation center.
- This angle information is information relating to the angle of rotation of the two-dimensional ultrasonic image around the center of the image.
- the control unit 33 transmits the rotated two-dimensional image data to the display circuit 15, and the display circuit 15 transmits an image signal corresponding to the two-dimensional image data to the display device 12. Accordingly, the control unit 33 can cause the display device 12 to display and output a two-dimensional ultrasonic image corresponding to the two-dimensional image data on which the rotation process has been performed.
- the rotation processing unit 33a rotates each coordinate point of the two-dimensional image data according to the direction and the angle based on the angle information input from the input device 11.
- the input device 11 inputs the angle information to the control unit 33 in addition to the above-described various information such as the instruction information and the coordinate information. For example, when using a keyboard or a touch panel, a numerical value corresponding to the angle information is input or selected, or a cursor or the like superimposed on the two-dimensional ultrasonic image of the display device 12 is moved in a predetermined direction. Key operation By doing so, this angle information is input.
- the input device 11 shifts the two-dimensional ultrasonic image in the positive direction by an angle corresponding to the amount of movement.
- the angle information for rotating the two-dimensional ultrasonic image in the negative direction by an angle corresponding to the amount of movement is input to the control unit 33.
- the input device 11 rotates the two-dimensional ultrasonic image in the positive direction by an angle corresponding to the amount of movement when the operator performs a drag operation or a key operation and the cursor moves rightward on the screen.
- the control unit 33 receives angle information for rotating the two-dimensional ultrasonic image in the negative direction by an angle corresponding to the amount of movement. input.
- FIG. 17 is a schematic diagram schematically illustrating a state in which a guide image and a two-dimensional ultrasonic image that has been subjected to a rotation process are displayed and output side by side on the same screen.
- FIG. 17 illustrates the above-described two-dimensional ultrasonic image UG as the two-dimensional ultrasonic image.
- the guide image GG described above is exemplified as the guide image. Note that the two-dimensional ultrasonic image UG and the guide image GG anatomically correspond to each other as described above.
- the cursor K is superimposed on the two-dimensional ultrasonic image UG.
- the control unit 33 moves, for example, a cursor K on a screen in a predetermined direction based on the angle information input from the input device 11 in the two-dimensional image rotation mode described above.
- the rotation processing unit 33a uses the angle information and, for example, an angle corresponding to the amount of movement of the cursor K, and generates a two-dimensional ultrasonic image in a rotation direction corresponding to the movement direction of the cursor K. Rotate each coordinate point of 2D image data of UG.
- the control unit 33 uses the two-dimensional image data subjected to the rotation processing to generate an angle corresponding to the amount of movement of the cursor K in the rotation direction corresponding to the movement direction of the force sol K as shown in FIG.
- control unit 33 can substantially match the actual body orientation of the subject visually recognized by the operator with the up, down, left, and right directions of the two-dimensional ultrasonic image of the subject. This allows the operator to easily understand the association between the two-dimensional ultrasound image on the screen and the actual subject, and promotes the efficiency of medical diagnosis for the subject.
- the control unit 33 performs the operation based on the mode switching instruction information input by the input operation. Then, the operation mode is switched to the guide image rotation mode.
- the rotation processing unit 33a outputs a guide to the display device 12 based on the angle information input from the input device 11 under the control of the control unit 33.
- each coordinate point on the guide image plane is rotated around the center position C 'of the guide image plane as the rotation center.
- the angle information input in the guide image rotation mode is information on an angle at which the guide image is rotated around the center of the image.
- the angle information of the guide image rotation mode is input to the control unit 33 by the operator performing an input operation using the input device 11 in the same manner as in the two-dimensional image rotation mode described above.
- control unit 33 transmits the guide image data on which the rotation processing has been performed to the display circuit 15, and the display circuit 15 transmits an image signal corresponding to the guide image data to the display device 12. Accordingly, the control unit 33 can cause the display device 12 to display and output a guide image corresponding to the guide image data on which the rotation process has been performed.
- FIG. 18 is a schematic diagram schematically illustrating a state in which a two-dimensional ultrasonic image and a guide image on which a rotation process has been performed are output side by side on the same screen.
- FIG. 18 illustrates the above-described two-dimensional ultrasonic image UG as the two-dimensional ultrasonic image.
- the guide image GG described above is exemplified as the guide image.
- the cursor K is superimposed on the guide image GG.
- the control unit 33 moves, for example, the cursor K on the screen in a predetermined direction based on the angle information input from the input device 11.
- the rotation processing unit 33a uses the angle information and, for example, uses an angle corresponding to the amount of movement of the cursor K to guide the rotation in a rotation direction corresponding to the movement direction of the cursor K.
- the controller 33 uses the rotated guide image data to rotate the cursor K in the rotation direction corresponding to the movement direction of the cursor K by an angle corresponding to the amount of movement of the cursor K, as shown in FIG.
- the displayed guide image GG can be displayed and output. Accordingly, the control unit 33 can substantially match the actual body orientation of the subject observed by the operator with the vertical and horizontal directions of the guide image. This allows the operator to easily understand the association between the guide image on the screen and the actual subject.
- the control unit 33 performs the operation based on the mode switching instruction information input by the input operation. Then, the operation mode is switched to the image-linked rotation mode.
- the rotation processing unit 33a outputs a display to the display device 12 based on the angle information input from the input device 11 under the control of the control unit 33.
- the two-dimensional image data of the two-dimensional ultrasonic image is subjected to rotation processing on each coordinate point of the two-dimensional image plane using the center position C of the two-dimensional image plane as a center of rotation, and at the same time, displayed on the display device 12. With respect to the guide image data of the output guide image, each coordinate point of the guide image plane is rotated around the center position C 'of the guide image plane as the center of rotation.
- the angle information input in the image-linked rotation mode is information on the angle at which the two-dimensional ultrasonic image is rotated about the image center and information on the angle at which the guide image is rotated about the image center. is there.
- the angle information of the image-linked rotation mode is input to the control unit 33 by the operator performing the same input operation as the two-dimensional image rotation mode or the guide image rotation mode using the input device 11.
- FIG. 19 is a schematic diagram schematically illustrating a state in which a two-dimensional ultrasound image on which rotation processing has been performed and a guide image on which rotation processing has been performed are displayed side by side on the same screen and output.
- the above-described two-dimensional ultrasonic image UG is illustrated as the two-dimensional ultrasonic image
- the above-described guide image GG is illustrated as the guide image.
- the cursor K is superimposed on the guide image GG, the cursor K moves according to the input operation of the angle information by the operator, and the two-dimensional ultrasonic image UG rotates together with the guide image GG.
- the present invention is not limited to this.
- a cursor K is superimposed on the two-dimensional ultrasound UG, and the cursor K moves in response to an input operation of angle information by the operator, and the two-dimensional ultrasound image UG
- the guide image GG may be rotated.
- the control unit 33 moves the cursor K on the screen, for example, in a predetermined direction based on the angle information input from the input device 11 in the image-linked rotation mode described above.
- the rotation processing unit 33a uses the angle information and, for example, an angle corresponding to the amount of movement of the cursor to generate the guide image GG in the rotation direction corresponding to the movement direction of the cursor K.
- Each coordinate point of the guide image data is rotated.
- the rotation processing unit 33a uses the same angle as the rotation angle of each coordinate point of the guide image data based on the angle information.
- Each coordinate point of the two-dimensional image data of the two-dimensional ultrasonic image UG is rotated in the same rotation direction as the rotation direction of each coordinate point.
- the control unit 33 can move the cursor K in the rotation direction corresponding to the movement direction of the cursor K, as shown in FIG.
- the guide image GG rotated by an angle corresponding to the amount of movement of the cursor K and the two-dimensional ultrasonic image UG rotated by the same angle in the same rotation direction as the guide image GG can be displayed and output. Accordingly, the control unit 33 determines the actual body orientation of the subject visually recognized by the operator, the vertical and horizontal directions of the two-dimensional ultrasonic image of the subject, and the vertical and horizontal directions of the guide image. Can be almost matched.
- an input operation of angle information using input device 11, for example, a drag operation, a key operation, or an operation of inputting or selecting a numerical value corresponding to an angle and a rotation direction The present invention is not limited to this.
- the force configured to input information on the angle and direction of rotation of each coordinate point of at least one of the two-dimensional image data and the guide image data is not limited to this.
- a unit angle and a rotation direction for rotating each coordinate point of at least one of the two-dimensional image data and the guide image data are input. At least one coordinate point of the image data and the guide image data may be sequentially rotated by the unit angle in the rotation direction.
- the configuration and function of the first embodiment described above are combined with the rotation direction of at least one of the two-dimensional ultrasonic image and the guide image. Based on angle information corresponding to the angle, at least one coordinate point of at least one of two-dimensional image data corresponding to the two-dimensional ultrasonic image and guide image data corresponding to the guide image is rotated, and the rotation is performed. Display and output a processed 2D ultrasound image, a guide image that has been rotated, or a 2D ultrasound image and a guide image that have been rotated using the same rotation direction and the same angle.
- An ultrasonic diagnostic apparatus can be realized.
- the operator can move the 12 o'clock direction of the ultrasonic vibrator and the ultrasonic observation apparatus due to the torsion of the shaft that flexibly connects the ultrasonic vibrator and the motor. Even if there is an angle deviation from the 12 o'clock direction detected by the camera, observe at least one of the two-dimensional ultrasonic image and the guide image according to the body position of the subject. Can. For example, when the right edge of the pelvis of the subject under examination is located at the top in the subject's body position, the surgeon positions the right edge of the pelvis at the top of the image on at least one of the 2D ultrasound image and the guide image. Thus, the orientation of at least one of the two-dimensional ultrasound image and the guide image can be visually matched with the actual orientation of the subject.
- the position data on the two-dimensional image plane of the two-dimensional image data is detected based on the alternating magnetic field from the transmission coil 7 provided with the first coil and the second coil.
- position data on the rotation axis direction and the rotation center position of the ultrasonic transducer 3a is detected, and the detected position data and default position data set in advance are detected.
- the two-dimensional image plane and the guide image plane are configured to be initially set.
- FIG. 20 is a block diagram illustrating a configuration example of an ultrasonic diagnostic apparatus according to Embodiment 4 of the present invention.
- the ultrasonic diagnostic apparatus 31 includes an image processing device 42 in place of the image processing device 32, a transmission coil 44 in place of the transmission coil 7, and a position data calculation device 45 in place of the position data calculation device 6. Is arranged.
- the image processing device 42 is provided with a control unit 43 instead of the control unit 33.
- the control unit 43 has substantially the same configuration and function as the control unit 33 described above. Other configurations are the same as those of the third embodiment, and the same components are denoted by the same reference numerals.
- the transmission coil 44 is realized using a first coil whose coil axis is fixed in the direction of the insertion axis of the insertion section 3 into the subject, that is, in the direction of the rotation axis of the ultrasonic transducer 3a. In substantially the same manner as the transmission coil 7, it is detachably disposed near the ultrasonic transducer 3a.
- the transmission coil 44 generates an alternating magnetic field indicating the position of the ultrasonic transducer 3a and the direction of the rotation axis when the position data calculation device 45 supplies a current to the first coil.
- the position data calculation device 45 has substantially the same configuration and function as the position data calculation device 6 described above, and default position data 45a is set in advance. This default location
- the data 45a is vector data set in advance on the orthogonal coordinate system xyz as default position data indicating the initial 12:00 direction on the two-dimensional image plane.
- the position data calculation device 45 receives a position detection signal based on the alternating magnetic field from the transmission coil 44 from the reception coil 10, and determines the rotation center and the rotation axis direction of the ultrasonic transducer 3a based on the received position detection signal. Is calculated. After that, the position data calculation device 45 transmits the calculated position data and the default position data 45a to the control unit 43.
- the control unit 43 performs the processing steps of steps S101 to S110 described above, and based on the calculated position data and the default position data 45a, a two-dimensional ultrasonic image (default two-dimensional ultrasonic image) and a guide image. (Default guide image) is displayed on the display device 12.
- the image forming unit 16c combines the two-dimensional image data acquired from the ultrasonic observation device 5 with the position data and the default position received from the position data calculation device 45 at the same timing. Correlate with data.
- the image forming unit 16c uses the center position C (t) and the direction vector V (t) based on the position data and the default position data as information for determining the position and orientation of the two-dimensional image data. And the direction vector V (t) based on. Based on this, the control unit 43 uses the position data
- Two-dimensional image data (default two-dimensional image data) having a center position C (t), a direction vector V (t), and a direction vector V (t) based on the default position data 45a.
- Guide image data (default guide image) having a center position C ′ (t) and a direction vector V ′ (t) based on the position data and a direction vector V ′ (t) based on the default position data 45a.
- the control unit 43 changes the operation mode based on the input mode switching instruction information to the two-dimensional image rotation mode or the guide image. Switch to rotation mode. If the control unit 43 is in the two-dimensional image rotation mode or the guide image rotation mode, the rotation processing unit 33a performs the default processing based on the angle information input from the input device 11, as in the third embodiment. Rotate each coordinate point on the 2D image plane of 2D image data or each coordinate point on the guide image plane of this default guide image data. Thereafter, as in the third embodiment described above, the control unit 43 executes the secondary processing by rotating the default two-dimensional ultrasonic image by this rotation processing. The original ultrasound image or the guide image obtained by rotating the default guide image by this rotation processing is updated and displayed on the display device 12.
- control unit 43 determines the default 2D image data of the 2D image data based on the input update instruction information.
- the direction vector in the hour direction (12:00 direction vector) is updated to the 12:00 direction vector of the two-dimensional image data obtained by this rotation processing.
- the control unit 43 updates the 12:00 direction vector of the default guide image data to the 12:00 direction vector of the guide image data obtained by the rotation processing based on the update instruction information.
- the image forming unit 16c generates the updated 12:00 direction vector and the center position and the direction vector of the normal direction (normal vector) based on the above-described position data. And two-dimensional image data acquired sequentially. Further, under the control of the control unit 43, the image forming unit 16c creates guide image data using the updated 12:00 direction vector, the center position based on the above-described position data, and the normal vector. Alternatively, under the control of the control unit 43, the image forming unit 16c converts the 12 o'clock direction vector of the updated guide image data into a 12 o'clock direction vector on the orthogonal coordinate system xyz, and obtains the obtained 12 o'clock direction vector.
- the time direction vector, the center position and the normal vector based on the position data described above, and the two-dimensional image data to be sequentially acquired may be associated with each other.
- the Examiner Since the image and the default guide image are displayed and output, and the default 2D ultrasound image or the default guide image is rotated at each coordinate point to update the 12 o'clock direction by the default position data, the Examiner The insertion part to be inserted into the body can be made thinner, and a two-dimensional ultrasonic image of the subject and a guide image anatomically corresponding to the two-dimensional ultrasonic image can be displayed on the same screen.
- An ultrasonic diagnostic apparatus suitable for an ultrasonic examination capable of displaying and outputting, enjoying the operation and effect of the above-described third embodiment, and reducing the pain of the subject when inserting the probe into the subject is provided. Real Can be manifested.
- Embodiments 1 to 4 described above each time a radial scan is performed to acquire two-dimensional image data of a subject, a two-dimensional ultrasonic image corresponding to the two-dimensional image data and the two-dimensional ultrasonic image
- a guide image corresponding to the anatomical image is displayed and output in the fifth embodiment
- the fifth embodiment further associates the identification information for identifying the acquired two-dimensional ultrasonic image with the two-dimensional image data. It is configured to be able to search and display and output 2D ultrasound images based on this specific information.
- FIG. 21 is a block diagram illustrating a configuration example of an ultrasonic diagnostic apparatus according to Embodiment 5 of the present invention.
- an image processing device 52 is provided instead of the image processing device 32.
- the image processing device 52 is provided with a control unit 53 instead of the control unit 33.
- the control unit 53 has substantially the same configuration and function as the above-described control unit 33, and further includes an image search unit 53a.
- Other configurations are the same as those of the third embodiment, and the same components are denoted by the same reference numerals.
- the control unit 53 performs the processing steps of steps S101 to S110 described above to create two-dimensional image data to be sequentially acquired and guide image data anatomically corresponding to each two-dimensional image data, Each two-dimensional ultrasonic image corresponding to each two-dimensional image data and each guide image corresponding to each guide image data are sequentially displayed and output on the display device 12. At this time, the control unit 53 stores the position data corresponding to the position and orientation of each two-dimensional image data in the image storage unit 14 in association with each two-dimensional image data, and Each position data corresponding to the position and the orientation is stored in the image storage unit 14 in association with each guide image data.
- the control unit 53 changes the operation mode based on the input mode switching instruction information to the two-dimensional image rotation mode and the guide image. Switch to rotation mode or image-linked rotation mode.
- the rotation processing unit 33a receives the input from the input device 11 as in the third embodiment. Based on the obtained angle information, at least one coordinate point of the two-dimensional image plane of the two-dimensional image data and the guide image plane of the guide image data is rotated.
- the control unit 53 sets the same angle in the same rotation direction as the two-dimensional ultrasonic image subjected to the rotation processing, the guide image subjected to the rotation processing, or the same rotation direction.
- the two-dimensional ultrasonic image and the guide image that have been subjected to the rotation process are updated and displayed on the display device 12.
- the control unit 53 performs a rotation process on the position data related to the two-dimensional image data based on the input update instruction information.
- the position data of the guide image data is updated to the position data of the guide image data subjected to the rotation processing.
- control unit 53 stores the updated position data of the two-dimensional image data in the image storage unit 14 in association with the two-dimensional image data, or stores the updated position data of the guide image data in the guide image data.
- the images are stored in the image storage unit 14 in association with each other.
- the control unit 53 detects the input specific information, and also detects the input specific information.
- the two-dimensional image data and the specific information are stored in the image storage unit 14 in association with each other. That is, when the specific information is sequentially input for each two-dimensional ultrasonic image from the input device 11, the control unit 53 sequentially detects each of the input specific information and simultaneously executes the two-dimensional ultrasonic image of the two-dimensional ultrasonic image.
- the dimensional image data and the sequentially input specific information are stored in the storage unit 16a or the image storage unit 14 in association with each other.
- control unit 53 determines, for each two-dimensional ultrasound image for which the specific information has been specified and input, the respective two-dimensional image data and the respective position data, and the respective guide image data anatomically corresponding to the two-dimensional image data.
- the respective position data and the respective specific information can be associated with each other. Further, if the above-described rotation processing is performed, the control unit 53 transmits the angle information for each performed rotation processing or the position data for each updated two-dimensional image data or each guide image data to the above-described two-dimensional data. Further correspondence is made for each sound image.
- the specific information is information for specifying each of the displayed and output two-dimensional ultrasonic images.
- the name, identification symbol, or two-dimensional ultrasonic image of the two-dimensional ultrasonic image is used. And the like related to the subject.
- the control unit 53 may cause the display device 12 to display and output the input specific information in association with each two-dimensional ultrasonic image!
- the control unit 53 In addition, the input specific information is detected, and the image search unit 53a is controlled.
- the image search unit 53a uses the associated specific information input from the input device 11 under the control of the control unit 53, and generates the two-dimensional image data associated with the associated specific information and the two-dimensional image data.
- the position data, the guide image data, and each position data are retrieved from the image storage unit 14 or the storage unit 16a.
- the image search unit 53a updates the angle information of the performed rotation processing or updates the rotation information.
- Each position data of the two-dimensional image data or each position data of the updated guide image data is further searched from the image storage unit 14 or the storage unit 16a.
- control unit 53 uses the various types of information retrieved by the image retrieval unit 53a to display the two-dimensional ultrasonic image and the guide image associated with the associated specific information on the display device 12. Display output. At this time, the control unit 53 controls the image processing apparatus 52 so that the current two-dimensional ultrasonic image acquired by the radial scan and the retrieved two-dimensional ultrasonic image are displayed and output on the same screen.
- the configuration and function of searching for a two-dimensional ultrasonic image and a guide image and displaying and outputting the two-dimensional ultrasonic image and the guide image are added to the configuration and function of the third embodiment.
- the diagnostic apparatus has been exemplified, the present invention is not limited to the configuration and the function of the first embodiment described above, and further includes a configuration and a function of searching for a two-dimensional ultrasonic image and a guide image and displaying and outputting them.
- the added ultrasonic diagnostic apparatus may be used. That is, the rotation processing unit 33a may not be provided.
- the ultrasonic diagnostic apparatus may further include a configuration and a function of searching and displaying and outputting a two-dimensional ultrasonic image and a guide image in addition to the configuration and the function of the second embodiment
- An ultrasonic diagnostic apparatus may further include a configuration and a function of searching for a two-dimensional ultrasonic image and a guide image and displaying and outputting the two-dimensional ultrasonic image and the guide image in addition to the configuration and functions of the fourth embodiment.
- each two-dimensional image data and each position data, and each guide image anatomically corresponding to these two-dimensional image data
- the data, its position data, and its specific information are stored in association with each other, and if necessary, each angle information for each rotation process or each position data for each updated 2D image data or each guide image data is stored.
- the two-dimensional image data and its position data, the guide image data and the corresponding two-dimensional image data are stored based on the input associated specific information.
- the position data and the position data of the updated two-dimensional image data or the position data of the guide image data are retrieved as necessary.
- the target two-dimensional ultrasonic image and the guide image can be searched, displayed and output, and the operation and effect of any one of Embodiments 1 to 4 described above can be obtained.
- a diagnostic device can be realized.
- the surgeon can compare the two-dimensional ultrasonic image displayed and output after the image search with the specific information input for the image search and the two-dimensional ultrasonic image. In comparison, it is possible to easily understand which part of the subject is viewed from which direction in the two-dimensional ultrasonic image. Furthermore, the surgeon can easily compare the target 2D ultrasound image displayed by the image search with the current 2D ultrasound image, and efficiently check the progress of the disease and the state of recovery. be able to.
- the force for calculating the position data regarding the ultrasonic transducer 3a using the position detection signal based on the alternating magnetic field from the transmission coil 7 In the sixth embodiment, the subject It is configured to detect the insertion shape of the insertion section 3 inserted into the body and output the display.
- FIG. 22 is a block diagram illustrating a configuration example of an ultrasonic diagnostic apparatus according to Embodiment 6 of the present invention. It is a lock figure.
- an image processing apparatus 62 is provided instead of the image processing apparatus 52 of the ultrasonic diagnostic apparatus 51 described above, and a transmission coil 64 is further provided.
- the image processing device 62 is provided with a control unit 63 instead of the control unit 53.
- the control section 63 has substantially the same configuration and function as the control section 53 described above, and is provided with an image forming section 63a instead of the image forming section 16c.
- the transmission coil 64 is electrically connected to the position data calculation device 6 and is also electrically connected to the transmission coil 7. That is, the position data calculation device 6 and the transmission coil 7 are electrically connected via the transmission coil 64.
- Other configurations are the same as those of the fifth embodiment, and the same components are denoted by the same reference numerals.
- the transmission coil 64 is realized using a plurality of coils, and is arranged on the rear end side of the insertion unit 3 with respect to the transmission coil 7, that is, on the side where the operation unit 4 is arranged. At this time, the plurality of coils W 1, W 2,..., W (m: an integer of 2 or more) contained in the transmission coil 64 are transmitted from the transmission coil 7 side.
- the entrance 3 is arranged substantially linearly toward the rear end side. Accordingly, the plurality of coils W 1, W 2,..., W are sequentially arranged in the longitudinal direction of the insertion portion 3 following the first coil of the transmission coil 7 described above.
- the number be 10 or more (for example, about 10 or more).
- the transmission coil 64 outputs an alternating magnetic field having a different frequency for each of the built-in coils when the current is supplied from the position data calculation device 6.
- the receiving coil 10 receives the alternating magnetic field from the transmitting coil 64 in the same manner as the alternating magnetic field of the transmitting coil 7 and the like, and transmits a position detection signal based on the alternating magnetic field to the position data calculating device 6.
- the position data calculation device 6 receives the position detection signal from the receiving coil 10, based on the received position detection signal, the position data calculation device 6 generates a plurality of coils W 1, W 2, W 3,. , W
- the position data calculating device 6 calculates the position of each of the calculated coils W 1, W 2,.
- the position data calculation device 6 transmits the transmission coil position data by the transmission coil 7, the marker coil position data by the marker coil 8, and the plate 9
- the plate position data is transmitted to the control unit 63, and the insertion unit position data is transmitted to the control unit 63.
- the control unit 63 performs the processing steps of steps S101 to S110 described above to create two-dimensional image data to be sequentially acquired and guide image data anatomically corresponding to each two-dimensional image data, Each two-dimensional ultrasonic image corresponding to each two-dimensional image data and each guide image corresponding to each guide image data are sequentially displayed and output on the display device 12. Further, when the insertion section position data is input from the position data calculation device 6, the control section 63 detects the insertion section position data, detects the time t at which the insertion section position data was detected from the timer 16b, and It controls the component 63a.
- the image forming unit 63a has substantially the same configuration and function as the image forming unit 16c described above, and further has an insertion shape image corresponding to the insertion shape image indicating the insertion shape of the insertion unit 3 in the subject. Works to create data. Under the control of the control unit 63, the image construction unit 63a inserts the insertion part position data input from the position data calculation device 6, that is, the directional components of the position vectors OU, OU,. Each coordinate of image data
- the image forming unit 63a determines each of the position vectors OU (t), OU (t), ⁇ , OU (t) as the coordinate data of the inserted shape image data at the time t.
- the image forming unit 63a extracts only the direction component of the position vector OC (t) of the center position C (t) of the time t from the transmission coil position data at the time t. Obtained as insertion part position data at time t.
- the image forming unit 63a calculates the directional components of the position vectors OU (t), OU (t), ..., OU (t) and the position vector OC ( t)
- the image forming unit 63a starts with the coordinate points based on the directional components of the position vector OC (t), and sets the position vectors OU (t), OU (t),
- the insertion shape image data at time t can be created.
- the inserted shape image data at time t is represented by a mark corresponding to each coordinate point based on each position vector acquired at time t, and a line shape connecting these marks sequentially, that is, time t. It corresponds to an insertion shape image expressing an insertion shape line indicating the insertion shape of the insertion section 3 in the insertion section.
- the mixing unit 16d combines the inserted shape image data at this time t, the two-dimensional image data acquired at the same timing, that is, at time t, Inserted shape image data, two-dimensional image data, and guide image data created as various image data at the same timing based on the guide image data at time t anatomically corresponding to the two-dimensional image data
- Mixed image data for displaying and outputting the inserted shape image, the two-dimensional ultrasonic image, and the guide image respectively corresponding to the images on the same screen of the display device 12 is created. Thereafter, the mixed image data is output to the display circuit 15 under the control of the control unit 63.
- the display circuit 15 converts and outputs an image signal corresponding to the mixed image data under the control of the control unit 63 as described above.
- the display device 12 arranges, on the same screen, a two-dimensional ultrasound image, a guide image, and an insertion shape image corresponding to the mixed image data at the same timing based on the image signal received from the display circuit 15. Display output.
- FIG. 23 is a schematic diagram schematically illustrating a screen display example in which a two-dimensional ultrasonic image, a guide image, and an insertion shape image at the same timing are displayed side by side on the same screen.
- the controller 64 outputs to the display circuit 15 mixed image data based on the same timing, for example, the two-dimensional image data at time ts, the guide image data at time ts, and the insertion shape image data at time ts.
- a two-dimensional ultrasonic image UG at time ts, a guide image GG at time ts, and an insertion shape image IG at time ts can be displayed side by side on the same screen of the display device 12. Note that the two-dimensional ultrasound image UG and the guide image GG anatomically correspond to each other as described above.
- the insertion shape image IG is inserted into the subject at the insertion shape of the insertion portion 3 of the probe 2 that has detected the two-dimensional image data corresponding to the two-dimensional ultrasonic image UG, that is, at time ts.
- 3 shows the insertion shape of the insertion section 3. That is, as shown in FIG. 23, the position vectors OC (ts), OU (ts), OU (ts), OU (ts), OU (ts),
- each mark dl, d2, ⁇ , dm corresponding to each coordinate point based on 2 m (however, marks dl to d8 are shown), and between these marks dl, d2, ⁇ , dm And an insertion shape line in which are sequentially connected.
- the mark dl indicates the position at the time t, for example, the time ts when the insertion portion position data was acquired.
- the marks d2, d3, ..., dm corresponding to the coordinate points based on the direction component of the position vector OC (ts) and successively following the mark dl are the position vectors OU (ts )
- OU (ts),..., OU (ts) correspond to the respective coordinate points based on the respective directional components.
- the marks dl to dm When the marks dl to dm are displayed in the insertion shape image, the marks dl and the marks d2 to dm may be displayed in another mode. This makes the position of the probe tip in the insertion shape image easier to understand.
- control unit 63 every time the control unit 63 acquires the two-dimensional image data, the control unit 63 sequentially creates the respective guide image data anatomically corresponding to the acquired two-dimensional image data, and further generates the two-dimensional image data.
- Each insertion shape image data is sequentially created based on each insertion portion position data sequentially acquired at the same timing as the data is sequentially acquired.
- the control unit 63 sequentially updates the two-dimensional ultrasonic image of the display device 12 based on the sequentially acquired two-dimensional image data, Then, the guide image of the display device 12 is sequentially updated based on the data, and at the same time, the insertion shape image of the display device 12 is updated based on the sequentially created insertion shape image data. That is, when the operator searches the region of interest of the subject while repeatedly causing the ultrasound diagnostic apparatus 61 to perform the above-described radial scan, the control unit 63 transmits the two-dimensional ultrasound image, the guide image, and the insertion shape image in real time. Are sequentially updated and displayed on the display device 12.
- an ultrasonic diagnostic apparatus in which a configuration and a function of displaying and outputting an inserted shape image are further added to the configuration and the function of the fifth embodiment described above is described.
- the present invention is not limited to this, and may be an ultrasonic diagnostic apparatus in which a configuration and a function of displaying and outputting an insertion shape image are further added to the configurations and functions of the above-described third and fourth embodiments. That is, the image search unit 53a may not be provided.
- the ultrasonic diagnostic apparatus may further include a configuration and a function of displaying and outputting an insertion shape image in addition to the configurations and functions of the first and second embodiments. That is, the rotation processing unit 33a and the image search unit 53a are provided, but need not be provided.
- Embodiment 6 of the present invention every time two-dimensional image data is acquired, two-dimensional ultrasonic images are sequentially updated based on the sequentially acquired two-dimensional image data, and the two-dimensional ultrasonic images are sequentially created.
- the guide images are sequentially updated based on the created guide image data, and although the inserted shape image is updated based on the inserted shape image data, the present invention is not limited to this.
- the two-dimensional ultrasonic image is sequentially updated based on the sequentially obtained two-dimensional image data, and the sequentially generated guide image data is used.
- the guide images may be sequentially updated, and at the same time, the insert shape image may be updated based on the sequentially created insert shape image data.
- the insertion of the probe insertion portion inserted into the subject is performed. Insertion part position data corresponding to the shape is sequentially acquired, and based on each insertion part position data acquired at the same timing as the two-dimensional image data is sequentially acquired, the insertion shape image data indicating this insertion shape And sequentially update the 2D ultrasound images based on the sequentially acquired 2D image data, and sequentially update the guide images based on the sequentially created guide image data.
- the inserted shape image was updated based on the inserted shape image data.
- a two-dimensional ultrasonic image of the inside of the subject a guide image anatomically corresponding to the two-dimensional ultrasonic image, and a probe insertion portion for detecting two-dimensional image data of the two-dimensional ultrasonic image are obtained.
- the inserted shape image can be displayed and output on the same screen in real time, so that the operator can easily understand the current inserted shape of the probe insertion portion that has been inserted into the subject.
- An ultrasonic diagnostic apparatus capable of enjoying the operation and effect of any one of the above-described first to fourth embodiments and improving the operation efficiency until displaying and outputting a two-dimensional ultrasonic image of a target region of interest. Can be realized.
- the operator can easily understand the current insertion shape of the probe insertion portion that has been inserted into the body of the subject.
- the digital scan scanning plane can be accurately and easily positioned on the target region of interest, and a two-dimensional ultrasonic image of the target region of interest can be efficiently observed.
- the ultrasonic vibrator 3a provided near the tip of the insertion portion 3 is driven to rotate. Radial scanning of the body of the subject was performed by repeatedly transmitting and receiving ultrasonic waves in a radial manner.
- an ultrasonic transducer group in which a plurality of ultrasonic transducers are arranged in a ring is used.
- An electronic radial scan of the body of the subject is performed by using the electronic radial scan.
- FIG. 24 is a block diagram illustrating a configuration example of an ultrasonic diagnostic apparatus according to Embodiment 7 of the present invention.
- a probe 72 is provided instead of the probe 2 of the ultrasonic diagnostic apparatus 51 described above.
- the probe 72 is provided with an insertion section 73 instead of the insertion section 3, an operation section 74 instead of the operation section 4, and a transmission coil 75 instead of the transmission coil 7.
- an ultrasonic transducer group 73a is provided instead of the ultrasonic transducer 3a.
- Other configurations are the same as those of the fifth embodiment, and the same components are denoted by the same reference numerals.
- FIG. 25 is a schematic diagram schematically illustrating one configuration of the distal end of the insertion portion 73 of the probe 72.
- the probe 72 is realized by using an electronic radial scan type ultrasonic endoscope, and has the insertion section 73 and the operation section 74 to be inserted into the subject as described above.
- An ultrasonic transducer group 73a and a transmission coil 75 are provided at the tip of the insertion section 73.
- the ultrasonic transducer group 73a has a plurality of ultrasonic transducers obtained by finely cutting into a strip shape and arranged in a circle around the insertion axis direction into the subject, for example, 360 ° around the entire circumference. It is composed of Further, each of the ultrasonic transducers constituting the ultrasonic transducer group 73a is electrically connected to the ultrasonic observation device 5 via the signal line 73b and the operation unit 74.
- the operation unit 74 bends the distal end of the insertion unit 73 including the portion where the ultrasonic transducer group 73a and the transmission coil 75 are provided in accordance with the operation of the operator, in substantially the same manner as the operation unit 4 described above. Has functions. Further, the operation unit 74 electrically connects the ultrasonic transducer group 73a and the ultrasonic observation device 5 when the operator turns on the power switch of the operation unit 74.
- the transmission coil 75 has a structure in which two coils having coil axes in two orthogonal directions are integrated, and the two coils and the position data calculation device 6 are electrically connected to each other. Further, as shown in FIG. 25, the transmission coil 75 is provided on the distal end side of the insertion section 73 of the ultrasonic transducer group 73a. At this time, it is desirable that the transmitting coil 75 be provided so that its outer shape does not protrude from the cross section of the ultrasonic transducer group 73a. As a result, the diameter of the insertion portion 73 is reduced. More / J, can be reduced.
- one coil axis direction corresponds to the insertion axis direction of the insertion unit 73 into the subject, and the other coil axis direction corresponds to the ultrasonic wave direction.
- This corresponds to the scanning surface of the electronic radial scan by the oscillator group 73a, that is, the 12:00 direction of the radial scan plane RH shown in FIG.
- the transmission coil 75 has one coil axis direction corresponding to the normal direction of the two-dimensional image data, that is, the direction vector V (t) described above, and The coil axis direction corresponds to the 12:00 direction of the two-dimensional image data, that is, the above-described direction vector V (t).
- the control unit 53 sends a control signal to the ultrasonic observation device 5 based on the input start instruction information.
- the ultrasonic observation device 5 transmits each excitation signal to each ultrasonic transducer of the ultrasonic transducer group 73a via the operation unit 74 and the signal line 73b based on the control signal received from the control unit 53.
- a pulse voltage of about 100 [V] is applied to each of the ultrasonic vibrators of the ultrasonic vibrator group 73a.
Landscapes
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Molecular Biology (AREA)
- General Health & Medical Sciences (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Veterinary Medicine (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Public Health (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Magnetic Resonance Imaging Apparatus (AREA)
- Image Processing (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP05737168.4A EP1741390B8 (en) | 2004-04-30 | 2005-04-26 | Ultrasonic diagnosis device |
US11/589,601 US7736316B2 (en) | 2004-04-30 | 2006-10-30 | Ultrasonic diagnosis apparatus |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004135896A JP4537756B2 (ja) | 2004-04-30 | 2004-04-30 | 超音波診断装置 |
JP2004-135896 | 2004-04-30 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/589,601 Continuation US7736316B2 (en) | 2004-04-30 | 2006-10-30 | Ultrasonic diagnosis apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005104955A1 true WO2005104955A1 (ja) | 2005-11-10 |
Family
ID=35241396
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/007926 WO2005104955A1 (ja) | 2004-04-30 | 2005-04-26 | 超音波診断装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US7736316B2 (ja) |
EP (1) | EP1741390B8 (ja) |
JP (1) | JP4537756B2 (ja) |
WO (1) | WO2005104955A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7774045B2 (en) | 2006-06-27 | 2010-08-10 | Olympus Medical Systems Corp. | Medical guiding system, medical guiding program, and medical guiding method |
Families Citing this family (52)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8784336B2 (en) | 2005-08-24 | 2014-07-22 | C. R. Bard, Inc. | Stylet apparatuses and methods of manufacture |
CN101662980B (zh) * | 2007-01-19 | 2013-02-27 | 桑尼布鲁克健康科学中心 | 用于成像探头的扫描机构 |
EP2036494A3 (en) * | 2007-05-07 | 2009-04-15 | Olympus Medical Systems Corp. | Medical guiding system |
JP5226244B2 (ja) * | 2007-05-07 | 2013-07-03 | オリンパスメディカルシステムズ株式会社 | 医用ガイドシステム |
JP2008301969A (ja) * | 2007-06-06 | 2008-12-18 | Olympus Medical Systems Corp | 超音波診断装置 |
JP5191167B2 (ja) * | 2007-06-06 | 2013-04-24 | オリンパスメディカルシステムズ株式会社 | 医用ガイドシステム |
JP4869189B2 (ja) * | 2007-09-13 | 2012-02-08 | オリンパスメディカルシステムズ株式会社 | 医用ガイドシステム |
JP2009082402A (ja) | 2007-09-28 | 2009-04-23 | Fujifilm Corp | 医用画像診断システム、医用撮像装置、医用画像格納装置、及び、医用画像表示装置 |
KR100971418B1 (ko) | 2007-10-29 | 2010-07-21 | 주식회사 메디슨 | 조이스틱을 가지는 초음파 진단장치 |
ES2557084T3 (es) | 2007-11-26 | 2016-01-21 | C. R. Bard, Inc. | Sistema integrado para la colocación intravascular de un catéter |
US9649048B2 (en) | 2007-11-26 | 2017-05-16 | C. R. Bard, Inc. | Systems and methods for breaching a sterile field for intravascular placement of a catheter |
US10524691B2 (en) | 2007-11-26 | 2020-01-07 | C. R. Bard, Inc. | Needle assembly including an aligned magnetic element |
US10449330B2 (en) | 2007-11-26 | 2019-10-22 | C. R. Bard, Inc. | Magnetic element-equipped needle assemblies |
US10751509B2 (en) | 2007-11-26 | 2020-08-25 | C. R. Bard, Inc. | Iconic representations for guidance of an indwelling medical device |
US8781555B2 (en) | 2007-11-26 | 2014-07-15 | C. R. Bard, Inc. | System for placement of a catheter including a signal-generating stylet |
US9521961B2 (en) * | 2007-11-26 | 2016-12-20 | C. R. Bard, Inc. | Systems and methods for guiding a medical instrument |
JP5301197B2 (ja) * | 2008-04-11 | 2013-09-25 | 富士フイルム株式会社 | 断面画像表示装置および方法ならびにプログラム |
KR20100008217A (ko) * | 2008-07-15 | 2010-01-25 | 주식회사 메디슨 | 초음파 영상과 외부 영상을 디스플레이하고 조작하는초음파 시스템 및 그 조작 방법 |
WO2010022370A1 (en) | 2008-08-22 | 2010-02-25 | C.R. Bard, Inc. | Catheter assembly including ecg sensor and magnetic assemblies |
US8437833B2 (en) | 2008-10-07 | 2013-05-07 | Bard Access Systems, Inc. | Percutaneous magnetic gastrostomy |
US9532724B2 (en) | 2009-06-12 | 2017-01-03 | Bard Access Systems, Inc. | Apparatus and method for catheter navigation using endovascular energy mapping |
DE102009030171A1 (de) * | 2009-06-24 | 2010-12-30 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Bildgebungsvorrichtung und Bildgebungsverfahren zur Auffindung von Unregelmäßigkeiten in Hohlräumen von Objekten |
EP2464407A4 (en) | 2009-08-10 | 2014-04-02 | Bard Access Systems Inc | DEVICES AND METHODS FOR ENDOVASCULAR ELECTROGRAPHY |
CN102573992B (zh) * | 2009-09-24 | 2016-04-27 | 皇家飞利浦电子股份有限公司 | 高强度聚焦超声定位机构 |
WO2011097312A1 (en) | 2010-02-02 | 2011-08-11 | C.R. Bard, Inc. | Apparatus and method for catheter navigation and tip location |
JP5314614B2 (ja) | 2010-02-05 | 2013-10-16 | 富士フイルム株式会社 | 医用画像表示装置及び医用画像表示方法並びにプログラム |
EP2913000B1 (en) | 2010-05-28 | 2020-02-12 | C.R. Bard, Inc. | Apparatus for use with needle insertion guidance system |
EP2912999B1 (en) | 2010-05-28 | 2022-06-29 | C. R. Bard, Inc. | Apparatus for use with needle insertion guidance system |
WO2012024577A2 (en) | 2010-08-20 | 2012-02-23 | C.R. Bard, Inc. | Reconfirmation of ecg-assisted catheter tip placement |
EP2700351A4 (en) | 2012-03-06 | 2015-07-29 | Olympus Medical Systems Corp | ENDOSCOPIC SYSTEM |
KR20130113775A (ko) * | 2012-04-06 | 2013-10-16 | 삼성메디슨 주식회사 | 초음파 프로브 및 초음파 진단 시스템 |
US9786040B2 (en) * | 2012-09-26 | 2017-10-10 | Hitachi, Ltd. | Ultrasound diagnostic apparatus and ultrasound two-dimensional cross-section image generation method |
JP6125615B2 (ja) * | 2013-04-05 | 2017-05-10 | テルモ株式会社 | 画像診断装置及びプログラム |
ES2811323T3 (es) | 2014-02-06 | 2021-03-11 | Bard Inc C R | Sistemas para el guiado y la colocación de un dispositivo intravascular |
JP6640738B2 (ja) * | 2014-05-02 | 2020-02-05 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 医用イメージングシステム及び医用イメージングシステムにおいてデータをリンクする方法 |
US10319090B2 (en) * | 2014-05-14 | 2019-06-11 | Koninklijke Philips N.V. | Acquisition-orientation-dependent features for model-based segmentation of ultrasound images |
WO2015190186A1 (ja) | 2014-06-10 | 2015-12-17 | オリンパス株式会社 | 内視鏡システム、内視鏡システムの作動方法 |
US20150369909A1 (en) * | 2014-06-19 | 2015-12-24 | Imperium, Inc. | Image sensor for large area ultrasound mapping |
WO2016009701A1 (ja) | 2014-07-15 | 2016-01-21 | オリンパス株式会社 | ナビゲーションシステム、ナビゲーションシステムの作動方法 |
KR20160064399A (ko) | 2014-11-28 | 2016-06-08 | 삼성전자주식회사 | 자기공명영상장치 |
KR102367446B1 (ko) | 2014-12-11 | 2022-02-25 | 삼성메디슨 주식회사 | 초음파 진단 장치 및 그 동작 방법 |
US10973584B2 (en) | 2015-01-19 | 2021-04-13 | Bard Access Systems, Inc. | Device and method for vascular access |
US10824315B2 (en) * | 2015-05-29 | 2020-11-03 | Canon Medical Systems Corporation | Medical image processing apparatus, magnetic resonance imaging apparatus and medical image processing method |
WO2016210325A1 (en) | 2015-06-26 | 2016-12-29 | C.R. Bard, Inc. | Connector interface for ecg-based catheter positioning system |
US11000207B2 (en) | 2016-01-29 | 2021-05-11 | C. R. Bard, Inc. | Multiple coil system for tracking a medical device |
JP6738631B2 (ja) * | 2016-03-29 | 2020-08-12 | ザイオソフト株式会社 | 医用画像処理装置、医用画像処理方法、及び医用画像処理プログラム |
JP6810587B2 (ja) * | 2016-11-30 | 2021-01-06 | オリンパス株式会社 | 内視鏡装置、内視鏡装置の作動方法 |
CN106898286B (zh) * | 2017-03-15 | 2020-07-03 | 武汉精测电子集团股份有限公司 | 基于指定位置的Mura缺陷修复方法及装置 |
JP2021164490A (ja) | 2018-04-10 | 2021-10-14 | オリンパス株式会社 | 医療システム |
JP7171291B2 (ja) * | 2018-07-26 | 2022-11-15 | キヤノンメディカルシステムズ株式会社 | 超音波診断装置及び画像処理プログラム |
CN112867443B (zh) | 2018-10-16 | 2024-04-26 | 巴德阿克塞斯系统股份有限公司 | 用于建立电连接的安全装备连接系统及其方法 |
CN114667093A (zh) * | 2019-11-05 | 2022-06-24 | 奥林巴斯株式会社 | 内窥镜装置、显示用图像输出方法以及程序 |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000023979A (ja) * | 1998-07-13 | 2000-01-25 | Olympus Optical Co Ltd | 超音波診断装置 |
JP2000081303A (ja) * | 1998-09-04 | 2000-03-21 | Olympus Optical Co Ltd | 位置検出装置 |
JP2001340335A (ja) * | 2000-06-01 | 2001-12-11 | Fukuda Denshi Co Ltd | 超音波診断装置 |
JP2002263101A (ja) | 2001-03-06 | 2002-09-17 | Aloka Co Ltd | 超音波診断装置 |
EP1354557A1 (en) | 2002-04-17 | 2003-10-22 | Olympus Optical Co., Ltd. | Ultrasonic diagnostic apparatus and method |
US20030231789A1 (en) | 2002-06-18 | 2003-12-18 | Scimed Life Systems, Inc. | Computer generated representation of the imaging pattern of an imaging device |
EP1504721A1 (en) | 2002-09-27 | 2005-02-09 | Olympus Corporation | Ultrasonograph |
EP1543776A1 (en) | 2002-09-27 | 2005-06-22 | Olympus Corporation | Ultrasonograph |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5474072A (en) * | 1993-10-29 | 1995-12-12 | Neovision Corporation | Methods and apparatus for performing sonomammography |
JP3871747B2 (ja) * | 1996-11-25 | 2007-01-24 | 株式会社日立メディコ | 超音波診断装置 |
US6248074B1 (en) * | 1997-09-30 | 2001-06-19 | Olympus Optical Co., Ltd. | Ultrasonic diagnosis system in which periphery of magnetic sensor included in distal part of ultrasonic endoscope is made of non-conductive material |
JP2000051217A (ja) * | 1998-08-06 | 2000-02-22 | Olympus Optical Co Ltd | 超音波診断装置 |
JP4614548B2 (ja) * | 2001-01-31 | 2011-01-19 | パナソニック株式会社 | 超音波診断装置 |
JP2002305044A (ja) * | 2001-04-03 | 2002-10-18 | Mitsuboshi Denki Seisakusho:Kk | 端子台 |
JP2002363101A (ja) | 2001-04-03 | 2002-12-18 | Taisho Pharmaceut Co Ltd | 外用剤 |
JP2003116869A (ja) * | 2001-10-18 | 2003-04-22 | Honda Seiki Kk | 超音波治療装置および超音波診断装置 |
-
2004
- 2004-04-30 JP JP2004135896A patent/JP4537756B2/ja not_active Expired - Fee Related
-
2005
- 2005-04-26 EP EP05737168.4A patent/EP1741390B8/en not_active Ceased
- 2005-04-26 WO PCT/JP2005/007926 patent/WO2005104955A1/ja active Application Filing
-
2006
- 2006-10-30 US US11/589,601 patent/US7736316B2/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000023979A (ja) * | 1998-07-13 | 2000-01-25 | Olympus Optical Co Ltd | 超音波診断装置 |
JP2000081303A (ja) * | 1998-09-04 | 2000-03-21 | Olympus Optical Co Ltd | 位置検出装置 |
JP2001340335A (ja) * | 2000-06-01 | 2001-12-11 | Fukuda Denshi Co Ltd | 超音波診断装置 |
JP2002263101A (ja) | 2001-03-06 | 2002-09-17 | Aloka Co Ltd | 超音波診断装置 |
EP1354557A1 (en) | 2002-04-17 | 2003-10-22 | Olympus Optical Co., Ltd. | Ultrasonic diagnostic apparatus and method |
US20030231789A1 (en) | 2002-06-18 | 2003-12-18 | Scimed Life Systems, Inc. | Computer generated representation of the imaging pattern of an imaging device |
EP1504721A1 (en) | 2002-09-27 | 2005-02-09 | Olympus Corporation | Ultrasonograph |
EP1543776A1 (en) | 2002-09-27 | 2005-06-22 | Olympus Corporation | Ultrasonograph |
Non-Patent Citations (4)
Title |
---|
ARAI O. ET.AL.: "Real-time virtual sonography.", J.MED.ULTRASONICS., vol. 30, 15 April 2003 (2003-04-15), pages 151, XP002990183 * |
JEANNETTE L. HERRING ET AL.: "Surface-Based Registration of CT Images to Physical Space for Image-Guided Surgery of the Spine: A Sensitivity Study", IEEE TRANSACTIONS ON MEDICAL IMAGING, vol. 17, no. 5, October 1998 (1998-10-01), pages 743 - 752, XP011035776 |
OHKUMA K. ET AL.: "Real-time virtual sonography.", J. MED ULTRASONICS., vol. 30, 15 April 2003 (2003-04-15), pages 277, XP002990182 * |
ROCH M. COMEAU ET AL.: "Intraoperative US in Interactive Image-guided Neurosurgery", RADIOGRAPHICS: A REVIEW PUBLICATION OF THE RADIOLOGICAL SOCIETY OF. NORTH AMERICA, vol. 18, no. 4, July 1998 (1998-07-01), pages 1019 - 1027 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7774045B2 (en) | 2006-06-27 | 2010-08-10 | Olympus Medical Systems Corp. | Medical guiding system, medical guiding program, and medical guiding method |
Also Published As
Publication number | Publication date |
---|---|
US7736316B2 (en) | 2010-06-15 |
JP4537756B2 (ja) | 2010-09-08 |
EP1741390A4 (en) | 2009-08-05 |
EP1741390B1 (en) | 2018-02-21 |
EP1741390B8 (en) | 2018-04-04 |
US20070078343A1 (en) | 2007-04-05 |
EP1741390A1 (en) | 2007-01-10 |
JP2005312770A (ja) | 2005-11-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2005104955A1 (ja) | 超音波診断装置 | |
JP4868959B2 (ja) | 体腔内プローブ装置 | |
JP5208495B2 (ja) | 医療用システム | |
EP2036494A2 (en) | Medical guiding system | |
WO2006057296A1 (ja) | 超音波診断装置 | |
JP4875416B2 (ja) | 医用ガイドシステム | |
EP2000087A2 (en) | Medical guiding system | |
JP2005312770A5 (ja) | ||
JPH10192A (ja) | 超音波画像診断装置 | |
JP5601684B2 (ja) | 医用画像装置 | |
JP2007125179A (ja) | 超音波診断装置 | |
KR20170047873A (ko) | 초음파 영상 장치 및 그 제어 방법 | |
JP4869197B2 (ja) | 医用ガイド装置 | |
JP5226244B2 (ja) | 医用ガイドシステム | |
JP4869189B2 (ja) | 医用ガイドシステム | |
JP2006087599A (ja) | 超音波診断装置 | |
JP4530799B2 (ja) | 超音波診断装置 | |
JP4700434B2 (ja) | 超音波診断装置 | |
JP4647899B2 (ja) | 超音波診断装置 | |
JP4668592B2 (ja) | 体腔内プローブ装置 | |
JP5307357B2 (ja) | 超音波診断装置 | |
JP2008301969A (ja) | 超音波診断装置 | |
JP4843728B2 (ja) | 超音波診断装置 | |
JP4707853B6 (ja) | 超音波診断装置 | |
JP4707853B2 (ja) | 超音波診断装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 11589601 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWW | Wipo information: withdrawn in national office |
Country of ref document: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2005737168 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 2005737168 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 11589601 Country of ref document: US |