WO2014076931A1 - 画像処理装置、画像処理方法、およびプログラム - Google Patents
画像処理装置、画像処理方法、およびプログラム Download PDFInfo
- Publication number
- WO2014076931A1 WO2014076931A1 PCT/JP2013/006625 JP2013006625W WO2014076931A1 WO 2014076931 A1 WO2014076931 A1 WO 2014076931A1 JP 2013006625 W JP2013006625 W JP 2013006625W WO 2014076931 A1 WO2014076931 A1 WO 2014076931A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- display state
- display
- assist
- position information
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0833—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
- A61B8/0841—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating instruments
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0891—Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of blood vessels
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/463—Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/483—Diagnostic techniques involving the acquisition of a 3D volume of data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5223—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/30—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4483—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/466—Displaying means of special interest adapted to display 3D data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5207—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of raw data to produce diagnostic data, e.g. for generating an image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5238—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
- A61B8/5246—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from the same or different imaging techniques, e.g. color Doppler and B-mode
Definitions
- the present invention relates to an image processing apparatus and an image processing method for generating an image for guiding an instrument to move to a target location in a subject.
- ultrasonic diagnostic apparatuses As biological image diagnostic apparatuses, for example, X-ray diagnostic apparatuses, MR (magnetic resonance) diagnostic apparatuses, ultrasonic diagnostic apparatuses, and the like are widely used. Among them, the ultrasonic diagnostic apparatus has advantages such as non-invasiveness and real-time property, and is widely used for diagnosis or screening. There are a wide variety of diagnostic sites using an ultrasonic diagnostic apparatus, such as the heart, blood vessels, liver, and breast. In recent years, blood vessel diagnosis of carotid arteries for the purpose of determining the risk of arteriosclerosis has attracted attention. However, since an advanced technique is required for this blood vessel diagnosis, an ultrasonic apparatus that displays an image for guiding an examiner has been proposed as in Patent Document 1.
- This intraoperative navigation system that displays the positional relationship between a patient position during surgery and a surgical instrument has been proposed.
- This intraoperative navigation system is intended to improve the recognizability of, for example, the position of a tumor or the position of a blood vessel, and to display the position of a surgical instrument relative to a surgical target site such as a bone or an organ to improve safety during surgery. Used for etc.
- the present invention provides an image processing apparatus and the like that can display an image for guiding the instrument so that the instrument can be moved to a target location in the subject in an easy-to-understand manner for the user.
- An image processing apparatus is an image processing apparatus that generates an assist image that is an image for guiding movement of an instrument to a target location in a subject, and includes a three-dimensional image including the target location.
- a 3D image analysis unit that determines target position information indicating a three-dimensional position of the target part based on an image; a position information acquisition unit that acquires instrument position information indicating a three-dimensional position of the instrument; and the target part Based on the positional relationship with the instrument, a display state determination unit that selects one display state from two or more display states, and the assist image is displayed in the selected display state as the target
- An assist image generation unit that generates using position information and the appliance position information, and a display control unit that performs control for outputting the assist image to a display device.
- the present invention it is possible to easily display an image for guiding the device so that the instrument can be moved to a target location in the subject.
- FIG. 1A is a schematic diagram showing a probe and a scan surface.
- FIG. 1B is a diagram illustrating two directions in which the carotid artery is scanned with a probe.
- FIG. 1C is a diagram illustrating an example of how an ultrasonic image acquired by a long-axis scan looks.
- FIG. 1D is a diagram illustrating an example of how an ultrasonic image acquired by a short-axis scan looks.
- FIG. 2A is a cross-sectional view showing the structure of an arterial blood vessel having a short-axis cross section.
- FIG. 2B is a cross-sectional view showing the structure of an arterial blood vessel having a long-axis cross section.
- FIG. 1A is a cross-sectional view showing the structure of an arterial blood vessel having a short-axis cross section.
- FIG. 2B is a cross-sectional view showing the structure of an arterial blood vessel having a long-axis cross section.
- FIG. 3 is a block diagram illustrating a configuration of an assumed ultrasonic diagnostic apparatus.
- FIG. 4 is a flowchart showing the operation of the assumed ultrasonic diagnostic apparatus.
- FIG. 5 is a diagram illustrating a screen configuration example including an assist image and a live image.
- FIG. 6 is a block diagram illustrating a configuration of the ultrasonic diagnostic apparatus according to the first embodiment.
- FIG. 7 is a flowchart showing the operation of the ultrasonic diagnostic apparatus according to the first embodiment.
- FIG. 8A is a diagram illustrating an example of a flow of generating a 3D image.
- FIG. 8B is a diagram illustrating an example of a flow of generating a 3D image.
- FIG. 8C is a diagram illustrating an example of a 3D image generation flow.
- FIG. 8D is a diagram illustrating an example of a 3D image generation flow.
- FIG. 9A is a diagram illustrating the position and orientation of the measurement target in the 3D image.
- FIG. 9B is a diagram illustrating the position of the measurement target in the long-axis cross section.
- FIG. 9C is a diagram illustrating a position in the short-axis cross section of the measurement target.
- FIG. 10 is a flowchart illustrating an example of a screen display switching operation.
- FIG. 10 is a flowchart illustrating an example of a screen display switching operation.
- FIG. 11A is a diagram illustrating an example of a carotid artery that is a measurement target in 3D space.
- FIG. 11B is a diagram illustrating an example of the second display state.
- FIG. 11C is a diagram illustrating an example of the first display state.
- FIG. 12 is a flowchart illustrating an example of an operation in which hysteresis is applied to screen display switching.
- FIG. 13A is a diagram illustrating an example of a carotid artery that is a measurement target in 3D space.
- FIG. 13B is a diagram illustrating an example of the carotid artery in the long axis direction in the 3D space.
- FIG. 13C is a diagram illustrating an example of the carotid artery in the short axis direction in the 3D space.
- FIG. 13D is a diagram illustrating an example of a combined display of the live image in the long axis direction and the assist image in the short axis direction after switching.
- FIG. 14A is a diagram illustrating an example of an assist image at the long-axis direction viewpoint before switching.
- FIG. 14B is a diagram illustrating an example of the assist image at the short-axis-direction viewpoint after switching.
- FIG. 15A is a diagram illustrating an example of an assist image at the long-axis direction viewpoint before switching.
- FIG. 15B is a diagram illustrating an example of an assist image with the zoom magnification increased at the short-axis-direction viewpoint after switching.
- FIG. 16 is a flowchart illustrating an example of an operation for switching the setting of the assist image.
- FIG. 17A is a diagram illustrating another example of the second display state.
- FIG. 17B is a diagram illustrating another example of the first display state.
- FIG. 18 is a flowchart illustrating the operation of the ultrasonic diagnostic apparatus according to the second embodiment.
- FIG. 19A is a diagram illustrating a system for acquiring probe position information using a camera.
- FIG. 19B is a diagram illustrating a specific example 1 in which position information of the probe cannot be acquired.
- FIG. 19C is a diagram illustrating a specific example 1 of a screen displaying warning information.
- FIG. 19D is a diagram illustrating a specific example 2 in which position information of the probe cannot be acquired.
- FIG. 19A is a diagram illustrating a system for acquiring probe position information using a camera.
- FIG. 19B is a diagram illustrating a specific example 1 in which position information of the probe cannot be acquired
- FIG. 19E is a diagram illustrating a specific example 2 of a screen displaying warning information.
- FIG. 20A is a diagram illustrating a display example 1 in which the posture of the subject is associated with the orientation of the 3D image.
- FIG. 20B is a diagram illustrating a display example 2 in which the posture of the subject is associated with the orientation of the 3D image.
- FIG. 21 is a diagram illustrating a screen configuration example using an assist image including images from two viewpoints.
- FIG. 22 is a flowchart illustrating the operation of the ultrasonic diagnostic apparatus according to the third embodiment.
- FIG. 23 is a schematic diagram showing an installation example of an intraoperative navigation system.
- FIG. 24 is a diagram showing an outline of information import into the virtual three-dimensional space.
- FIG. 25 is a block diagram illustrating a configuration of the image processing apparatus according to the fourth embodiment.
- FIG. 26 is a flowchart illustrating the operation of the image processing apparatus according to the fourth embodiment.
- FIG. 27A is a diagram illustrating an example of an assist image displayed in the second display state.
- FIG. 27B is a diagram illustrating an example of the assist image displayed in the first display state.
- FIG. 28A is a diagram illustrating an example of a physical format of a flexible disk which is a recording medium body.
- FIG. 28B is a diagram showing an external appearance, a cross-sectional structure, and a flexible disk when viewed from the front of the flexible disk.
- FIG. 28C is a diagram showing a configuration for recording and reproducing a program on a flexible disk.
- FIG. 1A to FIG. 1D are explanatory views of how images are seen when the carotid artery is scanned with ultrasound.
- FIG. 1A is a schematic diagram showing a probe and a scan plane
- FIG. 1B is a diagram showing two directions in which the carotid artery is scanned with a probe
- FIG. 1C is an example of how an ultrasound image obtained by a long-axis scan looks
- FIG. 1D is a diagram illustrating an example of how an ultrasonic image acquired by a short-axis scan looks.
- An ultrasonic transducer (not shown) is arranged on the probe 10, and for example, when the ultrasonic transducer is arranged one-dimensionally, as shown in FIG.
- An ultrasonic image is obtained with respect to the scan surface 11.
- a blood vessel image in the long axis direction as shown in FIG. 1C and a blood vessel image in the short axis direction as shown in FIG. 1D, respectively, are obtained. It is done.
- FIGS. 2A to 2D are cross-sectional views showing the structure of an arterial blood vessel having a short-axis cross section
- FIG. 2B is a cross-sectional view showing the structure of an arterial blood vessel having a long-axis cross section
- FIG. 2D is a cross-sectional view showing an example of thickening of the intima of the long-axis cross section.
- the arterial blood vessel wall 20 is composed of three layers, an intima 22, an intima 23, and an adventitia 24.
- the intima 22 and the media 23 are mainly thickened as shown in FIGS. 2C and 2D. Therefore, in the carotid artery diagnosis using ultrasound, the thickness of the intima combined with the intima 22 and the intima 23 is measured by detecting the intima boundary 25 and the epicardial boundary 26 shown in FIG. 2C.
- a portion where the thickness of the intima exceeds a certain value is called a plaque 27, and in the long axis image, changes in the structure of the blood vessel wall as shown in FIG. 2D. In the inspection of the plaque 27, generally, both the short axis image and the long axis image are confirmed.
- an ultrasonic diagnostic apparatus 30 for guiding the examiner is proposed.
- FIG. 3 is a block diagram showing a configuration of the ultrasonic diagnostic apparatus 30.
- the ultrasonic diagnostic apparatus 30 includes a 3D image analysis unit 31, a position information acquisition unit 32, an assist image generation unit 33, a live image acquisition unit 34, and a display control unit 35 as shown in FIG.
- the 3D image analysis unit 31 analyzes a previously acquired three-dimensional image (hereinafter referred to as a 3D image), and a three-dimensional position (hereinafter simply referred to as a position) of a target portion to be measured (hereinafter referred to as a measurement target) within the subject. And the target position information tgtInf including the orientation is determined, and the determined target position information tgtInf is output to the assist image generation unit 33.
- the position information acquisition unit 32 acquires instrument position information indicating the scanning position (scan position) and orientation of the probe 10 currently being scanned using, for example, a magnetic sensor or an optical camera.
- the assist image generation unit 33 Based on the 3D image, the target position information tgtInf, and the instrument position information, the assist image generation unit 33 superimposes and displays information indicating the measurement surface of the measurement target and the position and orientation of the current scan surface on the 3D image.
- the assist image asis0 is generated.
- the display control unit 35 displays the live image, which is an ultrasonic image at the current scan position, and the assist image together on the display device 150.
- FIG. 4 is a flowchart showing the operation of the ultrasonic diagnostic apparatus 30. Here, it is assumed that a 3D image indicating an organ shape to be diagnosed is generated in advance.
- the 3D image analysis unit 31 analyzes the 3D image and determines target position information including the position and orientation of the measurement target (step S001). Subsequently, the position information acquisition unit 32 acquires instrument position information indicating the scanning position and orientation of the probe 10 currently being scanned (step S002). Next, the assist image generation unit 33 calculates the difference between the measurement target and the current scan position, and generates path information Z that changes the color or shape of the display image according to the difference (step S003). . Then, the assist image generation unit 33 generates an assist image including the path information Z in addition to the 3D image, the position of the measurement target, and the current scan position (step S004). For example, as shown in FIG.
- the display control unit 35 displays a screen 40 in which the live image 48 that is an ultrasonic image at the current scan position and the assist image 41 are combined on the display device 150 (step S005).
- the assist image 41 includes a 3D image 42 indicating the shape of the organ including the target portion, an image 43 indicating the current position of the probe 10, an image 44 indicating the current scan plane, and an image 46 indicating the scan plane of the measurement target.
- An image 45 indicating the position of the probe 10 to be moved for scanning the measurement target, and an arrow 47 indicating the direction in which the probe 10 is moved are displayed.
- the inspector moves the probe and aligns the scan surface with the measurement target, the inspector takes a two-step process of roughly aligning and then fine-tuning. At this time, the rough positioning can be performed smoothly by mainly referring to the assist image and performing fine adjustment mainly while viewing the live image.
- the assist image 41 and the live image 48 are always displayed with the same screen configuration, there is a problem that it is difficult for the examiner to know which image to focus on and to move the probe. is there.
- an image processing apparatus that generates an assist image that is an image for guiding movement of an instrument to a target location in a subject. , Based on a three-dimensional image including the target portion, a 3D image analysis unit that determines target position information indicating a three-dimensional position of the target portion, and position information that acquires instrument position information indicating the three-dimensional position of the device Based on the acquisition unit, the positional relationship between the target portion and the instrument, a display state determination unit that selects one display state from two or more display states, and a display state that is selected.
- an assist image generation unit that generates the assist image using the target position information and the appliance position information, and a display control unit that performs control for outputting the assist image to a display device.
- the two or more display states include a first display state in which a zoom magnification of display in the assist image is displayed at a first magnification, and a zoom magnification of display in the assist image at a magnification larger than the first magnification.
- a second display state that displays at a second magnification, and the display state determination unit selects the first display state when the positional relationship does not satisfy a first predetermined condition, and the positional relationship is The second display state may be selected when the first predetermined condition is satisfied.
- the assist image can be switched to the enlarged display, and can be displayed in an easy-to-understand manner for the user.
- the 3D image analysis unit determines the direction of the target portion as the target position information in addition to the three-dimensional position of the target portion based on the three-dimensional image
- the position information acquisition unit includes:
- the orientation of the instrument may be acquired as the instrument position information.
- the instrument is a probe for acquiring an ultrasound image of the subject in an ultrasound diagnostic apparatus, and the position information acquisition unit acquires the scanning position and orientation of the probe as the instrument position information.
- the assist image generation unit may generate an assist image that is an image for guiding the movement of the probe to the target location.
- an assist image which is an image for guiding the movement of the probe to the target location, can be displayed in an easy-to-understand manner for the user.
- the image processing apparatus further includes a live image acquisition unit that acquires an ultrasonic image of the subject that is a live image from the probe, and the display control unit displays the assist image and the live image. May be output.
- the display device displays the assist image as a main image and the live image as a sub-image smaller than the main image, and the display device. And displaying the live image as the main image and the fourth display state in which the assist image is displayed as the sub-image, wherein the display state determination unit does not satisfy the second predetermined condition in the positional relationship
- the third display state is selected, and when the positional relationship satisfies the second predetermined condition, the fourth display state is selected, and the display control unit is configured to select the assist image in the selected display state.
- the live image may be output to a display device.
- the display control unit performs switching between the main image and the sub image by switching a relative display size between the assist image and the live image according to the selected display state.
- the assist image and the live image may be output to the display device.
- the display state determination unit selects a display state based on whether the positional relationship satisfies a third predetermined condition, and the fourth display When the state is selected, the display state may be selected based on whether the positional relationship satisfies the fourth predetermined condition.
- the target location is a blood vessel
- the display state determination unit determines and determines the positional relationship depending on whether or not a cross-section substantially parallel to the traveling direction of the blood vessel is depicted in the live image.
- One display state may be selected from the two or more display states based on the positional relationship.
- an assist image which is an image for guiding the movement of the probe to the target location, can be displayed in an easy-to-understand manner for the user.
- the image processing apparatus further includes a 3D image generation unit that generates the three-dimensional image from data acquired in advance, and the 3D image generation unit includes a region including the target portion as the data.
- the contour of the organ including the target portion is extracted from the ultrasonic image obtained by scanning with a probe in advance, thereby generating the three-dimensional image, and the position and orientation of the three-dimensional image in the three-dimensional space are determined.
- the probe may be associated with the scanning position and orientation of the probe acquired by the position information acquisition unit.
- the position and orientation of the 3D image in the 3D space can be associated with the scanning position and orientation of the probe.
- the assist image generation unit generates navigation information based on a relative relationship between a current scanning position and orientation of the probe and a position and orientation of the target portion, and the assist image is generated as the assist image with respect to the three-dimensional image. Then, an image in which the probe image indicating the current scanning position and orientation of the probe and the navigation information are superimposed may be generated.
- the assist image which is an image for guiding the movement of the probe to the target location, in a more easily understandable manner for the user.
- the assist image generation unit when the fourth display state is selected, the assist image generation unit generates a plurality of cross-sectional images respectively indicating cross-sectional shapes from a plurality of directions at the target location, and generates the plurality of cross-sectional images thus generated.
- an image in which a probe image indicating the current scanning position and orientation of the probe is superimposed may be generated as the assist image.
- the target portion is a blood vessel
- the plurality of cross-sectional images include two cross-sections each showing a cross-sectional shape from a long-axis direction that is a traveling direction of the blood vessel and a short-axis direction substantially orthogonal to the long-axis direction.
- the assist image generation unit includes, for the two cross-sectional images, the target of the probe based on a relative relationship between a current scanning position and orientation of the probe and a position and orientation of the target portion. An image in which a straight line or a rectangle for guiding movement to a place is superimposed may be generated as the assist image.
- the assist image which is an image for guiding the movement of the probe to the target location, in a more easily understandable manner for the user.
- the display state determination unit calculates the difference between the position and orientation of the target portion and the position and orientation of the appliance as the positional relationship using the target position information and the appliance position information, and calculates One display state may be selected according to the difference.
- the display state determination unit calculates the difference between the position and orientation of the target portion and the position and orientation of the instrument using the target position information and the instrument position information, and calculates the calculated difference. By holding, the displacement of the difference over time may be calculated as the positional relationship, and one display state may be selected according to the calculated displacement of the difference.
- the target location is a surgical target site in the subject
- the instrument is a surgical instrument used for surgery on the subject
- the assist image generation unit supplies the surgical target site to the surgical target site.
- An assist image that is an image for guiding the movement of the user may be generated.
- the practitioner can confirm the movement of the operated surgical instrument, and can easily adjust the distance between the surgical instrument and the target location and the direction of excision or cutting.
- the image processing apparatus may further include a 3D image generation unit that generates the three-dimensional image from data acquired in advance.
- the display state determination unit calculates a difference between the target location and the position of the instrument as the positional relationship using the target position information and the instrument position information, and displays one display according to the calculated difference.
- a state may be selected.
- the display state determination unit calculates a difference between the target location and the position of the instrument using the target position information and the instrument position information, and retains the calculated difference with time.
- the difference displacement may be calculated as the positional relationship, and one display state may be selected according to the calculated difference displacement.
- the two or more display states include two or more display states in which at least one of the zoom magnification and the viewpoint in the assist image is different, and the display state determination unit is based on the positional relationship,
- One display state may be selected from two or more display states in which at least one of the zoom magnification and the viewpoint in the assist image is different.
- an assist image can be generated in various display modes and displayed in an easy-to-understand manner for the user.
- the measurement target is not particularly limited as long as it is an organ that can be imaged by ultrasound, and includes a blood vessel, a heart, a liver, a breast, and the like.
- a carotid artery will be described as an example.
- FIG. 6 is a block diagram showing a configuration of the ultrasonic diagnostic apparatus 100 according to the first embodiment.
- the ultrasonic diagnostic apparatus 100 includes a 3D image analysis unit 101, a position information acquisition unit 102, a display state determination unit 103, an assist image generation unit 104, a transmission / reception unit 105, a live image acquisition unit 106, and a display control unit. 107 and a control unit 108.
- the ultrasonic diagnostic apparatus 100 is configured to be connectable to the probe 10, the display apparatus 150, and the input apparatus 160.
- the probe 10 has, for example, a plurality of transducers (not shown) arranged in a one-dimensional direction (hereinafter referred to as transducer arrangement direction).
- the probe 10 converts a pulsed or continuous wave electrical signal (hereinafter referred to as a transmission electrical signal) supplied from the transmission / reception unit 105 into a pulsed or continuous wave ultrasonic wave, and contacts the probe 10 with the skin surface of the subject.
- a transmission electrical signal supplied from the transmission / reception unit 105 into a pulsed or continuous wave ultrasonic wave
- the probe 10 receives a plurality of reflected ultrasound waves from the subject, converts the reflected ultrasound waves into electrical signals (hereinafter referred to as received electrical signals) by a plurality of transducers, and transmits and receives these received electrical signals.
- received electrical signals electrical signals
- an example of the probe 10 having a plurality of transducers arranged in a one-dimensional direction is shown, but the present invention is not limited to this.
- a two-dimensional array of transducers arranged in a two-dimensional direction, or a oscillating ultrasound that constructs a three-dimensional tomographic image by mechanically oscillating a plurality of transducers arranged in a one-dimensional direction A probe may be used and can be appropriately used depending on the measurement.
- the probe 10 may be provided with a part of the function of the transmission / reception unit 105 on the ultrasonic probe side.
- a transmission electrical signal is generated in the probe 10 based on a control signal (hereinafter referred to as a transmission control signal) for generating a transmission electrical signal output from the transmission / reception unit 105, and the transmission electrical signal is converted into an ultrasonic wave.
- a control signal hereinafter referred to as a transmission control signal
- the structure which converts the received reflected ultrasonic wave into a received electrical signal and generates a later-described received signal based on the received electrical signal in the probe 10 can be mentioned.
- the display device 150 is a so-called monitor, and displays the output from the display control unit 107 as a display screen.
- the input device 160 includes various input keys, and is used by the operator for various settings of the ultrasonic diagnostic apparatus 100.
- the configuration illustrated in FIG. 6 illustrates an example of a configuration in which the display device 150 and the input device 160 are provided separately from the ultrasonic diagnostic apparatus 100, but is not limited to such a configuration.
- the input device 160 is configured to operate a touch panel on the display device 150, the display device 150 and the input device 160 (and the ultrasonic diagnostic apparatus 100) are integrated.
- the 3D image analysis unit 101 analyzes a 3D image acquired in advance, for example, by scanning the measurement target with a short axis, and determines and determines position information (target position information) tgtInf1 including the three-dimensional position and orientation of the measurement target.
- the target position information tgtInf1 is output to the display state determination unit 103.
- the position information acquisition unit 102 acquires position information (instrument position information) indicating the scan position and orientation of the probe 10 currently being scanned using, for example, a magnetic sensor or an optical camera.
- the display state determination unit 103 selects one display state from the two display states based on the positional relationship between the measurement target and the probe 10. Specifically, the display state determination unit 103 selects and selects either the first display state or the second display state based on the position and orientation difference between the measurement target and the current scan position. Output as mode information mode indicating the display state.
- the assist image generation unit 104 acquires the assist image generation information tgtInf2 including the 3D image data and the target position information of the measurement target from the 3D image analysis unit 101, and is displayed in the display state indicated by the mode information mode.
- An assist image is generated.
- the assist image is an image for guiding the movement of the probe 10 to the measurement target, and information indicating the measurement surface of the measurement target and the position and orientation of the current scan surface is superimposed on the 3D image. It is an image.
- both information modes Include in information mode.
- the transmission / reception unit 105 is connected to the probe 10, generates a transmission control signal related to transmission control of the ultrasonic beam of the probe 10, and generates a pulsed or continuous wave transmission electrical signal generated based on the transmission control signal. Is transmitted to the probe 10.
- the transmission processing performed by the transmission / reception unit 105 means processing for generating a transmission control signal at least by the transmission / reception unit 105 and causing the probe 10 to transmit an ultrasonic wave (beam).
- the transmission / reception unit 105 amplifies the electrical signal received from the probe 10 and performs A / D conversion, performs reception processing for generating a reception signal, and supplies the reception signal to the live image acquisition unit 106.
- This received signal is composed of, for example, a plurality of signals having a transducer arrangement direction and a transmission direction of ultrasonic waves and a direction perpendicular to the transducer arrangement (hereinafter referred to as a depth direction). It is a digital signal obtained by A / D converting an electric signal converted from the amplitude of a sound wave.
- the transmission process and the reception process are repeatedly performed continuously to construct a plurality of frames made up of a plurality of received signals.
- the reception process performed by the transmission / reception unit 105 means a process in which at least the transmission / reception unit 105 acquires a reception signal based on the reflected ultrasound.
- the frame referred to here is a single received signal necessary for constructing one tomographic image, or a signal processed for constructing tomographic image data based on this single received signal.
- it means one piece of tomographic image data or tomographic image constructed based on this single received signal.
- the live image acquisition unit 106 converts each received signal in the frame into a luminance signal corresponding to the intensity, and generates tomographic image data by performing coordinate conversion of the luminance signal on the orthogonal coordinate system.
- the live image acquisition unit 106 sequentially performs this process for each frame, and outputs the generated tomographic image data to the display control unit 107.
- the display control unit 107 uses the assist image and the live image that is the ultrasonic image (tomographic image data) at the current scan position acquired by the live image acquisition unit 106, and the screen configuration specified by the mode information mode. Accordingly, the live image and the assist image are displayed on the display device 150.
- the control unit 108 controls each unit in the ultrasonic diagnostic apparatus 100 based on an instruction from the input device 160.
- FIG. 7 is a flowchart showing the operation of the ultrasonic diagnostic apparatus 100 according to the first embodiment.
- the 3D image analysis unit 101 analyzes a 3D image acquired in advance to determine target position information including the position and orientation of a cross section serving as a measurement target, and also determines the position or orientation of the measurement target.
- a range in which the difference is equal to or less than the threshold is set as a measurement range (step S101).
- 8A to 8D are diagrams illustrating an example of a flow of generating a 3D image using an ultrasonic image.
- the entire carotid artery is scanned by the probe 10 to acquire tomographic image data of a short axis image of a plurality of frames 51 as shown in FIG. 8A, and blood vessels are obtained from each frame 51 of the short axis image as shown in FIG. 8B.
- the contour 52 is extracted.
- FIG. 8C the blood vessel contour 52 of each frame 51 is placed in the 3D space, and a 3D image 53 of the carotid artery is constructed as shown in FIG. To do.
- the position information including the position and orientation
- the blood vessel contour 52 of each frame 51 is arranged in the 3D space based on this position information.
- the position information can be calculated based on, for example, an optical marker attached to the probe 10 captured by a camera and a change in the shape of the optical marker in the captured image.
- position information may be acquired using a magnetic sensor, a gyroscope, an acceleration sensor, or the like.
- the probe not only a probe that acquires a two-dimensional image but also a probe that can acquire a three-dimensional image without moving the probe may be used.
- This is, for example, an oscillating probe whose scanning surface mechanically oscillates within the probe, or a matrix probe in which ultrasonic transducers are two-dimensionally arranged on the probe surface.
- the 3D image may be acquired by a method other than ultrasound such as CT (Computer Tomography) or MRI (Magnetic Resonance Imaging).
- CT Computer Tomography
- MRI Magnetic Resonance Imaging
- the ultrasound diagnostic apparatus 100 may be configured to generate a 3D image.
- FIG. 9A is a diagram showing the position and orientation of the measurement target in the 3D image
- FIG. 9B is a diagram showing the position of the measurement target in the long-axis cross section
- FIG. 9C is the position of the measurement target in the short-axis cross section.
- the position and orientation of the measurement target to be measured vary depending on the purpose of measuring organ diagnosis.
- the measurement organ is an examination of the carotid artery
- the measurement target in the 3D image 53 is generally positioned and oriented as shown in FIG. 9A. Therefore, the 3D image analysis unit 101 measures a part at a predetermined distance 62 from the measurement reference position 61 set based on the shape of the carotid artery as shown in FIG. 9B in the long-axis cross section that is the traveling direction of the blood vessel. It is determined as 63.
- the 3D image analysis unit 101 has a plane (hereinafter referred to as a center line) 65 passing through a line (hereinafter referred to as a center line) 65 connecting the centers of the contours 64 of the short axis image in each frame constituting the 3D image in the plane in the short axis direction.
- the position of the measurement target 63 is determined so as to be 66).
- the 3D image analysis unit 101 determines that the maximum active surface 66 is a plane passing through the center of the contour before and after branching, or in a direction inclined by a predetermined angle from the plane.
- the 3D image analysis unit 101 may store the position information of the measurement target at the time of the previous diagnosis, and determine the measurement target 63 so that the measurement can be performed at the same position and orientation as the previous time at the next measurement. .
- the 3D image analysis unit 101 calculates the thickness of the intima by extracting the intima boundary and the outer membrane boundary of the blood vessel from the short-axis image acquired at the time of generating the 3D image, and the thickness is a threshold value.
- part which is the above can be detected as a plaque.
- the 3D image analysis unit 101 may determine, as the measurement target 63, a cross section in the long axis direction at a position where the thickness is maximum in the plaque detected in this way. In the present embodiment, the 3D image analysis unit 101 determines the measurement target 63, but the inspector may manually set the measurement target 63.
- the position information acquisition unit 102 acquires position information (instrument position information) indicating the current scan position and orientation of the probe 10 (step S102).
- the position information acquisition unit 102 acquires the position information using various sensors such as a camera or a magnetic sensor as described above.
- a camera for example, an optical marker composed of four markers is attached to the probe 10, and the position and orientation of the marker are estimated based on the center coordinates and size of the four markers in an image acquired by the camera.
- the scan position and orientation of the probe 10 can be estimated.
- the display state determination unit 103 determines whether or not the current scan position is within the measurement range with respect to the measurement target (step S103). If the result of this determination is that it is within the measurement range (Yes in step S103), the display state determination unit 103 selects the first display state (step S104). Next, the assist image generation unit 104 generates an assist image in the first display state using the assist image generation information tgtInf2 including the 3D image data and the target position information of the measurement target (step S105). Then, the display control unit 107 displays the assist image and the live image that is the ultrasonic image at the current scan position acquired by the live image acquisition unit 106 on the display device 150 in the first display state (step S106). .
- the display state determination unit 103 selects the second display state (step S107).
- the assist image generation unit 104 generates an assist image in the second display state using the assist image generation information tgtInf2 including the 3D image data and the target position information of the measurement target (step S108).
- the display control unit 107 displays the assist image and the live image on the display device 150 in the second display state (step S109).
- step S110 it is determined whether or not the process is completed. If the process is not completed (No in step S110), the current position information acquisition process (step S102) is repeated.
- FIG. 10 is a flowchart illustrating an example of a screen display switching operation. Note that the flowchart shown in FIG. 10 describes only a part that replaces step S103 to step S109 shown in FIG.
- the display state determination unit 103 calculates the difference in position and orientation between the measurement target and the current scan position (step S1101). Subsequently, the display state determination unit 103 determines whether or not the difference between the position and the orientation with respect to the specific direction of the 3D image is equal to or less than a threshold value (step S1102).
- the specific direction may take into consideration all three axes orthogonal to each other in the three-dimensional space coordinates, or may be set based on the shape of the measurement organ. For example, when the measurement target is parallel to the center line of the blood vessel, the distance between the center of the measurement target and the center of the scan plane at the current scan position is less than or equal to the threshold, and the scan plane at the current scan position is the center. When approaching parallel to the line, it is possible to determine that the difference between the position and the orientation is equal to or less than a threshold value.
- the display state determination unit 103 selects the first display state (step S104).
- the assist image generation unit 104 generates an assist image in the first display state using the assist image generation information tgtInf2 (step S105).
- the display control unit 107 displays on the display device 150 in the first display state (fourth display state) in which the main display is an ultrasonic live image and the sub display is an assist image (step S1103).
- the display state determination unit 103 selects the second display state (step S107).
- the assist image generation unit 104 generates an assist image in the second display state using the assist image generation information tgtInf2 (step S108).
- the display control unit 107 displays on the display device 150 in the second display state (third display state) in which the main display is the assist image and the sub display is the live image (step S109).
- the main display means the display of the center of the screen of the display device 150 on which the ultrasonic image is displayed or the portion occupying the largest area on the screen, and the sub display is displayed on the screen. This refers to the display of information that is not the main display.
- FIGS. 11A to 11C are diagram illustrating an example of a carotid artery that is a measurement organ in 3D space
- FIG. 11B is a diagram illustrating an example of a second display state
- FIG. 11C is a diagram illustrating an example of a first display state. It is.
- step S1102 whether the current scan position is within the measurement range, that is, the difference between the current scan position and the measurement target in 3D space, and the difference in the rotation angle around the z axis are respectively It is determined whether it is below a preset threshold value.
- determining in this way it is possible to roughly determine whether or not a long-axis image can be drawn with the probe rotated about the z-axis, and the display state can be switched when a long-axis image can be drawn.
- the current scan position is out of the measurement range, that is, the screen is displayed when the long axis image is not drawn, and the scan position can be moved to the target position while mainly referring to the assist image.
- the assist image 73 is displayed on the screen 70 as the main display 71 and the live image 74 is displayed as the sub display 72.
- the first display state shown in FIG. 11C is a screen display when the current scan position is within the measurement range, and since the scan position is in the vicinity of the target position, an ultrasonic live image 75 is mainly displayed.
- the live image 75 is displayed as the main display 71 and the assist image 73 is displayed as the sub display 72 on the screen 70 so that the positioning can be performed with reference.
- a 3D image 42 indicating the organ shape including the target portion
- an image 43 indicating the current position of the probe 10
- an image 44 indicating the current scan plane
- a scan plane of the measurement target are shown.
- An image 46, an image 45 indicating the position of the probe 10 to be moved for scanning the measurement target, and an arrow 47 indicating the direction in which the probe 10 is moved are displayed.
- the left-right relationship between the assist image and the live image is reversed before and after the switching, but the present invention is not limited to this.
- the display area may be enlarged while the live image 74 is displayed on the right side of the screen so that the left-right relationship is not switched.
- the screen display switching is not limited to two patterns, and the screen display may be changed continuously by enlarging or reducing the display area of each image based on the difference in position and orientation.
- step S1102 of FIG. 10 the display state is switched based on whether the difference between the position and orientation of the measurement target and the current scan position is equal to or smaller than the threshold value, and therefore the position where the difference is near the threshold boundary. If the probe is moved frequently, the display state is frequently switched, and the visibility of the assist image and the live image may be lowered.
- FIG. 12 is a flowchart showing an operation for stably switching the display state by introducing hysteresis in the display state switching determination.
- steps from step S1105 to step S1107 added to the flowchart of FIG. 10 will be described.
- the display state determination unit 103 determines whether or not the current display state is the second display state (step S1105). If the result of this determination is the second display state (Yes in step S1105), the display state determination unit 103 sets the threshold used for display state switching determination to T1 for each of the position and orientation (step S1106). . On the other hand, when the display state is not the second display state (No in step S1105), the display state determination unit 103 sets a threshold T2 different from the threshold T1 set in step S1106 (step S1107). For example, in the first display state, the position threshold T2 is set to 8 mm, and in the second display state, the position threshold T1 is set to 10 mm.
- the first display state is transitioned to when the position difference is 8 mm or less.
- the threshold value in the first display state is 10 mm, if the difference is less than 10 mm, the first display state is maintained. Therefore, even if the probe moves about 2 mm in the vicinity where the difference is 8 mm, the display state does not vibrate and can be kept stable.
- the elements to be switched based on the difference between the measurement target and the current scan position are not limited to the display state such as the main display and the sub display.
- the assist image itself such as the viewpoint direction and zoom magnification in the assist image. It may be a parameter related to the appearance of.
- FIG. 13A is a diagram illustrating an example of a carotid artery that is a measurement target in 3D space
- FIG. 13B is a diagram illustrating an example of a carotid artery in the long axis direction in 3D space
- FIG. 13C is a diagram illustrating a short in 3D space. It is a figure which shows an example of the carotid artery of an axial direction
- FIG. 13D is a figure which shows an example of the combination display of the live image of the major axis direction after switching, and the assist image of a minor axis direction.
- the three-dimensional shape of the carotid artery has a short-axis cross section parallel to the xz plane and a traveling direction parallel to the y-axis.
- a long axis image is drawn by rotating the probe.
- the positional relationship between the current scan position 82 and the measurement target 81 is easy to understand when viewed in the viewpoint direction (z-axis direction in the figure) from which the long axis image can be seen.
- a viewpoint direction (y-axis direction in the drawing) in which the scan position and inclination in the short-axis section 84 can be grasped is desirable.
- the blood vessel image is drawn only on a part of the screen as the shift of the rotation angle around the z-axis between them increases.
- the meandering of the blood vessel is slight, but at least the running from the common carotid artery to the bifurcation is linear, and this assumption is practically useful. Therefore, the rotation around the x-axis and the z-axis and the position in the y-axis direction can be grasped from the live image. From the assist image, the rotation around the y-axis and the positions in the x-axis and z-axis directions can be grasped, so that all positional relationships can be grasped by combining both.
- the traveling direction of the blood vessel can be determined based on the center line of the 3D image.
- FIG. 14A is a diagram showing an example of an assist image at the long-axis direction viewpoint before switching
- FIG. 14B is a diagram showing an example of the assist image at the short-axis direction viewpoint after switching.
- the assist image 85 at the long axis viewpoint as shown in FIG. 14A is displayed, and if it is within the measurement range, the assist image at the short axis direction viewpoint as shown in FIG. 14B is displayed. To do.
- FIG. 15A is a diagram illustrating an example of the assist image at the long-axis direction viewpoint before switching
- FIG. 15B is a diagram illustrating an example of the assist image at the short-axis direction viewpoint after switching and with the zoom magnification increased. .
- the zoom magnification is lowered as shown in FIG. 15A, and the scan position is scanned within the measurement range.
- the zoom magnification is increased and displayed so that the vicinity region of the measurement target can be seen in detail as shown in FIG. 15B.
- FIG. 16 is a flowchart showing an example of an operation for switching the setting of the assist image. Steps S201 to S203 are substantially the same as steps S101, S102, and S103 in FIG. Here, the processing of steps S204 and S205 will be described.
- the assist image generation unit 104 switches the setting of each element such as the viewpoint direction of the assist image and the zoom magnification (step S204). Then, the assist image generation unit 104 generates an assist image reflecting the switching (step S205). The switching of each element such as the viewpoint direction and the zoom magnification may be used together with the switching of the screen display.
- the display state of the screen is dynamically switched based on whether or not the current scan position is within the measurement range of the measurement target.
- the probe can be guided more easily for the examiner.
- the configuration is such that the viewpoint direction of the 3D space in the assist image is changed in accordance with the current scan position and orientation, the inspector can be guided to easily align the measurement target and the scan position. You can also.
- FIGS. 11B and 11C configurations other than the screen configurations illustrated in FIGS. 11B and 11C may be used.
- the main display 71 and the sub display 72 are not separately displayed on the screen 70, but in the main display 76 as shown in FIGS. 17A and 17B.
- a screen configuration including the sub display 77 may be used.
- the display state determination unit 103 selects the first display state or the second display state based on the difference between the position and orientation of the measurement target and the current scan position. It is not limited to.
- the display state determination unit 103 may select the first display state or the second display state based on the difference in position between the measurement target and the current scan position.
- the display state determination unit 103 holds the difference between the position and direction (or only the position) between the measurement target and the current scan position, so that the first display state or The second display state may be selected.
- the second embodiment is different from the first embodiment in that the position information acquisition unit 102 of the ultrasonic diagnostic apparatus 100 determines whether or not the probe position information can be acquired. Since the configuration is the same as that of the ultrasound diagnostic apparatus 100 of the first embodiment shown in FIG. 6, the position information acquisition unit 102 will be described using the same reference numerals.
- location information cannot be acquired correctly.
- position information cannot be acquired correctly if the probe goes out of the magnetic field range or approaches a device such as a metal that disturbs the magnetic field.
- the position information acquisition unit 102 determines whether the position information of the probe 10 has been acquired.
- FIG. 18 is a flowchart showing the operation of the ultrasonic diagnostic apparatus 100 according to the second embodiment. Since steps other than step S108 and step S109 are the same as those in FIG.
- step S108 the position information acquisition unit 102 determines whether the position information of the probe 10 has been acquired. As a result of this determination, if it has been acquired (Yes in step S108), the process proceeds to step S103.
- the position information acquisition unit 102 instructs the display control unit 107 to display warning information indicating that the position information cannot be acquired.
- the warning information is displayed on the display device 150 (step S109).
- position information can be acquired after step S103, information indicating that is also displayed. Also good.
- information indicating that is also displayed. not only whether or not position information can be acquired, but also display based on reliability of position information and the like may be performed. For example, if the gain, exposure, white balance, etc. of the camera are not appropriate and the position detection accuracy of the optical marker in the image acquired by the camera is lowered, the reliability is lowered. In this case, a numerical value based on the reliability may be displayed in step S109 or step S103 and the like, or a figure or the like whose shape, pattern, color, or the like changes based on the reliability may be displayed.
- FIG. 19A is a diagram illustrating a configuration example of a system that acquires position information by photographing an optical marker attached to a probe with a camera.
- the optical marker is composed of four markers 15a to 15d, and the position information acquisition unit 102 uses the center coordinates and sizes of the four markers in the image acquired by the camera 90. Based on the above, the position and orientation of the marker are estimated.
- FIG. 19B is a diagram showing a specific example 1 when the position information cannot be acquired because the marker 15c cannot be detected due to the shadow of the probe itself
- FIG. 19C is a diagram showing a specific example 1 of a screen displaying warning information. It is.
- a red circle sign indicating that is shown in FIG. 19C. 91 is displayed on the screen 70.
- a sign 91 of a green circle different from the red circle which is an example of the warning information, may be displayed.
- FIG. 19D is a diagram illustrating a specific example 2 when position information cannot be acquired because the probe 10 is out of the field of view of the camera 90
- FIG. 19E is a diagram illustrating a specific example 2 of a screen displaying warning information. It is.
- the current position of the probe 10 is indicated by a cross mark 93 in the drawing as shown in FIG. 19E.
- the arrow 94 in the direction from the current position of the probe toward the measurement target 92 is assisted. Display on the screen.
- FIG. 20A is a diagram showing a display example 1 in which the posture of the subject is associated with the orientation of the 3D image
- FIG. 20B is a diagram of display example 2 in which the posture of the subject is associated with the orientation of the 3D image.
- information relating the 3D image of the carotid artery and the orientation of the subject's body may be displayed on the assist image.
- the direction of the subject's head may be indicated, and as shown in display example 2 in FIG. May be shown.
- the direction of the head can be determined by, for example, detecting the face of the subject from the camera image or detecting the silhouette of the head or shoulder of the subject.
- the carotid artery branches from one to two, but the direction on the side where two blood vessels exist by branching may be used as the head direction.
- the direction of the head can also be determined by constraining the scanning direction in advance, for example, by setting the short axis scan for constructing the 3D image as the direction from the bottom to the top of the neck.
- the assist image may always display information from a plurality of viewpoint directions instead of switching the viewpoint direction when switching between the main display and the sub display.
- FIG. 21 is a diagram illustrating a screen configuration example using an assist image including images (cross-sectional images) in two viewpoint directions, ie, a long-axis viewpoint and a short-axis viewpoint, in carotid artery diagnosis.
- the viewpoint direction in the assist image 71 is always two viewpoint directions of the long axis direction and the short axis direction, and the assist image 71 displays an image 78 of the long axis direction viewpoint and an image 79 of the short axis direction viewpoint. Is done. Further, by combining the live image 72 and the assist image 71, position and orientation information about all three axes x, y, and z can be obtained. There is no need to switch.
- the main display may always be a live image and the sub display may be an assist image without switching the screen configuration.
- information indicating the current scan position may be superimposed on the assist image only when the current scan position is measured within the measurement range.
- Information indicating whether the current scan position is within the measurement range may be displayed.
- the third embodiment is different from the first embodiment in that the display state determination unit 103 of the ultrasonic diagnostic apparatus 100 switches the display state depending on whether or not a long-axis image is drawn on the ultrasonic image. Since the configuration is the same as that of the ultrasound diagnostic apparatus 100 according to Embodiment 1 shown in FIG. 6, the display state determination unit 103 will be described using the same reference numerals.
- the display state determination unit 103 determines whether or not the ultrasound image at the current scan position acquired from the live image acquisition unit 106 depicts a long-axis image. Then, when a long-axis image is drawn, the display state determination unit 103 selects the first display state in which the main display is an ultrasonic live image and the sub display is an assist image. In addition, when the long-axis image is not drawn, the display state determination unit 103 selects the second display state in which the main display is the assist image and the sub display is the live image.
- the inner membrane boundary and outer membrane boundary of the long-axis image of the blood vessel can be extracted based on the ultrasonic B image, color flow, or power Doppler image.
- a B image an edge near the boundary may be searched based on the luminance value, and in the case of a color flow or power Doppler image, it is assumed that the blood flow region corresponds to the lumen of the blood vessel.
- UI User Interface
- FIG. 22 is a flowchart showing the operation of the ultrasonic diagnostic apparatus 100 according to the third embodiment. Since steps other than step S301 are the same as those in FIG.
- the display state determination unit 103 determines whether or not the ultrasound image at the current scan position acquired from the live image acquisition unit 106 depicts a long-axis image (step S301). As a result of this determination, if it is determined that a long-axis image is drawn (Yes in step S301), the display state determination unit 103 sets the main display as an ultrasonic live image and the sub display as an assist image. A display state is selected (step S104). On the other hand, when it is determined that the long axis image is not drawn (No in step S301), the display state determination unit 103 selects the second display state in which the main display is the assist image and the sub display is the live image ( Step S107).
- the display state of the screen is dynamically switched based on whether or not the ultrasound image depicts a long axis image. As a result, the probe can be guided more easily for the examiner.
- the operation for diagnosing carotid artery plaque has been mainly described.
- the assist image is effective not only for plaque but also for Doppler measurement which is important in vascular diagnosis.
- the position information of the sample gate for Doppler measurement is determined by the 3D image analysis unit 101 or set manually, and the examiner is guided so that the set position of the sample gate can be scanned.
- the position of the sample gate can be set so as to be close to the boundary between the common carotid artery and the carotid sinus, to a predetermined distance from the carotid artery bifurcation, or to the plaque site.
- it can also be used to observe other blood vessels such as the abdominal aorta and subclavian artery, and tumors of the liver and breast.
- An intraoperative navigation system is a system that displays a positional relationship between a patient position during surgery and a surgical instrument. This intraoperative navigation system, for example, to improve the recognizability of tumor position or blood vessel position, etc., and to display the position of surgical instruments such as bones or organs to improve the safety during surgery Used for.
- FIG. 23 is a schematic diagram showing an installation example of an intraoperative navigation system
- FIG. 24 is a diagram showing an outline of information import into a virtual three-dimensional space.
- a surgical instrument 203 such as an endoscope may be inserted from an incision 202 of a patient 201 to be operated, and a desired part may be excised or cut. If the desired site is not visible, a surgical navigation system is used to indicate to the practitioner where the tip of the surgical instrument 203 is in the body.
- the surgical navigation system includes an optical marker 213 installed on the surgical instrument 203, a tracking system including one or more imaging devices 511 such as a CCD camera and an image processing device 500 on the bedside where the patient is laid, navigation information (assist image) ) Is displayed.
- the display device (monitor) 250 is displayed.
- the tracking system images the optical marker 213 with the imaging device 511, calculates the position and orientation (orientation) information 223 in the space of the optical marker 213, and uses the information as information on the position and orientation of the distal end portion of the surgical instrument 203. Can be converted. Based on the acquired position and orientation information, an object simulating the surgical instrument 203 is placed in the three-dimensional space 520 virtually set in the tracking system.
- the position of a desired part of a patient to be operated is generally confirmed in advance in terms of its three-dimensional shape and size by a preoperative simulation.
- a region to be excised or cut is determined in advance using the three-dimensional volume data 510 of a surgical target site (target location) acquired by a modality such as CT, MRI, PET, or an ultrasonic diagnostic apparatus.
- a modality such as CT, MRI, PET, or an ultrasonic diagnostic apparatus.
- the alignment 222 between the actual region to be operated and the three-dimensional volume data 510 is performed when the patient is fixed to the bed 204 before the operation is started. That is, on the assumption that the positional relationship between the imaging target device 201 and the bed 204 to which the surgical target patient 201 or the surgical target patient 201 is fixed does not change, the position, posture, and size of the surgical target region are taken into the tracking system. .
- optical markers 214 and 211 are placed at predetermined positions (physical feature points of the patient such as a bed and bones), and the spatial position is measured by the tracking system. And by measuring posture information.
- FIG. 25 is a block diagram illustrating a configuration of the image processing apparatus 500 according to the fourth embodiment.
- the image processing apparatus 500 includes a 3D image generation unit 501, a position information acquisition unit 502, a display state determination unit 503, an assist image generation unit 504, and a display control unit 505 as illustrated in FIG.
- the image processing apparatus 500 is connected to a database that stores volume data 510, an imaging apparatus 511, and a display apparatus 250.
- the imaging device 511 is a photographing unit such as a CCD camera, and acquires images of a surgical target patient and a surgical instrument including an optical marker.
- the volume data 510 is three-dimensional image data of a region to be operated, and is generally acquired by a modality such as CT or MRI before surgery. It is also possible to perform navigation while updating volume data at any time by acquiring data in real time using an ultrasonic diagnostic apparatus.
- the display device 250 is a so-called monitor, and displays an output from the display control unit 505 as a display screen.
- the 3D image generation unit 501 generates a 3D image of the surgical target site by rendering the volume data 510.
- the 3D image generation unit 501 may determine a region to be excised or cut and reflect information on the region or the like in the 3D image.
- the position information acquisition unit 502 includes position information including the three-dimensional position and orientation of the surgical target region based on the image acquired by the imaging device 511 and the image of the optical marker placed on the patient or surgical target patient or surgical instrument. (Target position information) and position information (instrument position information) indicating the three-dimensional position and posture (orientation) of the surgical instrument are acquired.
- the display state determination unit 503 selects one display state from the two display states based on the positional relationship between the surgical target site (target location) and the surgical instrument. Specifically, the display state determination unit 503 selects either the first display state or the second display state based on the position difference (distance) between the surgical target site and the surgical instrument. At this time, the display state determination unit 503 calculates the distance between the surgical target site and the surgical instrument from the position of the surgical target site and the surgical instrument in the virtual three-dimensional space.
- the assist image generation unit 504 generates an assist image so as to be displayed in the display state selected by the display state determination unit 503.
- the display control unit 505 controls the position and size when displaying the assist screen on the display device 250, and displays an assist image on the display device 250.
- FIG. 26 is a flowchart showing the operation of the image processing apparatus 500 according to the fourth embodiment.
- the 3D image generation unit 501 generates 3D image to be displayed on the assist image by acquiring 3D volume data including the surgical target area of the patient acquired in advance and rendering the 3D volume data (step S501).
- the 3D image generation unit 501 may perform resection or cutting site designation corresponding to preoperative simulation (generally, resection or cutting site setting is performed separately before operation).
- the position information acquisition unit 502 is based on an image acquired by the imaging device 511 in an environment where the geometrical positional relationship between the imaging device 511 and the bed or patient to be operated in the operating room is determined.
- Target position information such as the three-dimensional position, posture, and size of the surgical target site is acquired (step S502).
- the 3D image generation unit 501 performs alignment by calibrating the 3D image, the 3D position, posture, size, and the like of the surgical target site (step S503).
- the position information acquisition unit 502 acquires information on the position and posture of the surgical instrument based on the image acquired by the imaging device 511. Furthermore, these pieces of information are converted into information on the position and posture of the surgical instrument tip (step S504).
- the 3D image generation unit 501 arranges the surgical target part and the surgical instrument in the virtual three-dimensional space from the information on the position and posture of the surgical target part, the surgical instrument, and the surgical instrument tip (step S505).
- the display state determination unit 503 calculates the distance between the surgical target site and the surgical instrument in the virtual three-dimensional space (step S506).
- the display state determining unit 503 determines whether the distance between the surgical target site and the surgical instrument in the virtual three-dimensional space is within a predetermined range (step S507).
- the display state determination unit 103 selects the second display state (step S508).
- the display state determination unit 103 changes settings such as the zoom magnification and the line-of-sight direction of the assist image (step S509).
- step S510 the display state determination unit 103 selects the first display state (step S510).
- the assist image generation unit 504 generates an assist image to be displayed in the first display state or the second display state selected by the display state determination unit 503 (step S511).
- the display control unit 505 displays an assist image on the display device 250 (step S512).
- FIGS. 27A and 27B are diagrams illustrating an example of an assist image displayed by the image processing apparatus 500
- FIG. 27A is a diagram illustrating an example of an assist image displayed in the second display state
- FIG. It is a figure which shows an example of the assist image displayed in 1 display state.
- the assist image displayed in the first display state is an assist image when the surgical target site and the surgical instrument are outside a predetermined range (a certain distance apart), and the viewpoint position is set to a 3D volume as shown in FIG. 27A. It is installed at a point away from the data (or the cutting angle of view is set to a wide angle) so that the positional relationship between the surgical target site and the surgical instrument can be confirmed from a bird's-eye view.
- the assist image displayed in the second display state is an assist image when the distance between the surgical target site and the surgical instrument is within a predetermined range (not more than a certain distance), as shown in FIG. 27B.
- the position is set to a position close to the 3D volume data (or the cut-out angle of view is set narrow) so that a more detailed positional relationship and movement of the surgical instrument can be confirmed.
- step S513 it is determined whether or not the process is completed.
- step S513 information representing the positional relationship between the latest surgical target region and the surgical instrument should be generated in the assist image.
- step S514 information on the position and posture of the surgical instrument is acquired. Then, it is determined whether or not there is a change in the position or posture of the surgical instrument (step S515). If the result of this determination is that there is a change (Yes in step S515), the processing from step S506 is repeated. On the other hand, if there is no change (No in step S515), the processing from step S513 is repeated.
- the procedure for updating only the instrument position information of the surgical instrument is described here, the target position information of the surgical target part may be updated as necessary. Also at this time, if a change occurs in the positional relationship between the surgical target region and the surgical instrument, the processing from step S506 is executed.
- the instrument position information of the surgical instrument is updated in real time, and the assist image displayed on the display device 250 is also updated accordingly. Therefore, the practitioner confirms the movement of the operated surgical instrument on the display device 250. It is possible to easily adjust the distance between the surgical instrument and the target location and the direction of excision or cutting.
- the assist image generation unit 504 generates an assist image in the first display state that allows an overview of the whole, assuming that the surgical target site and the surgical instrument are separated in the initial state. .
- the assist image setting is changed from the first display state.
- settings such as the zoom magnification and the line-of-sight direction are changed from the initial state so as to switch to the second display state.
- the display state determination unit 503 sets the zoom magnification, the line-of-sight direction, and the like in the initial state.
- the distance calculated by the display state determination unit 503 may be set to be obtained between the center of gravity of the excision or cutting region in the surgical target site and the distal end of the surgical instrument, but is not limited thereto.
- step S501 the 3D image generation unit 501 may perform the resection or cutting site designation corresponding to the preoperative simulation.
- the simulation result (resection or cutting region) is displayed in step S511 as a 3D image. You may superimpose on.
- a step or means for determining whether or not the surgical instrument has accessed the resection or cutting area is added, and when it is determined that the access has been made, a 3D image from which the area has been deleted is regenerated and the display is updated. May be. By doing so, the practitioner can more easily grasp the progress of the operation.
- the method of capturing an optical marker with a camera has been described as a method for acquiring position information.
- a magnetic sensor or a multi-joint arm may be used.
- step S507 m types of image display states (m is a natural number) are prepared, and the nth display state (n is a natural number smaller than m) is selected.
- step S507 a certain time t and time It may be determined whether the absolute difference in distance from t ⁇ 1 is greater than or equal to a predetermined magnitude and whether it is positive or negative, and the nth to n + 1th or n ⁇ 1th display state is selected. By doing so, an effect that the region to be excised or cut is enlarged as the surgical instrument approaches the surgical target site, that is, an image that smoothly changes from FIG. 27A to FIG. 27B is obtained. .
- 28A to 28C are explanatory diagrams when the image processing method of each of the above embodiments is executed by a computer system using a program recorded on a recording medium such as a flexible disk.
- FIG. 28B shows the appearance, cross-sectional structure, and flexible disk as seen from the front of the flexible disk
- FIG. 28A shows an example of the physical format of the flexible disk that is the recording medium body.
- the flexible disk FD is built in the case F, and on the surface of the disk, a plurality of tracks Tr are formed concentrically from the outer periphery toward the inner periphery, and each track is divided into 16 sectors Se in the angular direction. ing. Therefore, in the flexible disk storing the program, the program is recorded in an area allocated on the flexible disk FD.
- FIG. 28C shows a configuration for recording and reproducing the program on the flexible disk FD.
- the program is written from the computer system Cs via the flexible disk drive.
- the ultrasonic diagnostic method for realizing the ultrasonic diagnostic method by the program in the flexible disk is constructed in the computer system, the program is read from the flexible disk by the flexible disk drive and transferred to the computer system.
- the recording medium is not limited to this, and any recording medium such as an IC card or a ROM cassette capable of recording a program can be similarly implemented.
- the blocks of the ultrasonic diagnostic apparatus of FIG. 6 and the image processing apparatus of FIG. 25 are typically realized as an LSI (Large Scale Integration) that is an integrated circuit. These may be individually made into one chip, or may be made into one chip so as to include a part or all of them.
- LSI Large Scale Integration
- LSI Integrated Circuit
- IC Integrated Circuit
- the method of circuit integration is not limited to LSI, and may be realized by a dedicated circuit or a general-purpose processor.
- a dedicated circuit for graphics processing such as GPU (Graphic Processing Unit) can be used.
- An FPGA Field Programmable Gate Array
- a reconfigurable processor that can reconfigure the connection and setting of circuit cells inside the LSI may be used.
- the units of the ultrasonic diagnostic apparatus of FIG. 6 and the image processing apparatus of FIG. 25 may be connected via a network such as the Internet or a LAN (Local Area Network).
- a network such as the Internet or a LAN (Local Area Network).
- LAN Local Area Network
- the function addition of each part etc. may be performed via a network.
- the image processing apparatus and method of the present invention it is possible to reduce the time until the scan position is matched with the target, and it is expected to improve examination efficiency in screening for arteriosclerosis. Have sex.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Pathology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Molecular Biology (AREA)
- Veterinary Medicine (AREA)
- Physics & Mathematics (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Surgery (AREA)
- Radiology & Medical Imaging (AREA)
- Heart & Thoracic Surgery (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Vascular Medicine (AREA)
- Physiology (AREA)
- Epidemiology (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Primary Health Care (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
- Gynecology & Obstetrics (AREA)
- Computer Graphics (AREA)
- General Engineering & Computer Science (AREA)
Abstract
Description
本発明者は、「背景技術」の欄において記載した超音波診断装置および術中ナビゲーションシステム等の画像処理装置に関し、以下の問題が生じることを見出した。
本実施の形態では、本発明の一態様に係る画像処理装置を超音波診断装置に適用した場合について、図面を参照しながら説明する。なお、測定ターゲットは、超音波により撮影できる器官であれば特に限定されず、血管、心臓、肝臓、および乳房などがあるが、ここでは頸動脈を例に説明する。
実施の形態2は、超音波診断装置100の位置情報取得部102がプローブの位置情報を取得できているかどうかを判定する点が、実施の形態1と相違する。なお、構成は、図6に示す実施の形態1の超音波診断装置100と同様であるので、位置情報取得部102についても同じ符号を用いて、説明する。
実施の形態3は、超音波診断装置100の表示状態決定部103が表示状態の切替えを超音波画像に長軸像が描画されているか否かで行う点が、実施の形態1と相違する。なお、構成は、図6に示す実施の形態1の超音波診断装置100と同様であるので、表示状態決定部103についても同じ符号を用いて、説明する。
本実施の形態では、本発明の一態様に係る画像処理装置を術中ナビゲーションシステムに適用した場合について、図面を参照しながら説明する。術中ナビゲーションシステムとは、手術中の患者位置と手術器具との位置関係を表示するシステムである。この術中ナビゲーションシステムは、例えば腫瘍の位置または血管の位置等の認識性を向上させるため、また、骨または臓器といった手術対象に対する手術器具の位置を表示し手術の際の安全性を向上させるためなどに用いられる。
上記各実施の形態で示した画像処理方法を実現するためのプログラムを、フレキシブルディスク等の記録媒体に記録するようにすることにより、上記実施の形態で示した処理を、独立したコンピュータシステムにおいて簡単に実施することが可能となる。
30、100 超音波診断装置
31、101 3D像解析部
32、102、502 位置情報取得部
33、104、504 アシスト画像生成部
34、106 ライブ画像取得部
35、107、505 表示制御部
103、503 表示状態決定部
105 送受信部
108 制御部
150、250 表示装置
160 入力装置
500 画像処理装置
501 3D像生成部
510 ボリュームデータ
511 撮像装置
Claims (22)
- 被検体内の対象箇所への器具の移動を誘導するための画像であるアシスト画像を生成する画像処理装置であって、
前記対象箇所を含む3次元像に基づいて、前記対象箇所の3次元位置を示す対象位置情報を決定する3D像解析部と、
前記器具の3次元位置を示す器具位置情報を取得する位置情報取得部と、
前記対象箇所と前記器具との位置関係に基づいて、二つ以上の表示状態の中から一つの表示状態を選択する表示状態決定部と、
選択された表示状態で表示されるように、前記アシスト画像を前記対象位置情報および前記器具位置情報を用いて生成するアシスト画像生成部と、
前記アシスト画像を表示装置に出力するための制御を行う表示制御部と
を備える画像処理装置。 - 前記二つ以上の表示状態には、
前記アシスト画像における表示のズーム倍率を第1倍率で表示する第1表示状態と、
前記アシスト画像における表示のズーム倍率を前記第1倍率より大きい倍率である第2倍率で表示する第2表示状態とを含み、
前記表示状態決定部は、前記位置関係が第1の所定条件を満たさない場合に、前記第1表示状態を選択し、前記位置関係が前記第1の所定条件を満たす場合に、前記第2表示状態を選択する
請求項1に記載の画像処理装置。 - 前記3D像解析部は、前記3次元像に基づいて、前記対象箇所の3次元位置に加えて、前記対象箇所の向きを、前記対象位置情報として決定し、
前記位置情報取得部は、前記器具の3次元位置に加えて、前記器具の向きを、前記器具位置情報として取得する
請求項1または請求項2に記載の画像処理装置。 - 前記器具は、超音波診断装置における前記被検体の超音波画像を取得するためのプローブであり、
前記位置情報取得部は、前記器具位置情報として、前記プローブの走査位置および向きを取得し、
前記アシスト画像生成部は、前記プローブの前記対象箇所への移動を誘導するための画像であるアシスト画像を生成する
請求項3に記載の画像処理装置。 - 前記画像処理装置は、さらに、
前記プローブからライブ画像である前記被検体の超音波画像を取得するライブ画像取得部を備え、
前記表示制御部は、前記アシスト画像および前記ライブ画像を表示装置に出力する
請求項4に記載の画像処理装置。 - 前記二つ以上の表示状態には、
前記表示装置において前記アシスト画像をメイン画像として表示するとともに、前記ライブ画像を前記メイン画像より小さいサブ画像として表示する第3表示状態と、
前記表示装置において前記ライブ画像を前記メイン画像として表示するとともに、前記アシスト画像を前記サブ画像として表示する第4表示状態とを含み、
前記表示状態決定部は、前記位置関係が第2の所定条件を満たさない場合に前記第3表示状態を選択し、前記位置関係が前記第2の所定条件を満たす場合に前記第4表示状態を選択し、
前記表示制御部は、選択された表示状態で、前記アシスト画像および前記ライブ画像を表示装置に出力する
請求項5に記載の画像処理装置。 - 前記表示制御部は、選択された前記表示状態に応じて、前記アシスト画像と前記ライブ画像との相対的な表示サイズを切り替えることで、前記メイン画像および前記サブ画像の切り替えを行い、前記アシスト画像および前記ライブ画像を前記表示装置に出力する
請求項6に記載の画像処理装置。 - 前記表示状態決定部は、前記第3表示状態を選択している場合には、前記位置関係が第3の所定条件を満たすか否かに基づいて表示状態を選択し、前記第4表示状態を選択している場合には、前記位置関係が第4の所定条件を満たすか否かに基づいて表示状態を選択する
請求項6または請求項7に記載の画像処理装置。 - 前記対象箇所は血管であり、
前記表示状態決定部は、前記ライブ画像において前記血管の走行方向に略平行な断面が描出されているか否かに応じて前記位置関係を判定し、判定した前記位置関係に基づいて前記二つ以上の表示状態の中から一つの表示状態を選択する
請求項5~請求項7のいずれか1項に記載の画像処理装置。 - 前記画像処理装置は、さらに、
あらかじめ取得されたデータから前記3次元像を生成する3D像生成部を備え、
前記3D像生成部は、前記データである、前記対象箇所を含む領域を前記プローブで予め走査して取得された超音波画像から、前記対象箇所を含む器官の輪郭を抽出することにより、前記3次元像を生成し、
前記3次元像の3次元空間内における位置および向きを、前記位置情報取得部により取得される前記プローブの走査位置および向きと対応づける
請求項4~請求項9のいずれか1項に記載の画像処理装置。 - 前記アシスト画像生成部は、前記プローブの現在の走査位置および向きと前記対象箇所の位置および向きとの相対関係に基づいてナビゲーション情報を生成し、前記アシスト画像として、前記3次元像に対して、前記プローブの現在の走査位置および向きを示すプローブ画像と、前記ナビゲーション情報と、を重畳した画像を生成する
請求項4~請求項10のいずれか1項に記載の画像処理装置。 - [規則91に基づく訂正 26.12.2013]
前記アシスト画像生成部は、前記第4表示状態が選択されたとき、前記対象箇所における複数の方向からの断面形状をそれぞれ示す複数の断面画像を生成し、生成した前記複数の断面画像に対して、前記プローブの現在の走査位置および向きを示すプローブ画像を重畳した画像を、前記アシスト画像として生成する
請求項6~請求項8のいずれか1項に記載の画像処理装置。 - 前記対象箇所は血管であり、
前記複数の断面画像には、前記血管の走行方向である長軸方向および前記長軸方向に略直交する短軸方向からの断面形状をそれぞれ示す2つの断面画像を含み、
前記アシスト画像生成部は、前記2つの断面画像に対して、前記プローブの現在の走査位置および向きと前記対象箇所の位置および向きとの相対関係に基づいて、前記プローブの前記対象箇所への移動を誘導するための直線または長方形を重畳した画像を、前記アシスト画像として生成する
請求項12に記載の画像処理装置。 - 前記表示状態決定部は、前記対象位置情報および前記器具位置情報を用いて、前記対象箇所の位置および向きと前記器具の位置および向きとのそれぞれの差分を前記位置関係として算出し、算出した前記差分に応じて一つの表示状態を選択する
請求項3に記載の画像処理装置。 - 前記表示状態決定部は、前記対象位置情報および前記器具位置情報を用いて、前記対象箇所の位置および向きと前記器具の位置および向きとのそれぞれの差分を算出し、算出した前記差分を保持することにより、時間経過に伴う前記差分の変位を前記位置関係として算出し、算出した前記差分の変位に応じて一つの表示状態を選択する
請求項3に記載の画像処理装置。 - 前記対象箇所は前記被検体内の手術対象部位であり、
前記器具は、前記被検体の手術に用いられる手術器具であり、
前記アシスト画像生成部は、前記手術器具の前記手術対象部位への移動を誘導するための画像であるアシスト画像を生成する
請求項1~請求項3のいずれか1項に記載の画像処理装置。 - 前記画像処理装置は、さらに、
あらかじめ取得されたデータから前記3次元像を生成する3D像生成部を備える
請求項16に記載の画像処理装置。 - 前記表示状態決定部は、前記対象位置情報および前記器具位置情報を用いて、前記対象箇所および前記器具の位置の差分を前記位置関係として算出し、算出した前記差分に応じて一つの表示状態を選択する
請求項1または請求項2に記載の画像処理装置。 - 前記表示状態決定部は、前記対象位置情報および前記器具位置情報を用いて、前記対象箇所および前記器具の位置の差分を算出し、算出した前記差分を保持することにより、時間経過に伴う前記差分の変位を前記位置関係として算出し、算出した前記差分の変位に応じて一つの表示状態を選択する
請求項1または請求項2に記載の画像処理装置。 - 前記二つ以上の表示状態には、
前記アシスト画像におけるズーム倍率および視点の少なくともどちらか一方が相違する表示状態を二つ以上含み、
前記表示状態決定部は、前記位置関係に基づいて、前記アシスト画像におけるズーム倍率および視点の少なくともどちらか一方が相違する二つ以上の表示状態の中から一つの表示状態を選択する
請求項1~請求項19のいずれか1項に記載の画像処理装置。 - 被検体内の対象箇所への器具の移動を誘導するための画像であるアシスト画像を生成する画像処理方法であって、
前記対象箇所を含む3次元像に基づいて、前記対象箇所の3次元位置を示す対象位置情報を決定する3D像解析ステップと、
前記器具の3次元位置を示す器具位置情報を取得する位置情報取得ステップと、
前記対象箇所と前記器具との位置関係に基づいて、二つ以上の表示状態の中から一つの表示状態を選択する表示状態決定ステップと、
選択された表示状態で表示されるように、前記アシスト画像を前記対象位置情報および前記器具位置情報を用いて生成するアシスト画像生成ステップと、
前記アシスト画像を表示装置に出力するための制御を行う表示制御ステップと
を含む画像処理方法。 - 被検体内の対象箇所への器具の移動を誘導するための画像であるアシスト画像を生成するためのプログラムであって、
前記対象箇所を含む3次元像に基づいて、前記対象箇所の3次元位置を示す対象位置情報を決定する3D像解析ステップと、
前記器具の3次元位置を示す器具位置情報を取得する位置情報取得ステップと、
前記対象箇所と前記器具との位置関係に基づいて、二つ以上の表示状態の中から一つの表示状態を選択する表示状態決定ステップと、
選択された表示状態で表示されるように、前記アシスト画像を前記対象位置情報および前記器具位置情報を用いて生成するアシスト画像生成ステップと、
前記アシスト画像を表示装置に出力するための制御を行う表示制御ステップとをコンピュータに実行させる
プログラム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014546867A JP6323335B2 (ja) | 2012-11-15 | 2013-11-11 | 画像処理装置、画像処理方法、およびプログラム |
US14/442,281 US20160270757A1 (en) | 2012-11-15 | 2013-11-11 | Image-processing apparatus, image-processing method, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012251583 | 2012-11-15 | ||
JP2012-251583 | 2012-11-15 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014076931A1 true WO2014076931A1 (ja) | 2014-05-22 |
Family
ID=50730865
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/006625 WO2014076931A1 (ja) | 2012-11-15 | 2013-11-11 | 画像処理装置、画像処理方法、およびプログラム |
Country Status (3)
Country | Link |
---|---|
US (1) | US20160270757A1 (ja) |
JP (1) | JP6323335B2 (ja) |
WO (1) | WO2014076931A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2020519369A (ja) * | 2017-05-11 | 2020-07-02 | ベラソン インコーポレイテッドVerathon Inc. | 確率マップに基づいた超音波検査 |
JP2021079124A (ja) * | 2016-05-06 | 2021-05-27 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 簡素化された3dイメージング制御を有する超音波イメージングシステム |
JP7538705B2 (ja) | 2020-12-08 | 2024-08-22 | 富士フイルムヘルスケア株式会社 | 超音波診断システム及び操作支援方法 |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106028930B (zh) | 2014-02-21 | 2021-10-22 | 3D集成公司 | 包括手术器械的套件 |
CN107072591B (zh) * | 2014-09-05 | 2021-10-26 | 普罗赛普特生物机器人公司 | 与靶器官图像的治疗映射结合的医师控制的组织切除 |
KR20160046670A (ko) * | 2014-10-21 | 2016-04-29 | 삼성전자주식회사 | 영상 진단 보조 장치 및 방법 |
EP3145419B1 (en) | 2015-07-21 | 2019-11-27 | 3dintegrated ApS | Cannula assembly kit, trocar assembly kit and minimally invasive surgery system |
DK178899B1 (en) * | 2015-10-09 | 2017-05-08 | 3Dintegrated Aps | A depiction system |
US11191524B2 (en) * | 2017-09-28 | 2021-12-07 | Canon Medical Systems Corporation | Ultrasonic diagnostic apparatus and non-transitory computer readable medium |
US10521916B2 (en) * | 2018-02-21 | 2019-12-31 | Covidien Lp | Locating tumors using structured light scanning |
JP6856816B2 (ja) * | 2018-02-23 | 2021-04-14 | 富士フイルム株式会社 | 超音波診断装置および超音波診断装置の制御方法 |
WO2020050017A1 (ja) * | 2018-09-04 | 2020-03-12 | 富士フイルム株式会社 | 超音波診断装置および超音波診断装置の制御方法 |
US20220015741A1 (en) * | 2019-01-09 | 2022-01-20 | Koninklijke Philips N.V. | Ultrasound system and method for shear wave characterization of anisotropic tissue |
WO2020246238A1 (ja) | 2019-06-06 | 2020-12-10 | 富士フイルム株式会社 | 3次元超音波撮像支援装置、方法、及びプログラム |
CN112991166B (zh) * | 2019-12-16 | 2024-09-06 | 无锡祥生医疗科技股份有限公司 | 智能辅助导引方法、超声设备及存储介质 |
CN113143168A (zh) * | 2020-01-07 | 2021-07-23 | 日本电气株式会社 | 医疗辅助操作方法、装置、设备和计算机存储介质 |
JP7447692B2 (ja) * | 2020-06-16 | 2024-03-12 | コニカミノルタ株式会社 | 超音波診断装置、超音波診断装置の制御方法、及び、超音波診断装置の制御プログラム |
JP2022074392A (ja) * | 2020-11-04 | 2022-05-18 | コニカミノルタ株式会社 | 超音波診断装置、超音波診断装置の制御方法、及び、超音波診断装置の制御プログラム |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005124712A (ja) * | 2003-10-22 | 2005-05-19 | Aloka Co Ltd | 超音波診断装置 |
JP2010200894A (ja) * | 2009-03-02 | 2010-09-16 | Tadashi Ukimura | 手術支援システム及び手術ロボットシステム |
JP2010201049A (ja) * | 2009-03-05 | 2010-09-16 | Aloka Co Ltd | 超音波診断装置 |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8808164B2 (en) * | 2008-03-28 | 2014-08-19 | Intuitive Surgical Operations, Inc. | Controlling a robotic surgical tool with a display monitor |
WO2010005571A2 (en) * | 2008-07-09 | 2010-01-14 | Innurvation, Inc. | Displaying image data from a scanner capsule |
-
2013
- 2013-11-11 JP JP2014546867A patent/JP6323335B2/ja not_active Expired - Fee Related
- 2013-11-11 WO PCT/JP2013/006625 patent/WO2014076931A1/ja active Application Filing
- 2013-11-11 US US14/442,281 patent/US20160270757A1/en not_active Abandoned
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005124712A (ja) * | 2003-10-22 | 2005-05-19 | Aloka Co Ltd | 超音波診断装置 |
JP2010200894A (ja) * | 2009-03-02 | 2010-09-16 | Tadashi Ukimura | 手術支援システム及び手術ロボットシステム |
JP2010201049A (ja) * | 2009-03-05 | 2010-09-16 | Aloka Co Ltd | 超音波診断装置 |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2021079124A (ja) * | 2016-05-06 | 2021-05-27 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 簡素化された3dイメージング制御を有する超音波イメージングシステム |
JP7177870B2 (ja) | 2016-05-06 | 2022-11-24 | コーニンクレッカ フィリップス エヌ ヴェ | 簡素化された3dイメージング制御を有する超音波イメージングシステム |
JP2020519369A (ja) * | 2017-05-11 | 2020-07-02 | ベラソン インコーポレイテッドVerathon Inc. | 確率マップに基づいた超音波検査 |
JP7538705B2 (ja) | 2020-12-08 | 2024-08-22 | 富士フイルムヘルスケア株式会社 | 超音波診断システム及び操作支援方法 |
Also Published As
Publication number | Publication date |
---|---|
JP6323335B2 (ja) | 2018-05-16 |
US20160270757A1 (en) | 2016-09-22 |
JPWO2014076931A1 (ja) | 2017-01-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6323335B2 (ja) | 画像処理装置、画像処理方法、およびプログラム | |
EP2460473B1 (en) | Reference image display method for ultrasonography and ultrasonic diagnosis apparatus | |
JP4758355B2 (ja) | 患者の体内に医療用機器を案内するためのシステム | |
JP5208495B2 (ja) | 医療用システム | |
JP5433240B2 (ja) | 超音波診断装置及び画像表示装置 | |
JP5395538B2 (ja) | 超音波診断装置及び画像データ表示用制御プログラム | |
JP5830576B1 (ja) | 医療システム | |
JP6873647B2 (ja) | 超音波診断装置および超音波診断支援プログラム | |
JP2013059609A (ja) | 医用画像表示装置及びx線診断装置 | |
US20100286526A1 (en) | Ultrasonic diagnostic apparatus, ultrasonic image processing apparatus and ultrasonic image processing method | |
JP2008005923A (ja) | 医用ガイドシステム | |
JP5253893B2 (ja) | 医用画像処理装置、超音波診断装置、及び超音波画像取得プログラム | |
US9990725B2 (en) | Medical image processing apparatus and medical image registration method using virtual reference point for registering images | |
US20230181148A1 (en) | Vascular system visualization | |
JP5942217B2 (ja) | 超音波診断装置、超音波画像処理装置及び超音波画像処理プログラム | |
JP2002253480A (ja) | 医療処置補助装置 | |
JP2009247641A (ja) | 超音波画像処理装置及び手術支援装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13855345 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014546867 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14442281 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13855345 Country of ref document: EP Kind code of ref document: A1 |