WO2023054001A1 - 画像処理装置、画像処理システム、画像表示方法、及び画像処理プログラム - Google Patents
画像処理装置、画像処理システム、画像表示方法、及び画像処理プログラム Download PDFInfo
- Publication number
- WO2023054001A1 WO2023054001A1 PCT/JP2022/034644 JP2022034644W WO2023054001A1 WO 2023054001 A1 WO2023054001 A1 WO 2023054001A1 JP 2022034644 W JP2022034644 W JP 2022034644W WO 2023054001 A1 WO2023054001 A1 WO 2023054001A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image processing
- image
- control unit
- display
- sensor
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 312
- 238000000034 method Methods 0.000 title claims description 101
- 230000033001 locomotion Effects 0.000 claims abstract description 28
- 239000000523 sample Substances 0.000 claims description 76
- 238000003825 pressing Methods 0.000 claims description 21
- 238000005520 cutting process Methods 0.000 description 77
- 238000003860 storage Methods 0.000 description 48
- 238000002608 intravascular ultrasound Methods 0.000 description 43
- 230000005484 gravity Effects 0.000 description 41
- 230000008569 process Effects 0.000 description 41
- 238000004891 communication Methods 0.000 description 27
- 230000006870 function Effects 0.000 description 23
- 238000009499 grossing Methods 0.000 description 23
- 238000012986 modification Methods 0.000 description 23
- 230000004048 modification Effects 0.000 description 23
- 244000208734 Pisonia aculeata Species 0.000 description 21
- 230000008859 change Effects 0.000 description 18
- 238000010586 diagram Methods 0.000 description 17
- 238000004364 calculation method Methods 0.000 description 16
- 238000009877 rendering Methods 0.000 description 16
- 210000004204 blood vessel Anatomy 0.000 description 13
- 238000009826 distribution Methods 0.000 description 8
- 238000012014 optical coherence tomography Methods 0.000 description 8
- 230000003287 optical effect Effects 0.000 description 8
- 208000007479 Orofaciodigital syndrome type 1 Diseases 0.000 description 6
- 239000003086 colorant Substances 0.000 description 6
- 238000003745 diagnosis Methods 0.000 description 6
- 239000000284 extract Substances 0.000 description 6
- 238000013507 mapping Methods 0.000 description 6
- 210000000056 organ Anatomy 0.000 description 6
- 201000003455 orofaciodigital syndrome I Diseases 0.000 description 6
- 238000002604 ultrasonography Methods 0.000 description 6
- 238000002679 ablation Methods 0.000 description 4
- 230000001172 regenerating effect Effects 0.000 description 4
- 210000005245 right atrium Anatomy 0.000 description 4
- 239000004065 semiconductor Substances 0.000 description 4
- 238000003384 imaging method Methods 0.000 description 3
- 238000003780 insertion Methods 0.000 description 3
- 230000037431 insertion Effects 0.000 description 3
- 210000003141 lower extremity Anatomy 0.000 description 3
- 235000008733 Citrus aurantifolia Nutrition 0.000 description 2
- 235000011941 Tilia x europaea Nutrition 0.000 description 2
- 230000001746 atrial effect Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 210000000601 blood cell Anatomy 0.000 description 2
- 238000002059 diagnostic imaging Methods 0.000 description 2
- 238000005401 electroluminescence Methods 0.000 description 2
- 210000005246 left atrium Anatomy 0.000 description 2
- 230000003902 lesion Effects 0.000 description 2
- 239000004571 lime Substances 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 230000010349 pulsation Effects 0.000 description 2
- 238000010079 rubber tapping Methods 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 210000001364 upper extremity Anatomy 0.000 description 2
- 230000002792 vascular Effects 0.000 description 2
- 210000001631 vena cava inferior Anatomy 0.000 description 2
- 210000002620 vena cava superior Anatomy 0.000 description 2
- 206010003658 Atrial Fibrillation Diseases 0.000 description 1
- 208000006017 Cardiac Tamponade Diseases 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 210000005242 cardiac chamber Anatomy 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 230000002526 effect on cardiovascular system Effects 0.000 description 1
- 230000037361 pathway Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/12—Diagnosis using ultrasonic, sonic or infrasonic waves in body cavities or body tracts, e.g. by using catheters
Definitions
- the present disclosure relates to an image processing device, an image processing system, an image display method, and an image processing program.
- Patent Documents 1 to 3 describe techniques for generating three-dimensional images of heart chambers or blood vessels using a US imaging system.
- US is an abbreviation for ultrasound.
- IVUS is an abbreviation for intravascular ultrasound.
- IVUS is a device or method that provides two-dimensional images in a plane perpendicular to the longitudinal axis of the catheter.
- IVUS is often used for procedures that use a catheter other than an IVUS catheter, such as ablation.
- the septal needle inserted into the right atrium punctures the fossa ovalis to form a pathway from the right atrium to the left atrium, the so-called Brockenblow technique. law is used. Since there is a risk of complications such as perforation or cardiac tamponade during puncture, it is desirable to fully confirm the puncture position. In that respect, an IVUS catheter that can obtain 360-degree information is excellent in confirming the puncture position within the same plane.
- images are acquired intermittently along the IVUS catheter axis, making it difficult to image three-dimensional structures. As a result, confirmation of the puncture position in the axial direction may be insufficient.
- the purpose of the present disclosure is to make it easier to confirm the position in living tissue.
- An image processing apparatus displays an image representing the biological tissue on a display based on tomographic data acquired by a sensor moving in the lumen of the biological tissue, and displays the image on the same screen as the image. represents the position of the sensor, and causes the display to display a first element that is displaced along with the movement of the sensor, wherein when a user operation requesting marking of the position of the sensor is received, the user A control unit is provided for displaying on the display together with the first element a second element fixed at the same position as the position of the first element when the operation is performed.
- control unit sets the color of the second element to a color different from that of the first element.
- the control unit when receiving an operation requesting movement of the sensor to a position corresponding to the position of the second element, moves the sensor to a position corresponding to the position of the second element.
- control unit when the user operation is received again, places the third element fixed at the same position as the position of the first element when the user operation is performed again on the first element and the third element. displayed on the display together with the second element.
- control unit sets the color of the third element to a color different from that of the second element.
- control unit places a fourth element fixed between the second element and the third element on the display together with the first element, the second element, and the third element. display.
- control unit calculates an intermediate position between the second element and the third element as the position between the second element and the third element.
- control unit sets the color of the fourth element to a color different from that of the second element and the third element.
- the control unit when receiving an operation requesting movement of the sensor to a position corresponding to the position of the fourth element, moves the sensor to a position corresponding to the position of the fourth element.
- control unit adjusts the color of the area between the cross section corresponding to the position of the second element and the cross section corresponding to the position of the third element in the three-dimensional image, which is the image, to the adjacent set to a different color than the area to be
- control unit causes the display to display a combination of a graphic element group, which is an element group including the first element and the second element, and a long graphic element representing the movement range of the sensor. .
- control unit displays the elongated graphic element in a direction in which the longitudinal direction of the lumen in the three-dimensional image, which is the image, is parallel to the longitudinal direction of the elongated graphic element. to display.
- the control unit in the three-dimensional image that is the image, selects voxels that represent at least the inner surface of the living tissue in a first voxel group corresponding to the position of the sensor, or represents the inner surface.
- a voxel adjacent to a voxel and representing the inner cavity is the first element
- a second voxel group corresponding to the position of the sensor when the user operation is performed, at least the voxel representing the inner surface, or the inner surface
- a voxel adjacent to a voxel representing the surface and representing the lumen is used as the second element, and the second element is colored to distinguish it from the first element.
- control unit accepts an operation of pressing one or more predetermined keys as the user operation.
- An image processing system as one aspect of the present disclosure includes the image processing device and a probe having the sensor.
- the image processing system further includes the display.
- An image display method as one aspect of the present disclosure displays an image representing the biological tissue on a display based on tomographic data acquired by a sensor moving in the lumen of the biological tissue, and displays the image on the same screen as the image. represents the position of the sensor, and displays on the display a first element that is displaced as the sensor moves, wherein a user operation requesting marking of the position of the sensor is received, and the user operation is displayed on the display together with the first element, the second element being fixed at the same position as the first element when is performed.
- An image processing program displays an image representing the biological tissue on a display based on tomographic data acquired by a sensor moving in the lumen of the biological tissue, and displays the image on the same screen as the image. represents the position of the sensor, and the computer displays on the display the first element that is displaced as the sensor moves.
- a second element fixed at the same position as the position of the first element when the first element is displayed on the display together with the first element is executed.
- FIG. 1 is a perspective view of an image processing system according to embodiments of the present disclosure
- FIG. FIG. 3 is a diagram showing an example of a screen displayed on a display by the image processing system according to the first embodiment of the present disclosure
- FIG. 1 is a diagram showing an example of a two-dimensional image displayed on a display by the image processing system according to the first embodiment of the present disclosure
- FIG. FIG. 4 is a diagram showing an example of a cutting area formed by an image processing system according to each embodiment of the present disclosure
- FIG. 1 is a block diagram showing the configuration of an image processing device according to each embodiment of the present disclosure
- FIG. FIG. 3 is a diagram showing an example of a screen displayed on a display by the image processing system according to the first embodiment of the present disclosure
- FIG. 1 is a diagram showing an example of a two-dimensional image displayed on a display by the image processing system according to the first embodiment of the present disclosure
- FIG. FIG. 4 is a diagram showing an example of a cutting area formed by an image processing
- FIG. 5 is a diagram showing an example of a screen displayed on a display by an image processing system according to a modified example of the first embodiment of the present disclosure
- FIG. FIG. 2 is a perspective view of a probe and drive unit according to embodiments of the present disclosure
- 4 is a flow chart showing the operation of the image processing system according to each embodiment of the present disclosure
- 4 is a flow chart showing the operation of the image processing system according to each embodiment of the present disclosure
- FIG. 4 is a diagram showing a result of binarizing a cross-sectional image of living tissue in each embodiment of the present disclosure
- FIG. 5 is a diagram showing the result of extracting a point cloud of the inner surface of a biological tissue in each embodiment of the present disclosure
- FIG. 4 is a diagram showing results of calculation of a center-of-gravity position of a cross section of a living tissue in each embodiment of the present disclosure
- FIG. 4 is a diagram showing the results of calculating the center-of-gravity positions of multiple cross-sections of living tissue in each embodiment of the present disclosure
- FIG. 14 is a diagram showing the result of performing smoothing on the result of FIG. 13
- 4 is a flowchart showing operations of the image processing system according to the first embodiment of the present disclosure
- FIG. 5 is a diagram showing an example of a screen displayed on a display by an image processing system according to a modified example of the first embodiment of the present disclosure
- FIG. 10 is a block diagram showing the configuration of an image processing device according to a modified example of the first embodiment of the present disclosure
- FIG. 7 is a diagram showing an example of a screen displayed on a display by an image processing system according to the second embodiment of the present disclosure
- FIG. 7 is a diagram showing an example of a screen displayed on a display by an image processing system according to the second embodiment of the present disclosure
- FIG. 7 is a diagram showing an example of a two-dimensional image displayed on a display by an image processing system according to the second embodiment of the present disclosure
- FIG. 8 is a flow chart showing the operation of the image processing system according to the second embodiment of the present disclosure
- FIG. 1 An outline of the present embodiment will be described with reference to FIGS. 1 to 5.
- FIG. 1 An outline of the present embodiment will be described with reference to FIGS. 1 to 5.
- the image processing apparatus 11 causes the display 16 to display an image representing the biological tissue 60 based on the tomographic data 51 acquired by the sensor moving in the lumen 63 of the biological tissue 60, and displays the image and the image. It is a computer that displays the position of the sensor on the same screen 80 and causes the display 16 to display the first element that is displaced along with the movement of the sensor.
- the image representing the living tissue 60 is a three-dimensional image 53 shown on the right side of FIG. 2 in this embodiment, but may be a two-dimensional image 56 such as a cross-sectional image shown on the left side of FIG.
- the first element is a graphic element such as a knob of a slider as shown in FIG. 2 as the first graphic element 87a.
- Voxels may be colored with a first color.
- the image processing apparatus 11 receives a marking operation that is a user operation requesting marking of the position of the sensor, the image processing apparatus 11 places the second element fixed at the same position as the first element when the marking operation was performed. displayed on the display 16 together with the elements.
- the second element is a graphic element such as a rectangular mark as shown in FIG. 2 as the second graphic element 87b in this embodiment.
- 54b may be voxels colored with a second color such as green.
- the display 16 displays the third element, which is fixed at the same position as the first element when the marking operation was performed again, together with the first element and the second element.
- the third element which in this embodiment is a graphic element such as a rectangular mark as shown in FIG. 2 as third graphic element 87c, is the third voxel corresponding to the position of the sensor when the marking operation is performed again. It may be a voxel colored with a third color, such as red, from group 54c.
- the image processing device 11 causes the display 16 to display the fourth element, which is fixed between the second element and the third element, together with the first element, the second element, and the third element.
- the fourth element is a graphic element such as a rectangular mark as shown in FIG. 2 as the fourth graphic element 87d.
- voxels colored with a fourth color such as yellow may be used.
- the axis synchronized with the linear scaler of the pullback unit of IVUS is displayed as a fifth graphic element 86 on the screen 80, specifically to the side of the two-dimensional image 56. Is displayed.
- the current position of the IVUS sensor ultrasound element is always displayed on the axis as a first graphic element 87a.
- a user such as a physician performing a procedure while manipulating a catheter, or a clinical engineer manipulating an IVUS system while looking at the display 16, may perform a predetermined command, such as simultaneously pressing Ctrl and "B".
- the current position of the ultrasonic element can be marked on the axis by pressing one or more keys as a marking operation. For example, when a septal puncture is performed, by performing a marking operation while detecting the upper end of the fossa ovalis 66 with IVUS, the position of the upper end of the fossa ovalis 66 is indicated as the second graphic element 87b. Markings can be made on the axis. The position of the inferior edge of the fossa ovalis 66 can be further marked on-axis as a third graphic element 87c by performing the marking operation again while the inferior edge of the fossa ovalis 66 is being detected by IVUS.
- the intermediate position of the upper and lower extremities of the fossa ovalis 66 is automatically calculated, the intermediate position of the upper and lower extremities of the fossa ovalis 66 can be further marked on the axis as a fourth graphic element 87d. Therefore, it becomes easier to confirm the central position of the fossa ovalis 66 as the puncture position. As a result, the risk of complications can be reduced.
- the three-dimensional image 53 can also be regarded as equivalent to the axis. That is, in the three-dimensional image 53, the first voxel group 54a is a voxel group on the same plane as the current position of the ultrasonic element. , is always displayed on the axis as a colored line indicating the current position of the ultrasonic element.
- the second voxel group 54b is a voxel group that is on the same plane as the ultrasonic element when the upper end of the fossa ovalis 66 is detected by IVUS and the marking operation is performed, the voxels colored with the second color are displayed on the axis as a colored line indicating the position of the top of the fossa ovalis 66 .
- the third voxel group 54c is the voxel group that is coplanar with the ultrasonic element when the lower end of the fossa ovalis 66 is detected by IVUS and the marking operation is performed again
- the third voxel group 54c the voxels colored with the third color may be considered to be displayed on the axis as colored lines indicating the positions of the lower ends of the fossa ovalis 66 .
- the fourth voxel group 54d is a voxel group that exists between the second voxel group 54b and the third voxel group 54c.
- 66 may be considered to be displayed on the axis as a colored line indicating an intermediate position between the upper and lower ends of 66 .
- the image processing device 11 causes the display 16 to display the three-dimensional data 52 representing the living tissue 60 as a three-dimensional image 53 . As shown in FIG. 4, the image processing device 11 forms a cut region 62 in the three-dimensional data 52 that exposes the lumen 63 of the biological tissue 60 in the three-dimensional image 53 . As shown in FIG. 2, the image processing device 11 causes the display 16 to display a two-dimensional image 56 representing a cross section 64 of the biological tissue 60 and a region 65 corresponding to the cutting region 62 in the cross section 64 along with a three-dimensional image 53. .
- the user can understand from the two-dimensional image 56 what kind of structure the portion of the biological tissue 60 that is cut off and not displayed in the three-dimensional image 53 is. For example, if the user is an operator, it becomes easier to operate the inside of the living tissue 60 .
- the image processing device 11 causes the display 16 to display a three-dimensional image 53 including a three-dimensional object representing the elongated medical instrument 67 inserted into the lumen 63 .
- the elongated medical device 67 is a catheter with a puncture needle attached to its tip in this embodiment, but may be another type of medical device such as a guide wire as long as it is elongated.
- the image processing device 11 arranges a mark 72 representing the tip of the elongated medical instrument 67 on the two-dimensional image 56 .
- the mark 72 may be of any color and shape, but is a solid yellow triangle in the example of FIG.
- the mark 72 is directed from the center of gravity of the cross section 64 of the biological tissue 60 into which the catheter as the elongated medical device 67 is inserted in the direction of ultrasound radiation. Therefore, the user can easily recognize that the mark 72 is a puncture needle.
- the image processing device 11 generates and updates three-dimensional data 52 based on the tomographic data 51 . As shown in FIG. 2, the image processing apparatus 11 selects at least the voxels representing the inner surface 61 of the biological tissue 60 or the inner surface 61 from the first voxel group 54a corresponding to the position of the sensor in the three-dimensional image 53. A voxel adjacent to the represented voxel and representing the lumen 63 is set as the first element, and is colored so as to be distinguished from other voxel groups 55 .
- the image processing device 11 displays at least the inner surface 61 of the second voxel group 54b corresponding to the position of the sensor, which is the first voxel group 54a when the marking operation was performed.
- a voxel, or a voxel adjacent to a voxel representing inner surface 61 and representing lumen 63, is treated as a second element and colored to distinguish it from other voxels containing the first element.
- the image processing device 11 selects voxels representing at least the inner surface 61 from among the third voxel group 54c corresponding to the position of the sensor when the marking operation is performed again.
- a voxel adjacent to the voxel representing 61 and representing the lumen 63 is set as the third element, and colored to distinguish it from other voxels including the first and second elements.
- the image processing apparatus 11 selects at least the voxels representing the inner surface 61 or
- the voxel adjacent to and representing the lumen 63 is the fourth element and is colored to distinguish it from the other voxel groups including the first, second and third elements.
- the user observing the lumen 63 of the biological tissue 60 using the three-dimensional image 53 can understand which part of the three-dimensional image 53 the information currently obtained by the sensor, that is, the latest information corresponds to. easier.
- the first marking location such as the upper end of the fossa ovalis 66
- a second marking location such as the lower end of the fossa ovalis 66, can also indicate which part of the three-dimensional image 53 it corresponds to.
- the voxel representing the inner surface 61 or the inner Voxels representing lumen 63 adjacent to voxels representing surface 61 may also be colored to distinguish them from voxels corresponding to other cross-sections of tissue 60 .
- the width of the voxel group that is colored to distinguish it from the voxel groups corresponding to other cross sections in the moving direction of the sensor is widened, and the user can easily recognize the voxel group in the three-dimensional image 53. .
- all voxels representing the living tissue 60 in the first voxel group 54a may be treated as the first elements and colored to distinguish them from the other voxel groups 55.
- all the voxels representing the living tissue 60 in the second voxel group 54b may be set as the second element and colored to distinguish them from other voxel groups including the first voxel group 54a.
- all the voxels representing the living tissue 60 in the third voxel group 54c are set as the third element, and are distinguished from other voxel groups including the first voxel group 54a and the second voxel group 54b. May be colored separately.
- all voxels representing the living tissue 60 are set as fourth elements, and are colored to distinguish them from other voxel groups including the first voxel group 54a, the second voxel group 54b, and the third voxel group 54c.
- the first voxel group 54a is colored to distinguish it from the other voxel groups 55 even on the cut surface of the living tissue 60 formed for observing the lumen 63 of the living tissue 60. It becomes easier for the user to understand which part of the three-dimensional image 53 the information corresponds to.
- the first voxel group 54a is colored to distinguish them from other voxel groups. It becomes easier to confirm the position within 60 .
- the image processing apparatus 11 converts the two-dimensional image 56 representing the cross section 64 into at least the voxels representing the inner surface 61 of the living tissue 60 or the inner surface 61 of the first voxel group 54 a corresponding to the cross section 64 . is displayed on the display 16 together with the three-dimensional image 53 in which the voxels representing the lumen 63 adjacent to the voxels representing the . Therefore, the relationship between the two-dimensional image 56 and the three-dimensional image 53 can be shown.
- the biological tissue 60 includes, for example, blood vessels or organs such as the heart.
- the biological tissue 60 is not limited to an anatomical single organ or a part thereof, but also includes a tissue that straddles a plurality of organs and has a lumen.
- a specific example of such tissue is a portion of the vascular system extending from the upper portion of the inferior vena cava through the right atrium to the lower portion of the superior vena cava.
- the screen 80 includes an operation panel 81, a two-dimensional image 56, a three-dimensional image 53, a first graphic element 87a, a second graphic element 87b, a third graphic element 87c, and a fourth graphic element. 87d and a fifth graphic element 86 are displayed.
- the operation panel 81 is a GUI component for setting the cutting area 62. "GUI" is an abbreviation for graphical user interface.
- the operation panel 81 includes a check box 82 for selecting whether to activate the setting of the cutting area 62, a slider 83 for setting the base angle, a slider 84 for setting the opening angle, a center of gravity
- a check box 85 is provided for selecting whether or not to use the
- the base angle is the rotation angle of one of the two straight lines L1 and L2 extending from one point M in the cross-sectional image representing the cross-section 64 of the living tissue 60 . Therefore, setting the base angle corresponds to setting the direction of the straight line L1.
- the opening angle is the angle between the two straight lines L1 and L2. Therefore, setting the opening angle corresponds to setting the angle formed by the two straight lines L1 and L2.
- Point M is the center of gravity of cross section 64 . Point M may be set at a point other than the center of gravity on cross-section 64 if it is selected not to use the center of gravity.
- a two-dimensional image 56 is an image obtained by processing a cross-sectional image.
- the color of the area 65 corresponding to the cut area 62 is changed to clearly indicate which part of the cross section 64 is cut.
- the viewpoint when displaying the three-dimensional image 53 on the screen 80 is adjusted according to the position of the cutting area 62 .
- a viewpoint is the position of a virtual camera 71 arranged in a three-dimensional space.
- the two-dimensional image 56 shows the position of the camera 71 with respect to the slice 64 .
- the two-dimensional image 56 can be used to determine the cutting area 62 .
- the position or size of the cutting area 62 can be set. For example, if the base angle is changed such that the straight line L1 is rotated counterclockwise by approximately 90 degrees, a region 65a that has moved according to the change in the base angle is obtained in the two-dimensional image 56a. Then, the position of the cutting area 62 is adjusted according to the position of the area 65a.
- the opening angle is changed such that the angle between the two straight lines L1 and L2 is increased, a region 65b enlarged according to the change in the opening angle is obtained in the two-dimensional image 56b. Then, the size of the cutting area 62 is adjusted according to the size of the area 65b. It is also possible to set both the position and size of the cutting area 62 by adjusting both the base angle and the opening angle to set both the position and size of the area 65 in the two-dimensional image 56 . The position of the camera 71 may be appropriately adjusted according to the position or size of the cutting area 62 .
- the image corresponding to the current position of the sensor that is, the latest image is always displayed as the two-dimensional image 56.
- the base angle may be set by dragging the straight line L1 instead of being set by operating the slider 83, or by entering a numerical value. good too.
- the opening angle may be set by dragging the straight line L2 or by entering a numerical value.
- the cutting area 62 determined using the two-dimensional image 56 is hidden or transparent.
- the sensor is present in the longitudinal direction of the lumen 63, and in order to represent the position currently being updated in real time, the first voxel group corresponding to the current position of the sensor is The color of 54a is changed.
- the voxels representing the inner surface 61 of the living tissue 60 are set to a color different from that of the other voxel group 55, so that the other voxels Although colored to be distinguished from the group 55, as a modification of the present embodiment, all the voxels representing the living tissue 60 in the first voxel group 54a may be set to different colors as shown in FIG. . As a further modification, the contrast between the first voxel group 54a and the other voxel group 55 is adjusted instead of setting the first voxel group 54a and the other voxel group 55 to different colors.
- One voxel group 54 a may be colored to distinguish it from other voxel groups 55 .
- the second voxel group 54b, the third voxel group 54c, and the fourth voxel group 54d are similar to the first voxel group 54a.
- the first graphic element 87a is a graphic element representing the position of the sensor.
- a fifth graphic element 86 is a long graphic element representing the range of motion of the sensor.
- the combination of first graphic element 87a and fifth graphic element 86 is configured as a slider.
- the first graphic element 87a and the fifth graphic element 86 may be displayed at arbitrary positions, but are displayed between the two-dimensional image 56 and the three-dimensional image 53 in this embodiment.
- the second graphic element 87b is a graphic element fixed at the same position as the first graphic element 87a when the marking operation was performed.
- the third graphic element 87c is a graphic element that is fixed in the same position as the first graphic element 87a when the marking operation is performed again.
- a fourth graphic element 87d is a graphic element fixed in position between the second graphic element 87b and the third graphic element 87c.
- the relative position of the first graphic element 87a is changed as the sensor moves. For example, assuming that no pullback operation is currently being performed, as shown in FIG. Colored with the first color.
- the first graphic element 87a is also displayed in the first color.
- a first graphic element 87a represents the current position of the sensor by being positioned at the same height as the first group of voxels 54a.
- the second graphic element 87b is newly displayed.
- the voxels representing the inner surface 61 of the living tissue 60 are colored with a second color such as green.
- a second graphic element 87b is also displayed in a second color.
- the second graphic element 87b represents the position of the upper end of the fossa ovalis 66 by being positioned at the same height as the second voxel group 54b.
- the third graphic element 87c is newly displayed. be done.
- the voxels representing the inner surface 61 of the biological tissue 60 are colored with a third color such as red.
- a third graphic element 87c is also displayed in a third color.
- the third graphic element 87c represents the position of the lower end of the fossa ovalis 66 by being positioned at the same height as the third voxel group 54c.
- the fourth graphic element 87d is newly displayed.
- the voxels representing the inner surface 61 of the living tissue 60 are colored with a fourth color such as yellow.
- a fourth graphic element 87d is also displayed in a fourth color.
- the fourth graphic element 87d is positioned at the same height as the fourth voxel group 54d to represent the intermediate position between the upper and lower ends of the fossa ovalis 66, that is, the desired puncture position.
- the X direction and the Y direction perpendicular to the X direction respectively correspond to the lateral direction of the lumen 63 of the living tissue 60 .
- a Z direction orthogonal to the X and Y directions corresponds to the longitudinal direction of the lumen 63 of the living tissue 60 .
- the image processing device 11 uses the three-dimensional data 52 to calculate the positions of the centers of gravity B1, B2, B3 and B4 of the cross sections C1, C2, C3 and C4 of the biological tissue 60, respectively.
- the image processing apparatus 11 sets two planes P1 and P2 that intersect at a line Lb that passes through the positions of the centers of gravity B1, B2, B3, and B4 and that include two straight lines L1 and L2, respectively. For example, if point M shown in FIG.
- the image processing device 11 forms an area sandwiched between the cut planes P1 and P2 in the three-dimensional image 53 and exposing the lumen 63 of the biological tissue 60 as the cut area 62 in the three-dimensional data 52 .
- cross sections C1, C2, C3, and C4 are shown as multiple cross sections in the lateral direction of the lumen 63 of the biological tissue 60 for the sake of convenience, but the number of cross sections for which the position of the center of gravity is to be calculated is four. It is not limited to one, but is preferably the same number as the number of cross-sectional images acquired by IVUS.
- the check box 85 on the operation panel 81 is unchecked, that is, not using the center of gravity is selected.
- the image processing device 11 intersects at an arbitrary line passing through the point M, such as a straight line extending in the Z direction through the point M, and includes two straight lines L1 and L2, respectively. Planes are set as cut planes P1 and P2.
- the image processing system 10 includes an image processing device 11, a cable 12, a drive unit 13, a keyboard 14, a mouse 15, and a display 16.
- the image processing apparatus 11 is a dedicated computer specialized for image diagnosis in this embodiment, but may be a general-purpose computer such as a PC. "PC” is an abbreviation for personal computer.
- the cable 12 is used to connect the image processing device 11 and the drive unit 13.
- the drive unit 13 is a device that is used by being connected to the probe 20 shown in FIG. 7 and drives the probe 20 .
- the drive unit 13 is also called MDU. "MDU” is an abbreviation for motor drive unit.
- Probe 20 has IVUS applications. Probe 20 is also referred to as an IVUS catheter or diagnostic imaging catheter.
- the keyboard 14, mouse 15, and display 16 are connected to the image processing device 11 via any cable or wirelessly.
- the display 16 is, for example, an LCD, organic EL display, or HMD.
- LCD is an abbreviation for liquid crystal display.
- EL is an abbreviation for electro luminescence.
- HMD is an abbreviation for head-mounted display.
- the image processing system 10 further comprises a connection terminal 17 and a cart unit 18 as options.
- connection terminal 17 is used to connect the image processing device 11 and an external device.
- the connection terminal 17 is, for example, a USB terminal.
- USB is an abbreviation for Universal Serial Bus.
- the external device is, for example, a recording medium such as a magnetic disk drive, a magneto-optical disk drive, or an optical disk drive.
- the cart unit 18 is a cart with casters for movement.
- An image processing device 11 , a cable 12 and a drive unit 13 are installed in the cart body of the cart unit 18 .
- a keyboard 14 , a mouse 15 and a display 16 are installed on the top table of the cart unit 18 .
- the probe 20 includes a drive shaft 21, a hub 22, a sheath 23, an outer tube 24, an ultrasonic transducer 25, and a relay connector 26.
- the drive shaft 21 passes through a sheath 23 inserted into the body cavity of a living body, an outer tube 24 connected to the proximal end of the sheath 23, and extends to the inside of a hub 22 provided at the proximal end of the probe 20.
- the driving shaft 21 has an ultrasonic transducer 25 for transmitting and receiving signals at its tip and is rotatably provided within the sheath 23 and the outer tube 24 .
- a relay connector 26 connects the sheath 23 and the outer tube 24 .
- the hub 22, the drive shaft 21, and the ultrasonic transducer 25 are connected to each other so as to integrally move back and forth in the axial direction. Therefore, for example, when the hub 22 is pushed toward the distal side, the drive shaft 21 and the ultrasonic transducer 25 move inside the sheath 23 toward the distal side. For example, when the hub 22 is pulled proximally, the drive shaft 21 and the ultrasonic transducer 25 move proximally inside the sheath 23 as indicated by the arrows.
- the drive unit 13 includes a scanner unit 31, a slide unit 32, and a bottom cover 33.
- the scanner unit 31 is also called a pullback unit.
- the scanner unit 31 is connected to the image processing device 11 via the cable 12 .
- the scanner unit 31 includes a probe connection section 34 that connects to the probe 20 and a scanner motor 35 that is a drive source that rotates the drive shaft 21 .
- the probe connecting portion 34 is detachably connected to the probe 20 through an insertion port 36 of the hub 22 provided at the proximal end of the probe 20 .
- the proximal end of the drive shaft 21 is rotatably supported, and the rotational force of the scanner motor 35 is transmitted to the drive shaft 21 .
- Signals are also transmitted and received between the drive shaft 21 and the image processing device 11 via the cable 12 .
- the image processing device 11 generates a tomographic image of the body lumen and performs image processing based on the signal transmitted from the drive shaft 21 .
- the slide unit 32 mounts the scanner unit 31 so as to move back and forth, and is mechanically and electrically connected to the scanner unit 31 .
- the slide unit 32 includes a probe clamp section 37 , a slide motor 38 and a switch group 39 .
- the probe clamping part 37 is arranged coaxially with the probe connecting part 34 on the tip side of the probe connecting part 34 and supports the probe 20 connected to the probe connecting part 34 .
- the slide motor 38 is a driving source that generates axial driving force.
- the scanner unit 31 advances and retreats by driving the slide motor 38, and the drive shaft 21 advances and retreats in the axial direction accordingly.
- the slide motor 38 is, for example, a servomotor.
- the switch group 39 includes, for example, a forward switch and a pullback switch that are pressed when moving the scanner unit 31 back and forth, and a scan switch that is pressed when image rendering is started and ended.
- Various switches are included in the switch group 39 as needed, without being limited to the example here.
- the scanner motor 35 When the scan switch is pressed, image rendering is started, the scanner motor 35 is driven, and the slide motor 38 is driven to move the scanner unit 31 backward.
- a user such as an operator connects the probe 20 to the scanner unit 31 in advance, and causes the drive shaft 21 to rotate and move to the proximal end side in the axial direction when image rendering is started.
- the scanner motor 35 and the slide motor 38 are stopped when the scan switch is pressed again, and image rendering is completed.
- the bottom cover 33 covers the bottom surface of the slide unit 32 and the entire circumference of the side surface on the bottom surface side, and can move toward and away from the bottom surface of the slide unit 32 .
- the image processing device 11 includes a control section 41 , a storage section 42 , a communication section 43 , an input section 44 and an output section 45 .
- the control unit 41 includes at least one processor, at least one programmable circuit, at least one dedicated circuit, or any combination thereof.
- a processor may be a general-purpose processor such as a CPU or GPU, or a dedicated processor specialized for a particular process.
- CPU is an abbreviation for central processing unit.
- GPU is an abbreviation for graphics processing unit.
- a programmable circuit is, for example, an FPGA.
- FPGA is an abbreviation for field-programmable gate array.
- a dedicated circuit is, for example, an ASIC.
- ASIC is an abbreviation for application specific integrated circuit.
- the control unit 41 executes processing related to the operation of the image processing device 11 while controlling each unit of the image processing system 10 including the image processing device 11 .
- the storage unit 42 includes at least one semiconductor memory, at least one magnetic memory, at least one optical memory, or any combination thereof.
- a semiconductor memory is, for example, a RAM or a ROM.
- RAM is an abbreviation for random access memory.
- ROM is an abbreviation for read only memory.
- RAM is, for example, SRAM or DRAM.
- SRAM is an abbreviation for static random access memory.
- DRAM is an abbreviation for dynamic random access memory.
- ROM is, for example, EEPROM.
- EEPROM is an abbreviation for electrically erasable programmable read only memory.
- the storage unit 42 functions, for example, as a main memory device, an auxiliary memory device, or a cache memory.
- the storage unit 42 stores data used for the operation of the image processing apparatus 11, such as the tomographic data 51, and data obtained by the operation of the image processing apparatus 11, such as the three-dimensional data 52 and the three-dimensional image 53. .
- the communication unit 43 includes at least one communication interface.
- the communication interface is, for example, a wired LAN interface, a wireless LAN interface, or an image diagnosis interface that receives and A/D converts IVUS signals.
- LAN is an abbreviation for local area network.
- A/D is an abbreviation for analog to digital.
- the communication unit 43 receives data used for the operation of the image processing device 11 and transmits data obtained by the operation of the image processing device 11 .
- the drive unit 13 is connected to an image diagnosis interface included in the communication section 43 .
- the input unit 44 includes at least one input interface.
- the input interface is, for example, a USB interface, an HDMI (registered trademark) interface, or an interface compatible with a short-range wireless communication standard such as Bluetooth (registered trademark).
- HDMI registered trademark
- HDMI registered trademark
- HDMI registered trademark
- HDMI registered trademark
- HDMI registered trademark
- HDMI registered trademark
- HDMI registered trademark
- the output unit 45 includes at least one output interface.
- the output interface is, for example, a USB interface, an HDMI (registered trademark) interface, or an interface compatible with a short-range wireless communication standard such as Bluetooth (registered trademark).
- the output unit 45 outputs data obtained by the operation of the image processing device 11 .
- the display 16 is connected to a USB interface or HDMI (registered trademark) interface included in the output unit 45 .
- the functions of the image processing device 11 are realized by executing the image processing program according to the present embodiment with a processor as the control unit 41 . That is, the functions of the image processing device 11 are realized by software.
- the image processing program causes the computer to function as the image processing device 11 by causing the computer to execute the operation of the image processing device 11 . That is, the computer functions as the image processing device 11 by executing the operation of the image processing device 11 according to the image processing program.
- the program can be stored on a non-transitory computer-readable medium.
- a non-transitory computer-readable medium is, for example, a flash memory, a magnetic recording device, an optical disk, a magneto-optical recording medium, or a ROM.
- Program distribution is performed, for example, by selling, assigning, or lending a portable medium such as an SD card, DVD, or CD-ROM storing the program.
- SD is an abbreviation for Secure Digital.
- DVD is an abbreviation for digital versatile disc.
- CD-ROM is an abbreviation for compact disc read only memory.
- the program may be distributed by storing the program in the storage of the server and transferring the program from the server to another computer.
- a program may be provided as a program product.
- a computer for example, temporarily stores a program stored in a portable medium or a program transferred from a server in a main storage device. Then, the computer reads the program stored in the main storage device with the processor, and executes processing according to the read program with the processor.
- the computer may read the program directly from the portable medium and execute processing according to the program.
- the computer may execute processing according to the received program every time the program is transferred from the server to the computer.
- the processing may be executed by a so-called ASP type service that realizes the function only by executing the execution instruction and obtaining the result without transferring the program from the server to the computer.
- "ASP" is an abbreviation for application service provider.
- the program includes information to be used for processing by a computer and conforming to the program. For example, data that is not a direct instruction to a computer but that has the property of prescribing the processing of the computer corresponds to "things equivalent to a program.”
- a part or all of the functions of the image processing device 11 may be realized by a programmable circuit or a dedicated circuit as the control unit 41. That is, part or all of the functions of the image processing device 11 may be realized by hardware.
- FIG. 8 The operation of the image processing system 10 according to the present embodiment will be described with reference to FIGS. 8 and 9.
- FIG. The operation of the image processing system 10 corresponds to the image display method according to this embodiment.
- the probe 20 is primed by the user before the flow of FIG. 8 starts. After that, the probe 20 is fitted into the probe connection portion 34 and the probe clamp portion 37 of the drive unit 13 and connected and fixed to the drive unit 13 . Then, the probe 20 is inserted to a target site in a living tissue 60 such as a blood vessel or heart.
- a living tissue 60 such as a blood vessel or heart.
- step S101 the scan switch included in the switch group 39 is pressed, and the pullback switch included in the switch group 39 is pressed, so that a so-called pullback operation is performed.
- the probe 20 transmits ultrasonic waves by means of the ultrasonic transducer 25 retracted in the axial direction by a pullback operation inside the biological tissue 60 .
- the ultrasonic transducer 25 radially transmits ultrasonic waves while moving inside the living tissue 60 .
- the ultrasonic transducer 25 receives reflected waves of the transmitted ultrasonic waves.
- the probe 20 inputs the signal of the reflected wave received by the ultrasonic transducer 25 to the image processing device 11 .
- the control unit 41 of the image processing apparatus 11 processes the input signal to sequentially generate cross-sectional images of the biological tissue 60, thereby acquiring tomographic data 51 including a plurality of cross-sectional images.
- the probe 20 rotates the ultrasonic transducer 25 in the circumferential direction inside the living tissue 60 and moves it in the axial direction, and rotates the ultrasonic transducer 25 toward the outside from the center of rotation.
- the probe 20 receives reflected waves from reflecting objects present in each of a plurality of directions inside the living tissue 60 by the ultrasonic transducer 25 .
- the probe 20 transmits the received reflected wave signal to the image processing device 11 via the drive unit 13 and the cable 12 .
- the communication unit 43 of the image processing device 11 receives the signal transmitted from the probe 20 .
- the communication unit 43 A/D converts the received signal.
- the communication unit 43 inputs the A/D converted signal to the control unit 41 .
- the control unit 41 processes the input signal and calculates the intensity value distribution of the reflected waves from the reflectors present in the transmission direction of the ultrasonic waves from the ultrasonic transducer 25 .
- the control unit 41 sequentially generates two-dimensional images having a luminance value distribution corresponding to the calculated intensity value distribution as cross-sectional images of the biological tissue 60, thereby acquiring tomographic data 51, which is a data set of cross-sectional images.
- the control unit 41 causes the storage unit 42 to store the obtained tomographic data 51 .
- the signal of the reflected wave received by the ultrasonic transducer 25 corresponds to the raw data of the tomographic data 51
- the cross-sectional image generated by processing the signal of the reflected wave by the image processing device 11 is the tomographic data. 51 processing data.
- the control unit 41 of the image processing device 11 may store the signal input from the probe 20 as the tomographic data 51 in the storage unit 42 as it is.
- the control unit 41 may store, as the tomographic data 51 , data indicating the intensity value distribution of the reflected wave calculated by processing the signal input from the probe 20 in the storage unit 42 .
- the tomographic data 51 is not limited to a data set of cross-sectional images of the living tissue 60, and may be data representing cross-sections of the living tissue 60 at each movement position of the ultrasonic transducer 25 in some format.
- an ultrasonic transducer that transmits ultrasonic waves in multiple directions without rotating is used instead of the ultrasonic transducer 25 that transmits ultrasonic waves in multiple directions while rotating in the circumferential direction.
- the tomographic data 51 may be acquired using OFDI or OCT instead of being acquired using IVUS.
- OFDI is an abbreviation for optical frequency domain imaging.
- OCT is an abbreviation for optical coherence tomography.
- another device instead of the image processing device 11 generating a dataset of cross-sectional images of the biological tissue 60, another device generates a similar dataset, and the image processing device 11 generates the dataset. It may be obtained from the other device. That is, instead of the control unit 41 of the image processing device 11 processing the IVUS signal to generate a cross-sectional image of the biological tissue 60, another device processes the IVUS signal to generate a cross-sectional image of the biological tissue 60. You may generate
- step S102 the control unit 41 of the image processing apparatus 11 generates three-dimensional data 52 of the biological tissue 60 based on the tomographic data 51 acquired in step S101. That is, the control unit 41 generates the three-dimensional data 52 based on the tomographic data 51 acquired by the sensor.
- the generated three-dimensional data 52 already exists, it is possible to update only the data at the location corresponding to the updated tomographic data 51 instead of regenerating all the three-dimensional data 52 from scratch. preferable. In that case, the amount of data processing when generating the three-dimensional data 52 can be reduced, and the real-time performance of the three-dimensional image 53 in the subsequent step S103 can be improved.
- control unit 41 of the image processing device 11 stacks the cross-sectional images of the living tissue 60 included in the tomographic data 51 stored in the storage unit 42 to three-dimensionalize the living tissue 60 .
- Dimensional data 52 is generated.
- any one of rendering methods such as surface rendering or volume rendering, and associated processing such as texture mapping including environment mapping, bump mapping, and the like is used.
- the control unit 41 causes the storage unit 42 to store the generated three-dimensional data 52 .
- the tomographic data 51 may include the elongated medical device 67 as in the data for the biological tissue 60 .
- data is included. Therefore, in step S102, the three-dimensional data 52 generated by the control unit 41 also includes the data of the elongated medical instrument 67 in the same way as the data of the living tissue 60.
- the control unit 41 of the image processing apparatus 11 classifies the pixel groups of the cross-sectional image included in the tomographic data 51 acquired in step S101 into two or more classes.
- These two or more classes include at least a "tissue” class to which the biological tissue 60 belongs, a “medical device” class to which the elongated medical device 67 belongs, a "blood cell” class, an indwelling stent A class of "indwellings” such as, or a class of "lesions” such as lime or plaque may also be included.
- Any method may be used as the classification method, but in this embodiment, a method of classifying pixel groups of cross-sectional images using a trained model is used.
- the learned model is trained by performing machine learning in advance so that it can detect regions corresponding to each class from a sample IVUS cross-sectional image.
- step S103 the control unit 41 of the image processing device 11 causes the display 16 to display the three-dimensional data 52 generated in step S102 as a three-dimensional image 53.
- the control unit 41 may set the angle at which the three-dimensional image 53 is displayed to any angle.
- the control unit 41 causes the display 16 to display the latest cross-sectional image included in the tomographic data 51 acquired in step S101 together with the three-dimensional image 53 .
- the control unit 41 of the image processing device 11 generates a 3D image 53 from the 3D data 52 stored in the storage unit 42 .
- the three-dimensional image 53 includes a group of three-dimensional objects such as a three-dimensional object representing the living tissue 60 and a three-dimensional object representing the elongated medical instrument 67 . That is, the control unit 41 generates a three-dimensional object of the living tissue 60 from the data of the living tissue 60 stored in the storage unit 42 , and generates a long medical device 67 from the data of the long medical instrument 67 stored in the storage unit 42 .
- the control unit 41 displays the latest cross-sectional image among the cross-sectional images of the biological tissue 60 included in the tomographic data 51 stored in the storage unit 42 and the generated three-dimensional image 53 on the display 16 via the output unit 45. to display.
- the control unit 41 of the image processing apparatus 11 selects the biological tissue 60 from among the first voxel group 54a corresponding to the cross section 64 indicated by the tomographic data 51 newly acquired by the sensor in the three-dimensional image 53. are colored to distinguish them from voxels 55 corresponding to other cross-sections of the biological tissue 60 . Specifically, as shown in FIG. 2, the control unit 41 sets the color of the voxel representing the inner surface 61 of the biological tissue 60 in the first voxel group 54a to be the same as any color of the other voxel group 55.
- the voxels representing the inner surface 61 of the living tissue 60 in the first voxel group 54 a are colored to distinguish them from the other voxel groups 55 .
- the control unit 41 sets the color of voxels representing the inner surface 61 of the biological tissue 60 in the first voxel group 54a to white.
- the control unit 41 of the image processing device 11 distinguishes all voxels representing the living tissue 60 from the other voxel group 55 in the first voxel group 54a, as shown in FIG. May be colored separately. Specifically, the control unit 41 sets the colors of all the voxels representing the biological tissue 60 in the first voxel group 54a to colors different from the colors of the other voxel groups 55, so that the first voxel All voxels representing the living tissue 60 in the group 54 a may be colored to distinguish them from the other voxel groups 55 .
- the control unit 41 of the image processing device 11 combines the first graphic element 87a and the fifth graphic element 86 and causes the display 16 to display them together with the three-dimensional image 53 . Specifically, as shown in FIG. 2, the control unit 41 causes the slider configured by combining the first graphic element 87a and the fifth graphic element 86 to be displayed on the left side of the three-dimensional image 53 via the output unit 45. to display. For example, the control unit 41 sets the color of the first graphic element 87a to white.
- control unit 41 of the image processing device 11 moves the fifth graphic element 86 so that the longitudinal direction of the lumen 63 in the three-dimensional image 53 and the longitudinal direction of the fifth graphic element 86 are parallel. displayed on the display 16. Specifically, as shown in FIG. 2, the control unit 41 causes the movement range of the sensor indicated by the fifth graphic element 86 to match the display range of the three-dimensional image 53 in the vertical direction of the screen 80. Also, the position of the sensor indicated by the first graphic element 87a and the position of the first voxel group 54a are matched.
- step S104 if there is an operation to set the angle for displaying the three-dimensional image 53 as the user's change operation, the process of step S105 is executed. If there is no change operation by the user, the process of step S106 is executed.
- step S ⁇ b>105 the control unit 41 of the image processing device 11 receives an operation via the input unit 44 to set the angle for displaying the three-dimensional image 53 .
- the control unit 41 adjusts the angle at which the three-dimensional image 53 is displayed to the set angle.
- step S103 the control unit 41 causes the display 16 to display the three-dimensional image 53 at the angle set in step S105.
- control unit 41 of the image processing device 11 allows the user to manipulate the three-dimensional image 53 displayed on the display 16 using the keyboard 14, the mouse 15, or the touch screen provided integrally with the display 16.
- An operation to rotate is received via the input unit 44 .
- the control unit 41 interactively adjusts the angle at which the three-dimensional image 53 is displayed on the display 16 according to the user's operation.
- the control unit 41 causes the input unit 44 to input the numerical value of the angle for displaying the three-dimensional image 53 by the user using the keyboard 14, the mouse 15, or the touch screen provided integrally with the display 16. accepted through The control unit 41 adjusts the angle at which the three-dimensional image 53 is displayed on the display 16 according to the input numerical value.
- step S106 if the tomographic data 51 is updated, the processes of steps S107 and S108 are executed. If the tomographic data 51 has not been updated, in step S104, it is confirmed again whether or not the user has performed a change operation.
- step S107 the control unit 41 of the image processing device 11 processes the signal input from the probe 20 to newly generate a cross-sectional image of the biological tissue 60, similarly to the processing of step S101, thereby obtaining at least one cross-sectional image.
- the control unit 41 of the image processing apparatus 11 updates the three-dimensional data 52 of the living tissue 60 based on the tomographic data 51 acquired at step S107. That is, the control unit 41 updates the three-dimensional data 52 based on the tomographic data 51 acquired by the sensor. Then, in step S103, the control unit 41 causes the display 16 to display the three-dimensional data 52 updated in step S108 as the three-dimensional image 53.
- the control unit 41 causes the display 16 to display the latest cross-sectional image included in the tomographic data 51 acquired in step S ⁇ b>107 together with the three-dimensional image 53 .
- step S111 if there is an operation to set the cutting area 62 as the user's setting operation, the process of step S112 is executed.
- step S ⁇ b>112 the control unit 41 of the image processing device 11 receives an operation for setting the cutting area 62 via the input unit 44 .
- control unit 41 of the image processing apparatus 11 performs an operation of setting a region 65 corresponding to the cutting region 62 on the cross-sectional image displayed on the display 16 in step S103 via the input unit 44. accept.
- the control unit 41 receives an operation of setting two straight lines L1 and L2 extending from one point M in the cross-sectional image as an operation of setting the area 65 corresponding to the cutting area 62 .
- control unit 41 of the image processing apparatus 11 allows the user to set the base angle and the opening angle integrally with the keyboard 14, the mouse 15, or the display 16 on the operation panel 81 as shown in FIG.
- An operation to designate using the provided touch screen is accepted via the input unit 44 . That is, the control unit 41 designates the direction of one of the two straight lines L1 and L2 and the angle formed by the two straight lines L1 and L2 as an operation for setting the two straight lines L1 and L2. accepts the operation to Here, it is assumed that the check box 85 on the operation panel 81 is checked, that is, the use of the center of gravity is selected.
- control unit 41 of the image processing apparatus 11 allows the user to draw two straight lines L1 and L2 on the cross-sectional image displayed on the display 16 using the keyboard 14, the mouse 15, or the display 16.
- a drawing operation using the integrally provided touch screen may be received via the input unit 44 . That is, the control unit 41 may receive an operation of drawing the two straight lines L1 and L2 on the cross-sectional image as the operation of setting the two straight lines L1 and L2.
- step S113 the control unit 41 of the image processing apparatus 11 uses the latest three-dimensional data 52 stored in the storage unit 42 to calculate the center-of-gravity positions of multiple transverse cross-sections of the lumen 63 of the biological tissue 60.
- the latest three-dimensional data 52 is the three-dimensional data 52 generated in step S102 if the process of step S108 has not been executed, and updated in step S108 if the process of step S108 has been executed. It means the three-dimensional data 52 that has been processed.
- the generated three-dimensional data 52 already exists, it is preferable to update only the data at the location corresponding to the updated tomographic data 51 instead of regenerating all the three-dimensional data 52 from scratch. . In that case, the amount of data processing when generating the three-dimensional data 52 can be reduced, and the real-time performance of the three-dimensional image 53 in the subsequent step S117 can be improved.
- the control unit 41 of the image processing apparatus 11 generates new cross-sectional images corresponding to each of the plurality of cross-sectional images generated in step S101 in step S107. , is replaced with the new cross-sectional image and then binarized.
- the control unit 41 extracts a point group of the inner surface of the biological tissue 60 from the binarized cross-sectional image.
- the control unit 41 extracts points corresponding to the inner surface of the main blood vessel one by one along the vertical direction of a cross-sectional image with the r axis as the horizontal axis and the ⁇ axis as the vertical axis. Extract the point cloud of .
- Point Cn is the center of the cross-sectional image.
- Point Bp is the center of gravity of the point cloud on the inner surface.
- Point Bv is the centroid of the vertices of the polygon.
- Point Bx is the centroid of the polygon as a convex hull.
- a method for calculating the barycentric position of a blood vessel a method different from the method for calculating the barycentric position of a polygon as a convex hull may be used.
- a method of calculating the center position of the largest circle that fits in the main blood vessel as the center-of-gravity position may be used.
- a binarized cross-sectional image with the r axis as the horizontal axis and the .theta Techniques similar to these can also be used when the biological tissue 60 is not a blood vessel.
- step S114 the control unit 41 of the image processing device 11 performs smoothing on the calculation result of the center-of-gravity position in step S113.
- the control unit 41 of the image processing apparatus 11 smoothes the calculation result of the center-of-gravity position by using a moving average, as indicated by the dashed line in FIG. 14 .
- a method other than the moving average may be used as a smoothing method.
- exponential smoothing, kernel method, local regression, Ramer-Douglas-Peucker algorithm, Savitzky-Golay method, smoothing spline, or SGM may be used.
- a technique of performing a fast Fourier transform and then removing high frequency components may be used.
- a Kalman filter or a low pass filter such as a Butterworth filter, a Chebyshev filter, a digital filter, an elliptic filter, or a KZ filter may be used.
- SGM is an abbreviation for stretched grid method.
- KZ is an abbreviation for Kolmogorov-Zurbenko.
- the control unit 41 divides the calculation result of the center-of-gravity position according to the positions of multiple cross sections in the longitudinal direction of the lumen 63 of the biological tissue 60 in the longitudinal direction of the lumen 63 of the biological tissue 60 . Smoothing may be performed for each calculation result. That is, when the curve of the position of the center of gravity shown by the dashed line in FIG. 14 overlaps the tissue region, the control unit 41 may divide the curve of the position of the center of gravity into a plurality of sections, and perform individual smoothing for each section. good.
- control unit 41 performs smoothing on the calculation result of the center-of-gravity position according to the positions of a plurality of cross-sections in the lateral direction of the lumen 63 of the biological tissue 60 in the longitudinal direction of the lumen 63 of the biological tissue 60 .
- step S115 the control unit 41 of the image processing apparatus 11 sets two planes that intersect with one line Lb passing through the position of the center of gravity calculated in step S113 as cutting planes P1 and P2. .
- the control unit 41 sets the cut planes P1 and P2 after performing smoothing on the calculation result of the center-of-gravity position in step S114, but the process of step S114 may be omitted.
- control unit 41 of the image processing device 11 sets the curve of the center-of-gravity position obtained as a result of the smoothing in step S114 as the line Lb.
- the control unit 41 sets two planes, which intersect at the set line Lb and respectively include the two straight lines L1 and L2 set in step S112, as the cutting planes P1 and P2.
- the control unit 41 obtains the three-dimensional coordinates intersecting the cut planes P1 and P2 of the living tissue 60 in the latest three-dimensional data 52 stored in the storage unit 42, and the lumen 63 of the living tissue 60 in the three-dimensional image 53. It is specified as the 3D coordinates of the edge of the opening to be exposed.
- the control unit 41 causes the storage unit 42 to store the identified three-dimensional coordinates.
- step S116 the control unit 41 of the image processing apparatus 11 forms an area sandwiched between the cut planes P1 and P2 in the three-dimensional image 53 and exposing the lumen 63 of the biological tissue 60 as the cut area 62 in the three-dimensional data 52. do.
- control unit 41 of the image processing device 11 converts the portion specified by the three-dimensional coordinates stored in the storage unit 42 in the latest three-dimensional data 52 stored in the storage unit 42 into a three-dimensional image. 53 is set to be hidden or transparent when displayed on the display 16. - ⁇ That is, the control unit 41 forms the cutting area 62 in accordance with the area 65 set in step S112.
- step S117 the control unit 41 of the image processing apparatus 11 causes the display 16 to display the three-dimensional data 52, which formed the cutting area 62 in step S116, as a three-dimensional image 53.
- the control unit 41 controls the cross section 64 indicated by the tomographic data 51 newly acquired by the sensor, which is represented by the cross section image displayed on the display 16 in step S103, and the region 65 corresponding to the cutting region 62 in the cross section 64. is displayed on the display 16 together with the three-dimensional image 53 .
- control unit 41 of the image processing device 11 processes the latest cross-sectional image among the cross-sectional images of the biological tissue 60 included in the tomographic data 51 stored in the storage unit 42, and processes the cross-sectional image shown in FIG. A two-dimensional image 56 like this is generated.
- the control unit 41 generates a three-dimensional image 53 as shown in FIG. 2 in which the portion specified by the three-dimensional coordinates stored in the storage unit 42 is hidden or transparent.
- the control unit 41 displays the generated two-dimensional image 56 and three-dimensional image 53 on the display 16 via the output unit 45 .
- the control unit 41 of the image processing apparatus 11 expresses the color of the region 65 corresponding to the cutting region 62 as a two-dimensional image 56 in a color different from that of the rest of the region. Generate an image. For example, white areas in a typical IVUS image may be changed to red in region 65 .
- step S118 if there is an operation to set the cutting area 62 as the user's change operation, the process of step S119 is executed. If there is no change operation by the user, the process of step S120 is executed.
- step S119 the control unit 41 of the image processing apparatus 11 receives an operation for setting the cutting area 62 via the input unit 44, as in the processing of step S112. Then, the processes after step S115 are executed.
- step S120 if the tomographic data 51 is updated, the processes of steps S121 and S122 are executed. If the tomographic data 51 has not been updated, in step S118, it is confirmed again whether or not the user has performed a change operation.
- step S121 the control unit 41 of the image processing device 11 processes the signal input from the probe 20 to newly generate a cross-sectional image of the biological tissue 60, similarly to the processing in step S101 or step S107.
- step S122 the control unit 41 of the image processing apparatus 11 updates the three-dimensional data 52 of the living tissue 60 based on the tomographic data 51 acquired at step S121. After that, the processes after step S113 are executed. In step S122, it is preferable to update only the data corresponding to the updated tomographic data 51. FIG. In that case, the amount of data processing when generating the three-dimensional data 52 can be reduced, and real-time performance of data processing after step S113 can be improved.
- step S ⁇ b>201 when a marking operation, which is a user operation requesting marking of the sensor position, is performed, the control unit 41 of the image processing device 11 receives the marking operation via the input unit 44 .
- the marking operation is, for example, an operation of simultaneously pressing the Ctrl key and "B" of the keyboard 14, an operation of clicking the first graphic element 87a with the mouse 15 while pressing the Ctrl key of the keyboard 14, or an operation of pressing the Ctrl key of the keyboard 14.
- An operation of tapping the first graphic element 87a on a touch screen provided integrally with the display 16 while pressing a key may be performed.
- step S202 the control unit 41 of the image processing device 11 causes the display 16 to display, together with the first element, the second element fixed at the same position as the first element when the marking operation was performed.
- the control unit 41 combines a graphic element group, which is a group of elements including the first element and the second element, and a long graphic element representing the movement range of the sensor, and displays it via the output unit 45. 16. More specifically, the control unit 41 places the second element on the slider, which is a combination of the position-variable first graphic element 87a and the elongated fifth graphic element 86 as the first element. , a second graphic element 87b with a fixed position is placed.
- the control unit 41 controls the position of the upper end of the fossa ovalis 66 indicated by the second graphic element 87b and the second voxel match the position of the group 54b.
- control unit 41 of the image processing device 11 sets the color of the second element to a color different from that of the first element. For example, the control unit 41 sets the color of the second graphic element 87b to green. In the three-dimensional image 53, the control unit 41 also sets the color of voxels representing the inner surface 61 of the biological tissue 60 in the second voxel group 54b to green.
- step S203 when the marking operation is performed again, the control unit 41 of the image processing device 11 accepts the marking operation via the input unit 44 again.
- step S204 the control unit 41 of the image processing apparatus 11 displays the third element fixed at the same position as the first element when the marking operation is performed again on the display 16 together with the first element and the second element. display. Specifically, the control unit 41 adds the third element to the graphic element group displayed on the display 16 in combination with the elongated graphic element representing the moving range of the sensor. More specifically, the control unit 41 controls the slider, which is a combination of the position-variable first graphic element 87a and the elongated fifth graphic element 86 as the first element, to place the third element A third graphic element 87c with a fixed position is placed as .
- the control unit 41 controls the position of the lower end of the fossa ovalis 66 indicated by the third graphic element 87c and the position of the lower end of the fossa ovalis 66 indicated by the third graphic element 87c. match the position of the voxel group 54c.
- control unit 41 of the image processing device 11 sets the color of the third element to a color different from that of the second element. For example, the control unit 41 sets the color of the third graphic element 87c to red. In the three-dimensional image 53, the control unit 41 also sets the color of the voxels representing the inner surface 61 of the biological tissue 60 in the third voxel group 54c to red.
- step S205 the control unit 41 of the image processing device 11 calculates the intermediate position between the second element and the third element. Specifically, the control unit 41 calculates an intermediate position between the second graphic element 87b and the third graphic element 87c.
- step S206 the control unit 41 of the image processing device 11 causes the display 16 to display the fourth element fixed at the position calculated in step S205 together with the first, second, and third elements. Specifically, the control unit 41 adds the fourth element to the graphic element group displayed on the display 16 in combination with the elongated graphic element representing the moving range of the sensor. More specifically, the control unit 41 places a fourth element on a slider that is a combination of a position-variable first graphic element 87a and an elongated fifth graphic element 86 as the first element. , a fourth graphic element 87d whose position is fixed is placed.
- the control unit 41 matches the intermediate position between the upper and lower ends of the fossa ovalis 66 indicated by the fourth graphic element 87d with the position of the fourth voxel group 54d.
- control unit 41 of the image processing device 11 sets the color of the fourth element to a color different from that of the second and third elements. For example, the controller 41 sets the color of the fourth graphic element 87d to yellow. In the three-dimensional image 53, the control unit 41 also sets the color of voxels representing the inner surface 61 of the biological tissue 60 among the fourth voxel group 54d to yellow.
- any position other than the intermediate position between the second element and the third element may be calculated as long as the position is between the second element and the third element.
- a position a predetermined distance away from the second graphic element 87b or a position a predetermined distance away from the third graphic element 87c is may be calculated.
- the three graphic elements additionally displayed in steps S201 to S206 may be erased by performing an erase operation, which is an operation different from the marking operation.
- the delete operation is, for example, an operation of simultaneously pressing the Ctrl key and "D" of the keyboard 14, an operation of clicking each graphic element with the mouse 15 while pressing the Ctrl key of the keyboard 14, or an operation of pressing the Ctrl key of the keyboard 14.
- An operation of tapping each graphic element on a touch screen provided integrally with the display 16 while pressing may be performed. If the Ctrl key and "D" are pressed simultaneously, three graphic elements may be deleted at once.
- the plane to be aimed may be predefined and marked.
- marking operations up to two marking operations are accepted, but as a modified example of this embodiment, only one marking operation may be accepted. That is, the marking operation may not be accepted while the second element is displayed on the display 16 .
- marking operations may be accepted three or more times. That is, even when the second element and the third element are displayed on the display 16, the third and subsequent marking operations are accepted, and the first element is fixed at the same position as when each marking operation was performed. additional elements may be displayed on the display 16 .
- steps S205 and S206 may be omitted. That is, the fourth element does not have to be displayed.
- the control unit 41 of the image processing device 11 may calculate the distance from a certain reference position to the position of the upper end of the fossa ovalis 66 .
- the calculated distance may be displayed within the screen 80, for example in the vicinity of the second graphic element 87b.
- the control unit 41 may further calculate the distance from the reference position to the position of the lower end of the fossa ovalis 66 .
- the calculated distance may be displayed within the screen 80, for example, in the vicinity of the third graphic element 87c.
- the control unit 41 may calculate the distance from the reference position to the position calculated in step S205.
- the calculated distance may be displayed within the screen 80, for example, in the vicinity of the fourth graphic element 87d.
- the control unit 41 of the image processing device 11 controls the area between the cross section corresponding to the position of the second element and the cross section corresponding to the position of the third element in the three-dimensional image 53.
- the color may be set to a different color than adjacent regions. That is, the control unit 41 may set the color of the voxel group existing between the second voxel group 54b and the third voxel group 54c in the three-dimensional image 53 to a color different from the default color.
- the voxel group set to a different color is the voxel group representing the inner surface 61 that exists between the second voxel group 54b and the third voxel group 54c, or the voxel group representing the inner surface 61 that is adjacent to the inner cavity 63. Only the voxel group representing
- the pullback unit may automatically move to each bookmark position when the user clicks a button or presses a shortcut key.
- move buttons 88b, 88c, and 88d which are not shown in the example of FIG.
- the move button 88b is a button for requesting movement of the sensor to a position corresponding to the position of the second graphic element 87b, that is, to the upper end of the fossa ovalis 66.
- the move button 88c is a button for requesting movement of the sensor to a position corresponding to the position of the third graphic element 87c, that is, to the lower end of the fossa ovalis 66.
- the move button 88d is a button for requesting movement of the sensor to a position corresponding to the position of the fourth graphic element 87d, that is, to an intermediate position between the upper and lower ends of the fossa ovalis 66.
- the control unit 41 of the image processing apparatus 11 includes a movement control function 46 that receives an operation requesting movement of the sensor via the input unit 44 and moves the sensor, as shown in FIG.
- the movement control function 46 is a function of controlling movement of the pullback unit via the communication section 43 of the image processing device 11 .
- the control unit 41 uses the movement control function 46 to move the pullback unit, thereby moving the sensor to the position requested by the user.
- control unit 41 moves the sensor to the upper end of the fossa ovalis 66 when the user clicks the move button 88b. That is, upon receiving an operation requesting movement of the sensor to a position corresponding to the position of the second element, the control section 41 moves the sensor to a position corresponding to the position of the second element.
- control unit 41 moves the sensor to the lower end of the fossa ovalis 66 when the user clicks the move button 88c. That is, upon receiving an operation requesting movement of the sensor to a position corresponding to the position of the third element, the control section 41 moves the sensor to a position corresponding to the position of the third element.
- the control unit 41 moves the sensor to an intermediate position between the upper and lower ends of the fossa ovalis 66 . That is, upon receiving an operation requesting movement of the sensor to a position corresponding to the position of the fourth element, the control section 41 moves the sensor to a position corresponding to the position of the fourth element.
- FIG. 1 An overview of the present embodiment will be described with reference to FIGS. 1, 4, 5, and 18 to 20.
- FIG. 1
- the image processing apparatus 11 is a computer that causes the display 16 to display three-dimensional data 52 representing the biological tissue 60 as a three-dimensional image 53 . As shown in FIG. 4, the image processing device 11 forms a cut region 62 in the three-dimensional data 52 that exposes the lumen 63 of the biological tissue 60 in the three-dimensional image 53 . The image processing device 11 adjusts the viewpoint when displaying the three-dimensional image 53 on the display 16 according to the position of the cutting area 62 . A viewpoint is the position of a virtual camera 71 arranged in a three-dimensional space.
- the image processing device 11 When the image processing device 11 receives a user operation requesting rotation of the viewpoint, the image processing device 11 extends horizontally in the three-dimensional image 53 and passes through a reference point located within the lumen 63 on the reference plane including the viewpoint and is perpendicular to the reference plane.
- the position of the cutting area 62 is changed from the first position, which is the position when the user's operation is performed, to the second position rotated around the rotation axis extending in the direction.
- the horizontal direction means the XY direction shown in FIG.
- the direction perpendicular to the reference plane is the Z direction shown in FIG.
- the reference point can be any point located within the lumen 63 on the reference plane, such as the center point of the IVUS catheter, but in this embodiment is the centroid of the lumen 63 on the reference plane.
- the corresponding center of gravity B1 becomes the reference point.
- the image processing device 11 rotates the viewpoint around the rotation axis according to the second position, as shown in FIGS. 18 and 19 .
- the usability for confirming the position in the living tissue 60 is improved.
- a procedure such as ablation using IVUS
- a user such as a doctor performing the procedure while operating a catheter, or a clinical engineer operating an IVUS system while looking at the display 16
- the viewpoint can be rotated around the rotation axis by performing a specific user operation.
- septal puncture is performed, as shown in FIG.
- the transition to the state in which the fossa 66 can be viewed from the side can be instantaneously performed by one user operation. Therefore, sufficient usefulness is obtained for confirming the puncture position.
- the image processing device 11 displays a two-dimensional image 56 representing a cross section 64 of the living tissue 60 and a region 65 corresponding to the cutting region 62 in the cross section 64 together with the three-dimensional image 53 on the display 16. to display.
- the two-dimensional image 56 shows the position of the camera 71 with respect to the slice 64 .
- the user can understand from the two-dimensional image 56 what kind of structure the portion of the biological tissue 60 that is cut off and not displayed in the three-dimensional image 53 is. For example, if the user is an operator, it becomes easier to operate the inside of the living tissue 60 .
- the image processing device 11 causes the display 16 to display a three-dimensional image 53 including a three-dimensional object representing the elongated medical instrument 67 inserted into the lumen 63 .
- the elongated medical device 67 is a catheter with a puncture needle attached to its tip in this embodiment, but may be another type of medical device such as a guide wire as long as it is elongated.
- the biological tissue 60 includes, for example, blood vessels or organs such as the heart.
- the biological tissue 60 is not limited to an anatomical single organ or a part thereof, but also includes a tissue that straddles a plurality of organs and has a lumen.
- a specific example of such tissue is a portion of the vascular system extending from the upper portion of the inferior vena cava through the right atrium to the lower portion of the superior vena cava.
- FIG. 18 18 and 19 an operation panel 81, a two-dimensional image 56, a three-dimensional image 53, and a button 89 are displayed on the screen 80.
- FIG. 18 18 and 19 an operation panel 81, a two-dimensional image 56, a three-dimensional image 53, and a button 89 are displayed on the screen 80.
- the operation panel 81 is a GUI component for setting the cutting area 62. "GUI" is an abbreviation for graphical user interface.
- the operation panel 81 includes a check box 82 for selecting whether to activate the setting of the cutting area 62, a slider 83 for setting the base angle, a slider 84 for setting the opening angle, a center of gravity
- a check box 85 is provided for selecting whether or not to use the
- the base angle is the rotation angle of one of the two straight lines L1 and L2 extending from one point M in the cross-sectional image representing the cross-section 64 of the living tissue 60 . Therefore, setting the base angle corresponds to setting the direction of the straight line L1.
- the opening angle is the angle between the two straight lines L1 and L2. Therefore, setting the opening angle corresponds to setting the angle formed by the two straight lines L1 and L2.
- Point M is the center of gravity of cross section 64 . Point M may be set at a point other than the center of gravity on cross-section 64 if it is selected not to use the center of gravity.
- a two-dimensional image 56 is an image obtained by processing a cross-sectional image.
- the color of the area 65 corresponding to the cut area 62 is changed to clearly indicate which part of the cross section 64 is cut.
- the viewpoint when displaying the three-dimensional image 53 on the screen 80 is adjusted according to the position of the cutting area 62 .
- the two-dimensional image 56 can be used to determine the cutting area 62 .
- the position or size of the cutting area 62 can be set. For example, if the base angle is changed such that the straight line L1 is rotated counterclockwise by approximately 90 degrees, a region 65a that has moved according to the change in the base angle is obtained in the two-dimensional image 56a. Then, the position of the cutting area 62 is adjusted according to the position of the area 65a.
- the opening angle is changed such that the angle between the two straight lines L1 and L2 is increased, a region 65b enlarged according to the change in the opening angle is obtained in the two-dimensional image 56b. Then, the size of the cutting area 62 is adjusted according to the size of the area 65b. It is also possible to set both the position and size of the cutting area 62 by adjusting both the base angle and the opening angle to set both the position and size of the area 65 in the two-dimensional image 56 . The position of the camera 71 may be appropriately adjusted according to the position or size of the cutting area 62 .
- the image corresponding to the current position of the sensor that is, the latest image is always displayed as the two-dimensional image 56.
- the base angle may be set by dragging the straight line L1 instead of being set by operating the slider 83, or by entering a numerical value. good too.
- the opening angle may be set by dragging the straight line L2 or by entering a numerical value.
- the cutting area 62 determined using the two-dimensional image 56 is hidden or transparent.
- a button 89 is a graphic element that is pressed to request rotation of the viewpoint.
- the button 89 may be displayed at any position, but is displayed on the right side of the three-dimensional image 53 in this embodiment.
- the button 89 indicates the rotation angle and rotation direction of the viewpoint.
- the viewpoint rotates 90 degrees in the circumferential direction.
- the three-dimensional image 53 is switched to an image as shown in FIG. The notation of the button 89 is also changed. As shown in FIG.
- the circumferential direction refers to the direction of rotation about the IVUS catheter axis, but may also refer to the direction of rotation about the center of gravity of each section of the living tissue 60 .
- the positional relationship between the fossa ovalis 66 and the elongated medical device 67 can be clearly grasped from a viewpoint where the fossa ovalis 66 can be confirmed from the front.
- the viewpoint is rotated 90 degrees
- the fossa ovalis 66 can be seen from the side, as shown in FIG. can catch.
- the three-dimensional object can be instantly rotated any number of times by pressing the button 89 . Therefore, both the positional relationship between the fossa ovalis 66 and the elongated medical device 67 and the positions where the right atrial structure, the left atrial structure, and the elongated medical device 67 are in contact can be easily confirmed. be able to.
- the rotation angle of the viewpoint is not limited to 90 degrees, and may be any predetermined angle.
- the viewpoint rotation angle may be variable. That is, the rotation angle of the viewpoint may be arbitrarily adjusted by the user.
- button 89 may be displayed simultaneously with a first button for rotating the viewpoint 90 degrees clockwise about the IVUS catheter axis and rotating the first button for rotating about the IVUS catheter axis. It may be divided into a second button for rotating the viewpoint 90 degrees clockwise.
- the user operation requesting rotation of the viewpoint is performed by pressing one or more predetermined keys such as simultaneously pressing the Ctrl key and "R" instead of pressing the button 89. It may be a pushing operation.
- the X direction and the Y direction perpendicular to the X direction respectively correspond to the lateral direction of the lumen 63 of the living tissue 60 .
- a Z direction orthogonal to the X and Y directions corresponds to the longitudinal direction of the lumen 63 of the living tissue 60 .
- the image processing device 11 uses the three-dimensional data 52 to calculate the positions of the centers of gravity B1, B2, B3 and B4 of the cross sections C1, C2, C3 and C4 of the biological tissue 60, respectively.
- the image processing apparatus 11 sets two planes P1 and P2 that intersect at a line Lb that passes through the positions of the centers of gravity B1, B2, B3, and B4 and that include two straight lines L1 and L2, respectively. For example, if point M shown in FIGS.
- the image processing device 11 forms an area sandwiched between the cut planes P1 and P2 in the three-dimensional image 53 and exposing the lumen 63 of the biological tissue 60 as the cut area 62 in the three-dimensional data 52 .
- cross sections C1, C2, C3, and C4 are shown as multiple cross sections in the lateral direction of the lumen 63 of the biological tissue 60 for the sake of convenience, but the number of cross sections for which the position of the center of gravity is to be calculated is four. It is not limited to one, but is preferably the same number as the number of cross-sectional images acquired by IVUS.
- the check box 85 on the operation panel 81 is unchecked, that is, not using the center of gravity is selected.
- the image processing device 11 intersects at an arbitrary line passing through the point M, such as a straight line extending in the Z direction through the point M, and includes two straight lines L1 and L2, respectively. Planes are set as cut planes P1 and P2.
- the image processing system 10 includes an image processing device 11, a cable 12, a drive unit 13, a keyboard 14, a mouse 15, and a display 16.
- the image processing apparatus 11 is a dedicated computer specialized for image diagnosis in this embodiment, but may be a general-purpose computer such as a PC. "PC” is an abbreviation for personal computer.
- the cable 12 is used to connect the image processing device 11 and the drive unit 13.
- the drive unit 13 is a device that is used by being connected to the probe 20 shown in FIG. 7 and drives the probe 20 .
- the drive unit 13 is also called MDU. "MDU” is an abbreviation for motor drive unit.
- Probe 20 has IVUS applications. Probe 20 is also referred to as an IVUS catheter or diagnostic imaging catheter.
- the keyboard 14, mouse 15, and display 16 are connected to the image processing device 11 via any cable or wirelessly.
- the display 16 is, for example, an LCD, organic EL display, or HMD.
- LCD is an abbreviation for liquid crystal display.
- EL is an abbreviation for electro luminescence.
- HMD is an abbreviation for head-mounted display.
- the image processing system 10 further comprises a connection terminal 17 and a cart unit 18 as options.
- connection terminal 17 is used to connect the image processing device 11 and an external device.
- the connection terminal 17 is, for example, a USB terminal.
- USB is an abbreviation for Universal Serial Bus.
- the external device is, for example, a recording medium such as a magnetic disk drive, a magneto-optical disk drive, or an optical disk drive.
- the cart unit 18 is a cart with casters for movement.
- An image processing device 11 , a cable 12 and a drive unit 13 are installed in the cart body of the cart unit 18 .
- a keyboard 14 , a mouse 15 and a display 16 are installed on the top table of the cart unit 18 .
- the probe 20 includes a drive shaft 21, a hub 22, a sheath 23, an outer tube 24, an ultrasonic transducer 25, and a relay connector 26.
- the drive shaft 21 passes through a sheath 23 inserted into the body cavity of a living body, an outer tube 24 connected to the proximal end of the sheath 23, and extends to the inside of a hub 22 provided at the proximal end of the probe 20.
- the driving shaft 21 has an ultrasonic transducer 25 for transmitting and receiving signals at its tip and is rotatably provided within the sheath 23 and the outer tube 24 .
- a relay connector 26 connects the sheath 23 and the outer tube 24 .
- the hub 22, the drive shaft 21, and the ultrasonic transducer 25 are connected to each other so as to integrally move back and forth in the axial direction. Therefore, for example, when the hub 22 is pushed toward the distal side, the drive shaft 21 and the ultrasonic transducer 25 move inside the sheath 23 toward the distal side. For example, when the hub 22 is pulled proximally, the drive shaft 21 and the ultrasonic transducer 25 move proximally inside the sheath 23 as indicated by the arrows.
- the drive unit 13 includes a scanner unit 31, a slide unit 32, and a bottom cover 33.
- the scanner unit 31 is also called a pullback unit.
- the scanner unit 31 is connected to the image processing device 11 via the cable 12 .
- the scanner unit 31 includes a probe connection section 34 that connects to the probe 20 and a scanner motor 35 that is a drive source that rotates the drive shaft 21 .
- the probe connecting portion 34 is detachably connected to the probe 20 through an insertion port 36 of the hub 22 provided at the proximal end of the probe 20 .
- the proximal end of the drive shaft 21 is rotatably supported, and the rotational force of the scanner motor 35 is transmitted to the drive shaft 21 .
- Signals are also transmitted and received between the drive shaft 21 and the image processing device 11 via the cable 12 .
- the image processing device 11 generates a tomographic image of the body lumen and performs image processing based on the signal transmitted from the drive shaft 21 .
- the slide unit 32 mounts the scanner unit 31 so as to move back and forth, and is mechanically and electrically connected to the scanner unit 31 .
- the slide unit 32 includes a probe clamp section 37 , a slide motor 38 and a switch group 39 .
- the probe clamping part 37 is arranged coaxially with the probe connecting part 34 on the tip side of the probe connecting part 34 and supports the probe 20 connected to the probe connecting part 34 .
- the slide motor 38 is a driving source that generates axial driving force.
- the scanner unit 31 advances and retreats by driving the slide motor 38, and the drive shaft 21 advances and retreats in the axial direction accordingly.
- the slide motor 38 is, for example, a servomotor.
- the switch group 39 includes, for example, a forward switch and a pullback switch that are pressed when moving the scanner unit 31 back and forth, and a scan switch that is pressed when image rendering is started and ended.
- Various switches are included in the switch group 39 as needed, without being limited to the example here.
- the scanner motor 35 When the scan switch is pressed, image rendering is started, the scanner motor 35 is driven, and the slide motor 38 is driven to move the scanner unit 31 backward.
- a user such as an operator connects the probe 20 to the scanner unit 31 in advance, and causes the drive shaft 21 to rotate and move to the proximal end side in the axial direction when image rendering is started.
- the scanner motor 35 and the slide motor 38 are stopped when the scan switch is pressed again, and image rendering is completed.
- the bottom cover 33 covers the bottom surface of the slide unit 32 and the entire circumference of the side surface on the bottom surface side, and can move toward and away from the bottom surface of the slide unit 32 .
- the image processing device 11 includes a control section 41 , a storage section 42 , a communication section 43 , an input section 44 and an output section 45 .
- the control unit 41 includes at least one processor, at least one programmable circuit, at least one dedicated circuit, or any combination thereof.
- a processor may be a general-purpose processor such as a CPU or GPU, or a dedicated processor specialized for a particular process.
- CPU is an abbreviation for central processing unit.
- GPU is an abbreviation for graphics processing unit.
- a programmable circuit is, for example, an FPGA.
- FPGA is an abbreviation for field-programmable gate array.
- a dedicated circuit is, for example, an ASIC.
- ASIC is an abbreviation for application specific integrated circuit.
- the control unit 41 executes processing related to the operation of the image processing device 11 while controlling each unit of the image processing system 10 including the image processing device 11 .
- the storage unit 42 includes at least one semiconductor memory, at least one magnetic memory, at least one optical memory, or any combination thereof.
- a semiconductor memory is, for example, a RAM or a ROM.
- RAM is an abbreviation for random access memory.
- ROM is an abbreviation for read only memory.
- RAM is, for example, SRAM or DRAM.
- SRAM is an abbreviation for static random access memory.
- DRAM is an abbreviation for dynamic random access memory.
- ROM is, for example, EEPROM.
- EEPROM is an abbreviation for electrically erasable programmable read only memory.
- the storage unit 42 functions, for example, as a main memory device, an auxiliary memory device, or a cache memory.
- the storage unit 42 stores data used for the operation of the image processing apparatus 11, such as the tomographic data 51, and data obtained by the operation of the image processing apparatus 11, such as the three-dimensional data 52 and the three-dimensional image 53. .
- the communication unit 43 includes at least one communication interface.
- the communication interface is, for example, a wired LAN interface, a wireless LAN interface, or an image diagnosis interface that receives and A/D converts IVUS signals.
- LAN is an abbreviation for local area network.
- A/D is an abbreviation for analog to digital.
- the communication unit 43 receives data used for the operation of the image processing device 11 and transmits data obtained by the operation of the image processing device 11 .
- the drive unit 13 is connected to an image diagnosis interface included in the communication section 43 .
- the input unit 44 includes at least one input interface.
- the input interface is, for example, a USB interface, an HDMI (registered trademark) interface, or an interface compatible with a short-range wireless communication standard such as Bluetooth (registered trademark).
- HDMI registered trademark
- HDMI registered trademark
- HDMI registered trademark
- HDMI registered trademark
- HDMI registered trademark
- HDMI registered trademark
- HDMI registered trademark
- the output unit 45 includes at least one output interface.
- the output interface is, for example, a USB interface, an HDMI (registered trademark) interface, or an interface compatible with a short-range wireless communication standard such as Bluetooth (registered trademark).
- the output unit 45 outputs data obtained by the operation of the image processing device 11 .
- the display 16 is connected to a USB interface or HDMI (registered trademark) interface included in the output unit 45 .
- the functions of the image processing device 11 are realized by executing the image processing program according to the present embodiment with a processor as the control unit 41 . That is, the functions of the image processing device 11 are realized by software.
- the image processing program causes the computer to function as the image processing device 11 by causing the computer to execute the operation of the image processing device 11 . That is, the computer functions as the image processing device 11 by executing the operation of the image processing device 11 according to the image processing program.
- the program can be stored on a non-transitory computer-readable medium.
- a non-transitory computer-readable medium is, for example, a flash memory, a magnetic recording device, an optical disk, a magneto-optical recording medium, or a ROM.
- Program distribution is performed, for example, by selling, assigning, or lending a portable medium such as an SD card, DVD, or CD-ROM storing the program.
- SD is an abbreviation for Secure Digital.
- DVD is an abbreviation for digital versatile disc.
- CD-ROM is an abbreviation for compact disc read only memory.
- the program may be distributed by storing the program in the storage of the server and transferring the program from the server to another computer.
- a program may be provided as a program product.
- a computer for example, temporarily stores a program stored in a portable medium or a program transferred from a server in a main storage device. Then, the computer reads the program stored in the main storage device with the processor, and executes processing according to the read program with the processor.
- the computer may read the program directly from the portable medium and execute processing according to the program.
- the computer may execute processing according to the received program every time the program is transferred from the server to the computer.
- the processing may be executed by a so-called ASP type service that realizes the function only by executing the execution instruction and obtaining the result without transferring the program from the server to the computer.
- "ASP" is an abbreviation for application service provider.
- the program includes information to be used for processing by a computer and conforming to the program. For example, data that is not a direct instruction to a computer but that has the property of prescribing the processing of the computer corresponds to "things equivalent to a program.”
- a part or all of the functions of the image processing device 11 may be realized by a programmable circuit or a dedicated circuit as the control unit 41. That is, part or all of the functions of the image processing device 11 may be realized by hardware.
- FIG. 8 The operation of the image processing system 10 according to the present embodiment will be described with reference to FIGS. 8 and 9.
- FIG. The operation of the image processing system 10 corresponds to the image display method according to this embodiment.
- the probe 20 is primed by the user before the flow of FIG. 8 starts. After that, the probe 20 is fitted into the probe connection portion 34 and the probe clamp portion 37 of the drive unit 13 and connected and fixed to the drive unit 13 . Then, the probe 20 is inserted to a target site in a living tissue 60 such as a blood vessel or heart.
- a living tissue 60 such as a blood vessel or heart.
- step S101 the scan switch included in the switch group 39 is pressed, and the pullback switch included in the switch group 39 is pressed, so that a so-called pullback operation is performed.
- the probe 20 transmits ultrasonic waves by means of the ultrasonic transducer 25 retracted in the axial direction by a pullback operation inside the biological tissue 60 .
- the ultrasonic transducer 25 radially transmits ultrasonic waves while moving inside the living tissue 60 .
- the ultrasonic transducer 25 receives reflected waves of the transmitted ultrasonic waves.
- the probe 20 inputs the signal of the reflected wave received by the ultrasonic transducer 25 to the image processing device 11 .
- the control unit 41 of the image processing apparatus 11 processes the input signal to sequentially generate cross-sectional images of the biological tissue 60, thereby acquiring tomographic data 51 including a plurality of cross-sectional images.
- the probe 20 rotates the ultrasonic transducer 25 in the circumferential direction inside the living tissue 60 and moves it in the axial direction, and rotates the ultrasonic transducer 25 toward the outside from the center of rotation.
- the probe 20 receives reflected waves from reflecting objects present in each of a plurality of directions inside the living tissue 60 by the ultrasonic transducer 25 .
- the probe 20 transmits the received reflected wave signal to the image processing device 11 via the drive unit 13 and the cable 12 .
- the communication unit 43 of the image processing device 11 receives the signal transmitted from the probe 20 .
- the communication unit 43 A/D converts the received signal.
- the communication unit 43 inputs the A/D converted signal to the control unit 41 .
- the control unit 41 processes the input signal and calculates the intensity value distribution of the reflected waves from the reflectors present in the transmission direction of the ultrasonic waves from the ultrasonic transducer 25 .
- the control unit 41 sequentially generates two-dimensional images having a luminance value distribution corresponding to the calculated intensity value distribution as cross-sectional images of the biological tissue 60, thereby acquiring tomographic data 51, which is a data set of cross-sectional images.
- the control unit 41 causes the storage unit 42 to store the obtained tomographic data 51 .
- the signal of the reflected wave received by the ultrasonic transducer 25 corresponds to the raw data of the tomographic data 51
- the cross-sectional image generated by processing the signal of the reflected wave by the image processing device 11 is the tomographic data. 51 processing data.
- the control unit 41 of the image processing device 11 may store the signal input from the probe 20 as the tomographic data 51 in the storage unit 42 as it is.
- the control unit 41 may store, as the tomographic data 51 , data indicating the intensity value distribution of the reflected wave calculated by processing the signal input from the probe 20 in the storage unit 42 .
- the tomographic data 51 is not limited to a data set of cross-sectional images of the living tissue 60, and may be data representing cross-sections of the living tissue 60 at each movement position of the ultrasonic transducer 25 in some format.
- an ultrasonic transducer that transmits ultrasonic waves in multiple directions without rotating is used instead of the ultrasonic transducer 25 that transmits ultrasonic waves in multiple directions while rotating in the circumferential direction.
- the tomographic data 51 may be acquired using OFDI or OCT instead of being acquired using IVUS.
- OFDI is an abbreviation for optical frequency domain imaging.
- OCT is an abbreviation for optical coherence tomography.
- another device instead of the image processing device 11 generating a dataset of cross-sectional images of the biological tissue 60, another device generates a similar dataset, and the image processing device 11 generates the dataset. It may be obtained from the other device. That is, instead of the control unit 41 of the image processing device 11 processing the IVUS signal to generate a cross-sectional image of the biological tissue 60, another device processes the IVUS signal to generate a cross-sectional image of the biological tissue 60. You may generate
- step S102 the control unit 41 of the image processing apparatus 11 generates three-dimensional data 52 of the biological tissue 60 based on the tomographic data 51 acquired in step S101. That is, the control unit 41 generates the three-dimensional data 52 based on the tomographic data 51 acquired by the sensor.
- the generated three-dimensional data 52 already exists, it is possible to update only the data at the location corresponding to the updated tomographic data 51 instead of regenerating all the three-dimensional data 52 from scratch. preferable. In that case, the amount of data processing when generating the three-dimensional data 52 can be reduced, and the real-time performance of the three-dimensional image 53 in the subsequent step S103 can be improved.
- control unit 41 of the image processing device 11 stacks the cross-sectional images of the living tissue 60 included in the tomographic data 51 stored in the storage unit 42 to three-dimensionalize the living tissue 60 .
- Dimensional data 52 is generated.
- any one of rendering methods such as surface rendering or volume rendering, and associated processing such as texture mapping including environment mapping, bump mapping, and the like is used.
- the control unit 41 causes the storage unit 42 to store the generated three-dimensional data 52 .
- the tomographic data 51 may include the elongated medical device 67 as in the data for the biological tissue 60 .
- data is included. Therefore, in step S102, the three-dimensional data 52 generated by the control unit 41 also includes the data of the elongated medical instrument 67 in the same way as the data of the living tissue 60.
- the control unit 41 of the image processing apparatus 11 classifies the pixel groups of the cross-sectional image included in the tomographic data 51 acquired in step S101 into two or more classes.
- These two or more classes include at least a "tissue” class to which the biological tissue 60 belongs, a “medical device” class to which the elongated medical device 67 belongs, a "blood cell” class, an indwelling stent A class of "indwellings” such as, or a class of "lesions” such as lime or plaque may also be included.
- Any method may be used as the classification method, but in this embodiment, a method of classifying pixel groups of cross-sectional images using a trained model is used.
- the learned model is trained by performing machine learning in advance so that it can detect regions corresponding to each class from a sample IVUS cross-sectional image.
- step S103 the control unit 41 of the image processing device 11 causes the display 16 to display the three-dimensional data 52 generated in step S102 as a three-dimensional image 53.
- the control unit 41 may set the angle at which the three-dimensional image 53 is displayed to any angle.
- the control unit 41 causes the display 16 to display the latest cross-sectional image included in the tomographic data 51 acquired in step S101 together with the three-dimensional image 53 .
- the control unit 41 of the image processing device 11 generates a 3D image 53 from the 3D data 52 stored in the storage unit 42 .
- the three-dimensional image 53 includes a group of three-dimensional objects such as a three-dimensional object representing the living tissue 60 and a three-dimensional object representing the elongated medical instrument 67 . That is, the control unit 41 generates a three-dimensional object of the living tissue 60 from the data of the living tissue 60 stored in the storage unit 42 , and generates a long medical device 67 from the data of the long medical instrument 67 stored in the storage unit 42 .
- the control unit 41 displays the latest cross-sectional image among the cross-sectional images of the biological tissue 60 included in the tomographic data 51 stored in the storage unit 42 and the generated three-dimensional image 53 on the display 16 via the output unit 45. to display.
- the control unit 41 of the image processing device 11 identifies, in the three-dimensional data 52, the point of the biological tissue 60 at which the distal end of the elongated medical instrument 67 is in contact as the contact point Pi. Then, the control unit 41 sets the color of the voxel corresponding to the contact point Pi to a predetermined color in the three-dimensional image 53 .
- the "predetermined color" is red in this embodiment, but any color may be used as long as the voxel corresponding to the contact point Pi can be distinguished from other voxel groups.
- control unit 41 of the image processing device 11 identifies a certain range of the living tissue 60 centered on the contact point Pi as the contact spot 68 in the three-dimensional data 52 . Then, the control unit 41 sets the color of the voxel group corresponding to the contact spot 68 in the three-dimensional image 53 to a predetermined color.
- the center of the contact spot 68 can be seen straight from the viewpoint, but suppose that the center of the contact spot 68 is shifted to the left or right as viewed from the viewpoint.
- the cross section of the contact spot 68 ie, the cross section of the tip of the septal puncture needle, cannot be seen simply by rotating the viewpoint by 90 degrees in the circumferential direction, as will be described later. Therefore, the control unit 41 of the image processing device 11 receives, via the input unit 44, an operation to adjust the viewpoint and move the center of the contact spot 68 to the center in the horizontal direction as viewed from the viewpoint.
- the viewpoint After moving the center of the contact spot 68 to the center in the horizontal direction as viewed from the viewpoint, the viewpoint is rotated 90 degrees in the circumferential direction so that the center of the contact spot 68 is positioned on the cutting plane. position can be adjusted.
- the center of the contact spot 68 can be viewed from the side, and the structure of the left atrium to which the tip of the septal puncture needle is directed can be confirmed from the side.
- the contact point Pi may be identified by any procedure, but in this embodiment it is identified by the following procedure.
- the control unit 41 of the image processing device 11 analyzes the tomographic data 51 stored in the storage unit 42 and detects the position of the biological tissue 60 with which the distal end of the elongated medical instrument 67 is in contact. Any method may be used as a method for analyzing the tomographic data 51, but in this embodiment, the tip of the living tissue 60 and the elongated medical instrument 67 in the cross-sectional image included in the tomographic data 51 is detected. , a method of determining whether or not the living tissue 60 and the tip of the elongated medical instrument 67 are in contact by measuring the distance between the living tissue 60 and the tip of the elongated medical instrument 67. Used. The control unit 41 identifies a point corresponding to the detected position in the three-dimensional data 52 as the contact point Pi.
- the control unit 41 of the image processing device 11 may analyze the three-dimensional data 52 to identify the contact point Pi. Any method may be used as a method for analyzing the three-dimensional data 52. For example, the tip of a long medical instrument 67 included in the three-dimensional data 52 is detected, and the living tissue 60 and the long medical instrument are detected. A method of determining whether the living tissue 60 and the tip of the elongated medical device 67 are in contact by measuring the distance to the tip of the medical device 67 is used.
- the control unit 41 of the image processing device 11 may receive input of position data indicating the position of the biological tissue 60 with which the distal end of the elongated medical instrument 67 is in contact. Specifically, the control unit 41 uses a sensor such as an electrode provided at the distal end of the elongated medical instrument 67 to determine whether the distal end of the elongated medical instrument 67 is in contact with the inner wall of the living tissue 60 . An input of position data may be received via the communication unit 43 or the input unit 44 from an external system for determining whether. Then, the control unit 41 may refer to the input position data and correct the analysis result of the three-dimensional data 52 .
- control unit 41 corresponds to the position indicated by the position data input from the external system as described above in the three-dimensional data 52 without analyzing the three-dimensional data 52.
- a point may be identified as a contact point Pi.
- step S104 if there is an operation to set the angle for displaying the three-dimensional image 53 as the user's change operation, the process of step S105 is executed. If there is no change operation by the user, the process of step S106 is executed.
- step S ⁇ b>105 the control unit 41 of the image processing device 11 receives an operation via the input unit 44 to set the angle for displaying the three-dimensional image 53 .
- the control unit 41 adjusts the angle at which the three-dimensional image 53 is displayed to the set angle.
- step S103 the control unit 41 causes the display 16 to display the three-dimensional image 53 at the angle set in step S105.
- control unit 41 of the image processing device 11 allows the user to manipulate the three-dimensional image 53 displayed on the display 16 using the keyboard 14, the mouse 15, or the touch screen provided integrally with the display 16.
- An operation to rotate is received via the input unit 44 .
- the control unit 41 interactively adjusts the angle at which the three-dimensional image 53 is displayed on the display 16 according to the user's operation.
- the control unit 41 causes the input unit 44 to input the numerical value of the angle for displaying the three-dimensional image 53 by the user using the keyboard 14, the mouse 15, or the touch screen provided integrally with the display 16. accepted through The control unit 41 adjusts the angle at which the three-dimensional image 53 is displayed on the display 16 according to the input numerical value.
- step S106 if the tomographic data 51 is updated, the processes of steps S107 and S108 are executed. If the tomographic data 51 has not been updated, in step S104, it is confirmed again whether or not the user has performed a change operation.
- step S107 the control unit 41 of the image processing device 11 processes the signal input from the probe 20 to newly generate a cross-sectional image of the biological tissue 60, similarly to the processing of step S101, thereby obtaining at least one cross-sectional image.
- the control unit 41 of the image processing apparatus 11 updates the three-dimensional data 52 of the living tissue 60 based on the tomographic data 51 acquired at step S107. That is, the control unit 41 updates the three-dimensional data 52 based on the tomographic data 51 acquired by the sensor. Then, in step S103, the control unit 41 causes the display 16 to display the three-dimensional data 52 updated in step S108 as the three-dimensional image 53.
- the control unit 41 causes the display 16 to display the latest cross-sectional image included in the tomographic data 51 acquired in step S ⁇ b>107 together with the three-dimensional image 53 .
- step S111 if there is an operation to set the cutting area 62 as the user's setting operation, the process of step S112 is executed.
- step S ⁇ b>112 the control unit 41 of the image processing device 11 receives an operation for setting the cutting area 62 via the input unit 44 .
- control unit 41 of the image processing apparatus 11 performs an operation of setting a region 65 corresponding to the cutting region 62 on the cross-sectional image displayed on the display 16 in step S103 via the input unit 44. accept.
- the control unit 41 receives an operation of setting two straight lines L1 and L2 extending from one point M in the cross-sectional image as an operation of setting the area 65 corresponding to the cutting area 62 .
- control unit 41 of the image processing apparatus 11 allows the user to set the base angle and the opening angle with the keyboard 14, the mouse 15, or the display 16 on the operation panel 81 shown in FIGS.
- An operation to specify using the integrally provided touch screen is accepted via the input unit 44 . That is, the control unit 41 designates the direction of one of the two straight lines L1 and L2 and the angle formed by the two straight lines L1 and L2 as an operation for setting the two straight lines L1 and L2. accepts the operation to Here, it is assumed that the check box 85 on the operation panel 81 is checked, that is, the use of the center of gravity is selected.
- control unit 41 of the image processing apparatus 11 allows the user to draw two straight lines L1 and L2 on the cross-sectional image displayed on the display 16 using the keyboard 14, the mouse 15, or the display 16.
- a drawing operation using the integrally provided touch screen may be received via the input unit 44 . That is, the control unit 41 may receive an operation of drawing the two straight lines L1 and L2 on the cross-sectional image as the operation of setting the two straight lines L1 and L2.
- step S113 the control unit 41 of the image processing apparatus 11 uses the latest three-dimensional data 52 stored in the storage unit 42 to calculate the center-of-gravity positions of multiple transverse cross-sections of the lumen 63 of the biological tissue 60.
- the latest three-dimensional data 52 is the three-dimensional data 52 generated in step S102 if the process of step S108 has not been executed, and updated in step S108 if the process of step S108 has been executed. It means the three-dimensional data 52 that has been processed.
- the generated three-dimensional data 52 already exists, it is preferable to update only the data at the location corresponding to the updated tomographic data 51 instead of regenerating all the three-dimensional data 52 from scratch. . In that case, the amount of data processing when generating the three-dimensional data 52 can be reduced, and the real-time performance of the three-dimensional image 53 in the subsequent step S117 can be improved.
- the control unit 41 of the image processing apparatus 11 generates new cross-sectional images corresponding to each of the plurality of cross-sectional images generated in step S101 in step S107. , is replaced with the new cross-sectional image and then binarized.
- the control unit 41 extracts a point group of the inner surface of the biological tissue 60 from the binarized cross-sectional image.
- the control unit 41 extracts points corresponding to the inner surface of the main blood vessel one by one along the vertical direction of a cross-sectional image with the r axis as the horizontal axis and the ⁇ axis as the vertical axis. Extract the point cloud of .
- Point Cn is the center of the cross-sectional image.
- Point Bp is the center of gravity of the point cloud on the inner surface.
- Point Bv is the centroid of the vertices of the polygon.
- Point Bx is the centroid of the polygon as a convex hull.
- a method for calculating the barycentric position of a blood vessel a method different from the method for calculating the barycentric position of a polygon as a convex hull may be used.
- a method of calculating the center position of the largest circle that fits in the main blood vessel as the center-of-gravity position may be used.
- a binarized cross-sectional image with the r axis as the horizontal axis and the .theta Techniques similar to these can also be used when the biological tissue 60 is not a blood vessel.
- step S114 the control unit 41 of the image processing device 11 performs smoothing on the calculation result of the center-of-gravity position in step S113.
- the control unit 41 of the image processing apparatus 11 smoothes the calculation result of the center-of-gravity position by using a moving average, as indicated by the dashed line in FIG. 14 .
- a method other than the moving average may be used as a smoothing method.
- exponential smoothing, kernel method, local regression, Ramer-Douglas-Peucker algorithm, Savitzky-Golay method, smoothing spline, or SGM may be used.
- a technique of performing a fast Fourier transform and then removing high frequency components may be used.
- a Kalman filter or a low pass filter such as a Butterworth filter, a Chebyshev filter, a digital filter, an elliptic filter, or a KZ filter may be used.
- SGM is an abbreviation for stretched grid method.
- KZ is an abbreviation for Kolmogorov-Zurbenko.
- the control unit 41 divides the calculation result of the center-of-gravity position according to the positions of multiple cross sections in the longitudinal direction of the lumen 63 of the biological tissue 60 in the longitudinal direction of the lumen 63 of the biological tissue 60 . Smoothing may be performed for each calculation result. That is, when the curve of the position of the center of gravity shown by the dashed line in FIG. 14 overlaps the tissue region, the control unit 41 may divide the curve of the position of the center of gravity into a plurality of sections, and perform individual smoothing for each section. good.
- control unit 41 performs smoothing on the calculation result of the center-of-gravity position according to the positions of a plurality of cross-sections in the lateral direction of the lumen 63 of the biological tissue 60 in the longitudinal direction of the lumen 63 of the biological tissue 60 .
- step S115 the control unit 41 of the image processing apparatus 11 sets two planes that intersect with one line Lb passing through the position of the center of gravity calculated in step S113 as cutting planes P1 and P2. .
- the control unit 41 sets the cut planes P1 and P2 after performing smoothing on the calculation result of the center-of-gravity position in step S114, but the process of step S114 may be omitted.
- control unit 41 of the image processing device 11 sets the curve of the center-of-gravity position obtained as a result of the smoothing in step S114 as the line Lb.
- the control unit 41 sets two planes, which intersect at the set line Lb and respectively include the two straight lines L1 and L2 set in step S112, as the cutting planes P1 and P2.
- the control unit 41 obtains the three-dimensional coordinates intersecting the cut planes P1 and P2 of the living tissue 60 in the latest three-dimensional data 52 stored in the storage unit 42, and the lumen 63 of the living tissue 60 in the three-dimensional image 53. It is specified as the 3D coordinates of the edge of the opening to be exposed.
- the control unit 41 causes the storage unit 42 to store the specified three-dimensional coordinates.
- step S116 the control unit 41 of the image processing apparatus 11 forms an area sandwiched between the cut planes P1 and P2 in the three-dimensional image 53 and exposing the lumen 63 of the biological tissue 60 as the cut area 62 in the three-dimensional data 52. do.
- control unit 41 of the image processing device 11 converts the portion specified by the three-dimensional coordinates stored in the storage unit 42 in the latest three-dimensional data 52 stored in the storage unit 42 into a three-dimensional image. 53 is set to be hidden or transparent when displayed on the display 16. - ⁇ That is, the control unit 41 forms the cutting area 62 in accordance with the area 65 set in step S112.
- step S117 the control unit 41 of the image processing apparatus 11 causes the display 16 to display the three-dimensional data 52, which formed the cutting area 62 in step S116, as a three-dimensional image 53.
- the control unit 41 controls the cross section 64 indicated by the tomographic data 51 newly acquired by the sensor, which is represented by the cross section image displayed on the display 16 in step S103, and the region 65 corresponding to the cutting region 62 in the cross section 64. is displayed on the display 16 together with the three-dimensional image 53 .
- control unit 41 of the image processing apparatus 11 processes the latest cross-sectional image among the cross-sectional images of the biological tissue 60 included in the tomographic data 51 stored in the storage unit 42, and processes the cross-sectional images shown in FIGS.
- a two-dimensional image 56 as shown in 19 is generated.
- the control unit 41 generates a three-dimensional image 53 as shown in FIGS. 18 and 19 in which the portion specified by the three-dimensional coordinates stored in the storage unit 42 is hidden or transparent.
- the control unit 41 displays the generated two-dimensional image 56 and three-dimensional image 53 on the display 16 via the output unit 45 .
- control unit 41 of the image processing apparatus 11 sets the color of the region 65 corresponding to the cutting region 62 as the two-dimensional image 56 to a color different from that of the rest of the region. Generates an image represented by color. For example, white areas in a typical IVUS image may be changed to red in region 65 .
- step S118 if there is an operation to set the cutting area 62 as the user's change operation, the process of step S119 is executed. If there is no change operation by the user, the process of step S120 is executed.
- step S119 the control unit 41 of the image processing apparatus 11 receives an operation for setting the cutting area 62 via the input unit 44, as in the processing of step S112. Then, the processes after step S115 are executed.
- step S120 if the tomographic data 51 is updated, the processes of steps S121 and S122 are executed. If the tomographic data 51 has not been updated, in step S118, it is confirmed again whether or not the user has performed a change operation.
- step S121 the control unit 41 of the image processing device 11 processes the signal input from the probe 20 to newly generate a cross-sectional image of the biological tissue 60, similarly to the processing in step S101 or step S107.
- step S122 the control unit 41 of the image processing apparatus 11 updates the three-dimensional data 52 of the living tissue 60 based on the tomographic data 51 acquired at step S121. After that, the processes after step S113 are executed. In step S122, it is preferable to update only the data corresponding to the updated tomographic data 51. FIG. In that case, the amount of data processing when generating the three-dimensional data 52 can be reduced, and real-time performance of data processing after step S113 can be improved.
- step S211 when the button 89 displayed on the display 16 is pressed as a user operation for requesting rotation of the viewpoint when displaying the three-dimensional image 53 on the display 16, the control unit 41 of the image processing apparatus 11 accepts the operation via the input unit 44 .
- step S212 the control unit 41 of the image processing device 11 rotates the viewpoint according to the user's operation performed in step S211. Specifically, the control unit 41 rotates around a rotation axis passing through a reference point located within the lumen 63 on a reference plane extending horizontally in the three-dimensional image 53 and including the viewpoint and extending in a direction perpendicular to the reference plane. , the position of the cutting area 62 is changed from the first position, which is the position when the user's operation was performed in step S211, to the second position rotated. Then, the control unit 41 rotates the viewpoint around the rotation axis according to the second position. In this embodiment, the control unit 41 rotates the viewpoint by 90 degrees around the rotation axis.
- the control unit 41 converts the three-dimensional image 53 from the image of the fossa ovalis 66 captured from the front as shown in FIG. to an image captured from the side.
- the control unit 41 switches the notation of the button 89 from "+90" and a right-pointing triangle as shown in FIG. 18 to "-90" and a left-pointing triangle as shown in FIG.
- the controller 41 controls the second position so that a portion of the contact spot 68 is located on the cutting surface formed by the cutting region 62 when the cutting region 62 is at the second position. may be set. In other words, the control unit 41 may adjust the position of the cutting area 62 so that the contact spot 68 is positioned on the cutting surface, that is, so that the contact spot 68 can be easily seen.
- This alignment may only be done upon explicit request from the user.
- the control unit 41 may adjust the viewpoint to position the viewpoint on the plane including the contact point Pi and the rotation axis, and then change the position of the cutting area 62 to the second position. This viewpoint adjustment may only be performed upon explicit request from the user.
- the control unit 41 may set the reference point so that the reference point is positioned on a projection line obtained by projecting a straight line connecting the contact point Pi and the viewpoint onto the reference plane. That is, the control unit 41 may adjust the rotation axis so that the contact point Pi, the viewpoint, and one point on the rotation axis are aligned. This pivot adjustment may only be performed upon explicit request from the user.
- step S ⁇ b>213 when an operation of pressing the button 89 displayed on the display 16 is performed again as a user operation requesting rotation of the viewpoint, the control unit 41 of the image processing device 11 receives the operation via the input unit 44 . accept it.
- step S214 the control unit 41 of the image processing device 11 reversely rotates the viewpoint according to the user's operation performed in step S213. Specifically, the control unit 41 changes the position of the cutting area 62 from the second position to the first position around the rotation axis. Then, the control unit 41 reversely rotates the viewpoint around the rotation axis according to the first position. In this embodiment, the control unit 41 reversely rotates the viewpoint by 90 degrees around the rotation axis. 18 and 19, the control unit 41 converts the three-dimensional image 53 from the image of the fossa ovalis 66 captured from the side as shown in FIG. switch to an image captured from the front. The control unit 41 switches the notation of the button 89 from "-90" and a triangle convex to the left as shown in FIG. 19 to "+90" and a triangle convex to the right as shown in FIG.
- the flow of FIG. 21 may be repeated any number of times.
- the image processing device, the image processing system, the image display method, and the image processing program according to the present embodiment correspond to the image processing device, the image processing system, the image display method, and the image processing program according to the following notes. .
- 3D data representing a living tissue is displayed as a 3D image on a display, and a cutting area exposing a lumen of the living tissue in the 3D image is formed in the 3D data, according to the position of the cutting area.
- an image processing device that adjusts a viewpoint when displaying the three-dimensional image on the display, When a user operation requesting rotation of the viewpoint is received, a direction extending horizontally in the three-dimensional image passes through a reference point located within the lumen on a reference plane containing the viewpoint and in a direction perpendicular to the reference plane.
- the position of the cutting area is changed from the first position, which is the position when the user operation is performed, to a second position rotated around the extending rotation axis, and the rotation axis is changed according to the second position.
- An image processing apparatus comprising a control unit that rotates the viewpoint around.
- Appendix 2 The image processing apparatus according to appendix 1, wherein the reference point is the center of gravity of the lumen on the reference plane.
- Appendix 3 The image processing device according to appendix 1 or appendix 2, wherein the control unit rotates the viewpoint by 90 degrees around the rotation axis when the user operation is received.
- Appendix 4. The image processing apparatus according to any one of appendices 1 to 3, wherein the control unit receives, as the user operation, an operation of pressing a button displayed on the display.
- Appendix 5 The image according to any one of appendices 1 to 4, wherein the control unit includes a three-dimensional object representing an elongated medical instrument inserted into the lumen in the three-dimensional image and causes the display to display the image. processing equipment.
- Appendix 6 After the control unit adjusts the viewpoint to position the viewpoint on a plane including the contact point of the biological tissue with which the distal end of the elongated medical instrument is in contact and the rotation axis. , the image processing apparatus according to appendix 5, wherein the position of the cutting area is changed to the second position.
- Appendix 7 The reference point according to appendix 5, wherein the reference point is located on a projection line obtained by projecting a straight line connecting the point of contact of the living tissue with which the tip of the elongated medical instrument is in contact and the viewpoint onto the reference plane. image processing device.
- Appendix 8 A portion of a contact spot, which is a certain range centered around a contact point of the biological tissue with which the tip of the elongated medical instrument is in contact when the position of the cutting area is the second position. is positioned on the cutting plane formed by the cutting area.
- Appendix 9 The image processing device according to any one of appendices 1 to 8, wherein the control unit causes the display to further display a cross-sectional image of the biological tissue arranged on the same screen as the three-dimensional image.
- Appendix 10 the image processing device according to any one of appendices 1 to 9; An image processing system comprising the display.
- 3D data representing a living tissue is displayed as a 3D image on a display, and a cutting area exposing a lumen of the living tissue in the 3D image is formed in the 3D data, according to the position of the cutting area.
- an image display method for adjusting a viewpoint when displaying the three-dimensional image on the display Receiving a user operation requesting rotation of the viewpoint; position of the cutting area around a rotation axis passing through a reference point located within the lumen on a reference plane extending horizontally in the three-dimensional image and including the viewpoint and extending in a direction perpendicular to the reference plane, change from the first position, which is the position when the user operation is performed, to the rotated second position;
- 3D data representing a living tissue is displayed as a 3D image on a display, and a cutting area exposing a lumen of the living tissue in the 3D image is formed in the 3D data, according to the position of the cutting area.
- a computer that adjusts a viewpoint when displaying the three-dimensional image on the display, When a user operation requesting rotation of the viewpoint is received, a direction extending horizontally in the three-dimensional image passes through a reference point located within the lumen on a reference plane containing the viewpoint and in a direction perpendicular to the reference plane.
- the position of the cutting area is changed from the first position, which is the position when the user operation is performed, to a second position rotated around the extending rotation axis, and the rotation axis is changed according to the second position.
Landscapes
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
Description
1.プルバックユニットを操作し、卵円窩66を探索する。
2.プルバックユニットを操作し、卵円窩66の上端及び下端をマーキングする。3次元画像53の卵円窩66を参照して上下を決定するとより正確な位置を特定できる。
3.上端及び下端の情報からその間にある任意の位置にプルバックユニットを移動させる。
4.中隔穿刺針を操作し、現在の超音波素子がある平面に先端をコンタクトさせる。その際に、X線装置又は超音波2次元情報を併用してもよい。
5.3次元画像53に含まれる長尺状の医療器具67の3次元モデルを確認することで、先端が正しく狙った平面において壁面にコンタクトしているかどうかを確認して穿刺する。このステップの前に、狙うべき平面を予め規定しておき、マーキングを行ってもよい。
生体組織を表す3次元データを3次元画像としてディスプレイに表示させるとともに、前記3次元画像において前記生体組織の内腔を露出させる切断領域を前記3次元データに形成し、前記切断領域の位置に応じて、前記3次元画像を前記ディスプレイに表示させる際の視点を調整する画像処理装置であって、
前記視点の回転を要求するユーザ操作を受け付けると、前記3次元画像において水平方向に延び前記視点を含む基準平面上における前記内腔内に位置する基準点を通りかつ前記基準平面と垂直な方向に延びる回転軸周りに、前記切断領域の位置を、前記ユーザ操作が行われたときの位置である第1位置から回転させた第2位置に変更し、前記第2位置に応じて、前記回転軸周りに前記視点を回転させる制御部を備える画像処理装置。
前記基準点は、前記基準平面上における前記内腔の重心である付記1に記載の画像処理装置。
前記制御部は、前記ユーザ操作を受け付けると、前記回転軸周りに前記視点を90度回転させる付記1又は付記2に記載の画像処理装置。
前記制御部は、前記ユーザ操作として、前記ディスプレイに表示されたボタンを押す操作を受け付ける付記1から付記3のいずれか1項に記載の画像処理装置。
前記制御部は、前記内腔に挿入された長尺状の医療器具を表す3次元オブジェクトを前記3次元画像に含めて前記ディスプレイに表示させる付記1から付記4のいずれか1項に記載の画像処理装置。
前記制御部は、前記視点を調整して、前記生体組織の、前記長尺状の医療器具の先端が接触している接触点と前記回転軸とを含む平面上に前記視点を位置させた後、前記切断領域の位置を前記第2位置に変更する付記5に記載の画像処理装置。
前記基準点は、前記生体組織の、前記長尺状の医療器具の先端が接触している接触点と前記視点とを結んだ直線を前記基準平面に投影した投影線上に位置する付記5に記載の画像処理装置。
前記切断領域の位置が前記第2位置であるときに、前記生体組織の、前記長尺状の医療器具の先端が接触している接触点を中心として含む一定の範囲である接触スポットの一部が前記切断領域によって形成される切断面上に位置する付記5に記載の画像処理装置。
前記制御部は、前記3次元画像と同じ画面上に配置される前記生体組織の断面画像を前記ディスプレイに更に表示させる付記1から付記8のいずれか1項に記載の画像処理装置。
付記1から付記9のいずれか1項に記載の画像処理装置と、
前記ディスプレイと
を備える画像処理システム。
生体組織を表す3次元データを3次元画像としてディスプレイに表示するとともに、前記3次元画像において前記生体組織の内腔を露出させる切断領域を前記3次元データに形成し、前記切断領域の位置に応じて、前記3次元画像を前記ディスプレイに表示する際の視点を調整する画像表示方法であって、
前記視点の回転を要求するユーザ操作を受け付け、
前記3次元画像において水平方向に延び前記視点を含む基準平面上における前記内腔内に位置する基準点を通りかつ前記基準平面と垂直な方向に延びる回転軸周りに、前記切断領域の位置を、前記ユーザ操作が行われたときの位置である第1位置から回転させた第2位置に変更し、
前記第2位置に応じて、前記回転軸周りに前記視点を回転させる画像表示方法。
生体組織を表す3次元データを3次元画像としてディスプレイに表示させるとともに、前記3次元画像において前記生体組織の内腔を露出させる切断領域を前記3次元データに形成し、前記切断領域の位置に応じて、前記3次元画像を前記ディスプレイに表示させる際の視点を調整するコンピュータに、
前記視点の回転を要求するユーザ操作を受け付けると、前記3次元画像において水平方向に延び前記視点を含む基準平面上における前記内腔内に位置する基準点を通りかつ前記基準平面と垂直な方向に延びる回転軸周りに、前記切断領域の位置を、前記ユーザ操作が行われたときの位置である第1位置から回転させた第2位置に変更し、前記第2位置に応じて、前記回転軸周りに前記視点を回転させる処理を実行させる画像処理プログラム。
11 画像処理装置
12 ケーブル
13 駆動ユニット
14 キーボード
15 マウス
16 ディスプレイ
17 接続端子
18 カートユニット
20 プローブ
21 駆動シャフト
22 ハブ
23 シース
24 外管
25 超音波振動子
26 中継コネクタ
31 スキャナユニット
32 スライドユニット
33 ボトムカバー
34 プローブ接続部
35 スキャナモータ
36 差込口
37 プローブクランプ部
38 スライドモータ
39 スイッチ群
41 制御部
42 記憶部
43 通信部
44 入力部
45 出力部
46 移動制御機能
51 断層データ
52 3次元データ
53 3次元画像
54a 第1ボクセル群
54b 第2ボクセル群
54c 第3ボクセル群
54d 第4ボクセル群
55 ボクセル群
56,56a,56b 2次元画像
60 生体組織
61 内表面
62 切断領域
63 内腔
64 断面
65,65a,65b 領域
66 卵円窩
67 長尺状の医療器具
68 接触スポット
71 カメラ
72 マーク
80 画面
81 操作パネル
82 チェックボックス
83 スライダー
84 スライダー
85 チェックボックス
86 第5グラフィック要素
87a 第1グラフィック要素
87b 第2グラフィック要素
87c 第3グラフィック要素
87d 第4グラフィック要素
88b,88c,88d 移動ボタン
89 ボタン
Claims (18)
- 生体組織の内腔を移動するセンサによって取得された断層データに基づいて、前記生体組織を表す画像をディスプレイに表示させるとともに、前記画像と同じ画面上で前記センサの位置を表し、前記センサの移動に伴って変位する第1要素を前記ディスプレイに表示させる画像処理装置であって、
前記センサの位置のマーキングを要求するユーザ操作を受け付けると、前記ユーザ操作が行われたときの前記第1要素の位置と同じ位置に固定される第2要素を前記第1要素とともに前記ディスプレイに表示させる制御部を備える画像処理装置。 - 前記制御部は、前記第2要素の色を、前記第1要素とは異なる色に設定する請求項1に記載の画像処理装置。
- 前記制御部は、前記第2要素の位置に対応する位置への前記センサの移動を要求する操作を受け付けると、前記第2要素の位置に対応する位置まで前記センサを移動させる請求項1又は請求項2に記載の画像処理装置。
- 前記制御部は、前記ユーザ操作を再び受け付けると、前記ユーザ操作が再び行われたときの前記第1要素の位置と同じ位置に固定される第3要素を前記第1要素及び前記第2要素とともに前記ディスプレイに表示させる請求項1から請求項3のいずれか1項に記載の画像処理装置。
- 前記制御部は、前記第3要素の色を、前記第2要素とは異なる色に設定する請求項4に記載の画像処理装置。
- 前記制御部は、前記第2要素及び前記第3要素の間の位置に固定される第4要素を前記第1要素、前記第2要素、及び前記第3要素とともに前記ディスプレイに表示させる請求項4又は請求項5に記載の画像処理装置。
- 前記制御部は、前記第2要素及び前記第3要素の間の位置として、前記第2要素及び前記第3要素の中間位置を算出する請求項6に記載の画像処理装置。
- 前記制御部は、前記第4要素の色を、前記第2要素及び前記第3要素とは異なる色に設定する請求項6又は請求項7に記載の画像処理装置。
- 前記制御部は、前記第4要素の位置に対応する位置への前記センサの移動を要求する操作を受け付けると、前記第4要素の位置に対応する位置まで前記センサを移動させる請求項6から請求項8のいずれか1項に記載の画像処理装置。
- 前記制御部は、前記画像である3次元画像において、前記第2要素の位置に対応する断面と前記第3要素の位置に対応する断面との間の領域の色を、隣接する領域とは異なる色に設定する請求項4から請求項9のいずれか1項に記載の画像処理装置。
- 前記制御部は、前記第1要素及び前記第2要素を含む要素群であるグラフィック要素群と、前記センサの移動範囲を表す長尺グラフィック要素とを組み合わせて前記ディスプレイに表示させる請求項1から請求項10のいずれか1項に記載の画像処理装置。
- 前記制御部は、前記画像である3次元画像における前記内腔の長手方向と、前記長尺グラフィック要素の長軸方向とが平行になる向きで前記長尺グラフィック要素を前記ディスプレイに表示させる請求項11に記載の画像処理装置。
- 前記制御部は、前記画像である3次元画像において、前記センサの位置に対応する第1ボクセル群のうち、少なくとも前記生体組織の内表面を表すボクセル、又は当該内表面を表すボクセルに隣接し前記内腔を表すボクセルを前記第1要素、前記ユーザ操作が行われたときの前記センサの位置に対応する第2ボクセル群のうち、少なくとも当該内表面を表すボクセル、又は当該内表面を表すボクセルに隣接し前記内腔を表すボクセルを前記第2要素として、前記第2要素を前記第1要素とは区別して色付けする請求項1に記載の画像処理装置。
- 前記制御部は、前記ユーザ操作として、予め定められた1つ以上のキーを押す操作を受け付ける請求項1から請求項13のいずれか1項に記載の画像処理装置。
- 請求項1から請求項14のいずれか1項に記載の画像処理装置と、
前記センサを有するプローブと
を備える画像処理システム。 - 前記ディスプレイを更に備える請求項15に記載の画像処理システム。
- 生体組織の内腔を移動するセンサによって取得された断層データに基づいて、前記生体組織を表す画像をディスプレイに表示するとともに、前記画像と同じ画面上で前記センサの位置を表し、前記センサの移動に伴って変位する第1要素を前記ディスプレイに表示する画像表示方法であって、
前記センサの位置のマーキングを要求するユーザ操作を受け付け、
前記ユーザ操作が行われたときの前記第1要素の位置と同じ位置に固定される第2要素を前記第1要素とともに前記ディスプレイに表示する画像表示方法。 - 生体組織の内腔を移動するセンサによって取得された断層データに基づいて、前記生体組織を表す画像をディスプレイに表示させるとともに、前記画像と同じ画面上で前記センサの位置を表し、前記センサの移動に伴って変位する第1要素を前記ディスプレイに表示させるコンピュータに、
前記センサの位置のマーキングを要求するユーザ操作を受け付けると、前記ユーザ操作が行われたときの前記第1要素の位置と同じ位置に固定される第2要素を前記第1要素とともに前記ディスプレイに表示させる処理を実行させる画像処理プログラム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP22875872.8A EP4400059A1 (en) | 2021-09-29 | 2022-09-15 | Image processing device, image processing system, image display method, and image processing program |
JP2023551315A JPWO2023054001A1 (ja) | 2021-09-29 | 2022-09-15 |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-160100 | 2021-09-29 | ||
JP2021-160102 | 2021-09-29 | ||
JP2021160102 | 2021-09-29 | ||
JP2021160100 | 2021-09-29 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/619,745 Continuation US20240242396A1 (en) | 2021-09-29 | 2024-03-28 | Image processing device, image processing system, image display method, and image processing program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023054001A1 true WO2023054001A1 (ja) | 2023-04-06 |
Family
ID=85782485
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/034644 WO2023054001A1 (ja) | 2021-09-29 | 2022-09-15 | 画像処理装置、画像処理システム、画像表示方法、及び画像処理プログラム |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP4400059A1 (ja) |
JP (1) | JPWO2023054001A1 (ja) |
WO (1) | WO2023054001A1 (ja) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6251072B1 (en) | 1999-02-19 | 2001-06-26 | Life Imaging Systems, Inc. | Semi-automated segmentation method for 3-dimensional ultrasound |
US6385332B1 (en) | 1999-02-19 | 2002-05-07 | The John P. Roberts Research Institute | Automated segmentation method for 3-dimensional ultrasound |
JP2008512171A (ja) * | 2004-09-09 | 2008-04-24 | メディガイド リミテッド | 内腔内の選択された位置へ医療用デバイスを移送するための方法およびシステム |
US20100215238A1 (en) | 2009-02-23 | 2010-08-26 | Yingli Lu | Method for Automatic Segmentation of Images |
US20110021903A1 (en) * | 2007-05-08 | 2011-01-27 | Mediguide Ltd | Method for producing an electrophysiological map of the heart |
US20200129142A1 (en) * | 2018-10-26 | 2020-04-30 | Volcano Corporation | Intraluminal ultrasound navigation buidance and associated devices, systems, and methods |
-
2022
- 2022-09-15 EP EP22875872.8A patent/EP4400059A1/en active Pending
- 2022-09-15 JP JP2023551315A patent/JPWO2023054001A1/ja active Pending
- 2022-09-15 WO PCT/JP2022/034644 patent/WO2023054001A1/ja active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6251072B1 (en) | 1999-02-19 | 2001-06-26 | Life Imaging Systems, Inc. | Semi-automated segmentation method for 3-dimensional ultrasound |
US6385332B1 (en) | 1999-02-19 | 2002-05-07 | The John P. Roberts Research Institute | Automated segmentation method for 3-dimensional ultrasound |
JP2008512171A (ja) * | 2004-09-09 | 2008-04-24 | メディガイド リミテッド | 内腔内の選択された位置へ医療用デバイスを移送するための方法およびシステム |
US20110021903A1 (en) * | 2007-05-08 | 2011-01-27 | Mediguide Ltd | Method for producing an electrophysiological map of the heart |
US20100215238A1 (en) | 2009-02-23 | 2010-08-26 | Yingli Lu | Method for Automatic Segmentation of Images |
US20200129142A1 (en) * | 2018-10-26 | 2020-04-30 | Volcano Corporation | Intraluminal ultrasound navigation buidance and associated devices, systems, and methods |
Also Published As
Publication number | Publication date |
---|---|
JPWO2023054001A1 (ja) | 2023-04-06 |
EP4400059A1 (en) | 2024-07-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4450786B2 (ja) | 画像処理方法および画像処理プログラム | |
EP2573735B1 (en) | Endoscopic image processing device, method and program | |
EP2036049A2 (en) | Apparatus and method for rendering for display forward-looking image data | |
JP7300352B2 (ja) | 診断支援装置、診断支援システム、及び診断支援方法 | |
JP5498090B2 (ja) | 画像処理装置及び超音波診断装置 | |
WO2023054001A1 (ja) | 画像処理装置、画像処理システム、画像表示方法、及び画像処理プログラム | |
US20220218309A1 (en) | Diagnostic assistance device, diagnostic assistance system, and diagnostic assistance method | |
RU2735068C1 (ru) | Система и способ для визуализации на дисплее изображения частично уплощенной поверхности внутренней поверхности полости, а также постоянный машиночитаемый носитель | |
US20240242396A1 (en) | Image processing device, image processing system, image display method, and image processing program | |
WO2022202203A1 (ja) | 画像処理装置、画像処理システム、画像表示方法、及び画像処理プログラム | |
WO2023176741A1 (ja) | 画像処理装置、画像処理システム、画像表示方法、及び画像処理プログラム | |
WO2022202202A1 (ja) | 画像処理装置、画像処理システム、画像表示方法、及び画像処理プログラム | |
WO2022071251A1 (ja) | 画像処理装置、画像処理システム、画像表示方法、及び画像処理プログラム | |
WO2022202201A1 (ja) | 画像処理装置、画像処理システム、画像表示方法、及び画像処理プログラム | |
WO2022071250A1 (ja) | 画像処理装置、画像処理システム、画像表示方法、及び画像処理プログラム | |
WO2023013601A1 (ja) | 画像処理装置、画像処理システム、画像処理方法、及び画像処理プログラム | |
CN114502079B (zh) | 诊断支援装置、诊断支援系统及诊断支援方法 | |
WO2021200294A1 (ja) | 画像処理装置、画像処理システム、画像表示方法、及び画像処理プログラム | |
WO2022202200A1 (ja) | 画像処理装置、画像処理システム、画像表示方法、及び画像処理プログラム | |
WO2022085373A1 (ja) | 画像処理装置、画像処理システム、画像表示方法、及び画像処理プログラム | |
JP7421548B2 (ja) | 診断支援装置及び診断支援システム | |
JP7379473B2 (ja) | 診断支援装置及び診断支援方法 | |
WO2021065746A1 (ja) | 診断支援装置、診断支援システム、及び診断支援方法 | |
JP2023024072A (ja) | 画像処理装置、画像処理システム、画像表示方法、及び画像処理プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22875872 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023551315 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022875872 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2022875872 Country of ref document: EP Effective date: 20240411 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |