WO2023176741A1 - Dispositif de traitement d'image, système de traitement d'image, méthode d'affichage d'image et programme de traitement d'image - Google Patents
Dispositif de traitement d'image, système de traitement d'image, méthode d'affichage d'image et programme de traitement d'image Download PDFInfo
- Publication number
- WO2023176741A1 WO2023176741A1 PCT/JP2023/009449 JP2023009449W WO2023176741A1 WO 2023176741 A1 WO2023176741 A1 WO 2023176741A1 JP 2023009449 W JP2023009449 W JP 2023009449W WO 2023176741 A1 WO2023176741 A1 WO 2023176741A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image processing
- point
- intersection
- screen
- processing device
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/12—Diagnosis using ultrasonic, sonic or infrasonic waves in body cavities or body tracts, e.g. by using catheters
Definitions
- the present disclosure relates to an image processing device, an image processing system, an image display method, and an image processing program.
- Patent Documents 1 to 3 describe techniques for generating three-dimensional images of heart chambers or blood vessels using a US imaging system.
- US is an abbreviation for ultrasound.
- Patent Document 4 discloses that when two points on a screen on which a three-dimensional image is displayed are specified, the two-dimensional coordinates and density of the two points are determined, and the density difference is added to the distance between the two-dimensional coordinates. A method for calculating the distance between the two points is disclosed.
- IVUS is an abbreviation for intravascular ultrasound.
- IVUS is a device or method that provides two-dimensional images in a plane perpendicular to the longitudinal axis of a catheter.
- IVUS is often used for procedures that use a catheter separate from the IVUS catheter, such as ablation.
- a septal puncture needle inserted into the right atrium punctures the fossa ovalis, creating a path from the right atrium to the left atrium. law is used.
- puncture it is desirable to carefully confirm the puncture position because there is a risk of complications such as perforation or cardiac tamponade.
- an IVUS catheter that can obtain 360-degree information is excellent in confirming the puncture position within the same plane.
- images are acquired intermittently along the IVUS catheter axis, making it difficult to image the three-dimensional structure.
- An object of the present disclosure is to improve the accuracy of distance calculation between two points specified on a screen on which a three-dimensional image is displayed.
- An image processing device renders a living tissue object based on a positional relationship between a viewpoint set in a virtual three-dimensional space and an object of biological tissue arranged in the three-dimensional space.
- an image processing device that displays the object on a screen as a three-dimensional image of the living tissue, the image processing device displaying the object on the screen as a three-dimensional image of the living tissue, the image processing device A first corresponding point corresponding to one of the two specified positions and a second corresponding point corresponding to the other on the plane corresponding to the above are identified, and the viewpoint and the first corresponding point are identified in the three-dimensional space.
- a control unit is provided that calculates the distance between the intersection points and outputs the obtained calculation result.
- control unit outputs a numerical value representing the distance on the screen as the calculation result.
- the position specifying operation includes, as a first operation, an operation of pressing a push button of an input device, and the control unit is configured to control the location on the screen when the first operation is performed on the plane.
- a point corresponding to the position of the pointer is specified as the first corresponding point.
- the first operation is an operation of pressing the push button while pressing a predetermined first key.
- the position specifying operation includes, as a second operation, an operation of releasing the push button, which is performed following the first operation and a drag operation of moving the pointer while holding the push button;
- the control unit specifies, as the second corresponding point, a point on the plane that corresponds to the position of the pointer when the second operation is performed.
- the second operation is an operation of releasing the push button while pressing a predetermined second key.
- control unit may control a first corresponding position on the screen corresponding to an intersection between the plane and a straight line connecting the viewpoint and the first intersection in the three-dimensional space; A mark is displayed at each second corresponding position corresponding to an intersection between the plane and a straight line connecting the viewpoint and the second intersection in space.
- control unit specifies a corresponding range on the plane that corresponds to the specified range in response to a range specification operation that specifies a range on the screen, and specifies the corresponding range on the plane that corresponds to the specified range, and A mark displayed at a position corresponding to an intersection point between the first intersection point and the second intersection point that exists in a three-dimensional area extending from the viewpoint in a conical shape through the outer edge of the corresponding range in the three-dimensional space. change the appearance of
- control unit receives an operation to delete marks whose appearance has been changed all at once.
- An image processing system as one aspect of the present disclosure includes the image processing device and a display that displays the screen.
- An image display method as an aspect of the present disclosure renders an object based on a positional relationship between a viewpoint set in a virtual three-dimensional space and an object of biological tissue arranged in the three-dimensional space.
- an image display method for displaying the object on a screen as a three-dimensional image of the living tissue comprising: A first corresponding point corresponding to one of the two specified positions and a second corresponding point corresponding to the other on the plane corresponding to the above are identified, and the viewpoint and the first corresponding point are identified in the three-dimensional space.
- intersection points that is an intersection between the object and an extension of a straight line connecting the viewpoint
- second intersection point that is an intersection between the object and an extension of the straight line that connects the viewpoint and the second corresponding point in the three-dimensional space.
- An image processing program renders an object based on a positional relationship between a viewpoint set in a virtual three-dimensional space and an object of biological tissue arranged in the three-dimensional space. and displays the object on the screen as a three-dimensional image of the living tissue, in response to a position designation operation that specifies two positions on the screen, a plane corresponding to the screen in the three-dimensional space.
- a process of calculating the distance between the two and a process of outputting the obtained calculation result are executed.
- the accuracy of distance calculation between two points specified on a screen on which a three-dimensional image is displayed is improved.
- FIG. 1 is a perspective view of an image processing system according to an embodiment of the present disclosure.
- 1 is a diagram illustrating an example of a screen displayed on a display by an image processing system according to an embodiment of the present disclosure.
- 1 is a diagram illustrating an example of a screen displayed on a display by an image processing system according to an embodiment of the present disclosure.
- 1 is a diagram illustrating an example of a screen displayed on a display by an image processing system according to an embodiment of the present disclosure.
- FIG. 2 is a diagram illustrating an example of distance calculation performed by the image processing system according to the embodiment of the present disclosure.
- 1 is a diagram illustrating an example of a screen displayed on a display by an image processing system according to an embodiment of the present disclosure.
- 1 is a diagram illustrating an example of a screen displayed on a display by an image processing system according to an embodiment of the present disclosure.
- 1 is a diagram illustrating an example of a screen displayed on a display by an image processing system according to an embodiment of the present disclosure.
- FIG. 3 is a diagram illustrating an example of region calculation performed by the image processing system according to the embodiment of the present disclosure.
- 1 is a diagram illustrating an example of a screen displayed on a display by an image processing system according to an embodiment of the present disclosure.
- 1 is a diagram illustrating an example of a screen displayed on a display by an image processing system according to an embodiment of the present disclosure.
- 1 is a diagram illustrating an example of a screen displayed on a display by an image processing system according to an embodiment of the present disclosure.
- 1 is a diagram illustrating an example of a screen displayed on a display by an image processing system according to an embodiment of the present disclosure.
- FIG. 1 is a diagram showing an example of a two-dimensional image displayed on a display by an image processing system according to an embodiment of the present disclosure.
- FIG. 2 is a diagram illustrating an example of a cutting area formed by an image processing system according to an embodiment of the present disclosure.
- FIG. 1 is a block diagram showing the configuration of an image processing device according to an embodiment of the present disclosure.
- FIG. 2 is a perspective view of a probe and a drive unit according to an embodiment of the present disclosure.
- 1 is a flowchart showing the operation of an image processing system according to an embodiment of the present disclosure.
- 1 is a flowchart showing the operation of an image processing system according to an embodiment of the present disclosure.
- 1 is a flowchart showing the operation of an image processing system according to an embodiment of the present disclosure.
- 1 is a flowchart showing the operation of an image processing system according to an embodiment of the present disclosure.
- 1 is a flowchart showing the operation of an image processing system according to an embodiment of the present disclosure.
- the image processing device 11 is a computer that displays three-dimensional data 52 representing a biological tissue 60 as a three-dimensional image 53 on a display 16. As shown in FIGS. 2 to 5, the image processing device 11 performs image processing based on the positional relationship between the viewpoint V0 set in the virtual three-dimensional space and the object 54 of the biological tissue 60 arranged in the three-dimensional space. , renders the object 54, and displays the object 54 on the screen 80 as a three-dimensional image 53 of the biological tissue 60. Viewpoint V0 corresponds to the position of virtual camera 71 arranged in three-dimensional space.
- the image processing device 11 specifies a first corresponding point Q1 and a second corresponding point Q2 on the plane 55 corresponding to the screen 80 in the three-dimensional space in response to the position specifying operation. do.
- the position designation operation is an operation to designate two positions on the screen 80.
- the first corresponding point Q1 is a point on the plane 55 that corresponds to one of the two positions designated by the position designation operation.
- the second corresponding point Q2 is a point on the plane 55 that corresponds to the other of the two positions designated by the position designation operation.
- the image processing device 11 calculates the distance
- the first intersection R1 is the intersection of the object 54 and an extension of the straight line connecting the viewpoint V0 and the first corresponding point Q1 in the three-dimensional space.
- the second intersection R2 is the intersection of the object 54 and an extension of the straight line connecting the viewpoint V0 and the second corresponding point Q2 in the three-dimensional space.
- the image processing device 11 outputs the obtained calculation results. Specifically, the image processing device 11 outputs a numerical value representing the distance on the screen 80 as the calculation result. Alternatively, the image processing device 11 may output the calculation result in another format such as audio.
- FIG. 4 shows an example in which the numerical value "10 mm" is displayed on the screen 80 as the calculation result.
- the accuracy of distance calculation between two points specified on the screen 80 on which the three-dimensional image 53 is displayed is improved. For example, even if the position of the viewpoint V0 with respect to the object 54 is changed and the way the three-dimensional image 53 is displayed changes, the distance between two specified points will change depending on the position of the viewpoint V0, not the display density. Therefore, accurate distance calculation is possible.
- the distance between two designated points is not the distance between the coordinates on the plane 55 corresponding to the screen 80
- the position designation operation includes an operation of pressing a push button on the input device as a first operation.
- the first operation is an operation of pressing a button on the mouse 15 as a push button of the input device, but it may also be an operation of pressing a specific key of the keyboard 14 as a push button of the input device.
- the image processing device 11 identifies a point on the plane 55 that corresponds to the position of the pointer 86 on the screen 80 when the first operation was performed as the first corresponding point Q1.
- FIG. 2 shows an example in which the pointer 86 is located at a position corresponding to the upper end of the fossa ovalis 66 represented in the three-dimensional image 53 when the first operation is performed.
- the pointer 86 has an arrow shape in this example, it may have another shape such as a cross shape.
- the shape of the pointer 86 may be changed so that the user can easily recognize the operation mode. Then, when the user switches the operation mode from the position specifying operation mode to another mode, the shape of the pointer 86 may be changed again.
- the image processing device 11 displays the mark 87 at the first corresponding position on the screen 80.
- the first corresponding position is a position on the screen 80 that corresponds to the intersection of the plane 55 and a straight line connecting the viewpoint V0 and the first intersection R1 in the three-dimensional space. After the first operation is performed, the first corresponding position remains the same as the position specified in the first operation until the viewpoint V0 is moved. However, when the viewpoint V0 is moved, the first corresponding position also changes to the first corresponding position. Moves from a specified position with one operation.
- FIG. 3 shows an example in which the mark 87 is displayed at a position corresponding to the upper end of the fossa ovalis 66 as the first corresponding position.
- the first operation may be an operation of pressing a push button while pressing a predetermined first key.
- the first key is, for example, the Ctrl key or the Shift key on the keyboard 14.
- the position specifying operation includes, as a second operation, an operation of releasing the push button, which is performed following the first operation and a drag operation of moving the pointer 86 while holding down the push button.
- the image processing device 11 identifies a point on the plane 55 that corresponds to the position of the pointer 86 when the second operation is performed as a second corresponding point Q2.
- FIG. 3 shows an example in which the pointer 86 is at a position corresponding to the lower end of the fossa ovalis 66 when the second operation is performed.
- the image processing device 11 displays the mark 88 at the second corresponding position on the screen 80.
- the second corresponding position is a position on the screen 80 that corresponds to the intersection of the plane 55 and a straight line connecting the viewpoint V0 and the second intersection R2 in the three-dimensional space. After the second operation is performed, the second corresponding position remains the same as the position specified by the second operation until the viewpoint V0 is moved. However, when the viewpoint V0 is moved, the second corresponding position also changes to the second corresponding position. Move from the specified position with 2 operations.
- FIG. 4 shows an example in which the mark 88 is displayed at a position corresponding to the lower end of the fossa ovalis 66 as the second corresponding position.
- the second operation may be an operation of releasing a push button while pressing a predetermined second key.
- the second key is, for example, the Ctrl key or the Shift key on the keyboard 14.
- the second key may be the same key as the first key, or may be a different key from the first key.
- the user moves the pointer 86 to a desired position using the mouse 15, presses a button on the mouse 15 to specify the starting point position, and moves the pointer 86 to another position while holding down the button on the mouse 15.
- the end point position can be specified by moving the mouse 15 to a desired position and releasing the button on the mouse 15.
- the image processing device 11 specifies the three-dimensional coordinates (xq1, yq1, dq) corresponding to the specified starting point position as the coordinates of the first corresponding point Q1.
- the image processing device 11 specifies the three-dimensional coordinates (xq2, yq2, dq) corresponding to the designated end point position as the coordinates of the second corresponding point Q2.
- the image processing device 11 calculates three-dimensional coordinates (xr1, yr1) where a straight line passing through the coordinates (xv, yv, dv) of the viewpoint V0 and the coordinates (xq1, yq1, dq) of the first corresponding point Q1 reaches the object 54. , dr1) as the coordinates of the first intersection R1.
- the image processing device 11 calculates three-dimensional coordinates (xr2, yr2) where a straight line passing through the coordinates (xv, yv, dv) of the viewpoint V0 and the coordinates (xq2, yq2, dq) of the second corresponding point Q2 reaches the object 54. , dr2) as the coordinates of the second intersection R2.
- the image processing device 11 calculates the Euclidean distance ⁇ ((xr2 ⁇ xr1) 2 +(yr2 ⁇ yr1) 2 + (dr2-dr1) 2 ) is calculated.
- the image processing device 11 outputs a numerical value representing the calculated Euclidean distance on the screen 80. Therefore, the user can easily and accurately measure any distance in the three-dimensional image 53, such as the length of the fossa ovalis 66.
- the image processing device 11 further specifies a corresponding range 56 on the plane 55 corresponding to the screen 80 in the three-dimensional space in response to the range specification operation, as shown in FIGS. 6 to 9.
- the range specification operation is an operation for specifying a range 89 on the screen 80.
- the corresponding range 56 is a range on the plane 55 that corresponds to the designated range 89.
- the image processing device 11 selects a mark displayed on the screen 80 at a position corresponding to an intersection existing in the three-dimensional area 57 as shown in FIG. 9 between the first intersection R1 and the second intersection R2. Change appearance, such as color or shape.
- the three-dimensional area 57 is an area that extends in a conical shape from the viewpoint V0 through the outer edge of the corresponding range 56 in the three-dimensional space.
- FIG. 9 shows an example in which both the first intersection point R1 and the second intersection point R2 exist within the three-dimensional area 57.
- FIG. 8 shows an example in which the colors of marks 87 and 88 displayed on the screen 80 at positions corresponding to the first intersection point R1 and the second intersection point R2 are changed.
- the image processing device 11 accepts batch operations on marks existing within the specified range 89, such as an operation to collectively delete marks whose appearance has been changed. Therefore, the user can perform efficient operations such as selecting arbitrary marks on the screen 80 and erasing them all at once.
- the range specification operation includes an operation of pressing a push button on the input device as a third operation.
- the third operation is an operation of pressing a button on the mouse 15 as a push button of the input device, but it may also be an operation of pressing a specific key of the keyboard 14 as a push button of the input device.
- the image processing device 11 identifies a point on the plane 55 that corresponds to the position of the pointer 86 on the screen 80 when the third operation was performed.
- FIG. 6 shows an example in which the pointer 86 is at a position corresponding to a point away from the fossa ovalis 66 to the upper left when the third operation is performed.
- the pointer 86 has an arrow shape in this example, it may have another shape such as a cross shape.
- the shape of the pointer 86 may be changed so that the user can easily recognize the operation mode. Then, when the user switches the operation mode from the range specification operation mode to another mode, the shape of the pointer 86 may be changed again.
- the third operation may be an operation of pressing a push button while pressing a predetermined third key.
- the third key is, for example, the Ctrl key or the Shift key on the keyboard 14.
- the third key may be the same key as the first key, or may be a different key from the first key.
- the third key may be the same key as the second key, or may be a different key from the second key.
- the range specifying operation includes, as a fourth operation, an operation of releasing the push button, which is performed following the third operation and the drag operation of moving the pointer 86 while holding down the push button.
- the image processing device 11 identifies a point on the plane 55 that corresponds to the position of the pointer 86 when the fourth operation was performed.
- FIG. 7 shows an example in which the pointer 86 is located at a position corresponding to a point farther to the lower right from the fossa ovalis 66 when the fourth operation is performed.
- the fourth operation may be an operation of releasing a push button while pressing a predetermined fourth key.
- the fourth key is, for example, the Ctrl key or the Shift key on the keyboard 14.
- the fourth key may be the same key as the first key, or may be a different key from the first key.
- the fourth key may be the same key as the second key, or may be a different key from the second key.
- the fourth key may be the same key as the third key, or may be a different key from the third key.
- the image processing device 11 determines the point on the plane 55 corresponding to the position of the pointer 86 on the screen 80 when the third operation is performed, and the point 86 when the fourth operation is performed.
- a rectangular range whose diagonal vertices are the point corresponding to the position of is specified as the corresponding range 56.
- FIG. 8 shows, as a range 89, a rectangular range having diagonal vertices at a position corresponding to a point distant to the upper left from the fossa ovalis 66 and a position corresponding to a point corresponding to a point distant to the lower right from the fossa ovalis 66.
- An example where is specified is shown.
- a circular range may be specified as the range 89 instead of a rectangular range. That is, the image processing device 11 uses a point on the plane 55 that corresponds to the position of the pointer 86 on the screen 80 when the third operation was performed as a center point, and a point on the plane 55 that corresponds to the position of the pointer 86 on the screen 80 when the fourth operation was performed.
- a circular range whose circumferential points are points corresponding to the positions of may be specified as the corresponding range 56.
- FIG. 10 shows an example in which the pointer 86 is located at a position corresponding to the center point of the fossa ovalis 66 when the third operation is performed.
- FIG. 10 shows an example in which the pointer 86 is located at a position corresponding to the center point of the fossa ovalis 66 when the third operation is performed.
- FIG. 11 shows an example in which the pointer 86 is at a position corresponding to a point downwardly away from the fossa ovalis 66 when the fourth operation is performed.
- a circular range is designated as a range 89, with a center point at a position corresponding to the center point of the fossa ovalis 66 and a circumferential point at a position corresponding to a point downwardly away from the fossa ovalis 66.
- FIG. 12 shows an example in which the colors of the marks 87 and 88 displayed on the screen 80 at positions corresponding to the first intersection point R1 and the second intersection point R2 are changed. .
- the user moves the pointer 86 to a desired position with the mouse 15, presses the button of the mouse 15 to specify one vertex position, and while holding down the button of the mouse 15, moves the pointer 86 to a desired position. can be moved to another desired position and the rectangular range can be designated as range 89 by releasing the mouse 15 button.
- the user moves the pointer 86 to a desired position with the mouse 15, presses the button of the mouse 15 to specify the center point position, and while holding down the button of the mouse 15, moves the pointer 86 to a desired position. 86 to another desired position and release the mouse 15 button to designate the circular area as area 89.
- the image processing device 11 specifies a two-dimensional range corresponding to the designated range 89 as the corresponding range 56, as shown in FIG.
- the image processing device 11 specifies a cone-shaped area extending through the coordinates (xv, yv, dv) of the viewpoint V0 and the outer edge of the corresponding range 56 as a three-dimensional area 57.
- the image processing device 11 determines that the coordinates (xr1, yr1, dr1) of the first intersection R1 and the coordinates (xr2, yr2, dr2) of the second intersection R2 are within the three-dimensional area 57.
- the image processing device 11 changes the colors of the marks 87 and 88 displayed on the screen 80 at positions corresponding to the first intersection R1 and the second intersection R2, respectively. Therefore, the user can easily select the marks 87 and 88.
- the image processing device 11 forms a cutting region 62 in the three-dimensional data 52 that exposes the inner cavity 63 of the biological tissue 60 in the three-dimensional image 53.
- the image processing device 11 adjusts the viewpoint V0 when displaying the three-dimensional image 53 on the display 16 according to the position of the cutting area 62.
- the image processing device 11 receives a user operation requesting rotation of the viewpoint V0, the image processing device 11 passes through a reference point located in the lumen 63 on a reference plane that extends horizontally in the three-dimensional image 53 and includes the viewpoint V0, and rotates the reference plane.
- the position of the cutting area 62 is changed from the first position, which is the position when the user's operation was performed, to the second position rotated around the rotation axis extending in the direction perpendicular to .
- the horizontal direction refers to the XY directions shown in FIG.
- the direction perpendicular to the reference plane is the Z direction shown in FIG.
- the reference point may be any point located within the lumen 63 on the reference plane, such as the center point of the IVUS catheter, but in this embodiment, it is the center of gravity of the lumen 63 on the reference plane.
- the corresponding center of gravity B1 becomes the reference point.
- the image processing device 11 rotates the viewpoint V0 around the rotation axis according to the second position.
- the usefulness for confirming the position within the living tissue 60 is improved.
- a procedure such as ablation using IVUS
- a user such as a doctor performing the procedure while operating a catheter or a clinical engineer operating the IVUS system while looking at the display 16
- the viewpoint V0 can be rotated around the rotation axis.
- the image processing device 11 identifies a cross section 64 of the living tissue 60 and a region 65 corresponding to the cutting region 62 in the cross section 64.
- the two-dimensional image 58 and the three-dimensional image 53 are displayed on the display 16.
- the position of the camera 71 with respect to the cross section 64 is displayed.
- the user can understand from the two-dimensional image 58 what kind of structure the portion of the biological tissue 60 that is cut out and not displayed in the three-dimensional image 53 has. For example, if the user is a surgeon, it becomes easier to perform surgery on the inside of the living tissue 60.
- the biological tissue 60 includes, for example, blood vessels or organs such as the heart.
- the biological tissue 60 is not limited to anatomically a single organ or a part thereof, but also includes a tissue having a lumen spanning multiple organs.
- An example of such a tissue is, specifically, a part of the vascular tissue that extends from the upper part of the inferior vena cava, passes through the right atrium, and reaches the lower part of the superior vena cava.
- an operation panel 81 In FIGS. 2 to 4, 6 to 8, and 10 to 12, an operation panel 81, a two-dimensional image 58, a three-dimensional image 53, and a pointer 86 are displayed on a screen 80.
- the operation panel 81 is a GUI component for setting the cutting area 62. “GUI” is an abbreviation for graphical user interface.
- the operation panel 81 includes a check box 82 for selecting whether to activate the setting of the cutting area 62, a slider 83 for setting the base angle, a slider 84 for setting the opening angle, and a center of gravity.
- a check box 85 is provided for selecting whether or not to use the .
- the base angle is the rotation angle of one of the two straight lines L1 and L2 extending from one point M in the cross-sectional image representing the cross-section 64 of the biological tissue 60. Therefore, setting the base angle corresponds to setting the direction of the straight line L1.
- the opening angle is the angle between the two straight lines L1 and L2. Therefore, setting the opening angle corresponds to setting the angle formed by the two straight lines L1 and L2.
- Point M is the center of gravity of cross section 64. Point M may be set at a point other than the center of gravity on the cross section 64 if it is selected not to use the center of gravity.
- the two-dimensional image 58 is an image obtained by processing a cross-sectional image.
- the color of an area 65 corresponding to the cutting area 62 is changed to clearly indicate which part of the cross section 64 has been cut.
- the viewpoint V0 when displaying the three-dimensional image 53 on the screen 80 is adjusted according to the position of the cutting region 62.
- the cutting area 62 can be determined using the two-dimensional image 58. Specifically, as shown in FIG. 13, by adjusting the base angle or opening angle and setting the position or size of the area 65 separated by two straight lines L1 and L2 in the two-dimensional image 58. , the position or size of the cutting area 62 can be set. For example, if the base angle is changed so that the straight line L1 is rotated approximately 90 degrees counterclockwise, a region 65a that is moved in accordance with the change in the base angle is obtained in the two-dimensional image 58a. Then, the position of the cutting area 62 is adjusted according to the position of the area 65a.
- the opening angle is changed so that the angle between the two straight lines L1 and L2 becomes larger, a region 65b that is enlarged according to the change in the opening angle is obtained in the two-dimensional image 58b.
- the size of the cutting area 62 is adjusted according to the size of the area 65b.
- the position of the camera 71 may be adjusted as appropriate depending on the position or size of the cutting area 62.
- the image corresponding to the current position of the sensor that is, the latest image, is always displayed as the two-dimensional image 58.
- the base angle may be set by dragging the straight line L1 instead of being set by operating the slider 83, or by inputting a numerical value. Good too.
- the opening angle may be set by dragging the straight line L2 or by inputting a numerical value.
- the cutting area 62 determined using the two-dimensional image 58 is hidden or transparent.
- the X direction and the Y direction perpendicular to the X direction each correspond to the lateral direction of the lumen 63 of the living tissue 60.
- the Z direction orthogonal to the X direction and the Y direction corresponds to the longitudinal direction of the lumen 63 of the living tissue 60.
- the image processing device 11 uses the three-dimensional data 52 to calculate the positions of the centers of gravity B1, B2, B3, and B4 of the cross sections C1, C2, C3, and C4 of the living tissue 60, respectively.
- the image processing device 11 sets two planes that intersect at a line Lb passing through the positions of the centers of gravity B1, B2, B3, and B4 and include two straight lines L1 and L2, respectively, as cutting planes P1 and P2. For example, if point M shown in FIGS.
- the image processing device 11 forms a region in the three-dimensional image 53 between the cutting planes P1 and P2 and exposing the inner cavity 63 of the living tissue 60 as a cutting region 62 in the three-dimensional data 52 .
- cross sections C1, C2, C3, and C4 are shown as multiple cross sections in the transverse direction of the internal cavity 63 of the living tissue 60, but the number of cross sections for which the center of gravity position is calculated is four. However, the number is preferably the same as the number of cross-sectional images obtained by IVUS.
- the check box 85 on the operation panel 81 is not checked, that is, it is selected not to use the center of gravity.
- the image processing device 11 intersects at an arbitrary line passing through point M, such as a straight line extending in the Z direction through point M, and includes two straight lines L1 and L2, respectively.
- the planes are set as cutting planes P1 and P2.
- the image processing system 10 includes an image processing device 11, a cable 12, a drive unit 13, a keyboard 14, a mouse 15, and a display 16.
- the image processing device 11 is a dedicated computer specialized for image diagnosis in this embodiment, it may be a general-purpose computer such as a PC. "PC” is an abbreviation for personal computer.
- the cable 12 is used to connect the image processing device 11 and the drive unit 13.
- the drive unit 13 is a device that is used by being connected to the probe 20 shown in FIG. 16 and drives the probe 20.
- Drive unit 13 is also called MDU.
- MDU is an abbreviation for motor drive unit.
- the probe 20 is applied to IVUS.
- the probe 20 is also called an IVUS catheter or an imaging catheter.
- the keyboard 14, mouse 15, and display 16 are connected to the image processing device 11 via any cable or wirelessly.
- the display 16 is, for example, an LCD, an organic EL display, or an HMD.
- LCD is an abbreviation for liquid crystal display.
- EL is an abbreviation for electro luminescence.
- HMD is an abbreviation for head-mounted display.
- the image processing system 10 further includes a connection terminal 17 and a cart unit 18 as options.
- connection terminal 17 is used to connect the image processing device 11 and external equipment.
- the connection terminal 17 is, for example, a USB terminal.
- USB is an abbreviation for Universal Serial Bus.
- the external device is, for example, a recording medium such as a magnetic disk drive, a magneto-optical disk drive, or an optical disk drive.
- the cart unit 18 is a cart with casters for movement.
- An image processing device 11, a cable 12, and a drive unit 13 are installed in the cart body of the cart unit 18.
- a keyboard 14, a mouse 15, and a display 16 are installed on the top table of the cart unit 18.
- the probe 20 includes a drive shaft 21, a hub 22, a sheath 23, an outer tube 24, an ultrasonic transducer 25, and a relay connector 26.
- the drive shaft 21 passes through a sheath 23 inserted into the body cavity of a living body and an outer tube 24 connected to the proximal end of the sheath 23, and extends to the inside of the hub 22 provided at the proximal end of the probe 20.
- the drive shaft 21 has an ultrasonic transducer 25 at its tip that transmits and receives signals, and is rotatably provided within the sheath 23 and the outer tube 24 .
- Relay connector 26 connects sheath 23 and outer tube 24.
- the hub 22, the drive shaft 21, and the ultrasonic transducer 25 are connected to each other so that they each move forward and backward in the axial direction. Therefore, for example, when the hub 22 is pushed toward the distal end, the drive shaft 21 and the ultrasonic transducer 25 move inside the sheath 23 toward the distal end. For example, when the hub 22 is pulled toward the proximal end, the drive shaft 21 and the ultrasonic transducer 25 move inside the sheath 23 toward the proximal end, as shown by the arrows.
- the drive unit 13 includes a scanner unit 31, a slide unit 32, and a bottom cover 33.
- the scanner unit 31 is also called a pullback unit. Scanner unit 31 is connected to image processing device 11 via cable 12 .
- the scanner unit 31 includes a probe connection section 34 that connects to the probe 20 and a scanner motor 35 that is a drive source that rotates the drive shaft 21 .
- the probe connecting portion 34 is detachably connected to the probe 20 via the insertion port 36 of the hub 22 provided at the base end of the probe 20. Inside the hub 22, the base end of the drive shaft 21 is rotatably supported, and the rotational force of the scanner motor 35 is transmitted to the drive shaft 21. Further, signals are transmitted and received between the drive shaft 21 and the image processing device 11 via the cable 12.
- the image processing device 11 generates a tomographic image of a living body lumen and performs image processing based on signals transmitted from the drive shaft 21 .
- the slide unit 32 carries the scanner unit 31 so that it can move forward and backward, and is mechanically and electrically connected to the scanner unit 31.
- the slide unit 32 includes a probe clamp section 37, a slide motor 38, and a switch group 39.
- the probe clamp section 37 is disposed coaxially with the probe connection section 34 on the distal side thereof, and supports the probe 20 connected to the probe connection section 34 .
- the slide motor 38 is a drive source that generates axial driving force.
- the scanner unit 31 is moved forward and backward by the drive of the slide motor 38, and the drive shaft 21 is accordingly moved forward and backward in the axial direction.
- the slide motor 38 is, for example, a servo motor.
- the switch group 39 includes, for example, a forward switch and a pullback switch that are pressed when moving the scanner unit 31 forward or backward, and a scan switch that is pressed when starting and ending image depiction.
- the switch group 39 is not limited to this example, and various switches may be included in the switch group 39 as necessary.
- the bottom cover 33 covers the bottom surface of the slide unit 32 and the entire circumference of the side surface on the bottom side, and is movable toward and away from the bottom surface of the slide unit 32.
- the configuration of the image processing device 11 will be described with reference to FIG. 15.
- the image processing device 11 includes a control section 41, a storage section 42, a communication section 43, an input section 44, and an output section 45.
- the control unit 41 includes at least one processor, at least one programmable circuit, at least one dedicated circuit, or any combination thereof.
- the processor is a general-purpose processor such as a CPU or GPU, or a dedicated processor specialized for specific processing.
- CPU is an abbreviation for central processing unit.
- GPU is an abbreviation for graphics processing unit.
- the programmable circuit is, for example, an FPGA.
- FPGA is an abbreviation for field-programmable gate array.
- the dedicated circuit is, for example, an ASIC.
- ASIC is an abbreviation for application specific integrated circuit.
- the control unit 41 executes processing related to the operation of the image processing device 11 while controlling each part of the image processing system 10 including the image processing device 11.
- the storage unit 42 includes at least one semiconductor memory, at least one magnetic memory, at least one optical memory, or any combination thereof.
- the semiconductor memory is, for example, RAM or ROM.
- RAM is an abbreviation for random access memory.
- ROM is an abbreviation for read only memory.
- the RAM is, for example, SRAM or DRAM.
- SRAM is an abbreviation for static random access memory.
- DRAM is an abbreviation for dynamic random access memory.
- the ROM is, for example, an EEPROM.
- EEPROM is an abbreviation for electrically erasable programmable read only memory.
- the storage unit 42 functions as, for example, a main storage device, an auxiliary storage device, or a cache memory.
- the storage unit 42 stores data used for the operation of the image processing device 11, such as tomographic data 51, and data obtained by the operation of the image processing device 11, such as 3D data 52 and 3D images 53. .
- the communication unit 43 includes at least one communication interface.
- the communication interface is, for example, a wired LAN interface, a wireless LAN interface, or an image diagnosis interface that receives and A/D converts IVUS signals.
- LAN is an abbreviation for local area network.
- A/D is an abbreviation for analog to digital.
- the communication unit 43 receives data used for the operation of the image processing device 11 and transmits data obtained by the operation of the image processing device 11.
- the drive unit 13 is connected to an image diagnosis interface included in the communication section 43.
- the input unit 44 includes at least one input interface.
- the input interface is, for example, a USB interface, an HDMI (registered trademark) interface, or an interface compatible with a short-range wireless communication standard such as Bluetooth (registered trademark).
- HDMI registered trademark
- HDMI registered trademark
- HDMI registered trademark
- HDMI registered trademark
- HDMI registered trademark
- HDMI registered trademark
- HDMI registered trademark
- HDMI registered trademark
- HDMI registered trademark
- HDMI registered trademark
- HDMI registered trademark
- HDMI High-Definition Multimedia Interface
- the input unit 44 accepts user operations such as inputting data used for the operation of the image processing device 11 .
- the keyboard 14 and mouse 15 are connected to a USB interface included in the input unit 44 or an interface compatible with near field communication. If the touch screen is provided integrally with the display 16, the display 16 may be connected to a USB interface or an HDMI (registered trademark) interface included in the input section 44.
- the output unit 45 includes at least one output interface.
- the output interface is, for example, a USB interface, an HDMI (registered trademark) interface, or an interface compatible with a short-range wireless communication standard such as Bluetooth (registered trademark).
- the output unit 45 outputs data obtained by the operation of the image processing device 11.
- the display 16 is connected to a USB interface or an HDMI (registered trademark) interface included in the output unit 45.
- the functions of the image processing device 11 are realized by executing the image processing program according to the present embodiment by a processor serving as the control unit 41. That is, the functions of the image processing device 11 are realized by software.
- the image processing program causes the computer to function as the image processing apparatus 11 by causing the computer to execute the operations of the image processing apparatus 11 . That is, the computer functions as the image processing device 11 by executing the operations of the image processing device 11 according to the image processing program.
- the program may be stored on a non-transitory computer-readable medium.
- the non-transitory computer-readable medium is, for example, a flash memory, a magnetic recording device, an optical disk, a magneto-optical recording medium, or a ROM.
- Distribution of the program is performed, for example, by selling, transferring, or lending a portable medium such as an SD card, DVD, or CD-ROM that stores the program.
- SD is an abbreviation for Secure Digital.
- DVD is an abbreviation for digital versatile disc.
- CD-ROM is an abbreviation for compact disc read only memory.
- the program may be distributed by storing the program in the storage of a server and transferring the program from the server to another computer.
- the program may be provided as a program product.
- a computer temporarily stores a program stored on a portable medium or a program transferred from a server in its main storage device. Then, the computer uses a processor to read a program stored in the main memory, and causes the processor to execute processing according to the read program.
- a computer may read a program directly from a portable medium and execute processing according to the program. The computer may sequentially execute processing according to the received program each time the program is transferred to the computer from the server. Processing may be performed using a so-called ASP type service that implements functions only by issuing execution instructions and obtaining results without transferring programs from the server to the computer. “ASP” is an abbreviation for application service provider.
- the program includes information that is used for processing by an electronic computer and is equivalent to a program. For example, data that is not a direct command to a computer but has the property of regulating computer processing falls under "something similar to a program.”
- a part or all of the functions of the image processing device 11 may be realized by a programmable circuit or a dedicated circuit as the control unit 41. That is, some or all of the functions of the image processing device 11 may be realized by hardware.
- the operation of the image processing system 10 according to this embodiment will be described with reference to FIGS. 17 and 18.
- the operation of the image processing system 10 corresponds to the image display method according to this embodiment.
- the probe 20 is primed by the user. Thereafter, the probe 20 is fitted into the probe connection part 34 and probe clamp part 37 of the drive unit 13, and is connected and fixed to the drive unit 13. The probe 20 is then inserted to a target site within the living tissue 60 such as a blood vessel or heart.
- step S101 the scan switch included in the switch group 39 is pressed, and the pullback switch included in the switch group 39 is further pressed, thereby performing a so-called pullback operation.
- the probe 20 transmits ultrasonic waves inside the living tissue 60 by using the ultrasonic transducer 25 that retreats in the axial direction by a pullback operation.
- the ultrasonic transducer 25 transmits ultrasonic waves in a radial manner while moving inside the living tissue 60 .
- the ultrasonic transducer 25 receives reflected waves of the transmitted ultrasonic waves.
- the probe 20 inputs the signal of the reflected wave received by the ultrasound transducer 25 to the image processing device 11 .
- the control unit 41 of the image processing device 11 acquires tomographic data 51 including a plurality of cross-sectional images by processing the input signals and sequentially generating cross-sectional images of the biological tissue 60.
- the probe 20 rotates the ultrasonic transducer 25 in the circumferential direction and moves it in the axial direction inside the living tissue 60, and the ultrasonic transducer 25 causes a plurality of waves outward from the center of rotation. Send ultrasound waves in the direction.
- the probe 20 uses the ultrasonic transducer 25 to receive reflected waves from reflective objects existing in multiple directions inside the living tissue 60 .
- the probe 20 transmits the received reflected wave signal to the image processing device 11 via the drive unit 13 and the cable 12.
- the communication unit 43 of the image processing device 11 receives the signal transmitted from the probe 20.
- the communication unit 43 performs A/D conversion on the received signal.
- the communication section 43 inputs the A/D converted signal to the control section 41 .
- the control unit 41 processes the input signal and calculates the intensity value distribution of reflected waves from a reflecting object existing in the ultrasonic wave transmission direction of the ultrasonic transducer 25 .
- the control unit 41 acquires tomographic data 51, which is a data set of cross-sectional images, by sequentially generating two-dimensional images having a luminance value distribution corresponding to the calculated intensity value distribution as cross-sectional images of the biological tissue 60.
- the control unit 41 causes the storage unit 42 to store the acquired tomographic data 51.
- the reflected wave signal received by the ultrasound transducer 25 corresponds to the raw data of the tomographic data 51, and the cross-sectional image that the image processing device 11 generates by processing the reflected wave signal corresponds to the tomographic data 51. This corresponds to 51 processed data.
- the control unit 41 of the image processing device 11 may store the signal input from the probe 20 as it is in the storage unit 42 as the tomographic data 51.
- the control unit 41 may cause the storage unit 42 to store data indicating the intensity value distribution of reflected waves calculated by processing the signal input from the probe 20 as the tomographic data 51.
- the tomographic data 51 is not limited to a data set of cross-sectional images of the living tissue 60, but may be any data that represents the cross-section of the living tissue 60 at each movement position of the ultrasound transducer 25 in some format.
- an ultrasonic vibrator that transmits ultrasonic waves in multiple directions without rotating is used. It's okay.
- the tomographic data 51 may be acquired using OFDI or OCT instead of being acquired using IVUS.
- OFDI is an abbreviation for optical frequency domain imaging.
- OCT is an abbreviation for optical coherence tomography.
- an ultrasound sensor that acquires tomographic data 51 by transmitting ultrasound in the lumen 63 of the living tissue 60 is used as a sensor that acquires the tomographic data 51 while moving through the lumen 63 of the living tissue 60.
- a sensor is used that emits light in the lumen 63 of the living tissue 60 to acquire the tomographic data 51.
- another device instead of the image processing device 11 generating a dataset of cross-sectional images of the living tissue 60, another device generates a similar dataset, and the image processing device 11 generates the dataset. It may also be acquired from the other device. That is, instead of the control unit 41 of the image processing device 11 processing the IVUS signal to generate a cross-sectional image of the biological tissue 60, another device processes the IVUS signal to generate a cross-sectional image of the biological tissue 60. The generated cross-sectional image may be input to the image processing device 11.
- step S102 the control unit 41 of the image processing device 11 generates three-dimensional data 52 of the biological tissue 60 based on the tomographic data 51 acquired in step S101. That is, the control unit 41 generates three-dimensional data 52 based on the tomographic data 51 acquired by the sensor.
- the control unit 41 instead of regenerating all 3D data 52 from scratch, it is possible to update only the data at the location to which the updated tomographic data 51 corresponds. preferable. In that case, the amount of data processing when generating the three-dimensional data 52 can be reduced, and the real-time performance of the three-dimensional image 53 in the subsequent step S103 can be improved.
- control unit 41 of the image processing device 11 stacks and three-dimensionalizes the cross-sectional images of the biological tissue 60 included in the tomographic data 51 stored in the storage unit 42, thereby creating three-dimensional images of the biological tissue 60.
- Generate dimensional data 52 any one of various processing methods such as rendering methods such as surface rendering or volume rendering, and accompanying texture mapping including environment mapping, and bump mapping may be used.
- the control unit 41 causes the storage unit 42 to store the generated three-dimensional data 52.
- the tomographic data 51 includes data on the long medical device in the same way as the data of the living tissue 60. It is included. Therefore, in step S102, the three-dimensional data 52 generated by the control unit 41 also includes data on the elongated medical instrument, similar to the data on the living tissue 60.
- the control unit 41 of the image processing device 11 classifies the pixel group of the cross-sectional image included in the cross-sectional data 51 acquired in step S101 into two or more classes.
- These two or more classes include at least a "tissue” class to which the biological tissue 60 belongs, a "medical device” class to which long medical devices belong, a "blood cell” class, an indwelling stent, etc. may further include a class of "indwellings", or a class of "lesions", such as lime or plaque.
- a method of classifying pixel groups of a cross-sectional image using a trained model is used.
- the trained model is trained to detect regions corresponding to each class from sample IVUS cross-sectional images by performing machine learning in advance.
- step S103 the control unit 41 of the image processing device 11 displays the three-dimensional data 52 generated in step S102 on the display 16 as a three-dimensional image 53.
- the control unit 41 may set the angle at which the three-dimensional image 53 is displayed to an arbitrary angle.
- the control unit 41 causes the display 16 to display the latest cross-sectional image included in the tomographic data 51 acquired in step S101 together with the three-dimensional image 53.
- the control unit 41 of the image processing device 11 generates a three-dimensional image 53 from the three-dimensional data 52 stored in the storage unit 42.
- the three-dimensional image 53 includes a group of three-dimensional objects such as a three-dimensional object representing the biological tissue 60 and a three-dimensional object representing a long medical instrument. That is, the control unit 41 generates a three-dimensional object of the biological tissue 60 from the data of the biological tissue 60 stored in the storage unit 42, and generates a three-dimensional object of the biological tissue 60 from the data of the long medical instrument stored in the storage unit 42. A three-dimensional object of a medical device is generated.
- the control unit 41 displays the latest cross-sectional image of the biological tissue 60 included in the tomographic data 51 stored in the storage unit 42 and the generated three-dimensional image 53 on the display 16 via the output unit 45. to be displayed.
- step S104 if the user performs a change operation to set the angle at which the three-dimensional image 53 is displayed, the process in step S105 is executed. If there is no change operation by the user, the process of step S106 is executed.
- step S105 the control unit 41 of the image processing device 11 receives, via the input unit 44, an operation to set the angle at which the three-dimensional image 53 is displayed.
- the control unit 41 adjusts the angle at which the three-dimensional image 53 is displayed to the set angle.
- step S103 the control unit 41 causes the display 16 to display the three-dimensional image 53 at the angle set in step S105.
- control unit 41 of the image processing device 11 allows the user to control the three-dimensional image 53 displayed on the display 16 by using the keyboard 14, the mouse 15, or the touch screen provided integrally with the display 16.
- a rotation operation is accepted via the input unit 44.
- the control unit 41 interactively adjusts the angle at which the three-dimensional image 53 is displayed on the display 16 in accordance with a user's operation.
- the control unit 41 may control the input unit 44 to allow the user to input a numerical value of the angle at which the three-dimensional image 53 is to be displayed using the keyboard 14, the mouse 15, or a touch screen provided integrally with the display 16. Accepted through.
- the control unit 41 adjusts the angle at which the three-dimensional image 53 is displayed on the display 16 in accordance with the input numerical value.
- step S106 if the tomographic data 51 is updated, the processes of step S107 and step S108 are executed. If the tomographic data 51 is not updated, in step S104, the presence or absence of the user's change operation is checked again.
- step S107 similarly to the process in step S101, the control unit 41 of the image processing device 11 processes the signal input from the probe 20 to newly generate a cross-sectional image of the biological tissue 60, thereby generating at least one cross-sectional image of the biological tissue 60.
- Tomographic data 51 including a new cross-sectional image is acquired.
- step S108 the control unit 41 of the image processing device 11 updates the three-dimensional data 52 of the biological tissue 60 based on the tomographic data 51 acquired in step S107. That is, the control unit 41 updates the three-dimensional data 52 based on the tomographic data 51 acquired by the sensor. Then, in step S103, the control unit 41 causes the display 16 to display the three-dimensional data 52 updated in step S108 as a three-dimensional image 53. The control unit 41 causes the display 16 to display the latest cross-sectional image included in the tomographic data 51 acquired in step S107 together with the three-dimensional image 53. In step S108, it is preferable to update only the data corresponding to the updated tomographic data 51. In that case, the amount of data processing when generating the three-dimensional data 52 can be reduced, and the real-time performance of the three-dimensional image 53 can be improved in step S108.
- step S111 if the user performs a setting operation to set the cutting area 62, the process in step S112 is executed.
- step S112 the control unit 41 of the image processing device 11 receives an operation to set the cutting area 62 via the input unit 44.
- control unit 41 of the image processing device 11 performs an operation to set an area 65 corresponding to the cutting area 62 on the cross-sectional image displayed on the display 16 in step S103 via the input unit 44. accept.
- control unit 41 receives an operation for setting two straight lines L1 and L2 extending from one point M in the cross-sectional image as an operation for setting a region 65 corresponding to the cutting region 62.
- control unit 41 of the image processing device 11 controls the base angle and the opening angle on the operation panel 81 as shown in FIGS. 2 to 4, 6 to 8, and 10 to 12.
- An operation in which the user specifies using the keyboard 14 , mouse 15 , or touch screen provided integrally with the display 16 is accepted via the input unit 44 . That is, the control unit 41 specifies the direction of one of the two straight lines L1, L2 and the angle formed by the two straight lines L1, L2 as an operation for setting the two straight lines L1, L2. accepts operations to do.
- the check box 85 on the operation panel 81 is checked, that is, the use of the center of gravity is selected.
- control unit 41 of the image processing device 11 allows the user to move two straight lines L1 and L2 on the cross-sectional image displayed on the display 16 using the keyboard 14, the mouse 15, or the display 16.
- An operation for drawing using an integrally provided touch screen may be accepted via the input unit 44. That is, the control unit 41 may receive an operation to draw the two straight lines L1 and L2 on the cross-sectional image as an operation to set the two straight lines L1 and L2.
- step S113 the control unit 41 of the image processing device 11 uses the latest three-dimensional data 52 stored in the storage unit 42 to calculate the position of the center of gravity of a plurality of cross-sections in the transverse direction of the internal cavity 63 of the living tissue 60.
- the latest three-dimensional data 52 is the three-dimensional data 52 generated in step S102 if the process of step S108 has not been executed, and the latest three-dimensional data 52 is the three-dimensional data 52 generated in step S102 if the process of step S108 has been executed. This refers to the three-dimensional data 52 that has been created.
- step S113 can be executed using a procedure similar to that disclosed in International Publication No. 2021/200294.
- step S114 the control unit 41 of the image processing device 11 performs smoothing on the calculation result of the center of gravity position in step S113. Specifically, the process in step S114 can be executed using a procedure similar to that disclosed in International Publication No. 2021/200294.
- step S115 the control unit 41 of the image processing device 11 sets two planes intersecting by a line Lb passing through the center of gravity position calculated in step S113 as cutting planes P1 and P2, as shown in FIG. .
- the control unit 41 performs smoothing on the calculation result of the center of gravity position in step S114 and then sets the cutting planes P1 and P2, but the process in step S114 may be omitted.
- control unit 41 of the image processing device 11 sets the curve of the center of gravity position obtained as a result of the smoothing in step S114 as the line Lb.
- the control unit 41 sets two planes that intersect at the set line Lb and include the two straight lines L1 and L2 set in step S112, respectively, as cutting planes P1 and P2.
- the control unit 41 determines the three-dimensional coordinates intersecting the cut planes P1 and P2 of the biological tissue 60 in the latest three-dimensional data 52 stored in the storage unit 42, and the internal cavity 63 of the biological tissue 60 in the three-dimensional image 53. Specify as the three-dimensional coordinates of the edge of the opening to be exposed.
- the control unit 41 causes the storage unit 42 to store the specified three-dimensional coordinates.
- step S116 the control unit 41 of the image processing device 11 forms a region in the three-dimensional data 52, which is sandwiched between the cutting planes P1 and P2 in the three-dimensional image 53 and exposes the inner cavity 63 of the biological tissue 60, as a cutting region 62. do.
- control unit 41 of the image processing device 11 converts the portion specified by the three-dimensional coordinates stored in the storage unit 42 into a three-dimensional image in the latest three-dimensional data 52 stored in the storage unit 42. 53 is set to be hidden or transparent when displayed on the display 16. That is, the control unit 41 forms the cutting area 62 in accordance with the area 65 set in step S112.
- step S117 the control unit 41 of the image processing device 11 causes the display 16 to display the three-dimensional data 52 in which the cutting area 62 was formed in step S116 as the three-dimensional image 53.
- the control unit 41 displays a cross section 64 represented by the cross-sectional image displayed on the display 16 in step S103, which is indicated by the tomographic data 51 newly acquired by the sensor, and a region 65 corresponding to the cutting region 62 in the cross section 64.
- a two-dimensional image 58 representing the image is displayed on the display 16 together with the three-dimensional image 53.
- control unit 41 of the image processing device 11 processes the latest cross-sectional image of the biological tissue 60 included in the tomographic data 51 stored in the storage unit 42, and converts the images from FIG. 4. Generate a two-dimensional image 58 as shown in FIGS. 6 to 8 and 10 to 12.
- the control unit 41 is shown in FIGS. 2 to 4, 6 to 8, and 10 to 12, in which the portion specified by the three-dimensional coordinates stored in the storage unit 42 is hidden or transparent.
- a three-dimensional image 53 is generated.
- the control unit 41 displays the generated two-dimensional image 58 and three-dimensional image 53 on the display 16 via the output unit 45.
- control unit 41 of the image processing device 11 generates a two-dimensional image 58 corresponding to the cutting area 62, as shown in FIGS. 2 to 4, 6 to 8, and 10 to 12.
- An image is generated in which the color of the region 65 is different from that of the remaining regions. For example, it is conceivable to change the white part in a general IVUS image to red in the region 65.
- step S118 if there is an operation to set the cutting area 62 as a user change operation, the process of step S119 is executed. If there is no change operation by the user, the process of step S120 is executed.
- step S119 the control unit 41 of the image processing device 11 receives an operation to set the cutting area 62 via the input unit 44, similar to the process in step S112. Then, the processing from step S115 onwards is executed.
- step S120 if the tomographic data 51 is updated, the processes of step S121 and step S122 are executed. If the tomographic data 51 is not updated, in step S118, the presence or absence of the user's change operation is checked again.
- step S121 the control unit 41 of the image processing device 11 processes the signal input from the probe 20 to newly generate a cross-sectional image of the living tissue 60, similarly to the process in step S101 or step S107.
- Tomographic data 51 including at least one new cross-sectional image is acquired.
- step S122 the control unit 41 of the image processing device 11 updates the three-dimensional data 52 of the biological tissue 60 based on the tomographic data 51 acquired in step S121. After that, the processing from step S113 onwards is executed. In step S122, it is preferable to update only the data corresponding to the updated tomographic data 51. In that case, the amount of data processing when generating the three-dimensional data 52 can be reduced, and the real-time performance of the data processing after step S113 can be improved.
- step S103 or step S117 the control unit 41 of the image processing device 11 selects the viewpoint V0 set in the virtual three-dimensional space and the object of the biological tissue 60 arranged in the three-dimensional space. This is performed when the object 54 is rendered based on the positional relationship with the biological tissue 60 and the object 54 is already displayed on the screen 80 as the three-dimensional image 53 of the biological tissue 60 .
- step S201 when a position specifying operation is performed, the control unit 41 of the image processing device 11 receives the position specifying operation via the input unit 44.
- the position designation operation is an operation to designate two positions on the screen 80.
- the position specifying operation includes an operation of pressing a button on the mouse 15 as the first operation, and is performed successively after the first operation and a drag operation of moving the pointer 86 while holding down the button of the mouse 15.
- an operation of releasing a button on the mouse 15 is included as the second operation.
- the first operation may be an operation of pressing a button on the mouse 15 while holding down a first key such as the Ctrl key or the Shift key on the keyboard 14.
- the second operation may be an operation of releasing a button on the mouse 15 while pressing a second key such as the Ctrl key or the Shift key on the keyboard 14.
- step S202 the control unit 41 of the image processing device 11 selects a first corresponding point Q1 and a second corresponding point Q1 on the plane 55 corresponding to the screen 80 in the three-dimensional space in response to the position specifying operation performed in step S201. Identify point Q2.
- the first corresponding point Q1 is a point on the plane 55 that corresponds to one of the two positions designated by the position designation operation.
- the second corresponding point Q2 is a point on the plane 55 that corresponds to the other of the two positions designated by the position designation operation.
- the control unit 41 calculates the distance
- the first intersection R1 is the intersection of the object 54 and an extension of the straight line connecting the viewpoint V0 and the first corresponding point Q1 in the three-dimensional space.
- the second intersection R2 is the intersection of the object 54 and an extension of the straight line connecting the viewpoint V0 and the second corresponding point Q2 in the three-dimensional space.
- the control unit 41 specifies the three-dimensional coordinates (xq1, yq1, dq) corresponding to the position specified by the first operation as the coordinates of the first corresponding point Q1.
- the control unit 41 specifies the three-dimensional coordinates (xq2, yq2, dq) corresponding to the position specified by the second operation as the coordinates of the second corresponding point Q2.
- the control unit 41 determines the three-dimensional coordinates (xr1, yr1, dr1) is specified as the coordinates of the first intersection point R1.
- the control unit 41 determines the three-dimensional coordinates (xr2, yr2, dr2) is specified as the coordinates of the second intersection R2.
- the control unit 41 calculates the Euclidean distance ⁇ ((xr2-xr1) 2 +(yr2-yr1) between the coordinates (xr1, yr1, dr1) of the first intersection R1 and the coordinates (xr2, yr2, dr2) of the second intersection R2. ) 2 + (dr2-dr1) 2 ) is calculated.
- step S203 the control unit 41 of the image processing device 11 outputs the calculation result obtained in step S202. Specifically, the control unit 41 outputs the numerical value representing the Euclidean distance calculated in step S202 on the screen 80, as shown in FIG.
- step S204 the control unit 41 of the image processing device 11 displays marks 87 and 88 at the first corresponding position and the second corresponding position on the screen 80, respectively.
- the first corresponding position is a position on the screen 80 that corresponds to the intersection of the plane 55 and a straight line connecting the viewpoint V0 and the first intersection R1 in the three-dimensional space.
- the second corresponding position is a position on the screen 80 that corresponds to the intersection of the plane 55 and a straight line connecting the viewpoint V0 and the second intersection R2 in the three-dimensional space.
- the control unit 41 displays marks 87 and 88 at the positions designated by the first operation and the second operation, respectively. For example, if the position of the viewpoint V0 is changed as a result of a subsequent operation accepted in step S105, step S112, or step S119, the control unit 41 also changes the positions of the marks 87 and 88.
- the flow in FIG. 19 may be repeated any number of times. For example, when N is an integer greater than or equal to 2, 2N marks may be displayed on the screen 80 as a result of the position designation operation being performed N or more times.
- the flow in FIG. 20 is executed after the flow in FIG. 19 is executed at least once.
- step S211 when a range specification operation is performed, the control unit 41 of the image processing device 11 receives the range specification operation via the input unit 44.
- the range specification operation is an operation for specifying a range 89 on the screen 80.
- the range specifying operation includes an operation of pressing a button on the mouse 15 as a third operation, and is performed following the third operation and a drag operation of moving the pointer 86 while holding down the button of the mouse 15.
- an operation of releasing a button on the mouse 15 is included as the fourth operation.
- the third operation may be an operation of pressing a button on the mouse 15 while holding down a third key such as the Ctrl key or the Shift key on the keyboard 14.
- the fourth operation may be an operation of releasing a button on the mouse 15 while pressing a fourth key such as the Ctrl key or the Shift key on the keyboard 14.
- step S212 the control unit 41 of the image processing device 11 specifies the corresponding range 56 on the plane 55 corresponding to the screen 80 in the three-dimensional space in response to the range specifying operation performed in step S211.
- the corresponding range 56 is a range on the plane 55 that corresponds to the range 89 designated by the range designation operation.
- the control unit 41 changes the appearance of the mark displayed on the screen 80 at a position corresponding to an intersection existing in the three-dimensional area 57 as shown in FIG. 9 among the marks displayed in step S204.
- the three-dimensional area 57 is an area that extends in a conical shape from the viewpoint V0 through the outer edge of the corresponding range 56 in the three-dimensional space.
- control unit 41 controls the control unit 41 to set two areas on the plane 55 corresponding to a regular range such as a rectangular range or a circular range extending from the position specified in the third operation to the position specified in the fourth operation.
- the dimension range is specified as the corresponding range 56.
- the control unit 41 specifies, as a three-dimensional region 57, a region extending in a conical shape through the coordinates (xv, yv, dv) of the viewpoint V0 and the outer edge of the corresponding range 56.
- the control unit 41 changes the color of the mark displayed on the screen 80 at a position corresponding to an intersection located within the three-dimensional area 57 among the intersections identified in step S202.
- the control unit 41 controls the first intersection point R1 and the second intersection point R2, as shown in FIG. 8 or FIG.
- the colors of the marks 87 and 88 linked to each are changed.
- the control unit 41 changes the color of one or more marks associated with the one or more intersections. May be changed.
- Image processing system 11 Image processing device 12 Cable 13 Drive unit 14 Keyboard 15 Mouse 16 Display 17 Connection terminal 18 Cart unit 20 Probe 21 Drive shaft 22 Hub 23 Sheath 24 Outer tube 25 Ultrasonic transducer 26 Relay connector 31 Scanner unit 32 Slide Unit 33 Bottom cover 34 Probe connection section 35 Scanner motor 36 Inlet 37 Probe clamp section 38 Slide motor 39 Switch group 41 Control section 42 Storage section 43 Communication section 44 Input section 45 Output section 51 Fault data 52 3D data 53 3D Image 54 Object 55 Plane 56 Corresponding range 57 Three-dimensional area 58, 58a, 58b Two-dimensional image 60 Biological tissue 61 Inner surface 62 Cutting area 63 Lumen 64 Cross section 65, 65a, 65b Area 66 Oval fossa 71 Camera 80 Screen 81 Operation Panel 82 Checkbox 83 Slider 84 Slider 85 Checkbox 86 Pointer 87, 88 Mark 89 Range
Landscapes
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Processing Or Creating Images (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
Ce dispositif de traitement d'image, qui, sur la base de la relation de position entre un point de visualisation défini dans un espace tridimensionnel virtuel et un objet de tissu biologique disposé dans l'espace tridimensionnel, réalise le rendu de l'objet et affiche l'objet sur un écran en tant qu'image tridimensionnelle du tissu biologique, est pourvu d'une unité de commande qui : en réponse à une opération de spécification de position pour spécifier deux positions sur l'écran, spécifie un premier point correspondant qui correspond à l'une des deux positions spécifiées sur un plan correspondant à l'écran dans l'espace tridimensionnel et un second point correspondant qui correspond à l'autre ; calcule la distance entre une première intersection, c'est-à-dire une intersection où une extension d'une ligne droite connectant le point de visualisation et le premier point correspondant dans l'espace tridimensionnel croise l'objet, et une seconde intersection, c'est-à-dire une intersection où une extension d'une ligne droite connectant le point de visualisation et le second point correspondant dans l'espace tridimensionnel croise l'objet ; et délivre le résultat de calcul obtenu.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022041892A JP2023136332A (ja) | 2022-03-16 | 2022-03-16 | 画像処理装置、画像処理システム、画像表示方法、及び画像処理プログラム |
JP2022-041892 | 2022-03-16 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023176741A1 true WO2023176741A1 (fr) | 2023-09-21 |
Family
ID=88023688
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2023/009449 WO2023176741A1 (fr) | 2022-03-16 | 2023-03-10 | Dispositif de traitement d'image, système de traitement d'image, méthode d'affichage d'image et programme de traitement d'image |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP2023136332A (fr) |
WO (1) | WO2023176741A1 (fr) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10201755A (ja) * | 1997-01-24 | 1998-08-04 | Hitachi Medical Corp | 擬似三次元画像における三次元量計測方法及びその装置 |
JP2000105838A (ja) * | 1998-09-29 | 2000-04-11 | Toshiba Corp | 画像表示方法及び画像処理装置 |
JP2002063564A (ja) * | 2000-08-17 | 2002-02-28 | Aloka Co Ltd | 画像処理装置及び記憶媒体 |
-
2022
- 2022-03-16 JP JP2022041892A patent/JP2023136332A/ja active Pending
-
2023
- 2023-03-10 WO PCT/JP2023/009449 patent/WO2023176741A1/fr unknown
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10201755A (ja) * | 1997-01-24 | 1998-08-04 | Hitachi Medical Corp | 擬似三次元画像における三次元量計測方法及びその装置 |
JP2000105838A (ja) * | 1998-09-29 | 2000-04-11 | Toshiba Corp | 画像表示方法及び画像処理装置 |
JP2002063564A (ja) * | 2000-08-17 | 2002-02-28 | Aloka Co Ltd | 画像処理装置及び記憶媒体 |
Also Published As
Publication number | Publication date |
---|---|
JP2023136332A (ja) | 2023-09-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7300352B2 (ja) | 診断支援装置、診断支援システム、及び診断支援方法 | |
US20220218309A1 (en) | Diagnostic assistance device, diagnostic assistance system, and diagnostic assistance method | |
WO2023176741A1 (fr) | Dispositif de traitement d'image, système de traitement d'image, méthode d'affichage d'image et programme de traitement d'image | |
JP2020140716A (ja) | 体腔のマップ | |
JP5498090B2 (ja) | 画像処理装置及び超音波診断装置 | |
WO2020217860A1 (fr) | Dispositif d'aide au diagnostic et méthode d'aide au diagnostic | |
WO2023054001A1 (fr) | Dispositif de traitement d'image, système de traitement d'image, procédé d'affichage d'image et programme de traitement d'image | |
WO2023013601A1 (fr) | Dispositif de traitement d'images, système de traitement d'images, procédé de traitement d'images et programme de traitement d'images | |
WO2022202200A1 (fr) | Dispositif de traitement d'image, système de traitement d'image, procédé d'affichage d'image et programme de traitement d'image | |
CN114502079B (zh) | 诊断支援装置、诊断支援系统及诊断支援方法 | |
WO2022202203A1 (fr) | Dispositif de traitement d'image, système de traitement d'image, procédé d'affichage d'image et programme de traitement d'image | |
WO2022202202A1 (fr) | Dispositif de traitement d'image, système de traitement d'image, procédé d'affichage d'image et programme de traitement d'image | |
WO2021065746A1 (fr) | Dispositif d'aide au diagnostic, système d'aide au diagnostic et procédé d'aide au diagnostic | |
WO2022202201A1 (fr) | Dispositif de traitement d'images, système de traitement d'images, procédé d'affichage d'image et programme de traitement d'images | |
WO2022071251A1 (fr) | Dispositif de traitement d'image, système de traitement d'image, procédé d'affichage d'image et programme de traitement d'image | |
WO2022071250A1 (fr) | Dispositif de traitement d'image, système de traitement d'image, procédé d'affichage d'image et programme de traitement d'image | |
WO2024071054A1 (fr) | Dispositif de traitement d'image, système d'affichage d'image, méthode d'affichage d'image et programme de traitement d'image | |
JP2023024072A (ja) | 画像処理装置、画像処理システム、画像表示方法、及び画像処理プログラム | |
WO2021200294A1 (fr) | Dispositif de traitement d'image, système de traitement d'image, procédé d'affichage d'image et programme de traitement d'image | |
WO2021200296A1 (fr) | Dispositif de traitement d'image, système de traitement d'image, procédé d'affichage d'image et programme de traitement d'image | |
WO2022085373A1 (fr) | Dispositif de traitement d'image, système de traitement d'image, procédé d'affichage d'image et programme de traitement d'image | |
WO2021200295A1 (fr) | Dispositif de traitement d'image, système de traitement d'image, procédé d'affichage d'image et programme de traitement d'image | |
JP2024051695A (ja) | 画像処理装置、画像表示システム、画像処理方法、及び画像処理プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23770696 Country of ref document: EP Kind code of ref document: A1 |