WO2022202200A1 - Dispositif de traitement d'image, système de traitement d'image, procédé d'affichage d'image et programme de traitement d'image - Google Patents

Dispositif de traitement d'image, système de traitement d'image, procédé d'affichage d'image et programme de traitement d'image Download PDF

Info

Publication number
WO2022202200A1
WO2022202200A1 PCT/JP2022/009239 JP2022009239W WO2022202200A1 WO 2022202200 A1 WO2022202200 A1 WO 2022202200A1 JP 2022009239 W JP2022009239 W JP 2022009239W WO 2022202200 A1 WO2022202200 A1 WO 2022202200A1
Authority
WO
WIPO (PCT)
Prior art keywords
tissue
image
data
dimensional
image processing
Prior art date
Application number
PCT/JP2022/009239
Other languages
English (en)
Japanese (ja)
Inventor
泰一 坂本
克彦 清水
弘之 石原
俊祐 吉澤
クレモン ジャケ
ステフェン チェン
トマ エン
亮介 佐賀
Original Assignee
テルモ株式会社
株式会社ロッケン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by テルモ株式会社, 株式会社ロッケン filed Critical テルモ株式会社
Priority to JP2023508893A priority Critical patent/JPWO2022202200A1/ja
Publication of WO2022202200A1 publication Critical patent/WO2022202200A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/12Diagnosis using ultrasonic, sonic or infrasonic waves in body cavities or body tracts, e.g. by using catheters

Definitions

  • the present disclosure relates to an image processing device, an image processing system, an image display method, and an image processing program.
  • Patent Documents 1 to 3 describe techniques for generating three-dimensional images of heart chambers or blood vessels using a US imaging system.
  • US is an abbreviation for ultrasound.
  • IVUS is an abbreviation for intravascular ultrasound.
  • IVUS is a device or method that provides two-dimensional images in a plane perpendicular to the longitudinal axis of the catheter.
  • a three-dimensional image that expresses the structure of a living tissue such as a heart chamber or a blood vessel is automatically generated from the two-dimensional IVUS image, and the generated three-dimensional image is displayed to the operator. can be considered. If the generated three-dimensional image is displayed as it is, the operator can only see the outer wall of the tissue. Therefore, it is conceivable to cut out part of the structure of the living tissue in the three-dimensional image so that the lumen can be seen. . If a catheter other than the IVUS catheter, such as an ablation catheter or a catheter for atrial septal puncture, is inserted into the living tissue, it is conceivable to further display a three-dimensional image representing the other catheter.
  • a catheter other than the IVUS catheter such as an ablation catheter or a catheter for atrial septal puncture
  • a 3D mapping system in which a position sensor is mounted on a catheter and draws a three-dimensional image using position information when the position sensor touches the myocardial tissue, is mainly used in the procedure. It is very time-consuming because it is necessary to bring the catheter into full contact with the myocardial tissue surface in the heart chamber. Circumferential isolation of the PV or SVC requires the marking of the site of ablation, but if IVUS can be used to complete such an operation, the time may be reduced.
  • PV is an abbreviation for pulmonary vein.
  • SVC is an abbreviation for superior vena cava. It is conceivable to display a three-dimensional image that expresses a mark by marking at least one location such as a cauterized location of living tissue.
  • An object of the present disclosure is to enable the location of an object or landmark associated with an object located in the lumen of a living tissue, or a landmark associated with the living tissue, to be located behind or within a portion of the living tissue. It is to do so.
  • An image processing apparatus includes first data representing a living tissue, second data representing an object located in a lumen of the living tissue, or second data representing a mark associated with the living tissue.
  • An image processing device for displaying three-dimensional data including three data as a three-dimensional image on a display, wherein the positional relationship between the biological tissue and the object or the mark is specified by referring to the three-dimensional data. , according to the specified positional relationship, there is an intervening tissue that is a portion of the living tissue that is interposed between the object or the mark and a viewpoint set when the three-dimensional image is displayed. and a control unit for performing control to display an image representing the object or the mark at a position corresponding to the intervening tissue in the three-dimensional image when it is determined that the intervening tissue exists.
  • control unit performs control to display an image representing the object or the mark on the surface of the intervening tissue in the three-dimensional image when it is determined that the intervening tissue exists.
  • control unit assigns a texture different from a texture applied to a surface of a portion of the living tissue adjacent to the intervening tissue in the three-dimensional image as the image representing the object or the mark. Apply to the surface of the intervening tissue.
  • control unit performs control to display an image representing the target object or the mark on the cross section of the intervening tissue in the three-dimensional image when it is determined that the intervening tissue exists.
  • control unit applies a texture different from a texture applied to a cross-section of a portion of the biological tissue adjacent to the intervening tissue in the three-dimensional image as the image representing the object or the mark. Apply to the cross-section of the intervening tissue.
  • control unit when the control unit determines that the intervening tissue exists, it performs control to display an image representing the target object or the mark through the intervening tissue in the three-dimensional image.
  • control unit arranges a three-dimensional image representing the object or the mark on the side opposite to the viewpoint of the intervening tissue in the three-dimensional image as the image representing the object or the mark. do.
  • the three-dimensional data are data constructed based on data obtained by a sensor inserted into the lumen of the biological tissue and observing the biological tissue and the object, respectively. data and the second data, wherein the object is a catheter inserted into the lumen of the biological tissue.
  • the three-dimensional data is constructed based on data obtained by a sensor that is inserted into the lumen of the biological tissue and observes the biological tissue, and each time new data is obtained by the sensor.
  • the control unit acquires the designation data that designates the mark, constructs the third data based on the acquired designation data, and converts the third data to the Include in 3D data.
  • An image processing system as one aspect of the present disclosure includes the image processing device, and a sensor that observes the living tissue and the object.
  • the image processing system further includes the display.
  • An image display method includes first data representing a living tissue, second data representing an object located in a lumen of the living tissue, or second data representing a mark associated with the living tissue.
  • An image display method for displaying three-dimensional data including three data on a display as a three-dimensional image, wherein a computer refers to the three-dimensional data to determine the positional relationship between the biological tissue and the object or the mark. and the computer determines, according to the identified positional relationship, the part of the biological tissue interposed between the object or the mark and the viewpoint set when the three-dimensional image is displayed.
  • an image representing the object or the mark at a position corresponding to the intervening tissue in the three-dimensional image when the computer determines that the intervening tissue exists is to control the display of
  • An image processing program includes first data representing a living tissue, second data representing an object located in a lumen of the living tissue, or second data representing a mark associated with the living tissue. a process of specifying the positional relationship between the biological tissue and the object or the mark by referring to the three-dimensional data in a computer that displays the three-dimensional data including the three data as a three-dimensional image on a display; depending on the positional relationship, whether or not there is an intervening tissue that is a portion intervening between the object or the mark and the viewpoint set when the three-dimensional image is displayed in the living tissue and, if it is determined that the intervening tissue exists, control is performed to display an image representing the object or the mark at a position corresponding to the intervening tissue in the three-dimensional image.
  • FIG. 1 is a perspective view of an image processing system according to an embodiment of the present disclosure
  • FIG. FIG. 4 is a cross-sectional view showing an example in which an object exists behind an intervening tissue
  • 1 is a block diagram showing the configuration of an image processing device according to an embodiment of the present disclosure
  • FIG. FIG. 2 is a perspective view of a probe and drive unit according to an embodiment of the present disclosure
  • 4 is a flow chart showing the operation of the image processing system according to the embodiment of the present disclosure
  • 4 is a flow chart showing the operation of the image processing system according to the embodiment of the present disclosure
  • FIG. 4 is a cross-sectional view showing an example of the positional relationship between a living tissue, an object, and a viewpoint
  • FIG. 4 is a diagram showing an example screen of a display according to an embodiment of the present disclosure
  • FIG. FIG. 4 is a diagram showing an example screen of a display according to an embodiment of the present disclosure
  • FIG. FIG. 4 is a schematic diagram showing an example of the positional relationship between a living tissue, a mark, and a viewpoint
  • FIG. 10 is a schematic diagram showing another example of the positional relationship between a living tissue, a mark, and a viewpoint
  • FIG. 12 is a schematic diagram showing an example of displaying an image representing a mark on the surface of the intervening tissue in the example of FIG. 11;
  • FIG. 1 An outline of the present embodiment will be described with reference to FIGS. 1 to 3.
  • FIG. 1 An outline of the present embodiment will be described with reference to FIGS. 1 to 3.
  • the image processing apparatus 11 converts three-dimensional data 52 including first data representing the living tissue 60 and second data representing the object located in the lumen 61 of the living tissue 60 into a three-dimensional image 53 . is displayed on the display 16 as a computer.
  • the image processing device 11 refers to the three-dimensional data 52 to identify the positional relationship between the living tissue 60 and the object.
  • the image processing device 11 detects the presence of an intervening tissue, which is a portion of the biological tissue 60 that is interposed between the object and the viewpoint set when the three-dimensional image 53 is displayed, according to the specified positional relationship.
  • the image processing device 11 decides whether to When the image processing device 11 determines that an intervening tissue exists, the image processing device 11 performs control to display an image representing the object at a position corresponding to the intervening tissue in the three-dimensional image 53 . Therefore, according to this embodiment, it is possible to confirm the position of the object even when there is an intervening tissue.
  • the biological tissue 60 includes, for example, blood vessels or organs such as the heart.
  • the biological tissue 60 is not limited to an anatomical single organ or a part thereof, but also includes a tissue that straddles a plurality of organs and has a lumen.
  • a specific example of such tissue is a portion of the vascular system extending from the upper portion of the inferior vena cava through the right atrium to the lower portion of the superior vena cava.
  • the living tissue 60 is the right atrium.
  • the portion of the right atrium adjacent to the fossa ovalis 65 is raised inward to form a ridge 64 .
  • a catheter 63 such as an ablation catheter or a catheter for atrial septal puncture is inserted into the right atrium.
  • an image expressing the structure of the living tissue 60 is automatically generated as the three-dimensional image 53 and the generated image is displayed to the operator.
  • a part of the structure of the living tissue 60 is cut off in the generated image so that the lumen 61 can be seen.
  • an image representing catheter 63 is also displayed.
  • the catheter 63 is behind the ridge 64 and is not visible to the operator depending on the direction in which the lumen 61 is viewed.
  • an image representing the catheter 63 is displayed on the surface of the ridge 64 in this embodiment. That is, at least the portion of the ridge 64 hiding the catheter 63 appears transparent, allowing the catheter 63 to be seen through that portion. Therefore, the operator can smoothly perform an operation such as ablation or atrial septal puncture.
  • the portion of the ridge 64 that hides the catheter 63 corresponds to the "intervening tissue".
  • This embodiment can be used not only when the catheter 63 is behind the ridge 64, but also when the fossa ovalis 65 is tented during an atrial septal puncture operation and the catheter 63 is embedded in the tissue. is behind or inside any tissue.
  • the catheter 63 corresponds to the "object".
  • the object is not limited to the catheter 63, but may be another object located in the lumen 61 of the biological tissue 60, such as a stent.
  • the X direction and the Y direction perpendicular to the X direction respectively correspond to the lateral direction of the lumen 61 of the living tissue 60 .
  • a Z direction perpendicular to the X and Y directions corresponds to the longitudinal direction of the lumen 61 of the biological tissue 60 .
  • the image processing system 10 includes an image processing device 11, a cable 12, a drive unit 13, a keyboard 14, a mouse 15, and a display 16.
  • the image processing apparatus 11 is a dedicated computer specialized for image diagnosis in this embodiment, but may be a general-purpose computer such as a PC. "PC” is an abbreviation for personal computer.
  • the cable 12 is used to connect the image processing device 11 and the drive unit 13.
  • the drive unit 13 is a device that is used by connecting to the probe 20 shown in FIG.
  • the drive unit 13 is also called MDU.
  • MDU is an abbreviation for motor drive unit.
  • Probe 20 has IVUS applications. Probe 20 is also referred to as an IVUS catheter or diagnostic imaging catheter.
  • the keyboard 14, mouse 15, and display 16 are connected to the image processing device 11 via any cable or wirelessly.
  • the display 16 is, for example, an LCD, organic EL display, or HMD.
  • LCD is an abbreviation for liquid crystal display.
  • EL is an abbreviation for electro luminescence.
  • HMD is an abbreviation for head-mounted display.
  • the image processing system 10 further comprises a connection terminal 17 and a cart unit 18 as options.
  • connection terminal 17 is used to connect the image processing device 11 and an external device.
  • the connection terminal 17 is, for example, a USB terminal.
  • USB is an abbreviation for Universal Serial Bus.
  • the external device is, for example, a recording medium such as a magnetic disk drive, a magneto-optical disk drive, or an optical disk drive.
  • the cart unit 18 is a cart with casters for movement.
  • An image processing device 11 , a cable 12 and a drive unit 13 are installed in the cart body of the cart unit 18 .
  • a keyboard 14 , a mouse 15 and a display 16 are installed on the top table of the cart unit 18 .
  • the probe 20 includes a drive shaft 21, a hub 22, a sheath 23, an outer tube 24, an ultrasonic transducer 25, and a relay connector 26.
  • the drive shaft 21 passes through a sheath 23 inserted into the body cavity of a living body, an outer tube 24 connected to the proximal end of the sheath 23, and extends to the inside of a hub 22 provided at the proximal end of the probe 20.
  • the driving shaft 21 has an ultrasonic transducer 25 for transmitting and receiving signals at its tip and is rotatably provided within the sheath 23 and the outer tube 24 .
  • a relay connector 26 connects the sheath 23 and the outer tube 24 .
  • the hub 22, the drive shaft 21, and the ultrasonic transducer 25 are connected to each other so as to integrally move back and forth in the axial direction. Therefore, for example, when the hub 22 is pushed toward the distal side, the drive shaft 21 and the ultrasonic transducer 25 move inside the sheath 23 toward the distal side. For example, when the hub 22 is pulled proximally, the drive shaft 21 and the ultrasonic transducer 25 move proximally inside the sheath 23 as indicated by the arrows.
  • the drive unit 13 includes a scanner unit 31, a slide unit 32, and a bottom cover 33.
  • the scanner unit 31 is connected to the image processing device 11 via the cable 12 .
  • the scanner unit 31 includes a probe connection section 34 that connects to the probe 20 and a scanner motor 35 that is a drive source that rotates the drive shaft 21 .
  • the probe connecting portion 34 is detachably connected to the probe 20 through an insertion port 36 of the hub 22 provided at the proximal end of the probe 20 .
  • the proximal end of the drive shaft 21 is rotatably supported, and the rotational force of the scanner motor 35 is transmitted to the drive shaft 21 .
  • Signals are also transmitted and received between the drive shaft 21 and the image processing device 11 via the cable 12 .
  • the image processing device 11 generates a tomographic image of the body lumen and performs image processing based on the signal transmitted from the drive shaft 21 .
  • the slide unit 32 mounts the scanner unit 31 so as to move back and forth, and is mechanically and electrically connected to the scanner unit 31 .
  • the slide unit 32 includes a probe clamp section 37 , a slide motor 38 and a switch group 39 .
  • the probe clamping part 37 is arranged coaxially with the probe connecting part 34 on the tip side of the probe connecting part 34 and supports the probe 20 connected to the probe connecting part 34 .
  • the slide motor 38 is a driving source that generates axial driving force.
  • the scanner unit 31 advances and retreats by driving the slide motor 38, and the drive shaft 21 advances and retreats in the axial direction accordingly.
  • the slide motor 38 is, for example, a servomotor.
  • the switch group 39 includes, for example, a forward switch and a pullback switch that are pressed when moving the scanner unit 31 back and forth, and a scan switch that is pressed when image rendering is started and ended.
  • Various switches are included in the switch group 39 as needed, without being limited to the example here.
  • the scanner motor 35 When the scan switch is pressed, image rendering is started, the scanner motor 35 is driven, and the slide motor 38 is driven to move the scanner unit 31 backward.
  • a user such as an operator connects the probe 20 to the scanner unit 31 in advance, and causes the drive shaft 21 to rotate and move to the proximal end side in the axial direction when image rendering is started.
  • the scanner motor 35 and the slide motor 38 are stopped when the scan switch is pressed again, and image rendering is completed.
  • the bottom cover 33 covers the bottom surface of the slide unit 32 and the entire circumference of the side surface on the bottom surface side, and can move toward and away from the bottom surface of the slide unit 32 .
  • the image processing device 11 includes a control section 41 , a storage section 42 , a communication section 43 , an input section 44 and an output section 45 .
  • the control unit 41 includes at least one processor, at least one programmable circuit, at least one dedicated circuit, or any combination thereof.
  • a processor may be a general-purpose processor such as a CPU or GPU, or a dedicated processor specialized for a particular process.
  • CPU is an abbreviation for central processing unit.
  • GPU is an abbreviation for graphics processing unit.
  • a programmable circuit is, for example, an FPGA.
  • FPGA is an abbreviation for field-programmable gate array.
  • a dedicated circuit is, for example, an ASIC.
  • ASIC is an abbreviation for application specific integrated circuit.
  • the control unit 41 executes processing related to the operation of the image processing device 11 while controlling each unit of the image processing system 10 including the image processing device 11 .
  • the storage unit 42 includes at least one semiconductor memory, at least one magnetic memory, at least one optical memory, or any combination thereof.
  • a semiconductor memory is, for example, a RAM or a ROM.
  • RAM is an abbreviation for random access memory.
  • ROM is an abbreviation for read only memory.
  • RAM is, for example, SRAM or DRAM.
  • SRAM is an abbreviation for static random access memory.
  • DRAM is an abbreviation for dynamic random access memory.
  • ROM is, for example, EEPROM.
  • EEPROM is an abbreviation for electrically erasable programmable read only memory.
  • the storage unit 42 functions, for example, as a main memory device, an auxiliary memory device, or a cache memory.
  • the storage unit 42 stores data used for the operation of the image processing apparatus 11, such as the tomographic data 51, and data obtained by the operation of the image processing apparatus 11, such as the three-dimensional data 52 and the three-dimensional image 53. .
  • the communication unit 43 includes at least one communication interface.
  • the communication interface is, for example, a wired LAN interface, a wireless LAN interface, or an image diagnosis interface that receives and A/D converts IVUS signals.
  • LAN is an abbreviation for local area network.
  • A/D is an abbreviation for analog to digital.
  • the communication unit 43 receives data used for the operation of the image processing device 11 and transmits data obtained by the operation of the image processing device 11 .
  • the drive unit 13 is connected to an image diagnosis interface included in the communication section 43 .
  • the input unit 44 includes at least one input interface.
  • the input interface is, for example, a USB interface, an HDMI (registered trademark) interface, or an interface compatible with a short-range wireless communication standard such as Bluetooth (registered trademark).
  • HDMI registered trademark
  • HDMI registered trademark
  • HDMI registered trademark
  • HDMI registered trademark
  • HDMI registered trademark
  • HDMI registered trademark
  • HDMI registered trademark
  • the output unit 45 includes at least one output interface.
  • the output interface is, for example, a USB interface, an HDMI (registered trademark) interface, or an interface compatible with a short-range wireless communication standard such as Bluetooth (registered trademark).
  • the output unit 45 outputs data obtained by the operation of the image processing device 11 .
  • the display 16 is connected to a USB interface or HDMI (registered trademark) interface included in the output unit 45 .
  • the functions of the image processing device 11 are realized by executing the image processing program according to the present embodiment with a processor as the control unit 41 . That is, the functions of the image processing device 11 are realized by software.
  • the image processing program causes the computer to function as the image processing device 11 by causing the computer to execute the operation of the image processing device 11 . That is, the computer functions as the image processing device 11 by executing the operation of the image processing device 11 according to the image processing program.
  • the program can be stored on a non-transitory computer-readable medium.
  • a non-transitory computer-readable medium is, for example, a flash memory, a magnetic recording device, an optical disk, a magneto-optical recording medium, or a ROM.
  • Program distribution is performed, for example, by selling, assigning, or lending a portable medium such as an SD card, DVD, or CD-ROM storing the program.
  • SD is an abbreviation for Secure Digital.
  • DVD is an abbreviation for digital versatile disc.
  • CD-ROM is an abbreviation for compact disc read only memory.
  • the program may be distributed by storing the program in the storage of the server and transferring the program from the server to another computer.
  • a program may be provided as a program product.
  • a computer for example, temporarily stores a program stored in a portable medium or a program transferred from a server in a main storage device. Then, the computer reads the program stored in the main storage device with the processor, and executes processing according to the read program with the processor.
  • the computer may read the program directly from the portable medium and execute processing according to the program.
  • the computer may execute processing according to the received program every time the program is transferred from the server to the computer.
  • the processing may be executed by a so-called ASP type service that realizes the function only by executing the execution instruction and obtaining the result without transferring the program from the server to the computer.
  • "ASP" is an abbreviation for application service provider.
  • the program includes information to be used for processing by a computer and conforming to the program. For example, data that is not a direct instruction to a computer but that has the property of prescribing the processing of the computer corresponds to "things equivalent to a program.”
  • a part or all of the functions of the image processing device 11 may be realized by a programmable circuit or a dedicated circuit as the control unit 41. That is, part or all of the functions of the image processing device 11 may be realized by hardware.
  • the operation of the image processing system 10 according to this embodiment will be described with reference to FIG.
  • the operation of the image processing system 10 corresponds to the image display method according to this embodiment.
  • the probe 20 is primed by the user before the flow of FIG. 5 starts. After that, the probe 20 is fitted into the probe connection portion 34 and the probe clamp portion 37 of the drive unit 13 and connected and fixed to the drive unit 13 . Then, the probe 20 is inserted to a target site in a living tissue 60 such as a blood vessel or heart.
  • a living tissue 60 such as a blood vessel or heart.
  • step S101 the scan switch included in the switch group 39 is pressed, and the pullback switch included in the switch group 39 is pressed, so that a so-called pullback operation is performed.
  • the probe 20 transmits ultrasonic waves by means of the ultrasonic transducer 25 retracted in the axial direction by a pullback operation inside the biological tissue 60 .
  • the ultrasonic transducer 25 radially transmits ultrasonic waves while moving inside the living tissue 60 .
  • the ultrasonic transducer 25 receives reflected waves of the transmitted ultrasonic waves.
  • the probe 20 inputs the signal of the reflected wave received by the ultrasonic transducer 25 to the image processing device 11 .
  • the control unit 41 of the image processing apparatus 11 processes the input signal to sequentially generate cross-sectional images of the biological tissue 60, thereby acquiring tomographic data 51 including a plurality of cross-sectional images.
  • the probe 20 rotates the ultrasonic transducer 25 in the circumferential direction inside the living tissue 60 and moves it in the axial direction, and rotates the ultrasonic transducer 25 toward the outside from the center of rotation.
  • the probe 20 receives reflected waves from reflecting objects present in each of a plurality of directions inside the living tissue 60 by the ultrasonic transducer 25 .
  • the probe 20 transmits the received reflected wave signal to the image processing device 11 via the drive unit 13 and the cable 12 .
  • the communication unit 43 of the image processing device 11 receives the signal transmitted from the probe 20 .
  • the communication unit 43 A/D converts the received signal.
  • the communication unit 43 inputs the A/D converted signal to the control unit 41 .
  • the control unit 41 processes the input signal and calculates the intensity value distribution of the reflected waves from the reflectors present in the transmission direction of the ultrasonic waves from the ultrasonic transducer 25 .
  • the control unit 41 sequentially generates two-dimensional images having a luminance value distribution corresponding to the calculated intensity value distribution as cross-sectional images of the biological tissue 60, thereby acquiring tomographic data 51, which is a data set of cross-sectional images.
  • the control unit 41 causes the storage unit 42 to store the obtained tomographic data 51 .
  • the signal of the reflected wave received by the ultrasonic transducer 25 corresponds to the raw data of the tomographic data 51
  • the cross-sectional image generated by processing the signal of the reflected wave by the image processing device 11 is the tomographic data. 51 processing data.
  • the control unit 41 of the image processing device 11 may store the signal input from the probe 20 as the tomographic data 51 in the storage unit 42 as it is.
  • the control unit 41 may store, as the tomographic data 51 , data indicating the intensity value distribution of the reflected wave calculated by processing the signal input from the probe 20 in the storage unit 42 .
  • the tomographic data 51 is not limited to a data set of cross-sectional images of the living tissue 60, and may be data representing cross-sections of the living tissue 60 at each movement position of the ultrasonic transducer 25 in some format.
  • an ultrasonic transducer that transmits ultrasonic waves in multiple directions without rotating is used instead of the ultrasonic transducer 25 that transmits ultrasonic waves in multiple directions while rotating in the circumferential direction.
  • the tomographic data 51 may be acquired using OFDI or OCT instead of being acquired using IVUS.
  • OFDI is an abbreviation for optical frequency domain imaging.
  • OCT is an abbreviation for optical coherence tomography.
  • another device instead of the image processing device 11 generating a dataset of cross-sectional images of the biological tissue 60, another device generates a similar dataset, and the image processing device 11 generates the dataset. It may be obtained from the other device. That is, instead of the control unit 41 of the image processing device 11 processing the IVUS signal to generate a cross-sectional image of the biological tissue 60, another device processes the IVUS signal to generate a cross-sectional image of the biological tissue 60. You may generate
  • step S102 the control unit 41 of the image processing apparatus 11 generates three-dimensional data 52 of the biological tissue 60 based on the tomographic data 51 acquired in step S101. That is, the control unit 41 generates the three-dimensional data 52 based on the tomographic data 51 acquired by the sensor.
  • the generated three-dimensional data 52 already exists, it is possible to update only the data at the location corresponding to the updated tomographic data 51 instead of regenerating all the three-dimensional data 52 from scratch. preferable. In that case, the amount of data processing when generating the three-dimensional data 52 can be reduced, and the real-time performance of the three-dimensional image 53 in the subsequent step S103 can be improved.
  • control unit 41 of the image processing device 11 stacks the cross-sectional images of the living tissue 60 included in the tomographic data 51 stored in the storage unit 42 to three-dimensionalize the living tissue 60 .
  • Dimensional data 52 is generated.
  • any one of rendering methods such as surface rendering or volume rendering, and associated processing such as texture mapping including environment mapping, bump mapping, and the like is used.
  • the control unit 41 causes the storage unit 42 to store the generated three-dimensional data 52 .
  • the tomographic data 51 includes the catheter 63 as well as the data of the biological tissue 60. data is included. Therefore, in step S102, the three-dimensional data 52 generated by the control unit 41 also includes the data of the catheter 63 as second data in the same way as the data of the biological tissue 60 as the first data.
  • step S102 Details of the processing performed in step S102 when the catheter 63 is inserted into the biological tissue 60 will be described with reference to FIG.
  • the control unit 41 of the image processing apparatus 11 classifies the pixel groups of the cross-sectional image included in the tomographic data 51 acquired in step S101 into two or more classes.
  • These two or more classes include at least a "living tissue” class and a “catheter” class, a "blood cell” class, a guidewire class and other “medical device” classes other than “catheters”,
  • a class of "indwelling objects” such as stents, or a class of "lesions” such as lime or plaque may also be included.
  • Any method may be used as the classification method, but in this embodiment, a method of classifying pixel groups of cross-sectional images using a trained model is used.
  • the learned model is trained by performing machine learning in advance so that it can detect regions corresponding to each class from a sample IVUS cross-sectional image.
  • step S200a the control unit 41 of the image processing apparatus 11 builds a three-dimensional object of the living tissue 60 by stacking regions classified into the "living tissue" class.
  • the control unit 41 reflects the constructed three-dimensional object of the biological tissue 60 in the three-dimensional space.
  • step S200b the control unit 41 builds a three-dimensional object of the catheter 63 by stacking the regions classified into the "catheter” class.
  • the catheter 63 is extracted by segmentation in this embodiment, but may be extracted by other techniques such as object detection. For example, a method of extracting only the catheter position may be used to construct an object considering the position as an object of the catheter 63 .
  • the control unit 41 reflects the constructed three-dimensional object of the catheter 63 in the three-dimensional space.
  • step S ⁇ b>200 c the control unit 41 executes the processing of steps S ⁇ b>201 to S ⁇ b>205 for each tissue voxel that is the voxel of the living tissue 60 .
  • the control unit 41 may place the virtual camera 71 and the virtual light source 72 as shown in FIG. 7 at arbitrary positions in the three-dimensional space.
  • the position of the camera 71 corresponds to the “viewpoint” when displaying the three-dimensional image 53 on the display 16 .
  • the number and relative positions of the light sources 72 are not limited to those illustrated, and can be changed as appropriate.
  • the three-dimensional object of the living tissue 60 may be cut at any cutting plane.
  • step S201 the control unit 41 of the image processing device 11 determines whether or not there is a catheter voxel, which is the voxel of the catheter 63, on an extension of the straight line including the straight line connecting the viewpoint and the tissue voxel. If there is no catheter voxel on the extension line, in step S202, the control unit 41 applies the first color, which is the color pre-assigned to the "living tissue" class, as the color of the tissue voxel.
  • step S203 the control unit 41 calculates a first distance that is the distance between the viewpoint and the tissue voxel, and a first distance that is the distance between the viewpoint and the catheter voxel on the extension line. 2 Calculate the distance. If the first distance is longer than the second distance, that is, if the catheter 63 exists in front of the living tissue 60, in step S204, the control unit 41 selects the color of the tissue voxel from the "catheter" class in advance. Apply the second color, which is the assigned color.
  • the control unit 41 sets the first color and the second color as the tissue voxel colors. Apply a third color that is different from the first and second colors, such as a color halfway between the two colors. In this embodiment, the control unit 41 applies a color containing 70% of the first color and 30% of the second color as the intermediate color corresponding to the third color.
  • the data regarding the coloring of the voxel group performed in the flow of FIG. 6 is stored in the storage unit 42 as part of the three-dimensional data 52.
  • step S103 the control unit 41 of the image processing device 11 causes the display 16 to display the three-dimensional data 52 generated in step S102 as a three-dimensional image 53.
  • control unit 41 of the image processing device 11 generates a 3D image 53 from the 3D data 52 stored in the storage unit 42 .
  • the control unit 41 causes the display 16 to display the generated three-dimensional image 53 via the output unit 45 .
  • step S104 If there is a user operation in step S104, the processing from step S105 to step S108 is performed. If there is no user operation, the processing from step S105 to step S108 is skipped.
  • step S105 the control unit 41 of the image processing device 11 receives an operation for setting the position of the opening 62 as shown in FIG.
  • the position of the opening 62 is set such that the lumen 61 of the living tissue 60 is exposed through the opening 62 in the three-dimensional image 53 displayed in step S103.
  • control unit 41 of the image processing device 11 allows the user to use the keyboard 14 , the mouse 15 , or the touch screen provided integrally with the display 16 to display the three-dimensional image 53 displayed on the display 16 .
  • An operation for cutting off a portion of the living tissue 60 is received via the input unit 44 .
  • the control unit 41 receives an operation of cutting off a portion of the living tissue 60 so that the cross section of the living tissue 60 has an open shape.
  • the “cross section of the living tissue 60 ” may be a cross section of the living tissue 60 , a longitudinal cross section of the living tissue 60 , or other cross sections of the living tissue 60 .
  • the “transverse section of the biological tissue 60 ” is a cross section obtained by cutting the biological tissue 60 perpendicularly to the direction in which the ultrasonic transducer 25 moves in the biological tissue 60 .
  • the “longitudinal section of the biological tissue 60 ” is a cut plane obtained by cutting the biological tissue 60 along the direction in which the ultrasonic transducer 25 moves in the biological tissue 60 .
  • “Another cross section of the biological tissue 60 ” is a cross section obtained by cutting the biological tissue 60 obliquely with respect to the direction in which the ultrasonic transducer 25 moves in the biological tissue 60 .
  • the “open shape” is, for example, a substantially C-shape, a substantially U-shape, a substantially three-shape, or any of these shapes, such as a bifurcation of a blood vessel, a pulmonary vein ostium, or a hole originally opened in the living tissue 60 . It is a shape partially lacking due to the presence of In the example of FIG. 7, the cross section of the living tissue 60 is substantially C-shaped.
  • step S106 the control unit 41 of the image processing device 11 determines the position of the opening 62 as the position set by the operation received in step S105.
  • control unit 41 of the image processing device 11 sets the three-dimensional coordinates of the boundary of the cut off portion of the living tissue 60 by the user's operation to the opening 62 in the three-dimensional data 52 stored in the storage unit 42 . , as the three-dimensional coordinates of the edge of the The control unit 41 causes the storage unit 42 to store the identified three-dimensional coordinates.
  • step S ⁇ b>107 the control unit 41 of the image processing device 11 forms an opening 62 in the three-dimensional data 52 that exposes the lumen 61 of the biological tissue 60 in the three-dimensional image 53 .
  • control unit 41 of the image processing device 11 converts the portion specified by the three-dimensional coordinates stored in the storage unit 42 into the three-dimensional image 53 in the three-dimensional data 52 stored in the storage unit 42. It is set to be hidden or transparent when displayed on the display 16. - ⁇
  • step S108 the control unit 41 of the image processing device 11 adjusts the viewpoint when displaying the three-dimensional image 53 on the display 16 according to the position of the opening 62 formed in step S107.
  • the control unit 41 arranges the viewpoint on a straight line extending from the inner surface of the living tissue 60 to the outside of the living tissue 60 through the opening 62 . Therefore, the user can look into the interior of the living tissue 60 through the opening 62 and virtually observe the lumen 61 of the living tissue 60 .
  • control unit 41 of the image processing device 11 controls the position where the lumen 61 of the biological tissue 60 can be seen through the portion set to be hidden or transparent in the three-dimensional image 53 displayed on the display 16.
  • a virtual camera 71 is arranged.
  • the control unit 41 controls, in the cross section of the biological tissue 60, a first straight line L1 extending from the inner surface of the biological tissue 60 to the outside of the biological tissue 60 through the first edge E1 of the opening 62,
  • a virtual camera 71 is arranged in an area AF sandwiched by a second straight line L2 extending from the inner surface of the tissue 60 through the second edge E2 of the opening 62 to the outside of the living tissue 60 .
  • the point where the first straight line L1 intersects the inner surface of the living tissue 60 is the same point Pt as the point where the second straight line L2 intersects the inner surface of the living tissue 60 . Therefore, the user can observe the point Pt on the inner surface of the living tissue 60 regardless of the position of the virtual camera 71 in the area AF.
  • the point Pt is drawn perpendicularly to the third straight line L3 from the middle point Pc of the third straight line L3 connecting the first edge E1 of the opening 62 and the second edge E2 of the opening 62. It is the same as the point where the fourth straight line L4 intersects the inner surface of the living tissue 60 . Therefore, the user can easily observe the point Pt on the inner surface of the living tissue 60 through the opening 62 .
  • placing the virtual camera 71 on the extension of the fourth straight line L4 as shown in FIG. 7 makes it easier for the user to observe the point Pt on the inner surface of the living tissue 60 .
  • the position of the virtual camera 71 may be any position where the lumen 61 of the living tissue 60 can be observed through the opening 62, but in the present embodiment it is within the range facing the opening 62.
  • the position of the virtual camera 71 is preferably set at an intermediate position facing the central portion of the opening 62 .
  • step S108 if the catheter 63 is inserted into the living tissue 60, the processing from step S201 to step S205 is executed for each tissue voxel.
  • an intervening portion 66 that is part of the living tissue 60 exists between the camera 71 and the catheter 63 . That is, the catheter 63 exists behind the intervening portion 66 . Therefore, the control unit 41 applies a third color, such as an intermediate color between the first color and the second color, as the color of the voxels corresponding to the intervening portion 66 .
  • the control unit 41 applies the first color as the color of the voxels corresponding to the portion of the living tissue 60 visible from the camera 71 excluding the intervening portion 66 .
  • the cross section 69 faces the camera 71 in the intervening portion 66 .
  • the area of the cross section 69 corresponding to the intervening portion 66 is colored with the third color, and the remaining area is colored with the first color.
  • the control unit 41 of the image processing apparatus 11 selects a first mode in which the processing from step S201 to step S205 is not executed even when the catheter 63 is inserted into the biological tissue 60, and a second mode in which the processing from step S201 to step S205 is not executed.
  • the display mode may be switched between the second mode in which the processing from step S201 to step S205 is executed.
  • the first mode as shown in FIG. 8, even if the catheter 63 is behind the intervening portion 66, the area of the cross section 69 corresponding to the interposing portion 66 is the same as the rest of the area. Colored with one color.
  • the same texture as that applied to the cross section of the portion of the biological tissue 60 adjacent to the intervening portion 66 in the three-dimensional image 53 is applied to the cross section of the intervening portion 66 .
  • the area of the cross section 69 corresponding to the intervening portion 66 is colored with the third color. be done. That is, as the image 67 representing the catheter 63 , a texture different from the texture applied to the cross section of the portion of the biological tissue 60 adjacent to the intervening portion 66 in the three-dimensional image 53 is applied to the cross section of the intervening portion 66 .
  • the switching of the display mode may be performed manually by user operation, or may be performed automatically with an arbitrary event as a trigger.
  • step S109 if the tomographic data 51 is updated, the processes of steps S110 and S111 are performed. If there is no update of the tomographic data 51, the presence or absence of user operation is confirmed again in step S104.
  • step S110 the control unit 41 of the image processing device 11 processes the signal input from the probe 20 to newly generate a cross-sectional image of the living tissue 60, similarly to the processing of step S101, thereby obtaining at least one cross-sectional image.
  • step S111 the control unit 41 of the image processing apparatus 11 updates the three-dimensional data 52 of the living tissue 60 based on the tomographic data 51 acquired at step S110. That is, the control unit 41 updates the three-dimensional data 52 based on the tomographic data 51 acquired by the sensor. Then, in step S103, the control unit 41 causes the display 16 to display the three-dimensional data 52 updated in step S111 as the three-dimensional image 53.
  • step S111 it is preferable to update only the data corresponding to the updated tomographic data 51. FIG. In that case, the amount of data processing when updating the three-dimensional data 52 can be reduced, and the real-time performance of the three-dimensional image 53 can be improved in step S111.
  • step S111 if the catheter 63 is inserted into the living tissue 60, the processing from step S201 to step S205 is executed for each tissue voxel.
  • steps S105 to S108 from the second time onward when changing the position of the opening 62 from the first position to the second position, the control unit 41 of the image processing device 11 changes the viewpoint to the third position corresponding to the first position. to a fourth position corresponding to the second position.
  • the control unit 41 moves the virtual light source 72 when displaying the three-dimensional image 53 on the display 16 in accordance with the movement of the viewpoint from the third position to the fourth position.
  • the control unit 41 moves the virtual light source 72 using the rotation matrix used for moving the virtual camera 71 when changing the circumferential position of the opening 62 in the cross section of the living tissue 60 .
  • the control unit 41 may instantaneously switch the viewpoint from the third position to the fourth position.
  • a moving image gradually moving from the position to the fourth position is displayed on the display 16 as the three-dimensional image 53 . Therefore, it is easy for the user to know that the viewpoint has moved.
  • step S105 the control unit 41 of the image processing apparatus 11 causes the input unit 44 to perform an operation of setting the position of the opening 62 and an operation of setting the position of the target point that the user wants to see. may be accepted through
  • control unit 41 of the image processing device 11 allows the user to use the keyboard 14 , the mouse 15 , or the touch screen provided integrally with the display 16 to display the three-dimensional image 53 displayed on the display 16 .
  • An operation of designating the position of the target point using the input unit 44 may be accepted.
  • the control unit 41 performs an operation of setting the position of the point Pt as the position of the point where the first straight line L1 and the second straight line L2 intersect the inner surface of the biological tissue 60 via the input unit 44. may be accepted.
  • step S105 the control unit 41 of the image processing device 11 inputs an operation for setting the position of the target point that the user wants to see instead of the operation for setting the position of the opening 62. It may be accepted via the unit 44 . Then, in step S106, the control unit 41 may determine the position of the opening 62 according to the position set by the operation received in step S105.
  • control unit 41 of the image processing device 11 allows the user to use the keyboard 14 , the mouse 15 , or the touch screen provided integrally with the display 16 to display the three-dimensional image 53 displayed on the display 16 .
  • An operation of designating the position of the target point using the input unit 44 may be accepted.
  • the control section 41 may determine the position of the opening 62 according to the position of the target point.
  • the control unit 41 performs an operation of setting the position of the point Pt as the position of the point where the first straight line L1 and the second straight line L2 intersect the inner surface of the biological tissue 60 via the input unit 44. may be accepted.
  • the control unit 41 may determine, as the area AF, a fan-shaped area centered at the point Pt and having a center angle preset or specified by the user in the cross section of the biological tissue 60 .
  • the control unit 41 may determine the position of the living tissue 60 overlapping the area AF as the position of the opening 62 .
  • the control unit 41 may determine a normal line perpendicular to a tangent line passing through the point Pt of the inner surface of the living tissue 60 as the fourth straight line L4.
  • the area AF may be set narrower than the width of the opening 62 . That is, the area AF may be set so as not to include at least one of the first edge E1 of the opening 62 and the second edge E2 of the opening 62 .
  • the point at which the first straight line L1 intersects the inner surface of the living tissue 60 may not be the same as the point at which the second straight line L2 intersects the inner surface of the living tissue 60.
  • a point P1 at which the first straight line L1 intersects the inner surface of the living tissue 60 and a point P2 at which the second straight line L2 intersects the inner surface of the living tissue 60 are the circumference of a circle centered at the point Pt. may be above. That is, the points P1 and P2 may be substantially equidistant from the point Pt.
  • the control unit 41 of the image processing device 11 includes the first data representing the biological tissue 60 and the second data representing the object located in the lumen 61 of the biological tissue 60.
  • the three-dimensional data 52 is displayed on the display 16 as a three-dimensional image 53 .
  • the control unit 41 refers to the three-dimensional data 52 to identify the positional relationship between the living tissue 60 and the object. According to the specified positional relationship, the control unit 41 determines that there is an intervening tissue, which is a portion of the living tissue 60 that is interposed between the object and the viewpoint set when the three-dimensional image 53 is displayed.
  • control unit 41 determines whether When determining that an intervening tissue exists, the control unit 41 performs control to display an image representing the object at a position corresponding to the intervening tissue in the three-dimensional image 53 . Therefore, according to this embodiment, it is possible to confirm the position of the object even when there is an intervening tissue.
  • the control unit 41 of the image processing device 11 refers to data obtained by a sensor that observes the living tissue 60 and the object, and identifies the positional relationship between the living tissue 60 and the object.
  • the sensor is not limited to the ultrasonic transducer 25 used for IVUS, and may be any sensor such as a sensor used for OFDI, OCT, CT examination, extracorporeal echo examination, or X-ray examination. "CT” is an abbreviation for computed tomography.
  • the control unit 41 detects, as an intervening tissue, a portion of the living tissue 60 interposed between the object and the viewpoint set when the three-dimensional image 53 is displayed, according to the specified positional relationship.
  • the catheter 63 corresponds to the object
  • the intervening portion 66 corresponds to the intervening tissue.
  • control unit 41 of the image processing device 11 determines that an intervening tissue exists, it performs control to display an image representing the target object in the cross section of the intervening tissue in the three-dimensional image 53 in this embodiment.
  • control may be performed to display an image representing the object on the surface of the intervening tissue in the three-dimensional image 53 .
  • the control unit 41 of the image processing device 11 selects the image 67 representing the object as the image 67 of the biological tissue 60 in the three-dimensional image 53. , applying a different texture to the cross-section of the intervening tissue than the texture applied to the cross-section of the portion adjacent to the intervening tissue. Assuming that the surface of the intervening tissue faces the camera 71 in the three-dimensional space, the control unit 41 selects a portion of the living tissue 60 adjacent to the intervening tissue in the three-dimensional image 53 as an image representing the object. A texture may be applied to the surface of the intervening tissue that is different from the texture applied to the surface of the tissue.
  • the control unit 41 of the image processing device 11 performs control to display an image representing the target object with the intervening tissue transparent in the three-dimensional image 53 when it is determined that the intervening tissue exists. you can go In the example of FIG. 7, since the catheter 63 exists behind the intervening portion 66, the control unit 41 changes the color of the voxels corresponding to the intervening portion 66, but instead changes the transparency of the intervening portion 66. You can raise it.
  • the control unit 41 may arrange a three-dimensional image representing the object on the opposite side of the viewpoint of the intervening tissue in the three-dimensional image 53 as the image representing the object.
  • This embodiment may be applied not only to an object such as a catheter 63 as shown in FIG. 7, but also to landmarks 73 associated with living tissue 60 as shown in FIGS. 10-12.
  • the control unit 41 of the image processing device 11 causes the display 16 to display the three-dimensional data 52 including the first data representing the living tissue 60 and the third data representing the mark 73 as the three-dimensional image 53 .
  • the control unit 41 refers to the three-dimensional data 52 to identify the positional relationship between the living tissue 60 and the mark 73 . According to the specified positional relationship, the control unit 41 determines that there is an intervening tissue that is a portion of the living tissue 60 that is interposed between the mark 73 and the viewpoint set when the three-dimensional image 53 is displayed.
  • control unit 41 determines whether When determining that an intervening tissue exists, the control unit 41 performs control to display an image 74 representing a mark 73 at a position corresponding to the intervening tissue in the three-dimensional image 53 . Therefore, even if an intervening tissue exists, the position of the mark 73 can be confirmed.
  • the first data included in the three-dimensional data 52 is constructed based on data obtained by a sensor that is inserted into the lumen 61 of the biological tissue 60 and observes the biological tissue 60, and new data is obtained by the sensor. updated from time to time. That is, the first data is configured to be sequentially updated by the sensor.
  • the control unit 41 of the image processing device 11 acquires designation data designating the mark 73 .
  • the control unit 41 receives a user operation that designates at least one place in the three-dimensional space as the mark 73, thereby acquiring data designating the mark 73 as designation data.
  • the control unit 41 causes the storage unit 42 to store the acquired designation data.
  • the control unit 41 constructs the third data based on the designated data stored in the storage unit 42 .
  • the control unit 41 includes the constructed third data in the three-dimensional data 52 and causes the display 16 to display the three-dimensional data 52 as a three-dimensional image 53 . That is, the control unit 41 causes the display 16 to display the third data as part of the three-dimensional image 53 .
  • the landmarks 73 are the locations of the living tissue 60 or living tissue 60 in three-dimensional space, such as the ablated portion of the living tissue 60, the start and end points for measuring distances in three-dimensional space, or the positions of nerves that should be avoided from being ablated. It is associated with the living tissue 60 by being attached to the lumen 61 of the tissue 60 or at least one location around it. When the mark 73 is attached, the probe 20 is moved forward and backward to update the three-dimensional data 52, and the coordinates of the location marked with the mark 73 are recorded as a fixed point in the three-dimensional space.
  • the living tissue 60 is myocardium.
  • the markings 73 are applied to the surface of the myocardium.
  • the mark 73 may become embedded in the myocardial tissue, as shown in FIG. That is, an intervening portion 66 that is part of the myocardial tissue may exist between the camera 71 and the mark 73 . In that case, the mark 73 cannot be confirmed on the three-dimensional image 53 . Therefore, when the control unit 41 of the image processing device 11 determines that the intervening portion 66 exists, as shown in FIG. control.
  • control unit 41 sets the color of the voxel corresponding to the intervening portion 66 to a first color and an intermediate color between a first color that is the color of the myocardial tissue and a second color that is the color of the mark 73 . Change to a third color that is different from the second color.
  • the control unit 41 of the image processing device 11 selects the biological tissue 60 in the three-dimensional image 53 as the image 74 representing the mark 73 .
  • a texture different from the texture applied to the surface of the portion adjacent to the intervening portion 66 is applied to the surface of the intervening portion 66 .
  • the control unit 41 creates an image representing the mark 73 in the three-dimensional image 53 of the biological tissue 60 adjacent to the intervening portion 66 .
  • a different texture may be applied to the cross-section of the interposer 66 than the texture applied to the cross-section of the intervening portion.
  • the control section 41 of the image processing device 11 changes the color of the voxel corresponding to the intervening portion 66 . If the mark 73 exists behind the intervening portion 66 , the control section 41 may increase the transparency of the intervening portion 66 . That is, when the control unit 41 determines that the interposed portion 66 exists, the control portion 41 may perform control to display an image representing the mark 73 through the intervening portion 66 in the three-dimensional image 53 . In that case, the control unit 41 may arrange a three-dimensional image representing the mark 73 on the opposite side of the viewpoint of the intervening tissue in the three-dimensional image 53 as the image representing the mark 73 .
  • image processing system 11 image processing device 12 cable 13 drive unit 14 keyboard 15 mouse 16 display 17 connection terminal 18 cart unit 20 probe 21 drive shaft 22 hub 23 sheath 24 outer tube 25 ultrasonic transducer 26 relay connector 31 scanner unit 32 slide Unit 33 Bottom cover 34 Probe connection part 35 Scanner motor 36 Insertion port 37 Probe clamp part 38 Slide motor 39 Switch group 41 Control part 42 Storage part 43 Communication part 44 Input part 45 Output part 51 Tomographic data 52 Three-dimensional data 53 Three-dimensional Image 60 Tissue 61 Lumen 62 Aperture 63 Catheter 64 Ridge 65 Fossa Oval 66 Interposition 67 Image 68 Surface 69 Section 71 Camera 72 Light Source 73 Landmark 74 Image 80 Screen

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

La présente invention concerne un dispositif de traitement d'image pour l'affichage, sur un afficheur sous la forme d'une image tridimensionnelle, de données tridimensionnelles comprenant des premières données qui représentent un tissu vivant, des deuxièmes données qui représentent un objet, ou des troisièmes données qui représentent un marqueur, le dispositif de traitement d'image étant pourvu d'une unité de commande pour : référencer les données tridimensionnelles et identifier la relation de position entre le tissu vivant et l'objet ou le marqueur ; évaluer, en fonction de la relation de position identifiée, s'il existe un tissu d'interposition qui est une partie du tissu vivant qui est interposé entre l'objet ou le marqueur et un ensemble de points visuels lorsque l'image tridimensionnelle est affichée ; et effectuer, s'il est évalué que le tissu d'interposition existe, la commande d'affichage d'une image représentant le marqueur ou l'objet au niveau de la position sur l'image tridimensionnelle correspondant au tissu d'interposition.
PCT/JP2022/009239 2021-03-26 2022-03-03 Dispositif de traitement d'image, système de traitement d'image, procédé d'affichage d'image et programme de traitement d'image WO2022202200A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2023508893A JPWO2022202200A1 (fr) 2021-03-26 2022-03-03

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021054083 2021-03-26
JP2021-054083 2021-03-26

Publications (1)

Publication Number Publication Date
WO2022202200A1 true WO2022202200A1 (fr) 2022-09-29

Family

ID=83397108

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/009239 WO2022202200A1 (fr) 2021-03-26 2022-03-03 Dispositif de traitement d'image, système de traitement d'image, procédé d'affichage d'image et programme de traitement d'image

Country Status (2)

Country Link
JP (1) JPWO2022202200A1 (fr)
WO (1) WO2022202200A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001299756A (ja) * 2000-04-25 2001-10-30 Toshiba Corp カテーテルまたは細径プローブの位置を検出可能な超音波診断装置
JP2002522106A (ja) * 1998-08-03 2002-07-23 カーディアック・パスウェイズ・コーポレーション ダイナミックに変更可能な人体の3次元グラフィックモデル

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002522106A (ja) * 1998-08-03 2002-07-23 カーディアック・パスウェイズ・コーポレーション ダイナミックに変更可能な人体の3次元グラフィックモデル
JP2001299756A (ja) * 2000-04-25 2001-10-30 Toshiba Corp カテーテルまたは細径プローブの位置を検出可能な超音波診断装置

Also Published As

Publication number Publication date
JPWO2022202200A1 (fr) 2022-09-29

Similar Documents

Publication Publication Date Title
JP7300352B2 (ja) 診断支援装置、診断支援システム、及び診断支援方法
US20220218309A1 (en) Diagnostic assistance device, diagnostic assistance system, and diagnostic assistance method
WO2022202200A1 (fr) Dispositif de traitement d'image, système de traitement d'image, procédé d'affichage d'image et programme de traitement d'image
JP5498090B2 (ja) 画像処理装置及び超音波診断装置
WO2023013601A1 (fr) Dispositif de traitement d'images, système de traitement d'images, procédé de traitement d'images et programme de traitement d'images
CN114502079B (zh) 诊断支援装置、诊断支援系统及诊断支援方法
WO2022202202A1 (fr) Dispositif de traitement d'image, système de traitement d'image, procédé d'affichage d'image et programme de traitement d'image
WO2023176741A1 (fr) Dispositif de traitement d'image, système de traitement d'image, méthode d'affichage d'image et programme de traitement d'image
WO2022202201A1 (fr) Dispositif de traitement d'images, système de traitement d'images, procédé d'affichage d'image et programme de traitement d'images
WO2022071251A1 (fr) Dispositif de traitement d'image, système de traitement d'image, procédé d'affichage d'image et programme de traitement d'image
WO2022071250A1 (fr) Dispositif de traitement d'image, système de traitement d'image, procédé d'affichage d'image et programme de traitement d'image
WO2022202203A1 (fr) Dispositif de traitement d'image, système de traitement d'image, procédé d'affichage d'image et programme de traitement d'image
WO2023054001A1 (fr) Dispositif de traitement d'image, système de traitement d'image, procédé d'affichage d'image et programme de traitement d'image
WO2021200296A1 (fr) Dispositif de traitement d'image, système de traitement d'image, procédé d'affichage d'image et programme de traitement d'image
WO2021065746A1 (fr) Dispositif d'aide au diagnostic, système d'aide au diagnostic et procédé d'aide au diagnostic
JP2023024072A (ja) 画像処理装置、画像処理システム、画像表示方法、及び画像処理プログラム
WO2024071054A1 (fr) Dispositif de traitement d'image, système d'affichage d'image, méthode d'affichage d'image et programme de traitement d'image
WO2022085373A1 (fr) Dispositif de traitement d'image, système de traitement d'image, procédé d'affichage d'image et programme de traitement d'image
US20220039778A1 (en) Diagnostic assistance device and diagnostic assistance method
WO2021200294A1 (fr) Dispositif de traitement d'image, système de traitement d'image, procédé d'affichage d'image et programme de traitement d'image
US20240108313A1 (en) Image processing device, image display system, image processing method, and image processing program
WO2022202302A1 (fr) Programme informatique, procédé de traitement d'informations et dispositif de traitement d'informations
WO2020203873A1 (fr) Dispositif d'aide au diagnostic, système d'aide au diagnostic et procédé d'aide au diagnostic
CN115484872A (zh) 图像处理装置、图像处理系统、图像显示方法及图像处理程序

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22774996

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023508893

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22774996

Country of ref document: EP

Kind code of ref document: A1