WO2023189510A1 - Image processing device, image processing system, image display method, and image processing program - Google Patents

Image processing device, image processing system, image display method, and image processing program Download PDF

Info

Publication number
WO2023189510A1
WO2023189510A1 PCT/JP2023/009725 JP2023009725W WO2023189510A1 WO 2023189510 A1 WO2023189510 A1 WO 2023189510A1 JP 2023009725 W JP2023009725 W JP 2023009725W WO 2023189510 A1 WO2023189510 A1 WO 2023189510A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
area
screen
user operation
image
Prior art date
Application number
PCT/JP2023/009725
Other languages
French (fr)
Japanese (ja)
Inventor
泰一 坂本
俊祐 吉澤
クレモン ジャケ
ステフェン チェン
トマ エン
亮介 佐賀
Original Assignee
テルモ株式会社
株式会社ロッケン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by テルモ株式会社, 株式会社ロッケン filed Critical テルモ株式会社
Publication of WO2023189510A1 publication Critical patent/WO2023189510A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • A61B8/14Echo-tomography
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/20ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management or administration of healthcare resources or facilities, e.g. managing hospital staff or surgery rooms

Definitions

  • the present disclosure relates to an image processing device, an image processing system, an image display method, and an image processing program.
  • Patent Document 1 discloses that image information of a subject is displayed on an operator's monitor and a surgeon's monitor, and image information independent of the image information displayed on the operator's monitor is displayed on the operator's monitor.
  • An image diagnostic apparatus that displays images is disclosed.
  • HD high definition
  • NTSC National Television System Committee
  • WXGA Wide Extended Graphics Array
  • the HD screen ratio is 16:9 in width and 9 in height.
  • the screen ratio of NTSC is 4:3 in width and 3 in height.
  • the screen ratio of WXGA is 16:10 in width and 10 in height.
  • the screen ratio of CinemaScope is 2.35 in width and 1 in height, that is, 2.35:1.
  • a screen size of 4:3 is often used in rooms where surgeons perform catheter treatment, such as angiography rooms.
  • a screen with a wider screen size such as 16:9, which is suitable for menu operations and detailed settings.
  • the screen size of the operator's monitor is longer than that of the surgeon's monitor, the drawing size will change, so if the operator's screen is displayed on the surgeon's monitor, it will be compressed vertically.
  • a problem arises in that a screen that has been modified is displayed. As a countermeasure to this problem, it is conceivable to display only information important to the surgeon on the monitor on the surgeon's side.
  • the surgeon can communicate smoothly with the operator while checking what operations are being performed on the operator's screen. be able to.
  • the surgeon cannot see what operations are being performed on the operator's screen, and the It may become difficult to communicate smoothly.
  • the purpose of the present disclosure is to facilitate communication between a surgeon and an operator.
  • An image processing device includes a first area that displays a surgical support image including a living tissue image that is imaged based on information obtained by sensing the structure of living tissue in the body of a surgical subject. and a second area displaying at least one operating element for controlling the display of the surgical support image in the first area, and displaying the operator screen in the first area on the first display.
  • An image processing device that displays a viewer screen that displays the surgical support image on a second display that is different from the first display in accordance with the display of the surgical support image, the image processing device comprising: When the user operation is performed in the first area, the display of the user operation is reflected on the viewer screen, and when the user operation is performed in the second area. includes a control unit that does not reflect the display of the user operation on the viewer screen.
  • the control unit when the user operation is performed in the second area, displays notification information on the viewer screen indicating that the user operation is performed in the second area. indicate.
  • control unit displays a setting menu including two or more setting items on the operator screen, and selects one of the two or more setting items from the setting menu using the input device.
  • a setting menu including two or more setting items on the operator screen
  • selects one of the two or more setting items from the setting menu using the input device When an operation to select one is received, one or more operation elements corresponding to the selected setting item are displayed in the second area, and when an operation of the one or more operation elements is accepted as the user operation, As the notification information, information indicating the selected setting item is displayed on the viewer screen.
  • control unit displays two or more operation elements in the second area as the at least one operation element, and displays any one of the two or more operation elements as the user operation.
  • information indicating the operating element being operated is displayed on the viewer screen as the notification information.
  • the input device is a pointing device
  • the control unit displays a first pointer movable by the user operation on the operator screen, and the first pointer is in the first area.
  • the display of the user operation is reflected on the viewer screen, and the first pointer moves in accordance with the movement of the second pointer.
  • the display of the user operation is stopped from being reflected on the viewer screen by hiding the second pointer on the viewer screen.
  • control unit when the control unit determines that the position of the first pointer has not changed for a certain period of time even when the first pointer is in the first area, the control unit displays the second pointer on the viewer screen. Hide.
  • the input device is a pointing device
  • the control unit displays a first pointer movable by the user operation on the operator screen, and the first pointer is in the first area.
  • the display of the user operation is reflected on the viewer screen, and the first pointer moves in accordance with the movement of the second pointer.
  • the second pointer is displayed on the viewer screen with its position fixed, thereby stopping the display of the user operation from being reflected on the viewer screen.
  • control unit when the control unit determines that the position of the first pointer has not changed for a certain period of time even when the first pointer is in the first area, the control unit displays the second pointer on the viewer screen. Hide.
  • the second area includes a floating area that is movable using the input device and is allowed to overlap the first area, and the control unit is configured to control the floating area on the operator screen. overlaps at least a part of the surgery support image, the at least part of the surgery support image is hidden on the operator screen, and the at least one part of the surgery support image is hidden on the viewer screen. Continue to display the section.
  • control unit separately controls an image displayed in the first area as the surgical support image and an image displayed on the viewer screen based on the same information obtained by the sensing. Render.
  • An image processing system as one aspect of the present disclosure includes the image processing device, the first display, and the second display.
  • the second display is installed in a radiation room where surgery is performed, and the first display is installed in a separate room where operations using the input device are performed.
  • An image display method as an aspect of the present disclosure includes a first area for displaying a surgical support image including a living tissue image that is imaged based on information obtained by sensing the structure of living tissue in the body of a surgical subject. and a second area displaying at least one operating element for controlling the display of the surgical support image in the first area, and displaying on the first display an operator screen including:
  • An image display method that displays a viewer screen that displays the surgical support image on a second display different from the first display in accordance with the display of the surgical support image, the method comprising: When the user operation is performed in the first area, the display of the user operation is reflected on the viewer screen, and when the user operation is performed in the second area, the display of the user operation is reflected on the viewer screen. This means that the display of the user operation is not reflected on the viewer screen.
  • the image display method when the user operation is performed in the second area, displays notification information on the viewer screen indicating that the user operation is performed in the second area. This means that it will be displayed on the screen.
  • An image processing program includes a first area for displaying a surgical support image including a living tissue image formed based on information obtained by sensing the structure of living tissue in the body of a surgical subject. and a second area displaying at least one operating element for controlling the display of the surgical support image in the first area, and displaying the operator screen in the first area on the first display.
  • a user operation performed on the operator screen using an input device is sent to a computer that displays a viewer screen that displays the surgical support image on a second display different from the first display in accordance with the display of the surgical support image. and when the user operation is performed in the first area, a display of the user operation is reflected on the viewer screen, and when the user operation is performed in the second area, A process for not reflecting the display of the user operation on the viewer screen is executed.
  • FIG. 1 is a block diagram showing the configuration of an image processing system according to an embodiment of the present disclosure.
  • FIG. 3 is a diagram illustrating an example of use of a first display and a second display according to an embodiment of the present disclosure.
  • FIG. 3 is a diagram illustrating an example of an operator screen displayed on a first display by the image processing device according to an embodiment of the present disclosure.
  • FIG. 7 is a diagram illustrating an example of a viewer screen displayed on a second display by the image processing device according to the embodiment of the present disclosure.
  • FIG. 3 is a diagram illustrating an example of an operator screen displayed on a first display by the image processing device according to an embodiment of the present disclosure.
  • FIG. 1 is a block diagram showing the configuration of an image processing system according to an embodiment of the present disclosure.
  • FIG. 3 is a diagram illustrating an example of use of a first display and a second display according to an embodiment of the present disclosure.
  • FIG. 3 is a diagram illustrating an example of an operator screen displayed on a first
  • FIG. 7 is a diagram illustrating an example of a viewer screen displayed on a second display by the image processing device according to the embodiment of the present disclosure.
  • FIG. 3 is a diagram illustrating an example of an operator screen displayed on a first display by the image processing device according to an embodiment of the present disclosure.
  • FIG. 7 is a diagram illustrating an example of a viewer screen displayed on a second display by the image processing device according to the embodiment of the present disclosure.
  • FIG. 3 is a diagram illustrating an example of an operator screen displayed on a first display by the image processing device according to an embodiment of the present disclosure.
  • FIG. 7 is a diagram illustrating an example of a viewer screen displayed on a second display by the image processing device according to the embodiment of the present disclosure.
  • FIG. 1 is a block diagram showing the configuration of an image processing device according to an embodiment of the present disclosure.
  • 3 is a flowchart showing the operation of the image processing device according to the embodiment of the present disclosure.
  • the image processing system 10 includes an image processing device 20, a first display 30, a second display 40 different from the first display 30, and an input device 50.
  • the image processing device 20 is connected to the first display 30, the second display 40, and the input device 50 via a cable or a network, or wirelessly.
  • the image processing device 20 is, for example, a general-purpose computer such as a PC, a server computer such as a cloud server, or a dedicated computer. "PC" is an abbreviation for personal computer.
  • the image processing device 20 may be installed in a medical facility such as a hospital, or may be installed in a facility other than the medical facility such as a data center.
  • the first display 30 and the second display 40 are, for example, LCDs or organic EL displays.
  • LCD is an abbreviation for liquid crystal display.
  • EL is an abbreviation for electro luminescent.
  • the screen size of the first display 30 may be any screen size, but in this embodiment it is NTSC. That is, the screen ratio of the first display 30 may be any screen ratio, but in this embodiment, it is 4:3.
  • the screen size of the second display 40 may be any screen size, but in this embodiment it is HD. That is, the screen ratio of the second display 40 may be any screen ratio, but in this embodiment it is 16:9.
  • the input device 50 is, for example, a pointing device such as a mouse, a keyboard, or a touch screen provided integrally with the first display 30.
  • the second display 40 is installed in the radiation room 11 of a medical facility.
  • the first display 30 and the input device 50 are installed in a separate room 12 of the medical facility.
  • the radiation room 11 is visible from a separate room 12 through a window 13.
  • an operator 14 such as a doctor.
  • an operator 15 such as a clinical engineer performs operations using the input device 50.
  • the first display 30 and the second display 40 display surgical support images 16 such as IVUS images, OFDI images, OCT images, CT images, MRI images, echo images, or X-ray images.
  • IVUS is an abbreviation for intravascular ultrasound.
  • OFDI is an abbreviation for optical frequency domain imaging.
  • OCT optical coherence tomography.
  • CT is an abbreviation for computed tomography.
  • MRI is an abbreviation for magnetic resonance imaging.
  • the image processing device 20 displays an operator screen 31 as shown in FIGS. 3, 5, 7, and 9 on the first display 30, and also displays an operator screen 31 as shown in FIGS. A viewer screen 41 like this is displayed on the second display 40.
  • the operator screen 31 includes a first area 32 that displays the surgical support image 16 and a second area 33 that displays at least one operating element for controlling the display of the surgical support image 16 in the first area 32.
  • the surgical support image 16 includes a living tissue image 17 that is an image formed based on information obtained by sensing the structure of living tissue within the body of the person undergoing surgery.
  • the viewer screen 41 displays the surgical support image 16 in accordance with the display of the surgical support image 16 in the first area 32 .
  • the image processing device 20 receives user operations performed on the operator screen 31 using the input device 50 .
  • the image processing device 20 When a user operation is performed in the first area 32, the image processing device 20 reflects the display of the user operation on the viewer screen 41. When the user operation is performed in the second area 33, the image processing device 20 does not reflect the display of the user operation on the viewer screen 41.
  • the surgical support image 16 displayed in the first area 32 of the screen of the operator 15 side is displayed on the monitor of the operator 14 side, and is displayed in the second area 33 of the screen of the operator 15 side. You can hide the operation elements that are displayed.
  • the surgical support image 16 is important information for the surgeon 14, but the operation elements are information unnecessary for the surgeon 14. That is, according to this embodiment, only clinically important information can be displayed on the monitor on the operator's side.
  • the surgeon 14 can confirm at least what kind of operation is being performed in the first area 32 of the screen on the operator 15 side. As a result, the surgeon 14 can communicate smoothly with the operator 15. That is, according to this embodiment, communication between the surgeon 14 and the operator 15 can be facilitated.
  • the image processing device 20 displays a first pointer 51 that can be moved by user operation on the operator screen 31.
  • the image processing device 20 displays a second pointer 52 that moves in accordance with the movement of the first pointer 51 on the viewer screen 41, thereby displaying user operations. is reflected on the viewer screen 41.
  • the image processing device 20 hides the second pointer 52 on the viewer screen 41 when the first pointer 51 is in the second area 33, thereby preventing the display of the user's operation from being reflected on the viewer screen 41. Stop.
  • the first pointer 51 is located in the second area 33. Therefore, in the example of FIG. 4 corresponding to the example of FIG.
  • the second pointer 52 is hidden on the viewer screen 41.
  • the first pointer 51 is located in the first area 32. Therefore, in the example of FIG. 6 corresponding to the example of FIG. 5, the second pointer 52 is displayed on the viewer screen 41 so as to move in accordance with the movement of the first pointer 51.
  • the image processing device 20 displays the second pointer 52 on the viewer screen 41 at a fixed position instead of hiding the second pointer 52. In this way, the display of user operations may be stopped from being reflected on the viewer screen 41.
  • the first pointer 51 is located in the second area 33. Therefore, in the example of FIG. 8 corresponding to the example of FIG. 7, the second pointer 52 is displayed on the viewer screen 41 with its position fixed. The second pointer 52 is fixed at a position on the viewer screen 41 corresponding to the position immediately before the first pointer 51 leaves the first area 32 . Therefore, the surgeon 14 can confirm at least in which direction the first pointer 51 has been moved on the screen on the operator 15 side.
  • the surgeon 14 knows that the second area 33 is adjacent to the right side of the first area 32 on the operator screen 31 and that the setting menu 35 is arranged at the bottom of the second area 33.
  • the second pointer 52 is located at the lower right corner of the viewer screen 41, it can be inferred that the operator 15 is operating the setting menu 35.
  • a 2D button 34a a 2D/3D button 34b, an enlargement button 34c, a reduction button 34d, and a rotation button 34e are displayed.
  • the 2D button 34a is an operation element for selecting a mode in which a two-dimensional image is displayed in the first area 32 as the surgical support image 16. For example, by clicking the 2D button 34a, an IVUS image can be displayed in the first area 32 as the surgical support image 16.
  • An IVUS image is a biological tissue image 17 in which an ultrasonic wave is transmitted from an ultrasonic transducer installed at the tip of a catheter inserted into the body of a patient undergoing surgery, a reflected wave is received, and the signal of the reflected wave is processed.
  • the image includes a two-dimensional image of a cross-sectional structure of a living tissue such as a blood vessel or a heart of a person to be operated on based on information obtained by performing the surgery.
  • the 2D/3D button 34b is an operation element for selecting a mode in which a two-dimensional image and a three-dimensional image are displayed in the first area 32 as the surgical support image 16. For example, by clicking the 2D/3D button 34b, the latest IVUS image and a three-dimensional image obtained by stacking multiple IVUS images can be displayed in the first area 32 as the surgical support image 16. can.
  • This three-dimensional image is generated as a biological tissue image 17 by sequentially imaging the cross-sectional structure of the patient's biological tissue while moving the ultrasound transducer by a pull-back operation, and generating a plurality of two-dimensional images. Contains a three-dimensional image obtained by stacking two-dimensional images to create a three-dimensional image.
  • the enlarge button 34c is an operation element for enlarging and displaying the surgical support image 16 in the first region 32. For example, by clicking the enlarge button 34c once, the IVUS image displayed in the first area 32 as the surgical support image 16 can be enlarged and displayed.
  • the reduction button 34d is an operation element for displaying the surgical support image 16 in a reduced size in the first area 32. For example, by clicking the reduction button 34d once, the IVUS image displayed in the first area 32 as the surgical support image 16 can be displayed in a reduced size.
  • the rotation button 34e is an operation element for rotating the surgical support image 16 in the first region 32. For example, by clicking the rotation button 34e once, the IVUS image displayed in the first area 32 as the surgical support image 16 can be rotated 90 degrees clockwise.
  • the image processing device 20 displays notification information 42 indicating that the user operation is performed in the second area 33 on the viewer screen 41.
  • notification information 42 indicating that the user operation is performed in the second area 33 on the viewer screen 41.
  • the rotation button 34e is clicked in the second area 33. Therefore, in the example of FIG. 4 corresponding to the example of FIG. 3, an icon indicating that the rotation button 34e has been clicked in the second area 33 is displayed on the viewer screen 41 as the notification information 42.
  • FIG. 9 when the IVUS image is displayed in the first area 32 as the surgical support image 16, a calibration operation is performed in the floating area 37 included in the second area 33. Therefore, in the example of FIG. 10 corresponding to the example of FIG. 9, a label indicating that a calibration operation is being performed in the second area 33 is highlighted on the viewer screen 41 as the notification information 42.
  • the image processing device 20 displays a setting menu 35 on the operator screen 31.
  • the image processing device 20 displays the setting menu 35 only in the second area 33 and does not display it on the viewer screen 41.
  • the image processing device 20 may display the setting menu 35 not only in the first area 32 but also on the viewer screen 41.
  • Setting menu 35 includes two or more setting items 36.
  • the image processing device 20 When the image processing device 20 receives an operation of the one or more operation elements as a user operation, the image processing device 20 displays information indicating the selected setting item on the viewer screen 41 as notification information 42.
  • the settings menu 35 In the example of FIG. 5, the settings menu 35 is closed.
  • the setting menu 35 when the setting menu 35 is opened and two or more setting items 36 are displayed, the setting item corresponding to the calibration operation is clicked. Therefore, in the example of FIG. 9, a window for performing a calibration operation is displayed in the floating area 37 included in the second area 33 as an operation element corresponding to the clicked setting item.
  • a window displayed in the floating area 37 is being manipulated. Therefore, in the example of FIG. 10 corresponding to the example of FIG. 9, a label indicating that the calibration operation has been selected is highlighted on the viewer screen 41 as the notification information 42.
  • the image processing device 20 displays two or more operation elements in the second area 33.
  • the image processing device 20 receives an operation of any one of the two or more operation elements as a user operation, the image processing device 20 displays information indicating the operated operation element on the viewer screen 41 as notification information 42.
  • the image processing device 20 displays information indicating the operated operation element on the viewer screen 41 as notification information 42.
  • the enlarge button 34c, the reduce button 34d, and the rotate button 34e are displayed in the second area 33, the rotate button 34e is clicked. Therefore, in the example of FIG. 4 corresponding to the example of FIG. 3, an icon indicating that the rotation button 34e has been clicked is displayed on the viewer screen 41 as the notification information 42.
  • the floating area 37 is an area that can be moved using the input device 50 and is allowed to overlap the first area 32 .
  • the image processing device 20 hides at least a portion of the surgical support image 16 on the operator screen 31, and , continues to display at least a portion of the surgical support image 16 on the viewer screen 41.
  • the floating area 37 overlaps at least a portion of the surgical support image 16 on the operator screen 31. Therefore, in the example of FIG. 9, at least a part of the surgical support image 16 is hidden on the operator screen 31, but in the example of FIG. At least a portion of the support image 16 is displayed.
  • the image processing device 20 includes a control section 21, a storage section 22, a communication section 23, an input section 24, and an output section 25.
  • the control unit 21 includes at least one processor, at least one programmable circuit, at least one dedicated circuit, or any combination thereof.
  • the processor is a general-purpose processor such as a CPU or GPU, or a dedicated processor specialized for specific processing.
  • CPU is an abbreviation for central processing unit.
  • GPU is an abbreviation for graphics processing unit.
  • the programmable circuit is, for example, an FPGA.
  • FPGA is an abbreviation for field-programmable gate array.
  • the dedicated circuit is, for example, an ASIC.
  • ASIC is an abbreviation for application specific integrated circuit.
  • the control unit 21 executes processing related to the operation of the image processing device 20 while controlling each part of the image processing system 10 including the image processing device 20.
  • the storage unit 22 includes at least one semiconductor memory, at least one magnetic memory, at least one optical memory, or any combination thereof.
  • the semiconductor memory is, for example, RAM, ROM, or flash memory.
  • RAM is an abbreviation for random access memory.
  • ROM is an abbreviation for read only memory.
  • the RAM is, for example, SRAM or DRAM.
  • SRAM is an abbreviation for static random access memory.
  • DRAM is an abbreviation for dynamic random access memory.
  • the ROM is, for example, an EEPROM.
  • EEPROM is an abbreviation for electrically erasable programmable read only memory.
  • the flash memory is, for example, an SSD.
  • SSD is an abbreviation for solid-state drive.
  • the magnetic memory is, for example, an HDD.
  • HDD is an abbreviation for hard disk drive.
  • the storage unit 22 functions as, for example, a main storage device, an auxiliary storage device, or a cache memory.
  • the storage unit 22 stores data used for the operation of the image processing device 20 and data obtained by the operation of the image processing device 20.
  • the communication unit 23 includes at least one communication interface.
  • the communication interface is, for example, an interface compatible with a wireless LAN communication standard such as IEEE802.11, or an interface compatible with a wired LAN communication standard such as Ethernet (registered trademark). "IEEE” is an abbreviation for Institute of Electrical and Electronics Engineers.
  • the communication unit 23 receives data used for the operation of the image processing device 20 and transmits data obtained by the operation of the image processing device 20.
  • the input unit 24 includes at least one input interface.
  • the input interface is, for example, a USB interface, an HDMI (registered trademark) interface, or an interface compatible with a short-range wireless communication standard such as Bluetooth (registered trademark).
  • USB is an abbreviation for Universal Serial Bus.
  • HDMI registered trademark
  • HDMI registered trademark
  • HDMI High-Definition Multimedia Interface
  • the output unit 25 includes at least one output interface.
  • the output interface is, for example, a USB interface, an HDMI (registered trademark) interface, or an interface compatible with a short-range wireless communication standard such as Bluetooth (registered trademark).
  • the output unit 25 outputs data obtained by the operation of the image processing device 20.
  • the output unit 25 is connected to the first display 30 and the second display 40.
  • the functions of the image processing device 20 are realized by executing the image processing program according to the present embodiment by a processor serving as the control unit 21. That is, the functions of the image processing device 20 are realized by software.
  • the image processing program causes the computer to function as the image processing device 20 by causing the computer to execute the operations of the image processing device 20 . That is, the computer functions as the image processing device 20 by executing the operations of the image processing device 20 according to the image processing program.
  • the program may be stored on a non-transitory computer-readable medium.
  • the non-transitory computer-readable medium is, for example, a flash memory, a magnetic recording device, an optical disk, a magneto-optical recording medium, or a ROM.
  • Distribution of the program is performed, for example, by selling, transferring, or lending a portable medium such as an SD card, DVD, or CD-ROM that stores the program.
  • SD is an abbreviation for Secure Digital.
  • DVD is an abbreviation for digital versatile disc.
  • CD-ROM is an abbreviation for compact disc read only memory.
  • the program may be distributed by storing the program in the storage of a server and transferring the program from the server to another computer.
  • the program may be provided as a program product.
  • a computer temporarily stores a program stored on a portable medium or a program transferred from a server in its main storage device. Then, the computer uses a processor to read a program stored in the main memory, and causes the processor to execute processing according to the read program.
  • a computer may read a program directly from a portable medium and execute processing according to the program. The computer may sequentially execute processing according to the received program each time the program is transferred to the computer from the server. Processing may be performed using a so-called ASP type service that implements functions only by issuing execution instructions and obtaining results without transferring programs from the server to the computer. “ASP” is an abbreviation for application service provider.
  • the program includes information similar to a program that is used for processing by an electronic computer. For example, data that is not a direct command to a computer but has the property of regulating computer processing falls under "something similar to a program.”
  • a part or all of the functions of the image processing device 20 may be realized by a programmable circuit or a dedicated circuit as the control unit 21. That is, some or all of the functions of the image processing device 20 may be realized by hardware.
  • the operation of the image processing device 20 according to this embodiment will be described with reference to FIG. 12. This operation corresponds to the image display method according to this embodiment.
  • step S1 the control unit 21 receives a user operation performed on the operator screen 31 using the input device 50.
  • the operator screen 31 is displayed on the first display 30.
  • a surgical support image 16 including a biological tissue image 17 is displayed in the first area 32 of the operator screen 31.
  • at least one operating element for controlling the display of the surgical support image 16 in the first area 32 is displayed in the second area 33 of the operator screen 31.
  • the control unit 21 receives user operations via the input unit 24.
  • the control unit 21 uses the mouse to move the first pointer 51 to the rotation button 34e among the two or more operation elements displayed in the second area 33, and rotates it.
  • An operation of clicking the button 34e with a mouse is accepted.
  • the control unit 21 uses the mouse to move the first pointer 51 on the surgical support image 16 displayed in the first area 32 as a user operation, and moves the first pointer 51 on the surgical support image 16 displayed in the first area 32 to Accepts a mouse click operation to mark a position.
  • the control unit 21 displays a setting menu 35 including two or more setting items 36 on the operator screen 31 via the output unit 25.
  • the control unit 21 uses the mouse to move the first pointer 51 displayed in the second area 33 as an operation for selecting any one of two or more setting items 36 from the setting menu 35 using the input device 50.
  • the setting menu 35 is moved to the setting menu 35, and the setting menu 35 is opened, and an operation of clicking a setting item corresponding to the calibration operation among the two or more setting items 36 with the mouse is accepted.
  • the control unit 21 displays a window including a group of GUI parts for performing a calibration operation in the floating area 37 of the second area 33 as one or more operation elements according to the selected setting item.
  • GUI is an abbreviation for graphical user interface. In the example of FIG.
  • control unit 21 uses the mouse to move the first pointer 51 to the floating area 37 of the second area 33 as a user operation, and displays the GUI components in the window displayed in the floating area 37. Accepts calibration operations performed by operating with the mouse. Since the floating region 37 overlaps at least a portion of the surgical support image 16 on the operator screen 31, the control unit 21 hides at least a portion of the surgical support image 16 on the operator screen 31.
  • step S2 When a user operation is performed in the first area 32, the process of step S2 is executed.
  • step S3 and step S4 are executed.
  • step S2 the control unit 21 reflects the display of the user's operation on the viewer screen 41.
  • a viewer screen 41 is displayed on the second display 40.
  • the viewer screen 41 displays the surgical support image 16 in accordance with the display of the surgical support image 16 in the first area 32 .
  • control unit 21 reflects the display of the user operation on the viewer screen 41 via the output unit 25.
  • the control unit 21 reflects the display of the user's operation on the viewer screen 41 by displaying the second pointer 52 on the viewer screen 41 so as to move in accordance with the movement of the first pointer 51.
  • step S3 the control unit 21 stops reflecting the display of the user's operation on the viewer screen 41.
  • control unit 21 stops reflecting the display of the user's operation on the viewer screen 41 via the output unit 25.
  • the control unit 21 stops reflecting the display of the user operation on the viewer screen 41 by hiding the second pointer 52 on the viewer screen 41.
  • FIG. 8 which corresponds to the example shown in FIG. Stop reflection.
  • step S4 the control unit 21 displays the notification information 42 on the viewer screen 41.
  • the control unit 21 displays the notification information 42 on the viewer screen 41 via the output unit 25.
  • the control unit 21 displays, as notification information 42, an icon indicating that the rotation button 34e has been clicked on the viewer screen 41. That is, the control unit 21 displays information indicating the operating element being operated on the viewer screen 41 as the notification information 42.
  • the control unit 21 highlights, as the notification information 42, a label indicating that the calibration operation has been selected on the viewer screen 41.
  • control unit 21 displays a group of labels corresponding to two or more setting items 36 included in the setting menu 35 at the top of the viewer screen 41, and also displays a group of labels corresponding to two or more setting items 36 included in the setting menu 35. Display the corresponding label in a different color tone than other labels. That is, the control unit 21 displays information indicating the selected setting item on the viewer screen 41 as the notification information 42. Although the floating area 37 overlaps at least a portion of the surgical support image 16 on the operator screen 31, the control unit 21 continues to display at least a portion of the surgical support image 16 on the viewer screen 41.
  • the surgical support image 16 on the operator screen 31 and the viewer screen 41 As a method for drawing the surgical support image 16 on the operator screen 31 and the viewer screen 41, for example, information obtained by sensing is stored in the storage unit 22 as 2D memory information, and based on this 2D memory information, the light source and After setting the camera position, rendering is performed on the operator screen 31, and the two-dimensional image generated as the surgical support image 16 is drawn in the first area 32 of the operator screen 31, and the scale is adjusted. It is conceivable to draw the image on the viewer screen 41 as well. However, in this embodiment, the screen ratios of the first area 32 of the operator screen 31 and the viewer screen 41 are different. Therefore, it is desirable to render the operator screen 31 and viewer screen 41 separately.
  • the light source and camera positions are set based on 2D memory information, and then rendering is performed on the operator screen 31 according to the screen ratio, and the 2D image generated as the surgical support image 16 is A dimensional image is drawn in the first area 32 of the operator screen 31, and rendering is separately performed on the viewer screen 41 according to the screen ratio, and another two-dimensional image generated as the surgical support image 16 is displayed to the viewer. It is conceivable to draw on the screen 41. That is, the control unit 21 of the image processing device 20 separates the image displayed in the first area 32 as the surgical support image 16 and the image displayed on the viewer screen 41 based on the same information obtained by sensing. You can render it to . Each rendering is repeated at regular intervals, so if the operator 15 changes the position of the light source or camera, the change is automatically reflected on both the operator screen 31 and the viewer screen 41.
  • the case where a three-dimensional image is displayed as the surgical support image 16 is the same as the case where a two-dimensional image is displayed. That is, as a method for drawing the surgical support image 16 on the operator screen 31 and the viewer screen 41, for example, 3D space information is stored in the storage unit 22, and based on this 3D space information, the positions of the light source and camera are After setting, volume rendering is performed on the operator screen 31, and the three-dimensional image generated as the surgical support image 16 is drawn in the first area 32 of the operator screen 31, and the scale is adjusted and viewed. It is conceivable to draw the image on the user screen 41 as well. However, in this embodiment, the screen ratios of the first area 32 of the operator screen 31 and the viewer screen 41 are different.
  • a more desirable method is to set the light source and camera positions based on 3D space information, perform volume rendering on the operator screen 31 according to the screen ratio, and generate the surgical support image 16.
  • a three-dimensional image is drawn in the first area 32 of the operator screen 31, and volume rendering is separately performed on the viewer screen 41 according to the screen ratio, and another three-dimensional image is generated as the surgical support image 16. It is conceivable to draw this on the viewer screen 41.
  • Each volume rendering is repeated at regular intervals, so if the operator 15 changes the position of the light source or camera, the change is automatically reflected on both the operator screen 31 and the viewer screen 41.
  • information unnecessary for the surgeon 14, such as operation elements for controlling the display of the surgical support image 16, is displayed only on the operator screen 31, and such information is displayed on the viewer screen 41. This allows you to omit unnecessary information. Therefore, the visibility of the viewer screen 41 is improved.
  • the mouse cursor position on the operator screen 31 is within the first area 32 displaying clinical information, the position is specified and the mouse cursor is moved to the corresponding position on the viewer screen 41. can be displayed. Therefore, the operation status of the operator 15 can be communicated to the operator 14.
  • communication between the operator 14 in the radiation room 11 and the operator 15 in the separate room 12 is carried out via the notification information 42 while minimizing the number of people exposed to radiation in the radiation room 11. It can be made smoother.
  • the control unit 21 of the image processing device 20 determines that the position of the first pointer 51 has not changed for a certain period of time even when the first pointer 51 is in the first area 32.
  • the second pointer 52 may be hidden on the viewer screen 41. For example, if the first pointer 51 does not move in the first area 32 for three seconds, the control unit 21 may hide the second pointer 52 on the viewer screen 41. According to this modification, the second pointer 52 can be prevented from interfering with the surgeon 14 checking the structure of the living tissue.
  • control unit 21 determines that the position of the first pointer 51 has not changed for a certain period of time, it not only hides the second pointer 52 on the viewer screen 41 but also hides the second pointer 52 on the operator screen 31.
  • the first pointer 51 may be hidden.
  • Image processing system 11 Radiology room 12 Separate room 13 Window 14 Surgeon 15 Operator 16 Surgery support image 17 Biological tissue image 20 Image processing device 21 Control unit 22 Storage unit 23 Communication unit 24 Input unit 25 Output unit 30 First display 31 Operation User screen 32 First area 33 Second area 34a 2D button 34b 2D/3D button 34c Enlarge button 34d Reduce button 34e Rotate button 35 Setting menu 36 Setting item 37 Floating area 40 Second display 41 Viewer screen 42 Notification information 50 Input device 51 1st pointer 52 2nd pointer

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Molecular Biology (AREA)
  • Physics & Mathematics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

This image processing device: causes a first display to display an operator screen that includes a first region for displaying a surgical assistance image including a biological tissue image which has been imaged on the basis of information obtained by sensing the structure of biological tissue in the body of a surgical subject, and a second region for displaying at least one operation element for controlling the display of the surgical assistance image in the first region; causes a second display, different from the first display, to display a viewer screen for displaying the surgical assistance image so as to match the display of the surgical assistance image in the first region; receives a user operation performed via the operator screen by using an input device; when the user operation is performed in the first region, mirrors said user operation display on the viewer screen; and when the user operation is performed in the second region, does not mirror the user operation display on the viewer screen.

Description

画像処理装置、画像処理システム、画像表示方法、及び画像処理プログラムImage processing device, image processing system, image display method, and image processing program
 本開示は、画像処理装置、画像処理システム、画像表示方法、及び画像処理プログラムに関する。 The present disclosure relates to an image processing device, an image processing system, an image display method, and an image processing program.
 特許文献1には、被検体の画像情報を操作者用のモニタと術者用のモニタとに表示させ、操作者用のモニタに表示させる画像情報とは独立した画像情報を術者用のモニタに表示させる画像診断装置が開示されている。 Patent Document 1 discloses that image information of a subject is displayed on an operator's monitor and a surgeon's monitor, and image information independent of the image information displayed on the operator's monitor is displayed on the operator's monitor. An image diagnostic apparatus that displays images is disclosed.
特開2007-275506号公報Japanese Patent Application Publication No. 2007-275506
 モニタのスクリーンサイズには、HD、NTSC、WXGA、及びシネマスコープなど、様々な種類がある。「HD」は、high definitionの略語である。「NTSC」は、National Television System Committeeの略語である。「WXGA」は、Wide Extended Graphics Arrayの略語である。HDの画面比率は、横幅が16、高さが9、すなわち、16:9である。NTSCの画面比率は、横幅が4、高さが3、すなわち、4:3である。WXGAの画面比率は、横幅が16、高さが10、すなわち、16:10である。シネマスコープの画面比率は、横幅が2.35、高さが1、すなわち、2.35:1である。 There are various types of monitor screen sizes, such as HD, NTSC, WXGA, and Cinemascope. "HD" is an abbreviation for high definition. "NTSC" is an abbreviation for National Television System Committee. "WXGA" is an abbreviation for Wide Extended Graphics Array. The HD screen ratio is 16:9 in width and 9 in height. The screen ratio of NTSC is 4:3 in width and 3 in height. The screen ratio of WXGA is 16:10 in width and 10 in height. The screen ratio of CinemaScope is 2.35 in width and 1 in height, that is, 2.35:1.
 血管造影室など、術者がカテーテル治療を行う部屋では、4:3のスクリーンサイズが採用されている事例が多い。一方、操作者がソフトウェア操作を行う部屋では、メニュー操作及び細かな設定を行うのに適した、16:9などの、より横長のスクリーンサイズを採用することが望ましい。しかし、術者側のモニタよりも横長のスクリーンサイズを操作者側のモニタに採用した場合、描画サイズが変わるため、術者側のモニタに操作者側の画面を表示させると、縦に圧縮された画面が表示されてしまうという課題が生じる。このような課題への対策として、術者側のモニタには術者にとって重要な情報のみを表示させることが考えられる。 A screen size of 4:3 is often used in rooms where surgeons perform catheter treatment, such as angiography rooms. On the other hand, in a room where an operator operates software, it is desirable to adopt a screen with a wider screen size such as 16:9, which is suitable for menu operations and detailed settings. However, if the screen size of the operator's monitor is longer than that of the surgeon's monitor, the drawing size will change, so if the operator's screen is displayed on the surgeon's monitor, it will be compressed vertically. A problem arises in that a screen that has been modified is displayed. As a countermeasure to this problem, it is conceivable to display only information important to the surgeon on the monitor on the surgeon's side.
 術者側のモニタに操作者側の画面が表示される場合、術者は、操作者側の画面上でどのような操作が行われているかを確認しながら、操作者と円滑にコミュニケーションをとることができる。これに対し、術者側のモニタに臨床上重要な情報のみが表示される場合、術者は、操作者側の画面上でどのような操作が行われているかを確認できず、操作者と円滑にコミュニケーションをとることが難しくなるおそれがある。 When the operator's screen is displayed on the surgeon's monitor, the surgeon can communicate smoothly with the operator while checking what operations are being performed on the operator's screen. be able to. On the other hand, if only clinically important information is displayed on the surgeon's monitor, the surgeon cannot see what operations are being performed on the operator's screen, and the It may become difficult to communicate smoothly.
 本開示の目的は、術者と操作者との間のコミュニケーションを円滑にすることである。 The purpose of the present disclosure is to facilitate communication between a surgeon and an operator.
 本開示の一態様としての画像処理装置は、手術対象者の体内にある生体組織の構造をセンシングにより得られた情報を基に画像化した生体組織画像を含む手術支援画像を表示する第1領域と、前記第1領域における前記手術支援画像の表示を制御するための少なくとも1つの操作要素を表示する第2領域とを含む操作者画面を第1ディスプレイに表示させるとともに、前記第1領域における前記手術支援画像の表示に合わせて前記手術支援画像を表示する閲覧者画面を前記第1ディスプレイとは異なる第2ディスプレイに表示させる画像処理装置であって、入力機器を用いて前記操作者画面で行われるユーザ操作を受け付け、当該ユーザ操作が前記第1領域で行われているときは、当該ユーザ操作の表示を前記閲覧者画面に反映し、当該ユーザ操作が前記第2領域で行われているときは、当該ユーザ操作の表示を前記閲覧者画面に反映しない制御部を備える。 An image processing device according to an aspect of the present disclosure includes a first area that displays a surgical support image including a living tissue image that is imaged based on information obtained by sensing the structure of living tissue in the body of a surgical subject. and a second area displaying at least one operating element for controlling the display of the surgical support image in the first area, and displaying the operator screen in the first area on the first display. An image processing device that displays a viewer screen that displays the surgical support image on a second display that is different from the first display in accordance with the display of the surgical support image, the image processing device comprising: When the user operation is performed in the first area, the display of the user operation is reflected on the viewer screen, and when the user operation is performed in the second area. includes a control unit that does not reflect the display of the user operation on the viewer screen.
 一実施形態として、前記制御部は、前記ユーザ操作が前記第2領域で行われているときは、前記ユーザ操作が前記第2領域で行われていることを示す報知情報を前記閲覧者画面に表示する。 In one embodiment, when the user operation is performed in the second area, the control unit displays notification information on the viewer screen indicating that the user operation is performed in the second area. indicate.
 一実施形態として、前記制御部は、2つ以上の設定項目を含む設定メニューを前記操作者画面に表示し、前記入力機器を用いて前記設定メニューから前記2つ以上の設定項目のいずれか1つを選択する操作を受け付けると、選択された設定項目に応じた1つ以上の操作要素を前記第2領域に表示し、前記ユーザ操作として、前記1つ以上の操作要素の操作を受け付けると、前記報知情報として、選択された設定項目を示す情報を前記閲覧者画面に表示する。 In one embodiment, the control unit displays a setting menu including two or more setting items on the operator screen, and selects one of the two or more setting items from the setting menu using the input device. When an operation to select one is received, one or more operation elements corresponding to the selected setting item are displayed in the second area, and when an operation of the one or more operation elements is accepted as the user operation, As the notification information, information indicating the selected setting item is displayed on the viewer screen.
 一実施形態として、前記制御部は、前記少なくとも1つの操作要素として、2つ以上の操作要素を前記第2領域に表示し、前記ユーザ操作として、前記2つ以上の操作要素のいずれか1つの操作を受け付けると、前記報知情報として、操作されている操作要素を示す情報を前記閲覧者画面に表示する。 In one embodiment, the control unit displays two or more operation elements in the second area as the at least one operation element, and displays any one of the two or more operation elements as the user operation. When an operation is accepted, information indicating the operating element being operated is displayed on the viewer screen as the notification information.
 一実施形態として、前記入力機器は、ポインティングデバイスであり、前記制御部は、前記ユーザ操作により移動可能な第1ポインタを前記操作者画面に表示し、前記第1ポインタが前記第1領域にあるときは、前記第1ポインタの移動に合わせて移動する第2ポインタを前記閲覧者画面に表示することで、前記ユーザ操作の表示を前記閲覧者画面に反映し、前記第1ポインタが前記第2領域にあるときは、前記閲覧者画面において前記第2ポインタを非表示にすることで、前記閲覧者画面への前記ユーザ操作の表示の反映を停止する。 In one embodiment, the input device is a pointing device, and the control unit displays a first pointer movable by the user operation on the operator screen, and the first pointer is in the first area. In this case, by displaying on the viewer screen a second pointer that moves in accordance with the movement of the first pointer, the display of the user operation is reflected on the viewer screen, and the first pointer moves in accordance with the movement of the second pointer. When the second pointer is in the area, the display of the user operation is stopped from being reflected on the viewer screen by hiding the second pointer on the viewer screen.
 一実施形態として、前記制御部は、前記第1ポインタが前記第1領域にあるときでも、前記第1ポインタの位置が一定時間変化していないと判定すると、前記閲覧者画面において前記第2ポインタを非表示にする。 In one embodiment, when the control unit determines that the position of the first pointer has not changed for a certain period of time even when the first pointer is in the first area, the control unit displays the second pointer on the viewer screen. Hide.
 一実施形態として、前記入力機器は、ポインティングデバイスであり、前記制御部は、前記ユーザ操作により移動可能な第1ポインタを前記操作者画面に表示し、前記第1ポインタが前記第1領域にあるときは、前記第1ポインタの移動に合わせて移動する第2ポインタを前記閲覧者画面に表示することで、前記ユーザ操作の表示を前記閲覧者画面に反映し、前記第1ポインタが前記第2領域にあるときは、前記第2ポインタを、位置を固定して前記閲覧者画面に表示することで、前記閲覧者画面への前記ユーザ操作の表示の反映を停止する。 In one embodiment, the input device is a pointing device, and the control unit displays a first pointer movable by the user operation on the operator screen, and the first pointer is in the first area. In this case, by displaying on the viewer screen a second pointer that moves in accordance with the movement of the first pointer, the display of the user operation is reflected on the viewer screen, and the first pointer moves in accordance with the movement of the second pointer. When the second pointer is in the area, the second pointer is displayed on the viewer screen with its position fixed, thereby stopping the display of the user operation from being reflected on the viewer screen.
 一実施形態として、前記制御部は、前記第1ポインタが前記第1領域にあるときでも、前記第1ポインタの位置が一定時間変化していないと判定すると、前記閲覧者画面において前記第2ポインタを非表示にする。 In one embodiment, when the control unit determines that the position of the first pointer has not changed for a certain period of time even when the first pointer is in the first area, the control unit displays the second pointer on the viewer screen. Hide.
 一実施形態として、前記第2領域は、前記入力機器を用いて移動可能な、前記第1領域に重なることが許容されるフローティング領域を含み、前記制御部は、前記操作者画面において前記フローティング領域が前記手術支援画像の少なくとも一部に重なっているときは、前記操作者画面において前記手術支援画像の当該少なくとも一部を非表示にするとともに、前記閲覧者画面において前記手術支援画像の当該少なくとも一部を表示し続ける。 In one embodiment, the second area includes a floating area that is movable using the input device and is allowed to overlap the first area, and the control unit is configured to control the floating area on the operator screen. overlaps at least a part of the surgery support image, the at least part of the surgery support image is hidden on the operator screen, and the at least one part of the surgery support image is hidden on the viewer screen. Continue to display the section.
 一実施形態として、前記制御部は、前記手術支援画像として前記第1領域に表示される画像と前記閲覧者画面に表示される画像とを前記センシングにより得られた同一の情報を基に別々にレンダリングする。 In one embodiment, the control unit separately controls an image displayed in the first area as the surgical support image and an image displayed on the viewer screen based on the same information obtained by the sensing. Render.
 本開示の一態様としての画像処理システムは、前記画像処理装置と、前記第1ディスプレイと、前記第2ディスプレイとを備える。 An image processing system as one aspect of the present disclosure includes the image processing device, the first display, and the second display.
 一実施形態として、前記第2ディスプレイは、手術が実施される放射線室に設置され、前記第1ディスプレイは、前記入力機器を用いた操作が行われる別室に設置される。 In one embodiment, the second display is installed in a radiation room where surgery is performed, and the first display is installed in a separate room where operations using the input device are performed.
 本開示の一態様としての画像表示方法は、手術対象者の体内にある生体組織の構造をセンシングにより得られた情報を基に画像化した生体組織画像を含む手術支援画像を表示する第1領域と、前記第1領域における前記手術支援画像の表示を制御するための少なくとも1つの操作要素を表示する第2領域とを含む操作者画面を第1ディスプレイに表示するとともに、前記第1領域における前記手術支援画像の表示に合わせて前記手術支援画像を表示する閲覧者画面を前記第1ディスプレイとは異なる第2ディスプレイに表示する画像表示方法であって、入力機器を用いて前記操作者画面で行われるユーザ操作を受け付け、前記ユーザ操作が前記第1領域で行われているときは、前記ユーザ操作の表示を前記閲覧者画面に反映し、前記ユーザ操作が前記第2領域で行われているときは、前記ユーザ操作の表示を前記閲覧者画面に反映しない、というものである。 An image display method as an aspect of the present disclosure includes a first area for displaying a surgical support image including a living tissue image that is imaged based on information obtained by sensing the structure of living tissue in the body of a surgical subject. and a second area displaying at least one operating element for controlling the display of the surgical support image in the first area, and displaying on the first display an operator screen including: An image display method that displays a viewer screen that displays the surgical support image on a second display different from the first display in accordance with the display of the surgical support image, the method comprising: When the user operation is performed in the first area, the display of the user operation is reflected on the viewer screen, and when the user operation is performed in the second area, the display of the user operation is reflected on the viewer screen. This means that the display of the user operation is not reflected on the viewer screen.
 一実施形態として、前記画像表示方法は、前記ユーザ操作が前記第2領域で行われているときは、前記ユーザ操作が前記第2領域で行われていることを示す報知情報を前記閲覧者画面に表示する、というものである。 In one embodiment, when the user operation is performed in the second area, the image display method displays notification information on the viewer screen indicating that the user operation is performed in the second area. This means that it will be displayed on the screen.
 本開示の一態様としての画像処理プログラムは、手術対象者の体内にある生体組織の構造をセンシングにより得られた情報を基に画像化した生体組織画像を含む手術支援画像を表示する第1領域と、前記第1領域における前記手術支援画像の表示を制御するための少なくとも1つの操作要素を表示する第2領域とを含む操作者画面を第1ディスプレイに表示させるとともに、前記第1領域における前記手術支援画像の表示に合わせて前記手術支援画像を表示する閲覧者画面を前記第1ディスプレイとは異なる第2ディスプレイに表示させるコンピュータに、入力機器を用いて前記操作者画面で行われるユーザ操作を受け付ける処理と、前記ユーザ操作が前記第1領域で行われているときは、前記ユーザ操作の表示を前記閲覧者画面に反映し、前記ユーザ操作が前記第2領域で行われているときは、前記ユーザ操作の表示を前記閲覧者画面に反映しない処理とを実行させる。 An image processing program according to an aspect of the present disclosure includes a first area for displaying a surgical support image including a living tissue image formed based on information obtained by sensing the structure of living tissue in the body of a surgical subject. and a second area displaying at least one operating element for controlling the display of the surgical support image in the first area, and displaying the operator screen in the first area on the first display. A user operation performed on the operator screen using an input device is sent to a computer that displays a viewer screen that displays the surgical support image on a second display different from the first display in accordance with the display of the surgical support image. and when the user operation is performed in the first area, a display of the user operation is reflected on the viewer screen, and when the user operation is performed in the second area, A process for not reflecting the display of the user operation on the viewer screen is executed.
 本開示によれば、術者と操作者との間のコミュニケーションを円滑にすることができる。 According to the present disclosure, communication between the surgeon and the operator can be facilitated.
本開示の実施形態に係る画像処理システムの構成を示すブロック図である。FIG. 1 is a block diagram showing the configuration of an image processing system according to an embodiment of the present disclosure. 本開示の実施形態に係る第1ディスプレイ及び第2ディスプレイの使用例を示す図である。FIG. 3 is a diagram illustrating an example of use of a first display and a second display according to an embodiment of the present disclosure. 本開示の実施形態に係る画像処理装置が第1ディスプレイに表示させる操作者画面の例を示す図である。FIG. 3 is a diagram illustrating an example of an operator screen displayed on a first display by the image processing device according to an embodiment of the present disclosure. 本開示の実施形態に係る画像処理装置が第2ディスプレイに表示させる閲覧者画面の例を示す図である。FIG. 7 is a diagram illustrating an example of a viewer screen displayed on a second display by the image processing device according to the embodiment of the present disclosure. 本開示の実施形態に係る画像処理装置が第1ディスプレイに表示させる操作者画面の例を示す図である。FIG. 3 is a diagram illustrating an example of an operator screen displayed on a first display by the image processing device according to an embodiment of the present disclosure. 本開示の実施形態に係る画像処理装置が第2ディスプレイに表示させる閲覧者画面の例を示す図である。FIG. 7 is a diagram illustrating an example of a viewer screen displayed on a second display by the image processing device according to the embodiment of the present disclosure. 本開示の実施形態に係る画像処理装置が第1ディスプレイに表示させる操作者画面の例を示す図である。FIG. 3 is a diagram illustrating an example of an operator screen displayed on a first display by the image processing device according to an embodiment of the present disclosure. 本開示の実施形態に係る画像処理装置が第2ディスプレイに表示させる閲覧者画面の例を示す図である。FIG. 7 is a diagram illustrating an example of a viewer screen displayed on a second display by the image processing device according to the embodiment of the present disclosure. 本開示の実施形態に係る画像処理装置が第1ディスプレイに表示させる操作者画面の例を示す図である。FIG. 3 is a diagram illustrating an example of an operator screen displayed on a first display by the image processing device according to an embodiment of the present disclosure. 本開示の実施形態に係る画像処理装置が第2ディスプレイに表示させる閲覧者画面の例を示す図である。FIG. 7 is a diagram illustrating an example of a viewer screen displayed on a second display by the image processing device according to the embodiment of the present disclosure. 本開示の実施形態に係る画像処理装置の構成を示すブロック図である。FIG. 1 is a block diagram showing the configuration of an image processing device according to an embodiment of the present disclosure. 本開示の実施形態に係る画像処理装置の動作を示すフローチャートである。3 is a flowchart showing the operation of the image processing device according to the embodiment of the present disclosure.
 以下、本開示の一実施形態について、図を参照して説明する。 Hereinafter, one embodiment of the present disclosure will be described with reference to the drawings.
 各図中、同一又は相当する部分には、同一符号を付している。本実施形態の説明において、同一又は相当する部分については、説明を適宜省略又は簡略化する。 In each figure, the same or corresponding parts are given the same reference numerals. In the description of this embodiment, the description of the same or corresponding parts will be omitted or simplified as appropriate.
 図1を参照して、本実施形態に係る画像処理システム10の構成を説明する。 With reference to FIG. 1, the configuration of an image processing system 10 according to the present embodiment will be described.
 画像処理システム10は、画像処理装置20と、第1ディスプレイ30と、第1ディスプレイ30とは異なる第2ディスプレイ40と、入力機器50とを備える。画像処理装置20は、第1ディスプレイ30、第2ディスプレイ40、及び入力機器50とケーブル若しくはネットワークを介して、又は無線で接続される。 The image processing system 10 includes an image processing device 20, a first display 30, a second display 40 different from the first display 30, and an input device 50. The image processing device 20 is connected to the first display 30, the second display 40, and the input device 50 via a cable or a network, or wirelessly.
 画像処理装置20は、例えば、PCなどの汎用コンピュータ、クラウドサーバなどのサーバコンピュータ、又は専用コンピュータである。「PC」は、personal computerの略語である。画像処理装置20は、病院などの医療施設に設置されてもよいし、又はデータセンタなど、医療施設とは別の施設に設置されてもよい。 The image processing device 20 is, for example, a general-purpose computer such as a PC, a server computer such as a cloud server, or a dedicated computer. "PC" is an abbreviation for personal computer. The image processing device 20 may be installed in a medical facility such as a hospital, or may be installed in a facility other than the medical facility such as a data center.
 第1ディスプレイ30及び第2ディスプレイ40は、例えば、LCD又は有機ELディスプレイである。「LCD」は、liquid crystal displayの略語である。「EL」は、electro luminescentの略語である。第1ディスプレイ30のスクリーンサイズは、任意のスクリーンサイズでよいが、本実施形態ではNTSCである。すなわち、第1ディスプレイ30の画面比率は、任意の画面比率でよいが、本実施形態では4:3である。第2ディスプレイ40のスクリーンサイズは、任意のスクリーンサイズでよいが、本実施形態ではHDである。すなわち、第2ディスプレイ40の画面比率は、任意の画面比率でよいが、本実施形態では16:9である。入力機器50は、例えば、マウスなどのポインティングデバイス、キーボード、又は第1ディスプレイ30と一体的に設けられたタッチスクリーンである。図2に示すように、第2ディスプレイ40は、医療施設の放射線室11に設置される。第1ディスプレイ30及び入力機器50は、医療施設の別室12に設置される。放射線室11は、別室12から窓13を通して見えるようになっている。放射線室11では、医師などの術者14によって手術が実施される。別室12では、臨床工学技士などの操作者15によって、入力機器50を用いた操作が行われる。第1ディスプレイ30及び第2ディスプレイ40には、IVUS画像、OFDI画像、OCT画像、CT画像、MRI画像、エコー画像、又はX線画像などの手術支援画像16が表示される。「IVUS」は、intravascular ultrasoundの略語である。「OFDI」は、optical frequency domain imagingの略語である。「OCT」は、optical coherence tomographyの略語である。「CT」は、computed tomographyの略語である。「MRI」は、magnetic resonance imagingの略語である。 The first display 30 and the second display 40 are, for example, LCDs or organic EL displays. "LCD" is an abbreviation for liquid crystal display. "EL" is an abbreviation for electro luminescent. The screen size of the first display 30 may be any screen size, but in this embodiment it is NTSC. That is, the screen ratio of the first display 30 may be any screen ratio, but in this embodiment, it is 4:3. The screen size of the second display 40 may be any screen size, but in this embodiment it is HD. That is, the screen ratio of the second display 40 may be any screen ratio, but in this embodiment it is 16:9. The input device 50 is, for example, a pointing device such as a mouse, a keyboard, or a touch screen provided integrally with the first display 30. As shown in FIG. 2, the second display 40 is installed in the radiation room 11 of a medical facility. The first display 30 and the input device 50 are installed in a separate room 12 of the medical facility. The radiation room 11 is visible from a separate room 12 through a window 13. In the radiation room 11, a surgery is performed by an operator 14 such as a doctor. In the separate room 12, an operator 15 such as a clinical engineer performs operations using the input device 50. The first display 30 and the second display 40 display surgical support images 16 such as IVUS images, OFDI images, OCT images, CT images, MRI images, echo images, or X-ray images. "IVUS" is an abbreviation for intravascular ultrasound. "OFDI" is an abbreviation for optical frequency domain imaging. "OCT" is an abbreviation for optical coherence tomography. "CT" is an abbreviation for computed tomography. "MRI" is an abbreviation for magnetic resonance imaging.
 図3から図10を参照して、本実施形態の概要を説明する。 An overview of this embodiment will be described with reference to FIGS. 3 to 10.
 画像処理装置20は、図3、図5、図7、及び図9に示すような操作者画面31を第1ディスプレイ30に表示させるとともに、図4、図6、図8、及び図10に示すような閲覧者画面41を第2ディスプレイ40に表示させる。操作者画面31は、手術支援画像16を表示する第1領域32と、第1領域32における手術支援画像16の表示を制御するための少なくとも1つの操作要素を表示する第2領域33とを含む。手術支援画像16は、手術対象者の体内にある生体組織の構造をセンシングにより得られた情報を基に画像化した生体組織画像17を含む。閲覧者画面41は、第1領域32における手術支援画像16の表示に合わせて手術支援画像16を表示する。画像処理装置20は、入力機器50を用いて操作者画面31で行われるユーザ操作を受け付ける。画像処理装置20は、ユーザ操作が第1領域32で行われているときは、ユーザ操作の表示を閲覧者画面41に反映する。画像処理装置20は、ユーザ操作が第2領域33で行われているときは、ユーザ操作の表示を閲覧者画面41に反映しない。 The image processing device 20 displays an operator screen 31 as shown in FIGS. 3, 5, 7, and 9 on the first display 30, and also displays an operator screen 31 as shown in FIGS. A viewer screen 41 like this is displayed on the second display 40. The operator screen 31 includes a first area 32 that displays the surgical support image 16 and a second area 33 that displays at least one operating element for controlling the display of the surgical support image 16 in the first area 32. . The surgical support image 16 includes a living tissue image 17 that is an image formed based on information obtained by sensing the structure of living tissue within the body of the person undergoing surgery. The viewer screen 41 displays the surgical support image 16 in accordance with the display of the surgical support image 16 in the first area 32 . The image processing device 20 receives user operations performed on the operator screen 31 using the input device 50 . When a user operation is performed in the first area 32, the image processing device 20 reflects the display of the user operation on the viewer screen 41. When the user operation is performed in the second area 33, the image processing device 20 does not reflect the display of the user operation on the viewer screen 41.
 本実施形態では、術者14側のモニタに、操作者15側の画面の第1領域32に表示される手術支援画像16を表示させつつ、操作者15側の画面の第2領域33に表示される操作要素を非表示にすることができる。ここで、手術支援画像16は、術者14にとって重要な情報であるが、操作要素は、術者14にとっては不要な情報である。すなわち、本実施形態によれば、術者14側のモニタに臨床上重要な情報のみを表示させることができる。 In this embodiment, the surgical support image 16 displayed in the first area 32 of the screen of the operator 15 side is displayed on the monitor of the operator 14 side, and is displayed in the second area 33 of the screen of the operator 15 side. You can hide the operation elements that are displayed. Here, the surgical support image 16 is important information for the surgeon 14, but the operation elements are information unnecessary for the surgeon 14. That is, according to this embodiment, only clinically important information can be displayed on the monitor on the operator's side.
 本実施形態では、ユーザ操作が第1領域32で行われているときは、術者14側のモニタにユーザ操作の表示が反映されるが、ユーザ操作が第2領域33で行われているときは、術者14側のモニタにはユーザ操作の表示が反映されない。よって、術者14は、少なくとも操作者15側の画面の第1領域32でどのような操作が行われているかを確認することができる。その結果、術者14は、操作者15と円滑にコミュニケーションをとることができる。すなわち、本実施形態によれば、術者14と操作者15との間のコミュニケーションを円滑にすることができる。 In this embodiment, when a user operation is performed in the first area 32, the display of the user operation is reflected on the monitor on the operator 14 side, but when the user operation is performed in the second area 33, the display of the user operation is reflected on the monitor on the operator 14 side. In this case, the display of the user's operation is not reflected on the monitor on the surgeon's side. Therefore, the surgeon 14 can confirm at least what kind of operation is being performed in the first area 32 of the screen on the operator 15 side. As a result, the surgeon 14 can communicate smoothly with the operator 15. That is, according to this embodiment, communication between the surgeon 14 and the operator 15 can be facilitated.
 入力機器50がポインティングデバイスである場合、画像処理装置20は、ユーザ操作により移動可能な第1ポインタ51を操作者画面31に表示する。画像処理装置20は、第1ポインタ51が第1領域32にあるときは、第1ポインタ51の移動に合わせて移動する第2ポインタ52を閲覧者画面41に表示することで、ユーザ操作の表示を閲覧者画面41に反映する。画像処理装置20は、第1ポインタ51が第2領域33にあるときは、閲覧者画面41において第2ポインタ52を非表示にすることで、閲覧者画面41へのユーザ操作の表示の反映を停止する。図3の例では、第1ポインタ51が第2領域33にある。そのため、図3の例に対応する図4の例では、閲覧者画面41において第2ポインタ52が非表示になっている。一方、図5の例では、第1ポインタ51が第1領域32にある。そのため、図5の例に対応する図6の例では、第2ポインタ52が第1ポインタ51の移動に合わせて移動するように閲覧者画面41に表示されている。 When the input device 50 is a pointing device, the image processing device 20 displays a first pointer 51 that can be moved by user operation on the operator screen 31. When the first pointer 51 is in the first area 32, the image processing device 20 displays a second pointer 52 that moves in accordance with the movement of the first pointer 51 on the viewer screen 41, thereby displaying user operations. is reflected on the viewer screen 41. The image processing device 20 hides the second pointer 52 on the viewer screen 41 when the first pointer 51 is in the second area 33, thereby preventing the display of the user's operation from being reflected on the viewer screen 41. Stop. In the example of FIG. 3, the first pointer 51 is located in the second area 33. Therefore, in the example of FIG. 4 corresponding to the example of FIG. 3, the second pointer 52 is hidden on the viewer screen 41. On the other hand, in the example of FIG. 5, the first pointer 51 is located in the first area 32. Therefore, in the example of FIG. 6 corresponding to the example of FIG. 5, the second pointer 52 is displayed on the viewer screen 41 so as to move in accordance with the movement of the first pointer 51.
 画像処理装置20は、第1ポインタ51が第2領域33にあるときは、第2ポインタ52を非表示にする代わりに、第2ポインタ52を、位置を固定して閲覧者画面41に表示することで、閲覧者画面41へのユーザ操作の表示の反映を停止してもよい。図7の例では、第1ポインタ51が第2領域33にある。そのため、図7の例に対応する図8の例では、閲覧者画面41において第2ポインタ52が位置を固定して閲覧者画面41に表示されている。第2ポインタ52は、閲覧者画面41において、第1ポインタ51の第1領域32から出る直前の位置に対応する位置に固定されている。よって、術者14は、少なくとも操作者15側の画面でどの方向に第1ポインタ51が動かされたかを確認することができる。例えば、術者14は、操作者画面31において第2領域33が第1領域32の右側に隣接していて、設定メニュー35が第2領域33の下部に配置されていることを知っていたとすると、第2ポインタ52が閲覧者画面41の右下部にあることから、操作者15が設定メニュー35を操作していると推測することができる。 When the first pointer 51 is in the second area 33, the image processing device 20 displays the second pointer 52 on the viewer screen 41 at a fixed position instead of hiding the second pointer 52. In this way, the display of user operations may be stopped from being reflected on the viewer screen 41. In the example of FIG. 7, the first pointer 51 is located in the second area 33. Therefore, in the example of FIG. 8 corresponding to the example of FIG. 7, the second pointer 52 is displayed on the viewer screen 41 with its position fixed. The second pointer 52 is fixed at a position on the viewer screen 41 corresponding to the position immediately before the first pointer 51 leaves the first area 32 . Therefore, the surgeon 14 can confirm at least in which direction the first pointer 51 has been moved on the screen on the operator 15 side. For example, assume that the surgeon 14 knows that the second area 33 is adjacent to the right side of the first area 32 on the operator screen 31 and that the setting menu 35 is arranged at the bottom of the second area 33. , since the second pointer 52 is located at the lower right corner of the viewer screen 41, it can be inferred that the operator 15 is operating the setting menu 35.
 第2領域33には、任意の操作要素が表示されてよいが、本実施形態では、2Dボタン34a、2D/3Dボタン34b、拡大ボタン34c、縮小ボタン34d、及び回転ボタン34eが表示される。 Although any operating elements may be displayed in the second area 33, in this embodiment, a 2D button 34a, a 2D/3D button 34b, an enlargement button 34c, a reduction button 34d, and a rotation button 34e are displayed.
 2Dボタン34aは、手術支援画像16として、2次元画像を第1領域32に表示するモードを選択するための操作要素である。例えば、2Dボタン34aをクリックすることで、手術支援画像16として、IVUS画像を第1領域32に表示することができる。IVUS画像は、生体組織画像17として、手術対象者の体内に挿入されたカテーテルの先端に設けられている超音波振動子から超音波を送信し、反射波を受信し、反射波の信号を処理することにより得られた情報を基に手術対象者の血管又は心臓などの生体組織の断面構造を画像化した2次元画像を含む。 The 2D button 34a is an operation element for selecting a mode in which a two-dimensional image is displayed in the first area 32 as the surgical support image 16. For example, by clicking the 2D button 34a, an IVUS image can be displayed in the first area 32 as the surgical support image 16. An IVUS image is a biological tissue image 17 in which an ultrasonic wave is transmitted from an ultrasonic transducer installed at the tip of a catheter inserted into the body of a patient undergoing surgery, a reflected wave is received, and the signal of the reflected wave is processed. The image includes a two-dimensional image of a cross-sectional structure of a living tissue such as a blood vessel or a heart of a person to be operated on based on information obtained by performing the surgery.
 2D/3Dボタン34bは、手術支援画像16として、2次元画像及び3次元画像を第1領域32に表示するモードを選択するための操作要素である。例えば、2D/3Dボタン34bをクリックすることで、手術支援画像16として、最新のIVUS画像と、複数のIVUS画像を積層して3次元化した3次元画像を第1領域32に表示することができる。この3次元画像は、生体組織画像17として、プルバック操作によって超音波振動子を移動させながら、手術対象者の生体組織の断面構造を順次画像化して複数の2次元画像を生成し、これら複数の2次元画像を積層して3次元化した3次元画像を含む。 The 2D/3D button 34b is an operation element for selecting a mode in which a two-dimensional image and a three-dimensional image are displayed in the first area 32 as the surgical support image 16. For example, by clicking the 2D/3D button 34b, the latest IVUS image and a three-dimensional image obtained by stacking multiple IVUS images can be displayed in the first area 32 as the surgical support image 16. can. This three-dimensional image is generated as a biological tissue image 17 by sequentially imaging the cross-sectional structure of the patient's biological tissue while moving the ultrasound transducer by a pull-back operation, and generating a plurality of two-dimensional images. Contains a three-dimensional image obtained by stacking two-dimensional images to create a three-dimensional image.
 拡大ボタン34cは、第1領域32において、手術支援画像16を拡大表示するための操作要素である。例えば、拡大ボタン34cを1回クリックすることで、手術支援画像16として第1領域32に表示されているIVUS画像を拡大表示することができる。 The enlarge button 34c is an operation element for enlarging and displaying the surgical support image 16 in the first region 32. For example, by clicking the enlarge button 34c once, the IVUS image displayed in the first area 32 as the surgical support image 16 can be enlarged and displayed.
 縮小ボタン34dは、第1領域32において、手術支援画像16を縮小表示するための操作要素である。例えば、縮小ボタン34dを1回クリックすることで、手術支援画像16として第1領域32に表示されているIVUS画像を縮小表示することができる。 The reduction button 34d is an operation element for displaying the surgical support image 16 in a reduced size in the first area 32. For example, by clicking the reduction button 34d once, the IVUS image displayed in the first area 32 as the surgical support image 16 can be displayed in a reduced size.
 回転ボタン34eは、第1領域32において、手術支援画像16を回転させるための操作要素である。例えば、回転ボタン34eを1回クリックすることで、手術支援画像16として第1領域32に表示されているIVUS画像を時計回りに90度回転させることができる。 The rotation button 34e is an operation element for rotating the surgical support image 16 in the first region 32. For example, by clicking the rotation button 34e once, the IVUS image displayed in the first area 32 as the surgical support image 16 can be rotated 90 degrees clockwise.
 画像処理装置20は、ユーザ操作が第2領域33で行われているときは、ユーザ操作が第2領域33で行われていることを示す報知情報42を閲覧者画面41に表示する。図3の例では、手術支援画像16として、IVUS画像が第1領域32に表示されているときに、第2領域33で回転ボタン34eがクリックされている。そのため、図3の例に対応する図4の例では、報知情報42として、第2領域33で回転ボタン34eがクリックされていることを示すアイコンが閲覧者画面41に表示されている。図9の例では、手術支援画像16として、IVUS画像が第1領域32に表示されているときに、第2領域33に含まれるフローティング領域37でキャリブレーション操作が行われている。そのため、図9の例に対応する図10の例では、報知情報42として、第2領域33でキャリブレーション操作が行われていることを示すラベルが閲覧者画面41において強調表示されている。 When the user operation is performed in the second area 33, the image processing device 20 displays notification information 42 indicating that the user operation is performed in the second area 33 on the viewer screen 41. In the example of FIG. 3, when the IVUS image is displayed in the first area 32 as the surgical support image 16, the rotation button 34e is clicked in the second area 33. Therefore, in the example of FIG. 4 corresponding to the example of FIG. 3, an icon indicating that the rotation button 34e has been clicked in the second area 33 is displayed on the viewer screen 41 as the notification information 42. In the example of FIG. 9, when the IVUS image is displayed in the first area 32 as the surgical support image 16, a calibration operation is performed in the floating area 37 included in the second area 33. Therefore, in the example of FIG. 10 corresponding to the example of FIG. 9, a label indicating that a calibration operation is being performed in the second area 33 is highlighted on the viewer screen 41 as the notification information 42.
 画像処理装置20は、設定メニュー35を操作者画面31に表示する。本実施形態では、画像処理装置20は、設定メニュー35を第2領域33のみに表示し、閲覧者画面41には表示しない。本実施形態の一変形例として、画像処理装置20は、設定メニュー35を第1領域32だけでなく、閲覧者画面41にも表示してよい。設定メニュー35は、2つ以上の設定項目36を含む。画像処理装置20は、入力機器50を用いて設定メニュー35から2つ以上の設定項目36のいずれか1つを選択する操作を受け付けると、選択された設定項目に応じた1つ以上の操作要素を第2領域33に表示する。画像処理装置20は、ユーザ操作として、当該1つ以上の操作要素の操作を受け付けると、報知情報42として、選択された設定項目を示す情報を閲覧者画面41に表示する。図5の例では、設定メニュー35が閉じている。図7の例では、設定メニュー35が開かれ、2つ以上の設定項目36が表示されているときに、キャリブレーション操作に対応する設定項目がクリックされている。そのため、図9の例では、クリックされた設定項目に応じた操作要素として、キャリブレーション操作を行うためのウィンドウが第2領域33に含まれるフローティング領域37に表示されている。図9の例では、フローティング領域37に表示されているウィンドウの操作が行われている。そのため、図9の例に対応する図10の例では、報知情報42として、キャリブレーション操作が選択されたことを示すラベルが閲覧者画面41において強調表示されている。 The image processing device 20 displays a setting menu 35 on the operator screen 31. In this embodiment, the image processing device 20 displays the setting menu 35 only in the second area 33 and does not display it on the viewer screen 41. As a modification of this embodiment, the image processing device 20 may display the setting menu 35 not only in the first area 32 but also on the viewer screen 41. Setting menu 35 includes two or more setting items 36. When the image processing device 20 receives an operation to select any one of two or more setting items 36 from the setting menu 35 using the input device 50, the image processing device 20 selects one or more operation elements according to the selected setting item. is displayed in the second area 33. When the image processing device 20 receives an operation of the one or more operation elements as a user operation, the image processing device 20 displays information indicating the selected setting item on the viewer screen 41 as notification information 42. In the example of FIG. 5, the settings menu 35 is closed. In the example of FIG. 7, when the setting menu 35 is opened and two or more setting items 36 are displayed, the setting item corresponding to the calibration operation is clicked. Therefore, in the example of FIG. 9, a window for performing a calibration operation is displayed in the floating area 37 included in the second area 33 as an operation element corresponding to the clicked setting item. In the example of FIG. 9, a window displayed in the floating area 37 is being manipulated. Therefore, in the example of FIG. 10 corresponding to the example of FIG. 9, a label indicating that the calibration operation has been selected is highlighted on the viewer screen 41 as the notification information 42.
 画像処理装置20は、2つ以上の操作要素を第2領域33に表示する。画像処理装置20は、ユーザ操作として、当該2つ以上の操作要素のいずれか1つの操作を受け付けると、報知情報42として、操作されている操作要素を示す情報を閲覧者画面41に表示する。図3の例では、拡大ボタン34c、縮小ボタン34d、及び回転ボタン34eが第2領域33に表示されているときに、回転ボタン34eがクリックされている。そのため、図3の例に対応する図4の例では、報知情報42として、回転ボタン34eがクリックされていることを示すアイコンが閲覧者画面41に表示されている。 The image processing device 20 displays two or more operation elements in the second area 33. When the image processing device 20 receives an operation of any one of the two or more operation elements as a user operation, the image processing device 20 displays information indicating the operated operation element on the viewer screen 41 as notification information 42. In the example of FIG. 3, when the enlarge button 34c, the reduce button 34d, and the rotate button 34e are displayed in the second area 33, the rotate button 34e is clicked. Therefore, in the example of FIG. 4 corresponding to the example of FIG. 3, an icon indicating that the rotation button 34e has been clicked is displayed on the viewer screen 41 as the notification information 42.
 フローティング領域37は、入力機器50を用いて移動可能な、第1領域32に重なることが許容される領域である。画像処理装置20は、操作者画面31においてフローティング領域37が手術支援画像16の少なくとも一部に重なっているときは、操作者画面31において手術支援画像16の当該少なくとも一部を非表示にするとともに、閲覧者画面41において手術支援画像16の当該少なくとも一部を表示し続ける。図9の例では、操作者画面31においてフローティング領域37が手術支援画像16の少なくとも一部に重なっている。そのため、図9の例では、操作者画面31において手術支援画像16の当該少なくとも一部が非表示になっているが、図9の例に対応する図10の例では、閲覧者画面41において手術支援画像16の当該少なくとも一部が表示されている。 The floating area 37 is an area that can be moved using the input device 50 and is allowed to overlap the first area 32 . When the floating area 37 overlaps at least a portion of the surgical support image 16 on the operator screen 31, the image processing device 20 hides at least a portion of the surgical support image 16 on the operator screen 31, and , continues to display at least a portion of the surgical support image 16 on the viewer screen 41. In the example of FIG. 9, the floating area 37 overlaps at least a portion of the surgical support image 16 on the operator screen 31. Therefore, in the example of FIG. 9, at least a part of the surgical support image 16 is hidden on the operator screen 31, but in the example of FIG. At least a portion of the support image 16 is displayed.
 図11を参照して、本実施形態に係る画像処理装置20の構成を説明する。 The configuration of the image processing device 20 according to this embodiment will be described with reference to FIG. 11.
 画像処理装置20は、制御部21と、記憶部22と、通信部23と、入力部24と、出力部25とを備える。 The image processing device 20 includes a control section 21, a storage section 22, a communication section 23, an input section 24, and an output section 25.
 制御部21は、少なくとも1つのプロセッサ、少なくとも1つのプログラマブル回路、少なくとも1つの専用回路、又はこれらの任意の組合せを含む。プロセッサは、CPU若しくはGPUなどの汎用プロセッサ、又は特定の処理に特化した専用プロセッサである。「CPU」は、central processing unitの略語である。「GPU」は、graphics processing unitの略語である。プログラマブル回路は、例えば、FPGAである。「FPGA」は、field-programmable gate arrayの略語である。専用回路は、例えば、ASICである。「ASIC」は、application specific integrated circuitの略語である。制御部21は、画像処理装置20を含む画像処理システム10の各部を制御しながら、画像処理装置20の動作に関わる処理を実行する。 The control unit 21 includes at least one processor, at least one programmable circuit, at least one dedicated circuit, or any combination thereof. The processor is a general-purpose processor such as a CPU or GPU, or a dedicated processor specialized for specific processing. "CPU" is an abbreviation for central processing unit. “GPU” is an abbreviation for graphics processing unit. The programmable circuit is, for example, an FPGA. "FPGA" is an abbreviation for field-programmable gate array. The dedicated circuit is, for example, an ASIC. “ASIC” is an abbreviation for application specific integrated circuit. The control unit 21 executes processing related to the operation of the image processing device 20 while controlling each part of the image processing system 10 including the image processing device 20.
 記憶部22は、少なくとも1つの半導体メモリ、少なくとも1つの磁気メモリ、少なくとも1つの光メモリ、又はこれらの任意の組合せを含む。半導体メモリは、例えば、RAM、ROM、又はフラッシュメモリである。「RAM」は、random access memoryの略語である。「ROM」は、read only memoryの略語である。RAMは、例えば、SRAM又はDRAMである。「SRAM」は、static random access memoryの略語である。「DRAM」は、dynamic random access memoryの略語である。ROMは、例えば、EEPROMである。「EEPROM」は、electrically erasable programmable read only memoryの略語である。フラッシュメモリは、例えば、SSDである。「SSD」は、solid-state driveの略語である。磁気メモリは、例えば、HDDである。「HDD」は、hard disk driveの略語である。記憶部22は、例えば、主記憶装置、補助記憶装置、又はキャッシュメモリとして機能する。記憶部22には、画像処理装置20の動作に用いられるデータと、画像処理装置20の動作によって得られたデータとが記憶される。 The storage unit 22 includes at least one semiconductor memory, at least one magnetic memory, at least one optical memory, or any combination thereof. The semiconductor memory is, for example, RAM, ROM, or flash memory. "RAM" is an abbreviation for random access memory. "ROM" is an abbreviation for read only memory. The RAM is, for example, SRAM or DRAM. “SRAM” is an abbreviation for static random access memory. “DRAM” is an abbreviation for dynamic random access memory. The ROM is, for example, an EEPROM. "EEPROM" is an abbreviation for electrically erasable programmable read only memory. The flash memory is, for example, an SSD. "SSD" is an abbreviation for solid-state drive. The magnetic memory is, for example, an HDD. "HDD" is an abbreviation for hard disk drive. The storage unit 22 functions as, for example, a main storage device, an auxiliary storage device, or a cache memory. The storage unit 22 stores data used for the operation of the image processing device 20 and data obtained by the operation of the image processing device 20.
 通信部23は、少なくとも1つの通信用インタフェースを含む。通信用インタフェースは、例えば、IEEE802.11などの無線LAN通信規格に対応したインタフェース、又はEthernet(登録商標)などの有線LAN通信規格に対応したインタフェースである。「IEEE」は、Institute of Electrical and Electronics Engineersの略称である。通信部23は、画像処理装置20の動作に用いられるデータを受信し、また画像処理装置20の動作によって得られるデータを送信する。 The communication unit 23 includes at least one communication interface. The communication interface is, for example, an interface compatible with a wireless LAN communication standard such as IEEE802.11, or an interface compatible with a wired LAN communication standard such as Ethernet (registered trademark). "IEEE" is an abbreviation for Institute of Electrical and Electronics Engineers. The communication unit 23 receives data used for the operation of the image processing device 20 and transmits data obtained by the operation of the image processing device 20.
 入力部24は、少なくとも1つの入力用インタフェースを含む。入力用インタフェースは、例えば、USBインタフェース、HDMI(登録商標)インタフェース、又はBluetooth(登録商標)などの近距離無線通信規格に対応したインタフェースである。「USB」は、Universal Serial Busの略語である。「HDMI(登録商標)」は、High-Definition Multimedia Interfaceの略語である。入力部24は、画像処理装置20の動作に用いられるデータを入力する操作を受け付ける。入力部24は、入力機器50に接続される。 The input unit 24 includes at least one input interface. The input interface is, for example, a USB interface, an HDMI (registered trademark) interface, or an interface compatible with a short-range wireless communication standard such as Bluetooth (registered trademark). "USB" is an abbreviation for Universal Serial Bus. "HDMI (registered trademark)" is an abbreviation for High-Definition Multimedia Interface. The input unit 24 receives an operation for inputting data used for the operation of the image processing device 20. Input section 24 is connected to input device 50 .
 出力部25は、少なくとも1つの出力用インタフェースを含む。出力用インタフェースは、例えば、USBインタフェース、HDMI(登録商標)インタフェース、又はBluetooth(登録商標)などの近距離無線通信規格に対応したインタフェースである。出力部25は、画像処理装置20の動作によって得られるデータを出力する。出力部25は、第1ディスプレイ30及び第2ディスプレイ40に接続される。 The output unit 25 includes at least one output interface. The output interface is, for example, a USB interface, an HDMI (registered trademark) interface, or an interface compatible with a short-range wireless communication standard such as Bluetooth (registered trademark). The output unit 25 outputs data obtained by the operation of the image processing device 20. The output unit 25 is connected to the first display 30 and the second display 40.
 画像処理装置20の機能は、本実施形態に係る画像処理プログラムを、制御部21としてのプロセッサで実行することにより実現される。すなわち、画像処理装置20の機能は、ソフトウェアにより実現される。画像処理プログラムは、画像処理装置20の動作をコンピュータに実行させることで、コンピュータを画像処理装置20として機能させる。すなわち、コンピュータは、画像処理プログラムに従って画像処理装置20の動作を実行することにより画像処理装置20として機能する。 The functions of the image processing device 20 are realized by executing the image processing program according to the present embodiment by a processor serving as the control unit 21. That is, the functions of the image processing device 20 are realized by software. The image processing program causes the computer to function as the image processing device 20 by causing the computer to execute the operations of the image processing device 20 . That is, the computer functions as the image processing device 20 by executing the operations of the image processing device 20 according to the image processing program.
 プログラムは、非一時的なコンピュータ読取り可能な媒体に記憶しておくことができる。非一時的なコンピュータ読取り可能な媒体は、例えば、フラッシュメモリ、磁気記録装置、光ディスク、光磁気記録媒体、又はROMである。プログラムの流通は、例えば、プログラムを記憶したSDカード、DVD、又はCD-ROMなどの可搬型媒体を販売、譲渡、又は貸与することによって行う。「SD」は、Secure Digitalの略語である。「DVD」は、digital versatile discの略語である。「CD-ROM」は、compact disc read only memoryの略語である。プログラムをサーバのストレージに格納しておき、サーバから他のコンピュータにプログラムを転送することにより、プログラムを流通させてもよい。プログラムをプログラムプロダクトとして提供してもよい。 The program may be stored on a non-transitory computer-readable medium. The non-transitory computer-readable medium is, for example, a flash memory, a magnetic recording device, an optical disk, a magneto-optical recording medium, or a ROM. Distribution of the program is performed, for example, by selling, transferring, or lending a portable medium such as an SD card, DVD, or CD-ROM that stores the program. "SD" is an abbreviation for Secure Digital. "DVD" is an abbreviation for digital versatile disc. "CD-ROM" is an abbreviation for compact disc read only memory. The program may be distributed by storing the program in the storage of a server and transferring the program from the server to another computer. The program may be provided as a program product.
 コンピュータは、例えば、可搬型媒体に記憶されたプログラム又はサーバから転送されたプログラムを、一旦、主記憶装置に格納する。そして、コンピュータは、主記憶装置に格納されたプログラムをプロセッサで読み取り、読み取ったプログラムに従った処理をプロセッサで実行する。コンピュータは、可搬型媒体から直接プログラムを読み取り、プログラムに従った処理を実行してもよい。コンピュータは、コンピュータにサーバからプログラムが転送される度に、逐次、受け取ったプログラムに従った処理を実行してもよい。サーバからコンピュータへのプログラムの転送は行わず、実行指示及び結果取得のみによって機能を実現する、いわゆるASP型のサービスによって処理を実行してもよい。「ASP」は、application service providerの略語である。プログラムには、電子計算機による処理の用に供する情報であってプログラムに準ずるものが含まれる。例えば、コンピュータに対する直接の指令ではないがコンピュータの処理を規定する性質を有するデータは、「プログラムに準ずるもの」に該当する。 For example, a computer temporarily stores a program stored on a portable medium or a program transferred from a server in its main storage device. Then, the computer uses a processor to read a program stored in the main memory, and causes the processor to execute processing according to the read program. A computer may read a program directly from a portable medium and execute processing according to the program. The computer may sequentially execute processing according to the received program each time the program is transferred to the computer from the server. Processing may be performed using a so-called ASP type service that implements functions only by issuing execution instructions and obtaining results without transferring programs from the server to the computer. “ASP” is an abbreviation for application service provider. The program includes information similar to a program that is used for processing by an electronic computer. For example, data that is not a direct command to a computer but has the property of regulating computer processing falls under "something similar to a program."
 画像処理装置20の一部又は全ての機能が、制御部21としてのプログラマブル回路又は専用回路により実現されてもよい。すなわち、画像処理装置20の一部又は全ての機能が、ハードウェアにより実現されてもよい。 A part or all of the functions of the image processing device 20 may be realized by a programmable circuit or a dedicated circuit as the control unit 21. That is, some or all of the functions of the image processing device 20 may be realized by hardware.
 図12を参照して、本実施形態に係る画像処理装置20の動作を説明する。この動作は、本実施形態に係る画像表示方法に相当する。 The operation of the image processing device 20 according to this embodiment will be described with reference to FIG. 12. This operation corresponds to the image display method according to this embodiment.
 ステップS1において、制御部21は、入力機器50を用いて操作者画面31で行われるユーザ操作を受け付ける。操作者画面31は、第1ディスプレイ30に表示されている。操作者画面31の第1領域32には、生体組織画像17を含む手術支援画像16が表示されている。操作者画面31の第2領域33には、第1領域32における手術支援画像16の表示を制御するための少なくとも1つの操作要素が表示されている。 In step S1, the control unit 21 receives a user operation performed on the operator screen 31 using the input device 50. The operator screen 31 is displayed on the first display 30. In the first area 32 of the operator screen 31, a surgical support image 16 including a biological tissue image 17 is displayed. In the second area 33 of the operator screen 31, at least one operating element for controlling the display of the surgical support image 16 in the first area 32 is displayed.
 具体的には、制御部21は、入力部24を介して、ユーザ操作を受け付ける。図3の例では、制御部21は、ユーザ操作として、マウスを用いて第1ポインタ51を第2領域33に表示されている2つ以上の操作要素のうち、回転ボタン34eまで移動させ、回転ボタン34eをマウスでクリックする操作を受け付ける。図5の例では、制御部21は、ユーザ操作として、マウスを用いて第1ポインタ51を第1領域32に表示されている手術支援画像16の上で移動させ、手術支援画像16の任意の位置をマーキングのためにマウスでクリックする操作を受け付ける。図7の例では、制御部21は、出力部25を介して、2つ以上の設定項目36を含む設定メニュー35を操作者画面31に表示する。制御部21は、入力機器50を用いて設定メニュー35から2つ以上の設定項目36のいずれか1つを選択する操作として、マウスを用いて第1ポインタ51を第2領域33に表示されている設定メニュー35まで移動させて設定メニュー35を開き、2つ以上の設定項目36のうち、キャリブレーション操作に対応する設定項目をマウスでクリックする操作を受け付ける。制御部21は、選択された設定項目に応じた1つ以上の操作要素として、キャリブレーション操作を行うためのGUI部品群を含むウィンドウを第2領域33のフローティング領域37に表示する。「GUI」は、graphical user interfaceの略語である。図9の例では、制御部21は、ユーザ操作として、マウスを用いて第1ポインタ51を第2領域33のフローティング領域37まで移動させ、フローティング領域37に表示されているウィンドウ内のGUI部品群をマウスで操作することで行われるキャリブレーション操作を受け付ける。制御部21は、操作者画面31においてフローティング領域37が手術支援画像16の少なくとも一部に重なっているため、操作者画面31において手術支援画像16の当該少なくとも一部を非表示にする。 Specifically, the control unit 21 receives user operations via the input unit 24. In the example of FIG. 3, as a user operation, the control unit 21 uses the mouse to move the first pointer 51 to the rotation button 34e among the two or more operation elements displayed in the second area 33, and rotates it. An operation of clicking the button 34e with a mouse is accepted. In the example of FIG. 5, the control unit 21 uses the mouse to move the first pointer 51 on the surgical support image 16 displayed in the first area 32 as a user operation, and moves the first pointer 51 on the surgical support image 16 displayed in the first area 32 to Accepts a mouse click operation to mark a position. In the example of FIG. 7, the control unit 21 displays a setting menu 35 including two or more setting items 36 on the operator screen 31 via the output unit 25. The control unit 21 uses the mouse to move the first pointer 51 displayed in the second area 33 as an operation for selecting any one of two or more setting items 36 from the setting menu 35 using the input device 50. The setting menu 35 is moved to the setting menu 35, and the setting menu 35 is opened, and an operation of clicking a setting item corresponding to the calibration operation among the two or more setting items 36 with the mouse is accepted. The control unit 21 displays a window including a group of GUI parts for performing a calibration operation in the floating area 37 of the second area 33 as one or more operation elements according to the selected setting item. “GUI” is an abbreviation for graphical user interface. In the example of FIG. 9, the control unit 21 uses the mouse to move the first pointer 51 to the floating area 37 of the second area 33 as a user operation, and displays the GUI components in the window displayed in the floating area 37. Accepts calibration operations performed by operating with the mouse. Since the floating region 37 overlaps at least a portion of the surgical support image 16 on the operator screen 31, the control unit 21 hides at least a portion of the surgical support image 16 on the operator screen 31.
 ユーザ操作が第1領域32で行われているときは、ステップS2の処理が実行される。ユーザ操作が第2領域33で行われているときは、ステップS3及びステップS4の処理が実行される。 When a user operation is performed in the first area 32, the process of step S2 is executed. When a user operation is performed in the second area 33, the processes of step S3 and step S4 are executed.
 ステップS2において、制御部21は、ユーザ操作の表示を閲覧者画面41に反映する。閲覧者画面41は、第2ディスプレイ40に表示されている。閲覧者画面41には、第1領域32における手術支援画像16の表示に合わせて手術支援画像16が表示されている。 In step S2, the control unit 21 reflects the display of the user's operation on the viewer screen 41. A viewer screen 41 is displayed on the second display 40. The viewer screen 41 displays the surgical support image 16 in accordance with the display of the surgical support image 16 in the first area 32 .
 具体的には、制御部21は、出力部25を介して、ユーザ操作の表示を閲覧者画面41に反映する。図5の例に対応する図6の例では、制御部21は、第2ポインタ52を第1ポインタ51と同じ移動軌跡を辿るように閲覧者画面41に表示されている手術支援画像16の上で移動させ、第2ポインタ52を閲覧者画面41において第1領域32でクリックされた位置に対応する位置で停止させる。すなわち、制御部21は、第2ポインタ52を第1ポインタ51の移動に合わせて移動するように閲覧者画面41に表示することで、ユーザ操作の表示を閲覧者画面41に反映する。 Specifically, the control unit 21 reflects the display of the user operation on the viewer screen 41 via the output unit 25. In the example shown in FIG. 6 corresponding to the example shown in FIG. to stop the second pointer 52 at a position corresponding to the position clicked in the first area 32 on the viewer screen 41. That is, the control unit 21 reflects the display of the user's operation on the viewer screen 41 by displaying the second pointer 52 on the viewer screen 41 so as to move in accordance with the movement of the first pointer 51.
 ステップS3において、制御部21は、閲覧者画面41へのユーザ操作の表示の反映を停止する。 In step S3, the control unit 21 stops reflecting the display of the user's operation on the viewer screen 41.
 具体的には、制御部21は、出力部25を介して、閲覧者画面41へのユーザ操作の表示の反映を停止する。図3の例に対応する図4の例では、制御部21は、閲覧者画面41において第2ポインタ52を非表示にすることで、閲覧者画面41へのユーザ操作の表示の反映を停止する。図7の例に対応する図8の例では、制御部21は、第2ポインタ52を、位置を固定して閲覧者画面41に表示することで、閲覧者画面41へのユーザ操作の表示の反映を停止する。 Specifically, the control unit 21 stops reflecting the display of the user's operation on the viewer screen 41 via the output unit 25. In the example of FIG. 4 corresponding to the example of FIG. 3, the control unit 21 stops reflecting the display of the user operation on the viewer screen 41 by hiding the second pointer 52 on the viewer screen 41. . In the example shown in FIG. 8, which corresponds to the example shown in FIG. Stop reflection.
 ステップS4において、制御部21は、報知情報42を閲覧者画面41に表示する。 In step S4, the control unit 21 displays the notification information 42 on the viewer screen 41.
 具体的には、制御部21は、出力部25を介して、報知情報42を閲覧者画面41に表示する。図3の例に対応する図4の例では、制御部21は、報知情報42として、回転ボタン34eがクリックされていることを示すアイコンを閲覧者画面41に表示する。すなわち、制御部21は、報知情報42として、操作されている操作要素を示す情報を閲覧者画面41に表示する。図9の例に対応する図10の例では、制御部21は、報知情報42として、キャリブレーション操作が選択されたことを示すラベルを閲覧者画面41において強調表示する。例えば、制御部21は、設定メニュー35に含まれる2つ以上の設定項目36に対応するラベル群を閲覧者画面41の上部に表示しておくとともに、操作者画面31で選択された設定項目に対応するラベルを他のラベルとは異なる色調で表示する。すなわち、制御部21は、報知情報42として、選択された設定項目を示す情報を閲覧者画面41に表示する。制御部21は、操作者画面31においてフローティング領域37が手術支援画像16の少なくとも一部に重なっているが、閲覧者画面41においては手術支援画像16の当該少なくとも一部を表示し続ける。 Specifically, the control unit 21 displays the notification information 42 on the viewer screen 41 via the output unit 25. In the example of FIG. 4 corresponding to the example of FIG. 3, the control unit 21 displays, as notification information 42, an icon indicating that the rotation button 34e has been clicked on the viewer screen 41. That is, the control unit 21 displays information indicating the operating element being operated on the viewer screen 41 as the notification information 42. In the example of FIG. 10 corresponding to the example of FIG. 9, the control unit 21 highlights, as the notification information 42, a label indicating that the calibration operation has been selected on the viewer screen 41. For example, the control unit 21 displays a group of labels corresponding to two or more setting items 36 included in the setting menu 35 at the top of the viewer screen 41, and also displays a group of labels corresponding to two or more setting items 36 included in the setting menu 35. Display the corresponding label in a different color tone than other labels. That is, the control unit 21 displays information indicating the selected setting item on the viewer screen 41 as the notification information 42. Although the floating area 37 overlaps at least a portion of the surgical support image 16 on the operator screen 31, the control unit 21 continues to display at least a portion of the surgical support image 16 on the viewer screen 41.
 手術支援画像16を操作者画面31及び閲覧者画面41に描画する方法として、例えば、センシングにより得られた情報を2Dメモリ情報として記憶部22に記憶し、この2Dメモリ情報を基に、光源及びカメラの位置を設定した上で、操作者画面31に対してレンダリングを行い、手術支援画像16として生成された2次元画像を操作者画面31の第1領域32に描画するとともに、縮尺を調整して閲覧者画面41にも描画することが考えられる。しかし、本実施形態では、操作者画面31の第1領域32及び閲覧者画面41の画面比率が異なっている。よって、操作者画面31及び閲覧者画面41に対して別々にレンダリングを行うことが望ましい。例えば、より望ましい方法として、2Dメモリ情報を基に、光源及びカメラの位置を設定した上で、操作者画面31に対して画面比率に応じたレンダリングを行い、手術支援画像16として生成された2次元画像を操作者画面31の第1領域32に描画するとともに、閲覧者画面41に対して別途画面比率に応じたレンダリングを行い、手術支援画像16として生成された別の2次元画像を閲覧者画面41に描画することが考えられる。すなわち、画像処理装置20の制御部21は、手術支援画像16として第1領域32に表示される画像と閲覧者画面41に表示される画像とをセンシングにより得られた同一の情報を基に別々にレンダリングしてよい。それぞれのレンダリングは定間隔で繰り返し行われるため、操作者15が光源又はカメラの位置を変更した場合は、その変更が操作者画面31及び閲覧者画面41の両方に自動的に反映される。 As a method for drawing the surgical support image 16 on the operator screen 31 and the viewer screen 41, for example, information obtained by sensing is stored in the storage unit 22 as 2D memory information, and based on this 2D memory information, the light source and After setting the camera position, rendering is performed on the operator screen 31, and the two-dimensional image generated as the surgical support image 16 is drawn in the first area 32 of the operator screen 31, and the scale is adjusted. It is conceivable to draw the image on the viewer screen 41 as well. However, in this embodiment, the screen ratios of the first area 32 of the operator screen 31 and the viewer screen 41 are different. Therefore, it is desirable to render the operator screen 31 and viewer screen 41 separately. For example, as a more desirable method, the light source and camera positions are set based on 2D memory information, and then rendering is performed on the operator screen 31 according to the screen ratio, and the 2D image generated as the surgical support image 16 is A dimensional image is drawn in the first area 32 of the operator screen 31, and rendering is separately performed on the viewer screen 41 according to the screen ratio, and another two-dimensional image generated as the surgical support image 16 is displayed to the viewer. It is conceivable to draw on the screen 41. That is, the control unit 21 of the image processing device 20 separates the image displayed in the first area 32 as the surgical support image 16 and the image displayed on the viewer screen 41 based on the same information obtained by sensing. You can render it to . Each rendering is repeated at regular intervals, so if the operator 15 changes the position of the light source or camera, the change is automatically reflected on both the operator screen 31 and the viewer screen 41.
 手術支援画像16として、3次元画像を表示する場合についても、2次元画像を表示する場合と同様である。すなわち、手術支援画像16を操作者画面31及び閲覧者画面41に描画する方法として、例えば、3D空間の情報を記憶部22に記憶し、この3D空間の情報を基に、光源及びカメラの位置を設定した上で、操作者画面31に対してボリュームレンダリングを行い、手術支援画像16として生成された3次元画像を操作者画面31の第1領域32に描画するとともに、縮尺を調整して閲覧者画面41にも描画することが考えられる。しかし、本実施形態では、操作者画面31の第1領域32及び閲覧者画面41の画面比率が異なっている。よって、ボリュームレンダリングを1度しか行わない方法よりも、ボリュームレンダリングを2度行う方法を用いることが望ましい。例えば、より望ましい方法として、3D空間の情報を基に、光源及びカメラの位置を設定した上で、操作者画面31に対して画面比率に応じたボリュームレンダリングを行い、手術支援画像16として生成された3次元画像を操作者画面31の第1領域32に描画するとともに、閲覧者画面41に対して別途画面比率に応じたボリュームレンダリングを行い、手術支援画像16として生成された別の3次元画像を閲覧者画面41に描画することが考えられる。それぞれのボリュームレンダリングは定間隔で繰り返し行われるため、操作者15が光源又はカメラの位置を変更した場合は、その変更が操作者画面31及び閲覧者画面41の両方に自動的に反映される。 The case where a three-dimensional image is displayed as the surgical support image 16 is the same as the case where a two-dimensional image is displayed. That is, as a method for drawing the surgical support image 16 on the operator screen 31 and the viewer screen 41, for example, 3D space information is stored in the storage unit 22, and based on this 3D space information, the positions of the light source and camera are After setting, volume rendering is performed on the operator screen 31, and the three-dimensional image generated as the surgical support image 16 is drawn in the first area 32 of the operator screen 31, and the scale is adjusted and viewed. It is conceivable to draw the image on the user screen 41 as well. However, in this embodiment, the screen ratios of the first area 32 of the operator screen 31 and the viewer screen 41 are different. Therefore, it is preferable to use a method that performs volume rendering twice rather than a method that performs volume rendering only once. For example, a more desirable method is to set the light source and camera positions based on 3D space information, perform volume rendering on the operator screen 31 according to the screen ratio, and generate the surgical support image 16. At the same time, a three-dimensional image is drawn in the first area 32 of the operator screen 31, and volume rendering is separately performed on the viewer screen 41 according to the screen ratio, and another three-dimensional image is generated as the surgical support image 16. It is conceivable to draw this on the viewer screen 41. Each volume rendering is repeated at regular intervals, so if the operator 15 changes the position of the light source or camera, the change is automatically reflected on both the operator screen 31 and the viewer screen 41.
 上述のように、本実施形態では、手術支援画像16の表示を制御するための操作要素など、術者14にとって不要な情報は操作者画面31のみに表示し、閲覧者画面41では、そのような不要な情報を省略することができる。よって、閲覧者画面41の見やすさが向上する。 As described above, in this embodiment, information unnecessary for the surgeon 14, such as operation elements for controlling the display of the surgical support image 16, is displayed only on the operator screen 31, and such information is displayed on the viewer screen 41. This allows you to omit unnecessary information. Therefore, the visibility of the viewer screen 41 is improved.
 本実施形態では、操作者画面31のコピーを閲覧者画面41に表示するのではなく、操作者画面31に表示されている臨床上必要な情報を閲覧者画面41に表示することができる。よって、操作者画面31でフローティング領域37が手術支援画像16の少なくとも一部に重なっていても、当該少なくとも一部が隠れるのは操作者画面31だけで、閲覧者画面41では手術支援画像16の全体を表示し続けることができる。 In this embodiment, instead of displaying a copy of the operator screen 31 on the viewer screen 41, clinically necessary information displayed on the operator screen 31 can be displayed on the viewer screen 41. Therefore, even if the floating area 37 overlaps at least a portion of the surgical support image 16 on the operator screen 31, only the operator screen 31 is covered with at least a portion of the floating area 37, and the surgical support image 16 is hidden on the viewer screen 41. You can continue to display the entire image.
 本実施形態では、操作者画面31のマウスカーソル位置が臨床上情報な情報を表示している第1領域32内にある場合、その位置を特定し、閲覧者画面41の対応する位置にマウスカーソルを表示することができる。よって、操作者15の操作状況を術者14に伝えることができる。 In this embodiment, when the mouse cursor position on the operator screen 31 is within the first area 32 displaying clinical information, the position is specified and the mouse cursor is moved to the corresponding position on the viewer screen 41. can be displayed. Therefore, the operation status of the operator 15 can be communicated to the operator 14.
 本実施形態では、放射線室11内で被爆する人員の数を必要最小限にしつつ、報知情報42を介して放射線室11内の術者14と別室12内の操作者15との間のコミュニケーションを円滑にすることができる。 In this embodiment, communication between the operator 14 in the radiation room 11 and the operator 15 in the separate room 12 is carried out via the notification information 42 while minimizing the number of people exposed to radiation in the radiation room 11. It can be made smoother.
 本実施形態の一変形例として、画像処理装置20の制御部21は、第1ポインタ51が第1領域32にあるときでも、第1ポインタ51の位置が一定時間変化していないと判定すると、閲覧者画面41において第2ポインタ52を非表示にしてもよい。例えば、制御部21は、第1領域32で第1ポインタ51が3秒間動かない場合は、閲覧者画面41で第2ポインタ52を非表示にしてもよい。この変形例によれば、第2ポインタ52が、術者14が生体組織の構造を確認する際の妨げにならないようにすることができる。更なる変形例として、制御部21は、第1ポインタ51の位置が一定時間変化していないと判定すると、閲覧者画面41において第2ポインタ52を非表示にするだけでなく、操作者画面31において第1ポインタ51を非表示にしてもよい。 As a modified example of the present embodiment, when the control unit 21 of the image processing device 20 determines that the position of the first pointer 51 has not changed for a certain period of time even when the first pointer 51 is in the first area 32, The second pointer 52 may be hidden on the viewer screen 41. For example, if the first pointer 51 does not move in the first area 32 for three seconds, the control unit 21 may hide the second pointer 52 on the viewer screen 41. According to this modification, the second pointer 52 can be prevented from interfering with the surgeon 14 checking the structure of the living tissue. As a further modification, when the control unit 21 determines that the position of the first pointer 51 has not changed for a certain period of time, it not only hides the second pointer 52 on the viewer screen 41 but also hides the second pointer 52 on the operator screen 31. The first pointer 51 may be hidden.
 本開示は上述の実施形態に限定されるものではない。例えば、ブロック図に記載の2つ以上のブロックを統合してもよいし、又は1つのブロックを分割してもよい。フローチャートに記載の2つ以上のステップを記述に従って時系列に実行する代わりに、各ステップを実行する装置の処理能力に応じて、又は必要に応じて、並列的に又は異なる順序で実行してもよい。その他、本開示の趣旨を逸脱しない範囲での変更が可能である。 The present disclosure is not limited to the embodiments described above. For example, two or more blocks depicted in the block diagram may be combined, or one block may be divided. Instead of performing two or more steps described in a flowchart chronologically as described, they may also be performed in parallel or in a different order, depending on the processing power of the device performing each step or as necessary. good. Other changes are possible without departing from the spirit of the present disclosure.
 10 画像処理システム
 11 放射線室
 12 別室
 13 窓
 14 術者
 15 操作者
 16 手術支援画像
 17 生体組織画像
 20 画像処理装置
 21 制御部
 22 記憶部
 23 通信部
 24 入力部
 25 出力部
 30 第1ディスプレイ
 31 操作者画面
 32 第1領域
 33 第2領域
 34a 2Dボタン
 34b 2D/3Dボタン
 34c 拡大ボタン
 34d 縮小ボタン
 34e 回転ボタン
 35 設定メニュー
 36 設定項目
 37 フローティング領域
 40 第2ディスプレイ
 41 閲覧者画面
 42 報知情報
 50 入力機器
 51 第1ポインタ
 52 第2ポインタ
10 Image processing system 11 Radiology room 12 Separate room 13 Window 14 Surgeon 15 Operator 16 Surgery support image 17 Biological tissue image 20 Image processing device 21 Control unit 22 Storage unit 23 Communication unit 24 Input unit 25 Output unit 30 First display 31 Operation User screen 32 First area 33 Second area 34a 2D button 34b 2D/3D button 34c Enlarge button 34d Reduce button 34e Rotate button 35 Setting menu 36 Setting item 37 Floating area 40 Second display 41 Viewer screen 42 Notification information 50 Input device 51 1st pointer 52 2nd pointer

Claims (15)

  1.  手術対象者の体内にある生体組織の構造をセンシングにより得られた情報を基に画像化した生体組織画像を含む手術支援画像を表示する第1領域と、前記第1領域における前記手術支援画像の表示を制御するための少なくとも1つの操作要素を表示する第2領域とを含む操作者画面を第1ディスプレイに表示させるとともに、前記第1領域における前記手術支援画像の表示に合わせて前記手術支援画像を表示する閲覧者画面を前記第1ディスプレイとは異なる第2ディスプレイに表示させる画像処理装置であって、
     入力機器を用いて前記操作者画面で行われるユーザ操作を受け付け、当該ユーザ操作が前記第1領域で行われているときは、当該ユーザ操作の表示を前記閲覧者画面に反映し、当該ユーザ操作が前記第2領域で行われているときは、当該ユーザ操作の表示を前記閲覧者画面に反映しない制御部を備える画像処理装置。
    a first area for displaying a surgical support image including a biological tissue image imaged based on information obtained by sensing the structure of a biological tissue in the body of a surgical subject; an operator screen that includes a second area that displays at least one operating element for controlling the display on the first display, and displays the surgical support image in accordance with the display of the surgical support image in the first area. An image processing device that displays a viewer screen that displays on a second display different from the first display,
    A user operation performed on the operator screen is accepted using an input device, and when the user operation is performed in the first area, the display of the user operation is reflected on the viewer screen, and the user operation is The image processing apparatus includes a control unit that does not reflect the display of the user operation on the viewer screen when the user operation is performed in the second area.
  2.  前記制御部は、前記ユーザ操作が前記第2領域で行われているときは、前記ユーザ操作が前記第2領域で行われていることを示す報知情報を前記閲覧者画面に表示する請求項1に記載の画像処理装置。 Claim 1: When the user operation is performed in the second area, the control unit displays notification information indicating that the user operation is performed in the second area on the viewer screen. The image processing device described in .
  3.  前記制御部は、2つ以上の設定項目を含む設定メニューを前記操作者画面に表示し、前記入力機器を用いて前記設定メニューから前記2つ以上の設定項目のいずれか1つを選択する操作を受け付けると、選択された設定項目に応じた1つ以上の操作要素を前記第2領域に表示し、前記ユーザ操作として、前記1つ以上の操作要素の操作を受け付けると、前記報知情報として、選択された設定項目を示す情報を前記閲覧者画面に表示する請求項2に記載の画像処理装置。 The control unit displays a setting menu including two or more setting items on the operator screen, and selects any one of the two or more setting items from the setting menu using the input device. When the operation of the one or more operation elements is accepted as the user operation, one or more operation elements corresponding to the selected setting item are displayed in the second area, and when the operation of the one or more operation elements is accepted as the user operation, the notification information includes: The image processing apparatus according to claim 2, wherein information indicating the selected setting item is displayed on the viewer screen.
  4.  前記制御部は、前記少なくとも1つの操作要素として、2つ以上の操作要素を前記第2領域に表示し、前記ユーザ操作として、前記2つ以上の操作要素のいずれか1つの操作を受け付けると、前記報知情報として、操作されている操作要素を示す情報を前記閲覧者画面に表示する請求項2に記載の画像処理装置。 When the control unit displays two or more operation elements in the second area as the at least one operation element, and receives an operation of any one of the two or more operation elements as the user operation, The image processing apparatus according to claim 2, wherein information indicating an operating element being operated is displayed on the viewer screen as the notification information.
  5.  前記入力機器は、ポインティングデバイスであり、
     前記制御部は、前記ユーザ操作により移動可能な第1ポインタを前記操作者画面に表示し、前記第1ポインタが前記第1領域にあるときは、前記第1ポインタの移動に合わせて移動する第2ポインタを前記閲覧者画面に表示することで、前記ユーザ操作の表示を前記閲覧者画面に反映し、前記第1ポインタが前記第2領域にあるときは、前記閲覧者画面において前記第2ポインタを非表示にすることで、前記閲覧者画面への前記ユーザ操作の表示の反映を停止する請求項1から請求項4のいずれか1項に記載の画像処理装置。
    The input device is a pointing device,
    The control unit displays a first pointer movable by the user operation on the operator screen, and when the first pointer is in the first area, the control unit displays a first pointer that moves in accordance with the movement of the first pointer. By displaying two pointers on the viewer screen, the display of the user operation is reflected on the viewer screen, and when the first pointer is in the second area, the second pointer is displayed on the viewer screen. The image processing apparatus according to any one of claims 1 to 4, wherein display of the user operation is stopped from being reflected on the viewer screen by hiding the image.
  6.  前記制御部は、前記第1ポインタが前記第1領域にあるときでも、前記第1ポインタの位置が一定時間変化していないと判定すると、前記閲覧者画面において前記第2ポインタを非表示にする請求項5に記載の画像処理装置。 The control unit hides the second pointer on the viewer screen when determining that the position of the first pointer has not changed for a certain period of time even when the first pointer is in the first area. The image processing device according to claim 5.
  7.  前記入力機器は、ポインティングデバイスであり、
     前記制御部は、前記ユーザ操作により移動可能な第1ポインタを前記操作者画面に表示し、前記第1ポインタが前記第1領域にあるときは、前記第1ポインタの移動に合わせて移動する第2ポインタを前記閲覧者画面に表示することで、前記ユーザ操作の表示を前記閲覧者画面に反映し、前記第1ポインタが前記第2領域にあるときは、前記第2ポインタを、位置を固定して前記閲覧者画面に表示することで、前記閲覧者画面への前記ユーザ操作の表示の反映を停止する請求項1から請求項4のいずれか1項に記載の画像処理装置。
    The input device is a pointing device,
    The control unit displays a first pointer movable by the user operation on the operator screen, and when the first pointer is in the first area, the control unit displays a first pointer that moves in accordance with the movement of the first pointer. By displaying two pointers on the viewer screen, the display of the user operation is reflected on the viewer screen, and when the first pointer is in the second area, the second pointer is fixed in position. The image processing device according to any one of claims 1 to 4, wherein the display of the user operation is stopped from being reflected on the viewer screen by displaying the user operation on the viewer screen.
  8.  前記制御部は、前記第1ポインタが前記第1領域にあるときでも、前記第1ポインタの位置が一定時間変化していないと判定すると、前記閲覧者画面において前記第2ポインタを非表示にする請求項7に記載の画像処理装置。 The control unit hides the second pointer on the viewer screen when determining that the position of the first pointer has not changed for a certain period of time even when the first pointer is in the first area. The image processing device according to claim 7.
  9.  前記第2領域は、前記入力機器を用いて移動可能な、前記第1領域に重なることが許容されるフローティング領域を含み、
     前記制御部は、前記操作者画面において前記フローティング領域が前記手術支援画像の少なくとも一部に重なっているときは、前記操作者画面において前記手術支援画像の当該少なくとも一部を非表示にするとともに、前記閲覧者画面において前記手術支援画像の当該少なくとも一部を表示し続ける請求項1から請求項8のいずれか1項に記載の画像処理装置。
    The second area includes a floating area that is movable using the input device and is allowed to overlap the first area,
    When the floating area overlaps at least a portion of the surgical support image on the operator screen, the control unit hides at least a portion of the surgical support image on the operator screen, and The image processing device according to any one of claims 1 to 8, wherein at least a part of the surgical support image continues to be displayed on the viewer screen.
  10.  前記制御部は、前記手術支援画像として前記第1領域に表示される画像と前記閲覧者画面に表示される画像とを前記センシングにより得られた同一の情報を基に別々にレンダリングする請求項1から請求項9のいずれか1項に記載の画像処理装置。 1 . The control unit separately renders an image displayed in the first area as the surgical support image and an image displayed on the viewer screen based on the same information obtained by the sensing. The image processing device according to claim 9.
  11.  請求項1から請求項10のいずれか1項に記載の画像処理装置と、
     前記第1ディスプレイと、
     前記第2ディスプレイと
    を備える画像処理システム。
    An image processing device according to any one of claims 1 to 10,
    the first display;
    An image processing system comprising: the second display.
  12.  前記第2ディスプレイは、手術が実施される放射線室に設置され、
     前記第1ディスプレイは、前記入力機器を用いた操作が行われる別室に設置される請求項11に記載の画像処理システム。
    The second display is installed in a radiation room where surgery is performed,
    The image processing system according to claim 11, wherein the first display is installed in a separate room where operations using the input device are performed.
  13.  手術対象者の体内にある生体組織の構造をセンシングにより得られた情報を基に画像化した生体組織画像を含む手術支援画像を表示する第1領域と、前記第1領域における前記手術支援画像の表示を制御するための少なくとも1つの操作要素を表示する第2領域とを含む操作者画面を第1ディスプレイに表示するとともに、前記第1領域における前記手術支援画像の表示に合わせて前記手術支援画像を表示する閲覧者画面を前記第1ディスプレイとは異なる第2ディスプレイに表示する画像表示方法であって、
     入力機器を用いて前記操作者画面で行われるユーザ操作を受け付け、
     前記ユーザ操作が前記第1領域で行われているときは、前記ユーザ操作の表示を前記閲覧者画面に反映し、前記ユーザ操作が前記第2領域で行われているときは、前記ユーザ操作の表示を前記閲覧者画面に反映しない画像表示方法。
    a first area for displaying a surgical support image including a biological tissue image imaged based on information obtained by sensing the structure of a biological tissue in the body of a surgical subject; an operator screen that includes a second area that displays at least one operating element for controlling the display on the first display; and a second area that displays at least one operation element for controlling the display; An image display method for displaying a viewer screen on a second display different from the first display, the method comprising:
    accepting user operations performed on the operator screen using an input device;
    When the user operation is performed in the first area, the display of the user operation is reflected on the viewer screen, and when the user operation is performed in the second area, the display of the user operation is reflected on the viewer screen. An image display method that does not reflect the display on the viewer screen.
  14.  前記ユーザ操作が前記第2領域で行われているときは、前記ユーザ操作が前記第2領域で行われていることを示す報知情報を前記閲覧者画面に表示する請求項13に記載の画像表示方法。 14. The image display according to claim 13, wherein when the user operation is performed in the second area, notification information indicating that the user operation is performed in the second area is displayed on the viewer screen. Method.
  15.  手術対象者の体内にある生体組織の構造をセンシングにより得られた情報を基に画像化した生体組織画像を含む手術支援画像を表示する第1領域と、前記第1領域における前記手術支援画像の表示を制御するための少なくとも1つの操作要素を表示する第2領域とを含む操作者画面を第1ディスプレイに表示させるとともに、前記第1領域における前記手術支援画像の表示に合わせて前記手術支援画像を表示する閲覧者画面を前記第1ディスプレイとは異なる第2ディスプレイに表示させるコンピュータに、
     入力機器を用いて前記操作者画面で行われるユーザ操作を受け付ける処理と、
     前記ユーザ操作が前記第1領域で行われているときは、前記ユーザ操作の表示を前記閲覧者画面に反映し、前記ユーザ操作が前記第2領域で行われているときは、前記ユーザ操作の表示を前記閲覧者画面に反映しない処理と
    を実行させる画像処理プログラム。
    a first area for displaying a surgical support image including a biological tissue image imaged based on information obtained by sensing the structure of a biological tissue in the body of a surgical subject; an operator screen that includes a second area that displays at least one operating element for controlling the display on the first display, and displays the surgical support image in accordance with the display of the surgical support image in the first area. a computer that displays a viewer screen that displays on a second display different from the first display;
    a process of accepting a user operation performed on the operator screen using an input device;
    When the user operation is performed in the first area, the display of the user operation is reflected on the viewer screen, and when the user operation is performed in the second area, the display of the user operation is reflected on the viewer screen. An image processing program that executes processing that does not reflect a display on the viewer screen.
PCT/JP2023/009725 2022-03-30 2023-03-13 Image processing device, image processing system, image display method, and image processing program WO2023189510A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-057406 2022-03-30
JP2022057406 2022-03-30

Publications (1)

Publication Number Publication Date
WO2023189510A1 true WO2023189510A1 (en) 2023-10-05

Family

ID=88200941

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/009725 WO2023189510A1 (en) 2022-03-30 2023-03-13 Image processing device, image processing system, image display method, and image processing program

Country Status (1)

Country Link
WO (1) WO2023189510A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000270318A (en) * 1999-03-15 2000-09-29 Olympus Optical Co Ltd Method for displaying surgical operation information
JP2007275506A (en) * 2006-04-12 2007-10-25 Hitachi Medical Corp Image diagnostic device
US20150005630A1 (en) * 2013-07-01 2015-01-01 Samsung Electronics Co., Ltd. Method of sharing information in ultrasound imaging
JP2017047054A (en) * 2015-09-04 2017-03-09 東芝メディカルシステムズ株式会社 Image processing apparatus and X-ray diagnostic apparatus
JP2017086360A (en) * 2015-11-09 2017-05-25 株式会社日立製作所 Ultrasonic diagnostic system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000270318A (en) * 1999-03-15 2000-09-29 Olympus Optical Co Ltd Method for displaying surgical operation information
JP2007275506A (en) * 2006-04-12 2007-10-25 Hitachi Medical Corp Image diagnostic device
US20150005630A1 (en) * 2013-07-01 2015-01-01 Samsung Electronics Co., Ltd. Method of sharing information in ultrasound imaging
JP2017047054A (en) * 2015-09-04 2017-03-09 東芝メディカルシステムズ株式会社 Image processing apparatus and X-ray diagnostic apparatus
JP2017086360A (en) * 2015-11-09 2017-05-25 株式会社日立製作所 Ultrasonic diagnostic system

Similar Documents

Publication Publication Date Title
US10545582B2 (en) Dynamic customizable human-computer interaction behavior
US9019301B2 (en) Medical image display apparatus in which specification of a medical image enables execution of image processing
US9606584B1 (en) Systems and user interfaces for dynamic interaction with two- and three-dimensional medical image data using hand gestures
US9220482B2 (en) Method for providing ultrasound images and ultrasound apparatus
US10303848B2 (en) Medical reading report preparing apparatus and medical image diagnostic apparatus
JP2009523500A (en) Control panel for medical imaging system
JP2012161605A (en) Portable imaging system with remote accessibility
US10285665B2 (en) Ultrasound diagnosis apparatus and method and computer-readable storage medium
JP6936842B2 (en) Visualization of reconstructed image data
EP2777501A1 (en) Portable display unit for medical image
US10269453B2 (en) Method and apparatus for providing medical information
JP2005130928A (en) Image display device, image display method, and program therefor
US20150160844A1 (en) Method and apparatus for displaying medical images
JP2009000167A (en) Medical image diagnostic apparatus and medical image display device
WO2023189510A1 (en) Image processing device, image processing system, image display method, and image processing program
CN108231162B (en) Method and apparatus for displaying medical images
US20090244006A1 (en) Information processing apparatus, image display control method thereof, and image display control program thereof
US20160120506A1 (en) Ultrasound imaging apparatus and method of operating same
JP2018015150A (en) Ultrasonic diagnostic device, medical image processor and medical image processing program
JP5583472B2 (en) Medical image diagnostic apparatus and control program
JP6702751B2 (en) Medical image display device, medical image display system and program
KR102321642B1 (en) Input apparatus and medical image apparatus comprising the same
JP2019188031A (en) Computer program, recording medium, display device, and display method
US20190196664A1 (en) Medical image display system and medical image display apparatus
JP2024058846A (en) Medical image processing device, medical image processing method, and medical image processing program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23779530

Country of ref document: EP

Kind code of ref document: A1