WO2006070441A1 - Operator supporting system - Google Patents

Operator supporting system Download PDF

Info

Publication number
WO2006070441A1
WO2006070441A1 PCT/JP2004/019545 JP2004019545W WO2006070441A1 WO 2006070441 A1 WO2006070441 A1 WO 2006070441A1 JP 2004019545 W JP2004019545 W JP 2004019545W WO 2006070441 A1 WO2006070441 A1 WO 2006070441A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
cursor
terminal device
cross
image display
Prior art date
Application number
PCT/JP2004/019545
Other languages
French (fr)
Japanese (ja)
Inventor
Hideyuki Majima
Original Assignee
Celltec Project Management Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Celltec Project Management Co., Ltd. filed Critical Celltec Project Management Co., Ltd.
Priority to PCT/JP2004/019545 priority Critical patent/WO2006070441A1/en
Publication of WO2006070441A1 publication Critical patent/WO2006070441A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/055Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves  involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7475User input or interface means, e.g. keyboard, pointing device, joystick
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • A61B6/461Displaying means of special interest
    • A61B6/465Displaying means of special interest adapted to display user selection data, e.g. graphical user interface, icons or menus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • A61B6/467Arrangements for interfacing with the operator or the patient characterised by special input means

Definitions

  • the present invention relates to a surgeon support system suitable for a doctor or dentist performing a surgical operation, and more particularly to a surgeon support system using an interactive image display device that displays an in-vivo image of a patient. .
  • a radiograph of a patient taken before surgery is pasted on the front of an illumination panel in the operating room, and the surgeon refers to it appropriately.
  • the X-ray image that can be observed with such a system is defined by the body angle at the time of imaging, so even if you look at another angle force inside the surgery, you can not respond to it. .
  • Patent Document 1 Japanese Patent Laid-Open No. 2002-534204
  • the interactive medical support system using the image display device described above uses an operation unit for office workers such as a keyboard and a mouse as the operation unit. For surgeons with treatment tools, they can't be used or handled as they are.
  • An interactive system that uses an image display device.
  • it is to provide a surgeon support system that is easy to use.
  • a surgeon support system includes an operating room terminal device that functions as a man-machine interface and is installed in the vicinity of an operating table in an operating room, and a diagnostic device such as a CT device, an MRI device, and a PET device.
  • An image storage device storing the 3D image data obtained by diagnosing a predetermined part of the patient via the computer, and the 3D image data read out from the image storage device and designated in advance.
  • an image processing device that generates image display data viewed from a predetermined viewpoint by performing image processing specified each time by an operation command and displays an image corresponding to the terminal device in the operating room. Contains.
  • the diagnostic equipment includes ultrasonic diagnostic equipment and interventional radiology (IVR) equipment.
  • the operating room terminal device has an image display having a predetermined display screen and an input operation accepting device for receiving an input operation.
  • the input operation acceptor includes a point indication position detector and a point indication position detector that can detect the position of a point indication object such as a fingertip existing in a plane region close to and parallel to the display screen in a non-contact manner.
  • Command generating means for generating a corresponding operation command based on the position of the point pointing object detected in step 1 and its movement mode.
  • the image processing apparatus generates a stereoscopic image with a cursor obtained by superimposing a predetermined planar cursor that is sequentially updated according to a cursor movement operation on a solid image corresponding to three-dimensional image data related to a predetermined part of the patient.
  • the first image processing means that generates corresponding image display data and displays it on the image display device of the operating room terminal device, and the first operation command that also receives the input operation acceptor power of the operating room terminal device.
  • Second image processing means for generating image display data corresponding to a state in which the stereoscopic image with the cursor is rotated by the direction and amount specified by the operation command and displaying the data on the surgeon's terminal device.
  • third image processing means for generating image display data corresponding to a cross-sectional image designated by a predetermined planar cursor after rotation and displaying the data on a terminal device in the operating room. Have at least.
  • a three-dimensional image of a predetermined part suitable for a patient's surgery can be displayed on the surgeon's terminal screen.
  • the input operation is performed in a non-contact manner using a fingertip or the like as a point pointing object, the surgeon can easily and freely operate the screen without using an input device such as a mouse or a keyboard.
  • the frequency of touching devices other than patients and treatment equipment is greatly reduced, so hygiene is greatly improved.
  • the predetermined plane cursor is an orthogonal three-plane cursor, whereby cross-sectional images corresponding to the three orthogonal planes are simultaneously displayed. Accordingly, by moving the orthogonal three-plane cursor, the three orthogonal cross sections inside the stereoscopic image of the predetermined part of the patient can be easily displayed.
  • image display data corresponding to an enlarged image of one cross section corresponding to the planar cursor specified by the command is transmitted.
  • an image processing apparatus that further includes a fourth image processing means and that can enlarge and display a desired cross-sectional image on the stereoscopic image with a point pointing object such as a fingertip.
  • the display screen of the image display device includes a three-dimensional image, a cross-sectional image designated by each plane cursor of the three-dimensional image, and one enlarged cross-sectional image. Can be displayed by the area division method.
  • a three-dimensional image with an orthogonal three-plane cursor and an enlarged image of one cross-section are displayed with almost the same size, and three cross-sectional images are displayed.
  • each image can be referred to as a list.
  • the stereoscopic image and the enlarged cross-sectional image are displayed with almost the same size, it is easy to compare them. Therefore, it is difficult for errors in the range and size of the corresponding part to occur.
  • the image processing device performs image processing that is designated in advance or designated each time by an operation command on one or more types of three-dimensional image data.
  • image display data viewed from a predetermined viewpoint is generated, and the data is transmitted to the surgeon side terminal device. This allows the surgeon to store each 3D image
  • a three-dimensional image of a predetermined part of the patient suitable for the operation can be created, which can be used as a simulation when performing the operation, and a more effective operation can be performed.
  • the one or more types of three-dimensional image data are obtained by imaging a predetermined part of a patient via a diagnostic apparatus such as a CT apparatus, an MRI apparatus, or a PET apparatus.
  • a diagnostic apparatus such as a CT apparatus, an MRI apparatus, or a PET apparatus.
  • Three types of 3D image data This makes it possible to combine imaging data from diagnostic devices with different types of image data, and to display a 3D image of a specific part of the patient suitable for surgery, so the surgeon performs more accurate surgery. be able to.
  • the surgeon support system as seen from another corner diagnoses a predetermined part of a patient via a terminal device functioning as a man-machine interface and a diagnostic device such as a CT device, an MRI device, or a PET device.
  • a diagnostic device such as a CT device, an MRI device, or a PET device.
  • An image storage device storing the obtained 3D image data, and image processing that is read from the image storage device and is designated in advance or by an operation command for the patient's 3D image data.
  • an image processing device that generates image display data viewed from a predetermined viewpoint and displays an image corresponding to the operating room side terminal device.
  • the terminal device includes an image display having a predetermined display screen and an input operation acceptor for accepting an input operation.
  • the input operation accepting device is in a position that does not block the display screen of the image display device, and the position of a point pointing object such as a fingertip that is close to and parallel to the display screen in a non-contact manner.
  • a command generating means for generating a corresponding operation command based on the position of the point pointing object detected by the point pointing position detector and its movement mode.
  • the image processing apparatus generates a three-dimensional image with a cursor obtained by superimposing a predetermined planar cursor that is sequentially updated according to a cursor movement operation on a three-dimensional image data corresponding to three-dimensional image data related to a predetermined part of the patient.
  • the first image processing means for generating corresponding image display data and displaying it on the image display device of the operating room terminal device, and the input operation acceptance power of the terminal device in response to the incoming first operation command ,
  • the stereoscopic image with the cursor Image display data corresponding to the state rotated by the direction and amount specified by the operation command is generated and displayed on the terminal device.
  • At least second image processing means and third image processing means for generating image display data corresponding to a cross-sectional image designated by a predetermined planar cursor after rotation and displaying the data on a terminal device. Yes.
  • a stereoscopic image of a predetermined part suitable for a patient's surgery can be displayed on the terminal screen. Also, since the input operation is performed without touching the position of the point indicating member with a fingertip or the like, the surgeon can operate without using an input device such as a mouse or a keyboard, and the screen operation can be performed easily and freely. It can be performed.
  • a patient's in-vivo simulation is performed using a stereoscopic image with a plane force and a cross-sectional image specified by each plane cursor via a diagnostic apparatus such as a CT apparatus, an MRI apparatus, or a PET apparatus.
  • a diagnostic apparatus such as a CT apparatus, an MRI apparatus, or a PET apparatus.
  • the input operation receiver on the surgeon's terminal device allows the surgeon to move the fingertip etc. along the display screen as a pointed object without using an input device such as a mouse or keyboard.
  • FIG. 1 A system configuration diagram of the surgeon support system according to the present invention is shown in FIG.
  • this surgeon support system comprises an image processing device, a surgeon's terminal device 2, a CT device 3, an MRI device 4, a PET device 5, an auxiliary storage device 6, and a data output device 7. Speak. These system elements are connected by a system bus or LAN cable (not shown).
  • the image processing apparatus 1 generates image display data by performing image processing on 3D data obtained by imaging a predetermined part of a patient via the diagnostic apparatus 3-5 described above.
  • the created image display data is sent to the surgeon's terminal device.
  • the surgeon-side terminal device 2 includes an image display having a predetermined display screen and an input operation accepting device for receiving an input operation.
  • the image display is preferably a flat display such as LCD, PDP, or organic EL.
  • the input operation acceptor accepts an input operation such as rotation on the display image, and the operation command generated during the input operation is sent to the image processing apparatus 1 described above.
  • the CT apparatus 3, the MRI apparatus 4, and the PET apparatus 5 are medical diagnostic imaging apparatuses conventionally used.
  • the basic operation of CT, MRI, and PET devices has already been widely known from various documents, and a description thereof will be omitted.
  • FIG. 2 shows in detail an example of an input operation acceptor and an image display of the terminal device according to the present embodiment.
  • the terminal device 2 includes an image display D, a pointed position detector 21, command generation means 22, a housing 23, a spacer 24, and a retroreflective member 25. .
  • the image display D a CRT image display or a liquid crystal image display is used.
  • the image display D one stereoscopic image created by the image processing apparatus 1, three cross-sectional images, and one enlarged cross-sectional image are displayed.
  • the point indicating position detector 21 detects the position of a point indicating object such as a fingertip existing in a plane area close to and parallel to the display screen in a non-contact manner. Further, in order to improve the detection accuracy, one point pointing position detector 21 is attached to each of the upper left and right, and is connected to the command generating means 22.
  • the command generation means 22 calculates the coordinate position and the movement distance of the point pointing object from the signal detected by the point pointing position detector 21, creates an operation command based on the calculation result, Sends an operation command to the image processing apparatus 1.
  • FIG. 3 shows a cross-sectional view of the input operation acceptor of the terminal device 2 of the present embodiment.
  • acrylic resin, polycarbonate resin A transparent input plane plate 26, which is also configured with equal force, is arranged in an overlapping manner.
  • the input flat plate 26 is fixed to the housing 23 via the spacer 24.
  • Spacers 24 are arranged on the upper, lower, left and right sides of the housing 23.
  • the retroreflective member 25 is disposed on the inner wall of each side of the spacer 24 so as to be perpendicular to the input flat plate 26.
  • the point indicating position detector 21 is fixed to the housing 23 via a spacer 24, and includes a light source 27 such as an infrared LED, a prism 28, an imaging lens 29, an imaging means 30 and the like. Yes.
  • the imaging means 30 the retroreflected light reflected by the optical axis L projected and received from the light source 27 by the operation of the point pointing object F such as the fingertip hits the retroreflective member 25 passes through the prism 28 and the imaging lens 29. Then, imaging is performed. The retroreflected light that has been imaged is converted into an electrical signal. From the electrical signal obtained by the imaging means 30, the command generation means 22 calculates the coordinate position and operation pattern of the point pointing object F, and a suitable operation command is created.
  • FIG. 4, FIG. 5, and FIG. 8 to FIG. 14 are flow charts showing the parts related to the main part of the present invention.
  • FIG. 6, FIG. 7, and FIG. The operation of the system of this embodiment will be described systematically with reference to the screen display example shown in FIG.
  • FIG. 4 shows a general flow chart of processing of the surgeon support system constituting the system of the present embodiment. As shown in the figure, this processing includes start processing (step 401), stereoscopic image G1 creation processing (step 402), cross-sectional image G2 creation processing (step 403), and enlarged cross-sectional image G3 creation processing. (Step 404) and screen display processing (Step 40)
  • command analysis processing step 407)
  • image storage processing step 408)
  • end processing step 409
  • the image processing apparatus 1 When the display data is displayed on the screen (step 405), the image processing apparatus 1 is in a standby state until an operation command is received from the surgeon-side terminal apparatus 2 (step 4).
  • the image storage process when the operation command from the surgeon-side terminal device 2 is “image storage”, the stereoscopic image is stored at the designated location.
  • the end process if the operation command from the surgeon side terminal device 2 is “end”, the end process of the surgeon support system is performed and the display screen is closed. Details of the start process (step 401) are shown in FIG. As shown in the figure, when the surgeon support system is activated, a system menu screen is displayed (step 50 1). An example of this system menu screen is shown in FIG. As shown in the figure, the system menu screen SG1 includes a new creation button 601 and a button 602 for opening a created file. When the system menu screen SG1 is displayed, it enters a standby state until either the new creation button 601 or the created file open button 602 is pressed (step 502NO).
  • buttons 502 YES When any one of these buttons is pressed (step 502 YES), the type of the pressed button is identified, and when the pressed button is the new creation button 601 (step 503 new creation), a new one is created.
  • the creation menu screen is displayed (step 504).
  • the new creation menu screen G2 includes an image creation source CT device button 701, an image creation source MRI device button 702, an image creation source PET device button 703, and an image creation source.
  • Device button 704 X—Y plane cursor button 705, Y—Z plane cursor button 706, Z—X plane cursor button 707, all plane cursor buttons 708, create execution button 709, and created file Open button 710 is provided.
  • the surgeon can select 3D image data for creating a stereoscopic image by pressing an image creation source designation button 701-704. Further, by pressing the plane cursor designation buttons 705 to 708, an orthogonal plane cursor to be superimposed on the stereoscopic image can be selected.
  • the process waits until the creation execution button 709 is pressed (step 505 NO).
  • step 505 YES When the creation execution button 709 is pressed (step 505 YES), the type of the image creation source device selected on the new creation menu screen S G2 is temporarily stored in the image processing device 1 (step 506). The shooting direction of the selected 3D image data is temporarily stored in the image processing apparatus 1 (step 507). Further, the coordinates where the plane cursor force selected on the newly created menu screen SG2 intersects are set, and the setting contents are temporarily stored in the image processing apparatus 1 (step 508).
  • Step 503 When the button 602 to open the created file of SG1 is pressed (Step 503 has been created), an image selection dialog (not shown) is displayed (Step 509), and the process waits until a stereoscopic image file is selected (Step 510).
  • image information including 3D image data type, stereoscopic image direction, and planar cursor coordinates is read (step 511). Image information is temporarily stored in the image processing apparatus 1 (step 512).
  • the image processing apparatus 1 determines whether or not it is an initial operation (step 801).
  • the 3D image data of the original diagnostic device temporarily stored in the start process step 401
  • the stereoscopic image G1 is read in a predetermined direction.
  • Created step 802
  • the planar cursor information set in the start process step 401
  • a planar cursor passing through predetermined coordinates is additionally created in the stereoscopic image G1 (Ste 803).
  • the flag F1 is set to “1” (step 804).
  • step 801F1 “1”
  • the operation command analyzed in the command analysis process (step 407) described later is determined (step 805).
  • the command is “rotation of a solid image” (step 805 rotation)
  • a stereoscopic image is recreated according to the angle calculated from the movement distance of the point pointing object described above (step 806).
  • the command is “change stereoscopic image data” (step 805, change stereoscopic image data)
  • a stereoscopic image is recreated from the three-dimensional image of the designated diagnostic device (step 807).
  • step 805 move, jump the 3D image is recreated so that the location specified by the above-mentioned point pointing object is in front (step 808). ).
  • a plane cursor passing through the specified coordinates is additionally created (step 809).
  • FIG. 10 details of the cross-sectional image G2 creation process (step 403) are shown in FIG.
  • the image processing apparatus 1 determines the coordinates through which the XY plane created in the stereoscopic image G1 passes and the 3 selected in the start process.
  • a sectional image G21 is created from the dimensional image data (step 901).
  • a cross-sectional image G22 is created from the coordinates through which the Y—Z plane created in the stereoscopic image G1 passes and the 3D image data selected in the above-mentioned start processing (step 902), and similarly the Z—X plane.
  • a cross-sectional image G23 is created from the coordinates through which and pass and the selected 3D image data (step 903).
  • step 1001F3 "1"
  • the cross-sectional image selection command analyzed in (Step 407) is determined (Step 1004).
  • the command is “X—Y plane” (step 1004G21)
  • the cross-sectional image G21 of the XY plane is enlarged (step 1005).
  • the command is “Y-Z plane” (step 1004G22)
  • the cross-sectional image G22 of the ⁇ - ⁇ plane is enlarged (step 1006).
  • the command is “ ⁇ - ⁇ plane” (step 1004G23)
  • the cross-sectional image G23 of the ⁇ - ⁇ plane is enlarged (step 1007). If the content of the operation command is other than a cross-sectional image (step 1004 is not specified), the cross-sectional image selected previously is enlarged and created (step 10 08).
  • FIG. 11 shows details of the command creation processing (step 407).
  • the image processing apparatus 1 is in a standby state until an operation command signal from the input operation acceptor power of the surgeon-side terminal apparatus 2 is received (step 1101).
  • the type of movement force operation of the point pointing object is determined (step 1102).
  • touch processing is executed (step 1103).
  • drag operation is executed (step 1104).
  • image rotation processing is executed (step 1105).
  • the touch location is detected from the detected touch coordinates (step 1201).
  • the touch location is “cross-sectional image” (step 1201 cross-sectional image)
  • it is analyzed as an enlarged cross-sectional image display switching command of the selected cross-sectional image (step 1202).
  • the touch location is “stereoscopic image” (step 1201 stereoscopic image)
  • it is analyzed as a jump display command of a stereoscopic image that displays and moves the specified location of the stereoscopic image to the front (step 1203).
  • the touch location is “stereoscopic image source switching button” (switching of step 1201), it is analyzed as a stereoscopic image creation source switching command for selecting a diagnostic device as a stereoscopic image creation source (step 1204).
  • the touch location force ⁇ save button (step 1201 save)
  • it is analyzed as an image save command for saving the current 3D image information with a plane cursor and cross-sectional image information (step 1205).
  • the touch location is the “end button” (step 1201 end), it is analyzed as an operation end command to end the surgeon support system (step 1206).
  • FIG. 13 shows details of the drag process (step 1104).
  • the drag location is detected from the detected drag coordinates and the movement distance of the point pointing object (step 1301). If the drag location is “plane cursor” (step 1301 plane cursor), the specified plane cursor is analyzed as a plane cursor movement command to move in the drag direction (step 1302).
  • the drag location is “stereoscopic image” (step 1301 stereoscopic image)
  • the stereoscopic image is analyzed as a stereoscopic image movement command for moving in the drag direction (step 1303).
  • step 1105) details of the image rotation processing (step 1105) are shown in FIG.
  • the rotation of the image in the left-right direction is detected from the detected movement and movement distance of the pointed object (step 1401).
  • a vertical rotation is detected (step 1402 and step 1405).
  • a right forward rotation command is analyzed in the case of a right forward rotation (step 1403).
  • right rear rotation it is analyzed as a right rear rotation command (step 1404).
  • left front rotation it is analyzed as a left front rotation command (step 1406), and in the case of left rear rotation, it is analyzed as a left rear rotation command. (Step 1407).
  • FIG. a specific screen display example according to the present invention is shown in FIG.
  • the cross-sectional image G23 designated by the cursor, the enlarged image of one designated cross-section, and each operation button are simultaneously displayed on the screen G.
  • the stereoscopic image G1 with the cursor and the enlarged image G3 of one cross section are displayed almost the same size as the patient.
  • Each operation button includes a button 1501 for opening a frequently used folder, a save button 1502, an end button 1503, an image information button 1504 for displaying stereoscopic image information, and a display of the color of the stereoscopic image.
  • Adjustment buttons such as an automatic density button 1505 for performing adjustment, a contrast high / low button 1506, a light / dark button 1507, and an enlarge / reduce button 1508 are provided.
  • screen G is displayed on the surgeon's terminal device 2 is shown in FIG. As shown in the figure, the screen G described above is displayed on the screen display of the surgeon-side terminal device 2.
  • the point pointing position detector 21 attached to the upper part of the surgeon-side terminal device 2 uses a point pointing object such as a finger to touch each image on the screen G or a portion where each button is displayed. By dragging or rotating, the position information of the pointed object is detected.
  • FIG. 1 An example in which the present surgeon support system is arranged in the operating room is shown in FIG.
  • the surgeon's terminal 2 is attached to the ceiling in the operating room R via an arm. Since the position of the surgeon-side terminal device 2 is moved by the arm, the surgeon-side terminal device 2 can be positioned at a place where the surgeon Do can easily operate. Therefore, even when performing surgery on patient P lying on operating table T, it can be easily operated.
  • the present surgeon support system is a three-dimensional image data obtained by diagnosing a predetermined site of a patient via a diagnostic apparatus such as a CT apparatus, an MRI apparatus, or a PET apparatus.
  • a diagnostic apparatus such as a CT apparatus, an MRI apparatus, or a PET apparatus.
  • a life-size stereoscopic image suitable for surgery can be displayed.
  • a cross-sectional image corresponding to each of the three orthogonal planes and a life-size enlarged image of one cross-section are displayed on one screen at the same time, together with a three-dimensional image with an orthogonal three-plane cursor. It can be realized more accurately.
  • surgeon support system does not use an input device such as a keyboard or a mouse.
  • touch operation, the drag operation, and the rotation operation are performed by a point indicating object such as a finger, so that operability is improved and hygiene is improved.
  • a patient's in-vivo simulation is performed using a stereoscopic image with a plane force and a cross-sectional image specified by each plane cursor via a diagnostic apparatus such as a CT apparatus, an MRI apparatus, or a PET apparatus.
  • a diagnostic apparatus such as a CT apparatus, an MRI apparatus, or a PET apparatus.
  • the input operation receiver on the surgeon's terminal device allows the surgeon to move the fingertip etc. along the display screen as a pointed object without using an input device such as a mouse or keyboard.
  • FIG. 1 is a block diagram showing a hardware configuration of the system according to the present embodiment.
  • FIG. 2 is an explanatory diagram showing a terminal device according to the present embodiment.
  • FIG. 3 is a cross-sectional view showing an input operation acceptor of the terminal device of the present embodiment.
  • FIG. 4 is a general flowchart showing a software configuration of the system of the present embodiment.
  • FIG. 5 is a flowchart showing details of start processing.
  • FIG. 6 is an explanatory diagram showing a display example of a menu screen of the terminal device.
  • FIG. 7 is an explanatory diagram showing a display example of a new image menu screen of the terminal device.
  • FIG. 8 is a flowchart showing details of a stereoscopic image G1 creation process.
  • FIG. 9 is a flowchart showing details of a cross-sectional image G2 creation process.
  • FIG. 10 is a flowchart showing details of an enlarged cross-sectional image G3 creation process.
  • FIG. 11 is a flowchart showing details of command analysis processing.
  • FIG. 12 is a flowchart showing details of touch processing in command analysis processing.
  • FIG. 13 is a flowchart showing details of a drag process in the command analysis process.
  • FIG. 14 is a flowchart showing details of image rotation processing in command analysis processing.
  • FIG. 15 is an explanatory diagram showing a display example of a simulation screen.
  • FIG. 16 is an explanatory diagram showing a display example in which a simulation screen is displayed on the terminal device of the present embodiment. [17] It is an explanatory diagram showing an example of an operating room in which the present surgeon support system is introduced. Explanation of symbols

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Physics & Mathematics (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Optics & Photonics (AREA)
  • Human Computer Interaction (AREA)
  • Gynecology & Obstetrics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Magnetic Resonance Imaging Apparatus (AREA)

Abstract

An operating room terminal comprises a point indication position detector which has an image display provided with a predetermined display screen and an input operation receiver adapted for receiving an input operation and so disposed at a place as not to obstruct the display screen of the image display and can detect the position of a point indicating object such as a finger tip present within a surface area close and parallel to the display screen in a noncontact way and command generating means for generating an operation command corresponding to the position of the point indicating object detected by the point indication position detector and the moving mode. A three-dimensional image is rotated through the operation of moving the point indicating object such as a finger tip along the display screen, and a desired cross-section image of the patient on the three-dimensional image is displayed on the screen of the image display in the operating room.

Description

明 細 書  Specification
手術医支援システム  Surgeon support system
技術分野  Technical field
[0001] 本発明は、医師や歯科医師が外科手術をする際等に好適な手術医支援システム に係り、特に、患者の体内画像を映し出すインタラクティブな画像表示装置を利用し た手術医支援システムに関する。  TECHNICAL FIELD [0001] The present invention relates to a surgeon support system suitable for a doctor or dentist performing a surgical operation, and more particularly to a surgeon support system using an interactive image display device that displays an in-vivo image of a patient. .
背景技術  Background art
[0002] 従来、手術医支援システムとしては、手術前に撮影された患者のレントゲン写真を 手術室内の照光パネルの前面に貼り付けて、手術医が適宜にこれを参照すると言つ た極めて原始的なものが一般的であった。このようなシステムで観察することができる レントゲン画像は、撮影時の身体角度により規定されるものであるから、手術中に別 の角度力 体内を見た 、としても、それに応えることはできな 、。  Conventionally, as a surgeon support system, a radiograph of a patient taken before surgery is pasted on the front of an illumination panel in the operating room, and the surgeon refers to it appropriately. Was common. The X-ray image that can be observed with such a system is defined by the body angle at the time of imaging, so even if you look at another angle force inside the surgery, you can not respond to it. .
[0003] 一方、 CT装置、 MRI装置、 PET装置を使用した診断により得られる 3次元画像デ ータに基づ 、て、画像表示装置を利用してインタラクティブに患者の体内を観察する ことは、各種医療診断の現場で行われている(例えば、特許文献 1参照)。  [0003] On the other hand, based on 3D image data obtained by diagnosis using a CT device, an MRI device, or a PET device, interactively observing a patient's body using an image display device is It is performed at various medical diagnosis sites (for example, see Patent Document 1).
特許文献 1:特開 2002— 534204号公報  Patent Document 1: Japanese Patent Laid-Open No. 2002-534204
発明の開示  Disclosure of the invention
発明が解決しょうとする課題  Problems to be solved by the invention
[0004] し力しながら、上述の画像表示装置を利用したインタラクティブな医療支援システム は、操作部としてキーボードやマウス等の事務職向け操作部が使用されてしたため、 手袋をしたり、手にメスや治療器具等を持ったままの手術医にとっては、そのままで は到底使 、こなすことができな 、。 [0004] However, the interactive medical support system using the image display device described above uses an operation unit for office workers such as a keyboard and a mouse as the operation unit. For surgeons with treatment tools, they can't be used or handled as they are.
[0005] 力!]えて、従来の画像表示システムは、単に、診断医のための便宜のみを考慮して 設計されていることが多ぐ特に、案内画面の設計に関しては、操作性の点で改良の 余地がかなり残されている。 [0005] Power! In other words, conventional image display systems are often designed solely for the convenience of the diagnostician. In particular, the guide screen design has considerable room for improvement in terms of operability. It is left.
[0006] この発明は、上述の問題点に着目してなされたものであり、その目的とするところは[0006] The present invention has been made paying attention to the above-mentioned problems, and the object of the invention is
、画像表示装置を利用したインタラクティブなシステムであって、し力も現場の手術医 にとつても使い勝手の良好な手術医支援システムを提供することにある。 , An interactive system that uses an image display device. In particular, it is to provide a surgeon support system that is easy to use.
[0007] この発明のさらに他の目的並びに作用効果は、明細書の以下の記載に基づいて 当業者であれば容易に理解されるであろう。  [0007] Still other objects and operational effects of the present invention will be easily understood by those skilled in the art based on the following description of the specification.
課題を解決するための手段  Means for solving the problem
[0008] この発明による手術医支援システムは、手術室内の手術台近傍に設置されて、マ ンマシンインタフェースとして機能する手術室内端末装置と、 CT装置、 MRI装置、 P ET装置等の診断装置を介して患者の所定部位を診断して得られた 3次元画像デー タを記憶させた画像記憶装置と、画像記憶装置から読み出され患者の 3次元画像デ ータに対して、予め指定された又は操作コマンドによりその都度に指定される画像処 理を施すことにより、所定の視点から見た画像表示用データを生成して手術室内端 末装置に該当する画像を表示させる画像処理装置と、を含んでいる。なお、診断装 置には、超音波診断装置、インターベンショナル'ラジオロジー (IVR)装置も含めるこ とちでさる。 [0008] A surgeon support system according to the present invention includes an operating room terminal device that functions as a man-machine interface and is installed in the vicinity of an operating table in an operating room, and a diagnostic device such as a CT device, an MRI device, and a PET device. An image storage device storing the 3D image data obtained by diagnosing a predetermined part of the patient via the computer, and the 3D image data read out from the image storage device and designated in advance. Alternatively, an image processing device that generates image display data viewed from a predetermined viewpoint by performing image processing specified each time by an operation command and displays an image corresponding to the terminal device in the operating room. Contains. The diagnostic equipment includes ultrasonic diagnostic equipment and interventional radiology (IVR) equipment.
[0009] 手術室内端末装置は、所定の表示画面を有する画像表示器と、入力操作を受け 付けるための入力操作受付器とを有している。入力操作受付器は、表示画面に近接 しかつこれと平行な面領域内に存在する指先等の点指示物体の位置を非接触で検 知可能な点指示位置検知器と、点指示位置検知器で検知された点指示物体の位置 並びにその移動態様に基づいて該当する操作コマンドを生成するコマンド生成手段 と、を含んでいる。  [0009] The operating room terminal device has an image display having a predetermined display screen and an input operation accepting device for receiving an input operation. The input operation acceptor includes a point indication position detector and a point indication position detector that can detect the position of a point indication object such as a fingertip existing in a plane region close to and parallel to the display screen in a non-contact manner. Command generating means for generating a corresponding operation command based on the position of the point pointing object detected in step 1 and its movement mode.
[0010] 画像処理装置は、当該患者の所定部位に関する 3次元画像データに相当する立 体画像に、カーソル移動操作に応じて逐次更新される所定の平面カーソルを重ねて 得られるカーソル付き立体画像に相当する画像表示用データを生成して手術室内 端末装置の画像表示器へと表示させる第 1の画像処理手段と、手術室内端末装置 の入力操作受付器力も到来する第 1の操作コマンドに応答して、カーソル付き立体 画像を当該操作コマンドにより指定された方向及び量だけ回転させた状態に相当す る画像表示用データを生成して手術医側端末装置へと表示させる第 2の画像処理手 段と、回転後における所定の平面カーソルで指定される断面画像に相当する画像表 示データを生成して手術室内端末装置へと表示させる第 3の画像処理手段と、を少 なくとも有している。 [0010] The image processing apparatus generates a stereoscopic image with a cursor obtained by superimposing a predetermined planar cursor that is sequentially updated according to a cursor movement operation on a solid image corresponding to three-dimensional image data related to a predetermined part of the patient. In response to the first image processing means that generates corresponding image display data and displays it on the image display device of the operating room terminal device, and the first operation command that also receives the input operation acceptor power of the operating room terminal device. Second image processing means for generating image display data corresponding to a state in which the stereoscopic image with the cursor is rotated by the direction and amount specified by the operation command and displaying the data on the surgeon's terminal device. And third image processing means for generating image display data corresponding to a cross-sectional image designated by a predetermined planar cursor after rotation and displaying the data on a terminal device in the operating room. Have at least.
[0011] この発明によると、手術医側端末画面には患者の手術に適した所定部位の立体画 像を表示させることができる。また、入力操作は指先等を点指示物体として非接触で 行われるため、手術医はマウスやキーボードなどの入力装置を使用せずに、容易に かつ自由自在に画面操作を行うことができる。また、患者や治療器具以外の装置等 に触れる頻度が大幅に低減されることから、衛生面が格段に向上する。  [0011] According to the present invention, a three-dimensional image of a predetermined part suitable for a patient's surgery can be displayed on the surgeon's terminal screen. In addition, since the input operation is performed in a non-contact manner using a fingertip or the like as a point pointing object, the surgeon can easily and freely operate the screen without using an input device such as a mouse or a keyboard. In addition, the frequency of touching devices other than patients and treatment equipment is greatly reduced, so hygiene is greatly improved.
[0012] この発明の一実施態様においては、所定の平面カーソルが直交 3平面カーソルで あり、それにより直交 3平面のそれぞれに相当する断面画像が同時に表示される。こ れにより、直交 3平面カーソルを移動させることで、当該患者の所定部位の立体画像 の内部の直交 3断面を簡単に表示させることができる。  In one embodiment of the present invention, the predetermined plane cursor is an orthogonal three-plane cursor, whereby cross-sectional images corresponding to the three orthogonal planes are simultaneously displayed. Accordingly, by moving the orthogonal three-plane cursor, the three orthogonal cross sections inside the stereoscopic image of the predetermined part of the patient can be easily displayed.
[0013] この発明の好ましい一実施態様においては、上記第 2の操作コマンドに応答して、 当該コマンドで指定される平面カーソルに対応する 1断面の拡大画像に相当する画 像表示データを送出する第 4の画像処理手段をさらに有し、指先等の点指示物体で 立体画像上の所望断面画像を拡大表示することができる画像処理装置が設けられる 。これにより、患者の立体画像上の所望断面画像を画像表示器の画面上に、容易に 表示させることができる。  In a preferred embodiment of the present invention, in response to the second operation command, image display data corresponding to an enlarged image of one cross section corresponding to the planar cursor specified by the command is transmitted. There is further provided an image processing apparatus that further includes a fourth image processing means and that can enlarge and display a desired cross-sectional image on the stereoscopic image with a point pointing object such as a fingertip. Thereby, the desired cross-sectional image on the stereoscopic image of the patient can be easily displayed on the screen of the image display.
[0014] この発明の好ましい一実施態様においては、画像表示器の表示画面には、立体画 像と、その立体画像の各平面カーソルで指定される断面画像と、 1枚の拡大断面画 像とを、領域分割方式により表示することができる。また、直交 3平面カーソル付き立 体画像と、 1断面の拡大画像とがほぼ等身大で表示され、 3枚の断面画像はそれより も表示される。これにより、 1画面上には、 1枚の立体画像と、 3枚の断面画像と、 1枚 の拡大断面画像と、が同時に表示されるため、一覧として各画像を参照することがで きる。また、立体画像と拡大断面画像とは、ほぼ等身大で表示されるため、照らし合 わせ易くなる。そのため、該当部位の範囲や大きさの誤差が生じにくくなる。  [0014] In a preferred embodiment of the present invention, the display screen of the image display device includes a three-dimensional image, a cross-sectional image designated by each plane cursor of the three-dimensional image, and one enlarged cross-sectional image. Can be displayed by the area division method. In addition, a three-dimensional image with an orthogonal three-plane cursor and an enlarged image of one cross-section are displayed with almost the same size, and three cross-sectional images are displayed. As a result, since one stereoscopic image, three cross-sectional images, and one enlarged cross-sectional image are displayed simultaneously on one screen, each image can be referred to as a list. In addition, since the stereoscopic image and the enlarged cross-sectional image are displayed with almost the same size, it is easy to compare them. Therefore, it is difficult for errors in the range and size of the corresponding part to occur.
[0015] この発明の好ましい一実施態様においては、画像処理装置では、 1種以上の 3次 元画像データに対して、予め指定された又は操作コマンドによりその都度に指定され た画像処理を施すことにより、所定の視点から見た画像表示用データを生成して手 術医側端末装置へデータ送出が行われる。これにより、手術医は、各 3次元画像デ ータを選択することにより、手術に適した患者の所定部位の立体画像を作成すること ができ、手術を行う際のシミュレーションとして利用することができ、より有効的な手術 を行うことができる。 [0015] In a preferred embodiment of the present invention, the image processing device performs image processing that is designated in advance or designated each time by an operation command on one or more types of three-dimensional image data. Thus, image display data viewed from a predetermined viewpoint is generated, and the data is transmitted to the surgeon side terminal device. This allows the surgeon to store each 3D image By selecting the data, a three-dimensional image of a predetermined part of the patient suitable for the operation can be created, which can be used as a simulation when performing the operation, and a more effective operation can be performed.
[0016] この発明の好ましい一実施態様においては、 1種以上の 3次元画像データは、 CT 装置、 MRI装置、 PET装置等の診断装置を介して患者の所定部位を撮影して得ら れた 3種類の 3次元画像データである。これにより、画像データのタイプが異なる診断 装置の撮影データを組み合わせることが可能になり、手術に適した患者の所定部位 の立体画像を表示させることができるため、手術医はより正確な手術を行うことができ る。  In one preferred embodiment of the present invention, the one or more types of three-dimensional image data are obtained by imaging a predetermined part of a patient via a diagnostic apparatus such as a CT apparatus, an MRI apparatus, or a PET apparatus. Three types of 3D image data. This makes it possible to combine imaging data from diagnostic devices with different types of image data, and to display a 3D image of a specific part of the patient suitable for surgery, so the surgeon performs more accurate surgery. be able to.
[0017] 別の一角から見たこの発明による手術医支援システムは、マンマシンインタフェース として機能する端末装置と、 CT装置、 MRI装置、 PET装置等の診断装置を介して 患者の所定部位を診断して得られた 3次元画像データを記憶させた画像記憶装置と 、画像記憶装置から読み出され患者の 3次元画像データに対して、予め指定された 又は操作コマンドによりその都度に指定された画像処理を施すことにより、所定の視 点から見た画像表示用データを生成して手術室側端末装置に該当する画像を表示 させる画像処理装置と、を含んでいる。  [0017] The surgeon support system according to the present invention as seen from another corner diagnoses a predetermined part of a patient via a terminal device functioning as a man-machine interface and a diagnostic device such as a CT device, an MRI device, or a PET device. An image storage device storing the obtained 3D image data, and image processing that is read from the image storage device and is designated in advance or by an operation command for the patient's 3D image data. And an image processing device that generates image display data viewed from a predetermined viewpoint and displays an image corresponding to the operating room side terminal device.
[0018] 端末装置は、所定の表示画面を有する画像表示器と、入力操作を受け付けるため の入力操作受付器とを有する。  [0018] The terminal device includes an image display having a predetermined display screen and an input operation acceptor for accepting an input operation.
[0019] 入力操作受付器は、画像表示器の表示画面を遮らない位置にあって、表示画面に 近接しかっこれと平行な面領域内に存在する指先等の点指示物体の位置を非接触 で検知可能な点指示位置検知器と、点指示位置検知器で検知された点指示物体の 位置並びにその移動態様に基づいて該当する操作コマンドを生成するコマンド生成 手段と、を含んでいる。  [0019] The input operation accepting device is in a position that does not block the display screen of the image display device, and the position of a point pointing object such as a fingertip that is close to and parallel to the display screen in a non-contact manner. And a command generating means for generating a corresponding operation command based on the position of the point pointing object detected by the point pointing position detector and its movement mode.
[0020] 画像処理装置は、当該患者の所定部位に関する 3次元画像データに相当する立 体画像に、カーソル移動操作に応じて逐次更新される所定の平面カーソルを重ねて 得られるカーソル付き立体画像に相当する画像表示用データを生成して手術室内 端末装置の画像表示器へと表示させる第 1の画像処理手段と、端末装置の入力操 作受付器力 到来する第 1の操作コマンドに応答して、カーソル付き立体画像を当該 操作コマンドにより指定された方向及び量だけ回転させた状態に相当する画像表示 用データを生成して端末装置へと表示させる [0020] The image processing apparatus generates a three-dimensional image with a cursor obtained by superimposing a predetermined planar cursor that is sequentially updated according to a cursor movement operation on a three-dimensional image data corresponding to three-dimensional image data related to a predetermined part of the patient. The first image processing means for generating corresponding image display data and displaying it on the image display device of the operating room terminal device, and the input operation acceptance power of the terminal device in response to the incoming first operation command , The stereoscopic image with the cursor Image display data corresponding to the state rotated by the direction and amount specified by the operation command is generated and displayed on the terminal device.
第 2の画像処理手段と、回転後における所定の平面カーソルで指定される断面画像 に相当する画像表示データを生成して端末装置へと表示させる第 3の画像処理手段 と、を少なくとも有している。  And at least second image processing means and third image processing means for generating image display data corresponding to a cross-sectional image designated by a predetermined planar cursor after rotation and displaying the data on a terminal device. Yes.
[0021] この発明によると、端末画面には患者の手術に適した所定部位の立体画像を表示 させることができる。また、入力操作は指先等で点指示部材の位置を非接触で行わ れるため、手術医はマウスやキーボードなどの入力装置を使用せずに操作すること ができ、容易にかつ自由自在に画面操作を行うことができる。 [0021] According to the present invention, a stereoscopic image of a predetermined part suitable for a patient's surgery can be displayed on the terminal screen. Also, since the input operation is performed without touching the position of the point indicating member with a fingertip or the like, the surgeon can operate without using an input device such as a mouse or a keyboard, and the screen operation can be performed easily and freely. It can be performed.
発明の効果  The invention's effect
[0022] 本発明によれば、 CT装置、 MRI装置、 PET装置等の診断装置を介して、平面力 一ソル付き立体画像と、各平面カーソルで指定される断面画像とにより、患者の体内 シミュレーションがより正確に実現することができる。また、手術医側端末装置に有す る入力操作受付器により、手術医がマウスやキーボードなどの入力装置を使用せず に、指先等を点指示物体として表示画面に沿って動かす操作を通じて立体画像を回 転させ、患者の立体画像上の所望断面画像を手術室内の画像表示器の画面上に 表示させ得るようにした手術医支援システムを提供することができる。  [0022] According to the present invention, a patient's in-vivo simulation is performed using a stereoscopic image with a plane force and a cross-sectional image specified by each plane cursor via a diagnostic apparatus such as a CT apparatus, an MRI apparatus, or a PET apparatus. Can be realized more accurately. In addition, the input operation receiver on the surgeon's terminal device allows the surgeon to move the fingertip etc. along the display screen as a pointed object without using an input device such as a mouse or keyboard. Thus, it is possible to provide a surgeon support system that can display a desired cross-sectional image on a stereoscopic image of a patient on the screen of an image display in the operating room.
発明を実施するための最良の形態  BEST MODE FOR CARRYING OUT THE INVENTION
[0023] 以下に、本発明に係る手術医支援システムの実施の一形態を添付図面を参照しつ つ詳細に説明する。 Hereinafter, an embodiment of a surgeon support system according to the present invention will be described in detail with reference to the accompanying drawings.
[0024] 本発明に係る手術医支援システムのシステム構成図が図 1に示されている。同図に 示されるように、この手術医支援システムは、画像処理装置 手術医側端末装置 2、 CT装置 3、 MRI装置 4、 PET装置 5、補助記憶装置 6、及びデータ出力装置 7により 構成されて ヽる。これらのシステム要素はシステムバスや LANケーブル(図示省略) で結ばれている。  A system configuration diagram of the surgeon support system according to the present invention is shown in FIG. As shown in the figure, this surgeon support system comprises an image processing device, a surgeon's terminal device 2, a CT device 3, an MRI device 4, a PET device 5, an auxiliary storage device 6, and a data output device 7. Speak. These system elements are connected by a system bus or LAN cable (not shown).
[0025] 画像処理装置 1は、上述の診断装置 3— 5を介して患者の所定部位を撮影して得ら れた 3次元データに対して画像処理を施して画像表示用データを作成するものであ り、作成された画像表示用データは手術医側端末装置へと送出される。 [0026] 手術医側端末装置 2は、所定の表示画面を有する画像表示器と、入力操作を受け 付けるための入力操作受付器とを有する。画像表示器としては、 LCD, PDP、有機 ELなどのフラットディスプレイが好ましい。画像表示器では、上述の画像処理装置 1 から送出された画像データが表示される。入力操作受付器では、表示画像に対して 回転などの入力操作が受け付けられ、入力操作時に生成された操作コマンドは上述 の画像処理装置 1へ送出される。 [0025] The image processing apparatus 1 generates image display data by performing image processing on 3D data obtained by imaging a predetermined part of a patient via the diagnostic apparatus 3-5 described above. The created image display data is sent to the surgeon's terminal device. [0026] The surgeon-side terminal device 2 includes an image display having a predetermined display screen and an input operation accepting device for receiving an input operation. The image display is preferably a flat display such as LCD, PDP, or organic EL. In the image display, the image data sent from the image processing apparatus 1 is displayed. The input operation acceptor accepts an input operation such as rotation on the display image, and the operation command generated during the input operation is sent to the image processing apparatus 1 described above.
[0027] なお、操作コマンドの内容については、後に詳細に説明する。また、本発明の実施 形態は、手術医側端末装置を挙げて説明するが、入力操作受付器が備えられた手 術室以外に設置された端末装置でも実施できることは言うまでもない。  [0027] The contents of the operation command will be described in detail later. Further, although the embodiment of the present invention will be described with reference to the surgeon-side terminal device, it goes without saying that the present invention can also be implemented with a terminal device installed outside the operating room equipped with the input operation accepting device.
[0028] CT装置 3、 MRI装置 4、及び PET装置 5は、従来から使用されている医療診断撮 影装置である。なお、 CT装置、 MRI装置、 PET装置の基本的な動作については、 各種の文献により既に広く知られているため説明は省略する。  [0028] The CT apparatus 3, the MRI apparatus 4, and the PET apparatus 5 are medical diagnostic imaging apparatuses conventionally used. The basic operation of CT, MRI, and PET devices has already been widely known from various documents, and a description thereof will be omitted.
[0029] 次に、本実施形態の端末装置の入力操作受付器、並びに画像表示器の一例が図 2に詳細に示されている。同図に示されるように、端末装置 2は、画像表示器 D、点指 示位置検知器 21、コマンド生成手段 22、ハウジング 23、スぺーサー 24、及び再帰 反射部材 25により構成されて 、る。  Next, FIG. 2 shows in detail an example of an input operation acceptor and an image display of the terminal device according to the present embodiment. As shown in the figure, the terminal device 2 includes an image display D, a pointed position detector 21, command generation means 22, a housing 23, a spacer 24, and a retroreflective member 25. .
[0030] 画像表示器 Dとしては、 CRT式画像表示器や液晶式画像表示器が用いられる。こ の画像表示器 Dには、画像処理装置 1で作成された 1枚の立体画像と、 3枚の断面 画像と、 1枚の拡大断面画像と、が映し出される。  [0030] As the image display D, a CRT image display or a liquid crystal image display is used. On the image display D, one stereoscopic image created by the image processing apparatus 1, three cross-sectional images, and one enlarged cross-sectional image are displayed.
[0031] 点指示位置検知器 21は、表示画面に近接しかっこれと平行な面領域内に存在す る指先等の点指示物体の位置を非接触で検知する。また、点指示位置検知器 21は 検知精度を上げるため、上部の左右に一個ずつ取り付けられており、コマンド生成手 段 22と接続されている。  [0031] The point indicating position detector 21 detects the position of a point indicating object such as a fingertip existing in a plane area close to and parallel to the display screen in a non-contact manner. Further, in order to improve the detection accuracy, one point pointing position detector 21 is attached to each of the upper left and right, and is connected to the command generating means 22.
[0032] コマンド生成手段 22は、点指示位置検知器 21で検知された信号から、点指示物 体の座標位置や移動距離等の算出を行い、算出結果カゝら操作コマンドを作成して、 画像処理装置 1へ操作コマンドを送出する。  [0032] The command generation means 22 calculates the coordinate position and the movement distance of the point pointing object from the signal detected by the point pointing position detector 21, creates an operation command based on the calculation result, Sends an operation command to the image processing apparatus 1.
[0033] 次に、本実施形態の端末装置 2の入力操作受付器の断面図が図 3に示されている 。同図に示されるように、画面表示器 Dの上に、アクリル榭脂、ポリカーボネイト樹脂 等力も構成された透明な入力平面板 26が重ね合わせて配置されている。また、入力 平面板 26は、スぺーサー 24を介して、ハウジング 23に固定されている。スぺーサー 24はハウジング 23の上下左右の各辺に配置されている。再帰反射部材 25は、スぺ ーサー 24の各辺の内側壁に、入力平面板 26に対して垂直に配置されている。 Next, FIG. 3 shows a cross-sectional view of the input operation acceptor of the terminal device 2 of the present embodiment. As shown in the figure, on the screen display D, acrylic resin, polycarbonate resin A transparent input plane plate 26, which is also configured with equal force, is arranged in an overlapping manner. Further, the input flat plate 26 is fixed to the housing 23 via the spacer 24. Spacers 24 are arranged on the upper, lower, left and right sides of the housing 23. The retroreflective member 25 is disposed on the inner wall of each side of the spacer 24 so as to be perpendicular to the input flat plate 26.
[0034] 点指示位置検知器 21は、ハウジング 23にスぺーサー 24を介して固定されると共に 、赤外線 LED等の光源 27、プリズム 28、結像レンズ 29、撮像手段等 30により構成さ れている。撮像手段 30では、指先等の点指示物体 Fの動作により光源 27から投受 光された光軸 Lが再帰反射部材 25に当たって反射された再帰反射光がプリズム 28 と、結像レンズ 29とを介して、撮像が行われる。撮像が行われた再帰反射光は、電気 信号に変換される。撮像手段 30により得られた電気信号は、コマンド生成手段 22に より点指示物体 Fの座標位置や操作パターンが算出され、適した操作コマンドが作成 される。 [0034] The point indicating position detector 21 is fixed to the housing 23 via a spacer 24, and includes a light source 27 such as an infrared LED, a prism 28, an imaging lens 29, an imaging means 30 and the like. Yes. In the imaging means 30, the retroreflected light reflected by the optical axis L projected and received from the light source 27 by the operation of the point pointing object F such as the fingertip hits the retroreflective member 25 passes through the prism 28 and the imaging lens 29. Then, imaging is performed. The retroreflected light that has been imaged is converted into an electrical signal. From the electrical signal obtained by the imaging means 30, the command generation means 22 calculates the coordinate position and operation pattern of the point pointing object F, and a suitable operation command is created.
[0035] 次に、図 4一図 5、図 8—図 14は、本発明の要部に関わる部分を示すフローチヤ一 トであり、以下、これらのフローチャート、図 6、図 7、及び図 15—図 17に示されている 画面表示例を参照しつつ、本実施形態システムの動作を系統的に説明する。  Next, FIG. 4, FIG. 5, and FIG. 8 to FIG. 14 are flow charts showing the parts related to the main part of the present invention. Hereinafter, these flowcharts, FIG. 6, FIG. 7, and FIG. —The operation of the system of this embodiment will be described systematically with reference to the screen display example shown in FIG.
[0036] 本実施形態システムを構成する手術医支援システムの処理のゼネラルフローチヤ 一トが図 4に示されている。同図に示されるように、この処理は、スタート処理 (ステツ プ 401)と、立体画像 G1作成処理 (ステップ 402)と、断面画像 G2作成処理 (ステツ プ 403)と、拡大断面画像 G3作成処理 (ステップ 404)と、画面表示処理 (ステップ 40 FIG. 4 shows a general flow chart of processing of the surgeon support system constituting the system of the present embodiment. As shown in the figure, this processing includes start processing (step 401), stereoscopic image G1 creation processing (step 402), cross-sectional image G2 creation processing (step 403), and enlarged cross-sectional image G3 creation processing. (Step 404) and screen display processing (Step 40)
5)と、コマンド解析処理 (ステップ 407)と、画像保存処理 (ステップ 408)と、終了処 理 (ステップ 409)とから概略構成されて 、る。 5), command analysis processing (step 407), image storage processing (step 408), and end processing (step 409).
[0037] また、表示用データが画面に表示されると (ステップ 405)、画像処理装置 1は、手 術医側端末装置 2からの操作コマンドが到来されるまで、待機状態になる (ステップ 4 [0037] When the display data is displayed on the screen (step 405), the image processing apparatus 1 is in a standby state until an operation command is received from the surgeon-side terminal apparatus 2 (step 4).
06)。 06).
[0038] さらに、画像保存処理 (ステップ 408)では、手術医側端末装置 2からの操作コマン ドが「画像保存」の場合は、指定した場所に立体画像の保存が行われる。また、終了 処理 (ステップ 409)では、手術医側端末装置 2からの操作コマンドが「終了」の場合 は、手術医支援システムの終了処理が行われ、表示画面が閉じられる。 [0039] スタート処理 (ステップ 401)の詳細が図 5に示されている。同図に示されるように、 手術医支援システムが起動されると、システムメニュー画面が表示される (ステップ 50 1)。このシステムメニュー画面の一例が図 6に示されている。同図に示されるように、 システムメニュー画面 SG1には、新規作成ボタン 601と、作成済みファイルを開くボタ ン 602と力 それぞれ設けられている。システムメニュー画面 SG1が表示されると、新 規作成ボタン 601と、作成済みファイルを開くボタン 602とのいずれかのボタンが押 圧されるまで、待機状態となる (ステップ 502NO)。 Furthermore, in the image storage process (step 408), when the operation command from the surgeon-side terminal device 2 is “image storage”, the stereoscopic image is stored at the designated location. In the end process (step 409), if the operation command from the surgeon side terminal device 2 is “end”, the end process of the surgeon support system is performed and the display screen is closed. Details of the start process (step 401) are shown in FIG. As shown in the figure, when the surgeon support system is activated, a system menu screen is displayed (step 50 1). An example of this system menu screen is shown in FIG. As shown in the figure, the system menu screen SG1 includes a new creation button 601 and a button 602 for opening a created file. When the system menu screen SG1 is displayed, it enters a standby state until either the new creation button 601 or the created file open button 602 is pressed (step 502NO).
[0040] それら 、ずれかのボタンが押圧されると(ステップ 502YES)、押圧されたボタンの 種類が識別され、押圧されたボタンが新規作成ボタン 601の場合には (ステップ 503 新規作成)、新規作成メニュー画面が表示される (ステップ 504)。  [0040] When any one of these buttons is pressed (step 502 YES), the type of the pressed button is identified, and when the pressed button is the new creation button 601 (step 503 new creation), a new one is created. The creation menu screen is displayed (step 504).
[0041] この新規作成メニュー画面が図 7に示されている。同図に示されるように、新規作成 メニュー画面 G2には、画像作成元 CT装置ボタン 701と、画像作成元 MRI装置ボタ ン 702と、画像作成元 PET装置ボタン 703と、画像作成元すベての装置ボタン 704 と、 X— Y平面カーソルボタン 705と、 Y— Z平面カーソルボタン 706と、 Z— X平面カー ソルボタン 707と、すべての平面カーソルボタン 708と、作成実行ボタン 709と、作成 済みファイルを開くボタン 710とが、それぞれ設けられている。手術医は、画像作成 元指定ボタン 701一 704を押圧することにより、立体画像を作成する 3次元画像デー タを選択することができる。さらに、平面カーソル指定ボタン 705— 708を押圧するこ とにより、立体画像に重ね合わせる直交平面カーソルを選択することができる。新規 作成メニュー画面 SG2が表示されると、作成実行ボタン 709が押圧されるまで、待機 状態となる (ステップ 505NO)。  [0041] This new creation menu screen is shown in FIG. As shown in the figure, the new creation menu screen G2 includes an image creation source CT device button 701, an image creation source MRI device button 702, an image creation source PET device button 703, and an image creation source. Device button 704, X—Y plane cursor button 705, Y—Z plane cursor button 706, Z—X plane cursor button 707, all plane cursor buttons 708, create execution button 709, and created file Open button 710 is provided. The surgeon can select 3D image data for creating a stereoscopic image by pressing an image creation source designation button 701-704. Further, by pressing the plane cursor designation buttons 705 to 708, an orthogonal plane cursor to be superimposed on the stereoscopic image can be selected. When the new creation menu screen SG2 is displayed, the process waits until the creation execution button 709 is pressed (step 505 NO).
[0042] 作成実行ボタン 709が押圧されると (ステップ 505YES)、新規作成メニュー画面 S G2で選択された画像作成元装置の種類が画像処理装置 1内に一時保存され (ステ ップ 506)、併せて選択された 3次元画像データの撮影方向が画像処理装置 1内に 一時保存される (ステップ 507)。さらに、新規作成メニュー画面 SG2で選択された平 面カーソル力 交差する座標が設定され、設定内容が画像処理装置 1内に一時保 存される(ステップ 508)。  [0042] When the creation execution button 709 is pressed (step 505 YES), the type of the image creation source device selected on the new creation menu screen S G2 is temporarily stored in the image processing device 1 (step 506). The shooting direction of the selected 3D image data is temporarily stored in the image processing apparatus 1 (step 507). Further, the coordinates where the plane cursor force selected on the newly created menu screen SG2 intersects are set, and the setting contents are temporarily stored in the image processing apparatus 1 (step 508).
[0043] システムメニュー画面 SG1の作成済みファイルを開くボタン 602が押圧された場合 には (ステップ 503作成済み)、図示しな 、画像選択ダイアログが表示され (ステップ 5 09)、立体画像ファイルが選択されるまで待機状態となる (ステップ 510)。 [0043] System menu screen When the button 602 to open the created file of SG1 is pressed (Step 503 has been created), an image selection dialog (not shown) is displayed (Step 509), and the process waits until a stereoscopic image file is selected (Step 510).
[0044] 立体画像ファイルが選択されると (ステップ 510YES)、 3次元画像データ種類と、 立体画像方向と、平面カーソル座標とを含む画像情報が読み出され (ステップ 511) 、読み出された立体画像情報が画像処理装置 1内に一時保存される (ステップ 512) When a stereoscopic image file is selected (step 510 YES), image information including 3D image data type, stereoscopic image direction, and planar cursor coordinates is read (step 511). Image information is temporarily stored in the image processing apparatus 1 (step 512).
[0045] 次に、立体画像 G1作成処理 (ステップ 402)の詳細が図 8に示されている。同図に 示されるように、立体画像 G1作成処理が実行されると、画像処理装置 1では、初期 動作か否かの判別が行われる (ステップ 801)。初期動作の場合には (ステップ 801F 1 = "0")、前述スタート処理 (ステップ 401)で一時保存された作成元診断装置の 3 次元画像データが読み出され、所定の向きで立体画像 G1が作成される (ステップ 80 2)。立体画像 G1が作成されると、同様に、前述スタート処理 (ステップ 401)で設定さ れた平面カーソル情報が読み出され、所定の座標を通る平面カーソルが立体画像 G 1に追加作成される (ステップ 803)。初期動作の処理が完了すると、フラグ F1は" 1" にセットされる(ステップ 804)。 Next, details of the stereoscopic image G1 creation process (step 402) are shown in FIG. As shown in the figure, when the stereoscopic image G1 creation process is executed, the image processing apparatus 1 determines whether or not it is an initial operation (step 801). In the case of the initial operation (step 801F 1 = “0”), the 3D image data of the original diagnostic device temporarily stored in the start process (step 401) is read, and the stereoscopic image G1 is read in a predetermined direction. Created (step 80 2). When the stereoscopic image G1 is created, similarly, the planar cursor information set in the start process (step 401) is read, and a planar cursor passing through predetermined coordinates is additionally created in the stereoscopic image G1 ( Step 803). When the processing of the initial operation is completed, the flag F1 is set to “1” (step 804).
[0046] 2回目以降の動作の場合には (ステップ 801F1 = "1")、後述のコマンド解析処理( ステップ 407)で解析された操作コマンドが判別される(ステップ 805)。コマンドが「立 体画像の回転」の場合には (ステップ 805回転)、前述の点指示物体の移動距離から 算出された角度に従って、立体画像の再作成が行われる (ステップ 806)。コマンドが 「立体画像のデータ変更」の場合には (ステップ 805立体画像データ変更)、指定さ れた診断装置の 3次元画像から立体画像の再作成が行われる (ステップ 807)。コマ ンドカ S「移動 ·ジャンプ」の場合には (ステップ 805移動、ジャンプ)、前述の点指示物 体により指定された場所が正面になるような、立体画像の再作成が行われる (ステツ プ 808)。立体画像 G1が再作成されると、指定された座標を通る平面カーソルが追 加作成される(ステップ 809)。  In the second and subsequent operations (step 801F1 = “1”), the operation command analyzed in the command analysis process (step 407) described later is determined (step 805). When the command is “rotation of a solid image” (step 805 rotation), a stereoscopic image is recreated according to the angle calculated from the movement distance of the point pointing object described above (step 806). When the command is “change stereoscopic image data” (step 805, change stereoscopic image data), a stereoscopic image is recreated from the three-dimensional image of the designated diagnostic device (step 807). In the case of commander S “move / jump” (step 805 move, jump), the 3D image is recreated so that the location specified by the above-mentioned point pointing object is in front (step 808). ). When the stereoscopic image G1 is re-created, a plane cursor passing through the specified coordinates is additionally created (step 809).
[0047] 次に、断面画像 G2作成処理 (ステップ 403)の詳細が図 9に示されている。同図に 示されるように、断面画像 G2作成処理が実行開始されると、画像処理装置 1では、 立体画像 G1で作成された X— Y平面が通る座標と、前述スタート処理で選択された 3 次元画像データとからは、断面画像 G21が作成される (ステップ 901)。さらに、立体 画像 G1で作成された Y— Z平面が通る座標と、前述スタート処理で選択された 3次元 画像データとからは、断面画像 G22が作成され (ステップ 902)、同様に Z— X平面が 通る座標と、選択された 3次元画像データとからは、断面画像 G23が作成される (ス テツプ 903)。 Next, details of the cross-sectional image G2 creation process (step 403) are shown in FIG. As shown in the figure, when the cross-sectional image G2 creation process is started, the image processing apparatus 1 determines the coordinates through which the XY plane created in the stereoscopic image G1 passes and the 3 selected in the start process. A sectional image G21 is created from the dimensional image data (step 901). Furthermore, a cross-sectional image G22 is created from the coordinates through which the Y—Z plane created in the stereoscopic image G1 passes and the 3D image data selected in the above-mentioned start processing (step 902), and similarly the Z—X plane. A cross-sectional image G23 is created from the coordinates through which and pass and the selected 3D image data (step 903).
[0048] 次に、拡大断面画像 G3作成処理 (ステップ 404)の詳細が図 10に示されている。  Next, details of the enlarged cross-sectional image G3 creation process (step 404) are shown in FIG.
同図に示されるように、拡大断面画像 G3作成処理が実行されると、画像処理装置 1 では、初期動作力否かの判別が行われる (ステップ 1001)。初期動作の場合には (ス テツプ 1001F3 = "0")、 X-Y平面の断面画像 G21が拡大作成される(ステップ 100 2)。初期動作の処理が完了すると、 F3フラグに" 1"にセットされる (ステップ 1003)。  As shown in the figure, when the enlarged cross-sectional image G3 creation process is executed, the image processing apparatus 1 determines whether or not the initial operating force is sufficient (step 1001). In the case of the initial operation (step 1001F3 = “0”), the X-Y plane cross-sectional image G21 is enlarged (step 1002). When the initial operation processing is completed, the F3 flag is set to "1" (step 1003).
[0049] 2回目以降の動作の場合には(ステップ 1001F3 = "1")、後述のコマンド解析処理  [0049] In the case of the second and subsequent operations (step 1001F3 = "1"), command analysis processing described later
(ステップ 407)で解析された断面画像選択コマンドが判別される (ステップ 1004)。コ マンドが「X— Y平面」の場合には(ステップ 1004G21)、 X— Y平面の断面画像 G21 が拡大作成される (ステップ 1005)。コマンドが「Y-Z平面」の場合には (ステップ 10 04G22)、 Υ— Ζ平面の断面画像 G22が拡大作成される(ステップ 1006)。コマンドが 「Ζ-Χ平面」の場合には(ステップ 1004G23)、 Ζ-Χ平面の断面画像 G23が拡大作 成される (ステップ 1007)。操作コマンドの内容が断面画像以外の場合には (ステツ プ 1004指定なし)、以前に選択されていた断面画像が拡大作成される (ステップ 10 08)。  The cross-sectional image selection command analyzed in (Step 407) is determined (Step 1004). When the command is “X—Y plane” (step 1004G21), the cross-sectional image G21 of the XY plane is enlarged (step 1005). When the command is “Y-Z plane” (step 1004G22), the cross-sectional image G22 of the Υ-Ζ plane is enlarged (step 1006). When the command is “Ζ-Χ plane” (step 1004G23), the cross-sectional image G23 of the Ζ-Χ plane is enlarged (step 1007). If the content of the operation command is other than a cross-sectional image (step 1004 is not specified), the cross-sectional image selected previously is enlarged and created (step 10 08).
[0050] 次に、コマンド作成処理 (ステップ 407)の詳細が図 11に示されている。同図に示さ れるように、画像処理装置 1は、手術医側端末装置 2の入力操作受付器力ゝらの操作 コマンド信号が受信されるまで待機状態になる (ステップ 1101ΝΟ)。点指示位置検 知器で点指示物体の動作が検知されると (ステップ 1101YES)、点指示物体の動き 力 操作の種類が判別される (ステップ 1102)。操作の種類が「タツチ操作」の場合 には (ステップ 1102タツチ)、タツチ処理が実行される (ステップ 1103)。操作の種類 が「ドラッグ操作」の場合には (ステップ 1102ドラッグ)、ドラッグ処理が実行される (ス テツプ 1104)。操作の種類が「回転操作」の場合には、画像回転処理が実行される( ステップ 1105)。 [0051] 次に、タツチ処理 (ステップ 1103)の詳細が図 12に示されている。同図に示される ように、タツチ処理が実行されると、検知されたタツチ座標によりタツチ場所が検出さ れる (ステップ 1201)。タツチ場所が「断面画像」の場合には (ステップ 1201断面画 像)、選択した断面画像の拡大断面画像表示切替コマンドと解析される (ステップ 12 02)。タツチ場所が「立体画像」の場合には (ステップ 1201立体画像)、立体画像を 指定した場所を正面に表示移動させる立体画像のジャンプ表示コマンドと解析され る (ステップ 1203)。タツチ場所が「立体画像元切替ボタン」の場合には (ステップ 12 01切替)、立体画像作成元となる診断装置を選択するための立体画像作成元切替 コマンドと解析される (ステップ 1204)。タツチ場所力 ^保存ボタン」の場合には (ステツ プ 1201保存)、現在の平面カーソル付き立体画像情報と断面画像情報を保存する ための画像保存コマンドと解析される (ステップ 1205)。タツチ場所が「終了ボタン」の 場合には (ステップ 1201終了)、本手術医支援システムを終了するための操作終了 コマンドと解析される(ステップ 1206)。 Next, FIG. 11 shows details of the command creation processing (step 407). As shown in the figure, the image processing apparatus 1 is in a standby state until an operation command signal from the input operation acceptor power of the surgeon-side terminal apparatus 2 is received (step 1101). When the movement of the point pointing object is detected by the point pointing position detector (step 1101 YES), the type of movement force operation of the point pointing object is determined (step 1102). When the operation type is “touch operation” (step 1102 touch), touch processing is executed (step 1103). When the operation type is “drag operation” (step 1102 drag), the drag process is executed (step 1104). When the operation type is “rotation operation”, image rotation processing is executed (step 1105). Next, details of the touch process (step 1103) are shown in FIG. As shown in the figure, when the touch process is executed, the touch location is detected from the detected touch coordinates (step 1201). When the touch location is “cross-sectional image” (step 1201 cross-sectional image), it is analyzed as an enlarged cross-sectional image display switching command of the selected cross-sectional image (step 1202). When the touch location is “stereoscopic image” (step 1201 stereoscopic image), it is analyzed as a jump display command of a stereoscopic image that displays and moves the specified location of the stereoscopic image to the front (step 1203). When the touch location is “stereoscopic image source switching button” (switching of step 1201), it is analyzed as a stereoscopic image creation source switching command for selecting a diagnostic device as a stereoscopic image creation source (step 1204). In the case of the touch location force ^ save button (step 1201 save), it is analyzed as an image save command for saving the current 3D image information with a plane cursor and cross-sectional image information (step 1205). When the touch location is the “end button” (step 1201 end), it is analyzed as an operation end command to end the surgeon support system (step 1206).
[0052] 次に、ドラッグ処理 (ステップ 1104)の詳細が図 13に示されている。同図に示される ように、ドラッグ処理が実行されると、検知されたドラッグ座標と、点指示物体の移動 距離とから、ドラッグ場所が検出される (ステップ 1301)。ドラッグ場所が「平面カーソ ル」の場合には (ステップ 1301平面カーソル)、指定した平面カーソルがドラッグ方向 へ移動するための平面カーソル移動コマンドと解析される(ステップ 1302)。ドラッグ 場所が「立体画像」の場合には (ステップ 1301立体画像)、立体画像がドラッグ方向 へ移動するための立体画像移動コマンドと解析される(ステップ 1303)。  Next, FIG. 13 shows details of the drag process (step 1104). As shown in the figure, when the drag process is executed, the drag location is detected from the detected drag coordinates and the movement distance of the point pointing object (step 1301). If the drag location is “plane cursor” (step 1301 plane cursor), the specified plane cursor is analyzed as a plane cursor movement command to move in the drag direction (step 1302). When the drag location is “stereoscopic image” (step 1301 stereoscopic image), the stereoscopic image is analyzed as a stereoscopic image movement command for moving in the drag direction (step 1303).
[0053] 次に、画像回転処理 (ステップ 1105)の詳細が図 14に示されている。同図に示さ れるように、画像回転処理が実行されると、検知した点指示物体の動作と移動距離と から、画像の左右方向の回転が検知される (ステップ 1401)。左右方向の回転が検 知されると、上下方向の回転が検知される(ステップ 1402、及びステップ 1405)。左 右、及び上下の動作の検知が行われた結果、右前方回転の場合には、右前方回転 コマンドと解析される (ステップ 1403)。右後方回転の場合には、右後方回転コマンド と解析される (ステップ 1404)。同様に、左前方回転の場合には、左前方回転コマン ドと解析され (ステップ 1406)、左後方回転の場合には、左後方回転コマンドと解析 される(ステップ 1407)。 Next, details of the image rotation processing (step 1105) are shown in FIG. As shown in the figure, when the image rotation process is executed, the rotation of the image in the left-right direction is detected from the detected movement and movement distance of the pointed object (step 1401). When a horizontal rotation is detected, a vertical rotation is detected (step 1402 and step 1405). As a result of detection of left / right and up / down motions, a right forward rotation command is analyzed in the case of a right forward rotation (step 1403). In the case of right rear rotation, it is analyzed as a right rear rotation command (step 1404). Similarly, in the case of left front rotation, it is analyzed as a left front rotation command (step 1406), and in the case of left rear rotation, it is analyzed as a left rear rotation command. (Step 1407).
[0054] 最後に、本発明に係る具体的な画面表示例が図 15に示されている。同図に示され るように、平面カーソル付き立体画像 G1と、 X— Y平面カーソルで指定される断面画 像 G21と、 Y— Z平面カーソルで指定される断面画像 G22と、 Z— X平面カーソルで指 定される断面画像 G23と、指定される 1断面の拡大画像と、各操作ボタンと、は画面 Gに同時に表示される。また、カーソル付き立体画像 G1と、 1断面の拡大画像 G3と、 は当該患者とほぼ等身大で表示される。各操作ボタンは、使用頻度の高いフォルダ を開くボタン 1501と、保存ボタン 1502と、終了ボタン 1503と、の他にち、立体画像 情報を表示させる画像情報ボタン 1504と、立体画像の色などの表示調整を行う自動 濃度ボタン 1505と、コントラスト強'弱ボタン 1506と、明 '喑ボタン 1507と、拡大'縮 小ボタン 1508と、などの調整ボタンが設けられている。さら〖こ、画面 Gを手術医側端 末装置 2に表示させた例が図 17に示されている。同図に示されるように、手術医側端 末装置 2の画面表示器には、前述の画面 Gが表示される。また、手術医側端末装置 2の上部に取り付けられている点指示位置検知器 21は、画面 Gの各画像、または各 ボタンが表示される部分を、指などの点指示物体を用いてタツチ、ドラッグまたは回転 操作することにより、点指示物体の位置情報が検知される。  Finally, a specific screen display example according to the present invention is shown in FIG. As shown in the figure, the stereoscopic image G1 with a plane cursor, the cross-sectional image G21 specified with the X—Y plane cursor, the cross-sectional image G22 specified with the Y—Z plane cursor, and the Z—X plane The cross-sectional image G23 designated by the cursor, the enlarged image of one designated cross-section, and each operation button are simultaneously displayed on the screen G. In addition, the stereoscopic image G1 with the cursor and the enlarged image G3 of one cross section are displayed almost the same size as the patient. Each operation button includes a button 1501 for opening a frequently used folder, a save button 1502, an end button 1503, an image information button 1504 for displaying stereoscopic image information, and a display of the color of the stereoscopic image. Adjustment buttons such as an automatic density button 1505 for performing adjustment, a contrast high / low button 1506, a light / dark button 1507, and an enlarge / reduce button 1508 are provided. Furthermore, an example in which screen G is displayed on the surgeon's terminal device 2 is shown in FIG. As shown in the figure, the screen G described above is displayed on the screen display of the surgeon-side terminal device 2. The point pointing position detector 21 attached to the upper part of the surgeon-side terminal device 2 uses a point pointing object such as a finger to touch each image on the screen G or a portion where each button is displayed. By dragging or rotating, the position information of the pointed object is detected.
[0055] 次に、本手術医支援システムを手術室に配置した例が図 16に示されている。手術 室 R内の天井にアームを介して手術医側端末装置 2が取り付けられて 、る。手術医 側端末装置 2は、アームによって位置を移動するため、手術医 Doが操作しやすい場 所に位置決めすることができる。そのため、手術台 Tの上に横たわる患者 Pへの手術 を行って!/、るときでも、容易に操作することができる。  Next, an example in which the present surgeon support system is arranged in the operating room is shown in FIG. The surgeon's terminal 2 is attached to the ceiling in the operating room R via an arm. Since the position of the surgeon-side terminal device 2 is moved by the arm, the surgeon-side terminal device 2 can be positioned at a place where the surgeon Do can easily operate. Therefore, even when performing surgery on patient P lying on operating table T, it can be easily operated.
[0056] このように、上述の実施形態による本手術医支援システムは、 CT装置、 MRI装置、 PET装置等の診断装置を介して患者の所定部位を診断して得られた 3次元画像デ ータを組み合わせることにより、手術に適した等身大の立体画像を表示させることが できる。また、直交 3平面カーソル付き立体画像と併せて、直交 3平面のそれぞれに 相当する断面画像と、 1断面の等身大拡大画像とが、 1画面上に同時に表示される ため、患者の体内シミュレーションがより正確に実現することができる。  As described above, the present surgeon support system according to the above-described embodiment is a three-dimensional image data obtained by diagnosing a predetermined site of a patient via a diagnostic apparatus such as a CT apparatus, an MRI apparatus, or a PET apparatus. By combining the data, a life-size stereoscopic image suitable for surgery can be displayed. In addition, a cross-sectional image corresponding to each of the three orthogonal planes and a life-size enlarged image of one cross-section are displayed on one screen at the same time, together with a three-dimensional image with an orthogonal three-plane cursor. It can be realized more accurately.
[0057] さらに、本手術医支援システムは、キーボードやマウスなどの入力装置を使用せず に、指などの点指示物体により、タツチ操作、ドラッグ操作や回転操作を行うため、操 作性が向上すると共に、衛生面が向上する。 [0057] Further, the surgeon support system does not use an input device such as a keyboard or a mouse. In addition, the touch operation, the drag operation, and the rotation operation are performed by a point indicating object such as a finger, so that operability is improved and hygiene is improved.
産業上の利用可能性  Industrial applicability
[0058] 本発明によれば、 CT装置、 MRI装置、 PET装置等の診断装置を介して、平面力 一ソル付き立体画像と、各平面カーソルで指定される断面画像とにより、患者の体内 シミュレーションがより正確に実現することができる。また、手術医側端末装置に有す る入力操作受付器により、手術医がマウスやキーボードなどの入力装置を使用せず に、指先等を点指示物体として表示画面に沿って動かす操作を通じて立体画像を回 転させ、患者の立体画像上の所望断面画像を手術室内の画像表示器の画面上に 表示させ得るようにした手術医支援システムを提供することができる。 [0058] According to the present invention, a patient's in-vivo simulation is performed using a stereoscopic image with a plane force and a cross-sectional image specified by each plane cursor via a diagnostic apparatus such as a CT apparatus, an MRI apparatus, or a PET apparatus. Can be realized more accurately. In addition, the input operation receiver on the surgeon's terminal device allows the surgeon to move the fingertip etc. along the display screen as a pointed object without using an input device such as a mouse or keyboard. Thus, it is possible to provide a surgeon support system that can display a desired cross-sectional image on a stereoscopic image of a patient on the screen of an image display in the operating room.
図面の簡単な説明  Brief Description of Drawings
[0059] [図 1]本実施形態システムのハードウェア構成を示すブロック図である。 FIG. 1 is a block diagram showing a hardware configuration of the system according to the present embodiment.
[図 2]本実施形態の端末装置を示す説明図である。  FIG. 2 is an explanatory diagram showing a terminal device according to the present embodiment.
[図 3]本実施形態の端末装置の入力操作受付器を示す断面図である。  FIG. 3 is a cross-sectional view showing an input operation acceptor of the terminal device of the present embodiment.
[図 4]本実施形態システムのソフトウェア構成を示すゼネラルフローチャートである。  FIG. 4 is a general flowchart showing a software configuration of the system of the present embodiment.
[図 5]スタート処理の詳細を示すフローチャートである。  FIG. 5 is a flowchart showing details of start processing.
[図 6]端末装置のメニュー画面の表示例を示す説明図である。  FIG. 6 is an explanatory diagram showing a display example of a menu screen of the terminal device.
[図 7]端末装置の新規画像メニュー画面の表示例を示す説明図である。  FIG. 7 is an explanatory diagram showing a display example of a new image menu screen of the terminal device.
[図 8]立体画像 G1作成処理の詳細を示すフローチャートである。  FIG. 8 is a flowchart showing details of a stereoscopic image G1 creation process.
[図 9]断面画像 G2作成処理の詳細を示すフローチャートである。  FIG. 9 is a flowchart showing details of a cross-sectional image G2 creation process.
[図 10]拡大断面画像 G3作成処理の詳細を示すフローチャートである。  FIG. 10 is a flowchart showing details of an enlarged cross-sectional image G3 creation process.
[図 11]コマンド解析処理の詳細を示すフローチャートである。  FIG. 11 is a flowchart showing details of command analysis processing.
[図 12]コマンド解析処理内のタツチ処理の詳細を示すフローチャートである。  FIG. 12 is a flowchart showing details of touch processing in command analysis processing.
[図 13]コマンド解析処理内のドラッグ処理の詳細を示すフローチャートである。  FIG. 13 is a flowchart showing details of a drag process in the command analysis process.
[図 14]コマンド解析処理内の画像回転処理の詳細を示すフローチャートである。  FIG. 14 is a flowchart showing details of image rotation processing in command analysis processing.
[図 15]シミュレーション画面の表示例を示す説明図である。  FIG. 15 is an explanatory diagram showing a display example of a simulation screen.
[図 16]本実施形態の端末装置にシミュレーション画面を表示させた表示例を示す説 明図である 圆 17]本手術医支援システムを導入した手術室の例を示す説明図である。 符号の説明 FIG. 16 is an explanatory diagram showing a display example in which a simulation screen is displayed on the terminal device of the present embodiment. [17] It is an explanatory diagram showing an example of an operating room in which the present surgeon support system is introduced. Explanation of symbols
1 画像処理装置  1 Image processing device
2 手術医側端末装置  2 Surgeon's terminal
3 CT装置  3 CT device
4 MRI装置  4 MRI equipment
5 PET装置  5 PET equipment
6 補助記憶装置  6 Auxiliary storage
7 データ出力装置  7 Data output device
21 点指示位置検知器  21-point indication position detector
22 コマンド生成手段  22 Command generation means
23 ハウジング  23 Housing
24 スぺーサー  24 Spacer
25 再帰反射部材  25 Retroreflective members
26 入力平面板  26 Input flat plate
27 赤外線光源  27 Infrared light source
28 プリズム  28 Prism
29 結像レンズ  29 Imaging lens
30 撮像手段  30 Imaging means
601 新規作成ボタン  601 New creation button
602 作成済ファイルを開くボタン  602 Button to open created file
701 CT装置ボタン  701 CT device button
702 MRI装置ボタン  702 MRI device button
703 PET装置ボタン  703 PET device button
704 すべての装置ボタン  704 All device buttons
705 X— Y平面カーソルボタン  705 X—Y plane cursor button
706 Y— Z平面カーソルボタン  706 Y—Z plane cursor button
707 Z— X平面カーソルボタン 708 すべての平面カーソルボタン707 Z—X plane cursor button 708 All plane cursor buttons
709 作成実行ボタン 709 Create button
710 作成済みファイルを開くボタン 710 Open button for created file
1501 フォルダを開くボタン1501 Open folder button
1502 保存ボタン 1502 Save button
1503 終了ボタン  1503 Exit button
1504 画像情報ボタン  1504 Image information button
1505 自動濃度ボタン  1505 Automatic density button
1506 コントラスト強'弱ボタン 1506 Contrast strong 'weak' button
1507 明'喑ボタン 1507 Bright '喑 button
1508 拡大 ·縮小ボタン  1508 Zoom button
D 画像表示器  D Image display
F 点指示物体  F point indicating object
G シミュレーション画面  G Simulation screen
G1 平面カーソル付き立体画像 G1 3D image with planar cursor
G21 X— Y平面の断面画像G21 X—Y plane cross-sectional image
G22 Y— Z平面の断面画像G22 Y—Z-plane cross-sectional image
G23 Z— X平面の断面画像G23 Z—Section image of the X plane
SG1 メニュー画面 SG1 menu screen
SG2 新規作成メニュー画面 SG2 New menu screen
R 手術室 R operating room
Do 手術医  Do surgeon
P 患者  P patient
T 手術台  T operating table

Claims

請求の範囲  The scope of the claims
手術室内の手術台近傍に設置されて、マンマシンインタフェースとして機能する手 術室内端末装置と、  An intra-operative terminal device installed near the operating table in the operating room and functioning as a man-machine interface;
CT装置、 MRI装置、 PET装置等の診断装置を介して患者の所定部位を診断して 得られた 3次元画像データを記憶させた画像記憶装置と、  An image storage device that stores 3D image data obtained by diagnosing a predetermined part of a patient via a diagnostic device such as a CT device, an MRI device, or a PET device;
画像記憶装置から読み出され患者の 3次元画像データに対して、予め指定された 又は操作コマンドによりその都度に指定される画像処理を施すことにより、所定の視 点から見た画像表示用データを生成して手術室内端末装置に該当する画像を表示 させる画像処理装置と、  Image display data viewed from a predetermined viewpoint is applied to the patient's three-dimensional image data read from the image storage device by performing image processing specified in advance or specified each time by an operation command. An image processing device for generating and displaying an image corresponding to the terminal device in the operating room
を含む手術医支援システムであって、  A surgeon support system including:
手術室内端末装置は、所定の表示画面を有する画像表示器と、入力操作を受け 付けるための入力操作受付器とを有し、かつ  The operating room terminal device has an image display device having a predetermined display screen, an input operation accepting device for receiving an input operation, and
入力操作受付器は、画像表示器の表示画面を遮らない位置にあって、表示画面に 近接しかっこれと平行な面領域内に存在する指先等の点指示物体の位置を非接触 で検知可能な点指示位置検知器と、  The input operation acceptor is in a position that does not obstruct the display screen of the image display, and can detect the position of a pointed object such as a fingertip that is close to and parallel to the display screen in a non-contact manner. A point pointing position detector;
点指示位置検知器で検知された点指示物体の位置並びにその移動態様に基づい て該当する操作コマンドを生成するコマンド生成手段と、を含み、さらに  Command generating means for generating a corresponding operation command based on the position of the point pointing object detected by the point pointing position detector and its movement mode, and
画像処理装置は、  The image processing device
当該患者の所定部位に関する 3次元画像データに相当する立体画像に、カーソル 移動操作に応じて逐次更新される所定の平面カーソルを重ねて得られるカーソル付 き立体画像に相当する画像表示用データを生成して手術室内端末装置の画像表示 器へと表示させる第 1の画像処理手段と、  Generates image display data corresponding to a stereoscopic image with a cursor obtained by superimposing a predetermined planar cursor that is sequentially updated according to the cursor movement operation on the stereoscopic image corresponding to the three-dimensional image data related to the predetermined part of the patient. First image processing means for displaying on the image display device of the operating room terminal device,
手術室内端末装置の入力操作受付器力 到来する第 1の操作コマンドに応答して 、カーソル付き立体画像を当該操作コマンドにより指定された方向及び量だけ回転さ せた状態に相当する画像表示用データを生成して手術医側端末装置へと表示させ る第 2の画像処理手段と、  Input operation acceptor force of the operating room terminal device In response to the first operation command that arrives, image display data corresponding to the state in which the stereoscopic image with the cursor is rotated by the direction and amount specified by the operation command Second image processing means for generating and displaying on the surgeon's terminal device;
回転後における所定の平面カーソルで指定される断面画像に相当する画像表示 データを生成して手術室内端末装置へと表示させる第 3の画像処理手段と、を少なく とち有し、 Third image processing means for generating image display data corresponding to a cross-sectional image designated by a predetermined planar cursor after rotation and displaying it on the terminal device in the operating room is reduced. Have tochi,
それにより、指先等を点指示物体として表示画面に沿って動かす操作を通じて立 体画像を回転させ、患者の立体画像上の所望断面映像を手術室内の画像表示器の 画面上に表示させ得るようにした、ことを特徴とする手術医支援システム。  As a result, the body image can be rotated through an operation of moving the fingertip or the like along the display screen as a point pointing object so that a desired cross-sectional image on the stereoscopic image of the patient can be displayed on the screen of the image display in the operating room. Surgeon support system characterized by that.
[2] 所定の平面カーソルが直交 3平面カーソルであり、それにより直交 3平面のそれぞ れに相当する断面映像が同時に表示される、ことを特徴とする請求項 1に記載の手 術医支援システム。  [2] The surgeon assistance according to claim 1, wherein the predetermined plane cursor is an orthogonal three-plane cursor, whereby cross-sectional images corresponding to each of the three orthogonal planes are simultaneously displayed. system.
[3] 手術室内端末装置の入力操作受付器力 到来する第 2の操作コマンドに応答して 、当該コマンドで指定される平面カーソルに対応する 1断面の拡大画像に相当する 画像表示データを生成して手術室内端末装置へと表示させる第 4の画像処理手段 をさらに有し、  [3] Input operation acceptor force of the operating room terminal device In response to the incoming second operation command, image display data corresponding to an enlarged image of one cross section corresponding to the planar cursor specified by the command is generated. And a fourth image processing means for displaying on the terminal device in the operating room.
それにより、指先等を点指示物体として表示画面に沿って動かす操作を通じて、患 者の立体画像上の所望断面映像を画像表示器の画面上に拡大表示させ得るよう〖こ した、ことを特徴とする請求項 2に記載の手術医支援システム。  As a result, the desired cross-sectional image on the patient's stereoscopic image can be enlarged and displayed on the screen of the image display device through the operation of moving the fingertip or the like along the display screen as a point indicating object. The surgeon support system according to claim 2.
[4] 直交 3平面カーソル付き立体画像と、各平面カーソルで指定される 3枚の断面画像 とが、画像表示器の表示画面に領域分割方式により表示されるようにした、ことを特 徴とする請求項 2に記載の手術医支援システム。  [4] A feature is that a stereoscopic image with an orthogonal three-plane cursor and three cross-sectional images specified by each plane cursor are displayed on the display screen of the image display device using the area division method. The surgeon support system according to claim 2.
[5] 直交 3平面カーソル付き立体画像と、各平面カーソルで指定される 3枚の断面画像 と、 1断面の拡大画像とが、画像表示器の表示画面上に領域分割方式により表示さ れるようにした、ことを特徴とする請求項 2に記載の手術医支援システム。  [5] Three-dimensional image with orthogonal three-plane cursor, three cross-sectional images specified by each plane cursor, and enlarged image of one cross-section are displayed on the display screen of the image display by the region division method. The surgeon support system according to claim 2, wherein:
[6] 直交 3平面カーソル付き立体画像と、 1断面の拡大画像とがほぼ等身大で表示され 、かつ各平面カーソルで指定される 3枚の断面画像がそれよりも小さく表示される、こ とを特徴とする請求項 5に記載の手術医支援システム。  [6] A three-dimensional image with an orthogonal three-plane cursor and an enlarged image of one cross-section are displayed almost in life, and the three cross-section images specified by each plane cursor are displayed smaller than that. The surgeon support system according to claim 5, wherein:
[7] 画像処理装置は、画像記憶装置力 読み出され同一患者の同一部位に関する 1 種以上の 3次元画像データに対して、予め指定された又は操作コマンドによりその都 度に指定された画像処理を施すことにより、所定の視点から見た画像表示用データ を生成して手術室内端末装置へと送出する、ことを特徴とする請求項 1一 6のいずれ 力に記載の手術医支援システム。 [7] The image processing apparatus reads out the image storage power and performs image processing that is pre-designated or specified each time by an operation command for one or more types of 3D image data related to the same part of the same patient. The surgeon support system according to any one of claims 1 to 6, characterized in that image display data viewed from a predetermined viewpoint is generated and sent to a terminal device in the operating room.
[8] 1種以上の 3次元画像データが、 CT装置、 MRI装置、 PET装置の診断装置を介し て患者の所定部位を撮影して得られた 3種類の 3次元画像データである、ことを特徴 とする請求項 7に記載の手術医支援システム。 [8] One or more types of 3D image data are three types of 3D image data obtained by imaging a predetermined part of a patient via a CT, MRI, or PET diagnostic device. The surgeon support system according to claim 7 characterized by.
[9] マンマシンインタフェースとして機能する端末装置と、 [9] A terminal device that functions as a man-machine interface;
CT装置、 MRI装置、 PET装置等の診断装置を介して患者の所定部位を診断して 得られた 3次元画像データを記憶させた画像記憶装置と、  An image storage device that stores 3D image data obtained by diagnosing a predetermined part of a patient via a diagnostic device such as a CT device, an MRI device, or a PET device;
画像記憶装置から読み出され患者の 3次元画像データに対して、予め指定された 又は操作コマンドによりその都度に指定された画像処理を施すことにより、所定の視 点から見た画像表示用データを生成して手術室側端末装置に該当する画像を表示 させる画像処理装置と、  Image display data viewed from a predetermined viewpoint can be obtained by performing image processing specified in advance or by an operation command on the patient's 3D image data read from the image storage device. An image processing device that generates and displays a corresponding image on the operating room terminal device;
を含む手術医支援システムであって、  A surgeon support system including:
端末装置は、所定の表示画面を有する画像表示器と、入力操作を受け付けるため の入力操作受付器とを有し、かつ  The terminal device has an image display having a predetermined display screen, an input operation accepting device for accepting an input operation, and
入力操作受付器は、画像表示器の表示画面を遮らない位置にあって、表示画面に 近接しかっこれと平行な面領域内に存在する指先等の点指示物体の位置を非接触 で検知可能な点指示位置検知器と、  The input operation acceptor is in a position that does not obstruct the display screen of the image display, and can detect the position of a pointed object such as a fingertip that is close to and parallel to the display screen in a non-contact manner. A point pointing position detector;
点指示位置検知器で検知された点指示物体の位置並びにその移動態様に基づい て該当する操作コマンドを生成するコマンド生成手段と、を含み、さらに  Command generating means for generating a corresponding operation command based on the position of the point pointing object detected by the point pointing position detector and its movement mode, and
画像処理装置は、  The image processing device
当該患者の所定部位に関する 3次元画像データに相当する立体画像に、カーソル 移動操作に応じて逐次更新される所定の平面カーソルを重ねて得られるカーソル付 き立体画像に相当する画像表示用データを生成して手術室内端末装置の画像表示 器へと表示させる第 1の画像処理手段と、  Generates image display data corresponding to a stereoscopic image with a cursor obtained by superimposing a predetermined planar cursor that is sequentially updated according to the cursor movement operation on the stereoscopic image corresponding to the three-dimensional image data related to the predetermined part of the patient. First image processing means for displaying on the image display device of the operating room terminal device,
端末装置の入力操作受付器力 到来する第 1の操作コマンドに応答して、カーソル 付き立体画像を当該操作コマンドにより指定された方向及び量だけ回転させた状態 に相当する画像表示用データを生成して端末装置へと表示させる第 2の画像処理手 段と、  In response to the incoming first operation command, the terminal device generates image display data corresponding to the state in which the stereoscopic image with the cursor is rotated by the direction and amount specified by the operation command. A second image processing means for displaying on the terminal device,
回転後における所定の平面カーソルで指定される断面画像に相当する画像表示 データを生成して端末装置へと表示させる第 3の画像処理手段と、を少なくとも有し、 それにより、指先等を点指示物体として表示画面に沿って動かす操作を通じて立 体画像を回転させ、患者の立体画像上の所望断面映像を端末装置の画像表示器の 画面上に表示させ得るようにした、ことを特徴とする手術医支援システム。 Image display corresponding to a cross-sectional image specified by a predetermined planar cursor after rotation At least a third image processing means for generating data and displaying it on the terminal device, so that the body image is rotated through an operation of moving the fingertip or the like along the display screen as a point pointing object. A surgeon support system characterized in that a desired cross-sectional image on a stereoscopic image can be displayed on the screen of an image display of a terminal device.
PCT/JP2004/019545 2004-12-27 2004-12-27 Operator supporting system WO2006070441A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2004/019545 WO2006070441A1 (en) 2004-12-27 2004-12-27 Operator supporting system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2004/019545 WO2006070441A1 (en) 2004-12-27 2004-12-27 Operator supporting system

Publications (1)

Publication Number Publication Date
WO2006070441A1 true WO2006070441A1 (en) 2006-07-06

Family

ID=36614568

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2004/019545 WO2006070441A1 (en) 2004-12-27 2004-12-27 Operator supporting system

Country Status (1)

Country Link
WO (1) WO2006070441A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10222248A (en) * 1997-02-12 1998-08-21 Canon Inc Information processor
JP2001070293A (en) * 1999-09-06 2001-03-21 Toshiba Corp Radio-diagnostic device
JP2001101450A (en) * 1999-09-28 2001-04-13 Terarikon Inc Three-dimensional image display device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10222248A (en) * 1997-02-12 1998-08-21 Canon Inc Information processor
JP2001070293A (en) * 1999-09-06 2001-03-21 Toshiba Corp Radio-diagnostic device
JP2001101450A (en) * 1999-09-28 2001-04-13 Terarikon Inc Three-dimensional image display device

Similar Documents

Publication Publication Date Title
US6359612B1 (en) Imaging system for displaying image information that has been acquired by means of a medical diagnostic imaging device
US11662830B2 (en) Method and system for interacting with medical information
US6175610B1 (en) Medical technical system controlled by vision-detected operator activity
JP5637775B2 (en) Medical device
US20110310126A1 (en) Method and system for interacting with datasets for display
JP5883147B2 (en) Image display device and medical image pickup device
Mewes et al. A gesture-controlled projection display for CT-guided interventions
TW201035851A (en) Electronic device and method of operating screen
JP2012027796A (en) Information processor and control method of the same
US20140258917A1 (en) Method to operate a device in a sterile environment
JP2016095634A (en) Midair touch panel and surgery simulator display system having the same
US9285961B2 (en) Ultrasonic diagnosis device, graphic environment control device for use therein, and control method therefor
EP3602263B1 (en) 3d scanner system with handheld scanner
JP2010148811A (en) Ultrasonic diagnostic apparatus
EP3435207B1 (en) Information processing device and display method
JP2016220830A (en) Medical image display apparatus and ultrasound diagnostic apparatus
JP2009119000A (en) Auxiliary controller for processing medical image,image processing system, and method for processing medical image
JPH10161801A (en) Input device
JP2010272036A (en) Image processing apparatus
WO2006070441A1 (en) Operator supporting system
WO2023086332A1 (en) An interactive augmented reality system for laparoscopic and video assisted surgeries
US20230169698A1 (en) Microscope system and corresponding system, method and computer program for a microscope system
EP2662025A1 (en) Ultrasonic diagnostic apparatus and control method thereof
JP2004178603A (en) Method and assembly for processing, observing and preloading command information transmitted by image operating apparatus
JP2011141584A (en) Input control equipment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 04807900

Country of ref document: EP

Kind code of ref document: A1

WWW Wipo information: withdrawn in national office

Ref document number: 4807900

Country of ref document: EP