WO2004052224A1 - Control apparatus for imaging device - Google Patents

Control apparatus for imaging device Download PDF

Info

Publication number
WO2004052224A1
WO2004052224A1 PCT/GB2003/005362 GB0305362W WO2004052224A1 WO 2004052224 A1 WO2004052224 A1 WO 2004052224A1 GB 0305362 W GB0305362 W GB 0305362W WO 2004052224 A1 WO2004052224 A1 WO 2004052224A1
Authority
WO
WIPO (PCT)
Prior art keywords
display surface
movement
image
imaging device
secondary image
Prior art date
Application number
PCT/GB2003/005362
Other languages
French (fr)
Inventor
James Robert Hewit
Alan Peter Slade
Original Assignee
University Of Dundee
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University Of Dundee filed Critical University Of Dundee
Priority to AU2003295095A priority Critical patent/AU2003295095A1/en
Publication of WO2004052224A1 publication Critical patent/WO2004052224A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • G06F3/0386Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry for light pen
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00039Operational features of endoscopes provided with input arrangements for the user
    • A61B1/00042Operational features of endoscopes provided with input arrangements for the user for mechanical operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/74Manipulators with manual electric input means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/372Details of monitor hardware
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms
    • A61B2090/502Headgear, e.g. helmet, spectacles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras

Definitions

  • the present invention relates to control apparatus for controlling the movement of an optical imaging device and to a corresponding method.
  • the present invention relates to control apparatus for controlling the movement of an endoscope.
  • MAS minimal access surgery
  • an endoscope is inserted into a patient's body through a small access wound to allow an operating surgeon to view and access an operating site.
  • the endoscope is manually controlled by a second surgeon or skilled technician, or is mounted on a robotic arm and images obtained by the endoscope are displayed on a monitor screen.
  • the operating surgeon viewing the operating site on the monitor screen, conveys instructions regarding the desired positioning of the endoscope within the patient's body either to the second surgeon or to the robotic arm. This is achieved using voice commands directed to the second surgeon, who carries out corresponding movements of the endoscope, or by operating input command devices of the robotic arm, such as a joystick, foot switches or keys/buttons .
  • the detector arrangement is adapted to detect the orientation of a first component relative to a second component during endoscopic surgery.
  • the first component is a transmitter which may be worn on a surgeon's head, and which is adapted to transmit a plurality of unique identifiable signals along mutually diverging beams.
  • the second component is a detector which is adapted to distinguish between the individual signals transmitted by the transmitter.
  • the detector is connected by a control circuit to an endoscopic camera which is, in turn, connected to provide an image on a display screen.
  • the detector and the associated control circuit are able to control movement of the endoscopic camera in response to movement of the surgeon's head.
  • the device requires excessive head movements of the surgeon which are both unnatural and often require the surgeon to take his/her eyes off the display screen.
  • the device is also inflexible in that the number of possible movements of the camera in response to movement of the surgeon's head are limited.
  • An alternative device is the Polyhe us system, commercially available from Computer Motion Inc. This includes a device which is mounted on the surgeon's head. A static electromagnetic field is created in a defined surgical area and movements of the surgeon's head and the associated device within the electromagnetic field are detected. An endoscope is moved according to the movements of the surgeon's head to change an image seen on a monitor screen.
  • the system is limited by being expensive, requires excessive, unnatural movements of the surgeon's head and cannot be easily transferred between alternative sites. Also, the number of possible movements of the endoscope camera are again limited.
  • control apparatus for controlling the movement of an optical imaging device, the apparatus comprising: a display surface for displaying an image of an object viewed by the optical imaging device; a moveably mounted image generator for generating a secondary image on the display surface; a detection device for detecting a position of the secondary image relative to the display surface; and a processor for receiving data concerning the position of the secondary image relative to the display surface, for detecting a movement of the secondary image relative to the display surface and for facilitating movement of the optical imaging device.
  • control apparatus for controlling the movement of an endoscope, the apparatus comprising: a display surface for displaying an image of an object viewed by the endoscope; a moveably mounted image generator for generating a secondary image on the display surface; a detection device for detecting a position of the secondary image relative to the display surface; and a processor for receiving data concerning the position of the secondary image relative to the display surface, for detecting a movement of the secondary image relative to the display surface and for facilitating movement of the endoscope.
  • the invention provides control apparatus where small movements of the secondary image, and thus of the image generator, may be detected, thereby facilitating movement of the optical imaging device. This may be achieved without excessive movement of the image generator.
  • the processor may be adapted to instruct a movement of the optical imaging device.
  • the control apparatus may be for controlling movement of a camera, preferably an endoscope/endoscopic camera.
  • the image generator is adapted for movement in response to a movement instruction or command of an operator.
  • the image generator may be adapted for movement corresponding to a movement of the operato .
  • the image generator is adapted to be mounted on the head of an operator for facilitating movement of the image generator in response to movement of the operator's head. Accordingly, an operator viewing the object image on the display screen may be able to alter the view of the object by small movements of the image generator. Where the image generator is head mounted, this may be achieved without excessive head movements.
  • the image generator may include a coupling such as a strap for releasably coupling the image generator to the head of an operator, or to headgear, spectacles or protective glasses worn by an operator.
  • the image generator may comprise an electromagnetic image generator for generating a secondary image at a frequency in the visible spectrum and preferably comprises a laser.
  • the display surface may comprise a monitor associated with the optical imaging device.
  • the apparatus may further comprise a projector for projecting the object image onto a blank display surface such as a projector screen.
  • the detection device comprises a second optical imaging device such as a camera.
  • the detection device may be adapted to view the display surface for detecting the position of the secondary image.
  • the processor or the detection device may be adapted to correlate the position of the secondary image with respect to a pre-programmed boundary corresponding to the dimensions of the display surface.
  • the detection device may generate a combined image of the object image and the secondary image displayed on the display surface.
  • the detection device may be adapted to detect light at the frequency of the secondary image for detecting the position of the secondary image.
  • the detection device may include an optical filter for filtering the viewed image, to facilitate detection of the secondary image.
  • the detection device may comprise a sensor for detecting electromagnetic radiation.
  • the detection device may comprise a screen such as a photo-sensitive screen.
  • the apparatus may further comprise a user interface assembly for allowing input of control commands to the processor.
  • the interface assembly may include an interface screen defining a plurality of discrete screen areas or zones associated with a respective control command.
  • the interface screen may define areas associated with movement commands such as commands to move the optical imaging device up, down, left, right and to move the imaging device to zoom in or out with respect to an object. This may facilitate manipulation of the optical imaging device to obtain a desired image of the object.
  • the preprogrammed boundary of the processor may correspond to the interface screen, specifically, to include boundary areas corresponding to the areas of the screen.
  • the interface assembly may comprise at least one command switch or button for inputting a control command to the processor. The command switch may confirm a control command of the interface screen.
  • the apparatus may include software suitable for processing control commands and for causing a corresponding movement of the optical imaging device.
  • the software may define the boundary and may correlate the position of the secondary image with respect to the boundary.
  • the apparatus may be adapted to control the image generator, to switch the image generator between a safety mode and a use mode.
  • an intensity of the secondary image generated may be at a level below a predetermined safe operating level. This may prevent generation of a secondary image at an intensity likely to cause harm, for example, to the eyes of a third party.
  • the image generator may be switchable between the safety mode and the use mode depending upon the detected position of the secondary image.
  • the detection device determines the position of the secondary image to be displaced from the display surface or on a determined area of the display surface, such as near the programmed boundary, the image generator may be switched to the safety mode.
  • the apparatus preferably the processor, may be adapted to control the power supply to the image generator to switch the generator between the safety and use modes.
  • the safety mode the power supplied to the generator may be reduced compared to the use mode of the generator.
  • the processor may be adapted to detect movement of the secondary image, for example through suitable software, to a position displaced from the display surface and may be adapted to reduce the power supplied to the image generator to switch the image generator to the safety mode. In this fashion, the image generator may automatically be switched to the safety mo e.
  • the apparatus may further comprise a mounting assembly for coupling to the optical imaging device for movement of the imaging device.
  • the mounting assembly may be automated and may include drive apparatus coupled to the processor for movement of the mounting assembly in response to an instruction of the processor.
  • the drive apparatus may comprise a robot such as a robotic arm.
  • medical apparatus comprising an optical imaging device and control apparatus for controlling the movement of the optical imaging device, the control apparatus comprising: a display surface for displaying an image of an object viewed by the optical imaging device; a moveably mounted image generator for generating a secondary image on the display surface; a detection device for detecting a position of the secondary image relative to the display surface; and a processor for receiving data concerning the position of the secondary image relative to the display surface, for detecting a movement of the secondary image relative to the display surface and for facilitating movement of the optical imaging device.
  • control apparatus for controlling movement of an optical imaging device which displays an image of an object viewed by the imaging device on a display surface
  • the control apparatus comprising: a moveably mounted image generator for generating a secondary image on the display surface; a detection device for detecting a position of the secondary image relative to the display surface; and a processor for receiving data concerning the position of the secondary image relative to the display surface, for detecting a movement of the secondary image relative to the display surface and for facilitating movement of the optical imaging device.
  • a method of controlling the movement of an optical imaging device comprising the steps of : mounting the optical imaging device to drive apparatus for moving the imaging device; displaying an image of an object viewed by the optical imaging device on a display surface; generating a secondary image on the display surface using a moveably mounted image generator; detecting a position of the secondary image relative to the display surface; detecting a movement of the secondary image relative to the display surface; and activating the drive apparatus to cause a desired movement of the optical imaging device.
  • Fig. 1 is a schematic illustration of control apparatus for controlling the movement of an optical imaging device in accordance with an embodiment of the present invention, shown in use;
  • Fig. 2 is a view of a display screen forming part of the control apparatus illustrated in Fig. 1, shown in more detail .
  • Fig. 1 there is shown a schematic illustration of control apparatus for controlling movement of an optical imaging device in accordance with an embodiment of the present invention, the control apparatus indicated generally by reference numeral 10.
  • the control apparatus is coupled to an optical imaging device in the form of an endoscope 12, for controlling movement of the endoscope.
  • the endoscope 12 is used in minimal access surgery (MAS) to obtain images from within a patient's body which can be viewed by a surgeon to allow, for example, a surgical procedure to be conducted.
  • the control apparatus 10 includes a display surface 13 including a monitor screen 14 of a monitor 15 coupled to the endoscope 12, for displaying an image 16 of an object 18 viewed by the endoscope.
  • the control apparatus 10 also includes a moveably mounted image generator in the form of a laser 20 and a detection device comprising a second optical imaging device, in this embodiment, a camera 22.
  • a processor 24 is coupled to the laser 20, the camera 22 and the endoscope 12.
  • the endoscope 12 With the endoscope 12 inserted into the patient's body through a small entrance wound, the endoscope 12 views an object 18, such as an organ of the patient.
  • the endoscope 12 is connected to the main monitor 15 to display the image 16 of the object on the monitor screen 14.
  • the laser 20 is strapped to the head 26 of the surgeon 28 and projects a secondary image in the form of an arrow or cursor 30, which may be directed on to the monitor screen 14.
  • the arrow 30 is thus directed around the display surface 13 by movements of the surgeon's head.
  • the camera 22 views an area 32 indicated by the dashed lines 33 which encompasses the display surface 13 and detects a position of the arrow 30 relative to the display surface 13.
  • Data concerning the position of the arrow 30 relative to the display surface 13 is supplied from the camera 22 to the processor 24, which is programmed with suitable software for detecting a movement of the arrow 30 relative to the display surface. This allows the processor to instruct a desired movement of the endoscope 12 in response to movement of the arrow 30, as will be described in more detail below.
  • a safety monitor 34 is coupled to the camera 22 and displays the image of the area 32 viewed by the camera. This allows verification of correct positioning of the camera 22 throughout a procedure using the control apparatus 10, as will be described.
  • the control apparatus 10 allows the surgeon 28 to instruct movements of the endoscope 12 by moving the arrow 30 across the projected object image 16 viewed on the monitor screen 14.
  • the endoscope 12 is mounted on a mounting assembly including a moveable robotic arm 36, such as those commercially available from Computer Motion, Inc or the Stubli PUMA robot.
  • the arm 36 is moveable in three planes of motion XY, XZ and YZ, as indicated in Fig. 1. Movement of the robotic arm 36 is controlled by the processor 24 which sends suitable output signals to the robotic arm 36 to instruct a desired movement.
  • the laser 20 is connected to and powered through the processor 24 and projects the arrow 30 onto the display surface 13 at a frequency in the visible spectrum.
  • the laser 20 includes suitable focussing and filtering optics (not shown) for projecting the secondary image in the desired arrow shape 30.
  • the camera 22 which views the image of the area 32 also includes suitable focussing and filtering optics (not shown) , for detecting light at the frequency emitted by the laser 20.
  • suitable software of the camera 22 or processor 24, such as Photosuite, commercially available from Sony, or Matlab this allows detection of the position of the arrow 30 relative to the display surface 13.
  • the safety camera 22 is programmed to detect when the arrow 30 is projected onto the interface screen 38 under software control before use of the control apparatus 10 begins.
  • the processor 24 is also pre-programmed with a boundary which corresponds to the display surface 13, and correlates a detected position of the arrow 30 relative to the boundary and thus relative to the display screen 13. Furthermore, the safety monitor 34 coupled to the camera 22 shows the image viewed by the camera 22 including the object image 16 and the arrow 30, to allow verification of the positioning of the camera 22 relative to the monitor screen 14 throughout the operation, and thus correct orientation of the boundary.
  • An interface assembly includes an interface screen 38 in the form of an overlay on the monitor 15 which will now be described in relation to Fig. 2, which illustrates the monitor 15 in more detail.
  • the interface screen 38 comprises an overlay on the main monitor 15 provided between an edge 40 of the monitor screen 14 and the edge 42 of the monitor casing.
  • the interface screen 38 is sub-divided into a number of sections or zones.
  • the zones correspond to desired control commands and in the embodiment shown, the zones include zones 44-54 corresponding to up, down, left, right, zoom in and zoom out control commands. These zones 44-54 are included in the boundary pre-programmed into the processor 24.
  • the camera 22 By detecting light at the frequency emitted by the laser 20, the camera 22 is able to detect a location of the arrow 30 on the display surface 13 by interaction with the processor 24 and the camera software. Accordingly, location of the arrow 30 in one of the zones 44-54 is detected by the processor 24.
  • the surgeon then issues a confirmation control command by depressing a control switch 56 coupled to the processor 24 to confirm the desired command and thus the desired movement. For example, when viewing the object image 16, if the surgeon desires to move the endoscope 12 upwardly to alter the image seen by the endoscope, the surgeon directs the arrow 30 into the up zone 44 and then depresses the control switch 56.
  • the processor 24 then instructs the robotic arm 36 to move a determined distance upwardly, such that the endoscope 12 views a different portion of the object 18. This may therefore be achieved with little movement and minimum intervention from the surgeon 28.
  • the safety camera 22 also detects when the arrow 30 is moved off the interface screen 38 and thus when the arrow 30 has moved to a position where the image may fall away from the display surface 13.
  • the laser 20 is switchable between a safety mode and a use mode. In the safety mode, the power supplied to the laser 20 is relatively low and thus the intensity of the generated arrow 30 is relatively low, avoiding damage to the eyesight of third parties.
  • the laser 20 is switched to the safety mode when it is detected that the arrow 30 is liable to move off the display screen 14. This is achieved by programming the processor 24 to define a laser intensity switch boundary corresponding to the boundary 58 of the interface screen 32. This forms part of the pre-programmed processor boundary.
  • the laser 20 is automatically switched to the safety mode .
  • Reduction in the power supplied to the laser 20 may be achieved by reducing the mark to space ratio of a driver of the laser.
  • the camera 22, controlled through the processor 24, is synchronised with the laser to detect the image at this lower power and to take an image only when the laser is pulsed on.
  • the interface screen may be provided as an image on the display screen such as an image on the monitor screen.
  • the apparatus may include a detection device comprising one or more electromagnetic sensors and may comprise a photo-sensitive screen.
  • the apparatus may be used in military command systems, computer games, virtual reality systems and the like.
  • the control apparatus may be arranged to cause a movement of the optical imaging device corresponding to a movement of the image generator.
  • the optical imaging device may correspondingly move left or right.

Abstract

In an embodiment of the invention, there is disclosed control apparatus (10) for controlling the movement of an optical imaging device (12), such as an endoscope. The apparatus (10) comprises: a display surface (13) for displaying an image (16) of an object (18) viewed by the optical imaging device (12); a moveably mounted image generator (20) for generating a secondary image (30) on the display surface (13); a detection device (22) for detecting a position of the secondary image (30) relative to the display surface (13); and a processor (24) for receiving data concerning the position of the secondary image (30) relative to the display surface (13), for detecting a movement of the secondary image (30) relative to the display surface (13) and for facilitating movement of the optical imaging device (12).

Description

CONTROL APPARATUS FOR IMAGING DEVICE
The present invention relates to control apparatus for controlling the movement of an optical imaging device and to a corresponding method. In particular, but not exclusively, the present invention relates to control apparatus for controlling the movement of an endoscope.
In minimal access surgery (MAS) , otherwise known as keyhole surgery, an endoscope is inserted into a patient's body through a small access wound to allow an operating surgeon to view and access an operating site. Typically, the endoscope is manually controlled by a second surgeon or skilled technician, or is mounted on a robotic arm and images obtained by the endoscope are displayed on a monitor screen. The operating surgeon, viewing the operating site on the monitor screen, conveys instructions regarding the desired positioning of the endoscope within the patient's body either to the second surgeon or to the robotic arm. This is achieved using voice commands directed to the second surgeon, who carries out corresponding movements of the endoscope, or by operating input command devices of the robotic arm, such as a joystick, foot switches or keys/buttons .
Where movement of the endoscope is manually controlled, the operating surgeon cannot easily both monitor the operating site on the monitor screen, manipulate surgical apparatus and control the endoscope, hence the need for the second surgeon or technician. This reduces efficiency and increases the cost of the procedure. Where the endoscope is mounted on a robotic arm, operation of the robotic arm diverts the attention of the operating surgeon away from the operating site and requires a high level of skill and knowledge to effectively control the arm. Also, the operating surgeon cannot easily both manipulate surgical apparatus and control the robotic arm. Accordingly, research has been carried out to develop systems for overcoming these disadvantages. One proposed device is disclosed in United States patent No. 6,239,874 (assigned to Armstrong Healthcare Limited) , which discloses an orientation detector arrangement. The detector arrangement is adapted to detect the orientation of a first component relative to a second component during endoscopic surgery. The first component is a transmitter which may be worn on a surgeon's head, and which is adapted to transmit a plurality of unique identifiable signals along mutually diverging beams. The second component is a detector which is adapted to distinguish between the individual signals transmitted by the transmitter. The detector is connected by a control circuit to an endoscopic camera which is, in turn, connected to provide an image on a display screen. The detector and the associated control circuit are able to control movement of the endoscopic camera in response to movement of the surgeon's head.
However, the device requires excessive head movements of the surgeon which are both unnatural and often require the surgeon to take his/her eyes off the display screen. The device is also inflexible in that the number of possible movements of the camera in response to movement of the surgeon's head are limited.
An alternative device is the Polyhe us system, commercially available from Computer Motion Inc. This includes a device which is mounted on the surgeon's head. A static electromagnetic field is created in a defined surgical area and movements of the surgeon's head and the associated device within the electromagnetic field are detected. An endoscope is moved according to the movements of the surgeon's head to change an image seen on a monitor screen. However, the system is limited by being expensive, requires excessive, unnatural movements of the surgeon's head and cannot be easily transferred between alternative sites. Also, the number of possible movements of the endoscope camera are again limited.
Similar disadvantages are experienced in a number of fields, for example, in situations where an operator is required to interact with a monitor or viewing screen.
These include military command systems, computer games, virtual reality systems and the like.
It is amongst the objects of embodiments of the present invention to obviate or mitigate at least one of the foregoing disadvantages.
According to a first aspect of the present invention, there is provided control apparatus for controlling the movement of an optical imaging device, the apparatus comprising: a display surface for displaying an image of an object viewed by the optical imaging device; a moveably mounted image generator for generating a secondary image on the display surface; a detection device for detecting a position of the secondary image relative to the display surface; and a processor for receiving data concerning the position of the secondary image relative to the display surface, for detecting a movement of the secondary image relative to the display surface and for facilitating movement of the optical imaging device.
According to a second aspect of the present invention, there is provided control apparatus for controlling the movement of an endoscope, the apparatus comprising: a display surface for displaying an image of an object viewed by the endoscope; a moveably mounted image generator for generating a secondary image on the display surface; a detection device for detecting a position of the secondary image relative to the display surface; and a processor for receiving data concerning the position of the secondary image relative to the display surface, for detecting a movement of the secondary image relative to the display surface and for facilitating movement of the endoscope.
The invention provides control apparatus where small movements of the secondary image, and thus of the image generator, may be detected, thereby facilitating movement of the optical imaging device. This may be achieved without excessive movement of the image generator. The processor may be adapted to instruct a movement of the optical imaging device.
The control apparatus may be for controlling movement of a camera, preferably an endoscope/endoscopic camera. Preferably, the image generator is adapted for movement in response to a movement instruction or command of an operator. Alternatively, the image generator may be adapted for movement corresponding to a movement of the operato . Preferably also, the image generator is adapted to be mounted on the head of an operator for facilitating movement of the image generator in response to movement of the operator's head. Accordingly, an operator viewing the object image on the display screen may be able to alter the view of the object by small movements of the image generator. Where the image generator is head mounted, this may be achieved without excessive head movements. The image generator may include a coupling such as a strap for releasably coupling the image generator to the head of an operator, or to headgear, spectacles or protective glasses worn by an operator.
The image generator may comprise an electromagnetic image generator for generating a secondary image at a frequency in the visible spectrum and preferably comprises a laser. The display surface may comprise a monitor associated with the optical imaging device. Alternatively, the apparatus may further comprise a projector for projecting the object image onto a blank display surface such as a projector screen. Preferably, the detection device comprises a second optical imaging device such as a camera. The detection device may be adapted to view the display surface for detecting the position of the secondary image. The processor or the detection device may be adapted to correlate the position of the secondary image with respect to a pre-programmed boundary corresponding to the dimensions of the display surface. The detection device may generate a combined image of the object image and the secondary image displayed on the display surface. The detection device may be adapted to detect light at the frequency of the secondary image for detecting the position of the secondary image. The detection device may include an optical filter for filtering the viewed image, to facilitate detection of the secondary image.
Alternatively, the detection device may comprise a sensor for detecting electromagnetic radiation. In embodiments of the invention, the detection device may comprise a screen such as a photo-sensitive screen.
The apparatus may further comprise a user interface assembly for allowing input of control commands to the processor. The interface assembly may include an interface screen defining a plurality of discrete screen areas or zones associated with a respective control command. For example, the interface screen may define areas associated with movement commands such as commands to move the optical imaging device up, down, left, right and to move the imaging device to zoom in or out with respect to an object. This may facilitate manipulation of the optical imaging device to obtain a desired image of the object. The preprogrammed boundary of the processor may correspond to the interface screen, specifically, to include boundary areas corresponding to the areas of the screen. The interface assembly may comprise at least one command switch or button for inputting a control command to the processor. The command switch may confirm a control command of the interface screen. Thus, it may be necessary to input a control command to the processor to confirm a desired movement of the optical imaging device before any movement takes place. This prevents inadvertent movement of the optical imaging device. The apparatus, preferably the processor, may include software suitable for processing control commands and for causing a corresponding movement of the optical imaging device. The software may define the boundary and may correlate the position of the secondary image with respect to the boundary.
The apparatus, preferably the processor, may be adapted to control the image generator, to switch the image generator between a safety mode and a use mode. In the safety mode, an intensity of the secondary image generated may be at a level below a predetermined safe operating level. This may prevent generation of a secondary image at an intensity likely to cause harm, for example, to the eyes of a third party. The image generator may be switchable between the safety mode and the use mode depending upon the detected position of the secondary image. Thus, when the detection device determines the position of the secondary image to be displaced from the display surface or on a determined area of the display surface, such as near the programmed boundary, the image generator may be switched to the safety mode. The apparatus, preferably the processor, may be adapted to control the power supply to the image generator to switch the generator between the safety and use modes. Thus in the safety mode, the power supplied to the generator may be reduced compared to the use mode of the generator.
The processor may be adapted to detect movement of the secondary image, for example through suitable software, to a position displaced from the display surface and may be adapted to reduce the power supplied to the image generator to switch the image generator to the safety mode. In this fashion, the image generator may automatically be switched to the safety mo e. The apparatus may further comprise a mounting assembly for coupling to the optical imaging device for movement of the imaging device. The mounting assembly may be automated and may include drive apparatus coupled to the processor for movement of the mounting assembly in response to an instruction of the processor. The drive apparatus may comprise a robot such as a robotic arm.
According to a third aspect of the present invention, there is provided medical apparatus comprising an optical imaging device and control apparatus for controlling the movement of the optical imaging device, the control apparatus comprising: a display surface for displaying an image of an object viewed by the optical imaging device; a moveably mounted image generator for generating a secondary image on the display surface; a detection device for detecting a position of the secondary image relative to the display surface; and a processor for receiving data concerning the position of the secondary image relative to the display surface, for detecting a movement of the secondary image relative to the display surface and for facilitating movement of the optical imaging device.
According to a fourth aspect of the present invention, there is provided control apparatus for controlling movement of an optical imaging device which displays an image of an object viewed by the imaging device on a display surface, the control apparatus comprising: a moveably mounted image generator for generating a secondary image on the display surface; a detection device for detecting a position of the secondary image relative to the display surface; and a processor for receiving data concerning the position of the secondary image relative to the display surface, for detecting a movement of the secondary image relative to the display surface and for facilitating movement of the optical imaging device.
Further features of the medical apparatus and control apparatus are described above in relation to the first and second aspects of the invention.
According to a fifth aspectO of the present invention, there is provided a method of controlling the movement of an optical imaging device, the method comprising the steps of : mounting the optical imaging device to drive apparatus for moving the imaging device; displaying an image of an object viewed by the optical imaging device on a display surface; generating a secondary image on the display surface using a moveably mounted image generator; detecting a position of the secondary image relative to the display surface; detecting a movement of the secondary image relative to the display surface; and activating the drive apparatus to cause a desired movement of the optical imaging device.
Embodiments of the present invention will now be described, by way of example only, with reference to the accompanying drawings, in which:
Fig. 1 is a schematic illustration of control apparatus for controlling the movement of an optical imaging device in accordance with an embodiment of the present invention, shown in use; and
Fig. 2 is a view of a display screen forming part of the control apparatus illustrated in Fig. 1, shown in more detail .
Referring firstly to Fig. 1, there is shown a schematic illustration of control apparatus for controlling movement of an optical imaging device in accordance with an embodiment of the present invention, the control apparatus indicated generally by reference numeral 10. In the embodiment shown, the control apparatus is coupled to an optical imaging device in the form of an endoscope 12, for controlling movement of the endoscope. The endoscope 12 is used in minimal access surgery (MAS) to obtain images from within a patient's body which can be viewed by a surgeon to allow, for example, a surgical procedure to be conducted. The control apparatus 10 includes a display surface 13 including a monitor screen 14 of a monitor 15 coupled to the endoscope 12, for displaying an image 16 of an object 18 viewed by the endoscope. The control apparatus 10 also includes a moveably mounted image generator in the form of a laser 20 and a detection device comprising a second optical imaging device, in this embodiment, a camera 22. A processor 24 is coupled to the laser 20, the camera 22 and the endoscope 12.
With the endoscope 12 inserted into the patient's body through a small entrance wound, the endoscope 12 views an object 18, such as an organ of the patient. The endoscope 12 is connected to the main monitor 15 to display the image 16 of the object on the monitor screen 14. The laser 20 is strapped to the head 26 of the surgeon 28 and projects a secondary image in the form of an arrow or cursor 30, which may be directed on to the monitor screen 14. The arrow 30 is thus directed around the display surface 13 by movements of the surgeon's head.
The camera 22 views an area 32 indicated by the dashed lines 33 which encompasses the display surface 13 and detects a position of the arrow 30 relative to the display surface 13.
Data concerning the position of the arrow 30 relative to the display surface 13 is supplied from the camera 22 to the processor 24, which is programmed with suitable software for detecting a movement of the arrow 30 relative to the display surface. This allows the processor to instruct a desired movement of the endoscope 12 in response to movement of the arrow 30, as will be described in more detail below.
A safety monitor 34 is coupled to the camera 22 and displays the image of the area 32 viewed by the camera. This allows verification of correct positioning of the camera 22 throughout a procedure using the control apparatus 10, as will be described. The control apparatus 10 allows the surgeon 28 to instruct movements of the endoscope 12 by moving the arrow 30 across the projected object image 16 viewed on the monitor screen 14.
In more detail, the endoscope 12 is mounted on a mounting assembly including a moveable robotic arm 36, such as those commercially available from Computer Motion, Inc or the Stubli PUMA robot. The arm 36 is moveable in three planes of motion XY, XZ and YZ, as indicated in Fig. 1. Movement of the robotic arm 36 is controlled by the processor 24 which sends suitable output signals to the robotic arm 36 to instruct a desired movement.
The laser 20 is connected to and powered through the processor 24 and projects the arrow 30 onto the display surface 13 at a frequency in the visible spectrum. The laser 20 includes suitable focussing and filtering optics (not shown) for projecting the secondary image in the desired arrow shape 30. The camera 22 which views the image of the area 32 also includes suitable focussing and filtering optics (not shown) , for detecting light at the frequency emitted by the laser 20. Through suitable software of the camera 22 or processor 24, such as Photosuite, commercially available from Sony, or Matlab, this allows detection of the position of the arrow 30 relative to the display surface 13. The safety camera 22 is programmed to detect when the arrow 30 is projected onto the interface screen 38 under software control before use of the control apparatus 10 begins. The processor 24 is also pre-programmed with a boundary which corresponds to the display surface 13, and correlates a detected position of the arrow 30 relative to the boundary and thus relative to the display screen 13. Furthermore, the safety monitor 34 coupled to the camera 22 shows the image viewed by the camera 22 including the object image 16 and the arrow 30, to allow verification of the positioning of the camera 22 relative to the monitor screen 14 throughout the operation, and thus correct orientation of the boundary.
An interface assembly includes an interface screen 38 in the form of an overlay on the monitor 15 which will now be described in relation to Fig. 2, which illustrates the monitor 15 in more detail. The interface screen 38 comprises an overlay on the main monitor 15 provided between an edge 40 of the monitor screen 14 and the edge 42 of the monitor casing. As shown, the interface screen 38 is sub-divided into a number of sections or zones. The zones correspond to desired control commands and in the embodiment shown, the zones include zones 44-54 corresponding to up, down, left, right, zoom in and zoom out control commands. These zones 44-54 are included in the boundary pre-programmed into the processor 24.
By detecting light at the frequency emitted by the laser 20, the camera 22 is able to detect a location of the arrow 30 on the display surface 13 by interaction with the processor 24 and the camera software. Accordingly, location of the arrow 30 in one of the zones 44-54 is detected by the processor 24. The surgeon then issues a confirmation control command by depressing a control switch 56 coupled to the processor 24 to confirm the desired command and thus the desired movement. For example, when viewing the object image 16, if the surgeon desires to move the endoscope 12 upwardly to alter the image seen by the endoscope, the surgeon directs the arrow 30 into the up zone 44 and then depresses the control switch 56. The processor 24 then instructs the robotic arm 36 to move a determined distance upwardly, such that the endoscope 12 views a different portion of the object 18. This may therefore be achieved with little movement and minimum intervention from the surgeon 28.
The safety camera 22 also detects when the arrow 30 is moved off the interface screen 38 and thus when the arrow 30 has moved to a position where the image may fall away from the display surface 13. To prevent damage to the eyesight of third parties, the laser 20 is switchable between a safety mode and a use mode. In the safety mode, the power supplied to the laser 20 is relatively low and thus the intensity of the generated arrow 30 is relatively low, avoiding damage to the eyesight of third parties. The laser 20 is switched to the safety mode when it is detected that the arrow 30 is liable to move off the display screen 14. This is achieved by programming the processor 24 to define a laser intensity switch boundary corresponding to the boundary 58 of the interface screen 32. This forms part of the pre-programmed processor boundary. When the arrow 30 crosses the boundary 58, as detected by the camera 22, the laser 20 is automatically switched to the safety mode . Reduction in the power supplied to the laser 20 may be achieved by reducing the mark to space ratio of a driver of the laser. In this case, the camera 22, controlled through the processor 24, is synchronised with the laser to detect the image at this lower power and to take an image only when the laser is pulsed on.
Various modifications may be made to the foregoing within the scope of the present invention.
For example, the interface screen may be provided as an image on the display screen such as an image on the monitor screen.
The apparatus may include a detection device comprising one or more electromagnetic sensors and may comprise a photo-sensitive screen. The apparatus may be used in military command systems, computer games, virtual reality systems and the like.
The control apparatus may be arranged to cause a movement of the optical imaging device corresponding to a movement of the image generator. Thus, for example, when the surgeon looks left or right, the optical imaging device may correspondingly move left or right.

Claims

1. Control apparatus for controlling the movement of an optical imaging device, the apparatus comprising: a display surface for displaying an image of an object viewed by the optical imaging device; a moveably mounted image generator for generating a secondary image on the display surface; a detection device for detecting a position of the secondary image relative to the display surface; and a processor for receiving data concerning the position of the secondary image relative to the display surface, for detecting a movement of the secondary image relative to the display surface and for facilitating movement of the optical imaging device.
2. Apparatus as claimed in claim 1, wherein the processor is adapted to instruct a movement of the optical imaging device .
3. Apparatus as claimed in either of claims 1 or 2, wherein the control apparatus is for controlling movement of a camera.
4. Apparatus as claimed in any preceding claim, wherein the apparatus is for controlling movement of an endoscope.
5. Apparatus as claimed in any preceding claim, wherein the image generator is adapted for movement in response to a movement instruction of an operator.
6. Apparatus as claimed in any preceding claim, wherein the image generator is adapted for movement corresponding to a movement of the operator.
7. Apparatus as claimed in any preceding claim, wherein the image generator is adapted to be mounted on the head of an operator for facilitating movement of the image generator in response to movement of the operator's head.
8. Apparatus as claimed in any preceding claim, wherein the image generator takes the form of an electromagnetic image generator for generating a secondary image at a frequency in the visible spectrum.
9. Apparatus as claimed in claim 8, wherein the image generator takes the form of a laser.
10. Apparatus as claimed in any preceding claim, wherein the display surface comprises a monitor associated with the optical imaging device.
11. Apparatus as claimed in any one of claims 1 to 10, wherein the apparatus further comprises a projector for projecting the object image onto the display surface.
12. Apparatus as claimed in any preceding claim, wherein the detection device comprises a second optical imaging device.
13. Apparatus as claimed in claim 12, wherein the second optical imaging device takes the form of a camera.
14. Apparatus as claimed in any preceding claim, wherein the detection device is adapted to view the display surface for detecting the position of the secondary image.
15. Apparatus as claimed in any preceding claim, wherein the apparatus is adapted to correlate the position of the secondary image with respect to a pre-programmed boundary corresponding to the dimensions of the display surface.
16. Apparatus as claimed in claim 15, wherein the processor is adapted to correlate the position of the secondary image with respect to the pre-programmed boundary.
17. Apparatus as claimed in any preceding claim, wherein the detection device is adapted to generate a combined image of the object image and the secondary image displayed on the display surface.
18. Apparatus as claimed in any preceding claim, wherein the detection device is adapted to detect light at the frequency of the secondary image for detecting the position of the secondary image.
19. Apparatus as claimed in claim 18, wherein the detection device includes an optical filter for filtering the viewed image, to facilitate detection of the secondary image .
20. Apparatus as claimed in any one of claims 1 to 16, wherein the detection device comprises a sensor for detecting electromagnetic radiation.
21. Apparatus as claimed in claim 20, wherein the detection device comprises a photo-sensitive screen.
22. Apparatus as .claimed in any preceding claim, further comprising a user interface assembly for allowing input of control commands to the processor.
23. Apparatus as claimed in claim 22, wherein the interface assembly includes an interface screen defining a plurality of discrete screen areas associated with a respective control command.
24. Apparatus as claimed in claim 23, wherein the interface screen defines areas associated with movement commands for moving the optical imaging device.
25. Apparatus as claimed in claim 24, wherein the movement commands are selected from the group comprising up, down, left, right, zoom in and zoom out.
26. Apparatus as claimed in any one of claims 23 to 26, when dependent on claim 15, wherein the pre-programmed boundary of the processor corresponds to a boundary of the interface screen.
27. Apparatus as claimed in claim 26, wherein the preprogrammed boundary of the processor includes boundary areas corresponding to the screen areas.
28. Apparatus as claimed in any one of claims 22 to 27, wherein the interface assembly comprises at least one command switch for inputting a control command to the processor .
29. Apparatus as claimed in claim 28, wherein the command switch is adapted to confirm a control command of the interface screen.
30. Apparatus as claimed in any one of claims 23 to 29, further comprising software for processing control commands and for causing a corresponding movement of the optical imaging device.
31. Apparatus as claimed in claim 30, when dependent on claim 15, wherein the software defines the boundary and is adapted to correlate the position of the secondary image with respect to the boundary.
32. Apparatus as claimed in any preceding claim, wherein the processor is adapted to control the image generator to switch the image generator between a safety mode and a use mode .
33. Apparatus as claimed in claim 32, wherein in the safety mode, an intensity of the secondary image generated is at a level below a predetermined safe operating level.
34. Apparatus as claimed in either of claims 32 or 33, wherein the image generator is switchable between the safety mode and the use mode depending upon the detected position of the secondary image.
35. Apparatus as claimed in any one of claims 32 to 34, wherein the processor is adapted to control the power supply to the image generator to switch the generator between the safety and use modes .
36. Apparatus as claimed in any one of claims 32 to 35, wherein the processor is adapted to detect movement of the secondary image to a position displaced from the display surface and to reduce the power supplied to the image generator to switch the image generator to the safety mode.
37. Apparatus as claimed in any preceding claim, further comprising a mounting assembly for coupling to the optical imaging device for movement of the imaging device.
38. Apparatus as claimed in claim 37, wherein the mounting assembly is automated and includes drive apparatus coupled to the processor for movement of the mounting assembly in response to an instruction of the processor.
39. Control apparatus for controlling the movement of an endoscope, the apparatus comprising: a display surface for displaying an image of an object viewed by the endoscope; a moveably mounted image generator for generating a secondary image on the display surface; a detection device for detecting a position of the secondary image relative to the display surface; and a processor for receiving data concerning the position of the secondary image relative to the display surface, for detecting a movement of the secondary image relative to the display surface and for facilitating movement of the endoscope .
40. Medical apparatus comprising an optical imaging device and control apparatus for controlling the movement of the optical imaging device, the control apparatus comprising: a display surface for displaying an image of an object viewed by the optical imaging device; a moveably mounted image generator for generating a secondary image on the display surface; a detection device for detecting a position of the secondary image relative to the display surface; and a processor for receiving data concerning the position of the secondary image relative to the display surface, for detecting a movement of the secondary image relative to the display surface and for facilitating movement of the optical imaging device.
41. Medical apparatus as claimed in claim 40, wherein the control apparatus takes the form of control apparatus as claimed in any one of claims 2 to 38.
42. Control apparatus for controlling movement of an optical imaging device which displays an image of an object viewed by the imaging device on a display surface, the control apparatus comprising: a moveably mounted image generator for generating a secondary image on the display surface; a detection device for detecting a position of the secondary image relative to the display surface; and a processor for receiving data concerning the position of the secondary image relative to the display surface, for detecting a movement of the secondary image relative to the display surface and for facilitating movement of the optical imaging device.
43. A method of controlling the movement of an optical imaging device, the method comprising the steps of: mounting the optical imaging device to drive apparatus for moving the imaging device; displaying an image of an object viewed by the optical imaging device on a display surface; generating a secondary image on the display surface using a moveably mounted image generator; detecting a position of the secondary image relative to the display surface; detecting a movement of the secondary image relative to the display surface; and activating the drive apparatus to cause a desired movement of the optical imaging device.
PCT/GB2003/005362 2002-12-12 2003-12-10 Control apparatus for imaging device WO2004052224A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2003295095A AU2003295095A1 (en) 2002-12-12 2003-12-10 Control apparatus for imaging device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GBGB0228937.9A GB0228937D0 (en) 2002-12-12 2002-12-12 Control apparatus for imaging device
GB0228937.9 2002-12-12

Publications (1)

Publication Number Publication Date
WO2004052224A1 true WO2004052224A1 (en) 2004-06-24

Family

ID=9949528

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2003/005362 WO2004052224A1 (en) 2002-12-12 2003-12-10 Control apparatus for imaging device

Country Status (3)

Country Link
AU (1) AU2003295095A1 (en)
GB (1) GB0228937D0 (en)
WO (1) WO2004052224A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008142494A1 (en) * 2007-05-21 2008-11-27 Sony Ericsson Mobile Communications Ab Remote viewfinding
GB2464092A (en) * 2008-09-25 2010-04-07 Prosurgics Ltd Surgical mechanism control system
EP2612592A1 (en) * 2010-08-31 2013-07-10 FUJIFILM Corporation Medical treatment information display device and method, and program
CN105812779A (en) * 2016-03-22 2016-07-27 华中科技大学 Head-mounted endoscope 3D display system capable of switching scene source

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5911036A (en) * 1995-09-15 1999-06-08 Computer Motion, Inc. Head cursor control interface for an automated endoscope system for optimal positioning
US6239874B1 (en) * 1996-11-18 2001-05-29 Armstrong Healthcare Limited Orientation detector arrangement
US6463361B1 (en) * 1994-09-22 2002-10-08 Computer Motion, Inc. Speech interface for an automated endoscopic system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6463361B1 (en) * 1994-09-22 2002-10-08 Computer Motion, Inc. Speech interface for an automated endoscopic system
US5911036A (en) * 1995-09-15 1999-06-08 Computer Motion, Inc. Head cursor control interface for an automated endoscope system for optimal positioning
US6239874B1 (en) * 1996-11-18 2001-05-29 Armstrong Healthcare Limited Orientation detector arrangement

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008142494A1 (en) * 2007-05-21 2008-11-27 Sony Ericsson Mobile Communications Ab Remote viewfinding
GB2464092A (en) * 2008-09-25 2010-04-07 Prosurgics Ltd Surgical mechanism control system
EP2520244A1 (en) * 2008-09-25 2012-11-07 Prosurgics Limited A surgical mechanism control system
EP2612592A1 (en) * 2010-08-31 2013-07-10 FUJIFILM Corporation Medical treatment information display device and method, and program
EP2612592A4 (en) * 2010-08-31 2014-06-25 Fujifilm Corp Medical treatment information display device and method, and program
US9158382B2 (en) 2010-08-31 2015-10-13 Fujifilm Corporation Medical information display apparatus, method, and program
CN105812779A (en) * 2016-03-22 2016-07-27 华中科技大学 Head-mounted endoscope 3D display system capable of switching scene source
CN105812779B (en) * 2016-03-22 2018-01-05 华中科技大学 A kind of wear-type endoscope 3D display system in changeable scene source

Also Published As

Publication number Publication date
AU2003295095A1 (en) 2004-06-30
GB0228937D0 (en) 2003-01-15

Similar Documents

Publication Publication Date Title
EP3834768B1 (en) Augmented reality headset with varied opacity for navigated robotic surgery
EP3861957B1 (en) Extended reality visualization of optical tool tracking volume for computer assisted navigation in surgery
CN106102633B (en) For remotely operating the structural adjustment system and method for medical system
JP7216768B2 (en) Utilization and Communication of 2D Digital Imaging in Medical Imaging in 3D Extended Reality Applications
EP1872737B1 (en) Computer assisted orthopaedic surgery system
US20210382559A1 (en) Ui for head mounted display system
JP6939778B2 (en) Control devices, control methods and surgical systems
EP2520244B1 (en) A surgical mechanism control system
CN110603599A (en) Operating room devices, methods, and systems
US11690697B2 (en) Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment
EP3265008A1 (en) Surgical tool tracking to control surgical system
US20210169605A1 (en) Augmented reality headset for navigated robotic surgery
CN113625452A (en) Head-mounted extended reality (XR) display device
KR20160033325A (en) Medical laser apparatus manipulated by robot arms
JP7282816B2 (en) Extended Reality Instrument Interaction Zones for Navigated Robotic Surgery
WO2004052224A1 (en) Control apparatus for imaging device
US20210251717A1 (en) Extended reality headset opacity filter for navigated surgery
JP3499946B2 (en) Diagnostic imaging device
JP2011050583A (en) Medical diagnostic apparatus
JP7367041B2 (en) UI for head-mounted display systems

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP