EP2827793B1 - Image system for surgery and method for image display - Google Patents
Image system for surgery and method for image display Download PDFInfo
- Publication number
- EP2827793B1 EP2827793B1 EP13764568.5A EP13764568A EP2827793B1 EP 2827793 B1 EP2827793 B1 EP 2827793B1 EP 13764568 A EP13764568 A EP 13764568A EP 2827793 B1 EP2827793 B1 EP 2827793B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- image
- treatment device
- unit
- operating state
- treatment
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Not-in-force
Links
- 238000001356 surgical procedure Methods 0.000 title claims description 45
- 238000000034 method Methods 0.000 title claims description 16
- 238000003384 imaging method Methods 0.000 claims description 45
- 238000001514 detection method Methods 0.000 claims description 40
- 238000006073 displacement reaction Methods 0.000 claims description 22
- 230000001133 acceleration Effects 0.000 claims description 14
- 238000003780 insertion Methods 0.000 claims description 8
- 230000037431 insertion Effects 0.000 claims description 8
- 230000003287 optical effect Effects 0.000 claims description 6
- 230000015572 biosynthetic process Effects 0.000 description 10
- 230000004048 modification Effects 0.000 description 10
- 238000012986 modification Methods 0.000 description 10
- 238000010586 diagram Methods 0.000 description 8
- 238000000605 extraction Methods 0.000 description 8
- 239000003550 marker Substances 0.000 description 8
- 230000015654 memory Effects 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 239000000284 extract Substances 0.000 description 3
- 230000000694 effects Effects 0.000 description 2
- 210000003128 head Anatomy 0.000 description 2
- 206010028980 Neoplasm Diseases 0.000 description 1
- 208000003464 asthenopia Diseases 0.000 description 1
- 238000001444 catalytic combustion detection Methods 0.000 description 1
- 208000002173 dizziness Diseases 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 238000004148 unit process Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00163—Optical arrangements
- A61B1/00193—Optical arrangements adapted for stereoscopic vision
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/0002—Operational features of endoscopes provided with data storages
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/00048—Constructional features of the display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/0005—Display arrangement combining images e.g. side-by-side, superimposed or tiled
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00163—Optical arrangements
- A61B1/00188—Optical arrangements with focusing or zooming features
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00163—Optical arrangements
- A61B1/00194—Optical arrangements adapted for three-dimensional imaging
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/045—Control thereof
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/00234—Surgical instruments, devices or methods, e.g. tourniquets for minimally invasive surgery
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/34—Trocars; Puncturing needles
- A61B17/3417—Details of tips or shafts, e.g. grooves, expandable, bendable; Multiple coaxial sliding cannulas, e.g. for dilating
- A61B17/3421—Cannulas
- A61B17/3423—Access ports, e.g. toroid shape introducers for instruments or hands
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/003—Navigation within 3D models or images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/356—Image reproducers having separate monoscopic and stereoscopic modes
- H04N13/359—Switching between monoscopic and stereoscopic modes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/00234—Surgical instruments, devices or methods, e.g. tourniquets for minimally invasive surgery
- A61B2017/00358—Snares for grasping
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2048—Tracking techniques using an accelerometer or inertia sensor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
- A61B2034/2057—Details of tracking cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/41—Medical
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
Definitions
- the present invention relates to an image system for surgery.
- the stereoscopic endoscope device of PTL 1 requires the operator to perform stereoscopic observation by parallax images all the time under the state where the treatment devic is not introduced into a body, and inconveniently causes eyestrain accompanied by the stereoscopic observation where the entire surgery process including the search for an affected area continues for a long time.
- the present invention provides following solutions.
- One aspect of the present invention provides an image system for surgery, the image system comprising: a three- dimensional imaging unit that acquires two images having parallax by capturing an image of an operation target portion a display unit that displays two images acquired by the three dimensional imaging unit individually for both left and right eyes, or displays a same image for both left and right eyes, or displays a same image for both left and right eyes; an operating state detection unit that detects that a treatment device or the three-dimensional imaging unit that performs treatment on the operation target portion is in an operating state in which to initiate treatment on the operation target portion; and a display control unit that switches from displaying a same one image on the display unit to displaying two images on the display unit when the operating state detection unit detects that the three-dimensional imaging unit or the treatment device is in the operating state in which to initiate treatment on the operation target portion.
- the display control unit controls the control unit so as to display two images acquired by the three-dimensional imaging unit individually for both left and right eyes. This allows the operator to appropriately perform treatment on the operation target portion by the treatment device while acquiring information in the depth direction of the operation target portion with a stereoscopic image.
- the operating state detection unit may process any of the images acquired by the three-dimensional imaging unit and calculate displacement, velocity or acceleration of the treatment device within the image, and determine that the treatment device is in the operating state in which to initiate the treatment on the operation target portion when the calculated displacement, velocity or acceleration exceeds a predetermined threshold.
- the operator can observe operation target portion with a two-dimensional image even when treatment is performed by observation from a remote location, where an affected area to be subjected to treatment is found and the operation unit provided in the master device is operated, contact or operation on the operation unit is detected to switch display to a stereoscopic image. Therefore, it is possible to perform appropriate treatment with the treatment device with an image of the operation target portion having depth.
- the three-dimensional imaging unit may include a zooming optical system
- the operating state detection unit may detect a magnification of the zooming optical system, and determine that the treatment device is in an operating state in which to initiate treatment on the operation target portion when it is detected that the magnification exceeds a predetermined threshold.
- Another aspect not forming part of the present invention provides a method for image display, comprising: detecting whether a treatment device that performs treatment on an operation target portion or an imaging unit that acquires two images having parallax by capturing an image of the operation target portion is in an operating state in which to initiate treatment on the operation target portion, and when it is detected that the treatment device or the imaging unit is not in the operating state in which to initiate treatment on the operation target portion, displaying only one of images acquired by the imaging unit for both left and right eyes, and when it is detected that the treatment device or the imaging unit is in an operating state in which to initiate the treatment on the operation target portion, displaying two images acquired by the imaging unit individually for both left and right eyes.
- a three-dimensional imaging unit 5 is inserted from the outside of a body to the inside of the body via a trocar 4 installed to penetrate the body surface tissue A of the patient, and dispose the distal end of the three-dimensional imaging unit 5 inserted inside of the body, so as to be opposed to the operation target portion B inside the body.
- one of the images is sent to the operating state detection unit 9.
- the sent image is processed in the feature extraction unit 11.
- features within the image are extracted (step S4), and the extracted features are sent to the determination unit 13.
- the features sent from the feature extraction unit 11 and various features of the treatment device 3 stored in the treatment device information storage unit 12 are compared to determine whether the features extracted from within the image indicate the treatment device 3 (step S5).
- the display control unit 10 controls the display unit 7 so as to display two images having parallax and sent from the image formation unit 8 on the two displays of the display unit 7 opposed to both left and right eyes (step S7).
- the operator can observe the operation target portion B with a stereoscopic image.
- Appearance of the treatment device 3 within the image is the representation of the intention of the operator to operate the treatment device 3 to initiate the treatment operation of the target portion B.
- identification marker 14 such as a bar code
- an operating state detection unit 9 that detects the operating state in which the treatment device 3 initiates the treatment of the operation target portion B, it is detected whether the treatment device 3 exists within the image. However, instead of this configuration, it may be detected whether the treatment device 3 extracted within the image has moved.
- step S5 when the treatment device 3 is detected in step S5, the position of the treatment device 3 within the image is stored (step S10), and where the position of the treatment device 3 is already stored, the amount of movement to the position newly stored is calculated (step S11). It is determined whether the calculated amount of movement exceeds a predetermined threshold (step S12), the system switches to displaying a stereoscopic image where the amount of movement is more than the threshold (step S7), and switches to displaying a two-dimensional image where the amount of movement is equal to or less than the threshold (step S6).
- the treatment device 3 In the image, where the treatment device 3 moves, it is more apparent that the treatment device 3 is in the operating state in which the operator initiates the treatment of the operation target portion B. By detecting such a state and switching to displaying a stereoscopic image, it is possible to make it easier to obtain the sense of depth on performing the treatment and make the treatment easy.
- displaying is switched between a stereoscopic image and a two-dimensional image, depending on the presence or displacement of the treatment device 3.
- the system may detect the treatment device 3 turned to the operating state in which to initiate the treatment of the operation target portion B.
- the outputs from the contact sensor 15 may be input directly to the display control unit 10, and where the contact on the operation unit 3a is detected by the contact sensor 15, the system may switch to displaying a stereoscopic image.
- a pressure sensor may be adopted instated of the contact sensor.
- the system may switch between displaying a stereoscopic image and displaying a two-dimensional image depending on a value of pressure applied to a pressure sensor.
- the treatment device 3 and an operation unit of the treatment device may be installed on a system of master-slave scheme as shown in Fig. 7 , in addition to the operation unit 3a as shown in Fig. 6 connected to the treatment device 3 for directly operating the treatment device 3.
- the treatment device 3 is installed on the manipulator on the slave side (slave manipulator) 16
- the operation unit is installed on the manipulator (master manipulator) 17 on the master side.
- a contact sensor 15 may be provided to the operation unit.
- the slave manipulator 16 operates to set the treatment device 3 in the operating state to initiate the treatment. Therefore, the system may switch image displaying by detecting operation to the operation unit of the master manipulator 17 by the contact sensor 15.
- two treatment devices 3 are respectively mounted on two slave manipulators 16.
- the three-dimensional imaging unit 5 is mounted on an observation manipulator 19 on the slave side.
- a master manipulator 17 to which the operator performs operation a display unit 7, such as a head mount display for providing the operator with an image, a digitizer 20 that detects the direction of view field of the operator are disposed, and a contact sensor 15 for detecting the operation to the operation unit by the operator is provided on the master manipulator 17.
- a display unit 7 such as a head mount display for providing the operator with an image
- a digitizer 20 that detects the direction of view field of the operator are disposed
- a contact sensor 15 for detecting the operation to the operation unit by the operator is provided on the master manipulator 17.
- the control memory 21 comprises a treatment device control unit 22 that controls a slave manipulator 16 and the treatment device 3 based on a signal from the master manipulator 17, an observation part control unit 23 that controls the observation manipulator 19 based on a signal from the digitizer 20, and an image control unit 6 comprising an image formation unit 8 that generates two images having parallax by processing an image signal acquired by the three-dimensional imaging unit 5, and a display control unit 10 that switches images between the stereoscopic image and the two-dimensional image according to the signal from the contact sensor 15 and outputs the switched image to the display unit 7.
- the contact sensor 15 is exemplified as a sensor for detecting operation to the operation unit of the master manipulator 17 by the operator, a sensor described below may be adopted in place of this.
- a holding finger sensor a contact sensor 15 is provided for each holding finger, and the system switches to displaying a stereoscopic image where three fingers are in contact with the sensor, and a two-dimensional image where two fingers are in contact with the sensor.
- a contact sensor on the arm rest of the master manipulator 17 (not shown in the drawings): The system switches to displaying a stereoscopic image where the operator put their elbow on the arm rest in order to operate the master manipulator 17, and switches to displaying a two-dimensional image where their elbow is not put on the arm rest.
- a sensor for detecting a specific operation pattern on the operation unit For example, the system detects the operation pattern in which the operation unit is operated two times in succession, and switches between the stereoscopic image and the two-dimensional image each time the pattern is detected. Or, an operation pattern may be detected that is not performed on a normal operation on the treatment.
- the determination unit 13 determines presence or absence of the treatment device 3 within the image.
- the positional relationship of the treatment device 3 and the operation target portion B in the depth direction within the image may be calculated (stereophonic measuring) based on the two images having parallax, and it may be determined whether the distance between the distal end of the treatment device 3 and the operation target portion B exceeds the threshold.
- the system may switch to displaying a stereoscopic image where they have come close to each other by exceeding the threshold, or may switch to a two-dimensional image where the distance is the threshold or greater.
- the sensor 26 for detecting the passage of the treatment device 3 at the trocar 2 for inserting the treatment device 3 inside of the body may switch the image to be displayed on the display unit 7 from a two-dimensional image to a stereoscopic image.
- an acceleration sensor 27 may be provided on the distal end of the treatment device 3, and where acceleration of a predetermined magnitude or more is detected, determines that the treatment device 3 is in the operating state in which to initiate the treatment of the operation target portion B, and the display control unit 10 may switch the image to be displayed on the display unit 7 from a two-dimensional image to a stereoscopic image.
- the amount of displacement may be calculated based on the acceleration detected by the acceleration sensor 27, and where the amount of displacement has become a predetermined value or greater, it may be determined that the system is in the operating state, and displaying may be switched from a two-dimensional image to a stereoscopic image.
- the determination unit 13 may determine whether the operator has set magnification of zooming to be greater than a predetermined threshold by operating the zooming adjustment unit 28 where the set magnification of zooming exceeds the predetermined threshold, the display control unit 10 may switch displaying from a two-dimensional image to displaying a stereoscopic image, and may switch from displaying a stereoscopic image to a two-dimensional image where the magnification is the threshold or less.
- magnification of zooming is set large by the operator, it is possible to regard the situation as one in which the operator is about to perform the treatment while precisely observing the operation target portion B by magnifying the operation target portion B. Accordingly, by switching displaying of an image on the display unit 7 to a stereoscopic image only in this case, it is possible to increase the easiness of treatment while minimizing the fatigue accompanied by observing the stereoscopic image over the extended period.
- a two-dimensional image and a stereoscopic image are switched from one another by various methods.
- a dominant eye input unit 29 through which the operator inputs their dominant eye. It is preferable to switch to displaying a two-dimensional image on the side of the dominant eye on the display unit 7 when, by sending the input information of the dominant eye to the display control unit 10, the display control unit 10 switches to displaying a two-dimensional image.
- one of the images selected from the two images acquired by the three-dimensional imaging unit is presented to both eyes when the two-dimensional image is displayed.
- any element may be employed as long as it presents to both eyes a same image, for example, one obtained by performing 2D image synthesis on two images.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Veterinary Medicine (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Biophysics (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Software Systems (AREA)
- Robotics (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Gynecology & Obstetrics (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Endoscopes (AREA)
Description
- The present invention relates to an image system for surgery.
- Conventionally, stereoscopic endoscope devices are known that display parallax images of an object acquired by a plurality of cameras disposed on a distal end of insertion parts of the device to allow an observer to stereoscopically observe the object while performing treatment introduced via a channel to the front of the distal end of the insertion part by a treatment device (for example, see
PTL 1.) - Such a stereoscopic endoscope device allows stereoscopic observation by parallax images in a normal state, and provides a two-dimensional image of a part of range in which a treatment device positioned at a closest distance appears within the parallax images where it is detected that the treatment device is inserted into the channel.
- {PTL 1} Japanese Patent No.
4170042 - The stereoscopic endoscope device of
PTL 1 requires the operator to perform stereoscopic observation by parallax images all the time under the state where the treatment devic is not introduced into a body, and inconveniently causes eyestrain accompanied by the stereoscopic observation where the entire surgery process including the search for an affected area continues for a long time. -
JP 2004/065804 - The present invention is made in view of the aforementioned circumstances, and an object of the present invention is to provide an image system for surgery and a method for image display with the same that can ease fatigue in a surgery.
- To achieve the above-described objective, the present invention provides following solutions.
- One aspect of the present invention provides an image system for surgery, the image system comprising: a three- dimensional imaging unit that acquires two images having parallax by capturing an image of an operation target portion a display unit that displays two images acquired by the three dimensional imaging unit individually for both left and right eyes, or displays a same image for both left and right eyes, or displays a same image for both left and right eyes; an operating state detection unit that detects that a treatment device or the three-dimensional imaging unit that performs treatment on the operation target portion is in an operating state in which to initiate treatment on the operation target portion; and a display control unit that switches from displaying a same one image on the display unit to displaying two images on the display unit when the operating state detection unit detects that the three-dimensional imaging unit or the treatment device is in the operating state in which to initiate treatment on the operation target portion.
- According to the present aspect, the operation target portions are imaged by the three-dimensional imaging unit and two images having parallax are acquired. In a state in which the operator is not operating the treatment device for performing treatment on the operation target portion or the three-dimensional imaging unit, the display control unit controls the display unit so as to display a same image acquired from an image acquired by the three-dimensional imaging unit on both left and right eyes. This allows the operator to observe the operation target portion by a two-dimensional image with less burden on the eyes.
- On the other hand, where the operator operates the treatment device or the three-dimensional imaging unit and it is detected by the operating state detection unit that the treatment device or the three-dimensional imaging unit is in an operating state in which to initiate treatment on the operation target portion, the display control unit controls the control unit so as to display two images acquired by the three-dimensional imaging unit individually for both left and right eyes. This allows the operator to appropriately perform treatment on the operation target portion by the treatment device while acquiring information in the depth direction of the operation target portion with a stereoscopic image.
- In other words, since, according to the present aspect, a stereoscopic image is displayed only in the situation in which to actually perform treatment on an operation target portion, and in other situation, a two-dimensional image is displayed. Therefore, the operator needs not continuously observe the stereoscopic image for an extended period, and it is possible to ease fatigue in a surgery.
- In the above-described aspect, the operating state detection unit may be configured to detect displacement, velocity or acceleration of the treatment device.
- With this configuration, it is possible to easily detect whether the treatment device is in an operating state in which to initiate the treatment on the operation target portion. In other words, when the treatment device is displaced, when the treatment device has moved with a velocity of predetermined magnitude or more, or acceleration of a predetermined value or more has worked on the treatment device, this is due to an operation by the operator to initiate the treatment on the operation target portion. In this case, by switching the observation to that with a stereoscopic image, it is possible to perform appropriate treatment by the treatment device on the operation target portion.
- Further, in the above-described aspect, the operating state detection unit may determine that the treatment device is in the operating state in which to initiate the treatment on the operation target portion when the operating state detection unit processes any of the images acquired by the three-dimensional imaging unit, and the treatment device is detected within the image.
- With this configuration, it is possible, without providing a separate sensor for detecting an operating state of a treatment device, etc., to detect an operating state of a treatment device from an image acquired by the three-dimensional imaging unit.
- Further, in a configuration in which displacement, velocity or acceleration of the above-described treatment device is detected, the operating state detection unit may process any of the images acquired by the three-dimensional imaging unit and calculate displacement, velocity or acceleration of the treatment device within the image, and determine that the treatment device is in the operating state in which to initiate the treatment on the operation target portion when the calculated displacement, velocity or acceleration exceeds a predetermined threshold.
- With this configuration, it is possible to achieve stereoscopic observation where the treatment device has moved within the image on performing treatment on the operation target portion.
- Further, in the above-described aspect, the operating state detection unit may be provided on an operation unit for operating the treatment device, and detect contact on the operation unit or the operation of the operation unit.
- With this configuration, since the contact or operation on the operation unit to which the intention of the operator is initially conveyed is detected, it is possible to rapidly perform switching from a two-dimensional image to a stereoscopic image.
- Further, in the above-described configuration with the operation unit, the operation unit may be provided on a master device disposed at a position remote from the treatment device and operating the treatment device by remote operation.
- With this configuration, the operator can observe operation target portion with a two-dimensional image even when treatment is performed by observation from a remote location, where an affected area to be subjected to treatment is found and the operation unit provided in the master device is operated, contact or operation on the operation unit is detected to switch display to a stereoscopic image. Therefore, it is possible to perform appropriate treatment with the treatment device with an image of the operation target portion having depth.
- In the above aspect, the operating state detection unit may detect an amount of displacement or a direction of displacement of the operation unit, and in a case where the detected amount of displacement exceeds a predetermined threshold or where it is detected that the direction of displacement is a predetermined direction of displacement, determine that the treatment device is in an operating state in which to initiate treatment on the operation target portion.
- With this configuration, it is possible to more reliably determine whether the operator has initiated treatment on the operation target portion by the amount of displacement or the direction of displacement of the operation unit.
- Further, in the above-described aspect, the operating state detection unit may be provided on a trocar for allowing insertion of the treatment device from outside of a body to inside of the body, and when a distal end of the treatment device is inserted inside of the body via the trocar, determine that the treatment device is in an operating state in which to initiate treatment on the operation target portion.
- With this configuration, it is possible to more reliably determine that the treatment device is in a preparatory state in which to initiate the treatment on the operation target portion by detecting that the treatment device is inserted inside of the body via the trocar.
- Further, in the above-described aspect, the operating state detection unit may measure a distance between a distal end position of the treatment device and the operation target portion, and where the measured distance is smaller than a predetermined threshold, determine that the treatment device is in the operating state in which to initiate treatment on the operation target portion.
- With this configuration, when the operator is about to perform treatment on the operation target portion, the treatment device is placed closed to the operation target portion. Therefore, it is possible to measure a distance between the operation target portion and the distal end of the treatment device and where the distance becomes smaller than a predetermined threshold, it can be more reliably detected that the treatment device is in an operating state in which to initiate treatment on the operation target portion.
- In the above-described aspect, the three-dimensional imaging unit may include a zooming optical system, the operating state detection unit may detect a magnification of the zooming optical system, and determine that the treatment device is in an operating state in which to initiate treatment on the operation target portion when it is detected that the magnification exceeds a predetermined threshold.
- With this configuration, since, when the operator is to perform treatment on the operation target portion, the operator increases the magnification of zooming optical system to perform more detailed observation of the operation target portion, it is possible that the three-dimensional imaging unit can detect more reliably that the treatment device is in an operating state in which to initiate treatment on the operation target portion when the magnification of the zooming optical system has become greater than a predetermined threshold.
- In the above-described aspect, a storage unit that stores dominant eye information on which eye of the operator the dominant eye is, and the display control unit, where it is not detected that the three-dimensional imaging unit or the treatment device is in the operating state in which to initiate treatment on the operation target portion, displays the image on the dominant eye side on the display unit based on the dominant eye information stored in the storage unit.
- With this configuration, it is possible to prevent the range of operation target portion which is the focus of the attention from being offset, and it is possible for the operator to perform the treatment without having an uncomfortable feel where displaying is switched from a two- dimensional image to a stereoscopic image.
- Another aspect not forming part of the present invention provides a method for image display, comprising: detecting whether a treatment device that performs treatment on an operation target portion or an imaging unit that acquires two images having parallax by capturing an image of the operation target portion is in an operating state in which to initiate treatment on the operation target portion, and when it is detected that the
treatment device or the imaging unit is not in the operating state in which to initiate treatment on the operation target portion, displaying only one of images acquired by the imaging unit for both left and right eyes, and when it is detected that the treatment device or the imaging unit is in an operating state in which to initiate the treatment on the operation target portion, displaying two images acquired by the imaging unit individually for both left and right eyes. - According to the present invention, an effect is achieved that can ease fatigue in a surgery.
-
- {
Fig. 1} Fig. 1 is a schematic diagram of overall configuration showing an image system for surgery according to one embodiment of the present invention. - {
Fig. 2} Fig. 2 shows a flowchart for explaining a method for image display according to one embodiment of the present invention using the image system for surgery ofFig. 1 . - {
Fig. 3} Fig. 3 is a partial enlarged view showing an identification marker provided at a distal end of a treatment device shown as a modification of the image system for surgery ofFig. 1 . - {
Fig. 4} Fig. 4 shows a flowchart showing a modification of a method for image display by the image system for surgery ofFig. 1 . - {
Fig. 5} Fig. 5 is a partial enlarged view showing an operation unit of a treatment device showing a contact sensor, which is a modification of the operating state detection unit of the image system for surgery ofFig. 1 . - {
Fig. 6} Fig. 6 is a diagram showing a modification of the image system for surgery having a contact sensor ofFig. 5 . - {
Fig. 7} Fig. 7 is a schematic diagram of the overall configuration of the image system for surgery, which is a modification of the image system for surgery ofFig. 1 and comprises a master and slave devices. - {
Fig. 8} Fig. 8 is a schematic diagram of overall configuration, which shows an image system for surgery that measures a distance between the treatment device and the operation target portion and which is another modification of the image system for surgery ofFig. 1 . - {
Fig. 9} Fig. 9 is a schematic diagram of overall configuration of an image system for surgery that detects insertion of the treatment device to a trocar, which is another modification of the image system for surgery ofFig. 1 . - {
Fig. 10} Fig. 10 is a schematic diagram of overall configuration of the image system for surgery that detects the acceleration of the treatment device, which is another modification of the image system for surgery ofFig. 1 . - {
Fig. 11} Fig. 11 is a schematic diagram of overall configuration of an image system for surgery that detects magnification of zooming of a three-dimensional imaging unit, which is another modification of the image system for surgery ofFig. 1 . - {
Fig. 12} Fig. 12 is a schematic diagram of overall configuration of an image system for surgery that displays a two-dimensional image on the side of the dominant eye as a two-dimensional image, which is another modification of the image system for surgery ofFig. 1 . - An image system for
surgery 1 according to one embodiment of the present invention will be hereafter described with reference to the drawings. - The image system for
surgery 1 according to the present embodiment, as shown inFig. 1 , is a system used with atreatment device 3 inserted via atrocar 2 from outside of a body to the inside of the body, and including a three-dimensional imaging unit 5 inserted from outside of a body to the inside of the body via atrocar 4, animage processing unit 6 that outputs and displays an image by processing an image signal acquired by the three-dimensional imaging unit 5, and adisplay unit 7 that displays an image output by theimage processing unit 6. InFig. 1 , reference sign A denotes a body surface tissue, and reference sign B denotes an operation target portion positioned inside of the body. - The three-
dimensional imaging unit 5 is a bar-shaped scope in which a camera is disposed that can acquire two images having parallaxes (not shown in the drawings) on the distal end and is insertable to thetrocar 4. - The
image processing unit 6 comprises animage formation unit 8 that generates an image from an image signal acquired by the three-dimensional imaging unit 5, an operatingstate detection unit 9 that detects an operating state of thetreatment device 3 by processing an image formed by theimage formation unit 8, adisplay control unit 10 that switches the image output to thedisplay unit 7 in accordance with a result of detection by the operatingstate detection unit 9. - The operating
state detection unit 9 comprises afeature extraction unit 11 that extracts a feature within the image sent from theimage formation unit 8, a treatment deviceinformation storage unit 12 that stores a feature of thetreatment device 3, adetermination unit 13 that compares the feature extracted by thefeature extraction unit 11 and the feature of thetreatment device 3 stored in the treatment deviceinformation storage unit 12. - The
feature extraction unit 11 is configured to, for example, extract features in shape of thetreatment device 3 within the image by processing the image. Also, the treatment deviceinformation storage unit 12 stores various features in shape of thetreatment device 3. Thedetermination unit 13 compares the extracted features in shape and the stored features in shape, and determines whether they agree with each other, in other words, whether thetreatment device 3 exists within the image. - The
display unit 7 is, for example, a head-mount type display device mounted on the head part of the operator and has two displays (not shown in the drawings) disposed to be opposed to left and right eyes, respectively. - Where it is determined that the
treatment device 3 exists within the image by thedetermination unit 13, thedisplay control unit 10 outputs two images having parallax and formed by theimage formation unit 8 so that they are displayed on the right and left displays of thedisplay unit 7, respectively. Thedisplay unit 7 can display a stereoscopic image to the operator by separately displaying two images having parallax respectively for left and right images. - On the other hand, where it is determined that the
treatment device 3 does not exist within the image by thedetermination unit 13, thedisplay control unit 10 outputs either one of the two images having parallax and formed by theimage formation unit 8 to the right and left displays of thedisplay unit 7 so that they are displayed simultaneously or alternately. Thedisplay unit 7 is configured to display two-dimensional image to the operator by displaying the same image on the left and right. - The method for image display by the image system for
surgery 1 according to the present embodiment configured as the above will be hereafter described. - In order to perform surgery by using the image system for
surgery 1 according to the present embodiment, as shown inFig. 1 , a three-dimensional imaging unit 5 is inserted from the outside of a body to the inside of the body via atrocar 4 installed to penetrate the body surface tissue A of the patient, and dispose the distal end of the three-dimensional imaging unit 5 inserted inside of the body, so as to be opposed to the operation target portion B inside the body. - As shown in
Fig. 2 , by setting the device to output a two-dimensional image as the output image from the display control unit 10 (step S1), commence capturing an image of the operation target portion B by the three-dimensional imaging unit 5 (step S2). When an image signal is acquired by the three-dimensional imaging unit 5, the acquired image signal having the parallax is sent to animage formation unit 8 of the image processingimage processing unit 6, where two images having parallax are formed. Since thedisplay control unit 10 is set to output a two-dimensional image, the right and left displays of thedisplay unit 7 displayed either one of the acquired two images having parallax simultaneously or alternately (step S3). This allows the operator to observe the operation target portion B with a two-dimensional image. - When images with parallax are formed by the
image formation unit 8, one of the images is sent to the operatingstate detection unit 9. In the operatingstate detection unit 9, the sent image is processed in thefeature extraction unit 11. Thereby, features within the image are extracted (step S4), and the extracted features are sent to thedetermination unit 13. In thedetermination unit 13, the features sent from thefeature extraction unit 11 and various features of thetreatment device 3 stored in the treatment deviceinformation storage unit 12 are compared to determine whether the features extracted from within the image indicate the treatment device 3 (step S5). - As a result of determination, where it is determined that the
treatment device 3 exists within the image, thedisplay control unit 10 controls thedisplay unit 7 so as to display two images having parallax and sent from theimage formation unit 8 on the two displays of thedisplay unit 7 opposed to both left and right eyes (step S7). As a result, the operator can observe the operation target portion B with a stereoscopic image. - Appearance of the
treatment device 3 within the image is the representation of the intention of the operator to operate thetreatment device 3 to initiate the treatment operation of the target portion B. By switching the image displayed on thedisplay unit 7 to the stereoscopic image at this time point, it is possible to provide the operator with the stereoscopic image by which the operator can easily obtain sense of depth on performing the treatment. - Contrarily, where there is no
treatment device 3 within the image, the operator is not about to initiate treatment by operating thetreatment device 3, and it is possible to provide a two-dimensional image providing less sense of depth to the operator (step S6). Where it is determined whether the observation is completed, and it is not the observation completion, the processes from step S2 are repeated (step S8). - In this way, according to the image system for
surgery 1 according to the present embodiment, it becomes possible to allow the operator to observe the operation target portion B with a two-dimensional image in a state where thetreatment device 3 has not appeared within the image, and observe the operation target portion B with the stereoscopic image when thetreatment device 3 has appeared within the image. This configuration provides a benefit of preventing the operator from continuously observing the stereoscopic image over and extended period even when the surgery continued over an extended period, and, reducing the fatigue of the operator. - In the present embodiment, the features in shape of the
treatment device 3 are extracted with thefeature extraction unit 11, they are compared in thedetermination unit 13 with a variety of features in shape of thetreatment device 3 stored in the treatment deviceinformation storage unit 12, presence or absence of thetreatment device 3 within the image is determined. Instead of this configuration, however, it may be possible to configure the structure so that, as shown inFig. 3 , anidentification marker 14, such as a bar code or a two dimension bar code may be adhered near the distal end of thetreatment device 3 and thefeature extraction unit 11 may extract theidentification marker 14 and store theidentification marker 14 and identification information in association with one another in the treatment deviceinformation storage unit 12. - With this configuration, it is possible to more easily determine the presence of the
treatment device 3 within the image than the case where comparison is performed by comparing the features in shape of thetreatment device 3 by recognizing them. - Further, by adding associated information to the
identification marker 14, such as a bar code, it is possible to store, in addition to the identification information of thetreatment device 3, another associated information, such as use state of thetreatment device 3, etc. in association with one another. - Further, as the identification marker 14 a color not present inside of the body may be used instead of the bar code.
- Further, in the present embodiment, as an operating
state detection unit 9 that detects the operating state in which thetreatment device 3 initiates the treatment of the operation target portion B, it is detected whether thetreatment device 3 exists within the image. However, instead of this configuration, it may be detected whether thetreatment device 3 extracted within the image has moved. - In this case, as shown in
Fig. 4 , when thetreatment device 3 is detected in step S5, the position of thetreatment device 3 within the image is stored (step S10), and where the position of thetreatment device 3 is already stored, the amount of movement to the position newly stored is calculated (step S11). It is determined whether the calculated amount of movement exceeds a predetermined threshold (step S12), the system switches to displaying a stereoscopic image where the amount of movement is more than the threshold (step S7), and switches to displaying a two-dimensional image where the amount of movement is equal to or less than the threshold (step S6). - In the image, where the
treatment device 3 moves, it is more apparent that thetreatment device 3 is in the operating state in which the operator initiates the treatment of the operation target portion B. By detecting such a state and switching to displaying a stereoscopic image, it is possible to make it easier to obtain the sense of depth on performing the treatment and make the treatment easy. - Further, in the present embodiment, by processing the acquired image, displaying is switched between a stereoscopic image and a two-dimensional image, depending on the presence or displacement of the
treatment device 3. Instead of this, however, as shown inFig. 5 , by providing acontact sensor 15 to theoperation unit 3a for operating thetreatment device 3, and detecting that the operator has grasped theoperation unit 3a with thecontact sensor 15, the system may detect thetreatment device 3 turned to the operating state in which to initiate the treatment of the operation target portion B. In this case, as shown inFig. 6 , the outputs from thecontact sensor 15 may be input directly to thedisplay control unit 10, and where the contact on theoperation unit 3a is detected by thecontact sensor 15, the system may switch to displaying a stereoscopic image. - With this configuration, it is possible to eliminate necessity of the image processing or recognition of the shape of the
treatment device 3, and achieve easier and more direct detection of being in the operating state of thetreatment device 3 in which to initiate the treatment of the operation target portion B. - A pressure sensor may be adopted instated of the contact sensor. In this case, the system may switch between displaying a stereoscopic image and displaying a two-dimensional image depending on a value of pressure applied to a pressure sensor.
- As an
operation unit 3a of thetreatment device 3, thetreatment device 3 and an operation unit of the treatment device (not shown in the drawings) may be installed on a system of master-slave scheme as shown inFig. 7 , in addition to theoperation unit 3a as shown inFig. 6 connected to thetreatment device 3 for directly operating thetreatment device 3. In this case, thetreatment device 3 is installed on the manipulator on the slave side (slave manipulator) 16, the operation unit is installed on the manipulator (master manipulator) 17 on the master side. In this configuration, acontact sensor 15 may be provided to the operation unit. - Also in this case, where the operator operates the operation part on the
master manipulator 17, theslave manipulator 16 operates to set thetreatment device 3 in the operating state to initiate the treatment. Therefore, the system may switch image displaying by detecting operation to the operation unit of themaster manipulator 17 by thecontact sensor 15. - In the example shown in
Fig. 7 , on the slave side on which an operating table 18 on which patient C is placed is disposed, twotreatment devices 3 are respectively mounted on twoslave manipulators 16. The three-dimensional imaging unit 5 is mounted on anobservation manipulator 19 on the slave side. - On the master side, a
master manipulator 17 to which the operator performs operation, adisplay unit 7, such as a head mount display for providing the operator with an image, adigitizer 20 that detects the direction of view field of the operator are disposed, and acontact sensor 15 for detecting the operation to the operation unit by the operator is provided on themaster manipulator 17. - Between the master side and the slave side is displaced a
control memory 21. Thecontrol memory 21 comprises a treatmentdevice control unit 22 that controls aslave manipulator 16 and thetreatment device 3 based on a signal from themaster manipulator 17, an observationpart control unit 23 that controls theobservation manipulator 19 based on a signal from thedigitizer 20, and animage control unit 6 comprising animage formation unit 8 that generates two images having parallax by processing an image signal acquired by the three-dimensional imaging unit 5, and adisplay control unit 10 that switches images between the stereoscopic image and the two-dimensional image according to the signal from thecontact sensor 15 and outputs the switched image to thedisplay unit 7. - Although the
contact sensor 15 is exemplified as a sensor for detecting operation to the operation unit of themaster manipulator 17 by the operator, a sensor described below may be adopted in place of this. - A pressure sensor: detects a pressure by which the operation unit is gripped and the system switches to a stereoscopic image where the pressure is greater than the threshold, and switches to a two-dimensional image where the pressure is less than the threshold.
- A holding finger sensor: a
contact sensor 15 is provided for each holding finger, and the system switches to displaying a stereoscopic image where three fingers are in contact with the sensor, and a two-dimensional image where two fingers are in contact with the sensor. - A contact sensor on the arm rest of the master manipulator 17 (not shown in the drawings): The system switches to displaying a stereoscopic image where the operator put their elbow on the arm rest in order to operate the
master manipulator 17, and switches to displaying a two-dimensional image where their elbow is not put on the arm rest. - A sensor for detecting a specific operation pattern on the operation unit: For example, the system detects the operation pattern in which the operation unit is operated two times in succession, and switches between the stereoscopic image and the two-dimensional image each time the pattern is detected. Or, an operation pattern may be detected that is not performed on a normal operation on the treatment.
- A sensor for detecting a stroke of the master manipulator 17: The system switches to displaying the stereoscopic image where the stroke exceeds the threshold, and switches to displaying the two-dimensional image where the stroke is threshold or less. The image may be switched to the stereoscopic image where the stroke direction is such a direction to put the
treatment device 3 closer to the operation target portion B. - Further, a contact sensor (not shown in the drawings) may be provided on the two
treatment devices 3, and the image may be switched each time the contact is detected. - Further, in
Fig. 1 , thedetermination unit 13 determines presence or absence of thetreatment device 3 within the image. However, instead of this, after extraction of thetreatment device 3 within the image, the positional relationship of thetreatment device 3 and the operation target portion B in the depth direction within the image may be calculated (stereophonic measuring) based on the two images having parallax, and it may be determined whether the distance between the distal end of thetreatment device 3 and the operation target portion B exceeds the threshold. As a result of determination, the system may switch to displaying a stereoscopic image where they have come close to each other by exceeding the threshold, or may switch to a two-dimensional image where the distance is the threshold or greater. - The method of stereophonic measuring may use, for example, a method that acquires three-dimensional information by publicly known polygons or wire frames as disclosed in Japanese Unexamined Patent Application Publication No.
2003-6616 treatment device 3. By configuring as above, it is possible to calculate the positional relationship in the depth direction between the marker and the operation target portion B without recognizing the shape of thetreatment device 3. - In the above, the position of the operation target portion B is detected based on the two images having parallax and acquired by the three-
dimensional imaging unit 5. However, instead of this, the system may calculate the distance between thetreatment device 3 and the operation target portion B by utilizing the positional information of the operation target portion B measured in advance by anMRI apparatus 24 or the like, as shown inFig. 8 . In the example shown inFig. 8 , agyro sensor 25 is disposed in thetreatment device 3 and the three-dimensional imaging unit 5, and the output of thegyro sensor 25 and the output from theMRI apparatus 24 are input to thedetermination unit 13. Thedetermination unit 13, based on the position of thetreatment device 3 detected by thegyro sensor 25 and the position of the operation target portion B input from theMRI apparatus 24, calculates the distance therebetween and compares it with the threshold to switch displaying the image. - Further, as shown in
Fig. 9 , thesensor 26 for detecting the passage of thetreatment device 3 at thetrocar 2 for inserting thetreatment device 3 inside of the body, and where the passage of thetreatment device 3 is detected by thesensor 26, thedisplay control unit 10, may switch the image to be displayed on thedisplay unit 7 from a two-dimensional image to a stereoscopic image. - Further, as shown in
Fig. 10 , anacceleration sensor 27 may be provided on the distal end of thetreatment device 3, and where acceleration of a predetermined magnitude or more is detected, determines that thetreatment device 3 is in the operating state in which to initiate the treatment of the operation target portion B, and thedisplay control unit 10 may switch the image to be displayed on thedisplay unit 7 from a two-dimensional image to a stereoscopic image. The amount of displacement may be calculated based on the acceleration detected by theacceleration sensor 27, and where the amount of displacement has become a predetermined value or greater, it may be determined that the system is in the operating state, and displaying may be switched from a two-dimensional image to a stereoscopic image. - In each of the above-described embodiments, it is determined by various methods whether the
treatment device 3 is in the operating state in which to initiate the treatment on the operation target portion B and displaying an image is switched. However, instead of this, it may be possible to determine whether the three-dimensional imaging unit 5 is in the operating state in which to initiate treatment on the operation target portion B. - For example, as shown in
Fig. 11 , thedetermination unit 13 may determine whether the operator has set magnification of zooming to be greater than a predetermined threshold by operating the zoomingadjustment unit 28 where the set magnification of zooming exceeds the predetermined threshold, thedisplay control unit 10 may switch displaying from a two-dimensional image to displaying a stereoscopic image, and may switch from displaying a stereoscopic image to a two-dimensional image where the magnification is the threshold or less. - Where the magnification of zooming is set large by the operator, it is possible to regard the situation as one in which the operator is about to perform the treatment while precisely observing the operation target portion B by magnifying the operation target portion B. Accordingly, by switching displaying of an image on the
display unit 7 to a stereoscopic image only in this case, it is possible to increase the easiness of treatment while minimizing the fatigue accompanied by observing the stereoscopic image over the extended period. - The
display control unit 10, in addition to switching displaying from a two-dimensional image to a stereoscopic image where it is determined that the magnification of zooming is larger than the predetermined threshold by thedetermination unit 13, may maintain the image to be displayed on thedisplay unit 7 as a two-dimensional image during the operation of switching the magnification of zooming. With this configuration, it is possible to prevent the operator from feeling dizziness by observing the stereoscopic image during change of the magnification of zooming. - Further, while the display is switched depending on the magnification of zooming by the operation of the zooming
adjustment unit 28 by the operator, it may be possible, instead of this, to determine the state of the operation target portion B, such as, for example, the size of a tumor, etc. by image processing, and switch to displaying a stereoscopic image where the magnification of zooming is changed automatically to an appropriate value. Here, an appropriate magnification of zooming depending on thetreatment device 3 to be used may be selected as the appropriate magnification of zooming. - Further, in the above description, a two-dimensional image and a stereoscopic image are switched from one another by various methods. However, as shown in
Fig. 12 , there may be provided a dominanteye input unit 29 through which the operator inputs their dominant eye. It is preferable to switch to displaying a two-dimensional image on the side of the dominant eye on thedisplay unit 7 when, by sending the input information of the dominant eye to thedisplay control unit 10, thedisplay control unit 10 switches to displaying a two-dimensional image. - With this configuration, there is an advantage that the operator can observe the image without any uncomfortable feeling since no offset is caused in the side of the dominant eye of the operator in the image when the display is switched to the two-dimensional image during the observation by the stereoscopic image. The same advantage may be obtained also when the display is switched from a two-dimensional image to a stereoscopic image, which is the case inverse to the above. A storage unit that stores the information on the dominant eye may serve as the dominant
eye input unit 29. - In the above description, one of the images selected from the two images acquired by the three-dimensional imaging unit is presented to both eyes when the two-dimensional image is displayed. However, any element may be employed as long as it presents to both eyes a same image, for example, one obtained by performing 2D image synthesis on two images.
-
- B operation target portion
- 1 image system for surgery
- 2 trocar
- 3a operation unit
- 5 three-dimensional imaging unit
- 7 display unit
- 9 operating state detection unit
- 10 display control unit
- 13 determination unit (operating state detection unit)
- 15 contact sensor (operating state detection unit)
- 17 master manipulator (master device)
- 24 MRI apparatus (operating state detection unit)
- 25 gyro sensor (operating state detection unit)
- 26 sensor (operating state detection unit)
- 27 acceleration sensor (operating state detection unit)
- 28 zooming adjustment element (operating state detection unit)
- 29 dominant eye input unit (storage unit)
Claims (11)
- An image system (1) for surgery, the image system (1) comprising:a three-dimensional imaging unit (5) configured to acquire two images having parallax by capturing an image of an operation target portion;a display unit (7) configured to display the two images acquired by the three-dimensional imaging unit (5) individually for both left and right eyes, or display a same image for both left and right eyes;an operating state detection unit (9) configured to detect that a treatment device (3) that performs treatment on the operation target portion or the three-dimensional imaging unit (5) is in an operating state in which to initiate treatment on the operation target portion; and characterized bya display control unit (10) configured to switch from displaying the same one image on the display unit (7) to displaying the two images having parallax on the display unit (7) when the operating state detection unit (9) detects that the three-dimensional imaging unit (5) or the treatment device (3) is in the operating state in which to initiate treatment on the operation target portion.
- The image system (1) for surgery according to claim 1, wherein the operating state detection unit (9) is configured to detect displacement, velocity or acceleration of the treatment device (3).
- The image system (1) for surgery according to claim 1 or 2, wherein the operating state detection unit (9) is configured to determine that the treatment device (3) is in the operating state in which to initiate the treatment on the operation target portion when the operating state detection unit (9) processes any of the images acquired by the three-dimensional imaging unit (5), and the treatment device (3) is detected within the image.
- The image system (1) for surgery according to claim 2, wherein the operating state detection unit (9) is configured to process any of the images acquired by the three-dimensional imaging unit (5) and calculate displacement, velocity or acceleration of the treatment device (3) within the image, and determine that the treatment device (3) is in the operating state in which to initiate the treatment on the operation target portion when the calculated displacement, velocity or acceleration exceeds a predetermined threshold.
- The image system (1) for surgery according to claim 1, wherein the operating state detection unit (9) is provided on an operation unit (3a) for operating the treatment device (3), and is configured to detect contact on the operation unit (3a) or the operation of the operation unit (3a).
- The image system (1) for surgery according to claim 5, wherein the operation unit (3a) is provided on a master device disposed at a position remote from the treatment device (3) and is configured to operate the treatment device (3) by remote operation.
- The image system (1) for surgery according to claim 5 or 6, wherein the operating state detection unit (9) is configured to detect an amount of displacement or a direction of displacement of the operation unit (3a), and in a case where the detected amount of displacement exceeds a predetermined threshold or where the detected direction of displacement is a predetermined direction of displacement, determine that the treatment device (3) is in an operating state in which to initiate treatment on the operation target portion.
- The image system (1) for surgery according to claim 1, wherein the operating state detection unit is provided on a trocar for allowing insertion of the treatment device (3) from outside of a body to inside of the body, and when a distal end of the treatment device (3) is inserted inside of the body via the trocar, is configured to determine that the treatment device (3) is in the operating state in which to initiate treatment on the operation target portion.
- The image system (1) for surgery according to claim 1, wherein the operating state detection unit (9) is configured to measure a distance between a distal end position of the treatment device (3) and the operation target portion, and when the measured distance is smaller than a predetermined threshold, is configured to determine that the treatment device (3) is in the operating state in which to initiate treatment on the operation target portion.
- The image system (1) for surgery according to claim 1, wherein
the three-dimensional imaging unit (5) includes a zooming optical system,
the operating state detection unit (9) is configured to detect an magnification of the zooming optical system, and determine that the treatment device (3) is in an operating state in which to initiate treatment on the operation target portion when it is detected that the magnification exceeds a predetermined threshold. - The image system (1) for surgery according to any one of claims 1 to 10, further comprising
a storage unit (12) configured to store dominant eye information on which eye of an operator the dominant eye is,
wherein the display control unit (10) is configured to, when it is not detected that the three-dimensional imaging unit (5) or the treatment device (3) is in the operating state in which to initiate treatment on the operation target portion, display the image on the dominant eye side on the display unit (7) based on dominant eye information stored in the storage unit (12).
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012063510A JP5931528B2 (en) | 2012-03-21 | 2012-03-21 | Surgical video system and method of operating surgical video system |
PCT/JP2013/058890 WO2013141404A1 (en) | 2012-03-21 | 2013-03-19 | Image system for surgery and method for image display |
Publications (3)
Publication Number | Publication Date |
---|---|
EP2827793A1 EP2827793A1 (en) | 2015-01-28 |
EP2827793A4 EP2827793A4 (en) | 2015-11-25 |
EP2827793B1 true EP2827793B1 (en) | 2018-01-31 |
Family
ID=49222847
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP13764568.5A Not-in-force EP2827793B1 (en) | 2012-03-21 | 2013-03-19 | Image system for surgery and method for image display |
Country Status (5)
Country | Link |
---|---|
US (1) | US20140323801A1 (en) |
EP (1) | EP2827793B1 (en) |
JP (1) | JP5931528B2 (en) |
CN (1) | CN104135962B (en) |
WO (1) | WO2013141404A1 (en) |
Families Citing this family (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104814712A (en) * | 2013-11-07 | 2015-08-05 | 南京三维视嘉科技发展有限公司 | Three-dimensional endoscope and three-dimensional imaging method |
JP5802869B1 (en) * | 2014-01-23 | 2015-11-04 | オリンパス株式会社 | Surgical device |
JP6150130B2 (en) | 2014-01-30 | 2017-06-21 | ソニー株式会社 | Endoscope system, endoscope image processing apparatus, image processing method, and program |
JP6218634B2 (en) * | 2014-02-20 | 2017-10-25 | オリンパス株式会社 | ENDOSCOPE SYSTEM AND ENDOSCOPE OPERATING METHOD |
JP6257371B2 (en) | 2014-02-21 | 2018-01-10 | オリンパス株式会社 | Endoscope system and method for operating endoscope system |
US10932657B2 (en) * | 2014-04-02 | 2021-03-02 | Transenterix Europe S.A.R.L. | Endoscope with wide angle lens and adjustable view |
CN105812775A (en) * | 2014-12-29 | 2016-07-27 | 广东省明医医疗慈善基金会 | Three-dimensional display system based on hard lens and method thereof |
WO2016117089A1 (en) * | 2015-01-22 | 2016-07-28 | オリンパス株式会社 | Method for generating three-dimensional light-emitting image, and imaging system |
EP3130276B8 (en) * | 2015-08-12 | 2020-02-26 | TransEnterix Europe Sàrl | Endoscope with wide angle lens and adjustable view |
WO2017130567A1 (en) * | 2016-01-25 | 2017-08-03 | ソニー株式会社 | Medical safety-control apparatus, medical safety-control method, and medical assist system |
WO2017217115A1 (en) * | 2016-06-17 | 2017-12-21 | ソニー株式会社 | Image processing device, image processing method, program, and image processing system |
JP6332524B2 (en) * | 2017-05-23 | 2018-05-30 | ソニー株式会社 | Endoscope system, endoscope image processing apparatus, and image processing method |
JP7215809B2 (en) * | 2017-08-21 | 2023-01-31 | リライン コーポレーション | Arthroscopy apparatus and arthroscopy method |
EP3707582A1 (en) * | 2017-11-07 | 2020-09-16 | Koninklijke Philips N.V. | Augmented reality triggering of devices |
JP6968289B2 (en) * | 2018-08-24 | 2021-11-17 | 富士フイルム株式会社 | Image processing device, image processing method, and image processing program |
CN109806002B (en) * | 2019-01-14 | 2021-02-23 | 微创(上海)医疗机器人有限公司 | Surgical robot |
JP6936826B2 (en) | 2019-03-18 | 2021-09-22 | 株式会社モリタ製作所 | Image processing equipment, display system, image processing method, and image processing program |
CN110101455B (en) * | 2019-04-30 | 2021-01-01 | 微创(上海)医疗机器人有限公司 | Display device and surgical robot |
DE102019114817B4 (en) | 2019-06-03 | 2021-12-02 | Karl Storz Se & Co. Kg | Imaging system and method of observation |
EP4016466A1 (en) * | 2020-12-17 | 2022-06-22 | Inter IKEA Systems B.V. | Method and device for displaying details of a texture of a three-dimensional object |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003066336A (en) * | 2001-08-23 | 2003-03-05 | Olympus Optical Co Ltd | Microscope for surgery |
JP4170042B2 (en) * | 2002-08-09 | 2008-10-22 | フジノン株式会社 | Stereoscopic electronic endoscope device |
JP4383188B2 (en) * | 2003-04-01 | 2009-12-16 | オリンパス株式会社 | Stereoscopic observation system |
JP2004305367A (en) * | 2003-04-04 | 2004-11-04 | Olympus Corp | Stereoscopic observing apparatus |
JP4398200B2 (en) * | 2003-08-26 | 2010-01-13 | オリンパス株式会社 | Stereoscopic observation device |
US8814779B2 (en) * | 2006-12-21 | 2014-08-26 | Intuitive Surgical Operations, Inc. | Stereoscopic endoscope |
JP2009254783A (en) * | 2008-03-25 | 2009-11-05 | Panasonic Electric Works Co Ltd | Endoscope system and endoscopic operation training system |
JP5284731B2 (en) * | 2008-09-02 | 2013-09-11 | オリンパスメディカルシステムズ株式会社 | Stereoscopic image display system |
JP4625515B2 (en) * | 2008-09-24 | 2011-02-02 | 富士フイルム株式会社 | Three-dimensional imaging apparatus, method, and program |
JP2011101229A (en) * | 2009-11-06 | 2011-05-19 | Sony Corp | Display control device, display control method, program, output device, and transmission apparatus |
JP2013521941A (en) * | 2010-03-12 | 2013-06-13 | ヴァイキング・システムズ・インコーポレーテッド | 3D visualization system |
JP5701140B2 (en) * | 2011-04-21 | 2015-04-15 | キヤノン株式会社 | Stereoscopic endoscope device |
-
2012
- 2012-03-21 JP JP2012063510A patent/JP5931528B2/en active Active
-
2013
- 2013-03-19 EP EP13764568.5A patent/EP2827793B1/en not_active Not-in-force
- 2013-03-19 CN CN201380010153.8A patent/CN104135962B/en active Active
- 2013-03-19 WO PCT/JP2013/058890 patent/WO2013141404A1/en active Application Filing
-
2014
- 2014-07-09 US US14/326,821 patent/US20140323801A1/en not_active Abandoned
Non-Patent Citations (1)
Title |
---|
None * |
Also Published As
Publication number | Publication date |
---|---|
CN104135962B (en) | 2017-03-22 |
US20140323801A1 (en) | 2014-10-30 |
EP2827793A4 (en) | 2015-11-25 |
EP2827793A1 (en) | 2015-01-28 |
JP5931528B2 (en) | 2016-06-08 |
CN104135962A (en) | 2014-11-05 |
JP2013192773A (en) | 2013-09-30 |
WO2013141404A1 (en) | 2013-09-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2827793B1 (en) | Image system for surgery and method for image display | |
WO2017179350A1 (en) | Device, method and program for controlling image display | |
JP6103827B2 (en) | Image processing apparatus and stereoscopic image observation system | |
KR100998182B1 (en) | 3D display system of surgical robot and control method thereof | |
US20180125340A1 (en) | Medical stereoscopic observation device, medical stereoscopic observation method, and program | |
JP5893808B2 (en) | Stereoscopic endoscope image processing device | |
US20150018618A1 (en) | Stereoscopic endscope system | |
JP2012235983A (en) | Medical image display system | |
JP2012223363A (en) | Surgical imaging system and surgical robot | |
US20200113419A1 (en) | Medical system and operation method therefor | |
JP5771754B2 (en) | Stereoscopic endoscope system | |
JP5629023B2 (en) | Medical three-dimensional observation device | |
US10582840B2 (en) | Endoscope apparatus | |
JP2021191316A (en) | Endoscope system | |
JP2012147857A (en) | Image processing apparatus | |
KR101601021B1 (en) | Three dimension endoscope system using giro sensor | |
WO2016194446A1 (en) | Information processing device, information processing method, and in-vivo imaging system | |
US20160113482A1 (en) | Surgical device | |
JP2006223476A (en) | Three-dimensional image observing device for medical use | |
JP6996883B2 (en) | Medical observation device | |
JP2021194268A (en) | Blood vessel observation system and blood vessel observation method | |
KR20150007517A (en) | Control method of surgical action using realistic visual information | |
JP7157098B2 (en) | Angioscope system and method for measuring blood vessel diameter |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20141016 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAX | Request for extension of the european patent (deleted) | ||
RA4 | Supplementary search report drawn up and despatched (corrected) |
Effective date: 20151026 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: A61B 1/00 20060101ALI20151020BHEP Ipc: A61B 19/00 20060101AFI20151020BHEP |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: OLYMPUS CORPORATION |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: NICHOGI, MASAO Inventor name: YOSHIMURA, KATSUHIKO Inventor name: KIKUCHI, SATORU Inventor name: KOBAYASHI, HIROYOSHI Inventor name: IKEDA, HIROMU Inventor name: MIYAZAKI, YASUHIRO Inventor name: KONNO, OSAMU |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R079 Ref document number: 602013032647 Country of ref document: DE Free format text: PREVIOUS MAIN CLASS: A61B0019000000 Ipc: A61B0001000000 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G06T 19/00 20110101ALI20170711BHEP Ipc: H04N 13/02 20060101ALI20170711BHEP Ipc: A61B 1/00 20060101AFI20170711BHEP |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
INTG | Intention to grant announced |
Effective date: 20170831 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: REF Ref document number: 966580 Country of ref document: AT Kind code of ref document: T Effective date: 20180215 |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602013032647 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: MP Effective date: 20180131 |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG4D |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 966580 Country of ref document: AT Kind code of ref document: T Effective date: 20180131 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180430 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180501 Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180430 Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180531 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 Ref country code: AL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602013032647 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
REG | Reference to a national code |
Ref country code: BE Ref legal event code: MM Effective date: 20180331 |
|
GBPC | Gb: european patent ceased through non-payment of renewal fee |
Effective date: 20180430 |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: MM4A |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180319 |
|
RIC2 | Information provided on ipc code assigned after grant |
Ipc: A61B 1/00 20060101AFI20170711BHEP Ipc: H04N 13/02 20181130ALI20170711BHEP Ipc: G06T 19/00 20110101ALI20170711BHEP |
|
26N | No opposition filed |
Effective date: 20181102 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180319 |
|
RIC2 | Information provided on ipc code assigned after grant |
Ipc: G06T 19/00 20110101ALI20170711BHEP Ipc: A61B 1/00 20060101AFI20170711BHEP Ipc: H04N 13/02 20060101ALI20170711BHEP |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180331 Ref country code: GB Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180430 Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 Ref country code: LI Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180331 Ref country code: CH Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180331 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: FR Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180331 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20190321 Year of fee payment: 7 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MT Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180319 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: TR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: HU Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO Effective date: 20130319 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MK Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180131 Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180131 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R119 Ref document number: 602013032647 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20201001 |