WO2010124672A1 - Video-based mono-camera navigation system - Google Patents
Video-based mono-camera navigation system Download PDFInfo
- Publication number
- WO2010124672A1 WO2010124672A1 PCT/DE2010/000440 DE2010000440W WO2010124672A1 WO 2010124672 A1 WO2010124672 A1 WO 2010124672A1 DE 2010000440 W DE2010000440 W DE 2010000440W WO 2010124672 A1 WO2010124672 A1 WO 2010124672A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- camera
- video
- video camera
- marker
- navigation system
- Prior art date
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/06—Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0808—Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the brain
- A61B8/0816—Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the brain using echo-encephalography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5238—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
- A61B8/5261—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from different diagnostic modalities, e.g. ultrasound and X-ray
Definitions
- the invention relates to a video-based camera navigation system, comprising a device and a method which allows the detection of objects provided with markers in space and their location with only one video camera.
- optical navigation systems It is known to determine the positional relationship of objects to each other with optical navigation systems. For example, in the fields of warehouse technology, in transportation, in metalworking and surgery. In surgery, these systems are used in the operating room to determine the position of instruments relative to the patient and thus to improve the chances of success of an operation.
- Software-supported triangulation determines the spatial coordinates of the markers in the room.
- the coordinate systems of the camera, the patient (CT sectional images) and the surgical instrument are transferred by registration into a uniform coordinate system. This makes it possible to display the surgical instrument or a so-called pointer in the CT slice images.
- the surgeon is thus able to determine the current position of the instrument on or in the patient in the CT slice images on a monitor. Based on the graphical representation of the instrument in the CT slice images, the surgeon recognizes the distance to the target area and the risk structures to be protected (eg nerves or vessels).
- US 2004/0002642 A1 discloses an optical stereo camera navigation system which operates in the visible wavelength range. The localization is realized here by the recognition of black and white contrast markers with two cameras.
- the invention has for its object to develop a video-based camera navigation system that can be implemented with the use of a single video camera and no special requirements are placed on such a video camera. This makes it possible to use a commercially available video camera.
- an optical measuring system which consists of a device according to claim 1 and a method according to claim 5. Belong to the device
- a second marker which is provided with a different pattern and a different color, and which is connected to a variable object whose position is to be recorded in a measuring area.
- the two markers thus together form a unit for detecting the spatial position and rotation in virtual reality environments of a first, fixed or movable object relative to a second, fixed or movable object and are connected as well as the video camera to the electronic data processing system.
- a commercially available video camera can be used, which is connected via a USB or other data interface with the electronic data processing system.
- a characteristic feature is attached on the objects, such as persons, instruments or tools whose location parameters are to be determined.
- the marker is provided with various markings or geometries, which can be recognized by the fully automatic image processing by the video camera and assigned by the color-shape coding.
- the size, color and shape of the marker is arbitrary.
- the base color should differ only with the colored markings applied to it.
- the color tones of the markers can theoretically be chosen arbitrarily. For optimal image processing, however, strong color contrasts are preferable.
- characteristic points of the geometry such as corner points, centers or centers of gravity are used.
- the shape, the color and the distance or the arrangement of the geometries on the respective marker is known to the mono-camera navigation system. - A -
- a specially developed image processing software picks up the recorded images of the video camera and detects fully automatically the colored markings. By subsequently assigning the characteristic heavy or corner points of the markers in the recorded camera image to the geometries attached to the marker taking into account the camera properties, the position calculation of the object takes place in space.
- the image processing software does not require special hardware. It can be installed and run on a standard PC.
- the navigation system operates according to the following method steps: a) placing a first marker in a fixed position on the first fixed or movable object, b) attaching a second marker on the second fixed or movable object, c) aligning the video camera on both markers, taking into account their possible Movement range, d) input of the position of the fixed target area, e) image acquisition via the video camera, f) image preprocessing, g) object recognition, h) assignment of the objects to the known geometries - classification, i) color determination of the geometries of the markings, j) assignment of the recorded characteristic pixels (for example, corner points) of a geometry to the markings attached to the real object, k) calculating the transformation T over the associated points of the markers and the camera parameters (focal length, image distortion),
- Any camera system can be used.
- an adaptation to the measurement requirement, state of the art camera technology and economic systems is possible at any time without major expenses.
- the visibility of the markers is better ensured, since the video camera, in contrast to conventional systems is not two to three meters from the area to be measured.
- the preferred example of a medical insert is either attached to the operating table or directly to the patient or surgical instrument.
- the camera position can be adapted more flexibly to the dimensions of the area to be measured.
- the single video camera can be easily moved closer to the area to be surveyed. If a larger area needs to be monitored, the camera will be set up at a greater distance.
- the resolution of the video camera and the correlating measuring accuracy can thus be optimally adapted to the respective measurement situation. Higher measurement accuracy is smaller and coarser Accuracy achieved in larger areas.
- FIG 3 shows the sequence of the method steps for the image processing and tracking process.
- the monaural camera navigation system illustrated in FIG. 1 is placed in an operating room for assistance and monitoring during an operation for the operating surgeon in such a way that on the screen 1, the position of his surgical instrument 2 to the location or target area 3 of the surgical procedure with his healthy and diseased tissues as well as his risk structures can see exactly.
- the screen 1 belongs together with the processor 4 and the keyboard and computer mouse not shown to the electronic data processing system 8.
- a video camera 5 is provided, which is also connected to the electronic data processing system 8.
- Essential to the invention is the use of two markers 6, 7, which together form a unit for detecting the spatial position and rotation in virtual reality environments of a first, spatially movable object with respect to a second, stationary object.
- the one marker 6 is mounted in a fixed position on the patient 10, while the second Marker 7 is fixed in a specific position on the surgical instrument 2.
- Both markers 6, 7 are provided according to FIG. 2 with markings 9, which according to FIG. 2 have a specific color and a certain shape and are arranged in different fixed positions.
- the letters entered in the outlines are to be given an example of certain colors such as g for yellow, R for red, O for orange and so on. Due to the different colors and contours as well as the different arrangement of the markings, the respective position of the two markers 6, 7 relative to one another and the position and position of the surgical instrument 2 to the location or target area 3 of the surgical procedure can be detected with only one video camera 5 and on the Screen 2 are displayed.
- This first embodiment involves the use of the camera navigation system in the operation of a patient, wherein the operative target area occupies the same position and position on the operating table during the entire surgical procedure. It is of great importance to reach the operative target area while at the same time sparing the risk structures such as nerves or vessels.
- the assistance of the surgeon in the operation by the mono-camera navigation system consists in the optical navigation. If necessary, the surgeon can see exactly where he is with his surgical instrument 2 on screen 1 at any time. When the marker 7 attached to the instrument 2 moves with respect to the marker 6 fixedly attached to the patient 10, this is recognized by the video camera 5 as a result of the different patterns and colors and displayed on the screen 1, so that the surgeon can use the Position of his instrument 2 to the operative target area 3 can recognize.
- the coordinate systems of the patient (CT / MRI slice images), video camera 5 and instrument 2 are converted into a coordinate system by a point-based registration. While the surgeon leads the instrument 2, he can simultaneously recognize the actual position of the guided by him instrument 2 on a screen 1 in the CT Thomasbildem.
- a 2D ultrasound head scans the patient's brain intraoperatively.
- the described invention is combined with the 2D ultrasound system.
- Both the patient 10 and the 2D ultrasound head are markers 6, 7 different pattern and color. Point-based registration enables the patient coordinate systems (MRI systems) to
- Sectional images), video camera 5 and 2D ultrasound head are converted into a coordinate system.
- the location of the 2D ultrasound head is captured at the time the 2D ultrasound image is taken. It is thus possible to determine a 3D ultrasound image from the 2D.
- the intraoperative 3D ultrasound image is superimposed with the preoperatively acquired MRI slice images. Thus, the surgeon receives the information about the actual actual position of the brain tumor.
- the mono camera system can be used as a direct 3D input device for various program and operating system surfaces.
- the position information of the tracker is converted into actions and positions of the computer mouse.
- the marker 6 can be attached to the hand on the head or to various objects, in order, e.g. to take control of the computer mouse. Consequently, in 3D scenes or in 3D games, it is also possible to control the video camera 5 with, for example, the rotation or movement of the head.
- the direct link between real movement of the patient's head 10 and changing the virtual camera position gives the user a vivid and realistic impression of the 3D scene.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Human Computer Interaction (AREA)
- Medical Informatics (AREA)
- Physics & Mathematics (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Image Processing (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
Claims
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE112010001790T DE112010001790A5 (en) | 2009-04-27 | 2010-04-20 | VIDEO BASED MONO CAMERA NAVIGATION SYSTEM |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102009019019.8 | 2009-04-27 | ||
DE102009019019A DE102009019019A1 (en) | 2009-04-27 | 2009-04-27 | Video-based mono camera navigation system |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010124672A1 true WO2010124672A1 (en) | 2010-11-04 |
Family
ID=42347343
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/DE2010/000440 WO2010124672A1 (en) | 2009-04-27 | 2010-04-20 | Video-based mono-camera navigation system |
Country Status (2)
Country | Link |
---|---|
DE (2) | DE102009019019A1 (en) |
WO (1) | WO2010124672A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10022065B2 (en) | 2014-11-30 | 2018-07-17 | Elbit Systems Ltd. | Model registration system and method |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103142313B (en) * | 2013-03-19 | 2015-05-13 | 张巍 | Surgical operation tool position-pose real-time detection system based on monocular vision |
DE102020209177A1 (en) | 2020-07-22 | 2022-01-27 | Robert Bosch Gesellschaft mit beschränkter Haftung | Method of determining the position of an object |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030210812A1 (en) * | 2002-02-26 | 2003-11-13 | Ali Khamene | Apparatus and method for surgical navigation |
US20040138556A1 (en) * | 1991-01-28 | 2004-07-15 | Cosman Eric R. | Optical object tracking system |
US20050201613A1 (en) * | 1998-10-23 | 2005-09-15 | Hassan Mostafavi | Single-camera tracking of an object |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6978167B2 (en) | 2002-07-01 | 2005-12-20 | Claron Technology Inc. | Video pose tracking system and method |
-
2009
- 2009-04-27 DE DE102009019019A patent/DE102009019019A1/en not_active Withdrawn
-
2010
- 2010-04-20 WO PCT/DE2010/000440 patent/WO2010124672A1/en active Application Filing
- 2010-04-20 DE DE112010001790T patent/DE112010001790A5/en not_active Ceased
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040138556A1 (en) * | 1991-01-28 | 2004-07-15 | Cosman Eric R. | Optical object tracking system |
US20050201613A1 (en) * | 1998-10-23 | 2005-09-15 | Hassan Mostafavi | Single-camera tracking of an object |
US20030210812A1 (en) * | 2002-02-26 | 2003-11-13 | Ali Khamene | Apparatus and method for surgical navigation |
Non-Patent Citations (2)
Title |
---|
KHAN M; ECKE U; MANN WJ.: "The application of an optical navigation system in endonasal sinus surgery", HNO, 2003 |
STANGE T; SCHULTZ-COULON; HANS-JÜRGEN: "Clinical experience with an optical navigation system in clinical routine operation", 77TH ANNUAL MEETING OF THE GERMAN SOCIETY OF OTO-RHINO LARYNGOLOGY, HEAD AND NECK SURGERY, 2006 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10022065B2 (en) | 2014-11-30 | 2018-07-17 | Elbit Systems Ltd. | Model registration system and method |
US10932689B2 (en) | 2014-11-30 | 2021-03-02 | Elbit Systems Ltd. | Model registration system and method |
Also Published As
Publication number | Publication date |
---|---|
DE102009019019A1 (en) | 2010-10-28 |
DE112010001790A5 (en) | 2012-09-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3330922B1 (en) | Method and device for representing an object | |
EP3449830B1 (en) | Control of a medical imaging device | |
DE69503814T2 (en) | Video-based system for computer-assisted surgery and localization | |
DE102007021185B4 (en) | X-ray diagnostic device with a plurality of coded marks and a method for determining the position of device parts of the X-ray diagnostic device | |
DE19848765C2 (en) | Position verification in camera images | |
DE102006035292B4 (en) | Method and system for transferring position-related information from a virtual to an actual reality and for displaying this information in the actual reality and use of such a system | |
EP2082686B1 (en) | Method to display orientated (overlap) images | |
DE102007062843A1 (en) | Method for detecting movement | |
EP0950380A1 (en) | Image guided surgery arrangement | |
DE102015212352A1 (en) | Method, arrangement and computer program product for the position detection of an object to be examined | |
DE10137241A1 (en) | Arrangement, for detecting and measuring objects, optically projects markers onto object, records partial views of object in global coordinate system using information re-detected markers | |
DE102014226756A1 (en) | An imaging assembly and method for positioning a patient in an imaging modality | |
WO2009010195A1 (en) | Method and system for determining the position and orientation of a camera relative to a real object | |
DE212012000054U1 (en) | Apparatus, structures, circuits and systems for assessing, assessing and / or determining relative positions, orientations, orientations and angles of rotation of a portion of a bone and between two or more portions of one or more bones | |
DE102011050201A1 (en) | System for the evaluation of identification marks, identification marks and their use | |
DE102017116558A1 (en) | Method for guiding movement sequences and training device for guiding movement sequences | |
WO2010124672A1 (en) | Video-based mono-camera navigation system | |
DE102004049258A1 (en) | Operation-supporting medical information system controlling device, has control unit with control unit section to evaluate indicating instrument positions and motion operation, where section produces control signal for information system | |
EP2831839B1 (en) | Method for automatically operating a monitoring system | |
EP1464285B1 (en) | Perspective registration and visualisation of internal body regions | |
DE102010018291B4 (en) | Navigation system and X-ray system | |
WO2017017103A1 (en) | System for the stereoscopic representation of images of an endoscope | |
DE102006019470B4 (en) | Apparatus and method for displaying the position of a medical device in the body of a living being | |
WO2018211057A1 (en) | Marker-based camera tracker | |
DE102012209664B4 (en) | DEVICE AND METHOD FOR CALIBRATING TRACKING SYSTEMS |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10721104 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112010001790 Country of ref document: DE Ref document number: 1120100017905 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 10721104 Country of ref document: EP Kind code of ref document: A1 |
|
REG | Reference to national code |
Ref country code: DE Ref legal event code: R225 Ref document number: 112010001790 Country of ref document: DE Effective date: 20120920 |