DE102006035292B4 - Method and system for transferring position-related information from a virtual to an actual reality and for displaying this information in the actual reality and use of such a system - Google Patents

Method and system for transferring position-related information from a virtual to an actual reality and for displaying this information in the actual reality and use of such a system

Info

Publication number
DE102006035292B4
DE102006035292B4 DE102006035292A DE102006035292A DE102006035292B4 DE 102006035292 B4 DE102006035292 B4 DE 102006035292B4 DE 102006035292 A DE102006035292 A DE 102006035292A DE 102006035292 A DE102006035292 A DE 102006035292A DE 102006035292 B4 DE102006035292 B4 DE 102006035292B4
Authority
DE
Germany
Prior art keywords
reality
display device
position
system
actual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
DE102006035292A
Other languages
German (de)
Other versions
DE102006035292A1 (en
Inventor
Rainer Konietschke
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Deutsches Zentrum fur Luft- und Raumfahrt eV
Original Assignee
Deutsches Zentrum fur Luft- und Raumfahrt eV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Deutsches Zentrum fur Luft- und Raumfahrt eV filed Critical Deutsches Zentrum fur Luft- und Raumfahrt eV
Priority to DE102006035292A priority Critical patent/DE102006035292B4/en
Publication of DE102006035292A1 publication Critical patent/DE102006035292A1/en
Application granted granted Critical
Publication of DE102006035292B4 publication Critical patent/DE102006035292B4/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
    • G05B19/41805Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM] characterised by assembly
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/10Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
    • A61B90/11Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis with guides for needles or instruments, e.g. arcuate slides or ball joints
    • A61B90/13Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis with guides for needles or instruments, e.g. arcuate slides or ball joints guided by light, e.g. laser pointers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/30Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B18/18Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by applying electromagnetic radiation, e.g. microwaves
    • A61B18/20Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by applying electromagnetic radiation, e.g. microwaves using laser
    • A61B2018/2015Miscellaneous features
    • A61B2018/2025Miscellaneous features with a pilot laser
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2068Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/31From computer integrated manufacturing till monitoring
    • G05B2219/31048Project on workpiece, image of finished workpiece, info or a spot

Abstract

A method of transferring positional information from a virtual spatial reality to an actual spatial reality and displaying that information in the actual spatial reality, in which method the virtual reality recorded information is determined in relation to a virtual reality assigned coordinate system and Representation in actual reality, a registration is made, in which a transformation matrix is determined, which transforms the virtual reality assigned coordinate system in a coordinate system of a tracking system (6), so that the information to be transmitted in the actual reality position associated information in this transformation faithfully by means of a Display device to be displayed, which contains a light beam (5) emitting light source (3), their position in the actual reality by means of the tracking system (6) is determined, characterized in that from the determined position of the light source (3) and the information assigned to the virtual reality transformed into the actual reality, the position to be taken in the display device for the distraction of the display ...

Description

  • The The invention relates to a method for transmitting position-associated Information from a virtual spatial reality into an actual one spatial reality and to display this information in the actual spatial reality, being in virtual reality recorded information related to a virtual reality Coordinate system can be determined and displayed in the actual reality a registration is made in which a transformation matrix is determined, which is the virtual reality assigned coordinate system transformed into a coordinate system of a tracking system, so that in the actual reality to be transferred Information in this transformation faithful to be displayed.
  • The The invention also relates to a system for carrying out the method, and a use of this system.
  • One important and exemplary field of application of the invention to be created in the field of minimally invasive surgical procedure Operations, so-called "keyhole" operations lie where those with long, rod-shaped medical instruments by artificial created body openings operated on. Before actually performing such minimally invasive surgery requires preoperative planning because the place on the body surface, at which the surgeon puts the puncture point in the body of the patient, great influence on the accessibility of the site at which operates within the body shall be. Therefore, it is important to use before surgery preoperative Picture material, such. B. computed tomography (CT) data, to investigate where at the body surface or more puncture points should be.
  • The Result of preoperative Phase are in this case the coordinates of the puncture points relative to that coordinate system in which the preoperative data was recorded were.
  • In order to then in the operating room the coordinates of the puncture points on the Body surface of the patient can be drawn In general, a tracking system is used that makes it possible to the location of objects in space relative to the coordinate system of the tracking system measure up. To the planning results of the preoperative phase, ie off the virtual reality, to be transferred to the operating theater, ie to the actual reality, is therefore first To perform a so-called registration, in which the transformation matrix is determined that the coordinate system in which the preoperative Image material is present in the coordinate system of the tracking system transformed in the operating room.
  • general Thus, these are applications in which the virtual reality with the actual reality to be matched. It is therefore in the form of virtual reality a model of actual reality present in the position information or more generally positionally assigned Information exists that translates into actual reality should be.
  • To the There is a balance between the virtual world and the real world known ways which are given below.
  • With The help of a tracked pointer can provide position and orientation data be transferred from the virtual world to the real world. This is the Position of the top of the Zei rod measured and with the to find Position compared in virtual reality. The actual Position of the pointer can be displayed in virtual reality and the user can now use the location of the pointer the target position, which is also displayed in virtual reality will match. Is the "virtual" pointer in the virtual reality at the desired Position, so is the real pointer to the corresponding one Position. Optional would be An acoustic feedback conceivable. Determining the desired Location is in this case manually performed by the user. It so no automatic determination takes place.
  • A An alternative to this is to find the positions to be found to drive a tracked robot. This could be controlled so that his tip points to the point that out of the virtual reality in the actual Reality to be transmitted should.
  • Out US 5,617,857 A For example, there is known an imaging system with a medical instrument that includes a source for radiating detectable energy and an instrument body with a working part. The imaging system also includes a detector for detecting the energy and a processor for determining the location of the medical instrument based on the detected energy. In this known imaging system is on the medical instrument, a memory for storing initial information, such. B. the location of the energy radiating means with respect to the instrument body provided. A transmitter is provided for transferring the initial information from the memory to the processor upon connection of the medical instrument to the processor. In this way, the processor can then configure itself according to the attached instrument so that the System can follow the location of the instrument body in three-dimensional space after detection of the emitted energy.
  • Out US 5,803,089 A For example, there is known a system for monitoring the position of a medical instrument relative to a patient's body and displaying at least one of a plurality of pre-recorded images of that body responsive to the position of the medical instrument. The system includes a reference unit, a remote unit, a positional characteristic field generator, a field sensor, a position detector and an output display device. The reference unit is immovable with respect to at least a portion of the patient's body so that it is substantially immobile relative to a target surgical site. The remote unit is attached to the medical instrument.
  • Of the Field generator is either with the reference unit or the remote Unit and creates a positional characteristic field in an area including the target surgical site. The field sensor is connected to the other unit and speaks to that Presence of the field to a sensor output signal corresponding to the detected field to create. The position detector is connected to the sensor output signal Connected and generates position data corresponding to the position of the remote Unit with respect to the reference unit. The output side Display device communicates with the position detector to at least one of the previously recorded images, which on the Address position data to display.
  • This known system also includes a Registration unit containing the memory and the position data in connection. The memory stores the plurality of before recorded images of the body. Each previously recorded image corresponds to a flat area within the body, so that the multiplicity of those represented by the recorded pictures planar areas defines a first coordinate system. The registration unit correlates the position data of a second coordinate system, as defined by the position detector, with the plurality the previously recorded images of the first coordinate system and identifies a desired one previously recorded image, with the position of the remote Unit is connected in relation to the patient's body.
  • In US Pat. No. 6,529,758 B2 EP 0 682 034 describes an image guided surgical procedure using registration of preoperative spatial scan data acquired prior to surgery, registered with the patient and used to construct a spatial perspective image of the surgical site. Registration of the spatial scan data may be achieved by generating a light pattern on the patient surface by moving a laser beam across the surgical site. The light pattern is captured by at least two cameras from different fixed positions, digitized and used to form a laser reconstruction of the patient surface. This digital surface can then be aligned with the spatial scan data to form an improved spatial perspective image of the surgical site.
  • In US 2004/0254584 A1 a computerized navigation method for hip replacement surgery is described in which a pelvic plane is determined from at least three recognizable pelvic anatomical features, a tracking system is used to determine the orientation of an acetabulum implant to obtain implant orientation data, and the acetabulum implant is adjusted to a desired orientation with respect to the particular pelvic plane by relating the mentioned implant orientation data to the particular pelvic plane. Preferably, the system includes femoral tracking markers securely attached to a thigh of the patient and trackable by the tracking system to detect changes in leg length and thigh offset.
  • US 2002/0016541 A1 describes a method and system for projecting an image onto a patient, which image is based on markings applied to images previously made by the patient. For example, the image is projected onto the patient by a laser. The position of the patient and the image are recorded in the same frame of reference. There is then an assignment of the markers on the previously prepared images to the corresponding locations on the patient's body. The projection of the image onto the patient is then carried out as a function of these markings.
  • It should be noted that no method or system exists with which a point detected in a virtual spatial reality and defined there, or more generally a position-assigned information determined therein, can then simply and flexibly be displayed in the space of the actual reality. The use of robots is basically possible for applications in which robots are available anyway, but also with regard to speed, pointing range, space In many cases, robots are detrimental to flexibility, cost and accuracy.
  • task The present invention is a method and system to create what captured in a virtual spatial reality and specified points there or more generally in it position-assigned information without robot use in the room of actual Reality easy and flexible not only in fixed, but also in one arbitrarily and quickly moving, the light source containing display device displayed correctly and thus effortlessly be found again.
  • The The object is achieved by the characteristics of the representational claims 1, 5 and 21 to 24 and 26.
  • According to the invention, which relates to a method of the type mentioned, is solved this task by that the positional information is in the actual Reality by means of a display device, which is a light beam emitting light source and a light beam deflector that contains the Position of the light source in actual reality by means of of the tracking system is determined that from the determined position the light source and the virtual reality in the actual Transformed reality positional information the position to be taken Deflection device is calculated and adjusted so that after his Deflect the beam of light exactly in the direction assigned to it Position is directed and in case of movement of the light source the deflector automatically adjusted in such a way is that the deflected light beam always assigned to the Position remains focused.
  • at an application of the method in the field of minimally invasive medical Operations is the virtual spatial reality in one based on preoperative Imagery, preoperative Planning, as a result, puncture points relative to that coordinate system in which the preoperative Information will be included and which the coordinate system the virtual reality forms. To be presented in the actual reality of the operating room made the registration, where the transformation matrix determining which of the virtual reality assigned to the preoperative phase coordinate system transformed into the coordinate system of the tracking system provided in the operating room.
  • The Penetration points are in the actual reality of the operating room with the aid of the light source and the deflector containing display device displayed, with the position of the light source in the actual Reality of Operating room is determined by means of the local tracking system. From the determined position of the light source and from the virtual reality the preoperative phase in the actual reality of the operating room transformed puncture points is the vorzunehmende Position of the deflector calculated and controlled so that after its deflection the beam of light exactly to the concerned Penetration point is addressed. In the case of a movement of the light source will the controllable deflection automatically in such a way readjusted that the deflected light beam always assigned to this Puncture point remains focused.
  • at fixed attachment of the display device in the space of the actual reality is after a single registration in the room of the actual reality no tracking system required anymore.
  • One the task solved System for implementation the specified method according to the present invention is characterized characterized in that a device for storing in the virtual reality recorded, position-related information relative to the the virtual reality assigned coordinate system is provided that for representation in the actual Reality one Registration device is provided, in which the transformation matrix is determined which the virtual reality assigned coordinate system in transformed a coordinate system of a tracking system, so that in the actual Reality to be transmitted Information in this transformation faithful to be displayed. To display the positional information in the actual reality a display device is provided which is a light beam emitting light source and a controllable light beam deflector contains that in the case of a mobile, so not fixed display device to determine the position of the light source in the actual reality a tracking system is provided. Furthermore, an evaluation unit provided from the determined position of the light source and the one out of virtual reality in the actual Transformed reality positional information the position to be taken calculated controllable deflection, so after its distraction the light beam exactly in the direction to the assigned position is directed and in the case of a movement of the light source, the controllable Deflection automatically adjusted in such a way is that the deflected light beam always assigned to the Position remains focused.
  • In the method and the system according to the invention, therefore, the position of the source of the light beam, which is controlled by the controllable Deflection device, z. B. two adjustable mirror can be deflected, determined using a tracking system. From this position and the position of the desired point to be irradiated, it is calculated how the deflection device, ie in the example the two mirrors, are to be adjusted so that the light beam points in the right direction. If the light source is moved, the deflection device is adjusted automatically accordingly, so that the light beam always points to the same location.
  • The Connection of system components light source, deflection device, evaluation unit and possibly tracking system leads to a new and advantageous system that automatically points in the room "mark" can or position-assigned information can provide location-dependent.
  • The Indicator device leaves as a handheld device or permanently mounted. In a realization as a handheld this can be carried out wirelessly or wired. For fixed installation in the Space of the actual reality and prior registration of the display device will not be a tracking system more needed.
  • Of the Light beam lets in the deflector by two or more adjustable Mirror or a pan-tilt unit or magnetic fields or Distract electrical fields.
  • In the light source, the light beam through a laser, a headlight, an infrared source or light-emitting diodes or else in parallel various of these light beam generation systems be generated.
  • The Evaluation unit based on the position of the light source and the desired Light beam direction the position of the controllable deflection device, z. B. two mirrors, calculated and the deflection also controls, can as an external device or realized integrated in the running as a handheld display device become.
  • It can Also, several display devices are used, so that intersecting beams of light marks three-dimensional dots or different ones Colors or patterns can be realized.
  • advantageous and appropriate training of the method and system of the present invention in subclaims specified, which directly or indirectly to the claim 1 or 5 relate.
  • Besides, they are in claims specified yet advantageous and appropriate uses.
  • By The invention provides a number of advantages.
  • What As far as the costs are concerned, the method according to the invention is compared with other automatic methods, e.g. B. the use of robots, much cheaper, because the components of the display device with light source and deflector available as standard components and a tracking system is needed in both cases.
  • What As far as the speed is concerned, it is through a mirror system with corresponding adjustment speed in the deflection possible, also to illuminate several points "at the same time". No manual adjustment is necessary what the procedure compared to a probe with pointer makes it much faster.
  • What As far as the space requirement is concerned, it is possible for the display device in the form of a compact handset to realize. As another option Can the display device outside the actual manipulation area are fixed externally, making the important area around the center of the work area around is kept free.
  • What As far as accuracy is concerned, current mirror systems are in use in the deflection extremely high accuracy, so that the Overall accuracy of the system primarily of accuracy depends on the tracking system. There this chosen freely can be, is the best possible Accuracy attainable.
  • What As far as the pointing area is concerned, the area that results with the Display device of the system according to the invention are covered can, from the tracking range of the tracking system and the range of the light beam. He is larger than a normal Robot and also bigger than that a tracked pointer.
  • Generally the method and system of the present invention are suitable for a variety of applications where the "virtual world" with the "actual World "balanced shall be. This means that a model of the actual World exists in which position data or other positionally assigned Information is present that translates into the actual world should be.
  • Another interesting application besides that for minimally invasive surgical operations with preoperative planning is when fast adjustable mirrors are used as the controllable deflector with which not only dots but also lines, letters or symbols can be projected onto a surface. In In this case, it would be possible to label a tracked object also position-dependent. Concrete applications would be z. B. in the assembly assistance. Numbering could be projected directly onto the corresponding components. If, for example, a fitter is to install an alternator in a motor vehicle and thereby screw five screws 1 to 5 in succession, the display device now projects onto the positions at which the respective screws are to be screwed in, the corresponding numbers 1 to 5.
  • Next Use with minimally invasive surgical operations preoperative Planning and assembly operations for the targeted and planned display of mounting locations be available a possible use for the guide in museums or exhibitions, so that in sync with the description of an object, the display device a part just discussed This illuminated object targeted lights.
  • A further use exists in teaching and training with the display device as jitter-free laser pointer or coupled with a computer presentation program.
  • Also for stage and Show effects for the realization of an interactive laser show on the position of persons, e.g. B. the showmaster, reacts, can be use the system according to the invention advantageous, as well as in a logistic administration. Such a logistical administration can z. B. that of a library, an archive or the like be, using the display device to manage the logistical Objects, e.g. Eg books in a library, to be found. These are the positions of the objects stored in the virtual reality storage device. The display device with its suitably deflected light beam points to the searched object as soon as its position belongs to the system has been communicated.
  • Also in the field of advertising, the games industry and event management For example, the system of the present invention can be used to advantage become.
  • The present invention and two embodiments of a display device for minimally invasive Operations used in the system are described below with reference to drawings explained. Show it:
  • 1 a schematic view of an embodiment of a running as a display device of the system according to the invention, and
  • 2 likewise a schematic view of an exemplary embodiment of a display device of the system according to the invention designed as a permanently mounted device.
  • at the preoperative Planning minimally invasive surgery is done with long, rod-shaped instruments by artificial orifices surgery. The place on the body surface, at which the surgeon places the puncture point in the patient has huge Influence on accessibility the place where you want to operate on the inside of the body. Therefore is it great Meaning, before surgery based on preoperative imagery too examine at which point a puncture point is best placed is.
  • The Result of preoperative Planning are the coordinates of the puncture points relative to the coordinate system, in which the preoperative Information was recorded. Order then at the actual Operation in the operating room to find the coordinates of the points on the body surface and drawing, a tracking system is used, the z. B. of optical or magnetic nature.
  • The Tracking system allows measuring the location of objects in space relative to the coordinate system of the tracking system. To the preoperative Planning results (= virtual reality) in the operating theater (= actual Reality) transferred to, will be first done a registration, in which the transformation matrix is determined, which is the coordinate system, in which the preoperative Imagery is in the coordinate system of the tracking system transformed.
  • By the in 1 schematically shown construction of an intended for the inventive system display device is now able to the positions of preoperatively planned puncture points, z. B. the puncture point 1 , on the body surface 2 to find a patient. In this, designed as a mobile handheld display device are a laser beam source 3 and a controllable deflector including two adjustable mirrors 4 intended. The location of the source 3 of the laser beam 5 passing through the two adjustable mirrors of the deflector 4 can be distracted, with the help of a mounted in the operating room tracking system 6 determined.
  • From the location of the source 3 and the position of the puncture point known on the basis of the coordinate transformation during the registration 1 is in an evaluation unit 7 calculated as the two mirrors of the deflector 4 must be made so that the laser beam 5 exactly on the puncture point 1 shows. Will the laser beam source 3 be moved, so be the two mirrors in the controllable deflector 4 based on the position measurement by means of the tracking system 6 and the calculation of the mirror positions in the evaluation unit 7 automatically adjusted accordingly, so that the laser beam 5 always in the same place, namely the planned puncture point 1 shows.
  • The deflection angle of the laser beam 5 after passing through the deflector 4 are α in the azimuth plane and β in the elevation plane. The evaluation unit 7 thus controls the two mirrors of the controllable deflection according to their calculation result. In this way, the preoperatively planned puncture points, in the example of the planned puncture point 1 , extremely comfortable and precise on the body surface 2 be transmitted to the patient. The evaluation unit 7 can be implemented as an external device or integrated in the designed as a handheld display device.
  • Also by the in 2 schematically illustrated construction of a device provided for the inventive display device, the positions of preoperatively planned puncture points, z. B. the puncture point 1 , on the body surface 2 a patient can be found again. In this, in contrast to the previously described embodiment not movable, but permanently mounted in the operating room display device are also a laser beam source 3 and a two-way mirror having controllable deflection 4 intended.
  • The location of the source 3 of the laser beam 5 passing through the two adjustable mirrors of the deflector 4 In this case, it is not always necessary to determine with the aid of a tracking system installed in the operating room so that it can be saved in this case. It is only for a prior registration of the position of the laser beam source 3 in the premises of the operating room.
  • From the known location of the source 3 and the position of the puncture point known on the basis of the coordinate transformation during the registration 1 is also here in an evaluation unit 7 calculated as the two mirrors of the deflector 4 must be made so that the laser beam 5 exactly on the puncture point 1 shows.
  • The two mirrors in the controllable deflection device 4 are calculated on the basis of the calculation of the mirror positions in the evaluation unit 7 automatically adjusted accordingly, so that the laser beam 5 always in the same place, namely the planned puncture point 1 shows. The deflection angle of the laser beam 5 after passing through the deflector 4 are α in the azimuth plane and β in the elevation plane. The evaluation unit 7 thus controls the two mirrors of the controllable deflection according to their calculation result. Also in this embodiment, therefore, the preoperatively planned puncture points, in the example of the planned puncture point 1 , exactly and comfortably on the body surface 2 transmitted to the patient.
  • 1
    entry point
    2
    body surface
    3
    Laser beam source, Light beam source
    4
    Deflector
    5
    Laser beam, beam of light
    6
    tracking system
    7
    evaluation
    α
    deflection in the azimuth plane
    β
    deflection in the elevation plane

Claims (26)

  1. A method of transferring positional information from a virtual spatial reality to an actual spatial reality and displaying that information in the actual spatial reality, in which method the virtual reality recorded information is determined in relation to a virtual reality assigned coordinate system and In actual reality, a registration is made, in which a transformation matrix is determined which assigns the virtual reality assigned coordinate system to a coordinate system of a tracking system (FIG. 6 ), so that the position-associated information to be transferred to the actual reality is transformed in the same way by means of a display device which generates a light beam ( 5 ) emitting light source ( 3 ) whose position in the actual reality by means of the tracking system ( 6 ) is determined, characterized in that from the determined position of the light source ( 3 ) and the position-associated information transformed from the virtual reality into the actual reality, the position to be taken in the display device for deflecting the light emitted by the light source ( 3 ) outgoing light beam ( 5 ), controllable light beam deflection device ( 4 ) is calculated and adjusted so that the light beam ( 5 ) after its deflection (α, β) by the controllable light beam deflection device ( 4 ) is directed exactly in the direction to the assigned position and in case of a movement of the light source ( 3 ) the controllable light beam deflection device ( 4 ) is adjusted automatically in such a way that the deflected light beam ( 5 ) is always directed to the assigned position.
  2. A method according to claim 1, characterized gekenn characterized in that, in an application of the method in the field of minimally invasive medical operations, the virtual spatial reality consists in preoperative planning performed on preoperative imagery, which as a result has puncture points relative to the coordinate system in which the preoperative information is acquired and which the coordinate system the virtual reality, that for the representation in the actual reality of the operating room registration is made, in which the transformation matrix is determined, which assigned to the virtual reality of the preoperative phase coordinate system in the coordinate system of the operating room provided tracking system ( 6 ) that the puncture points ( 1 ) in the actual reality of the operating room with the help of the light source ( 3 ) and the deflection device ( 4 ), in which method the position of the light source ( 3 ) in the actual reality of the operating room by means of the local tracking system ( 6 ) is determined that from the determined position of the light source ( 3 ) and the penetration points transformed from the virtual reality of the preoperative phase into the actual reality of the operating room, the position of the deflection device ( 4 ), so that after its deflection the light beam ( 5 ) is directed exactly to the relevant puncture point and in the case of a movement of the light source ( 3 ) the deflection device ( 4 ) is adjusted automatically in such a way that the deflected light beam ( 5 ) is always directed to this assigned puncture point.
  3. A method according to claim 1 or 2, characterized in that at fixed attachment of the display device in the space of the actual reality after a single Regist tion in the space of the actual reality no tracking system ( 6 ) is more necessary.
  4. Method according to one of the preceding claims, characterized in that the with the light beam ( 5 ) are those of one or more points and / or of lines and / or of letters and / or of symbols which are transmitted via the deflection device ( 4 ) are projected onto a surface in actual spatial reality.
  5. System for carrying out the method according to one of the preceding claims, characterized in that a device for storing the recorded in the virtual reality, position-associated information relative to the virtual reality assigned coordinate system is provided that a registration device is provided for display in the actual reality in which the transformation matrix is determined which assigns the virtual reality assigned coordinate system to a coordinate system of a tracking system ( 6 ), so that the information to be transmitted to the actual reality is displayed faithfully in that in order to display the position-related information in the actual reality, a display device is provided which generates a light beam (FIG. 5 ) emitting light source ( 3 ) and a controllable deflection device ( 4 ) that, in the case of a movable, that is not fixed display device for determining the position of the light source ( 3 ) in actual reality a tracking system ( 6 ), and that an evaluation unit ( 7 ) is provided, which from the determined position of the light source ( 3 ) and the position-associated information transformed from the virtual reality into the actual reality, the position of the controllable deflection device ( 4 ), so that after its deflection (α, β) the light beam ( 5 ) is directed exactly in the direction to the assigned position and in case of a movement of the light source ( 3 ) the controllable deflection device ( 4 ) is adjusted automatically in such a way that the deflected light beam ( 5 ) is always directed to the assigned position.
  6. System according to claim 5, characterized in that the light source ( 3 ) in the display device is a laser beam source.
  7. System according to claim 5, characterized in that the light source ( 3 ) in the display device is a headlight.
  8. System according to claim 5, characterized in that the light source ( 3 ) is formed in the display device by light-emitting diodes.
  9. System according to claim 5, characterized in that the light source ( 3 ) in the display device is an infrared radiation source.
  10. System according to one of claims 6 to 9, characterized in that the light source ( 3 ) is formed in the display device in parallel by two or more different radiation sources.
  11. System according to claim 5, characterized in that the controllable deflection device ( 4 ) of the light beam ( 5 ) is formed in the display device by two or more adjustable mirrors.
  12. System according to claim 5, characterized in that the controllable deflection device ( 4 ) of the light beam ( 5 ) is formed in the display device by a pan-tilt unit.
  13. System according to claim 5, characterized records that the controllable deflection device ( 4 ) of the light beam ( 5 ) in the display device by a controllable device for generating the light beam ( 5 ) is formed in its deflection influencing magnetic or electric fields.
  14. System according to claim 6, characterized that the display device is designed as a handheld device.
  15. System according to claim 6, characterized in that the display device is designed as a device fixedly mounted in the space of the actual reality, whereby with prior registration by means of the registration device no tracking system ( 6 ) is required in the space of actual reality.
  16. System according to claim 6, characterized in that the evaluation unit ( 7 ) is an external device.
  17. System according to claim 14, characterized in that the evaluation unit ( 7 ) is integrated in the designed as a handheld display device.
  18. System according to claim 14, characterized in that that the display device is designed as a wireless handset is.
  19. System according to claim 14, characterized in that that the display device is designed as a wired handset is.
  20. System according to one of claims 6 to 19, characterized in that a plurality of display devices are provided in the actual spatial reality, through which intersecting light beams ( 5 ) Three-dimensional points are markable or different colors or patterns are feasible.
  21. Use of according to one of claims 6 to 20 trained system for leadership in museums or exhibitions, so that in sync with the description of an object, the display device a part just discussed This illuminated object targeted lights.
  22. Use of according to one of claims 6 to 20 trained system in teaching and training with the display device as a jitter-free laser pointer or coupled with a computer presentation program.
  23. Use of according to one of claims 6 to 20 trained system for stage and show effects for the realization of an interactive laser show, the Responding to the position of persons.
  24. Use of according to one of claims 6 to 20 trained system with display device during assembly operations for Targeted and planned displays of assembly points.
  25. Use according to claim 24, characterized that with the display device not only points, but also lines, Letters or symbols on an assembly work surface in the actual reality be projected.
  26. Use of the system according to any one of claims 6 to 20 in a library or an archive, wherein by means of the display device, the logistically managed objects are to be found, that the positions of the objects in the storage device of the virtual reality are stored and that the display device with her suitably deflected light beam ( 5 ) points to the searched object as soon as its position has been communicated to the system.
DE102006035292A 2006-07-26 2006-07-26 Method and system for transferring position-related information from a virtual to an actual reality and for displaying this information in the actual reality and use of such a system Active DE102006035292B4 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
DE102006035292A DE102006035292B4 (en) 2006-07-26 2006-07-26 Method and system for transferring position-related information from a virtual to an actual reality and for displaying this information in the actual reality and use of such a system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
DE102006035292A DE102006035292B4 (en) 2006-07-26 2006-07-26 Method and system for transferring position-related information from a virtual to an actual reality and for displaying this information in the actual reality and use of such a system

Publications (2)

Publication Number Publication Date
DE102006035292A1 DE102006035292A1 (en) 2008-01-31
DE102006035292B4 true DE102006035292B4 (en) 2010-08-19

Family

ID=38859468

Family Applications (1)

Application Number Title Priority Date Filing Date
DE102006035292A Active DE102006035292B4 (en) 2006-07-26 2006-07-26 Method and system for transferring position-related information from a virtual to an actual reality and for displaying this information in the actual reality and use of such a system

Country Status (1)

Country Link
DE (1) DE102006035292B4 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8276286B2 (en) 2010-01-20 2012-10-02 Faro Technologies, Inc. Display for coordinate measuring machine
US8284407B2 (en) 2010-01-20 2012-10-09 Faro Technologies, Inc. Coordinate measuring machine having an illuminated probe end and method of operation
US8615893B2 (en) 2010-01-20 2013-12-31 Faro Technologies, Inc. Portable articulated arm coordinate measuring machine having integrated software controls
US8630314B2 (en) 2010-01-11 2014-01-14 Faro Technologies, Inc. Method and apparatus for synchronizing measurements taken by multiple metrology devices
US8677643B2 (en) 2010-01-20 2014-03-25 Faro Technologies, Inc. Coordinate measurement machines with removable accessories
US8832954B2 (en) 2010-01-20 2014-09-16 Faro Technologies, Inc. Coordinate measurement machines with removable accessories
US8875409B2 (en) 2010-01-20 2014-11-04 Faro Technologies, Inc. Coordinate measurement machines with removable accessories
US8898919B2 (en) 2010-01-20 2014-12-02 Faro Technologies, Inc. Coordinate measurement machine with distance meter used to establish frame of reference
US8997362B2 (en) 2012-07-17 2015-04-07 Faro Technologies, Inc. Portable articulated arm coordinate measuring machine with optical communications bus
US9168654B2 (en) 2010-11-16 2015-10-27 Faro Technologies, Inc. Coordinate measuring machines with dual layer arm

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102006031580A1 (en) 2006-07-03 2008-01-17 Faro Technologies, Inc., Lake Mary Method and device for the three-dimensional detection of a spatial area
DE102008039838B4 (en) 2008-08-27 2011-09-22 Deutsches Zentrum für Luft- und Raumfahrt e.V. Method for scanning the three-dimensional surface of an object by means of a light beam scanner
US9551575B2 (en) 2009-03-25 2017-01-24 Faro Technologies, Inc. Laser scanner having a multi-color light source and real-time color receiver
DE102009015920B4 (en) 2009-03-25 2014-11-20 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US9113023B2 (en) 2009-11-20 2015-08-18 Faro Technologies, Inc. Three-dimensional scanner with spectroscopic energy detector
DE102009057101A1 (en) 2009-11-20 2011-05-26 Faro Technologies, Inc., Lake Mary Device for optically scanning and measuring an environment
US9210288B2 (en) 2009-11-20 2015-12-08 Faro Technologies, Inc. Three-dimensional scanner with dichroic beam splitters to capture a variety of signals
US9529083B2 (en) 2009-11-20 2016-12-27 Faro Technologies, Inc. Three-dimensional scanner with enhanced spectroscopic energy detector
US9628775B2 (en) 2010-01-20 2017-04-18 Faro Technologies, Inc. Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
DE112011100294B4 (en) 2010-01-20 2019-06-13 Faro Technologies Inc. Portable articulated arm coordinate measuring machine with multibus arm technology
US9607239B2 (en) 2010-01-20 2017-03-28 Faro Technologies, Inc. Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
US9163922B2 (en) 2010-01-20 2015-10-20 Faro Technologies, Inc. Coordinate measurement machine with distance meter and camera to determine dimensions within camera images
US9879976B2 (en) 2010-01-20 2018-01-30 Faro Technologies, Inc. Articulated arm coordinate measurement machine that uses a 2D camera to determine 3D coordinates of smoothly continuous edge features
JP2013539541A (en) * 2010-09-08 2013-10-24 ファロ テクノロジーズ インコーポレーテッド Laser scanner or laser tracking device having a projector
DE102010020925B4 (en) 2010-05-10 2014-02-27 Faro Technologies, Inc. Method for optically scanning and measuring an environment
DE102012100609A1 (en) 2012-01-25 2013-07-25 Faro Technologies, Inc. Device for optically scanning and measuring an environment
DE102012206350A1 (en) * 2012-04-18 2013-10-24 Deutsches Zentrum für Luft- und Raumfahrt e.V. Method for operating a robot
DE102012109481A1 (en) 2012-10-05 2014-04-10 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US10067231B2 (en) 2012-10-05 2018-09-04 Faro Technologies, Inc. Registration calculation of three-dimensional scanner data performed between scans based on measurements by two-dimensional scanner
US9513107B2 (en) 2012-10-05 2016-12-06 Faro Technologies, Inc. Registration calculation between three-dimensional (3D) scans based on two-dimensional (2D) scan data from a 3D scanner
DE102015122844A1 (en) 2015-12-27 2017-06-29 Faro Technologies, Inc. 3D measuring device with battery pack

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020016541A1 (en) * 1999-09-15 2002-02-07 Glossop Neil David Method and system to facilitate image guided surgery
US20020176603A1 (en) * 2001-05-24 2002-11-28 Acoustic Positioning Research Inc. Automatic pan/tilt pointing device, luminaire follow-spot, and 6DOF 3D position/orientation calculation information
US20030120154A1 (en) * 2001-11-28 2003-06-26 Frank Sauer Method and apparatus for ultrasound guidance of needle biopsies
US20050093889A1 (en) * 2001-03-27 2005-05-05 Frank Sauer Augmented reality guided instrument positioning with guiding graphics
US20050256391A1 (en) * 2004-05-14 2005-11-17 Canon Kabushiki Kaisha Information processing method and apparatus for finding position and orientation of targeted object

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020016541A1 (en) * 1999-09-15 2002-02-07 Glossop Neil David Method and system to facilitate image guided surgery
US20050093889A1 (en) * 2001-03-27 2005-05-05 Frank Sauer Augmented reality guided instrument positioning with guiding graphics
US20020176603A1 (en) * 2001-05-24 2002-11-28 Acoustic Positioning Research Inc. Automatic pan/tilt pointing device, luminaire follow-spot, and 6DOF 3D position/orientation calculation information
US20030120154A1 (en) * 2001-11-28 2003-06-26 Frank Sauer Method and apparatus for ultrasound guidance of needle biopsies
US20050256391A1 (en) * 2004-05-14 2005-11-17 Canon Kabushiki Kaisha Information processing method and apparatus for finding position and orientation of targeted object

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8630314B2 (en) 2010-01-11 2014-01-14 Faro Technologies, Inc. Method and apparatus for synchronizing measurements taken by multiple metrology devices
US8276286B2 (en) 2010-01-20 2012-10-02 Faro Technologies, Inc. Display for coordinate measuring machine
US8537374B2 (en) 2010-01-20 2013-09-17 Faro Technologies, Inc. Coordinate measuring machine having an illuminated probe end and method of operation
US8601702B2 (en) 2010-01-20 2013-12-10 Faro Technologies, Inc. Display for coordinate measuring machine
US8615893B2 (en) 2010-01-20 2013-12-31 Faro Technologies, Inc. Portable articulated arm coordinate measuring machine having integrated software controls
US8284407B2 (en) 2010-01-20 2012-10-09 Faro Technologies, Inc. Coordinate measuring machine having an illuminated probe end and method of operation
US8677643B2 (en) 2010-01-20 2014-03-25 Faro Technologies, Inc. Coordinate measurement machines with removable accessories
US8763266B2 (en) 2010-01-20 2014-07-01 Faro Technologies, Inc. Coordinate measurement device
US8832954B2 (en) 2010-01-20 2014-09-16 Faro Technologies, Inc. Coordinate measurement machines with removable accessories
US8875409B2 (en) 2010-01-20 2014-11-04 Faro Technologies, Inc. Coordinate measurement machines with removable accessories
US8898919B2 (en) 2010-01-20 2014-12-02 Faro Technologies, Inc. Coordinate measurement machine with distance meter used to establish frame of reference
US9009000B2 (en) 2010-01-20 2015-04-14 Faro Technologies, Inc. Method for evaluating mounting stability of articulated arm coordinate measurement machine using inclinometers
US9168654B2 (en) 2010-11-16 2015-10-27 Faro Technologies, Inc. Coordinate measuring machines with dual layer arm
US8997362B2 (en) 2012-07-17 2015-04-07 Faro Technologies, Inc. Portable articulated arm coordinate measuring machine with optical communications bus

Also Published As

Publication number Publication date
DE102006035292A1 (en) 2008-01-31

Similar Documents

Publication Publication Date Title
US7780677B2 (en) Method of determining the position of the articular point of a joint
US6662036B2 (en) Surgical positioning system
US9724165B2 (en) System and method for verifying calibration of a surgical device
AU2004200390B2 (en) Implant registration device for surgical navigation system
US6235038B1 (en) System for translation of electromagnetic and optical localization systems
CA2716550C (en) Method and system for planning/guiding alterations to a bone
US8233963B2 (en) Automatic identification of tracked surgical devices using an electromagnetic localization system
US5891158A (en) Method and system for directing an instrument to a target
US5769861A (en) Method and devices for localizing an instrument
US6947783B2 (en) System for the navigation-assisted positioning of elements
US5662111A (en) Process of stereotactic optical navigation
US6517478B2 (en) Apparatus and method for calibrating an endoscope
CN1267064C (en) Apparatus for positioning surgical instrument
JP4612196B2 (en) Device for imaging and planning for ligament graft placement
US6491702B2 (en) Apparatus and method for photogrammetric surgical localization
DE10202091B4 (en) Device for determining a coordinate transformation
US6511418B2 (en) Apparatus and method for calibrating and endoscope
EP1720479B1 (en) Registration methods and apparatus
US7803158B2 (en) Navigated pin placement for orthopaedic procedures
US8109942B2 (en) Computer-aided methods, systems, and apparatuses for shoulder arthroplasty
EP3032456A1 (en) System and method for optical position measurement and guidance of a rigid or semi-flexible tool to a target
CA2335867C (en) Fiducial matching by means of fiducial screws
DE60319330T2 (en) Surgical instrument
US7715602B2 (en) Method and apparatus for reconstructing bone surfaces during surgery
CA2273874C (en) Apparatus and method for visualizing ultrasonic images

Legal Events

Date Code Title Description
OP8 Request for examination as to paragraph 44 patent law
8364 No opposition during term of opposition
R084 Declaration of willingness to licence
R085 Willingness to licence withdrawn