DE102006035292A1 - Position associated information transmitting method, involves indicating information in actual reality by indicator, light beam, light source, and beam deflecting unit, where position of light source is determined by tracking system - Google Patents

Position associated information transmitting method, involves indicating information in actual reality by indicator, light beam, light source, and beam deflecting unit, where position of light source is determined by tracking system

Info

Publication number
DE102006035292A1
DE102006035292A1 DE102006035292A DE102006035292A DE102006035292A1 DE 102006035292 A1 DE102006035292 A1 DE 102006035292A1 DE 102006035292 A DE102006035292 A DE 102006035292A DE 102006035292 A DE102006035292 A DE 102006035292A DE 102006035292 A1 DE102006035292 A1 DE 102006035292A1
Authority
DE
Germany
Prior art keywords
display
reality
light source
light
actual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
DE102006035292A
Other languages
German (de)
Other versions
DE102006035292B4 (en
Inventor
Rainer Konietschke
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Deutsches Zentrum fur Luft- und Raumfahrt eV
Original Assignee
Deutsches Zentrum fur Luft- und Raumfahrt eV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Deutsches Zentrum fur Luft- und Raumfahrt eV filed Critical Deutsches Zentrum fur Luft- und Raumfahrt eV
Priority to DE102006035292A priority Critical patent/DE102006035292B4/en
Publication of DE102006035292A1 publication Critical patent/DE102006035292A1/en
Application granted granted Critical
Publication of DE102006035292B4 publication Critical patent/DE102006035292B4/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
    • G05B19/41805Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM] characterised by assembly
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/10Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
    • A61B90/11Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis with guides for needles or instruments, e.g. arcuate slides or ball joints
    • A61B90/13Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis with guides for needles or instruments, e.g. arcuate slides or ball joints guided by light, e.g. laser pointers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/30Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B18/18Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by applying electromagnetic radiation, e.g. microwaves
    • A61B18/20Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by applying electromagnetic radiation, e.g. microwaves using laser
    • A61B2018/2015Miscellaneous features
    • A61B2018/2025Miscellaneous features with a pilot laser
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2068Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/31From computer integrated manufacturing till monitoring
    • G05B2219/31048Project on workpiece, image of finished workpiece, info or a spot

Abstract

The method involves indicating the position associated information in the actual reality by an indicator, a light beam (5), a light source (3), and a controlled light beam deflecting unit (4). The position of the light source is determined by a tracking system (6). The position of the deflected system is computed and adjusted from the position of the light source and the information is transformed from the virtual reality to the actual reality. The diverted light beam remains directed toward the assigned position. An independent claim is also included for a system for execution of position associated information transmitting method.

Description

  • The The invention relates to a method for transmitting position-associated Information from a virtual spatial reality into an actual one spatial reality and to display this information in the actual spatial reality, being in virtual reality recorded information related to a virtual reality Coordinate system can be determined and displayed in the actual reality a registration is made in which a transformation matrix is determined, which is the virtual reality assigned coordinate system transformed into a coordinate system of a tracking system, so that in the actual reality to be transferred Information in this transformation faithful to be displayed.
  • The The invention also relates to a system for carrying out the method.
  • One important and exemplary field of application of the invention to be created in the field of minimally invasive surgical procedure Operations, so-called "keyhole" operations, in those with long, rod-shaped medical instruments through artificially created body openings operated on. Before actually performing such minimally invasive surgery requires preoperative planning because the place on the body surface, at which the surgeon puts the puncture point in the body of the patient, great influence on the accessibility of the site at which operates within the body shall be. Therefore, it is important to use before surgery preoperative Artwork, such as Computed tomography (CT) data to investigate where at the body surface or more puncture points should be.
  • The Result of preoperative Phase are in this case the coordinates of the puncture points relative to that coordinate system in which the preoperative data was recorded were.
  • In order to then in the operating room the coordinates of the puncture points on the Body surface of the patient can be drawn In general, a tracking system is used that makes it possible to the location of objects in space relative to the coordinate system of the tracking system measure up. To the planning results of the preoperative phase, ie off the virtual reality, to be transferred to the operating theater, ie to the actual reality, is therefore first To perform a so-called registration, in which the transformation matrix is determined that the coordinate system in which the preoperative Image material is present in the coordinate system of the tracking system transformed in the operating room.
  • general Thus, these are applications in which the virtual reality with the actual reality to be matched. It is therefore in the form of virtual reality a model of actual reality present in the position information or more generally positionally assigned Information exists that translates into actual reality should be.
  • To the There is a balance between the virtual world and the real world known ways which are given below.
  • With The help of a tracked pointer can provide position and orientation data be transferred from the virtual world to the real world. This is the Position of the top of the Zei rod measured and with the to find Position compared in virtual reality. The actual Position of the pointer can be displayed in virtual reality and the user can now use the location of the pointer the target position, which is also displayed in virtual reality will match. Is the "virtual" pointer in virtual reality at the desired Position, so is the real pointer to the corresponding one Position. Optional would be An acoustic feedback conceivable. Determining the desired Location is in this case manually performed by the user. It so no automatic determination takes place.
  • A An alternative to this is to find the positions to be found to drive a tracked robot. This could be controlled so that his tip points to the point that out of the virtual reality in the actual Reality to be transmitted should.
  • Out US 5,617,857 A For example, there is known an imaging system with a medical instrument that includes a source for radiating detectable energy and an instrument body with a working part. The imaging system also includes a detector for detecting the energy and a processor for determining the location of the medical instrument based on the detected energy. In this known imaging system, a memory for storing initial information, such as the position of the energy radiating means with respect to the instrument body, is provided on the medical instrument. A transmitter is provided for transferring the initial information from the memory to the processor upon connection of the medical instrument to the processor. In this way, the processor can then configure itself according to the attached instrument, so that the system tracks the location of the instrument body in the three-diagonal can follow dimensional space after detection of the radiated energy.
  • Out US 5,803,089 A For example, there is known a system for monitoring the position of a medical instrument relative to a patient's body and displaying at least one of a plurality of pre-recorded images of that body responsive to the position of the medical instrument. The system includes a reference unit, a remote unit, a positional characteristic field generator, a field sensor, a position detector and an output display device. The reference unit is stationary with respect to at least a portion of the patient's body so that it is substantially immobile relative to a target surgical site. The remote unit is attached to the medical instrument.
  • Of the Field generator is either with the reference unit or the remote Unit and creates a positional characteristic field in an area including the target surgical site. The field sensor is connected to the other unit and speaks to that Presence of the field to a sensor output signal corresponding to the detected field to create. The position detector is connected to the sensor output signal Connected and generates position data corresponding to the position of the remote Unit with respect to the reference unit. The output side Display device communicates with the position detector to at least one of the previously recorded images, which on the Address position data to display.
  • This known system also includes a Registration unit containing the memory and the position data in connection. The memory stores the plurality of before recorded images of the body. Each previously recorded image corresponds to a flat area within the body, so that the multiplicity of those represented by the recorded pictures planar areas define a first coordinate system. The registration unit correlates the position data of a second coordinate system, as defined by the position detector, with the plurality the previously recorded images of the first coordinate system and identifies a desired one previously recorded image, with the position of the remote Unit is connected in relation to the patient's body.
  • In US Pat. No. 6,529,758 B2 EP 0 682 034 describes an image guided surgical procedure using registration of preoperative spatial scan data acquired prior to surgery, registered with the patient and used to construct a spatial perspective image of the surgical site. Registration of the spatial scan data may be achieved by generating a light pattern on the patient surface by moving a laser beam across the surgical site. The light pattern is captured by at least two cameras from different fixed positions, digitized and used to form a laser reconstruction of the patient surface. This digital surface can then be aligned with the spatial scan data to form an improved spatial perspective image of the surgical site.
  • In US 2004/0254584 A1 a computerized navigation method for hip replacement surgery is described in which a pelvic plane is determined from at least three recognizable pelvic anatomical features, a tracking system is used to determine the orientation of an acetabulum implant to obtain implant orientation data, and the acetabulum implant is adjusted to a desired orientation with respect to the particular pelvic plane by relating the mentioned implant orientation data to the particular pelvic plane. Preferably, the system includes femoral tracking markers securely attached to a thigh of the patient and trackable by the tracking system to detect changes in leg length and thigh offset.
  • It it should be noted that no method or system exists yet, with the one captured in a virtual spatial reality and a point specified there or, more generally, a position-assigned one Information then in the space of the actual reality simply and can be displayed flexibly. The use of robots is basically possible for applications, in which robots are available anyway, but also in terms of on speed, pointing area, space requirements, flexibility, costs and accuracy robots are disadvantageous in many cases.
  • task The present invention is therefore a method and system to create what captured in a virtual spatial reality and specified points there or more generally in it position-assigned information without robot use in the room of actual Reality easy and flexibly displayed and thus easily recovered can.
  • According to the invention, which relates to a method of the type mentioned, this object is achieved in that the positionzuge arranged information is displayed in the actual reality by means of a display device containing a light beam emitting light source and a light beam deflector, that the position of the light source in the actual reality by means of the tracking system is determined that from the detected position of the light source and the the deflected position of the deflector is calculated and adjusted so that after its deflection the beam of light is directed exactly in the direction to the assigned position and in case of movement of the light source the deflector automatically in such a way it is checked that the deflected light beam is always directed to the assigned position.
  • at an application of the method in the field of minimally invasive medical Operations is the virtual spatial reality in one based on preoperative Imagery, preoperative Planning, as a result, puncture points relative to that coordinate system in which the preoperative Information will be included and which the coordinate system the virtual reality forms. To be presented in the actual reality of the operating room made the registration, where the transformation matrix determining which of the virtual reality assigned to the preoperative phase coordinate system transformed into the coordinate system of the tracking system provided in the operating room.
  • The Penetration points are in the actual reality of the operating room with the aid of the light source and the deflector containing display device displayed, with the position of the light source in the actual Reality of Operating room is determined by means of the local tracking system. From the determined position of the light source and from the virtual reality the preoperative phase in the actual reality of the operating room transformed puncture points is the vorzunehmende Position of the deflector calculated and controlled so that after its deflection the beam of light exactly to the concerned Penetration point is addressed. In the case of a movement of the light source will the controllable deflection automatically in such a way readjusted that the deflected light beam always assigned to this Puncture point remains focused.
  • at fixed attachment of the display device in the space of the actual reality is after a single registration in the room of the actual reality no tracking system required anymore.
  • One the task solved System for implementation the specified method according to the present invention is characterized characterized in that a device for storing in the virtual reality recorded, position-related information relative to the the virtual reality assigned coordinate system is provided that for representation in the actual Reality one Registration device is provided, in which the transformation matrix is determined which the virtual reality assigned coordinate system in transformed a coordinate system of a tracking system, so that in the actual Reality to be transmitted Information in this transformation faithful to be displayed. To display the positional information in the actual reality a display device is provided which is a light beam emitting light source and a controllable light beam deflector contains that in the case of a mobile, so not fixed display device to determine the position of the light source in the actual reality a tracking system is provided. Furthermore, an evaluation unit provided from the determined position of the light source and the one out of virtual reality in the actual Transformed reality positional information the position to be taken calculated controllable deflection, so after its distraction the light beam exactly in the direction to the assigned position is directed and in the case of a movement of the light source, the controllable Deflection automatically adjusted in such a way is that the deflected light beam always assigned to the Position remains focused.
  • at the method and system of the present invention Accordingly, the position of the source of the light beam through the controllable deflection, e.g. two adjustable mirrors can be deflected, with the help of one Tracking system determined. From this location and the location of the desired point to be irradiated is calculated as the deflection device, So in the example the two mirrors, are set, so that the Light beam pointing in the right direction. If the light source is moved, so the deflector is automatically adjusted accordingly, so that the light beam always points to the same place.
  • The Connection of system components light source, deflection device, evaluation unit and possibly tracking system leads to a new and advantageous system that automatically points in the room can "mark" or positionally assigned Information depending on location can provide.
  • The display device can be realized as a handheld device or permanently mounted. In a realization as a handheld this can be done wirelessly or wired. For fixed installation in the space of the actual reality and prior registration of the display device, no tracking system is needed anymore.
  • Of the Light beam lets in the deflector by two or more adjustable Mirror or a pan-tilt unit or magnetic fields or Distract electrical fields.
  • In the light source, the light beam through a laser, a headlight, an infrared source or light-emitting diodes or else in parallel various of these light beam generation systems be generated.
  • The Evaluation unit based on the position of the light source and the desired Light beam direction the position of the controllable deflection device, e.g. two mirrors, calculated and also controls the deflection device, can as an external device or realized integrated in the running as a handheld display device become.
  • It can Also, several display devices are used, so that intersecting beams of light marks three-dimensional dots or different ones Colors or patterns can be realized.
  • advantageous and appropriate training of the method and system of the present invention in subclaims specified, which directly or indirectly to the claim 1 or 5 relate.
  • Besides, they are in claims specified yet advantageous and appropriate uses.
  • By The invention provides a number of advantages.
  • What As far as the costs are concerned, the method according to the invention is compared with other automatic methods, e.g. the use of robots, much cheaper, because the components of the display device with light source and deflector available as standard components and a tracking system is needed in both cases.
  • What As far as the speed is concerned, it is through a mirror system with corresponding adjustment speed in the deflection possible, also to illuminate several points "at the same time". It no manual adjustment is necessary what the procedure compared to a probe with pointer makes it much faster.
  • What As far as the space requirement is concerned, it is possible for the display device in the form of a compact handset to realize. As another option Can the display device outside the actual manipulation area are fixed externally, making the important area around the center of the work area around is kept free.
  • What As far as accuracy is concerned, current mirror systems are in use in the deflection extremely high accuracy, so that the Overall accuracy of the system primarily of accuracy depends on the tracking system. There this chosen freely can be, is the best possible Accuracy attainable.
  • What As far as the pointing area is concerned, the area that results with the Display device of the system according to the invention are covered can, from the tracking range of the tracking system and the range of the light beam. He is larger than a normal Robot and also bigger than that a tracked pointer.
  • Generally the method and system of the present invention are suitable for a variety of applications where the "virtual world" with the "actual World "balanced shall be. This means that a model of the actual World exists in which position data or other positionally assigned Information is present that translates into the actual world should be.
  • A Another interesting application besides those for minimally invasive surgical operations with preoperative Planning arises when quickly adjustable mirrors are controllable Deflector, with which not only points, but also lines, letters or symbols projected onto a surface can be. In this case would be it is possible a tracked object is to be labeled depending on location. concrete Applications would find e.g. in the assembly assistance. Numbering could be direct be projected onto the corresponding components. For example a fitter install an alternator in a motor vehicle and five at the same time Tighten screws 1 to 5 one after the other, so the display device projects now on the positions where the respective screws are screwed should be the corresponding numbers 1 to 5.
  • Next Use with minimally invasive surgical operations preoperative Planning and assembly operations for the targeted and planned display of mounting locations be available a possible use for the guide in museums or exhibitions, so that in sync with the description of an object, the display device a part just discussed This illuminated object targeted lights.
  • A further use exists in teaching and training with the display device as jitter-free laser pointer or coupled with a computer presentation program.
  • Also for stage and Show effects for the realization of an interactive laser show on the position of persons, e.g. the showmaster, reacts, settles use the system according to the invention advantageous, as well as in a logistic administration. Such a logistical administration can e.g. that of a library, an archive or the like be, using the display device to manage the logistical Objects, e.g. Books in a library, to be found. These are the positions of the objects stored in the virtual reality storage device. The display device with its suitably deflected light beam points to the searched object as soon as its position belongs to the system has been communicated.
  • Also in the field of advertising, the games industry and event management For example, the system of the present invention can be used to advantage become.
  • The present invention and two embodiments of a display device for minimally invasive Operations used in the system are described below with reference to drawings explained. Show it:
  • 1 a schematic view of an embodiment of a running as a display device of the system according to the invention, and
  • 2 likewise a schematic view of an exemplary embodiment of a display device of the system according to the invention designed as a permanently mounted device.
  • at the preoperative Planning minimally invasive surgery is done with long, rod-shaped instruments by artificial orifices surgery. The place on the body surface, at which the surgeon places the puncture point in the patient has huge Influence on accessibility the place where you want to operate on the inside of the body. Therefore is it great Meaning, before surgery based on preoperative imagery too examine at which point a puncture point is best placed is.
  • The Result of preoperative Planning are the coordinates of the puncture points relative to the coordinate system, in which the preoperative Information was recorded. Order then at the actual Operation in the operating room to find the coordinates of the points on the body surface and drawing in, a tracking system is used which is e.g. from optical or magnetic nature.
  • The Tracking system allows measuring the location of objects in space relative to the coordinate system of the tracking system. To the preoperative Planning results (= virtual reality) in the operating theater (= actual Reality) transferred to, will be first done a registration, in which the transformation matrix is determined, which is the coordinate system, in which the preoperative Imagery is in the coordinate system of the tracking system transformed.
  • By the in 1 schematically shown construction of an intended for the inventive system display device is now able to the positions of preoperatively planned puncture points, eg the puncture point 1 , on the body surface 2 to find a patient. In this, designed as a mobile handheld display device are a laser beam source 3 and a controllable deflector including two adjustable mirrors 4 intended. The location of the source 3 of the laser beam 5 passing through the two steep mirrors of the deflector 4 can be distracted, with the help of a mounted in the operating room tracking system 6 determined.
  • From the location of the source 3 and the position of the puncture point known on the basis of the coordinate transformation during the registration 1 is in an evaluation unit 7 calculated as the two mirrors of the deflector 4 must be made so that the laser beam 5 exactly on the puncture point 1 shows. Will the laser beam source 3 moved, so the two mirrors in the controllable deflection 4 based on the position measurement by means of the tracking system 6 and the calculation of the mirror positions in the evaluation unit 7 automatically adjusted accordingly, so that the laser beam 5 always in the same place, namely the planned puncture point 1 shows.
  • The deflection angle of the laser beam 5 after passing through the deflector 4 are α in the azimuth plane and β in the elevation plane. The evaluation unit 7 thus controls the two mirrors of the controllable deflection according to their calculation result. In this way, the preoperatively planned puncture points, in the example of the planned puncture point 1 , extremely comfortable and precise on the body surface 2 be transmitted to the patient. The evaluation unit 7 can be implemented as an external device or integrated in the designed as a handheld display device.
  • Also by the in 2 schematically illustrated a structure for the invention Sys tem provided display device, the positions of preoperatively planned puncture points, eg the puncture point 1 , on the body surface 2 a patient can be found again. In this, in contrast to the previously described embodiment not movable, but permanently mounted in the operating room display device are also a laser beam source 3 and a steerable mirror having controllable deflection means 4 intended.
  • The location of the source 3 of the laser beam 5 passing through the two adjustable mirrors of the deflector 4 In this case, it is not always necessary to determine with the aid of a tracking system installed in the operating room so that it can be saved in this case. It is only for a prior registration of the position of the laser beam source 3 in the premises of the operating room.
  • From the known location of the source 3 and the position of the puncture point known on the basis of the coordinate transformation during the registration 1 is also here in an evaluation unit 7 calculated as the two mirrors of the deflector 4 must be made so that the laser beam 5 exactly on the puncture point 1 shows.
  • The two mirrors in the controllable deflection device 4 are calculated on the basis of the calculation of the mirror positions in the evaluation unit 7 automatically adjusted accordingly, so that the laser beam 5 always in the same place, namely the planned puncture point 1 shows. The deflection angle of the laser beam 5 after passing through the deflector 4 are α in the azimuth plane and β in the elevation plane. The evaluation unit 7 thus controls the two mirrors of the controllable deflection according to their calculation result. Also in this embodiment, therefore, the preoperatively planned puncture points, in the example of the planned puncture point 1 , exactly and comfortably on the body surface 2 transmitted to the patient.
  • 1
    entry point
    2
    body surface
    3
    Laser beam source, Light beam source
    4
    Deflector
    5
    Laser beam, beam of light
    6
    tracking system
    7
    evaluation
    α
    deflection in the azimuth plane
    β
    deflection in the elevation plane

Claims (31)

  1. A method of transferring positional information from a virtual spatial reality to an actual spatial reality and displaying that information in the actual spatial reality, wherein the information captured in the virtual reality is determined in relation to a virtual reality assigned coordinate system and displayed in registering the actual reality, in which a transformation matrix is determined, which transforms the coordinate system assigned to the virtual reality into a coordinate system of a tracking system so that the information to be transferred to the actual reality is displayed in the same transformation, characterized in that the positionally related information in the actual reality by means of a display device which displays a light beam ( 5 ) emitting light source ( 3 ) and a controllable light beam deflection device ( 4 ) contains that the position of the light source in the actual reality by means of the tracking system ( 6 ) is determined that is calculated and adjusted from the determined position of the light source and the virtual reality in the actual reality transformed position information to be taken position of the controllable deflecting, so that after its deflection (α, β) of the light beam exactly in the Direction is directed to the assigned position and in the case of movement of the light source, the controllable deflection automatically adjusted in such a way that the deflected light beam is always directed to the assigned position.
  2. A method according to claim 1, characterized in that, in an application of the method in the field of minimally invasive medical operations, the virtual spatial reality consists in preoperative planning performed on the basis of preoperative imagery, which as a result has puncture points relative to that coordinate system in which the preoperative information and that forms the virtual reality coordinate system such that registration is made in the actual reality of the operating room in which the transformation matrix is determined, which the virtual reality preoperative phase assigned coordinate system in the coordinate system of the operating room provided tracking system ( 6 ) that the puncture points ( 1 ) in the actual reality of the operating room with the help of the light source ( 3 ) and the deflection device ( 4 ), wherein the position of the light source in the actual reality of the operating room is determined by means of the local tracking system that transforms the detected position of the light source and the virtual reality of the preoperative phase into the actual reality of the operating room ten puncture points the position of the deflection device is calculated, so that after its deflection of the light beam ( 5 ) is directed exactly to the relevant puncture point and in the case of movement of the light source, the deflection is automatically adjusted in such a way that the deflected light beam is always directed to this assigned puncture point.
  3. Method according to claim 1 or 2, characterized that with fixed attachment of the display device in the space of the actual Reality after a single registration in the real world space no tracking system more is needed.
  4. Method according to one of the preceding claims, characterized in that the with the light beam ( 5 ) are those of one or more points and / or lines and / or letters and / or symbols that are projected via the deflector onto a surface in the actual spatial reality.
  5. System for carrying out the method according to one of the preceding claims, characterized in that a device for storing the recorded in the virtual reality, position-associated information relative to the virtual reality assigned coordinate system is provided that a registration device is provided for display in the actual reality in which the transformation matrix is determined which assigns the virtual reality assigned coordinate system to a coordinate system of a tracking system ( 6 ), so that the information to be transmitted to the actual reality is displayed faithfully in that in order to display the position-related information in the actual reality, a display device is provided which generates a light beam (FIG. 5 ) emitting light source ( 3 ) and a controllable light beam deflection device ( 4 ) that, in the case of a movable, that is not fixed display device for determining the position of the light source ( 3 ) a tracking system is provided in the actual reality, and that an evaluation unit ( 7 ) is provided, which calculates from the determined position of the light source and the virtual reality in the actual reality position-related information to be taken position of the controllable deflection, so that after its deflection (α, β) of the light beam exactly in the direction assigned to Position is directed and in the case of movement of the light source, the controllable deflection automatically adjusted in such a way that the deflected light beam is always directed to the assigned position.
  6. System according to claim 5, characterized in that the light source ( 3 ) in the display device is a laser beam source.
  7. System according to claim 5, characterized in that the light source in the display device is a headlight.
  8. System according to claim 5, characterized in that the light source in the display device formed by LEDs is.
  9. System according to claim 5, characterized in that the light source in the display device is an infrared radiation source is.
  10. System according to claims 6 to 9, characterized in that the light source in the display device is parallel through two or more different types of radiation sources is formed.
  11. System according to claim 5, characterized in that the controllable deflection device ( 4 ) of the light beam ( 5 ) is formed in the display device by two or more adjustable mirrors.
  12. System according to claim 5, characterized in that the controllable deflection device ( 4 ) of the light beam ( 5 ) is formed in the display device by a pan-tilt unit.
  13. System according to claim 5, characterized in that the controllable deflection device ( 4 ) of the light beam ( 5 ) is formed in the display device by a controllable device for generating the light beam in its deflection influencing magnetic or electric fields.
  14. System according to claim 6, characterized that the display device is designed as a handheld device.
  15. System according to claim 6, characterized that the display device is fixedly mounted in the space of the actual reality Device is formed, with prior registration by means of the registration device no tracking system in the space of the actual reality is required.
  16. System according to claim 6, characterized in that the evaluation unit ( 7 ) is an external device.
  17. System according to claims 6 and 14, characterized in that the evaluation unit ( 7 ) is integrated in the designed as a handheld display device.
  18. System according to claim 14, characterized records that the display device is designed as a wireless handset.
  19. System according to claim 14, characterized in that that the display device is designed as a wired handset is.
  20. System according to one of claims 6 to 19, characterized that several display devices are provided in the actual spatial reality are, through whose intersecting rays of light three-dimensional Points are markable or different colors or patterns are feasible.
  21. Use of according to one of claims 6 to 20 trained system in minimally invasive surgical operations with preoperative Planning.
  22. Use of according to one of claims 6 to 20 trained system for leadership in museums or exhibitions, so that in sync with the description of an object, the display device a part just discussed This illuminated object targeted lights.
  23. Use of according to one of claims 6 to 20 trained system in teaching and training with the display device as a jitter-free laser pointer or coupled with a computer presentation program.
  24. Use of according to one of claims 6 to 20 trained system for stage and show effects for the realization of an interactive laser show, the to the position of persons, e.g. the showmaster, responded.
  25. Use of according to one of claims 6 to 20 trained system with display device during assembly operations for Targeted and planned displays of assembly points.
  26. Use according to claim 25, characterized that with the display device not only points but also lines, Letters or symbols on an assembly work surface in the actual reality be projected.
  27. Use of according to one of claims 6 to 20 trained system in a logistical administration.
  28. Use according to claim 27, characterized that the logistical administration is that of a library, a Archives or the like, wherein by means of the display device the objects to be managed logistically, e.g. Books in a library, found that are to be the positions of the objects in the storage device of the virtual reality are stored and that the display device with their matching deflected light beam to the object sought, as soon as its Position has been communicated to the system.
  29. Use of according to one of claims 6 to 20 trained system in the field of advertising.
  30. Use of according to one of claims 6 to 20 trained system in the field of games industry.
  31. Use of according to one of claims 6 to 20 trained system in the field of event management.
DE102006035292A 2006-07-26 2006-07-26 Method and system for transferring position-related information from a virtual to an actual reality and for displaying this information in the actual reality and use of such a system Active DE102006035292B4 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
DE102006035292A DE102006035292B4 (en) 2006-07-26 2006-07-26 Method and system for transferring position-related information from a virtual to an actual reality and for displaying this information in the actual reality and use of such a system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
DE102006035292A DE102006035292B4 (en) 2006-07-26 2006-07-26 Method and system for transferring position-related information from a virtual to an actual reality and for displaying this information in the actual reality and use of such a system

Publications (2)

Publication Number Publication Date
DE102006035292A1 true DE102006035292A1 (en) 2008-01-31
DE102006035292B4 DE102006035292B4 (en) 2010-08-19

Family

ID=38859468

Family Applications (1)

Application Number Title Priority Date Filing Date
DE102006035292A Active DE102006035292B4 (en) 2006-07-26 2006-07-26 Method and system for transferring position-related information from a virtual to an actual reality and for displaying this information in the actual reality and use of such a system

Country Status (1)

Country Link
DE (1) DE102006035292B4 (en)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102008039838A1 (en) 2008-08-27 2010-03-04 Deutsches Zentrum für Luft- und Raumfahrt e.V. Measuring object's three dimensional surface scanning method, involves executing scanning movement of laser light beam by beam deflecting unit, and utilizing laser light beam for measuring and displaying data of scanning points
WO2012033892A1 (en) * 2010-09-08 2012-03-15 Faro Technologies, Inc. A laser scanner or laser tracker having a projector
US8533967B2 (en) 2010-01-20 2013-09-17 Faro Technologies, Inc. Coordinate measurement machines with removable accessories
WO2013156468A1 (en) * 2012-04-18 2013-10-24 Deutsches Zentrum für Luft- und Raumfahrt e.V. Method for operating a robot
US8942940B2 (en) 2010-01-20 2015-01-27 Faro Technologies, Inc. Portable articulated arm coordinate measuring machine and integrated electronic data processing system
US9074883B2 (en) 2009-03-25 2015-07-07 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US9113023B2 (en) 2009-11-20 2015-08-18 Faro Technologies, Inc. Three-dimensional scanner with spectroscopic energy detector
US9163922B2 (en) 2010-01-20 2015-10-20 Faro Technologies, Inc. Coordinate measurement machine with distance meter and camera to determine dimensions within camera images
US9210288B2 (en) 2009-11-20 2015-12-08 Faro Technologies, Inc. Three-dimensional scanner with dichroic beam splitters to capture a variety of signals
USRE45854E1 (en) 2006-07-03 2016-01-19 Faro Technologies, Inc. Method and an apparatus for capturing three-dimensional data of an area of space
US9329271B2 (en) 2010-05-10 2016-05-03 Faro Technologies, Inc. Method for optically scanning and measuring an environment
US9372265B2 (en) 2012-10-05 2016-06-21 Faro Technologies, Inc. Intermediate two-dimensional scanning with a three-dimensional scanner to speed registration
US9417056B2 (en) 2012-01-25 2016-08-16 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US9417316B2 (en) 2009-11-20 2016-08-16 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US9513107B2 (en) 2012-10-05 2016-12-06 Faro Technologies, Inc. Registration calculation between three-dimensional (3D) scans based on two-dimensional (2D) scan data from a 3D scanner
US9529083B2 (en) 2009-11-20 2016-12-27 Faro Technologies, Inc. Three-dimensional scanner with enhanced spectroscopic energy detector
US9551575B2 (en) 2009-03-25 2017-01-24 Faro Technologies, Inc. Laser scanner having a multi-color light source and real-time color receiver
US9607239B2 (en) 2010-01-20 2017-03-28 Faro Technologies, Inc. Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
US9628775B2 (en) 2010-01-20 2017-04-18 Faro Technologies, Inc. Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
US10067231B2 (en) 2012-10-05 2018-09-04 Faro Technologies, Inc. Registration calculation of three-dimensional scanner data performed between scans based on measurements by two-dimensional scanner
US10175037B2 (en) 2015-12-27 2019-01-08 Faro Technologies, Inc. 3-D measuring device with battery pack
US10281259B2 (en) 2010-01-20 2019-05-07 Faro Technologies, Inc. Articulated arm coordinate measurement machine that uses a 2D camera to determine 3D coordinates of smoothly continuous edge features

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8630314B2 (en) 2010-01-11 2014-01-14 Faro Technologies, Inc. Method and apparatus for synchronizing measurements taken by multiple metrology devices
US8898919B2 (en) 2010-01-20 2014-12-02 Faro Technologies, Inc. Coordinate measurement machine with distance meter used to establish frame of reference
US8615893B2 (en) 2010-01-20 2013-12-31 Faro Technologies, Inc. Portable articulated arm coordinate measuring machine having integrated software controls
US8284407B2 (en) 2010-01-20 2012-10-09 Faro Technologies, Inc. Coordinate measuring machine having an illuminated probe end and method of operation
US8832954B2 (en) 2010-01-20 2014-09-16 Faro Technologies, Inc. Coordinate measurement machines with removable accessories
US8875409B2 (en) 2010-01-20 2014-11-04 Faro Technologies, Inc. Coordinate measurement machines with removable accessories
US8677643B2 (en) 2010-01-20 2014-03-25 Faro Technologies, Inc. Coordinate measurement machines with removable accessories
US9168654B2 (en) 2010-11-16 2015-10-27 Faro Technologies, Inc. Coordinate measuring machines with dual layer arm
US8997362B2 (en) 2012-07-17 2015-04-07 Faro Technologies, Inc. Portable articulated arm coordinate measuring machine with optical communications bus

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020016541A1 (en) * 1999-09-15 2002-02-07 Glossop Neil David Method and system to facilitate image guided surgery
US20020176603A1 (en) * 2001-05-24 2002-11-28 Acoustic Positioning Research Inc. Automatic pan/tilt pointing device, luminaire follow-spot, and 6DOF 3D position/orientation calculation information
US20030120154A1 (en) * 2001-11-28 2003-06-26 Frank Sauer Method and apparatus for ultrasound guidance of needle biopsies
US20050093889A1 (en) * 2001-03-27 2005-05-05 Frank Sauer Augmented reality guided instrument positioning with guiding graphics
US20050256391A1 (en) * 2004-05-14 2005-11-17 Canon Kabushiki Kaisha Information processing method and apparatus for finding position and orientation of targeted object

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020016541A1 (en) * 1999-09-15 2002-02-07 Glossop Neil David Method and system to facilitate image guided surgery
US20050093889A1 (en) * 2001-03-27 2005-05-05 Frank Sauer Augmented reality guided instrument positioning with guiding graphics
US20020176603A1 (en) * 2001-05-24 2002-11-28 Acoustic Positioning Research Inc. Automatic pan/tilt pointing device, luminaire follow-spot, and 6DOF 3D position/orientation calculation information
US20030120154A1 (en) * 2001-11-28 2003-06-26 Frank Sauer Method and apparatus for ultrasound guidance of needle biopsies
US20050256391A1 (en) * 2004-05-14 2005-11-17 Canon Kabushiki Kaisha Information processing method and apparatus for finding position and orientation of targeted object

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USRE45854E1 (en) 2006-07-03 2016-01-19 Faro Technologies, Inc. Method and an apparatus for capturing three-dimensional data of an area of space
DE102008039838A1 (en) 2008-08-27 2010-03-04 Deutsches Zentrum für Luft- und Raumfahrt e.V. Measuring object's three dimensional surface scanning method, involves executing scanning movement of laser light beam by beam deflecting unit, and utilizing laser light beam for measuring and displaying data of scanning points
US9551575B2 (en) 2009-03-25 2017-01-24 Faro Technologies, Inc. Laser scanner having a multi-color light source and real-time color receiver
US9074883B2 (en) 2009-03-25 2015-07-07 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US9417316B2 (en) 2009-11-20 2016-08-16 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US9210288B2 (en) 2009-11-20 2015-12-08 Faro Technologies, Inc. Three-dimensional scanner with dichroic beam splitters to capture a variety of signals
US9529083B2 (en) 2009-11-20 2016-12-27 Faro Technologies, Inc. Three-dimensional scanner with enhanced spectroscopic energy detector
US9113023B2 (en) 2009-11-20 2015-08-18 Faro Technologies, Inc. Three-dimensional scanner with spectroscopic energy detector
US9607239B2 (en) 2010-01-20 2017-03-28 Faro Technologies, Inc. Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
US8942940B2 (en) 2010-01-20 2015-01-27 Faro Technologies, Inc. Portable articulated arm coordinate measuring machine and integrated electronic data processing system
US10060722B2 (en) 2010-01-20 2018-08-28 Faro Technologies, Inc. Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
US8683709B2 (en) 2010-01-20 2014-04-01 Faro Technologies, Inc. Portable articulated arm coordinate measuring machine with multi-bus arm technology
US8638446B2 (en) 2010-01-20 2014-01-28 Faro Technologies, Inc. Laser scanner or laser tracker having a projector
US9163922B2 (en) 2010-01-20 2015-10-20 Faro Technologies, Inc. Coordinate measurement machine with distance meter and camera to determine dimensions within camera images
US8533967B2 (en) 2010-01-20 2013-09-17 Faro Technologies, Inc. Coordinate measurement machines with removable accessories
US10281259B2 (en) 2010-01-20 2019-05-07 Faro Technologies, Inc. Articulated arm coordinate measurement machine that uses a 2D camera to determine 3D coordinates of smoothly continuous edge features
US9628775B2 (en) 2010-01-20 2017-04-18 Faro Technologies, Inc. Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
US9329271B2 (en) 2010-05-10 2016-05-03 Faro Technologies, Inc. Method for optically scanning and measuring an environment
US9684078B2 (en) 2010-05-10 2017-06-20 Faro Technologies, Inc. Method for optically scanning and measuring an environment
WO2012033892A1 (en) * 2010-09-08 2012-03-15 Faro Technologies, Inc. A laser scanner or laser tracker having a projector
GB2501390B (en) * 2010-09-08 2014-08-06 Faro Tech Inc A laser scanner or laser tracker having a projector
GB2501390A (en) * 2010-09-08 2013-10-23 Faro Tech Inc A laser scanner or laser tracker having a projector
CN103003713A (en) * 2010-09-08 2013-03-27 法罗技术股份有限公司 A laser scanner or laser tracker having a projector
CN103003713B (en) * 2010-09-08 2015-04-01 法罗技术股份有限公司 A laser scanner or laser tracker having a projector
US9417056B2 (en) 2012-01-25 2016-08-16 Faro Technologies, Inc. Device for optically scanning and measuring an environment
WO2013156468A1 (en) * 2012-04-18 2013-10-24 Deutsches Zentrum für Luft- und Raumfahrt e.V. Method for operating a robot
US9372265B2 (en) 2012-10-05 2016-06-21 Faro Technologies, Inc. Intermediate two-dimensional scanning with a three-dimensional scanner to speed registration
US9739886B2 (en) 2012-10-05 2017-08-22 Faro Technologies, Inc. Using a two-dimensional scanner to speed registration of three-dimensional scan data
US9746559B2 (en) 2012-10-05 2017-08-29 Faro Technologies, Inc. Using two-dimensional camera images to speed registration of three-dimensional scans
US9618620B2 (en) 2012-10-05 2017-04-11 Faro Technologies, Inc. Using depth-camera images to speed registration of three-dimensional scans
US10067231B2 (en) 2012-10-05 2018-09-04 Faro Technologies, Inc. Registration calculation of three-dimensional scanner data performed between scans based on measurements by two-dimensional scanner
US9513107B2 (en) 2012-10-05 2016-12-06 Faro Technologies, Inc. Registration calculation between three-dimensional (3D) scans based on two-dimensional (2D) scan data from a 3D scanner
US10203413B2 (en) 2012-10-05 2019-02-12 Faro Technologies, Inc. Using a two-dimensional scanner to speed registration of three-dimensional scan data
US10739458B2 (en) 2012-10-05 2020-08-11 Faro Technologies, Inc. Using two-dimensional camera images to speed registration of three-dimensional scans
US10175037B2 (en) 2015-12-27 2019-01-08 Faro Technologies, Inc. 3-D measuring device with battery pack

Also Published As

Publication number Publication date
DE102006035292B4 (en) 2010-08-19

Similar Documents

Publication Publication Date Title
US20170333138A1 (en) System and method for verifying calibration of a surgical device
US9901409B2 (en) System and methods for intraoperative guidance feedback
US10639204B2 (en) Surgical component navigation systems and methods
US10499996B2 (en) Methods and systems for computer-aided surgery using intra-operative video acquired by a free moving camera
CN104519822B (en) Soft tissue cutting apparatus and application method
JP2019534717A (en) System for sensory enhancement in medical procedures
EP2637593B1 (en) Visualization of anatomical data by augmented reality
Zheng et al. Computer-assisted orthopedic surgery: current state and future perspective
US6925339B2 (en) Implant registration device for surgical navigation system
DE10202091B4 (en) Device for determining a coordinate transformation
EP3032456B1 (en) System and method for optical position measurement and guidance of a rigid or semi-flexible tool to a target
EP0553246B1 (en) Surgical probe locating system for head use
US6402762B2 (en) System for translation of electromagnetic and optical localization systems
US5769861A (en) Method and devices for localizing an instrument
EP1913333B1 (en) System and method for detecting drifts in calibrated tracking systems
DE10215808B4 (en) Registration procedure for navigational procedures
US5957844A (en) Apparatus and method for visualizing ultrasonic images
DE60028582T2 (en) Method and device for facilitating image-controlled surgery
US6069932A (en) Apparatus and method for planning a stereotactic surgical procedure using coordinated fluoroscopy
EP1127545B1 (en) Procedure for locating objects in radiotherapy
US7217276B2 (en) Instrument guidance method and system for image guided surgery
DE19747427C2 (en) Device for bone segment navigation
ES2216789T3 (en) System for orientation assisted by navigation of elements on a body.
US6081336A (en) Microscope calibrator
DE60312210T2 (en) Method for determining the mechanism points

Legal Events

Date Code Title Description
OP8 Request for examination as to paragraph 44 patent law
8364 No opposition during term of opposition
R084 Declaration of willingness to licence
R085 Willingness to licence withdrawn