WO2014057352A1 - Measuring sensor and method for measuring a surface - Google Patents
Measuring sensor and method for measuring a surface Download PDFInfo
- Publication number
- WO2014057352A1 WO2014057352A1 PCT/IB2013/002621 IB2013002621W WO2014057352A1 WO 2014057352 A1 WO2014057352 A1 WO 2014057352A1 IB 2013002621 W IB2013002621 W IB 2013002621W WO 2014057352 A1 WO2014057352 A1 WO 2014057352A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- probe
- measuring
- finger
- probe body
- measured
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B21/00—Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant
- G01B21/02—Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring length, width, or thickness
- G01B21/04—Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring length, width, or thickness by measuring coordinates of points
- G01B21/047—Accessories, e.g. for positioning, for tool-setting, for measuring probes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4824—Touch or pain perception evaluation
- A61B5/4827—Touch or pain perception evaluation assessing touch sensitivity, e.g. for evaluation of pain threshold
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/002—Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2068—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6825—Hand
- A61B5/6826—Finger
Definitions
- the invention relates to a probe for measuring a surface with a probe body and a fastening device for fastening the probe body to a finger and a method for measuring a surface.
- Mobile 3D measuring technology plays a major role in industrial metrology. For example, the shape of any objects such as bent tubes or car molds is measured with buttons. During the measurement, the measuring person guides the probe of the probe, for example, along a trajectory on the surface of the material to be measured. The surface of the material to be measured can also be touched point by point.
- Such measuring systems are also used to position and spatial orientation of objects, such as pairs of flanges to detect absolute or relative to each other. Also in surgery, surfaces are scanned with rigid styli whose position and orientation are measured with surgical navigation systems.
- the invention is based on the object to facilitate the palpation of the relevant points of an object surface.
- Haptic perception is understood to mean the active feeling of size, contours, surface texture, weight, etc. of an object through integration of the senses of the skin and sensitivity to depth.
- the terminal tactile ball of a fingertip i. the fingertip
- the totality of haptic perceptions allows the brain to localize and evaluate, for example, mechanical stimuli. Changing stimuli, such as those produced by scanning a surface, are perceived to be particularly haptic.
- the probe body is a ball or a tip.
- the tip may also protrude from a spherical body or it may be formed as a cone or pyramid. Depending on the application, different probe body shapes are advantageous.
- the probe body has a curved guide.
- a curved guide may be a short curved piece of wire or a curved surface.
- the bend may describe a concave arc or the curved guide may also be roof-shaped or formed as an angled surface.
- the probe body is particularly small.
- the probe body must also be rigidly connectable to the fastening device, and the probe body should make it possible to be pressed into the fingertip preferably in such a way that the area of the fingertip surrounding the probe body can touch the surface which also contacts the probe body. It has been found that it is advantageous if the probe is between 1 and 10 mm and preferably between 2 and 5 mm long. This length is the diameter of a spherical probe and the maximum extension between two surface points of the body in another Tast analysesform.
- a simple embodiment provides that the fastening device has a finger ring.
- a finger ring can be easily slipped over the finger and thus allows attachment of the probe body and the entire probe on the finger.
- the ring can be pushed over the fingertip between the first and second limb of a finger or surround the finger syringe in the area of the fingertip.
- the fastening device becomes individually adaptable if it has a plastically deformable sleeve.
- a plastically deformable sleeve can also be shaped like a ring.
- the plastic deformability makes it possible to adapt the sleeve to a special finger shape and to attach it.
- the material of the sleeve is to be chosen so firmly that it is on the one hand for adaptation plastically deformable and on the other hand during use of the fastening device, the sleeve is so rigid that the probe body is firmly connected to the finger.
- the sleeve may also be in cross-section U-shaped or O-shaped with an open slot, so that the sleeve - preferably elastic - is easily bendable to pull it over the finger.
- the fastening device comprises a wire which holds the probe body.
- a thin, stiff wire or several thin stiff pieces of wire make it possible on the one hand to keep the probe safely and on the other hand, not to hinder the scanning of the surface with the finger.
- the wire should have a cross section such as a tube cross-section or a T-shape to allow high stability. It can also be used highly stable plastics.
- a simple embodiment provides that the probe body is rigidly connected to a measuring arm. About the measuring arm, the position of the probe can be determined without a camera system is necessary.
- the probe has a camera system that is aligned with the marker area. This increases the mobility of the probe.
- the object underlying the invention is achieved by a method for measuring a surface, in which a feeler body is moved with a finger on a surface in such a way that the finger touches the surface while the feeler body touches the surface.
- the position of the probe is determined and stored. Individual points on the surface of an object can be measured. In particular for the measurement of roughnesses or shapes, it is proposed that the position of the feeler be continuously measured. [22] For measuring it is provided that the position of the probe is measured in the room. That is, there is a space coordinate system in relation to which the position of the probe is set in the measurement.
- the position be measured relative to the position of the surface. While several measurement points define the position of the surface, individual points serve to determine the deviation from a surface shape. It is advantageous if the sol surface of an object to be measured is known and the method is used to determine a deviation from the surface of the sol. In this example, bumps in a piece of pipe can be determined or surveys on a bone, which differ from the usual bone shape. During the measurement, a surface shape can first be determined by interpolating between the locations of the points by means of a multiplicity of points in order to determine a basic shape. A more accurate measurement can subsequently determine a deviation from this basic shape, such as, for example, surface irregularities or roughnesses or even macroscopic elevations on a previously determined desired shape.
- a measuring system be activated after the finger has touched the surface. This is preferably realized by a switch or voice control. Depending on the haptic perception, the measuring system can thus be switched on or off.
- a measuring arm is fixed with one end firmly in the space and thereby can by the determination of the movement of the measuring arm on the position of the other end of the arm getting closed.
- the probe can also have a probe body and optical recognition features for the position and orientation determination.
- FIG. 1 schematically shows the device with optical camera system, the probe with an area for haptic perception and the material to be measured
- FIG. 2 schematically shows a possible embodiment of the probe with an area for the haptic perception
- Figure 3 schematically another possible embodiment of the probe with a
- FIG. 4 schematically shows two possible embodiments of the scanning range of the device
- Figure 5 schematically shows two possible designs for guides in the sensing range of
- FIG. 6 shows schematically the cross-section of the sensing area of the device with a ball as a rigid probe body
- FIG. 7 schematically shows the longitudinal section of the scanning region of the device with a ball as a rigid probe body
- FIG. 8 schematically shows the cross-section of the sensing area of the device with a
- FIG. 9 schematically shows the device with optical camera system, the probe with a further form of the rigid probe body, the marker region and the probe body,
- FIG. 10 shows schematically the device with optical camera system, the probe with rigid probe body, marker region and probe body and another marker with pattern rigidly connected to the object,
- FIG. 11 shows schematically the device with optical camera system, the probe with a rigid probe body, marker region and probe body and another, rigidly connected to the table marker with pattern,
- Figure 12 schematically shows the device with the object rigidly connected optical
- FIG. 13 shows schematically the device with a rigidly connected to the probe optical camera system, the probe with a rigid stylus body, marker region and probe and a rigidly connected to the object marker with pattern and
- Figure 14 schematically a suitable device for the measurement of hard to reach places of a workpiece.
- Figure 1 shows schematically the arrangement of all components of a device based on optical methods.
- the measured material 200 such as an engine block or a bent pipe.
- the measuring person 300 holds a probe 501 with a rigid shield 500, on which optically visible markings are mounted.
- the spatial positions and spatial orientations of these markings are measured by at least one camera 400 with image sensor and the viewing area 410 using methods known in the art.
- the cameras can be fixed to a wall, ceiling, rack or tripod with a 440 magnet, screw, bayonet or click-lock.
- the orientation of the cameras to each other is calibrated by means of a reference body 600 with optical markings. Appropriate calibration methods are known in the art.
- the rigid shield 500 is connected by means of the rigid connection 532 with the ring 531, to which the rigid ball 530 is attached.
- the probe 501 is by means of the ring 531 from the finger 311 of the hand 310 of Held measuring person.
- the measuring person guides the ball 531 of the probe 501 along the surface of the object 200 such that the touch ball 312 of the fingertip 311 contacts both the surface of the object 200 and the ball 530.
- the marks 510 on the shield can be passive. They can also be actively lit, such as with LED or OLED.
- spatial position and spatial orientation of the markings 510 measured by the cameras are determined on the shield 500 in the camera coordinate system. Since the shield 500 is connected to the rigid ball 530 via the rigid connection 532 and the ring 531, the orientation and orientation of the ball 530 for each measurement is also known.
- the sphere can be made of a quartz glass, of steel or of any other suitable material, which can clearly detect the surface of the material to be measured.
- the ball has a diameter of 2 to 5 millimeters. This makes it possible for the measuring person to simultaneously perceive both the ball 530 and the surface of the object 200 haptically.
- the geometry of the material to be measured 200 can thus be detected by the person measuring leads her finger with the probe gently over the material to be measured in a pleasant speed, for example 50 to 150 mm / s.
- the contact line or the contact point is detected by the camera system at any time via the ball contact with the material to be measured.
- other fingers such as the thumb 313 are used to scan the material 200 to be measured and guide the ball 530.
- the sphere geometry is taken into account when determining the surface geometry of the material to be measured.
- the scanning of the material to be measured makes it possible that a visual contact between the measuring person 300 and the sample 200 is not required during the measurement. Sampling is more intuitive thanks to tactile perception. For example, during the scan, the subject may view a monitor with the analyzed measurement data. For example, the measuring person can judge the quality of the measurement data himself and decide whether certain object locations need to be measured again.
- the scanning preferably produces not only single spot measurements but rather areas of pads. It is the task of the analysis algorithm to calculate the surface of the article 200 from the measured points of contact. If the ball 530 always remains in contact with the object 200, a continuous measurement can also be made which detects lines in the space lying on the surface of the object.
- FIG. 3 schematically shows a further device with the probe 1, which consists of the probe body 2, the marker region 3 and the probe body 13.
- the probe body 2 is attached by means of the fastening device 5 to a finger 7 on the hand 6 of the measuring person.
- the attachment between the wire 2 of the fastening device 5 and the finger 7 need not be rigid and should allow only a movement of the probe body 13.
- the touch area 4 is located in the probe body 13.
- the probe body 13 may be, for example, a ball or a tip.
- the rigid marker region 3 and the rigid probe body 13 are rigidly connected to the rigid probe body 2.
- the probe 1 is a substantially rigid object.
- the probe 1 is attached to a hand 6, that firstly the probe body 2 is guided along a finger 7, secondly the Tastballen 10 of the fingertip of this finger is at the touch area 4 and third, the marker area 3 at the back of the hand, wrist or forearm of the measuring person.
- the finger 7 may be, for example, the index finger, the middle finger or otherwise a finger.
- the measuring person can wear gloves during the measurement. This must be taken into account in the design of the touch area. For example, the surgeon must wear gloves in the operating room. Or the measuring person in the production line must protect himself with gloves against sharp edges of the object to be measured or against oily surfaces.
- the fastening device 5 may comprise an elastic tube.
- the attachment may also have an easily deformable, plastically deformable sleeve, the for attaching the probe 1 on the finger can be easily compressed and then connects the sleeve firmly with the finger.
- Other options for attachment can be easily handled clamps or straps.
- the attachment should be easy to remove after use.
- the attachment of the probe to the finger is temporary.
- the attachment should be comfortable for the measuring person. The attachment ensures that the probe 1 does not slip off the finger 7.
- the optically contrasting pattern 11 is applied to the surface of the marker region 3.
- This is realized by means of laser engraving.
- the surface of the marker region to which the pattern is applied is preferably flat.
- the pattern has no gloss on the surface of the marker region, i. the surface is dull.
- the marker region 3 can consist of one or more such flat surfaces on which the high-contrast patterns 11 are applied.
- Each pattern 11 on a flat area of the marker area contains areas at least for the pattern identification and for the determination of the position and orientation of the pattern in the coordinate system of the camera system.
- Other pattern areas may include redundancy to account for effects of, for example, a soiled pattern in determining identification, location, and orientation.
- Each pattern is assigned its own coordinate system.
- Two or more patterns 11 may be arranged on the marker area 3 such that at least one pattern is measurable by the camera system 12.
- the camera system measures the optically contrasting patterns 1 1 and uses them to determine the position and orientation of the pattern coordinate system with respect to the coordinate system of the camera system.
- the position and orientation of the probe in the coordinate system with respect to the pattern on the marker region is known either through the manufacturing process or through calibration. This calibration is state of the art. Thus, the position and orientation of the probe in the coordinate system of the camera system is known from the measured position and orientation of the pattern. [48] In the arrangement in Figure 3, the camera system and the object must not be moved during the measurement.
- the objects 8 are in surgery, for example, the surfaces of acetabular cups, in particular the edge of the acetabulum, other joints such as knee, ankle, shoulder joint or generally bone or hard structures.
- the objects 8 may be any molded parts or bearings in gears.
- the touch area 4 is designed such that, when the surface 9 of the touchpad 10 of the fingertip is measured in the touch area 4 of the probe 1, it simultaneously comes into contact with the object surface 9 and with the feeler body 13. For example, by means of the haptic perception, the measuring person ensures that the probe body 13 is located on the object surface 9 at the desired position and / or that the probe body 13 touches the object surface 9.
- the measuring person can move the probe body 13 to the desired surface position. It may also be helpful to use the other fingers that are not connected to the probe for haptic perception.
- the measuring person can activate the camera system, parts of the measuring system or the entire measuring system by means of activation modules such as switches or voice control for the measurement.
- the shape and size of the probe body 13 depend, for example, on the haptic perception of the person measuring with gloves or on the object surface 9. For example, too small balls can reduce their haptic perception with respect to the object surface. Too large balls cover too much of the touch area and can also reduce haptic perception.
- FIGS. 5a and 5b schematically show guide aids for the probe.
- the touch area around the rigid feeler body 13 can be configured with a rigid guide aid 20 (FIG. 5a). This substantially facilitates the guidance of the feeler body 13, for example, along the edge of the object surface 9.
- the guide aid 20 may be attached to the stylus body 2.
- the tactile area around the rigid probe body 13 can be configured with a guide aid 21 made of flexible and / or plastically deformable material (FIG. 5b). This substantially facilitates the guidance of the feeler body 13 along the object surface 9.
- the shape of this guide aid substantially supports the displacement of the feeler body along the object surface.
- the guidance aid for example in the form of a saddle, can be helpful.
- Tastballen 10 is shown with the fingernail 50 in Figures 5a and 5b.
- FIG. 6 shows schematically the cross-section of the sensing area 4 of the device with a ball 13 as a probe body.
- the touchpad 10 of the fingertip touches the ball 13 and at the points 24, the object surface 9 of the object 8.
- FIG. 6 shows schematically the cross-section of the sensing area 4 of the device with a ball 13 as a probe body.
- the touchpad 10 of the fingertip touches the ball 13 and at the points 24, the object surface 9 of the object 8.
- FIG. 7 shows schematically the longitudinal section of the scanning region 4 of the device with a ball 13 on the rigid stylus body 2.
- the stylet 10 of the fingertip touches the ball 13 and, at the points 25, the object surface 9 of the object 8.
- FIG. 8 shows schematically the cross-section of the sensing area of the device with a hemisphere 26 and a cone 27 as a probe body.
- the tactile ball 10 of the fingertip feels the hemisphere 26 and the object surface 9, while the tip of the cone 27 slides over the object surface 9.
- the attachment and probe must be sterile.
- the probe consists of materials suitable for multiple sterilization in an autoclave without compromising the probe geometry or the optical properties of the sample.
- the probe can be completely passive. It can be designed as a lightweight measuring instrument. For example, its weight is less than 50g.
- Measuring probes can be manufactured with different patterns, marker areas, stylus body lengths, stylus configurations or different materials.
- the probe can be made of a piece of metal or plastic.
- the patterns are applied to the marker region, for example by means of laser engraving.
- the surface to be scanned must not be visible to the measuring person. Thanks to the haptic perception in the touch area, the positioning of the probe on the object is made much easier. This ensures, for example, that the probe touches the object surface safely. By moving the scanning area along the object surface, the measuring person can safely aim for the desired positioning of the probe body.
- the camera system can be part of a tracking system or a navigation system. It can be used stationary or mobile, be light and portable, have its own power supply and / or communicate wirelessly.
- the camera system can carry out the measurements continuously. As soon as samples are measured in the measuring range of the camera system, their position and orientation are transmitted to the user software. This calculates the positions and orientations of the probe in the camera coordinate system. Thus, the contact points of the probe body are calculated on the object surface with methods known in the art.
- the camera system measures the optically contrasting patterns of the probe and thus the position and orientation of the probe. Thanks to the haptic perception, the measuring person assesses the position of the probe with respect to the object surface. In general, only measurement data relating to the object surface should be acquired with the camera system. Due to the haptic perception, the measuring person can activate the camera system as needed for measurement. The activation can be carried out, for example, by a foot switch or by a switch or push button integrated in the camera system. The activation of the camera system triggers a single measurement, which indicates the position and orientation of the pattern and the tactile results. Activation of the camera system may, if necessary, trigger a series of individual measurements. In this case, the measuring person can guide the measuring probe along the object surface in a targeted manner by means of haptic perception and thus acquire measurement data.
- FIG. 9 schematically shows another form of the rigid stylus body 2.
- the stylus body 2 is arranged laterally by the finger 7.
- Other implementations of the stylus body are possible as along both finger sides, above the finger over the fingernail or at the finger bottom along the Tastballens.
- the stylus bodies can be made of components of simple geometry or anatomically shaped to provide a high wearing comfort.
- FIG. 10 schematically shows the device with optical camera system 12, the probe 1 with a rigid probe body 2, marker region 3 with pattern 11 and probe body 13 and a further marker 3a rigidly connected to the object 8 and having the pattern I 1a.
- the identifications of the patterns 11 and 11a are different.
- the measured coordinates of the probe body 13 can be calculated in the coordinate system of the second pattern I Ia.
- the camera system can be moved freely with respect to the two markers.
- the camera system 12 can be held by the measuring person with the other hand in order to carry out measurements if necessary. Both camera system 12 and object 8 are allowed to move relative to each other.
- FIG. 11 schematically shows the device with optical camera system, the probe 1 with a rigid probe body 2, marker region 3 with pattern 11 and probe body 13 and a further marker 3a rigidly connected to the table 30.
- the object 8 must not be moved in the measurement with respect to the table marker 3a.
- the camera system 12 can be moved freely with respect to the two markers 3 and 3a.
- FIG. 12 schematically shows a device in which the rigid object 8 is rigidly connected to an optical camera system 40.
- the probe 1 has a rigid probe body 2, a marker region 3 with pattern 11 and a probe body 13.
- the camera system is preferably lightweight and portable. It has its own power supply and communicates wirelessly.
- FIG. 13 schematically shows the device with an optical camera system 50 rigidly connected to the probe 1, the probe 1 with a rigid stylus body 2 and a stylus 13 and a marker 3 a rigidly connected to the rigid object 8 with a pattern I 1 a.
- the camera system and the probe are, if necessary, rigidly connected to each other by means of the adapter 51.
- the lightweight, portable camera system has its own power supply and communicates wirelessly. For the surface measurement of the probe is preferably performed with one hand along the surface and with the other hand, the camera system is held if necessary and / or activated for measurement by means of switches or buttons.
- FIG. 14 schematically illustrates a possible device for measurements at hard-to-reach locations of a workpiece 1200.
- the probe ball 530 is connected to the shield 500 by means of the connecting piece 1201.
- the pattern 510 on the sign can be aligned with the camera system as needed.
- the center of the ball 530 lies on the axis of rotation 1010. The correct position of the test ball on the workpiece surface is scanned with the finger 311.
- this haptic measuring device is possible in various areas: in quality control, in assembly, in mold and plant construction, in sheet metal working, in welding technology and in many more. Especially in processes where highest ergonomics and high throughput of measurements are required, the measurement process can be integrated into the respective work step.
- a particularly preferred application uses two probes simultaneously, which are measured individually by the measuring system. Both hands are equipped with a measuring probe. This allows objects to be measured very efficiently in an intuitive way. The optimal circle of action is about one meter. The surgeon can make the object ergonomic Grasp and move properly with both hands to deposit it at a specific location or to control the item after processing. At the same time, the object is determined and measured.
- the object can be examined for roughness by means of suitable tactile balls by gently rubbing the operator over the material to be measured. Based on the change in position of the ball and the depth variation perpendicular to the measurement plane, the roughness is determined.
- Haptic scanning according to the invention can be combined with Augmented Reality.
- a target shape is known.
- Meissein material is removed from an object.
- the measuring system immediately records the current shape by scanning. This can be communicated to the surgeon via an optical system without, for example, constantly having to use another measuring instrument such as a surface scanner.
- a measuring arm known to the experts can also be used.
- the probe is attached directly to the measuring arm by means of a suitable adapter.
- the shield 500 shown schematically in FIG. 2 with the optical markings 510 or the marker 3 with the pattern 11 in FIG. 3 can thus be dispensed with.
- the measurements of the probe take place in the coordinate system of the measuring arm.
Abstract
Description
Claims
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE112013004917.1T DE112013004917A5 (en) | 2012-10-08 | 2013-10-08 | Probe and method for measuring a surface |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102012019620 | 2012-10-08 | ||
DE102012019620.2 | 2012-10-08 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014057352A1 true WO2014057352A1 (en) | 2014-04-17 |
Family
ID=49949987
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2013/002621 WO2014057352A1 (en) | 2012-10-08 | 2013-10-08 | Measuring sensor and method for measuring a surface |
Country Status (2)
Country | Link |
---|---|
DE (1) | DE112013004917A5 (en) |
WO (1) | WO2014057352A1 (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10194131B2 (en) | 2014-12-30 | 2019-01-29 | Onpoint Medical, Inc. | Augmented reality guidance for spinal surgery and spinal procedures |
DE102017118717A1 (en) | 2017-08-16 | 2019-02-21 | Carl Zeiss Industrielle Messtechnik Gmbh | Device for determining a 3D position of an object in a measuring volume |
US10278777B1 (en) | 2016-03-12 | 2019-05-07 | Philipp K. Lang | Augmented reality visualization for guiding bone cuts including robotics |
US11348257B2 (en) | 2018-01-29 | 2022-05-31 | Philipp K. Lang | Augmented reality guidance for orthopedic and other surgical procedures |
US11553969B1 (en) | 2019-02-14 | 2023-01-17 | Onpoint Medical, Inc. | System for computation of object coordinates accounting for movement of a surgical site for spinal and other procedures |
US11751944B2 (en) | 2017-01-16 | 2023-09-12 | Philipp K. Lang | Optical guidance for surgical, medical, and dental procedures |
US11786206B2 (en) | 2021-03-10 | 2023-10-17 | Onpoint Medical, Inc. | Augmented reality guidance for imaging systems |
US11801114B2 (en) | 2017-09-11 | 2023-10-31 | Philipp K. Lang | Augmented reality display for vascular and other interventions, compensation for cardiac and respiratory motion |
US11826113B2 (en) | 2013-03-15 | 2023-11-28 | Intellijoint Surgical Inc. | Systems and methods to compute a subluxation between two bones |
US11857378B1 (en) | 2019-02-14 | 2024-01-02 | Onpoint Medical, Inc. | Systems for adjusting and tracking head mounted displays during surgery including with surgical helmets |
US11865008B2 (en) | 2010-12-17 | 2024-01-09 | Intellijoint Surgical Inc. | Method and system for determining a relative position of a tool |
US11957420B2 (en) | 2023-11-15 | 2024-04-16 | Philipp K. Lang | Augmented reality display for spinal rod placement related applications |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040046734A1 (en) * | 2002-09-25 | 2004-03-11 | Hart Timothy O. | Thumb-retained stylus |
US20050199250A1 (en) | 2004-03-11 | 2005-09-15 | Green John M.Ii | System for determining a position of a point on an object |
US20090183297A1 (en) * | 2007-12-09 | 2009-07-23 | Lonnie Drosihn | Hand Covering With Tactility Features |
-
2013
- 2013-10-08 DE DE112013004917.1T patent/DE112013004917A5/en not_active Withdrawn
- 2013-10-08 WO PCT/IB2013/002621 patent/WO2014057352A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040046734A1 (en) * | 2002-09-25 | 2004-03-11 | Hart Timothy O. | Thumb-retained stylus |
US20050199250A1 (en) | 2004-03-11 | 2005-09-15 | Green John M.Ii | System for determining a position of a point on an object |
DE102005010335A1 (en) | 2004-03-11 | 2005-09-22 | Howmedica Osteonics Corp. | System for determining the position of a point on an object |
US7742802B2 (en) | 2004-03-11 | 2010-06-22 | Howmedica Osteonics Corp. | System for determining a position of a point on an object |
US20090183297A1 (en) * | 2007-12-09 | 2009-07-23 | Lonnie Drosihn | Hand Covering With Tactility Features |
Cited By (42)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11865008B2 (en) | 2010-12-17 | 2024-01-09 | Intellijoint Surgical Inc. | Method and system for determining a relative position of a tool |
US11826113B2 (en) | 2013-03-15 | 2023-11-28 | Intellijoint Surgical Inc. | Systems and methods to compute a subluxation between two bones |
US11839436B2 (en) | 2013-03-15 | 2023-12-12 | Intellijoint Surgical Inc. | Methods and kit for a navigated procedure |
US11153549B2 (en) | 2014-12-30 | 2021-10-19 | Onpoint Medical, Inc. | Augmented reality guidance for spinal surgery |
US11050990B2 (en) | 2014-12-30 | 2021-06-29 | Onpoint Medical, Inc. | Augmented reality guidance for spinal procedures using stereoscopic optical see-through head mounted displays with cameras and 3D scanners |
US11272151B2 (en) | 2014-12-30 | 2022-03-08 | Onpoint Medical, Inc. | Augmented reality guidance for spinal surgery with display of structures at risk for lesion or damage by penetrating instruments or devices |
US11750788B1 (en) | 2014-12-30 | 2023-09-05 | Onpoint Medical, Inc. | Augmented reality guidance for spinal surgery with stereoscopic display of images and tracked instruments |
US11652971B2 (en) | 2014-12-30 | 2023-05-16 | Onpoint Medical, Inc. | Image-guided surgery with surface reconstruction and augmented reality visualization |
US10511822B2 (en) | 2014-12-30 | 2019-12-17 | Onpoint Medical, Inc. | Augmented reality visualization and guidance for spinal procedures |
US10594998B1 (en) | 2014-12-30 | 2020-03-17 | Onpoint Medical, Inc. | Augmented reality guidance for spinal procedures using stereoscopic optical see-through head mounted displays and surface representations |
US10602114B2 (en) | 2014-12-30 | 2020-03-24 | Onpoint Medical, Inc. | Augmented reality guidance for spinal surgery and spinal procedures using stereoscopic optical see-through head mounted displays and inertial measurement units |
US11483532B2 (en) | 2014-12-30 | 2022-10-25 | Onpoint Medical, Inc. | Augmented reality guidance system for spinal surgery using inertial measurement units |
US10742949B2 (en) | 2014-12-30 | 2020-08-11 | Onpoint Medical, Inc. | Augmented reality guidance for spinal procedures using stereoscopic optical see-through head mounted displays and tracking of instruments and devices |
US11350072B1 (en) | 2014-12-30 | 2022-05-31 | Onpoint Medical, Inc. | Augmented reality guidance for bone removal and osteotomies in spinal surgery including deformity correction |
US10326975B2 (en) | 2014-12-30 | 2019-06-18 | Onpoint Medical, Inc. | Augmented reality guidance for spinal surgery and spinal procedures |
US10841556B2 (en) | 2014-12-30 | 2020-11-17 | Onpoint Medical, Inc. | Augmented reality guidance for spinal procedures using stereoscopic optical see-through head mounted displays with display of virtual surgical guides |
US10194131B2 (en) | 2014-12-30 | 2019-01-29 | Onpoint Medical, Inc. | Augmented reality guidance for spinal surgery and spinal procedures |
US10951872B2 (en) | 2014-12-30 | 2021-03-16 | Onpoint Medical, Inc. | Augmented reality guidance for spinal procedures using stereoscopic optical see-through head mounted displays with real time visualization of tracked instruments |
US10292768B2 (en) | 2016-03-12 | 2019-05-21 | Philipp K. Lang | Augmented reality guidance for articular procedures |
US10603113B2 (en) | 2016-03-12 | 2020-03-31 | Philipp K. Lang | Augmented reality display systems for fitting, sizing, trialing and balancing of virtual implant components on the physical joint of the patient |
US11311341B2 (en) | 2016-03-12 | 2022-04-26 | Philipp K. Lang | Augmented reality guided fitting, sizing, trialing and balancing of virtual implants on the physical joint of a patient for manual and robot assisted joint replacement |
US10278777B1 (en) | 2016-03-12 | 2019-05-07 | Philipp K. Lang | Augmented reality visualization for guiding bone cuts including robotics |
US10799296B2 (en) | 2016-03-12 | 2020-10-13 | Philipp K. Lang | Augmented reality system configured for coordinate correction or re-registration responsive to spinal movement for spinal procedures, including intraoperative imaging, CT scan or robotics |
US10849693B2 (en) | 2016-03-12 | 2020-12-01 | Philipp K. Lang | Systems for augmented reality guidance for bone resections including robotics |
US11013560B2 (en) | 2016-03-12 | 2021-05-25 | Philipp K. Lang | Systems for augmented reality guidance for pinning, drilling, reaming, milling, bone cuts or bone resections including robotics |
US10743939B1 (en) | 2016-03-12 | 2020-08-18 | Philipp K. Lang | Systems for augmented reality visualization for bone cuts and bone resections including robotics |
US11452568B2 (en) | 2016-03-12 | 2022-09-27 | Philipp K. Lang | Augmented reality display for fitting, sizing, trialing and balancing of virtual implants on the physical joint of a patient for manual and robot assisted joint replacement |
US10368947B2 (en) | 2016-03-12 | 2019-08-06 | Philipp K. Lang | Augmented reality guidance systems for superimposing virtual implant components onto the physical joint of a patient |
US11172990B2 (en) | 2016-03-12 | 2021-11-16 | Philipp K. Lang | Systems for augmented reality guidance for aligning physical tools and instruments for arthroplasty component placement, including robotics |
US11602395B2 (en) | 2016-03-12 | 2023-03-14 | Philipp K. Lang | Augmented reality display systems for fitting, sizing, trialing and balancing of virtual implant components on the physical joint of the patient |
US10405927B1 (en) | 2016-03-12 | 2019-09-10 | Philipp K. Lang | Augmented reality visualization for guiding physical surgical tools and instruments including robotics |
US11850003B2 (en) | 2016-03-12 | 2023-12-26 | Philipp K Lang | Augmented reality system for monitoring size and laterality of physical implants during surgery and for billing and invoicing |
US11751944B2 (en) | 2017-01-16 | 2023-09-12 | Philipp K. Lang | Optical guidance for surgical, medical, and dental procedures |
DE102017118717B4 (en) * | 2017-08-16 | 2019-10-31 | Carl Zeiss Industrielle Messtechnik Gmbh | Device for determining a 3D position of an object in a measuring volume |
DE102017118717A1 (en) | 2017-08-16 | 2019-02-21 | Carl Zeiss Industrielle Messtechnik Gmbh | Device for determining a 3D position of an object in a measuring volume |
US11801114B2 (en) | 2017-09-11 | 2023-10-31 | Philipp K. Lang | Augmented reality display for vascular and other interventions, compensation for cardiac and respiratory motion |
US11348257B2 (en) | 2018-01-29 | 2022-05-31 | Philipp K. Lang | Augmented reality guidance for orthopedic and other surgical procedures |
US11727581B2 (en) | 2018-01-29 | 2023-08-15 | Philipp K. Lang | Augmented reality guidance for dental procedures |
US11857378B1 (en) | 2019-02-14 | 2024-01-02 | Onpoint Medical, Inc. | Systems for adjusting and tracking head mounted displays during surgery including with surgical helmets |
US11553969B1 (en) | 2019-02-14 | 2023-01-17 | Onpoint Medical, Inc. | System for computation of object coordinates accounting for movement of a surgical site for spinal and other procedures |
US11786206B2 (en) | 2021-03-10 | 2023-10-17 | Onpoint Medical, Inc. | Augmented reality guidance for imaging systems |
US11957420B2 (en) | 2023-11-15 | 2024-04-16 | Philipp K. Lang | Augmented reality display for spinal rod placement related applications |
Also Published As
Publication number | Publication date |
---|---|
DE112013004917A5 (en) | 2015-06-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2014057352A1 (en) | Measuring sensor and method for measuring a surface | |
Helsen et al. | Temporal and spatial coupling of point of gaze and hand movements in aiming | |
WO2009130169A1 (en) | Measuring method for an articulated-arm coordinate measuring machine | |
DE102005052786B3 (en) | Chassis for a mobile X-ray diagnostic facility | |
CN108602147A (en) | The method for manufacturing medical instrument | |
DE102005026654A1 (en) | Device for contactless measurement of body geometry, spatial position, orientation measures marker position relative to pattern(s) on body with optical navigation system, measures body position/orientation in space using marker position | |
EP0763708A3 (en) | Coordinate measuring machine | |
DE102008034237A1 (en) | Positioning system for use in magnetic stimulation system for transcranial magnetic stimulation, has robot controlled by control device such that robot moves coil arrangement to determined position of indicator | |
DE102005010335A1 (en) | System for determining the position of a point on an object | |
DE102016118616B4 (en) | Measuring device for an optical measuring system | |
Haggard et al. | On the hand transport component of prehensile movements | |
EP3314203B1 (en) | Adapter element for assembling a rotational apparatus in the measurement space of a coordinate measuring machine | |
DE19816272A1 (en) | Method and arrangement for measuring structures of an object | |
DE102012202990A1 (en) | Apparatus and method for measuring and detecting body part and limb mobilities | |
Gehrmann et al. | Variability of precision pinch movements caused by carpal tunnel syndrome | |
DE202014101900U1 (en) | Manipulator for spatial orientation of a miniature roughness meter | |
DE102017003641A1 (en) | Method for measuring coordinates or properties of a workpiece surface | |
Touvet et al. | Grasp: combined contribution of object properties and task constraints on hand and finger posture | |
DE102006050687B4 (en) | Method for measuring and / or correcting the shape of the workpiece after forming by bending | |
DE102010054973B4 (en) | Method and measuring system for measuring an object to be measured | |
DE102019103973B4 (en) | robot | |
DE102005029002B4 (en) | Method and device for the contact measurement of a force | |
DE102019113799B4 (en) | Measuring system and method for measuring a measuring object | |
DE102006004514A1 (en) | Anatomical measuring points and curvature determining device for analyzing posture and upper surface shape of human body, has two scanning arms, where one of scanning arms is equipped with three-dimensional position sensor | |
DE10203002B4 (en) | Device for calibrating a robot |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13819068 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1120130049171 Country of ref document: DE Ref document number: 112013004917 Country of ref document: DE |
|
REG | Reference to national code |
Ref country code: DE Ref legal event code: R225 Ref document number: 112013004917 Country of ref document: DE Effective date: 20150618 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13819068 Country of ref document: EP Kind code of ref document: A1 |