EP1749248A1 - Mobile trackingeinheit - Google Patents
Mobile trackingeinheitInfo
- Publication number
- EP1749248A1 EP1749248A1 EP05750870A EP05750870A EP1749248A1 EP 1749248 A1 EP1749248 A1 EP 1749248A1 EP 05750870 A EP05750870 A EP 05750870A EP 05750870 A EP05750870 A EP 05750870A EP 1749248 A1 EP1749248 A1 EP 1749248A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- camera
- tracking unit
- augmented reality
- reality device
- articulated arm
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 230000003190 augmentative effect Effects 0.000 claims abstract description 66
- 230000005670 electromagnetic radiation Effects 0.000 claims abstract description 13
- 238000001514 detection method Methods 0.000 claims description 18
- 230000006870 function Effects 0.000 claims description 10
- 239000000523 sample Substances 0.000 claims description 9
- 230000002457 bidirectional effect Effects 0.000 description 9
- 239000003550 marker Substances 0.000 description 8
- 230000003993 interaction Effects 0.000 description 6
- 230000001419 dependent effect Effects 0.000 description 3
- 238000004020 luminiscence type Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 2
- 238000002604 ultrasonography Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 239000000969 carrier Substances 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 229910052736 halogen Inorganic materials 0.000 description 1
- 150000002367 halogens Chemical class 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/18—Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
- G05B19/409—Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by using manual data input [MDI] or by using control panel, e.g. controlling functions with the panel; characterised by control panel details or by setting parameters
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/32—Operator till task planning
- G05B2219/32014—Augmented reality assists operator in maintenance, repair, programming, assembly, use of head mounted display with 2-D 3-D display and voice feedback, voice and gesture command
Definitions
- the invention relates to a mobile tracking unit, which has a camera with a receiver for electromagnetic radiation.
- the camera is designed to detect electromagnetic radiation emitted or reflected by an object at a detection angle.
- the camera can generate a camera image signal representing an image of the object.
- the tracking unit ' also has a camera position sensor which is designed to generate a camera position signal corresponding to a detection direction of the camera and to output it on the output side.
- the tracking unit also has at least one image reproduction unit with an input for an image signal.
- the tracking unit also has an augmented reality device which is connected on the input side to the camera position sensor and on the output side to the image display unit.
- the augmented reality device is designed to generate an auxiliary information image signal representing auxiliary information as a function of the camera position signal from the camera image signal and to send this to the image display unit.
- the augmented reality system comprises a mobile device for the context-dependent insertion of assembly instructions.
- the context-dependent display of assembly instructions with process-optimized specification of necessary work steps results in support for work sequences that is appropriate to the situation.
- the augmented reality system of WO 00/52539 proposes data glasses which, in addition to one in the area of the spectacle lenses arranged display device has an image capture device in the form of a camera.
- Head-mounted displays have the disadvantage that they restrict the user with regard to freedom of movement and viewing angle when a device, in particular a motor vehicle, is to be repaired.
- Another problem is the calibration of an augmented reality system with a head-mounted display.
- the invention is therefore based on the object of specifying an augmented reality system which is simple to operate and which can also be easily calibrated. This object is achieved by the tracking unit according to the invention with an augmented reality system of the type mentioned at the beginning.
- the mobile tracking unit has an articulated arm.
- the articulated arm attaches to an articulated arm base with a base joint.
- the articulated arm base is connected to the tracking unit and the articulated arm has a free articulated arm end, the camera being arranged in the region of the free end.
- the articulated arm has at least one central joint between the articulated arm base and the free end.
- the position sensor is operatively connected to the base joint of the articulated arm base and to the center joint and is designed to detect a base joint position and a center joint position and to generate the camera position signal as a function of the base joint position and the center joint position.
- the camera position can be detected precisely, so that the augmented reality device can generate an auxiliary information image signal corresponding to the detection direction.
- an augmented reality device can also have a camera position sensor with a laser interferometer.
- the camera position sensor with a laser interferometer is designed to determine a distance to an object from the interference of a transmitted and back-reflected laser beam.
- a camera position sensor with a laser interferometer preferably has a calibration device which is designed to measure predetermined calibration locations and, after measuring the calibration locations, to determine a camera position in a coordinate system.
- a camera position sensor can have an ultrasound interferometer for determining the position.
- the articulated arm has a first central joint and a second central joint, which are each designed as a swivel joint.
- An articulated arm with two central joints is advantageously flexibly adjustable.
- the articulated arm can have a camera joint in the region of the free end, via which the camera is connected to the articulated arm. As a result, the camera can be moved in different detection directions at a predetermined articulated arm position.
- the augmented reality device is preferably designed for calibrating an articulated arm position.
- the articulated arm has a probe tip which forms the free end.
- the probe tip advantageously facilitates the calibration of the augmented reality device.
- the probe tip can be brought to predetermined locations on the vehicle body for calibrating the augmented reality device.
- Such calibration locations can be, for example, the dome bearings or the hood lock.
- the augmented reality device can be designed for contactless calibration of the articulated arm position.
- a contactless calibration can be carried out, for example, with a laser interferometer or an ultrasound interferometer, which can be part of the augmented reality device and thus of the tracking unit.
- the augmented reality device of the mobile tracking unit has a 3-D database with a large number of 3-D data records, which in their entirety represent a graphic model in three dimensions of at least parts of a vehicle with its individual components ,
- the augmented reality device has a VRML (Virtual Reality Modeling Language) module connected to the 3-D database.
- the VRML module is designed to generate the auxiliary information image signal at least partially from at least one 3-D data record as a function of the camera position signal.
- the VRML module can thus generate a three-dimensional view for displaying in a two-dimensional plane, in particular an image display unit, of at least one or more vehicle components for a camera position.
- the VRML module can read out a selection of the parts to be loosened or replaced in a corresponding sequence from the 3D database and a view of these parts can be calculated for a predetermined camera position.
- the augmented reality device can then generate the auxiliary information image signal from the camera image signal and from an output signal of the VRML module.
- the auxiliary information image signal can contain, for example, information about a torque of a screw connection to be tightened.
- the augmented reality device has a voice input unit and is designed to be controllable by the voice input unit.
- a voice input unit gives a user both hands free for work to be performed when working with the mobile tracking unit.
- the augmented reality device has a voice output unit which is designed to audibly output a voice auxiliary instruction signal representing a spoken language.
- a voice auxiliary instruction signal representing a spoken language.
- the augmented reality device is preferably designed to generate a video auxiliary instruction signal representing graphic information or text information and to generate the auxiliary information image signal from the camera image signal and the auxiliary instruction signal such that the auxiliary information image signal represents an image in which the graphic information or text information is superimposed on the image of the object.
- the video auxiliary instruction signal can represent, for example, auxiliary instructions for predetermined repair steps.
- the video auxiliary instruction signal can contain, for example, a calculated view of vehicle components from a predetermined detection direction of the camera.
- the auxiliary video instruction signal can contain parts of the output signal of the VRML module or represent an output signal of the VRML module itself.
- the augmented reality device has a mobile camera carrier for attachment to an object, the mobile camera carrier having at least one mobile camera with a receiver for electromagnetic radiation.
- the mobile camera is designed to detect electromagnetic radiation emitted or reflected by the object in a detection direction and to generate a mobile camera image signal.
- the mobile camera image signal represents an image of at least a part of the object.
- the augmented reality device has an input for the mobile camera image signal, which can be operatively connected to the mobile camera.
- a mobile camera can also be designed as an endoscope mobile camera, which comprises an elongated shaft with a deflectable movable end.
- the endoscope mobile camera is designed to detect electromagnetic radiation emitted or reflected by the object in a detection direction in the region of the deflectable movable end and to generate a mobile camera image signal.
- the augmented reality device can have a Bluetooth interface, which is connected on the output side to the input for the mobile camera image signal.
- the mobile camera can be connected on the output side to a Bluetooth transmission unit.
- An image reproduction unit of a mobile tracking unit can be connected wirelessly to the augmented reality device.
- the image display unit can be connected wirelessly to the augmented reality device via an infrared interface or via a Bluetooth interface.
- the mobile tracking unit has wheels which are arranged in the region of a floor of a mobile tracking unit such that the mobile tracking unit is rollable on a floor.
- the wheels of the mobile tracking unit can be locked.
- the mobile tracking unit is preferably designed in the area of the floor in such a way that the wheels can be sunk in the floor.
- the mobile tracking unit can have a lowering device.
- the tracking unit can have a lowering lever that is operatively connected to the lowering device, which is pivotably arranged and is operatively connected to the wheels in such a way that the wheels can be lowered inwards in the floor of the mobile tracking unit by actuating the lowering lever by hand.
- the mobile tracking unit is preferably designed to be able to countersink the wheels in such a way that the wheels can be completely countersunk into the interior of the tracking unit in a plane formed by the tracking unit floor.
- the tracking unit has feet and the tracking unit is designed to sink the wheels into the interior of the tracking unit such that the feet can come into active contact with a floor.
- An image reproduction unit of the mobile tracking unit can also be designed as a see-through head-mounted display (STHMD).
- the augmented reality system can display a calibration mark in the see-through head-mounted display, which makes it easier for a user to align an auxiliary instruction with a viewed object.
- FIG. 1 shows a schematic exemplary embodiment of a mobile tracking unit with an augmented reality device and an articulated arm which has a camera in the region of the end;
- Figure 2 is a schematic representation of a mobile tracking unit with an articulated arm and a mobile camera carrier for attachment to an object and
- Figure 3 is a schematic representation of an augmented reality device for a mobile tracking unit.
- FIG. 1 shows a schematic illustration of a mobile tracking unit 10.
- the tracking unit 10 has an articulated arm 20 which is arranged on an upper side of the tracking unit 10.
- the articulated arm 20 has an articulated arm base 28 on which the articulated arm 20 is attached.
- the articulated arm base 28 has a base swivel 32, by means of which the articulated arm base 28 is designed to be rotatable about a base rotational axis 68 and for this purpose has a base swivel 32.
- the articulated arm base 28 is connected to the tracking unit 10 via the base swivel joint 32.
- the base swivel joint 32 is designed in such a way that the articulated arm base 28 can rotate around the base rotation axis 68. This is indicated by the directions of rotation 53 and 54.
- the base axis of rotation 68 is arranged perpendicular to a surface of the tracking unit 10.
- the articulated arm 20 has a longitudinally extending rigid base member 22.
- the rigid base member 22 has two ends, one end being connected to the articulated arm base 28 via a base pivot joint 30 such that the rigid base member 22 is pivotally connected to the articulated arm base 28 about a pivot axis running perpendicular to the base axis of rotation 68.
- the rigid base member 22 is connected in the region of the end facing away from the articulated arm base 28 to a rigid central member 24 via a central swivel joint 34.
- the central swivel joint 34 is designed and arranged such that the rigid central member 24 can be swiveled about a swivel axis relative to the rigid base member 22, which runs parallel to the swivel axis of the basic swivel joint 30.
- the rigid central link 24, like the rigid base link 22, is designed to extend longitudinally and has two ends.
- the rigid middle link 24 is connected to a rigid end link 26 in the region of the end facing away from the central joint 34 via a central swivel joint 36.
- the rigid end member 26 is elongated and has two ends.
- the rigid end member 26 has a camera joint 39 in the region of the end facing away from the central swivel joint 36.
- the mobile tracking unit 10 has a camera 40 which is connected to the rigid end member 26 via the camera joint 39.
- the camera joint 39 is connected to the rigid end member 26 in such a way that the camera 40 about an axis of rotation 46 is rotatable all around.
- the axis of rotation 46 extends coaxially with a longitudinal axis of the rigid end member 26.
- the camera joint 39 is also designed such that the camera 40 is connected to the rigid end member 26 so as to be pivotable about a pivot axis 44, the pivot axis 44 running perpendicular to the axis of rotation 46.
- the camera joint 39 has a swivel joint 38, which is arranged to connect the camera 40 to the camera joint 39.
- the camera joint 39 is designed as a U-profile with a U-profile base and two U-profile side walls.
- the axis of rotation 46 runs through the U-profile base and the pivot axis 44 runs through the U-profile side walls.
- Swivel directions 46 and 66 of the rigid end member 26, swivel directions 60 and 62 of the rigid central member 24 and swivel directions 58 and 56 of the rigid base member 22 are also shown.
- the camera joint 39, the central pivot joint 36, the central pivot joint 34 and the base pivot joint 30 each have a joint sensor.
- the joint sensor is designed to detect the angular position of a joint.
- the swivel angle range of the central swivel joint 34 and of the base swivel joint 30 can be limited here, so that the articulated arm 20 cannot sag.
- the swivel angle range of the swivel joint 38 about the swivel axis 44 can be limited.
- the mobile tracking unit 10 has four wheels 14 which are arranged in the region of a floor of the mobile tracking unit 10.
- the mobile tracking unit 10 is designed such that the wheels 14 can be lowered into the interior of the mobile tracking unit 10.
- the mobile tracking unit 10 has a lowering lever 16 which can be pivoted about a pivot axis 18.
- the pivot axis 18 runs parallel to the bottom of the mobile tracking unit 10.
- the wheels 14 are operatively connected to the lowering lever 16 such that the wheels 14 can be moved upwards in a lowering direction 68 by a pivoting movement 17 of the lowering lever 16.
- the wheels 14 can be moved in an extension direction 70 by a pivoting movement of the lowering lever 16 opposite to the pivoting direction 17.
- the mobile tracking unit 10 can have corresponding recesses for receiving the wheels 14 in order to lower the wheels 14 in the bottom of the tracking unit.
- the mobile tracking unit 10 is designed such that the wheels 14 can be lowered in such a way that the bottom of the mobile tracking unit 10 comes into operative contact with a floor (not shown) with the floor 14 in a lowered position. As a result, the wheels 14 are relieved and the mobile tracking unit 10 is no longer fixed in a rollable manner.
- the mobile tracking unit 10 has an image display unit 52.
- the image display unit 52 can be designed as a TFT display.
- the mobile tracking unit 10 has a computer unit 48 with an augmented reality device.
- the mobile tracking unit 10 also has an input unit 50.
- the input unit 50 can be designed as a keyboard.
- the image display unit 52 and the input unit 50 are connected to the computer unit 48.
- the articulated arm 20 of the mobile tracking unit 10 also has a lamp 42.
- the lamp 42 is arranged on the rigid end member 26, for example in the region of the free end.
- the lamp 42 can be pivotally connected to the rigid end member 26.
- the lamp 42 can have a halogen lamp or at least one luminescent diode.
- the articulated arm 20 of the mobile tracking unit 10 has a probe tip 41 in the area of the free end, the end of which forms the free end of the articulated arm 20.
- the probe tip 41 is provided for calibrating the augmented reality device.
- FIG. 2 shows an embodiment of a mobile tracking unit 110.
- the mobile tracking unit 110 has an articulated arm 120 which is attached to an articulated arm base 128.
- the articulated arm base 128 is connected to the mobile tracking unit 110 and is arranged on an upper side of the mobile tracking unit 110 and connected to the latter via a base swivel 132.
- the articulated arm base 128 has a swivel joint 130, via which a rigid base member 122 is connected to the articulated arm base 128.
- the rigid base member 122 has a central pivot joint 134 in the region of the end facing away from the base pivot joint 130.
- the rigid base member 122 is connected to a rigid central member 124 via the central pivot joint 134.
- the rigid central link 124 is connected to a rigid end link 126 via a central swivel joint 136 in the region of the end facing away from the central swivel joint 134.
- the rigid end member 126 has an image display unit 160 in the region of a free end.
- the rigid end member 126 also has a lamp 142 in the region of the free end, which can be pivotally connected to the rigid end member 126, for example.
- a probe tip 141 is arranged, the end of which forms the free end of the articulated arm 120.
- the mobile tracking unit 110 also has four wheels 114, which are operatively connected to a lowering lever 116 in such a way that the wheels 114 can each be lowered into a cavity of the tracking unit 110 provided for lowering the respective wheel by pivoting the lowering lever 116.
- the mobile tracking unit 110 also has a computer unit 148 with an augmented reality device, an input unit 150 and an image display unit 152.
- the image display unit 152 and the input unit 150 are connected to the computer unit 148.
- the image display unit 152 can be a TFT screen, for example.
- the input unit 150 can be a keyboard, for example.
- a vehicle 102 to be repaired is also shown.
- the vehicle 102 to be repaired has a bonnet 104.
- the bonnet 104 is shown in the open state.
- the mobile tracking unit 110 comprises a mobile camera carrier 146.
- the mobile camera carrier 146 is rod-shaped and has a first mobile camera 140 along a longitudinal axis in the area of a first outer third section and in the area of a second outer third section, which is opposite the first outer third section, a second mobile camera 144.
- the first Mobile camera 140 and the second mobile camera 144 are operatively connected to the computer unit 148 and thus to the augmented reality device.
- the computer unit 148 has a Bluetooth interface for this purpose.
- the first mobile camera 140 and the second mobile camera 144 can be connected to the LAN interface unit at least temporarily.
- the first mobile camera 140 and the second mobile camera 144 can each be configured as a CCD camera.
- the mobile tracking unit 110 also comprises a mobile image display unit 162 which is connected to the mobile camera carrier 146.
- the mobile image display unit 162 can be connected to the computer unit 148.
- the computer unit 148 includes a Bluetooth interface or an infrared interface, which are designed to connect the computer unit 148 to the mobile image display unit 162.
- the first mobile camera 140 and the second mobile camera 144 can each generate a camera image signal, which corresponds in each case to an image of an object, for example an engine to be repaired.
- the detection area that is to say the detection angle and the detection direction, of the first camera 140 and the second camera 144 must first be calibrated.
- at least one marker can be attached at a predetermined location in the engine compartment of the vehicle 102.
- a marker can be, for example, an infrared marker or a paper marker that is printed with a specific pattern.
- the infrared signal or the specific pattern of the paper marker can be recognized in an image captured by the camera in the camera image signal.
- the augmented reality device can have a marker recognition unit.
- the Marker detection unit is designed to generate a camera position signal as a function of a marker position in an image of an object and to output it on the output side.
- FIG. 3 shows a schematic illustration of an embodiment of an augmented reality device 301 for a mobile tracking unit.
- the augmented reality device 301 has a central processing unit 310.
- the central processing unit 310 can be a microprocessor, for example.
- the augmented reality device 301 has a camera 312.
- the camera 312 has a receiver for electromagnetic radiation and is designed to detect electromagnetic radiation emitted or reflected by an object at a detection angle and to generate a camera image signal representing an image of the object.
- the augmented reality device 301 for a tracking unit also has a camera position sensor 328.
- the camera position sensor 328 is designed to generate a camera position signal corresponding to a detection direction of the camera and to output it on the output side.
- the augmented reality device 301 also has joint sensors 334, 332, 330 and 336.
- the joint sensors 334, 332, 330, 336 are each operatively connected to a swivel or swivel joint of an articulated arm and are designed to detect an angular position of the corresponding joint and to generate a joint angle signal corresponding to the angular position of the joint and to output it on the output side.
- a joint sensor can have a rotary potentiometer which has an ohmic resistance.
- the ohmic resistance of the rotary potentiometer is dependent on the angular position of an at least partially revolving sliding contact.
- the sliding contact is operatively connected to the swivel or swivel joint in such a way that an angle of rotation position of a swivel or swivel joint corresponds to an angle of rotation position of the sliding contact of the rotary potentiometer.
- the rotary potentiometer can be arranged with the rotary or swivel joint on a common shaft or, for example, be operatively connected to the rotary or swivel joint via a toothed belt.
- An optical angle sensor is also conceivable, in which a number of photodiodes and a number of luminescent diodes corresponding to the number of photodiodes are present.
- the photodiodes and the luminescence diodes are each arranged in such a way that exactly one photodiode is arranged in the beam path of exactly one corresponding luminescence diode.
- the optical angle sensor has a movable cutting disc, which is arranged and designed to interrupt the beam path between at least one luminescence diode and the corresponding photodiode as a function of an angular position of the swivel or swivel joint.
- the augmented reality device 301 also has an image display unit 314 with a touch-sensitive surface 316.
- the touch-sensitive surface 316 is designed to generate a touch signal representing the touch location as a function of a touch at a touch location of the touch-sensitive surface 316 and to output it on the output side.
- the augmented reality device 301 also has a 3-D database with a large number of 3-D data records.
- the 3-D data sets as a whole represent a graphic model in three dimensions of at least parts of an object, in particular a vehicle, with its individual components.
- the augmented reality device 301 also has a 3-D calculation unit 318, which is connected to the 3-D database 320 via a connecting line 364.
- the 3-D calculation unit 318 is connected to the central processing unit 310 via a connecting line 362 and is designed, depending on a camera position signal, to form a 3-D auxiliary instruction signal from at least one 3-D data set signal representing a 3-D data set produce.
- the 3-D auxiliary instruction signal represents an image in two dimensions of at least one three-dimensional object in a detection direction corresponding to the camera position signal.
- the augmented reality device 301 also has an auxiliary instruction memory 322, which is connected to the central processing unit 310 via a bidirectional data bus 366.
- the auxiliary instruction store 322 has a number of records.
- a data record can be a video data record, which is a graphic information or a Represents text information.
- a data record can be an audio data record that represents spoken language.
- the central processing unit 310 is designed to read out an audio data record or a video data record from the auxiliary instruction memory 322 via the bidirectional data bus 366 and to generate an audio auxiliary instruction signal corresponding to the audio data record or a video auxiliary instruction signal corresponding to the video data record.
- the augmented reality device 301 also has a voice input unit 323.
- the voice input unit 323 is connected on the input side via a connecting line 376 to a microphone 324 and on the output side via a connecting line 374 to the central processing unit 310.
- the microphone 324 is designed to receive airborne sound and to generate a microphone signal corresponding to the airborne sound and to output it on the output side.
- the voice input unit 323 is designed to digitize the microphone signal and to generate an ASCII signal from the digitized microphone signal in accordance with a predetermined assignment rule and to output it on the output side.
- the augmented reality device 301 also has an audio output unit 326.
- the audio output unit 326 has a loudspeaker for converting an electrical signal into airborne sound and is connected on the input side to the central processing unit 310 via a connecting line 372.
- the image display unit 314 is connected on the input side to the central processing unit 310 via a connecting line 368.
- the touch-sensitive surface 316 is connected to the central processing unit 310 via a connecting line 370.
- the camera position sensor 328 is connected to the central processing unit via a connecting line 352.
- the camera 312 is connected to the central processing unit 310 via a connecting line 350.
- the camera position sensor 328 is connected on the input side via a connecting line 360 to the joint sensor 334, on the input side via a connecting line 358 to the joint sensor 332, on the input side via a connecting line 356 to the joint sensor 330 and on the input side via a connecting line 354 to the joint sensor 336.
- the camera 312 can generate a camera signal corresponding to an object to be detected and send it on the output side to the central processing unit 310 via the connecting line 350.
- the camera position sensor 328 can receive an angle signal via the connecting lines 360, 358, 356 and 354, which corresponds to an articulated position of a corresponding swivel or swivel joint of an articulated arm.
- the camera position sensor 328 can generate a camera position signal which represents a camera position in a predetermined coordinate system and a detection direction and send this to the central processing unit 310 on the output side via the connecting line 352.
- the central processing unit 310 can send the camera position signal to the 3-D calculation unit 318 via the bidirectional connecting line 362.
- the 3-D calculation unit 318 can read out at least one 3-D data record from the 3-D database 320 and a 3-D data record signal corresponding to the 3-D data record, depending on a user interaction signal received via the connecting line 362 produce.
- the 3-D calculation unit is further developed to generate a 3-D auxiliary instruction signal as a function of the camera position signal and to send this to the central processing unit 310 on the output side via the bidirectional connecting line 362.
- the central processing unit 310 can generate the user interaction signal depending on a touch signal received via the connection line 370 and send the user interaction signal via the connection line 362 to the 3-D calculation unit.
- the 3-D data records can, for example, represent individual objects of a vehicle model.
- a user's hand 305 may touch a location of touch-sensitive surface 316 that corresponds to a predetermined repair program.
- the central processing unit 310 can then generate a user interaction signal which corresponds to the 3-D data records to be read out from the 3-D database 320.
- the central processing unit 310 can generate the user interaction signal depending on the ASCII signal of the voice input unit 323 received via the connecting line 374.
- the central processing unit 310 can read out at least one audio data record via the bidirectional data bus 366 and generate an audio auxiliary instruction signal corresponding to the audio data record and send this to the audio output unit 326 via the connecting line 372.
- the central processing unit 310 can also read out a video data record from the auxiliary instruction memory 322 via the bidirectional data bus 366 and generate a video auxiliary instruction signal corresponding to the video data record.
- the central processing unit 310 is designed to generate an auxiliary information image signal from the camera image signal received on the input side via the connecting line 350, the 3-D auxiliary instruction signal received via the bidirectional connecting line 362 and from the video auxiliary instruction signal such that the auxiliary information image signal represents an image in which graphic information or text information is superimposed on the image of an object.
- the central processing unit 310 can send the auxiliary information image signal via the connecting line 368 to the image display unit 314 on the output side.
- the camera 312 can be connected to the central processing unit 310 via a radio data interface.
- the radio data interface is designed to transmit a camera image signal wirelessly.
- the image display unit 314 can accordingly be connected to the central processing unit 310 via a radio data interface.
Landscapes
- Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Manufacturing & Machinery (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102004027010 | 2004-05-28 | ||
DE102005011616.7A DE102005011616B4 (de) | 2004-05-28 | 2005-03-07 | Mobile Trackingeinheit |
PCT/EP2005/005781 WO2005116785A1 (de) | 2004-05-28 | 2005-05-25 | Mobile trackingeinheit |
Publications (1)
Publication Number | Publication Date |
---|---|
EP1749248A1 true EP1749248A1 (de) | 2007-02-07 |
Family
ID=34970199
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP05750870A Withdrawn EP1749248A1 (de) | 2004-05-28 | 2005-05-25 | Mobile trackingeinheit |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP1749248A1 (de) |
DE (1) | DE102005011616B4 (de) |
WO (1) | WO2005116785A1 (de) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102010016113A1 (de) | 2010-03-24 | 2011-09-29 | Krauss-Maffei Wegmann Gmbh & Co. Kg | Verfahren zur Ausbildung eines Besatzungsmitglieds eines insbesondere militärischen Fahrzeugs |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2642833B1 (fr) * | 1989-02-06 | 1991-05-17 | Vision 3D | Procede d'etalonnage d'un systeme d'acquisition tridimensionnelle de forme et systeme d'acquisition pour la mise en oeuvre dudit procede |
US5745387A (en) * | 1995-09-28 | 1998-04-28 | General Electric Company | Augmented reality maintenance system employing manipulator arm with archive and comparison device |
EP1157316B1 (de) * | 1999-03-02 | 2003-09-03 | Siemens Aktiengesellschaft | System und verfahren zur situationsgerechten unterstützung der interaktion mit hilfe von augmented-reality-technologien |
US6330356B1 (en) * | 1999-09-29 | 2001-12-11 | Rockwell Science Center Llc | Dynamic visual registration of a 3-D object with a graphical model |
DE10159610B4 (de) | 2001-12-05 | 2004-02-26 | Siemens Ag | System und Verfahren zur Erstellung einer Dokumentation von Arbeitsvorgängen, insbesondere im Umfeld Produktion, Montage, Service oder Wartung |
AU2002366994A1 (en) * | 2002-01-15 | 2003-07-30 | Information Decision Technolog | Method and system to display both visible and invisible hazards and hazard information |
US20030179308A1 (en) * | 2002-03-19 | 2003-09-25 | Lucia Zamorano | Augmented tracking using video, computed data and/or sensing technologies |
DE10345743A1 (de) * | 2003-10-01 | 2005-05-04 | Kuka Roboter Gmbh | Verfahren und Vorrichtung zum Bestimmen von Position und Orientierung einer Bildempfangseinrichtung |
-
2005
- 2005-03-07 DE DE102005011616.7A patent/DE102005011616B4/de not_active Expired - Fee Related
- 2005-05-25 WO PCT/EP2005/005781 patent/WO2005116785A1/de active Application Filing
- 2005-05-25 EP EP05750870A patent/EP1749248A1/de not_active Withdrawn
Non-Patent Citations (1)
Title |
---|
See references of WO2005116785A1 * |
Also Published As
Publication number | Publication date |
---|---|
DE102005011616B4 (de) | 2014-12-04 |
DE102005011616A1 (de) | 2005-12-29 |
WO2005116785A1 (de) | 2005-12-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2176833B1 (de) | Verfahren und system zur ermittlung der position und orientierung einer kamera relativ zu einem realen objekt | |
DE19513930B4 (de) | Endoskopievorrichtung | |
DE3211029C2 (de) | Abtasthalterung für eine Ultraschall-Diagnosesonde | |
EP2309925B1 (de) | Röntgenbild-aufnahmesystem und röntgenbild-aufnahmeverfahren zur aufnahme von bilddaten mit röntgengeräten für eine volumenrekonstruktion | |
DE112013003584T5 (de) | Tragbares Gelenkarm-Koordinatenmessgerät mit optischem Kommunikationsbus | |
DE102015015503A1 (de) | Robotersystem, das eine mit erweiterter Realität kompatible Anzeige aufweist | |
DE102013110580A1 (de) | Verfahren zum optischen Abtasten und Vermessen einer Szene | |
EP1761759A1 (de) | Verfahren zur steuerung einer rohrrevisionsanlage und zur auswertung der revisionsdaten | |
EP1420264A1 (de) | Verfahren und Vorrichtung zur Kalibrierung eines Messsystems | |
EP3286532B1 (de) | Verfahren zum erfassen von vibrationen einer vorrichtung und vibrationserfassungssystem | |
DE102007039788A1 (de) | Detektor | |
DE2948573A1 (de) | Verfahren und anordnung zur beruehrungslosen achsvermessung an kraftfahrzeugen | |
DE102009012590A1 (de) | Vorrichtung zum Ermitteln der Stellung eines Roboterarms mit Kamera zur Durchführung von Aufnahmen | |
EP1047014A2 (de) | Verfahren und Eingabeeinrichtung zum Steuern der Lage eines in einer virtuellen Realität graphisch darzustellenden Objekts | |
WO2015082683A2 (de) | Vorrichtung und verfahren zur messung von werkstücken | |
DE112012005524T5 (de) | Verfahren und Vorrichtung zur Verwendung von Gesten zur Steuerung eines Lasertrackers | |
DE102006046689A1 (de) | Medizintechnisches Behandlungssystem | |
DE112013002892T5 (de) | Koordinatenmessgeräte mit entfernbaren Zusatzteilen | |
EP0481278A1 (de) | Verfahren und Messeinrichtung zur Positionsbestimmung von Raumpunkten | |
DE102011119012A1 (de) | Vereinfachtes Koordinatenmesssystem für Schulungszwecke | |
EP0884574B1 (de) | Reifenprüfverfahren und -vorrichtung | |
EP1639337B1 (de) | Kamerawagen | |
DE102005011616B4 (de) | Mobile Trackingeinheit | |
DE102019201134B4 (de) | Verfahren, Computerprogramm mit Instruktionen und System zum Einmessen einer Augmented-Reality-Brille und Augmented-Reality-Brille zur Verwendung in einem Kraftfahrzeug | |
DE102022202563B3 (de) | Planen einer Bahn eines Roboters |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20061017 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU MC NL PL PT RO SE SI SK TR |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: SCHEFFLER, WERNER Inventor name: TEGTMEIER, ANDRE Inventor name: RITTER, KLAUS-CHRISTOPH Inventor name: POEGE, CARSTEN |
|
DAX | Request for extension of the european patent (deleted) | ||
17Q | First examination report despatched |
Effective date: 20071126 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN |
|
18W | Application withdrawn |
Effective date: 20090311 |