DE102011122206A1 - Method for representation of virtual image component i.e. augmented reality image, on transparent display of augmented reality system, involves determining position of component, and representing virtual image component by display - Google Patents

Method for representation of virtual image component i.e. augmented reality image, on transparent display of augmented reality system, involves determining position of component, and representing virtual image component by display Download PDF

Info

Publication number
DE102011122206A1
DE102011122206A1 DE102011122206A DE102011122206A DE102011122206A1 DE 102011122206 A1 DE102011122206 A1 DE 102011122206A1 DE 102011122206 A DE102011122206 A DE 102011122206A DE 102011122206 A DE102011122206 A DE 102011122206A DE 102011122206 A1 DE102011122206 A1 DE 102011122206A1
Authority
DE
Germany
Prior art keywords
component
virtual image
transparent display
motor vehicle
augmented reality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
DE102011122206A
Other languages
German (de)
Inventor
Johannes Tümler
Ralf Rabätje
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Volkswagen AG
Original Assignee
Volkswagen AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Volkswagen AG filed Critical Volkswagen AG
Priority to DE102011122206A priority Critical patent/DE102011122206A1/en
Publication of DE102011122206A1 publication Critical patent/DE102011122206A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with three-dimensional environments, e.g. control of viewpoint to navigate in the environment
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person

Abstract

The invention relates to a method for displaying a virtual image component (VIRT) on a transparent display (11), wherein a transformation module (20) is generated as a function of the viewing direction of a user of the transparent display (11), the position (PVIRT ') of the virtual image component (VIRT) is determined by means of the transformation module (20) as a function of the position (PVIRT) of a real object (55) to be superposed by means of the virtual image component (VIRT), and wherein the virtual image component (VIRT) is determined by means of the transparent display (11 ) is pictured.

Description

  • The invention relates to a method for displaying a virtual image component on a transparent display.
  • The EP 0 949 513 A2 discloses an augmented reality system having a tracker system for determining the relative position between a sensor for determining a pattern of reference points on an object surface and the object surface, the augmented reality system including a processor connected to the sensor.
  • The WO 01/96829 A1 discloses a method for automobile production wherein an examiner is assisted in performing an inspection for quality defects of a motor vehicle or a motor vehicle component by an Augmented Reality (AR) system which records and / or validates exam relevant data the examiner outputs, the test in the context of the production of the motor vehicle or the motor vehicle component for detecting quality defects in the motor vehicle or the motor vehicle component in at least one predetermined manufacturing state, wherein a worker in the implementation of individual, the test upstream production steps for Manufacture of other motor vehicles or automotive components via the augmented reality system is supplied based on at least one of the identified quality deficiencies caution, and wherein a post-processor in the implementation of a test downstream Na In the case of a faulty motor vehicle or a defective motor vehicle component, error information about the augmented reality system based on the respectively identified quality defects is supplied to the driver.
  • According to the WO 01/96829 A1 the term augmented reality is defined as a superposition of a real environment with a computer-generated environment, ie a virtual world. Augmented reality is thus a kind of human-technology interaction that fills in the user's information in his field of vision and thus expands his perception.
  • The DE 10 2005 045 973 A1 discloses a device having an image capture unit for detecting a real environment and generating images of an environmental region detected within a time interval and having processing means for determining movement information by evaluating the images.
  • The EP 1 507 235 A1 discloses a method of creating a combined image or video having virtual objects. It is provided that a marker with position detection means, in particular a GPS, is equipped.
  • The EP 1 708 139 B1 discloses a calibration method for detecting a correction value for correcting a position and orientation difference between a physical object and a virtual object superimposed on a captured image of the physical object.
  • The DE 10 2009 049 073 A1 discloses a method for displaying virtual information in a view of a real environment, wherein a virtual object is provided having a global position and orientation with respect to a geographic global coordinate system.
  • The DE 10 2004 061 841 A1 discloses a system for markerless tracking of objects, wherein a mixing system for simultaneous display of a video image and a three-dimensional data model are provided.
  • The DE 11 2004 000 902 D5 discloses an augmented reality system with a real reference generator for displaying a real difference on a calibration screen. In addition, an optical transparency display is provided with a fast position with respect to the real reference generator.
  • It is an object of the invention to improve the efficiency of Adsegmented Reality systems with see-through display. For this purpose, it is particularly desirable to make augmented reality systems with a transparent display faster ready for use. In particular, the calibration of such a system should be accelerated.
  • The above object is achieved by a method for displaying a virtual image component on a transparent display (see-through display), wherein a transformation module depending on the viewing direction of a user (viewer) of the transparent display is generated, wherein the position of the virtual image component by means of the transformation module in dependence Position of a real object to be superimposed by means of the virtual image component (in the vicinity of the transparent display which a user / viewer sees when viewing through the transparent display), and wherein the virtual image component is determined by means of the transparent display (determined by means of the transformation module Position of the virtual image component).
  • A virtual image component within the meaning of the invention may comprise alphanumeric characters or consist of such. A virtual image component within the meaning of the invention may comprise alphanumeric characters as well as a color background for these characters. A virtual image component within the meaning of the invention can consist of alphanumeric characters in conjunction with a color background for these characters. A virtual image component within the meaning of the invention may be a drawing or include such a drawing. A virtual image component within the meaning of the invention may be a sketch or comprise such a sketch. A virtual image component within the meaning of the invention may be or include an icon. A virtual image component within the meaning of the invention may be or include an animation.
  • A viewing direction in the sense of the invention is in particular the visual beam of an eye (on a real or virtual (shown in the see-through display) object).
  • In an advantageous embodiment of the invention, it is provided that to generate the transformation module or to generate a transformation matrix of the transformation module, a user (in particular due to his own selection [and in particular corresponding input or message] or on request), his view of a particular (selected) real point visible when looking through the transparent display. In addition, by means of the transparent display in a further advantageous embodiment of the invention, a marking is produced which, in a further advantageous embodiment of the invention, follows the viewing direction of the observer (so that the marking and the selected real point are in the user's sight [on a viewing beam] ). In a further advantageous embodiment of the invention, the coordinates of the marking are assigned to the coordinates of the selected real point, if the viewing direction of the observer for a predetermined limit long substantially unchanged or remains constant. In a further advantageous embodiment of the invention, the method described in this paragraph is repeated at least once in an advantageous manner at least five times (ie at least twice or at least six times), the view of the viewer or user on objects of different positions (or different selected real points ). A marker according to the invention is in particular a crosshair.
  • In a further advantageous embodiment of the invention, the transformation module is generated as a function of a data record in which at least six different real positions, which are visible when viewed through the transparent display, are associated with positions of a marking. In a further advantageous embodiment of the invention, the transformation module comprises a transformation matrix. In a further advantageous embodiment of the invention, the transformation matrix is generated as a function of a data record in which at least six different real positions, which are visible when viewed through the transparent display, are assigned to positions of a marking.
  • In a further advantageous embodiment of the invention, the viewing direction of the user (observer) of the transparent display is determined by means of a camera aimed at at least one eye of the user (observer) of the transparent display. A suitable camera is disclosed on the website www.istar-project.org. In a further advantageous embodiment of the invention, the transparent display and the camera are mechanically connected. In a further advantageous embodiment of the invention, the transparent display and the camera are mechanically fixed to each other. In a further advantageous embodiment of the invention, the transparent display and the camera are arranged together on a headset. In a further advantageous embodiment of the invention, the transparent display is arranged on the headset that it is associated with only one eye of the user (viewer). In this case, it is provided in particular that a second eye is not assigned to the transparent display, that is to say that the view is released by the second eye without looking through the transparent display.
  • In a further advantageous embodiment of the invention, the headset is associated with a sensor (eg camera), which is arranged in particular fixed on the headset. Alternatively or additionally, a sensor may be provided which records both the real object that is the target of the virtual image and the viewer. By means of these sensors, in particular tracking or a change of the transformation module takes place when the orientation of the real object to the head of the user or viewer changes.
  • It is a further object of the invention to reduce the cost of the production of technical components, in particular the cost of the production of motor vehicles or the cost of their repair.
  • The aforementioned object is achieved - in particular in connection with the aforementioned features - by a method for producing a technical component, in particular a motor vehicle or a motor vehicle component, wherein a component to be mounted is marked or identified by means of the virtual image component according to the method described above, and that the component to be mounted in the technical Component is mounted in the motor vehicle or in the motor vehicle component.
  • The aforementioned object is - in particular in connection with the aforementioned features - also solved by a method for producing a technical component, in particular a motor vehicle or a motor vehicle component, wherein the mounting location of a component to be mounted by means of the virtual image component according to the method described above is marked or marked , And that the component to be mounted in the technical component, in the motor vehicle or in the motor vehicle component is mounted at the mounting location.
  • The above object is - in particular in connection with the aforementioned features - also solved by a method for producing a technical component, in particular a motor vehicle or a motor vehicle component, wherein an assembly process for mounting a component to be mounted by means of the virtual image component determined according to the method described above or is verified, and that the component to be mounted in the technical component, in the motor vehicle or in the motor vehicle component is mounted according to the assembly process.
  • Motor vehicle in the sense of the invention is in particular a land vehicle which can be used individually in road traffic. Motor vehicles according to the invention are not limited in particular to land vehicles with internal combustion engine.
  • Further advantages and details emerge from the following description of exemplary embodiments. Showing:
  • 1 an embodiment of a system for displaying an augmented reality image,
  • 2 an embodiment for a virtual image, which is represented by means of a transparent display,
  • 3 An exemplary embodiment of an exemplary method sequence for the production or correction or repair of a motor vehicle,
  • 4 an exemplary method for implementing the calibration step and 5 a sketch illustrating the method according to 4 ,
  • 1 shows a system for displaying an augmented reality image VIRT. The system includes one by a user 1 worn headset 10 with a transparent display 11 (See-through display) to display the augmented reality image VIRT. The headset 10 also includes a camera 12 for detecting the viewing direction BTRG of the eye 51 , The transparent display 11 is by means of a transformation module 20 which is connected to the transparent display 11 the virtual image VIRT and its position PVIRT 'outputs. The virtual image components VIRT can, for example, from a database 22 and / or a network and from this to the transformation module 20 be transmitted. In this case, a virtual image component VIRT whose desired position PVIRT is assigned to a real object.
  • 2 shows an embodiment of a virtual image VIRT, by means of the transparent display 11 is pictured. In the exemplary embodiment, the real object to which the virtual image VIRT is assigned is a motor vehicle 25 ,
  • This in 1 System shown is particularly suitable for the manufacture and repair of motor vehicles or their components. 3 shows an exemplary embodiment of an exemplary method sequence for the manufacture or correction or repair of a motor vehicle. At first a calibration step takes place 31 in which the transformation module 20 or a corresponding transformation matrix of the transformation module 20 is produced. The calibration step 31 follows a step 32 in which the virtual image VIRT by means of the transparent display 11 is displayed so as to superimpose a real object at the desired position. On the basis of the information thus displayed, a repair or assembly or inspection takes place in one step 33 , The step 33 follows a query 34 whether the calibration step is to be repeated. If the calibration step is to be repeated, the query follows 34 the calibration step 31 , Otherwise, the query follows 34 turn the step 32 in which another virtual image by means of the transparent display 11 is shown.
  • 4 shows an exemplary method for implementing the calibration step 31 in 3 , The process begins with a step 41 in which a position on a real object is selected. The selection may be by the viewer 1 or users of the transparent display 11 respectively. However, this can also be communicated to the selected point.
  • The step 41 follows a step 42 in which the. User or viewer 1 of the transparent display 11 his gaze directed to the selected point of the real image. The viewing direction BRTG or the sight beam 54 as he exemplifies in 5 is shown in a following step 43 detected.
  • It follows a step 44 in which a mark in the 5 by way of example with reference numbers 53 is moved so that it is the selected point of in 5 with reference number 55 designated, for the eye 51 covered. In 5 is the means of the transparent display 11 shown, marking as a crosshair in the virtual image plane 52 of the transparent display 11 shown.
  • The step 44 follows a query 45 in which it is queried whether the visual beam is not changed substantially for a predetermined time. Will the sight beam 54 not changed for a predetermined time, follows the query 45 a step 46 in which the coordinates of the selected point 55 and the coordinates of the marker 53 as two new lines into the matrix labeled B in the Article Mihran Tuceryan, Nassir Navab: Single point active alignment method (SPAAM) for optical see-through HMD calibration for AR (see homepages.cwi.nl/~robertl/2IV55/papers/tuceryan.pdf) (See equation (9) in the article Mihran Tuceryan, Nassir Navab: Single point active alignment method (SPAAM) for optical see-through HMD calibration for AR (see homepages.cwi.nl/~robertl/2IV55/papers/ tuceryan.pdf)). The coordinates x M, i , y M, i and z M, i correspond to the coordinates of the selected point 55 and the coordinates x i and y i the coordinates of the mark 53 ,
  • The step 46 follows a query 47 whether the procedure for six different real points 55 has been carried out. If the procedure was performed for less than six selected real points, the query follows 47 again the step 41 in relation to a new selected real point 55 , Otherwise, the query follows 47 a step 48 in which by means of matrix B the transformation matrix
    Figure 00070001
    is generated, that of the matrix T camera or the matrix G in equation (7) in the Article Mihran Tuceryan, Nassir Navab: Single point active alignment method (SPAAM) for optical see-through HMD calibration for AR (see homepages.cwi.n / ~ robertl / 2IV55 / papers / tuceryan.pdf) equivalent.
  • From the matrix B, according to the relationships of equations (7), (8) and (9) in the Article Mihran Tuceryan, Nassir Navab: Single, point active alignment method (SPAAM) for optical see-through HMD calibration for AR (see homepages.cwi.nl/~robertl/2IV55/papers/tuceryan.pdf) the transformation matrix T camera determined.
  • For positionally accurate insertion of the virtual image VIRT in the step 32 (see. 3 ), by means of the transformation module 20 a vector PVIRT '' determined as a function of the transformation matrix T camera : PVIRT '' = T camera · PVIRT With
  • Figure 00080001
  • From the vector PVIRT '' the position PVIRT of the virtual image VIRT in the transparent display results as follows:
  • Figure 00080002
  • In the in 1 The illustrated system may include a camera mounted on the headset 13 for recording a real image RB1 in the direction of the user 1 be provided of the transparent display. Alternatively or additionally, a. camera 21 be provided for receiving a real image RB2 representing the user 1 in relation to the object under consideration, such as the motor vehicle 25 , pictures. By means of the real images RB1 and RB2, a matrix F according to equation (3) in the Article Mihran Tuceryan, Nassir Navab: Single point active alignment method (SPAAM) for optical see-through HMD calibration for AR (see homepages.cwi.nl/~robertl/2IV55/papers/tuceryan.pdf) determining the coordinates of the position PVIRT 'of the virtual image VIRT, by means of the more transparent display 11 is shown, with a movement of the user or viewer 1 to the real object, such as the motor vehicle 25 , adapts.
  • The elements and angles in 5 are drawn in the light of simplicity and clarity and not necessarily to scale. So z. For example, the magnitudes of some elements are exaggerated over other elements to enhance understanding of the embodiments of the present invention.
  • QUOTES INCLUDE IN THE DESCRIPTION
  • This list of the documents listed by the applicant has been generated automatically and is included solely for the better information of the reader. The list is not part of the German patent or utility model application. The DPMA assumes no liability for any errors or omissions.
  • Cited patent literature
    • EP 0949513 A2 [0002]
    • WO 01/96829 A1 [0003, 0004]
    • DE 102005045973 A1 [0005]
    • EP 1507235 A1 [0006]
    • EP 1708139 B1 [0007]
    • DE 102009049073 A1 [0008]
    • DE 102004061841 A1 [0009]
    • DE 112004000902 D5 [0010]
  • Cited non-patent literature
    • Article Mihran Tuceryan, Nassir Navab: Single point active alignment method (SPAAM) for optical see-through HMD calibration for AR (see homepages.cwi.nl/~robertl/2IV55/papers/tuceryan.pdf) [0035]
    • Article Mihran Tuceryan, Nassir Navab: Single point active alignment method (SPAAM) for optical see-through HMD calibration for AR (see homepages.cwi.n / ~ robertl / 2IV55 / papers / tuceryan.pdf) [0036]
    • Article Mihran Tuceryan, Nassir Navab: Single, point active alignment method (SPAAM) for optical see-through HMD calibration for AR (see homepages.cwi.nl/~robertl/2IV55/papers/tuceryan.pdf) [0037]
    • Article Mihran Tuceryan, Nassir Navab: Single point active alignment method (SPAAM) for optical see-through HMD calibration for AR (see homepages.cwi.nl/~robertl/2IV55/papers/tuceryan.pdf) [0040]

Claims (9)

  1. Method for displaying a virtual image component (VIRT) on a transparent display ( 11 ), whereby a transformation module ( 20 ) depending on the viewing direction of a user of the transparent display ( 11 ), wherein the position (PVIRT ') of the virtual image component (VIRT) by means of the transformation module ( 20 ) as a function of the position (PVIRT) of a real object to be superposed by means of the virtual image constituent (VIRT) ( 55 ), and wherein the virtual image component (VIRT) by means of the transparent display ( 11 ) is pictured.
  2. Method according to claim 1, characterized in that the viewing direction of the user of the transparent display ( 11 ) by means of at least one eye ( 51 ) of the user of the transparent display ( 11 ) directed camera ( 12 ) is determined.
  3. Method according to claim 2, characterized in that the transparent display ( 11 ) and the camera ( 12 ) are mechanically connected.
  4. Method according to claim 2 or 3, characterized in that the transparent display ( 11 ) and the camera ( 12 ) are mechanically fixed to each other.
  5. Method according to claim 2, 3 or 4, characterized in that the transparent display ( 11 ) and the camera ( 12 ) together on a headset ( 10 ) are arranged.
  6. Method according to claim 5, characterized in that the transparent display ( 11 ) on the headset ( 10 ) is arranged so that only one eye ( 51 ) is assigned to the user.
  7. A method for producing a technical component, in particular a motor vehicle or a motor vehicle component, characterized in that a component to be mounted is marked or marked by means of the virtual image component (VIRT) according to a method according to one of the preceding claims, and that the component to be mounted in the technical component, is mounted in the motor vehicle or in the motor vehicle component.
  8. Method for producing a technical component, in particular a motor vehicle or a motor vehicle component, in particular method according to claim 7, characterized in that the mounting location of a component to be mounted by means of the virtual image component (VIRT) according to a method according to one of claims 1 to 6 marked or marked is, and that the component to be mounted in the technical component, in the motor vehicle or in the motor vehicle component is mounted at the mounting location.
  9. Method for producing a technical component, in particular a motor vehicle or a motor vehicle component, in particular a method according to claim 7 or 8, characterized in that an assembly sequence for mounting a component to be mounted by means of the virtual image component (VIRT) according to a method according to one of claims 1 to 6 is determined or verified, and that the component to be mounted in the technical component, in the motor vehicle or in the motor vehicle component is mounted according to the assembly process.
DE102011122206A 2011-12-23 2011-12-23 Method for representation of virtual image component i.e. augmented reality image, on transparent display of augmented reality system, involves determining position of component, and representing virtual image component by display Pending DE102011122206A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
DE102011122206A DE102011122206A1 (en) 2011-12-23 2011-12-23 Method for representation of virtual image component i.e. augmented reality image, on transparent display of augmented reality system, involves determining position of component, and representing virtual image component by display

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
DE102011122206A DE102011122206A1 (en) 2011-12-23 2011-12-23 Method for representation of virtual image component i.e. augmented reality image, on transparent display of augmented reality system, involves determining position of component, and representing virtual image component by display

Publications (1)

Publication Number Publication Date
DE102011122206A1 true DE102011122206A1 (en) 2013-06-27

Family

ID=48575578

Family Applications (1)

Application Number Title Priority Date Filing Date
DE102011122206A Pending DE102011122206A1 (en) 2011-12-23 2011-12-23 Method for representation of virtual image component i.e. augmented reality image, on transparent display of augmented reality system, involves determining position of component, and representing virtual image component by display

Country Status (1)

Country Link
DE (1) DE102011122206A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014213113A1 (en) 2014-07-07 2016-01-07 Volkswagen Aktiengesellschaft Three-dimensional augmented reality process, especially in the automotive sector
DE102014019441A1 (en) 2014-12-22 2016-06-23 Audi Ag Method for mounting at least one component
DE102017107224A1 (en) * 2017-04-04 2018-10-04 Deutsches Zentrum für Luft- und Raumfahrt e.V. Method and device for displaying a component documentation

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0949513A2 (en) 1998-04-08 1999-10-13 Trisen Systems Inc. Virtual reality technology
US6064749A (en) * 1996-08-02 2000-05-16 Hirota; Gentaro Hybrid tracking for augmented reality using both camera motion detection and landmark tracking
WO2001035178A1 (en) * 1999-11-09 2001-05-17 Siemens Aktiengesellschaft System and method for object-oriented marking and organising of information about selected technological components
WO2001096829A1 (en) 2000-06-13 2001-12-20 Volkswagen Aktiengesellschaft Method for testing a product for quality defects and system for carrying out this method
US20020075286A1 (en) * 2000-11-17 2002-06-20 Hiroki Yonezawa Image generating system and method and storage medium
US20020105484A1 (en) * 2000-09-25 2002-08-08 Nassir Navab System and method for calibrating a monocular optical see-through head-mounted display system for augmented reality
DE10215885A1 (en) * 2002-03-20 2003-10-02 Volkswagen Ag Automatic process control
EP1507235A1 (en) 2003-08-15 2005-02-16 Werner G. Lonsing Method and apparatus for producing composite images which contain virtual objects
DE102004061841A1 (en) 2003-12-22 2005-07-14 Augmented Solutions Gmbh Markerless tracking system for augmented reality applications enables search space for features in camera image to be restricted by user by manipulating three-dimensional data model
DE112004000902T5 (en) 2003-06-12 2006-03-09 Siemens Corp. Research, Inc. Calibration of actual and virtual views
US20060250322A1 (en) * 2005-05-09 2006-11-09 Optics 1, Inc. Dynamic vergence and focus control for head-mounted displays
DE102005045973A1 (en) 2005-09-26 2007-04-05 Siemens Ag Device for camera-based tracking has processing arrangement for determining movement information for evaluating images, whereby information characterizes object position change within time interval
DE102009008039A1 (en) * 2008-12-23 2010-07-01 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Augmented reality image producing method for manufacturing technical component of land vehicle, involves selecting color and/or texture of virtual image component, and comparing component with part of real image to augmented reality image
EP1708139B1 (en) 2005-04-01 2011-03-16 Canon Kabushiki Kaisha Calibration method and apparatus
DE102009049073A1 (en) 2009-10-12 2011-04-21 Metaio Gmbh Method for presenting virtual information in a view of a real environment

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6064749A (en) * 1996-08-02 2000-05-16 Hirota; Gentaro Hybrid tracking for augmented reality using both camera motion detection and landmark tracking
EP0949513A2 (en) 1998-04-08 1999-10-13 Trisen Systems Inc. Virtual reality technology
WO2001035178A1 (en) * 1999-11-09 2001-05-17 Siemens Aktiengesellschaft System and method for object-oriented marking and organising of information about selected technological components
WO2001096829A1 (en) 2000-06-13 2001-12-20 Volkswagen Aktiengesellschaft Method for testing a product for quality defects and system for carrying out this method
US20020105484A1 (en) * 2000-09-25 2002-08-08 Nassir Navab System and method for calibrating a monocular optical see-through head-mounted display system for augmented reality
US20020075286A1 (en) * 2000-11-17 2002-06-20 Hiroki Yonezawa Image generating system and method and storage medium
DE10215885A1 (en) * 2002-03-20 2003-10-02 Volkswagen Ag Automatic process control
DE112004000902T5 (en) 2003-06-12 2006-03-09 Siemens Corp. Research, Inc. Calibration of actual and virtual views
EP1507235A1 (en) 2003-08-15 2005-02-16 Werner G. Lonsing Method and apparatus for producing composite images which contain virtual objects
DE102004061841A1 (en) 2003-12-22 2005-07-14 Augmented Solutions Gmbh Markerless tracking system for augmented reality applications enables search space for features in camera image to be restricted by user by manipulating three-dimensional data model
EP1708139B1 (en) 2005-04-01 2011-03-16 Canon Kabushiki Kaisha Calibration method and apparatus
US20060250322A1 (en) * 2005-05-09 2006-11-09 Optics 1, Inc. Dynamic vergence and focus control for head-mounted displays
DE102005045973A1 (en) 2005-09-26 2007-04-05 Siemens Ag Device for camera-based tracking has processing arrangement for determining movement information for evaluating images, whereby information characterizes object position change within time interval
DE102009008039A1 (en) * 2008-12-23 2010-07-01 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Augmented reality image producing method for manufacturing technical component of land vehicle, involves selecting color and/or texture of virtual image component, and comparing component with part of real image to augmented reality image
DE102009049073A1 (en) 2009-10-12 2011-04-21 Metaio Gmbh Method for presenting virtual information in a view of a real environment

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Artikel Mihran Tuceryan, Nassir Navab: Single point active alignment method (SPAAM) for optical see-through HMD calibration for AR (siehe homepages.cwi.nl/~robertl/2IV55/papers/tuceryan.pdf)
Online-Enzyklopädie "Wikipedia", Eintrag zum Begriff "Head mounted display" vom 20.12.2011 [recherchiert am 20.1.2012]. *
TUCERYAN, M. [et al.]: Single point active alignment method (SPAAM) for optical see-through HMD calibration for AR, Proceeding of the IEEE and ACM Symposium on Augmented Reality, 2000, S. 149-158. *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014213113A1 (en) 2014-07-07 2016-01-07 Volkswagen Aktiengesellschaft Three-dimensional augmented reality process, especially in the automotive sector
DE102014019441A1 (en) 2014-12-22 2016-06-23 Audi Ag Method for mounting at least one component
DE102017107224A1 (en) * 2017-04-04 2018-10-04 Deutsches Zentrum für Luft- und Raumfahrt e.V. Method and device for displaying a component documentation

Similar Documents

Publication Publication Date Title
US10229539B2 (en) Component assembly work support system and component assembly method
CN103502876B (en) For the method and apparatus correcting the projection arrangement of vehicle
US9448758B2 (en) Projecting airplane location specific maintenance history using optical reference points
US20150294505A1 (en) Head mounted display presentation adjustment
JP6138566B2 (en) Component mounting work support system and component mounting method
ES2558255T3 (en) Automated annotation of a view
US20140375816A1 (en) Vehicle Display Device with Movement Compensation
US8587659B1 (en) Method and apparatus for dynamic image registration
CN103792674A (en) Device and method for measuring and correcting distortion of virtual reality displayer
Foxlin et al. Flighttracker: A novel optical/inertial tracker for cockpit enhanced vision
US20140285523A1 (en) Method for Integrating Virtual Object into Vehicle Displays
EP3179334A1 (en) Device and method for testing function or use of a head worn see through augmented reality device
CN105676452A (en) Augmented reality hud display method and device for vehicle
DE102011122206A1 (en) Method for representation of virtual image component i.e. augmented reality image, on transparent display of augmented reality system, involves determining position of component, and representing virtual image component by display
CN106104667B (en) The windshield and its control method of selection controllable areas with light transmission
FR3004817A1 (en) Hybrid visualization system displaying superposition information outside
CN102235869A (en) Method and information system for marking automobile destination
KR20120066472A (en) Apparatus and method for displaying augmented reality contents using a front object
JP2014106642A (en) Ar system using optical see-through type hmd
Foxlin et al. Improved registration for vehicular AR using auto-harmonization
Atac et al. Scorpion hybrid optical-based inertial tracker (HObIT)
DE102009040848A1 (en) Augmented-reality-images generating method for determining relative position between sensor and surface of object e.g. component of land vehicle, involves providing optical time variable marker with display for displaying information
CN108171673A (en) Image processing method, device, vehicle-mounted head-up-display system and vehicle
DE102017201502A1 (en) data glasses
Reisman et al. Design of augmented reality tools for air traffic control towers

Legal Events

Date Code Title Description
R163 Identified publications notified
R012 Request for examination validly filed