DE102014009608A1 - Operation of AR glasses in the motor vehicle - Google Patents

Operation of AR glasses in the motor vehicle

Info

Publication number
DE102014009608A1
DE102014009608A1 DE102014009608.4A DE102014009608A DE102014009608A1 DE 102014009608 A1 DE102014009608 A1 DE 102014009608A1 DE 102014009608 A DE102014009608 A DE 102014009608A DE 102014009608 A1 DE102014009608 A1 DE 102014009608A1
Authority
DE
Germany
Prior art keywords
relative position
ar glasses
object
device
motor vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
DE102014009608.4A
Other languages
German (de)
Inventor
Marcus Kühne
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Audi AG
Original Assignee
Audi AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Audi AG filed Critical Audi AG
Priority to DE102014009608.4A priority Critical patent/DE102014009608A1/en
Publication of DE102014009608A1 publication Critical patent/DE102014009608A1/en
Application status is Pending legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/165Anti-collision systems for passive traffic, e.g. including static obstacles, trees
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Other optical systems; Other optical apparatus
    • G02B27/01Head-up displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00624Recognising scenes, i.e. recognition of a whole field of perception; recognising scene-specific objects
    • G06K9/00664Recognising scenes such as could be captured by a camera operated by a pedestrian or robot, including objects at substantially different ranges from the camera
    • G06K9/00671Recognising scenes such as could be captured by a camera operated by a pedestrian or robot, including objects at substantially different ranges from the camera for providing information about objects in the scene to a user, e.g. as in augmented reality applications
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00624Recognising scenes, i.e. recognition of a whole field of perception; recognising scene-specific objects
    • G06K9/00791Recognising scenes perceived from the perspective of a land vehicle, e.g. recognising lanes, obstacles or traffic signs on road scenes
    • G06K9/00805Detecting potential obstacles
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Other optical systems; Other optical apparatus
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Other optical systems; Other optical apparatus
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2380/00Specific applications
    • G09G2380/10Automotive applications

Abstract

The invention relates to a motor vehicle (10) having a recognition device (26) for detecting a relative position (XYZ1) of a vehicle-external object (14) with respect to a sensor reference point (32), a control device (36) for generating graphic data (38) which is a graphical object (22, 24) from the perspective of a predetermined target relative position to the object (14) represent, and a display device (18) for fading the graphic data (38) in a field of view (20) of a user (16). The invention is based on the object to provide a powerful image processing for AR glasses. For this purpose, a localization device (32) is designed to determine a relative position (XYZ2) of the AR glasses (18) with respect to the sensor reference point (32), and a transformation device (34) is designed to determine from the relative position (XYZ1) of the article (14 ) and the relative position (XYZ2) of the AR glasses (18) to determine a relative position (XYZ3) of the AR glasses (18) with respect to the object (14) and to pass it to the control device (36) as the desired relative position. In this way, a detection device (26) of the motor vehicle (10) designed for object recognition can be used.

Description

  • The invention relates to a method for fading in a virtual graphic object, e.g. As a warning symbol, in a field of view of a person sitting in a motor vehicle. To fade in the object, digital graphics data representing the object are displayed by means of smart glasses that can be worn by the person. This technology is also known as Augmented Reality (AR).
  • In the future, such augmented reality glasses or short AR glasses will find an ever wider spread. AR goggles are capable of displaying in space three-dimensionally assignable information to the user's field of vision, such as an on-road directional arrow or a warning signal positioned at a source of danger. The road and the source of danger are real objects which are located outside the motor vehicle and to which the directional arrow or the warning signal is superimposed as a virtual digital graphic object in the field of vision of the user. The necessary detection of the external environment, ie in particular of an object, such as the road or the source of danger, is usually via a camera of the AR glasses, which has the same perspective as the user due to their arrangement on the glasses.
  • A more reliable detection and classification of objects in the vehicle environment, however, can be achieved with the sensors located in the vehicle, as in the prior art z. B. can be provided for an environment monitoring for driver assistance. This environment monitoring can, for example, be camera-based and / or radar-based. An evaluation of the sensor data can be carried out by means of a high-performance central computer of the motor vehicle. However, the objects detected in the vehicle environment by the vehicle-based environmental monitoring are detected from the perspective of the sensors of the vehicle environment detection. Therefore, warning signals regarding detected objects in the vehicle environment can usually only be displayed in perspective over a head-up display, for example, since this is fixedly arranged with respect to the sensor reference point of the surroundings monitoring. An AR goggle moves against constantly with the changed head posture of its user, so that there is always a different perspective on the vehicle environment.
  • From the US 2012/0050138 A1 For example, a method is known for keeping an image content displayed on the user's field of view with respect to a vehicle environment calm in the case of AR glasses when the vehicle is rocking or vibrating. For this purpose, the head movement of the user and the inclination movement of the vehicle is detected by means of motion sensors and taken into account in the generation of the image content.
  • From the US 7,180,476, B1 For example, a method is known for alternately inserting camera data from different video cameras in a data goggle. The camera data is selected in response to a current head pose of the user wearing the data glasses.
  • From the US 2008/0079753 A1 For example, a method for operating an AR pair of eyeglasses is known in which graphic objects are superimposed into the field of vision of the user and the object size is adjusted in dependence on the eccentricity of the object with respect to the user's line of sight. This compensates for visual blur at the edge of the field of view.
  • The invention is based on the object to provide a powerful image processing for AR glasses.
  • The object is solved by the subject matters of the independent claims. Advantageous developments of the invention will become apparent by the feature of the dependent claims.
  • The inventive method is used to display a graphic object in a motor vehicle, so for example a warning symbol or an arrow. For this purpose, by means of a display device, corresponding graphics data representing the graphic object are faded into a field of view of the user by means of AR glasses in the motor vehicle.
  • The user is thus superimposed over the visual impression of his environment the graphic object superimposed or displayed.
  • The graphic object has a spatial reference to a real object in the vehicle environment. This object is detected by a recognition device, ie a relative position of the vehicle-external object is determined with respect to a sensor reference point. According to the invention it is provided that the detection device is provided by the motor vehicle outside the AR glasses. In other words, according to the invention, therefore, there is a high-performance detection device, for example an environmental monitoring device of the motor vehicle, which, for example, is camera-based and / or laser-based and / or based on the communication standard Car-2-Car or Car-2-X can be designed. But this also means that the sensor reference point of the Detection device is a vehicle-related reference point.
  • In order for the graphical object to be displayed in the correct position with respect to an object in the vehicle environment, the graphic data of the graphic object must be generated by a control device in such a way that the graphical object is represented from a perspective of a predefined setpoint relative position to the object.
  • This target relative position should correspond to the eye position of the user. In order to achieve that the graphic object takes place from the perspective of the user of the AR glasses, a relative position of the AR glasses relative to the sensor reference point is determined by a localization device according to the invention. Furthermore, according to the invention, a transformation device is provided which is designed to determine from the relative position of the object on the one hand and the relative position of the AR spectacles on the other hand - both relative positions relative to the sensor reference point - the relative position of the AR spectacles relative to the object and this relative position to the control device as the target relative position for the determination of the geometry data of the graphical object to pass. In other words, according to the invention, by providing the localization device and the transformation device from the relationship object-vehicle spectacles, the correct representation of the marker points, that is to say the position of the graphic object in the field of vision of the user, is calculated.
  • By the invention there is the advantage that by means of the AR glasses an immediate insertion of the graphic object in the field of vision of the user is always possible because the AR glasses reliably follows the head movements of the user. At the same time, however, the processor performance of a recognition device is available, which is provided by the motor vehicle, that is, it does not have to be arranged in the AR spectacles. Thus, a much greater circuit complexity in terms of the design of the detection device can be operated and yet a relatively light and small-sized AR glasses are used to display the graphics data.
  • It is particularly preferably provided that, when the graphics data are generated, a perspective distortion of the graphic object is additionally set as a function of the desired relative position. As a result, the virtual, artificial graphic object fits into the natural environment perceived by the user through the AR glasses in a particularly realistic manner. In particular, the depth perception of the graphic object is thereby also made possible, so that the graphical object does not seem to float in front of the object from the user's point of view, but is actually recognizable at the same distance as the object in the field of view.
  • A further advantage results if an on-board sensor system is provided as the localization device for the AR glasses. This has the advantage that it is fixedly arranged even with respect to the sensor reference point, so that a reliable localization of the AR glasses with respect to the sensor reference point is ensured.
  • The localization of the AR glasses is realized according to an embodiment of the invention in which the sensor system comprises at least one of the following sensors: a video camera, an infrared camera, a ToF camera (ToF - Time of Flight). A ToF camera can be realized for example by means of a PMD sensor (PMD - Photonic Mixing Device). An infrared camera has the advantage that, independently of a lighting situation in the passenger compartment, such as may be caused by external light sources, for example solar radiation, the user's head and the AR glasses can be reliably detected on the basis of image-processing algorithms. A ToF camera has the advantage that even depth information and thus a 3D position specification are immediately available.
  • In order to make the localization of the AR glasses in the passenger compartment even more reliable, it is preferably provided that at least one marker is provided on the AR glasses, which is configured recognizable for the localization device. The marker has at least one of the following labeling agents: a fluorescent color; a radiated in an environment of AR glasses infrared light emitting diode; a goggle configured to emit a coded cognition signal; a graphic pattern that correlates with pattern data of an image recognizer. A marker results in the advantage that it is possible to search for the sensor signal of the marker in a targeted manner in order to localize the AR glasses. The said fluorescent color may be, for example, infrared fluorescent. Then it can be detected with an infrared camera. An infrared light emitting diode radiates very bright and can be reliably located in an infrared image of an infrared camera, for example by segmentation. A radiation source for a coded detection signal may be, for example, an RFID circuit. For example, a graphical pattern that may be recognized by an image recognizer based on correlation may be a bar code and / or a QR code or a cross.
  • Preferably, in addition to the relative position of the AR glasses to the sensor reference point, a spatial orientation of the AR glasses is detected by the localization device and when the graphical object is displayed in the field of view of the user, a presentation location of the graphic object is set as a function of the orientation. As a result, pitch movements and / or a lateral rotation of the head are then taken into account, so that the graphic object does not migrate with these head movements, but remains in the field of view of the object in the field of vision of the user.
  • The detection of the orientation of the AR glasses can also be used advantageously to carry out the parameterization of driver assistance systems. For example, if an object is on the road and this is detected by a per se known environment monitoring device of the motor vehicle, the environment monitoring device generates a collision warning signal, so that, for example, a warning exclamation point is displayed as a graphic object. If the user does not look at the road in front of him, but, for example, obliquely to the side towards the roadside, it can be assumed that the user does not immediately perceive the object on the road and thus not the warning symbol. Accordingly, depending on the warning signal and the orientation of the AR glasses, on the basis of which one can see the side view of the user, a driver assistance system can be armed, which can initiate an evasive maneuver or a braking maneuver, for example, automatically when a collision with the Object is otherwise no longer preventable. In other words, by the parameter setting signal for the driver assistance system, for example, a switching threshold for triggering a maneuver or, for example, a distance from brake pads is set to a brake disc to reduce a reaction time of the driver assistance system. In other words, an embodiment of the invention therefore provides for the graphic object to be defined (eg exclamation point or arrow) as a function of a warning signal from an environmental monitoring device of the motor vehicle and also to parameterize a driver assistance system and, in addition, the parameterization depends on the orientation of the vehicle To make AR glasses.
  • In connection with the generation of the graphic data, an embodiment of the invention provides that the control device generates the graphic data by means of a rendering of 3-D vector data or by means of a shear of rastered digital image data. 3D vector data can be particularly easily adapted for a desired graphical perspective representation with respect to a desired relative position and then converted by rendering simply into a graphic that can be displayed by means of AR glasses. If one already has a finished screened symbol, ie a pixel graphic of the graphical object, then a perspective distortion can be achieved by means of shear.
  • In order to determine the relative position of the AR glasses with respect to the object, an embodiment of the invention provides that the transformation device converts a coordinate system of the motor vehicle and a coordinate system of the AR glasses into one another by means of a coordinate transformation or on each other. This results in the advantage that the AR spectacle position is ready with respect to the vehicle-external object directly without further elaborate calculations.
  • As already stated, the invention also includes a motor vehicle. The motor vehicle according to the invention has a detection device for detecting a relative position of a vehicle-external object with respect to a sensor point, that is, for example, an environment monitoring device, as known per se from the prior art, for example in connection with the collision early warning. Furthermore, the motor vehicle has a control device for generating graphics data, which represent a graphical object from the perspective of a predetermined desired relative position to the object. Such a control device can be provided, for example, by a central processing device or an infotainment system of the motor vehicle, for example by providing a corresponding program module for generating the graphics data and executing them by the corresponding process devices. Furthermore, in the motor vehicle according to the invention, a display device is provided for driving AR glasses, which is designed to display graphics data in a field of view of the user by means of the AR-Bille. The display direction can transmit the graphic data to the AR glasses by means of a radio link, for example based on WLAN (Wireless Local Area Network) or Bluetooth.
  • The motor vehicle according to the invention further comprises the localization device, which is designed to determine a relative position of the AR glasses with respect to the sensor reference point, ie to locate the AR glasses in the passenger compartment. Furthermore, a transformation device is provided, which is designed to determine the relative position of the AR glasses with respect to the object from the relative position of the object on the one hand and the relative position of the AR glasses on the other hand, which are both related to the sensor reference point, ie Relative position of the AR glasses with respect to the object, and to pass this new relative position to the control device, which this relative position as the target relative position when generating the graphics data is based.
  • The motor vehicle according to the invention has the advantage that, on the one hand, the computing power of processor devices of the motor vehicle can be used. On the other hand, the graphic object can be faded into the field of view of the user by means of the AR glasses, i. H. be provided even when the user moves his head. Thus, the advantages of the vehicle-side processor performance on the one hand and the flexible display option of AR glasses are combined.
  • In the following an embodiment of the invention is described. For this purpose, the single FIGURE (FIG.) Shows a schematic representation of an embodiment of the motor vehicle according to the invention.
  • The exemplary embodiment explained below is a preferred embodiment of the invention. In the exemplary embodiment, however, the described components of the embodiment each represent individual features of the invention, which are to be considered independently of each other, which also develop the invention independently of one another and thus also individually or in a different combination than the one shown as part of the invention. Furthermore, the described embodiment can also be supplemented by further features of the invention already described.
  • The figure shows a motor vehicle 10 , which may be a motor vehicle, especially a passenger car. The car 10 For example, on a street 12 drive. For the explanation of the embodiment, it is assumed that an object 14 in the street 12 located. A driver 16 of the motor vehicle 10 can use AR glasses while driving 18 through which he can view his surroundings or which is designed to display a stereoscopic view of a camera image of the environment via two monitors. A field of vision 20 the driver 16 is also shown in the figure.
  • The driver looks through the AR glasses 18 but not just the road 12 and the object 14 but he gets in addition to his field of vision 20 also virtual two graphical objects 22 . 24 which may, for example, be icons and other symbols or also text information or image information (not shown). The graphic object 22 in the example shown is a warning symbol that is in the field of vision 20 the user in the correct position on the object 14 is shown. The graphic object 24 is a driving instruction in the form of an arrow, which the driver an evasive maneuver to avoid the object 14 indicates. The graphic objects 22 . 24 be in sight 20 in the correct position on the object 14 (in relation to the object 22 ) or the street 12 (in relation to the object 24 ), even if the driver 16 his head back and forth or turns.
  • The object 14 and thus the danger emanating from this was from a detection device 26 of the motor vehicle 10 detected. The recognition device 26 For example, it may be an environmental monitoring of a driver assistance system of the motor vehicle, as it is known from driver assistance systems z. B. in connection with the collision early detection is known.
  • The recognition device 26 For example, at least one sensor 28 , For example, a camera or a radar, and an evaluation 30 include. The evaluation device 30 For example, it may be a program module or a control device which stores the sensor data of the sensor 28 receives and from the sensor data, for example, a relative position of the object XYZ1 14 with respect to a sensor reference point 32 determine. The sensor reference point 32 is a vehicle-fixed point. In the following, only the relative position XYZ1 of the object will be used 14 for the display of the object 22 considered. The not further described below relative position of the road 12 for the display of the object 24 can be known, can be determined in this way in the same way.
  • Moving the head moves the driver 16 also the AR glasses 18 ie the AR glasses 18 moves with respect to the sensor reference point 32 , Therefore, based on the relative position XYZ1, the location of the graphic objects can not be determined directly 22 . 24 in the field of vision 20 the driver 16 be determined. In the motor vehicle 10 is therefore a localization facility 32 which provides a relative position XYZ2 of the AR glasses 18 with respect to the sensor reference point 32 determine. The localization setting 32 may include, for example, an infrared camera. A transformation device 34 can from the analyzer 30 the relative position XYZ1 and the localization setting 32 receive the relative position XYZ2 as position data, for example, in each case as a vector, and from this, for example by means of a coordinate transformation and / or an addition of the position data or the transformed position data, a relative position XYZ3 of the AR glasses with respect to the object 14 to calculate. This relative position XYZ3 can be sent to a control device 36 the AR glasses 18 be transmitted. The control device 36 and the transformation device 34 may be, for example, program modules, for example, by a processor device of the Motor vehicle or a processor device of the AR glasses 18 be executed.
  • The control device 36 can depending on the relative position XYZ3 graphics data 38 generate the two graphical objects 22 . 24 represent. The graphics data 38 can to the AR glasses 18 or to a projection device of the AR glasses 18 be issued and thereby the AR glasses 18 or their projection device are controlled so that the graphics data in the field of view 20 and thereby the graphical objects 22 . 24 in the field of vision 20 are displayed.
  • Now moves the user 16 his head, so the relative position XYZ2 changes with respect to the sensor reference point 32 , which in turn through the localization facility 32 detected and by the transformation facility 34 is transferred to a new relative position XYZ3, so that the display position of the objects 22 . 24 in the field of vision 20 also changes, ie to the new head position of the user 16 is adjusted.
  • Overall, therefore, the detection of the external environment over that in the motor vehicle 10 infrastructure and can still be passed to the AR glasses. A technical prerequisite for a perspective correct representation of the outlying points of interest, so the object 14 and more items, that is the exact spatial position of the AR glasses 18 within the motor vehicle is known. For this purpose, the invention provides that the position of the AR glasses 18 by a in the vehicle 10 built-in sensors, so for example, an infrared camera-based sensors, in the form of the localization device 32 is detected and calculated from the relation point-of-interest vehicle glasses the correct representation of the marker points. An additional additional information is the orientation of the glasses. Thus, the vehicle systems know in which direction the driver is currently looking. This can be used particularly advantageously in the parameterization of driver assistance systems, as has already been described.
  • Overall, the example shows how the invention in a motor vehicle, the correct reproduction of outward information can be ensured by determining the position of an AR glasses.
  • QUOTES INCLUDE IN THE DESCRIPTION
  • This list of the documents listed by the applicant has been generated automatically and is included solely for the better information of the reader. The list is not part of the German patent or utility model application. The DPMA assumes no liability for any errors or omissions.
  • Cited patent literature
    • US 2012/0050138 A1 [0004]
    • US 7180476 B1 [0005]
    • US 2008/0079753 A1 [0006]

Claims (10)

  1. Method for displaying a graphic object ( 22 . 24 ) in a motor vehicle ( 10 ), whereby - by a recognition device ( 26 ) a relative position (XYZ1) of a vehicle-external object ( 14 ) with respect to a sensor reference point ( 32 ), - by a control device ( 36 ) Graphic data ( 38 ) generating the graphical object ( 22 . 24 ) from a perspective of a predetermined desired relative position to the object ( 14 ), and - by a display device ( 18 ) the graphic data ( 38 ) by means of a motor vehicle ( 10 ) AR glasses ( 18 ) in a field of vision ( 20 ) of a user ( 16 ), characterized in that - the recognition device ( 26 ) by the motor vehicle ( 10 ) outside the AR glasses ( 18 ) is provided by a localization device ( 32 ) a relative position (XYZ2) of the AR glasses ( 18 ) with respect to the sensor reference point ( 32 ) is determined and - by a transformation device ( 34 ) from the relative position (XYZ1) of the object ( 14 ) and the relative position (XYZ2) of the AR glasses ( 18 ) a relative position (XYZ3) of the AR glasses ( 18 ) with regard to the subject ( 14 ) and to the control device ( 36 ) as the desired relative position for the determination of the geometry data ( 38 ).
  2. The method of claim 1, wherein in generating the graphics data ( 38 ) depending on the desired relative person a perspective distortion of the graphical object ( 22 . 24 ) is set.
  3. Method according to one of the preceding claims, wherein as the localization device ( 32 ) an on-board sensor system is provided, which with respect to the sensor reference point ( 32 ) is fixed.
  4. The method of claim 3, wherein the sensor system comprises at least one of the following sensors: an infrared camera, a ToF camera.
  5. Method according to one of the preceding claims, wherein the AR glasses ( 18 ) at least one for the localization device ( 32 ) has recognizably designed markers comprising at least one of the following marking means: a fluorescent color, an infrared light emitting diode directed into an environment of the AR glasses, a radiation source configured to emit a coded detection signal, a graphic pattern correlating with pattern data of an image recognizer.
  6. Method according to one of the preceding claims, wherein the localization device ( 32 ) a spatial orientation of the AR glasses ( 18 ) and a representation of the graphical object ( 22 . 24 ) in the field of vision ( 20 ) is adjusted depending on the orientation.
  7. Method according to Claim 6, in which, depending on a warning signal, an environmental monitoring device of the motor vehicle ( 10 ) the graphic object ( 22 . 24 ) and depending on the warning signal and the orientation of the AR glasses ( 18 ) a parameterization signal is transmitted to at least one driver assistance system of the motor vehicle.
  8. Method according to one of the preceding claims, wherein the control device ( 36 ) the graphic data ( 38 ) is generated by means of a rendering of 3D vector data or by means of a shear of rastered digital image data.
  9. Method according to one of the preceding claims, wherein the transformation means ( 34 ) the relative position (XYZ3) of the AR glasses ( 18 ) with regard to the subject ( 14 ) is determined by a coordinate system of the motor vehicle ( 10 ) and a coordinate system of the AR glasses ( 18 ) images each other.
  10. Motor vehicle ( 10 ) with - a recognition device ( 26 ) for detecting a relative position (XYZ1) of a vehicle-external object ( 14 ) with respect to a sensor reference point ( 32 ), - a control device ( 36 ) for generating graphic data ( 38 ), which is a graphical object ( 22 . 24 ) from the perspective of a predetermined desired relative position to the object ( 14 ), and - a display device ( 18 ) for driving AR glasses ( 18 ) and thereby fading in the graphic data ( 38 ) in a field of vision ( 20 ) of a user ( 16 ), characterized by - a localization device ( 32 ), which is adapted to a relative position (XYZ2) of the AR glasses ( 18 ) with respect to the sensor reference point ( 32 ), and - a transformation device ( 34 ) which is adapted from the relative position (XYZ1) of the object ( 14 ) and the relative position (XYZ2) of the AR glasses ( 18 ), both on the sensor reference point ( 32 ), a relative position (XYZ3) of the AR glasses ( 18 ) with regard to the subject ( 14 ) and to the controller ( 36 ) as the target relative position.
DE102014009608.4A 2014-06-27 2014-06-27 Operation of AR glasses in the motor vehicle Pending DE102014009608A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
DE102014009608.4A DE102014009608A1 (en) 2014-06-27 2014-06-27 Operation of AR glasses in the motor vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
DE102014009608.4A DE102014009608A1 (en) 2014-06-27 2014-06-27 Operation of AR glasses in the motor vehicle

Publications (1)

Publication Number Publication Date
DE102014009608A1 true DE102014009608A1 (en) 2015-12-31

Family

ID=54839475

Family Applications (1)

Application Number Title Priority Date Filing Date
DE102014009608.4A Pending DE102014009608A1 (en) 2014-06-27 2014-06-27 Operation of AR glasses in the motor vehicle

Country Status (1)

Country Link
DE (1) DE102014009608A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019072481A1 (en) * 2017-10-09 2019-04-18 Audi Ag Method for operating a display device in a motor vehicle

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10013478A1 (en) * 2000-03-18 2001-09-20 Volkswagen Ag Assistance system for drivers displaying navigation and vehicle data in field of vision, supplemented by audible information, includes data spectacles with receiver
US20040051680A1 (en) * 2002-09-25 2004-03-18 Azuma Ronald T. Optical see-through augmented reality modified-scale display
US7180476B1 (en) 1999-06-30 2007-02-20 The Boeing Company Exterior aircraft vision system using a helmet-mounted display
DE102006006001B3 (en) * 2006-02-08 2007-10-04 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Method and arrangement for inserting location-related information into a visual representation or view of a scene
US20080079753A1 (en) 2003-12-01 2008-04-03 Volvo Technology Corporation Method and system for presenting information
EP1990674A1 (en) * 2007-05-09 2008-11-12 Harman Becker Automotive Systems GmbH Head-mounted display system
DE102009049073A1 (en) * 2009-10-12 2011-04-21 Metaio Gmbh Method for presenting virtual information in a view of a real environment
US20120050138A1 (en) 2009-03-30 2012-03-01 Aisin Aw Co., Ltd. Information display apparatus
DE102012216057A1 (en) * 2012-09-11 2014-05-28 Bayerische Motoren Werke Aktiengesellschaft Arrange displays in a head-mounted display

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7180476B1 (en) 1999-06-30 2007-02-20 The Boeing Company Exterior aircraft vision system using a helmet-mounted display
DE10013478A1 (en) * 2000-03-18 2001-09-20 Volkswagen Ag Assistance system for drivers displaying navigation and vehicle data in field of vision, supplemented by audible information, includes data spectacles with receiver
US20040051680A1 (en) * 2002-09-25 2004-03-18 Azuma Ronald T. Optical see-through augmented reality modified-scale display
US20080079753A1 (en) 2003-12-01 2008-04-03 Volvo Technology Corporation Method and system for presenting information
DE102006006001B3 (en) * 2006-02-08 2007-10-04 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Method and arrangement for inserting location-related information into a visual representation or view of a scene
EP1990674A1 (en) * 2007-05-09 2008-11-12 Harman Becker Automotive Systems GmbH Head-mounted display system
US20120050138A1 (en) 2009-03-30 2012-03-01 Aisin Aw Co., Ltd. Information display apparatus
DE102009049073A1 (en) * 2009-10-12 2011-04-21 Metaio Gmbh Method for presenting virtual information in a view of a real environment
DE102012216057A1 (en) * 2012-09-11 2014-05-28 Bayerische Motoren Werke Aktiengesellschaft Arrange displays in a head-mounted display

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019072481A1 (en) * 2017-10-09 2019-04-18 Audi Ag Method for operating a display device in a motor vehicle

Similar Documents

Publication Publication Date Title
DE102012216623B4 (en) Method for dynamic information display on a head-up display
EP1510849B1 (en) A virtual display device for use in a vehicle
EP1916154B1 (en) Method for displaying information
JP4770931B2 (en) Display device
KR101906133B1 (en) Visual driver information and warning system for a driver of a motor vehicle
US20040178894A1 (en) Head-up display system and method for carrying out the location-correct display of an object situated outside a vehicle with regard to the position of the driver
WO2010029707A2 (en) Image projection system and image projection method
JP2008205914A (en) Image processor
US10081370B2 (en) System for a vehicle
JP4475308B2 (en) Display device
US9760782B2 (en) Method for representing objects surrounding a vehicle on the display of a display device
JP5096836B2 (en) Method and system for imaging the surroundings of a vehicle
JP2008094377A (en) Vehicular display device
US20120235805A1 (en) Information display apparatus and information display method
DE102013110332B4 (en) Visual guidance system
US20150062168A1 (en) System and method for providing augmented reality based directions based on verbal and gestural cues
US9690104B2 (en) Augmented reality HUD display method and device for vehicle
CN104584102A (en) Method for supplementing object information assigned to an object and method for selecting objects in surroundings of a vehicle
US20120133738A1 (en) Data Processing System and Method for Providing at Least One Driver Assistance Function
KR20170024904A (en) Drive assistance appratus and method for controlling the same
KR20120127830A (en) User interface method for terminal of vehicle and apparatus tererof
US20160307374A1 (en) Method and system for providing information associated with a view of a real environment superimposed with a virtual object
US10279741B2 (en) Display control apparatus, method, recording medium, and vehicle
US20140098008A1 (en) Method and apparatus for vehicle enabled visual augmentation
CN106205175A (en) display device and vehicle for vehicle

Legal Events

Date Code Title Description
R012 Request for examination validly filed
R079 Amendment of ipc main class

Free format text: PREVIOUS MAIN CLASS: G09G0005000000

Ipc: G08G0001160000

R016 Response to examination communication