WO2012152475A1 - Procédés et dispositif servant à étalonner un système de projection d'un véhicule - Google Patents
Procédés et dispositif servant à étalonner un système de projection d'un véhicule Download PDFInfo
- Publication number
- WO2012152475A1 WO2012152475A1 PCT/EP2012/054353 EP2012054353W WO2012152475A1 WO 2012152475 A1 WO2012152475 A1 WO 2012152475A1 EP 2012054353 W EP2012054353 W EP 2012054353W WO 2012152475 A1 WO2012152475 A1 WO 2012152475A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- projection device
- vehicle
- image
- information
- real object
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/011—Head-up displays characterised by optical features comprising device for correcting geometrical aberrations, distortion
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0181—Adaptation to the pilot/driver
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B2027/0192—Supplementary details
- G02B2027/0198—System for aligning or maintaining alignment of an image in a predetermined direction
Definitions
- the present invention relates to a method and a device for calibrating a projection device of a vehicle and to a corresponding computer program product.
- DE 10 2004 035 896 A1 is concerned with a head-up display with which a visual field of a driver of a vehicle can be scanned.
- the present invention provides a method for calibrating a projection device of a vehicle, furthermore a device which uses this method and finally a corresponding computer program product according to the main claims.
- Advantageous embodiments emerge from the respective subclaims and the following description.
- a virtual image can be faded into a field of view of an occupant of the vehicle.
- the calibration of the projection device is necessary.
- positions of the virtual image and the real object which are visible to the occupant can be compared with one another. If the positions are in the predetermined relationship to each other, no further tere calibration may be required. On the other hand, if there is a deviation of the positions from the predetermined reference, the calibration information can be determined based on the deviation.
- the present invention provides a method of calibrating a projection device of a vehicle, wherein the projection device is adapted to project a virtual image associated with a real object into a beam path between the real object and an assumed head position of an occupant of the vehicle, and wherein the method is the following Steps includes:
- the light information representing, on the one hand, light originating from the real object located outside the vehicle and the other representing light originating from the projection device, surrounding the virtual object associated with the real object To project an image into the beam path;
- Determining a calibration information for the projection device based on a position of the virtual image determined from the light information and a position of the real object determined from the light information.
- the virtual image in the field of view of the occupant such as a driver of the vehicle, are displayed.
- the image may be an image of an object, a mark, a character, or a word.
- the image may be related to the real object located outside the vehicle.
- the real object may be, for example, a lane, a lane marking, a traffic sign, another vehicle or another person.
- the virtual image can be displayed so that it is superimposed from the perspective of the occupant, the real object or arranged next to the real object.
- the calibration may be required to display the virtual image at a desired position relative to the real object.
- the projection device may be a head-up display. In particular, it may be a so-called contact analog or augmented head-up display.
- the area where, for the a picture is visible, is represented by a so-called eyebox.
- the HUD is constructed so that the eyebox includes the points or areas inside the vehicle that typically house the head of the occupant, and in particular the eyes of the occupant.
- the virtual image can be entered so that it is clearly visible from the head position.
- the beam path can be represented by a connecting line between the real object and the head position.
- the detection of the light information can be carried out by means of an image capture device, for example a camera.
- the detected light information can be mapped into a digital image, for example.
- the image may be evaluated with a suitable image evaluation, for example, to recognize the virtual image and the real object in the image and to relate the positions of the virtual image and the object within the image to each other.
- the positions determined from the light information can be compared with nominal positions. It is also possible to compare an arrangement of the positions determined from the light information with one another with a desired arrangement. Corresponding desired specifications may be predetermined or provided by the projection device.
- a target specification may be that the positions of the virtual image and the real object overlap from the perspective of the occupant, that is, from the head position. Is based on the
- a deviation compensating compensation information can be determined.
- the compensation information may be provided to the projection device.
- the compensation information can cause the projection of the virtual image by the projection device to be changed in such a way that the target specification is met. If a plurality of virtual images are projected into the beam path by the projection device, then the step of determining the calibration information for each virtual image can be repeated.
- the projection device can be aware of a real position of the real object.
- the real position can be specified for the calibration.
- the real position can also be determined, for example, by means of a sensor for detecting the surroundings and provided to the projection device.
- the method may include a step of placing an imager at the assumed head position.
- the step of detecting the incoming light information can be detected by the image sensor.
- the image sensor may be part of an image capture device, such as a camera.
- a digital image can be generated from the light having the light information.
- the compensation information can be determined.
- the imager may be arranged to perform the process of calibration of a person at the head position. This procedure is advantageous if the vehicle has no suitable image sensor.
- the light information arriving at the assumed head position may be redirected to an image capturing device.
- the image capture device may be located outside of the assumed head position.
- the mirror can be arranged by a person in the appropriate position. This approach is advantageous if it makes more sense not to place the image capture device directly in the head position.
- the deflection can be done by a deflection element, such as a mirror, is arranged at the head position and aligned properly. This procedure is also suitable if the vehicle already has a suitable image capture device.
- the vehicle may include an interior camera for monitoring an interior of the vehicle.
- This indoor camera can be used as an image capture device.
- a mirror is arranged at the head position so that the beam path of the
- the step of redirecting may be performed repeatedly for different assumed head positions.
- the step of detecting may be repeatedly performed to acquire image information for each of the different assumed head positions.
- the assumed head position can be determined. which corresponds to an actual position of the Eye-Box.
- the actual position of the eye-box can be detected by means of a suitable image evaluation, for example by comparing images taken at different head positions with each other.
- the calibration information may be determined based on the light information acquired at the actual position of the eye-box. This procedure is useful if the actual position of the eye-box is not known in advance or exact positioning of the deflecting element is complicated. In this case, the deflection element can be moved until the actual position of the eye box is found.
- information for predistortion of image data to be output by the projection device can be determined.
- the predistortion can make sense, so that the virtual image appears undistorted to the occupant, even if he moves his head slightly and thus changes his direction of view something.
- the method may comprise a step of arranging a diaphragm in a projection beam path of the projection device.
- the assumed head position may be determined as a position at which the virtual image is visible. Whether the virtual image is visible can be determined by evaluating the incoming light information. For example, the image sensor or the deflection element can be moved back and forth between possible head positions until the detected light information comprises the virtual image. The movement of the deflecting element can be performed by a person or a holding device for the deflecting element. The resulting head position is used as the assumed head position.
- the method may also include a step of projecting the virtual image associated with the real object.
- the virtual image can be projected into the beam path by means of the projection device.
- the present invention further provides a device which is designed to implement the steps of the method according to the invention in corresponding devices. implement or implement. Also by this embodiment of the invention in the form of a device, the object underlying the invention can be solved quickly and efficiently.
- a device can be understood as meaning one or more electrical devices which process sensor signals and output control signals in dependence thereon.
- the device may have an interface, which may be formed in hardware and / or software.
- the interfaces can be part of a so-called system ASIC, for example, which contains a wide variety of functions of the device.
- the interfaces are their own integrated circuits or at least partially consist of discrete components.
- the interfaces may be software modules that are present, for example, on a microcontroller in addition to other software modules.
- Also of advantage is a computer program product with program code which can be stored on a machine-readable carrier such as a semiconductor memory, a hard disk memory or an optical memory and used to carry out the method according to one of the embodiments described above, when the program is executed on a computer.
- a machine-readable carrier such as a semiconductor memory, a hard disk memory or an optical memory
- FIG. 1 shows a vehicle with a projection device according to an embodiment of the present invention.
- FIG. 2 shows an illustration of a virtual camera
- FIG. 3 shows a vehicle with a projection device according to a further exemplary embodiment of the present invention
- FIG. 1 shows a vehicle 100 with a projection device 102 according to an embodiment of the present invention.
- the vehicle 100 has a disk 104.
- a head portion 106 of an occupant of the vehicle 100 is shown.
- an object 108 outside the vehicle 100, in the field of view of the occupant through the disc 104, an object 108, here a shield, is arranged.
- An optical path 110 travels from the object 108 to the head region 106 and in particular to a position of the eyes of the occupant. Via the beam path 110, light emitted or reflected by the object 108 strikes the head region 106 of the occupant.
- the projection device 102 is arranged inside the vehicle 100.
- the projection device 102 is designed to project a virtual image into the field of view of the occupant.
- the projection device 102 is designed to emit light which, via a further beam path 12, strikes a reflection element, for example the pane 104, and is coupled by the reflection element into the beam path 110 in the direction of the head region 106.
- a reflection element for example the pane 104
- the occupant of the vehicle has the impression that the virtual image emitted by the projection device 102 is located outside the vehicle 100 in the region of the beam path 110.
- the projection device 102 may be configured to project the virtual image with respect to the real object 108 into the field of view of the occupant.
- the virtual image may be displayed congruent with the real object 108.
- the projection device 102 it is necessary for the projection device 102 to have at least information about a position of the real object 108 with respect to the vehicle 100.
- the information about the position may include a direction and a distance between a reference point of the vehicle 100 and the vehicle
- Position of the real object 108 include.
- the projection device 102 can have an interface to receive the information about the position. Via the interface, the projection device 102 can also be provided with information about a type of the real object 108. In this way, the projection device 102 can adapt an image content of the virtual image to the real object 108.
- the information about the position of the real object 108 may be provided to the projection device 102 by an environment detection device 114.
- the surroundings detection device 114 is arranged on the vehicle 100 and is designed to detect the surroundings of the vehicle 100 at least in a partial area of the field of view of the occupant.
- the surroundings detection device 1 14 is designed to detect the real object 108 and to determine the position of the real object 108 relative to the vehicle 100.
- the environment detection device 114 may also be designed to classify a type of the real object 108 by means of an object recognition.
- the environment detection device 14 can represent an ACC system (Adaptive Cruise Control) or a part of such a system.
- ACC system Adaptive Cruise Control
- the information about the position of the real object 108 of the projection device 102 may also be fixed or input via an input interface, for example by an operator.
- the projection device 102 may be a head-up display.
- the head-up display is a system that displays information directly into the driver's field of vision.
- a special optics is used, which ensures by reflection in the windshield 104 that gives the driver the impression as if an image in front of the vehicle 100 would float.
- the driver must look down to see important indications such as speed or directional guidance.
- the image of a HUD 102 is visible only to the driver, and only if his eyes are in the so-called Eyebox (EB). This is a 3-dimensional space typically 25 cm wide and 10 cm high from the driver's point of view.
- EB Eyebox
- the head-up display 102 may selectively overlay information to the real image to reference points in space.
- a display is in common usage contact analogue or Called Augmented Head-up Display.
- the driver can be given with the help of this technique, for example, navigation instructions in the form that the road to be followed by him is marked in color. Even lanes or detected by ACC system 1 14 preceding vehicles can be displayed to the driver directly into the field of view.
- various approaches are known. Using autostereoscopic or binocular systems, different images are created for both eyes and made visible to the eye via appropriate optics.
- the 3D image to 2D is performed by a so-called renderer, which calculates the 2D image on the scene from the perspective of the driver and the 3D visualization data.
- a driver's head position estimation may be necessary.
- the position of the head region 106 in the interior of the vehicle 100 for example with respect to a reference point of the vehicle, can be determined.
- the environment sensor system 114 in the form of the ACC monitors with a radar sensor the area in front of the vehicle 100.
- the surroundings sensor 1 14 is designed to calculate from the reflected signals of direction, distance and relative speed of preceding vehicles or other objects. If the surroundings sensor system 114 detects a slower driving ahead vehicle in its own lane, it adjusts the speed so that the vehicle 100 follows at the desired distance. Depending on the driving situation, ACC reduces the engine torque or brakes the vehicle 100. Even when cornering, ACC can recognize which vehicle is crucial for speed control. As soon as there is no vehicle in the measuring range, ACC automatically accelerates the vehicle to the preset speed. Modern methods use a video outside camera to secure radar data and increase reliability. In this case, objects 108 are detected by means of image recognition algorithms from video images and fused with the radar data.
- the augmented reality head-up display (AR-HUD) 102 For a perspective correct projection, the augmented reality head-up display (AR-HUD) 102 must be calibrated.
- the projection of an augmented reality head-up display after installation in the vehicle 100 should be such that afterwards the insertion of symbols in the driver's field of view takes place at the correct position.
- the calculation of the perspective of the image represented by the HUD 102 from a desired 3-dimensional scene presented to the driver is performed by the so-called renderer.
- the image shown is two-dimensional in a monoscopic HUD 102.
- the perspective technologies eg OpenGL, which make the image calculation directly.
- three objects to be displayed are described in a 3D description language and passed to the renderer, which calculates the corresponding 2-D image.
- the configuration of the renderer This is done so that the virtual image appears from the perspective of the driver at the desired location.
- the concept of a virtual camera is tracked within the render, for which the conversion from 3D to 2D is performed by the renderer based on the camera parameters.
- These camera parameters are the calibration parameters to be determined in the method.
- FIG. 2 shows an illustration of such a virtual camera 220.
- the virtual camera 220 must be located on the driver's head and directed towards the HUD image.
- the "zoom" must be selected to fully capture the HUD image area, so the parameters of the virtual camera 220 needed to configure the renderer are as follows: horizontal field of view 222 (field of view)
- the position and the rotation of the camera 220 in the vehicle coordinate system are also required, as well as the so-called “viewport”, which indicates how large the image is in pixels on the screen.
- FIG. 3 shows a section of a vehicle with a device for calibrating a projection device 102 according to an embodiment of the present invention.
- the projection device 102 is designed as an AR-HUD.
- the projection device 102 has a HUD opening, from which, as described with reference to FIG. 1, light 1 12 is projected onto the windshield 104 of the vehicle. From the windshield 104, the light 112 is reflected in the beam path to the head position of the occupant at which a camera 320 is arranged according to this embodiment.
- the eyebox indicates the spatial area in which the image of the display is visible to the viewer. Thus, the head position and the camera 320 are in the eyebox.
- the projection device 102 is designed to project the light 12 onto the windshield 104 in such a way that a virtual image 330 is produced which, from the perspective of the occupant, hovers behind the windshield 104 behind the windscreen 104, ie outside the vehicle. Also shown in FIG. 3 are three real objects 108 spaced apart from each other at different positions outside the vehicle. The objects 108 are real markers 108 whose position is known in each case. In the virtual image 330, three markers are arranged. According to this exemplary embodiment, the virtual image 330 is to be projected by the projection device 102 in such a way that each of the markers of the virtual image 330 overlaps with an associated one of the real markers 108.
- the projection device 102 has a memory 332 in which system parameters of the projection device 102, here AR-HU D parameters, are stored.
- the projection device 102 is designed to use the system parameters to determine drive values for generating the virtual image 330 in such a way that the virtual image 330 comprises virtual markers which are at a predetermined position relative to the real world
- Markers 108 are displayed. If there is a deviation from the desired overlap, then the deviation can be detected and eliminated by a calibration process.
- the camera 320 is arranged to capture both the virtual image 330 and the real objects 108.
- An image 336 generated by the camera 320 is transmitted from the camera 320 to an image evaluator 340.
- the image evaluation device 340 is designed to perform an evaluation of the image 336 and to calculate system parameters for the projection device 102 based on the evaluation. For this purpose, for example, the virtual markers of the virtual image 330 and the real markers 108 in the image 336 are recognized and the respective positions are compared with each other.
- the image evaluator 340 may be configured to determine new system parameters based on the deviation so as to correct the deviation.
- known methods for calibrating a projection device can be used.
- the current system parameters of the projection device 102 used to generate the virtual image 330 can be known to the image evaluation device 340 and used to determine the new system parameters.
- the system parameters determined by the image evaluation device 340 for the projection device 102 can be entered into the memory 332 of the projection device 102, for example by flashing. Subsequently, these new system parameters may be used for re-projecting the virtual image 330.
- FIG. 3 shows a construction for an automatic calibration of an augmented reality HUD according to an exemplary embodiment of the present invention.
- the camera 320 and the projection device 102 are arranged inside the vehicle.
- the image evaluation device 340 can be arranged outside the vehicle and via suitable interfaces with the camera
- objects 108 with a known position and virtual objects 330 generated by the HUD 102 are detected simultaneously via the image 336, which is taken out of the eyebox by means of the camera 320.
- the deviation of the position of real markers 108 in front of the vehicle from the eye box of the HUD 102 can be compared with the position of inserted markers 330. With a sufficiently large number of markers, the deviations can be used to calculate the correct parameters for the AR-HUD, so that the projection is perspectively correct.
- the searched HUD parameters can then be calculated for a sufficiently large amount of objects 108 from the respective object positions, for virtual and real objects 330, 108, as well as the currently set, still to be adjusted, HUD parameters.
- the HUD parameters can be calculated in a single step.
- To determine the HUD parameters can be known algorithms for marker-based calculation of transformation parameters are used.
- FIG. 4 shows a section of a vehicle with a device for calibrating a projection device 102 according to a further exemplary embodiment of the present invention.
- the structure shown in Fig. 4 corresponds to the structure shown in Fig. 3, with the difference that instead of a camera placed at the head position, an interior camera 420 is used, that a deflection element 422, here a mirror or deflection mirror, at the Kopfpositi - On is arranged and that the image 336 of the interior camera 420 of a
- Unit 440 is evaluated for the parameter calculation, which is part of the projection device 102.
- the mirror 422 is arranged in the eyebox that the beam path 1 10 is deflected to the interior camera 420.
- the interior camera 420 may be permanently installed in the vehicle and used in normal operation, for example, to monitor the interior of the vehicle.
- the image 336 generated by the interior camera 420 is provided to the projection device 102 via a suitable interface.
- the image 336 may be transmitted from the indoor camera 320 to the projection device 102 via a system bus, such as a CAN bus, to which both the indoor camera 420 and the projection device 102 are shot.
- the unit 440 is designed in accordance with the image evaluation device 340 described with reference to FIG. 4 in order to determine new system parameters for the projection device 102 and to store them in the memory of the projection device 102.
- both the projection device and the mirror 422 and the indoor camera 420 are disposed inside the vehicle.
- an automatic calibration of the HUD 102 can be made by taking advantage of the already installed in the vehicle interior camera 420, for example, for fatigue monitoring, and an additional mirror 422, which is temporarily placed in the eyebox.
- the calculation of the AR-HU D parameters takes place completely in the vehicle system, i. there is no electronic connection of external devices, such as a camera, one
- the mirror can be provided with features, such as markers, which allow a simplified measurement of the markers in the camera image. This allows a simplified determination of the spatial position of the mirror relative to the camera and thus the determination of the position of the mirror in the vehicle with known methods of optical measurement technology. This improves the accuracy of determining the system parameters for projection.
- Eyebox of the HUD 102 to place in the vehicle then evaluate the images of this external camera to calculate the AR-HU D parameters outside the vehicle and then to use this for the configuration of the AR-HUDs.
- a correspondingly complex process in which a separate camera is required, images are read out and processed elsewhere, and the results are transmitted to the vehicle electronics, can be completely bypassed by the method shown in FIG. 4.
- a method is provided for automatically calibrating an augmented reality HUD based on an indoor camera, which can be performed inexpensively and quickly.
- the method makes it possible to calibrate the projection device 102 only with the aid of a mirror 422 and, depending on the calibration method, if necessary, external markers 108.
- external markers 108 objects arranged especially for the purpose of calibration in front of the vehicle can be used.
- Calibration algorithms installed, for example, in the drive electronics within the AR-HUD 102 then receive the image 336 of the indoor camera 420, evaluate it, calculate the AR-HU D parameters and reconfigure the virtual image 330 generated by the projection device 102 New.
- the calibration algorithms used here correspond to those which are used in an automatic calibration by camera in the eyebox, as described with reference to FIG. 3.
- software modules may still be required to assist the user in positioning the mirror 422.
- the interior camera is designed to record a video or image sequence while the mirror is displaced in all directions within the eyebox. In this way, the border area of the Eyebox can be determined and also its center. It is sufficient for the evaluation if the mirror 422 has been located only once in the middle of the eyebox within the sequence. This location of the sequence can be easily recognized by software, and the calculation of the parameters for the projection device 102 can then be performed simply on the basis of the corresponding image 336 at the location of the sequence.
- Another way to find the center of the eyebox is to use appropriate bezels that are placed over the opening of the HUD 102. If a single test marker appeared, it would only be visible in the center of the eyebox. This makes it possible for the user of the method to search the center of the eyebox. For example, the image 336 received from the camera 420 may be displayed on a display, or the system may acknowledge the correct placement with a sound.
- the projection device can be the projection device described with reference to FIGS. 1, 3 or 4.
- a camera can be arranged directly in the eyebox.
- a mirror may be placed in the eyebox to redirect the light received in the eyebox to a camera located at another position.
- the camera or mirror is oriented so that the detected light corresponds to the light that an occupant of the vehicle would receive through his eyes located in the area of the eyebox.
- the light comprises both information about an object arranged outside the vehicle and information about a virtual image which is generated by the projection device with respect to the object arranged outside the vehicle.
- the camera located in the eyebox or the mirror located in the eyebox may be removed.
- a calibration information for the projection device is determined based on the image generated by the camera. For this purpose, an image analysis can be carried out and data from the image with respect to the virtual image and the real object can be compared with predetermined desired data.
- the calibration information may include system parameters of the projection device that are connected to the
- Projection device can be provided.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Instrument Panels (AREA)
- Controls And Circuits For Display Device (AREA)
- Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
Abstract
L'invention concerne des procédés pour l'étalonnage d'un système de projection (102) d'un véhicule, le système de projection permettant de projeter dans un trajet de rayon (110) entre l'objet réel et une position théorique de la tête (106) d'un occupant du véhicule une image (330) virtuelle affectée à un objet réel (108). Lors d'une étape de détection, une information lumineuse, incidente le long du trajet de rayon jusqu'à la position théorique de la tête, est détectée, l'information lumineuse représentant, d'une part, la lumière partant d'un objet réel situé en dehors du véhicule et, d'autre part, la lumière provenant du système de projection pour projeter dans le trajet de rayon l'image virtuelle affectée à l'objet réel. Lors d'une étape de détermination, une information d'étalonnage est déterminée pour le système de projection, se basant sur une position de l'image virtuelle déterminée à partir de l'information lumineuse et sur une position de l'objet réel déterminée à partir de l'information lumineuse.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201280022877.XA CN103502876B (zh) | 2011-05-12 | 2012-03-13 | 用于校正车辆的投影装置的方法和设备 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102011075703.1 | 2011-05-12 | ||
DE201110075703 DE102011075703A1 (de) | 2011-05-12 | 2011-05-12 | Verfahren und Vorrichtung zur Kalibrierung einer Projektionseinrichtung eines Fahrzeugs |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012152475A1 true WO2012152475A1 (fr) | 2012-11-15 |
Family
ID=45888174
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2012/054353 WO2012152475A1 (fr) | 2011-05-12 | 2012-03-13 | Procédés et dispositif servant à étalonner un système de projection d'un véhicule |
Country Status (3)
Country | Link |
---|---|
CN (1) | CN103502876B (fr) |
DE (1) | DE102011075703A1 (fr) |
WO (1) | WO2012152475A1 (fr) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102016204274A1 (de) | 2016-03-15 | 2017-09-21 | Volkswagen Aktiengesellschaft | System und Verfahren zum Erfassen einer Eingabegeste eines Nutzers |
US9783112B2 (en) | 2015-10-27 | 2017-10-10 | Cnh Industrial America Llc | Rear windshield implement status heads-up display |
US11487132B2 (en) * | 2018-11-12 | 2022-11-01 | Yutou Technology (Hangzhou) Co., Ltd. | Active alignment for assembling optical devices |
US20230016649A1 (en) * | 2018-11-12 | 2023-01-19 | Yutou Technology (Hangzhou) Co., Ltd. | Active alignment for assembling optical devices |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105874883B (zh) * | 2013-09-10 | 2019-06-18 | 飞利浦灯具控股公司 | 用于编码光源的自动化投用的方法和装置 |
JP6337269B2 (ja) * | 2014-04-09 | 2018-06-06 | パナソニックIpマネジメント株式会社 | 車両評価装置 |
US9990550B2 (en) * | 2014-09-19 | 2018-06-05 | Bendix Commercial Vehicle Systems Llc | Wide baseline object detection stereo system |
CN105786306A (zh) * | 2014-12-25 | 2016-07-20 | 比亚迪股份有限公司 | 车载抬头显示系统及其投影图像高度调节方法 |
CN105301777B (zh) * | 2015-12-05 | 2018-06-26 | 中国航空工业集团公司洛阳电光设备研究所 | 一种平显调校方法和专用于实施该方法的装置 |
DE102016210088B3 (de) * | 2016-06-08 | 2017-07-06 | Volkswagen Aktiengesellschaft | Verfahren und Vorrichtung zur Darstellung einer Umgebung eines Kraftfahrzeugs |
TWI609199B (zh) * | 2016-06-30 | 2017-12-21 | 葉天守 | 反射式虛像顯示裝置 |
CN107966816B (zh) * | 2017-11-22 | 2023-11-03 | 苏州萝卜电子科技有限公司 | 一种装调方法以及装调的分体式抬头显示器 |
KR102436730B1 (ko) * | 2017-12-06 | 2022-08-26 | 삼성전자주식회사 | 가상 스크린의 파라미터 추정 방법 및 장치 |
CN108152957A (zh) * | 2017-12-25 | 2018-06-12 | 宁波均胜科技有限公司 | 一种车载抬头显示系统及基于该系统的误差校准方法 |
CN108225734B (zh) * | 2018-01-05 | 2021-07-02 | 宁波均胜科技有限公司 | 一种基于hud系统的误差标定系统及其误差标定方法 |
CN110365952B (zh) * | 2018-04-11 | 2022-05-31 | 京东方科技集团股份有限公司 | 一种用于投射显示装置的视角测试方法和测试系统 |
CN109559522B (zh) * | 2019-01-21 | 2021-09-28 | 熵基科技股份有限公司 | 一种调试方法、伸缩立柱、摄像机及存储介质 |
CN111089708A (zh) * | 2019-12-09 | 2020-05-01 | 中国航空工业集团公司洛阳电光设备研究所 | 一种测量平显显示中心误差的系统及方法 |
CN112067013A (zh) * | 2020-09-01 | 2020-12-11 | 卜云 | 一种基于ar-hud的车载识别系统 |
CN112344963B (zh) * | 2020-11-05 | 2021-09-10 | 的卢技术有限公司 | 一种基于增强现实抬头显示设备的测试方法及系统 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102004035896A1 (de) | 2004-07-23 | 2006-03-16 | Robert Bosch Gmbh | Vorrichtung zur Erfassung von Informationen über ein Messobjekt in einem Fahrzeug |
DE102005037797A1 (de) * | 2005-08-03 | 2007-02-08 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | System und Verfahren zur automatischen Kalibrierung einer Projektion, sowie eine Verwendung des Systems |
WO2007083215A2 (fr) * | 2006-01-17 | 2007-07-26 | Ferrari S.P.A. | Procédé de commande de système d'affichage frontal de véhicule routier |
DE102007001266A1 (de) * | 2007-01-08 | 2008-07-10 | Metaio Gmbh | Optische Anordnung, insbesondere für ein Head-Up-Display |
DE102007045301A1 (de) * | 2007-09-21 | 2009-04-02 | Carl Zeiss Ag | Anordnung und Verfahren zur Charakterisierung von reflektiv abbildenden Projektionssystemen |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8350724B2 (en) * | 2009-04-02 | 2013-01-08 | GM Global Technology Operations LLC | Rear parking assist on full rear-window head-up display |
-
2011
- 2011-05-12 DE DE201110075703 patent/DE102011075703A1/de not_active Ceased
-
2012
- 2012-03-13 CN CN201280022877.XA patent/CN103502876B/zh not_active Expired - Fee Related
- 2012-03-13 WO PCT/EP2012/054353 patent/WO2012152475A1/fr active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102004035896A1 (de) | 2004-07-23 | 2006-03-16 | Robert Bosch Gmbh | Vorrichtung zur Erfassung von Informationen über ein Messobjekt in einem Fahrzeug |
DE102005037797A1 (de) * | 2005-08-03 | 2007-02-08 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | System und Verfahren zur automatischen Kalibrierung einer Projektion, sowie eine Verwendung des Systems |
WO2007083215A2 (fr) * | 2006-01-17 | 2007-07-26 | Ferrari S.P.A. | Procédé de commande de système d'affichage frontal de véhicule routier |
DE102007001266A1 (de) * | 2007-01-08 | 2008-07-10 | Metaio Gmbh | Optische Anordnung, insbesondere für ein Head-Up-Display |
DE102007045301A1 (de) * | 2007-09-21 | 2009-04-02 | Carl Zeiss Ag | Anordnung und Verfahren zur Charakterisierung von reflektiv abbildenden Projektionssystemen |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9783112B2 (en) | 2015-10-27 | 2017-10-10 | Cnh Industrial America Llc | Rear windshield implement status heads-up display |
DE102016204274A1 (de) | 2016-03-15 | 2017-09-21 | Volkswagen Aktiengesellschaft | System und Verfahren zum Erfassen einer Eingabegeste eines Nutzers |
US11487132B2 (en) * | 2018-11-12 | 2022-11-01 | Yutou Technology (Hangzhou) Co., Ltd. | Active alignment for assembling optical devices |
US20230016649A1 (en) * | 2018-11-12 | 2023-01-19 | Yutou Technology (Hangzhou) Co., Ltd. | Active alignment for assembling optical devices |
US20230016207A1 (en) * | 2018-11-12 | 2023-01-19 | Yutou Technology (Hangzhou) Co., Ltd. | Active alignment for assembling optical devices |
US11662602B2 (en) * | 2018-11-12 | 2023-05-30 | Yutou Technology (Hangzhou) Co., Ltd. | Active alignment for assembling optical devices |
US11668948B2 (en) * | 2018-11-12 | 2023-06-06 | Yutou Technology (Hangzhou) Co., Ltd. | Active alignment for assembling optical devices |
Also Published As
Publication number | Publication date |
---|---|
CN103502876A (zh) | 2014-01-08 |
CN103502876B (zh) | 2016-11-09 |
DE102011075703A1 (de) | 2012-11-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2012152475A1 (fr) | Procédés et dispositif servant à étalonner un système de projection d'un véhicule | |
EP2805183B1 (fr) | Procédé et dispositif de visualisation de l'environnement d'un véhicule | |
DE102017107396A1 (de) | Testverfahren und Testvorrichtung für Fahrerassistenzsysteme | |
DE112017003916T5 (de) | Head-Up-Display-Vorrichtung, Anzeigesteuerverfahren und Steuerprogramm | |
DE102018215185B3 (de) | Verfahren, Vorrichtung und computerlesbares Speichermedium mit Instruktionen zum Einstellen eines Head-Up-Displays in einem Kraftfahrzeug, Einstellvorrichtung zur Verwendung in einem solchen Verfahren oder mit einer solchen Vorrichtung | |
DE102007001266A1 (de) | Optische Anordnung, insbesondere für ein Head-Up-Display | |
DE102010038825A1 (de) | Bildanzeigesteuervorrichtung | |
DE102020124756B4 (de) | Verfahren und Vorrichtung zum Kalibrieren eines virtuellen Bildes | |
DE102009019399B4 (de) | Verfahren zur automatischen Bestimmung wenigstens einer die Änderung der Lage eines Kraftfahrzeugs beschreibenden Zielgröße | |
DE102015008887A1 (de) | Verfahren und Vorrichtung zur Kalibrierung eines Headup-Displays in einem Fahrzeug | |
DE102011075702A1 (de) | Verfahren und Vorrichtung zur Ausrichtung einer Projektion einer Projektionseinrichtung eines Fahrzeugs | |
DE102014226185A1 (de) | Verfahren zum Bestimmen einer Blickrichtung einer Person | |
DE102010003850A1 (de) | Verfahren und Vorrichtung zur Nachführung einer Position einer Objektmarkierung | |
EP2888623B1 (fr) | Commande d'un affichage tête haute d'un véhicule et système de formation d'images pour l'affichage tête haute | |
DE102013219556A1 (de) | Verfahren und Vorrichtung zur Steuerung einer Bilderzeugungseinrichtung eines Head-Up-Displays | |
DE102013206435A1 (de) | Visualisierungssystem und Vorrichtung zur Generierung und Anzeige eines virtuellen Abbilds einer Fahrzeugumgebung | |
DE102016211227A1 (de) | Verfahren und Fahrzeugsteuersystem zum Erzeugen von Abbildungen eines Umfeldmodells und entsprechendes Fahrzeug | |
DE112016000689T5 (de) | Kameraparametereinstellvorrichtung | |
DE102017215163A1 (de) | System aus einem Kraftfahrzeug und einer Augmented-Reality-Brille und Verfahren zum Bestimmen einer Pose einer Augmented-Reality-Brille im Innenraum eines Fahrzeugs | |
DE102016208398B4 (de) | Verfahren zur Kalibrierung einer Fahrzeugkamera | |
DE102006044615A1 (de) | Verfahren zur Kalibrierung von Bilderfassungseinrichtungen in Fahrzeugen | |
DE102006037600B4 (de) | Verfahren zur auflösungsabhängigen Darstellung der Umgebung eines Kraftfahrzeugs | |
DE102013016249A1 (de) | Verfahren und Vorrichtung zur Darstellung von Navigationshinweisen | |
DE102012213132B4 (de) | Verfahren und Vorrichtung zur Fusion von Kameraaufnahmen von zumindest zwei Fahrzeugen | |
DE102006051539A1 (de) | Verfahren und Vorrichtung zur luftbildgestützten Umgebungserfassung bei Kraftfahrzeugen |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12710673 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12710673 Country of ref document: EP Kind code of ref document: A1 |