WO2024032971A1 - Dispositif et procédé de mesure d'images virtuelles tridimensionnelles et d'objets sur un affichage tête haute - Google Patents

Dispositif et procédé de mesure d'images virtuelles tridimensionnelles et d'objets sur un affichage tête haute Download PDF

Info

Publication number
WO2024032971A1
WO2024032971A1 PCT/EP2023/068086 EP2023068086W WO2024032971A1 WO 2024032971 A1 WO2024032971 A1 WO 2024032971A1 EP 2023068086 W EP2023068086 W EP 2023068086W WO 2024032971 A1 WO2024032971 A1 WO 2024032971A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual
camera
test object
field
pattern
Prior art date
Application number
PCT/EP2023/068086
Other languages
German (de)
English (en)
Inventor
Christoph Boesel
Martin BAUMGARTL
Original Assignee
Bayerische Motoren Werke Aktiengesellschaft
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bayerische Motoren Werke Aktiengesellschaft filed Critical Bayerische Motoren Werke Aktiengesellschaft
Publication of WO2024032971A1 publication Critical patent/WO2024032971A1/fr

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0127Head-up displays characterised by optical features comprising devices increasing the depth of field
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera

Definitions

  • the invention relates to a method for measuring virtual 3D images of a field of view display device, which is used to display 3D images in a user's field of vision via reflection on a partially transparent reflection disk arranged in its field of vision.
  • the field of view display device can be designed in particular for use in a motor vehicle or another land, aircraft or water vehicle, with, for example, a front, rear or side window of the vehicle or a combiner window specifically provided for this purpose being in the field of vision of a user, the passenger of the vehicle , serves as a reflection disk.
  • the invention is also directed to a corresponding control unit, an associated field of view display device and a vehicle equipped therewith.
  • HUD head-up displays
  • This allows, for example, speed information and other useful navigation and vehicle operating instructions or even entertainment content in the form of a virtual image to be superimposed on the real image of the surroundings in front of or in the vehicle observed by the driver or another occupant.
  • a HUD in a classic design has a projection unit housed in the instrument panel.
  • This includes an imaging unit, for example a display, for generating a light beam with the desired display content and, if necessary, suitable projection optics in order to further shape the light beam and to the 22-1285 2 called partially transparent reflection disk.
  • the bundle of light rays By reflecting on the reflection disk, the bundle of light rays reaches a space in the vehicle interior that is intended for the user's eyes (also called an eyebox), from which he can perceive the display content as a virtual image floating behind the reflection disk.
  • the evaluation of the virtual image and the improvement of the HUD performance requires an objective capture of the displayed virtual image content using measuring devices.
  • photos of the virtual image are usually created using a mono camera in fixed positions relative to the windshield and some important performance parameters, such as. B. a double image distance or any distortions are evaluated. However, any existing depth information of the virtual image is lost.
  • HUD design parameters that are particularly important for 3D images such as the virtual image distance or image curvature, or performance parameters such as local point disparity.
  • the relative position of the virtual image to the environment which is of utmost importance for augmentation (augmented reality) through contact-analog virtual representations, i.e. virtual representations based on real surrounding objects, cannot be objectively recorded and corrected either.
  • Head-up display development is increasingly focusing on the generation of three-dimensional virtual image structures and image augmentation through contact-analog virtual representations. Therefore, an objective evaluation of such image content using suitable measuring devices is becoming increasingly important in order to be able to ensure targeted improvement and reliable calibration of corresponding HUD systems.
  • stereoscopic measurement using a specially designed stereo camera is an established method for measuring and recording the spatial structure of physical three-dimensional objects, as described, for example, in US 1,871,281 or US 6,430,373.
  • a photo of the object to be examined is created using at least two calibrated cameras, whose fixed positioning in space relative to one another is known.
  • a pixel disparity of the identical features in the left and right images can be determined.
  • the camera calibration of the stereo camera is usually carried out by photographing various targets, ie real objects whose spatial positioning and extent are known, by the cameras that are fixedly positioned relative to one another. From this, extrinsic camera parameters such as rotation and translation of the coordinate systems of both cameras relative to one another and, if not already known, also intrinsic camera parameters such as focal length and optical center can be determined using known mathematical methods.
  • extrinsic camera parameters such as rotation and translation of the coordinate systems of both cameras relative to one another and, if not already known, also intrinsic camera parameters such as focal length and optical center can be determined using known mathematical methods.
  • a well-known difficulty when measuring a real 3D object with a stereo camera is quickly and precisely determining corresponding features in both images. In general, this can be done, for example, by using a suitable similarity measure and e.g. B.
  • a brute force search algorithm that compares the left and right camera images.
  • a frequently used step that is intended to further simplify the search is to rectify both stereo images.
  • the images are transformed using the known extrinsic parameters (such as a rotation matrix and a translation vector of the two camera coordinate systems of the stereo camera relative to one another) as if both cameras were perfectly aligned parallel to one another. This means that corresponding features in the right and in the 22-1285 4 left camera image has the same vertical pixel coordinate, and corresponding features are therefore on a horizontal line.
  • the projected pattern is used to quickly and robustly locate corresponding features in the left and right camera images, for example on large monochrome areas of the 3D object, using a suitable similarity measure.
  • DE 102015211954 A1 In order to be able to measure transparent or highly reflective 3D objects that have, for example, smooth or painted surfaces, DE 102015211954 A1 also proposes, as a variation of this projector idea, to replace the optical cameras of the stereo camera with thermal imaging cameras and to replace the projection unit with one Equip infrared heaters. In particular, it is proposed to imprint an irregular, for example any (quasi-) statistical thermal pattern (e.g. a speckle pattern) on the object surface to be examined. These suggestions do not apply to virtual 3D objects.
  • any (quasi-) statistical thermal pattern e.g. a speckle pattern
  • the generation of a virtual ideal image in accordance with the performance specifications of the HUD is not easily possible, for example due to manufacturing tolerances of the windshield, and requires a precise assessment of the influence of any construction deviations on the virtual image. It is therefore an object of the present invention to provide a technical concept (method and device) for measuring three-dimensional virtual images and objects of a head-up display or other field of view display device, which make it possible to overcome the problems and difficulties described. In particular, this is intended to enable a quick, precise and robust evaluation of the performance of a field-of-view display device designed for 3D representation and thus also its targeted improvement.
  • the field of view display device should be particularly suitable for use in a vehicle.
  • a method for measuring virtual 3D images of a field-of-view display device and by a corresponding control unit, field-of-view display device and a vehicle equipped therewith according to the independent claims. Further refinements are specified in the dependent claims. All further features and effects mentioned in the claims and the following description for the method also apply with regard to the control unit, the field of view display device and the vehicle, and vice versa.
  • a method is provided for measuring virtual 3D images of a field of view display device, for example a head-up display (HUD), which can be designed in particular for use in a vehicle.
  • HUD head-up display
  • the field of view display device is designed to display three-dimensional images and objects into the field of vision of a user, such as a driver or another occupant of the vehicle, via reflection on a partially transparent reflection disk arranged in his field of vision, 22-1285 6, for example a windshield or another vehicle window or a combiner window designed specifically for this purpose.
  • the vehicle can be a motor vehicle, but also any other land, air or water vehicle.
  • the representation of a respective 3D object by the field of view display device can in particular also be contact-analogous, ie oriented towards real surrounding objects outside the vehicle.
  • the method includes the following steps: First, an image-generating unit of the field of view display device or its control unit is provided with image generation data of a virtual 3D test object.
  • a virtual object surface of the 3D test object to be displayed in three dimensions is overlaid on the software side with a predetermined visual surface pattern so that it has variations corresponding to the surface pattern, for example in brightness and/or color.
  • Both the 3D test object and the surface pattern can vary in position, shape and extent over time in a predetermined manner.
  • the surface pattern allows the recognition of the individual surface points or areas of the virtual 3D test object to be significantly more precise, simplified and/or accelerated during its later stereoscopic detection and evaluation.
  • the entire object surface to be displayed of the virtual 3D test object can be so completely overlaid/overlaid/covered with the surface pattern when its image generation data is provided that robust, rapid and at the same time complete recognition of individual surface points is made possible.
  • the virtual 3D test object used for the measurement can have any three-dimensional shape and extent and can consist of any number of non-connected 3D partial objects, each of which can be arranged anywhere in space.
  • the non-contiguous virtual 3D partial objects themselves can be arranged in such a way that 22-1285 7 they represent or result in a predetermined, for example regular, irregular, statistical or quasi-statistical three-dimensional pattern (hereinafter referred to as 3D pattern) in space.
  • a 3D pattern generated in this way can be used in the present method as an alternative or in addition to the above-mentioned overlay of the three-dimensional virtual object surface with the predetermined surface pattern in order to achieve the effects described below when measuring the 3D test object.
  • the overlay of the object surface with a surface pattern can, for example, be more favorable than generating a virtual 3D test object in the form of a predetermined (virtual) 3D pattern if the virtual 3D test object is to have a predetermined three-dimensional object or surface shape, for example occurs particularly often during normal operation of the field of view display device and is therefore preferably also used in the evaluation of its performance described herein.
  • the virtual 3D test object is now generated by the field of view display device according to the image generation data provided and captured from at least two different perspectives by an optical stereo camera system.
  • the stereo camera system has at least one movable camera and/or at least two cameras at a definable distance from one another in the beam path of the field of view display device after the reflection disk.
  • the respective camera can be arranged, for example, at a position in space intended for the corresponding eye of the user. However, this is not mandatory.
  • image points are identified based on the surface patterns and/or 3D patterns contained therein 22-1285 8 each come from one and the same object surface point.
  • at least one actual display parameter of the generated virtual 3D test object is determined, which can be used to evaluate the 3D display performance of the field of view display device.
  • the at least one actual representation parameter can include, for example, a projection distance (also called projection depth or image distance) and/or a spatial position and/or a spatial orientation of the generated virtual 3D test object or its surface points.
  • the at least one actual representation parameter of the generated virtual 3D test object can include a local vertical point disparity, which indicates a vertical offset of the pairwise corresponding object surface points in the camera images from the two different perspectives.
  • the respective actual display parameter can be determined, for example, with respect to a coordinate system of the stereo camera system, the field of view display device or the vehicle in which it is mounted, or with respect to an eyebox intended for its user.
  • an eyebox is understood here to be a two- or three-dimensional spatial area intended for the user's eyes or for the respective user's eye, from which the generated virtual 3D images are visible to the user in the intended quality.
  • the respective actual display parameter can be compared with a predetermined, associated target display parameter of the virtual 3D test object.
  • the 3D display performance of the field of view display device can be evaluated and, for example, specifically improved if a predetermined deviation tolerance is exceeded.
  • This metrological recording can in particular also be designed to be eye position-dependent, ie correspond to selectable/adjustable or variable eye positions and/or eye distances of a user.
  • the field of view display device in particular in the form of a head-up display, can be designed for 3D image generation in basically any technically feasible manner. This can therefore not only involve the mere creation of a depth effect through the use of several two-dimensional virtual image planes and image surfaces that are inclined or curved in space, but in particular also “real” three-dimensional virtual 3D objects, which can be created, for example, using holographic techniques a specially designed image-generating unit of the field of view display device can be generated.
  • 3D HUD in a motor vehicle, objects such as an arrow can be displayed at a great distance and at the same time, for example.
  • B. a sign close to the driver.
  • these objects are displayed superimposed with a checkerboard pattern, for example, so that they can be measured with a pair of stereo cameras (depending on the position of the pair of stereo cameras).
  • the deviations from the target can then be determined.
  • a central special feature of virtually represented 3D objects is that their position and shape (both when using two-dimensional image planes and three-dimensional curved image surfaces as well as “real” three-dimensional virtual objects) change with the eyes - or camera position can change.
  • the individual eye distance also plays a role 22-1285 10 a role for the user, so that users with different eye distances can perceive a virtual 3D object differently.
  • the correct position and orientation of a virtual object in space can be of crucial importance, especially in the case of contact-analog representation (ie based on real surrounding objects).
  • contact-analog representation ie based on real surrounding objects.
  • the surface pattern can, for example, have at least one of the following pattern types or pattern properties, which can also be combined with one another and/or alternately, ie next to one another, in the object surface: a regular or periodic two-dimensional pattern (the periodicity can be one- or two-dimensional be, ie the two-dimensional pattern can change periodically in the two-dimensional surface, for example only in one direction, while it remains constant in a direction orthogonal thereto, or it can change in two independent directions with the respective periodicity); an irregular or aperiodic two-dimensional pattern (e.g., statistical or quasi-statistical); a checkerboard pattern; a flat distribution of circles with one or more predetermined diameters; a speckle pattern; an area-wide gray value pattern, the gray values of which preferably vary in a predetermined manner in each area point; a black and white pattern; a color pattern that has one or more different colors.
  • a regular or periodic two-dimensional pattern the periodicity can be one- or two-dimensional be, ie the two-
  • comprehensive and/or irregular surface patterns such as: B. aperiodic sine patterns or speckle-like gray value distributions, if necessary more options for quick and/or 22-1285 11 time-resolved feature mapping between the left and right camera images of the created virtual 3D object.
  • the respective camera of the stereo camera system when capturing the generated virtual 3D test object, is arranged at a spatial position intended for the user's respective eye.
  • an entire eyebox volume of the field of view display device intended for the respective eye of the user can be scanned by the stereo camera system in that the respective camera of the stereo camera system successively captures the generated virtual 3D test object from several different positions within the eyebox volume in order to evaluate the 3D display performance of the field of view display device as fully as possible for the entire eyebox.
  • the definable relative distance of the at least two cameras can be varied to adapt to different eye distances of the user. In particular, it can also be varied when capturing the generated virtual 3D test object in order to evaluate the 3D display performance of the field of view display device for different user eye distances.
  • the method presented here further comprises a calibration of at least one of the cameras of the stereo camera system, which can be carried out in particular before the measurement of the actual 3D test object.
  • this camera can, for example, be positioned at one or more discrete calibration positions and a calibration can be carried out by determining a spatial position and/or orientation and/or, if necessary, at least one internal imaging property of this camera with respect to a coordinate system of the stereo camera system or the field of view display device become.
  • a parametric recalibration function is then determined from this, which, depending on at least one (in particular continuously) measurable camera position parameter, indicates the spatial position and/or orientation of this camera at an infinite number of camera positions lying in between and/or beyond.
  • Spatial positions and/or orientations of this camera that can be obtained using this parametric recalibration function can be used as its calibration in the actual measurement of the virtual 3D test object, ie to determine its at least one actual representation parameter, such as. B. its projection distance using triangulation.
  • a time-consuming and cost-intensive recalibration of each individual camera in the stereo camera system can be avoided with each camera movement.
  • such a parameterization of the camera calibration opens up the possibility of quickly and precisely taking into account different eye positions and/or eye distances of the user when measuring the virtual 3D test object.
  • said camera can be displaceable along at least one translational rail with linear position feedback, so that said measurable camera position parameter is a linear camera position or a linear camera distance to the second camera of the stereo camera system on this rail according to the currently received position feedback.
  • the position feedback can, for example, be measured or obtained directly during the capture of the generated virtual 3D test object or immediately before or after, so that in particular recalibration in real time is made possible by such a parametric recalibration function.
  • a control unit is provided which is designed and set up to automatically carry out the method presented here. For this purpose, for example, in the control unit 13 corresponding computer program must be installed and run during operation of the field of view display device.
  • a field of view display device which can be designed in particular for use in a vehicle.
  • the field of view display device comprises a projection unit which is designed to generate and output a light beam with a 3D image content transported therein to a partially transparent reflection disk to be arranged or arranged in the field of vision of a user.
  • the design and mutual arrangement of the projection unit and the reflection disk are chosen such that the user is presented with a desired virtual 3D image behind it, provided his eyes are in a designated spatial area (eyebox) opposite the reflection disk.
  • the projection unit can comprise an image-generating unit, for example in the form of a suitable display or projector, and possibly further optical elements in the beam path of the light beam emanating from the display/projector for further beam shaping and deflection.
  • the field of view display device further comprises an optical stereo camera system which comprises at least one movable camera and/or at least two cameras at a definable distance from one another, which can be positioned in the beam path of the light beam after its reflection on the reflection disk and are designed to display the virtual 3D shown -Capture image from at least two different perspectives.
  • the field of view display device includes the above control unit, which is designed and set up to control the projection unit and the stereo camera system when automatically carrying out the method presented herein.
  • the reflection disk mentioned can also be manufactured and/or sold as part of the field of view display device or, alternatively, separately.
  • the reflection pane can in particular be used as a section of a windshield of a vehicle or another 22-1285 14 vehicle window can be formed.
  • it can also be designed as a combiner disk designed specifically for the purpose mentioned here.
  • a vehicle in particular a motor vehicle or any other land, air or water vehicle, is provided.
  • the spatial orientation terms used here such as “above”, “below”, “in front”, “side”, “horizontal”, “vertical” etc. can in particular refer to the usual vehicle-fixed Cartesian coordinate system with mutually perpendicular longitudinal, transverse and height axes of the vehicle.
  • the vehicle includes a windshield and an instrument panel located underneath and is equipped with the above field of view display device.
  • Their image-generating unit or, if appropriate, their entire projection unit can be arranged in particular inside the instrument panel or in/on its top side, for example installed directly on or below the top side of the instrument panel, such that the light beam bundle is thrown by the projection unit onto the windshield or a combiner disk positioned inside the vehicle in front of it in the field of vision of the driver or another occupant, which serves as the above-mentioned partially transparent reflection disk.
  • the field of view display device can also be installed at any other suitable location in the vehicle.
  • Figure 1 shows a schematic top view of a field of view display device in a motor vehicle, which is designed to display 3D images into an occupant's field of vision via reflection on a vehicle window arranged in his field of vision and to carry out a method according to an exemplary embodiment of the invention
  • Figure 2 shows a flowchart of an exemplary embodiment of a method presented herein for measuring virtual 3D images of a field of view display device
  • 3 shows an exemplary step of providing image generation data of a virtual 3D test object in the method of FIG.
  • FIG. 1 shows, in a highly simplified schematic top view, an exemplary embodiment of a vehicle 1 with a field of view display device 2 according to the aspects of the invention specified above and in the claims.
  • This example is a motor vehicle that is only indicated by its windshield 3.
  • a projection unit 5 of the field of view display device 2 is arranged opposite the windshield 3, for example below it in an instrument panel 4 of the vehicle 1 (not shown).
  • the field of view display device 2 is designed as a 3D head-up display (3D HUD).
  • the projection unit 5 is designed to generate a light beam L with a desired 3D image content.
  • the light beam L emanating from the projection unit 5 is thrown onto the windshield 3, which in this example serves as a partially transparent reflection disk of the field of view display device 2, so that after reflection on the windshield 3 it reaches an eyebox E of a user 6, which in this example is a driver of vehicle 1.
  • the eyebox E is a two- or three-dimensionally defined spatial area in the vehicle 1 at a predetermined position relative to the windshield 3, which is intended for the eyes of the user 6, so that he can see a virtual 3D image O generated by the field of view display device 2 with both eyes able to see. As illustrated in FIG.
  • the user 6 sees a virtual eye with his left eye 22-1285 17 image OL and with his right eye a virtual image OR, the relative position and shape of which can vary with the eye distance and the eye position of the user 6 relative to the light beam L, so that the resulting perception and spatial position of the generated 3D object depending on the eye distance and the eye position of the user 6 can vary.
  • an optical stereo camera system 7 is provided in the vehicle 1 or in the field of view display device 2, which can include at least one movable camera and / or at least two cameras at a definable distance from one another (not shown individually), which are in the beam path of the light beam L whose reflection can be positioned on the windshield 3 in order to capture the displayed virtual 3D image O from two different perspectives to evaluate the 3D HUD performance (of course without the user 6 being there).
  • the stereo camera system 7 can be positioned directly in the eyebox E of the user 6 or, as indicated in FIG. 1, immediately in front of it.
  • the stereo camera system 7 and its position are indicated purely by way of example and schematically in FIG.
  • a correspondingly configured control unit 8 is also provided, which can communicate in an appropriate manner with the projection unit 5 and the stereo camera system 7 in terms of information and control technology.
  • the control unit 8 can be arranged, for example, in the projection unit 5 or outside it in the vehicle 1, for example in the instrument panel 4.
  • 2 shows a flowchart of an exemplary embodiment of a method according to the above first aspect of the invention for measuring virtual 3D images O of a field of view display device 2, as shown, for example, in FIG. 1.
  • FIG. 1 should not be construed as limiting, but rather serves solely as an exemplary illustration of the possible 22-1285 18 steps.
  • a step S1 the projection unit 5 of the field of view display device 2 or its control unit 8 is provided with image generation data of a virtual 3D test object O.
  • a virtual object surface 9 of the 3D test object to be displayed in three dimensions is overlaid on the software side with a predetermined visual surface pattern so that it has variations corresponding to the surface pattern, for example in brightness and/or color.
  • Fig. 3 illustrates this step using the example of a checkerboard surface pattern.
  • the entire object surface 9 of the virtual 3D test object O to be displayed has been overlaid with the surface pattern when its image generation data is provided.
  • the virtual 3D object to be created is shown before the software overlay with the surface pattern.
  • the virtual 3D object O to be created is shown in its representation form used for measurement/calibration.
  • its cuboid object surface 9 is superimposed on the software side with a known structure, such as e.g. B. a checkerboard pattern, the corners 10 of which are particularly easy to localize by gradient formation after detection by the stereo camera system 7.
  • a known structure such as e.g. B. a checkerboard pattern
  • more general (quasi-) statistical patterns can also be used as area patterns.
  • the overlay with a suitable surface pattern in step S1 serves purely to measure and, if necessary, calibrate the field of view display device 2 (see below).
  • the virtual 3D test object O is displayed by the field of view display device 2 according to the provided image formation data 22-1285 19 generated and captured from at least two different perspectives by an optical stereo camera system 7.
  • the stereo camera system 7 has at least one movable camera and/or at least two cameras at a definable distance from one another in the beam path of the field of view display device 2 after the reflection disk.
  • At least one actual display parameter of the generated virtual 3D test object O is determined, which can be used to evaluate the 3D display performance of the field of view display device 2.
  • This method and this device make possible, in particular, a fast, robust and precise measurement of virtual 3D test objects O, which are generated, for example, by a HUD, which also enables, among other things, a quick calibration of the HUD.
  • Important measured variables or output parameters in the 3D measurement of the generated virtual test objects O for HUD evaluation with a stereo camera system 7 (also called 3D camera) are, for example: B.
  • the projection distance can e.g. B. deviate from the specification due to construction tolerances of the field of view display device 2.
  • the vertical local 22-1285 20 Point disparity corresponds to a vertical offset of the features between the left and right eyes and is caused by the driver's left and right eyes seeing a different virtual image in the case of HUD projection. The latter does not occur, for example, when measuring real objects, ie there the vertical local point disparity is naturally equal to zero.
  • the camera images recorded in step S2 from the two different perspectives can first be rectified in a manner known per se, that is, converted to a parallel alignment of both cameras.
  • the corresponding pixel coordinates ( ) and ( ) in the (rectified) left and right camera images can then be determined.
  • this would apply in the rectified camera images of a real object.
  • the right and left eyes perceive different virtual images OL and OR (see Fig. 1) and the following applies in the rectified camera images.
  • the projection distance results, for example, from triangulation or from the horizontal pixel deviation of the pixel coordinates ( ) of the corresponding features in the (rectified) left and right camera image as well as the camera distance from the set of rays.
  • the present method proposes that the 3D objects typically displayed in the HUD for the driver (such as an arrow on the road, a marking of signs or pedestrians, a map for navigation, etc ) before or during their generation by the projection unit 5 (such as an augmented reality HUD display) solely for the purpose of measurement and / or calibration on the software side with regular (such as the checkerboard pattern 3), irregular, statistical or quasi-statistical surface patterns.
  • a checkerboard pattern for example: B. determine the vertical local point disparity at the corner points 10 of the checkerboard pattern with high precision.
  • a calibration of the HUD can also take place, for example, by measuring the virtual, patterned 3D test object in reference to a known real object in the driver's field of vision.
  • another problem with virtual objects on a HUD is the dependence of the position of the virtual object on the distance between the eyes and the eye position of the user 6. This plays a role both in the stereo camera calibration and in the measurement of the 3D test object O with the stereo -Camera system 7 a role.
  • the method described here enables a solution to this problem, which is described below using the example of a stereo camera system with cameras that can be positioned flexibly relative to one another:
  • the integration of the eye distance into the measurement proposed here requires, in the first step, the determination of a rotation matrix and a translation vector as in a conventional stereo camera calibration.
  • the distortion of the two cameras can be determined (if they have optical distortion) and calculated out algorithmically.
  • the intrinsic camera parameters are determined if these are not known, as well as the extrinsic parameters (rotation matrix, translation vector of the two camera coordinate systems relative to each other or to the vehicle).
  • the two cameras can z. B. be attached to a translational rail or other movement device with position feedback.
  • the stereo camera calibration according to the embodiment of the invention described above is carried out using mathematical means (such as mathematical modeling or polynomial interpolation or extrapolation and much more) in between and/or positions beyond this are expanded.
  • a parametric recalibration function can be specified based on actual calibration at some known position points of the cameras, which depends on one or more (for example continuously) measurable camera position parameters.
  • the parametric recalibration function can specify a precise relative 3D camera position depending on a measurable distance of the camera on a rail (which can be described by a simple linear model).
  • the eyebox volume of the HUD can be scanned or scanned, for example, using the stereo camera system for variable distances of the left and right cameras.
  • the eye position can be parameterized by the respective center of the stereo camera system 7.
  • a 3D point model is created, for example, by taking a stereo photograph of the virtual 3D test object (provided with the surface pattern).
  • 3D point cloud models can be interpolated or extrapolated for unknown eye distances and eye positions.
  • the necessary number of measuring points of the grid in the eyebox volume or the complexity of the fit function generally depends on the complexity of the HUD optics or the reflection pane (such as the windshield 3).
  • the eyebox volume can be “sampled”, for example by a single eye position with a predetermined constant eye distance.
  • the measuring device can therefore include, for example, one or more variably positionable camera(s) (stereo camera system 7), whose relative positions can be either fixed (only for at least two cameras) or variable.
  • the calibration of the measuring device is ideally carried out using the parametric description presented above with the help of camera position-dependent rotation matrices and translation vectors or projection matrices composed thereof where ⁇ is the measurable camera position parameter mentioned above.
  • the pixel disparities of the identical features can be interpreted as a function of the camera positions determined in this way.
  • This allows the 3D structure of the surface 9 of the virtual 3D test object O to be calculated depending on the position using a parametric representation of this functional relationship.
  • the virtual 3D test object O can be interpolated for different eye positions and eye distances.
  • the problem of exact positioning of the two cameras during stereo calibration which is very difficult to achieve (e.g. due to construction tolerances of the cameras), can be eliminated 22-1285 24 needed to describe/evaluate 3D HUD performance for a desired eye relief can be avoided using rigorous mathematical means.
  • FIG. 4 and 5 show further examples of the software overlay of the virtual 3D test object O to be measured in step S1 of FIG. 2 before it is generated with predefined surface patterns, before it is generated in steps S2 and S3 and photographed from different camera perspectives . If the virtual object is overlaid with full circles 11, for example as in FIG User 6 perceivable virtual images OL and OR of FIG. 1 can correspond).
  • FIG. 5 Another possible implementation according to FIG. 5 is to overlay the virtual 3D test object to be measured (here a pyramid with a rectangular base) with a speckle-like surface pattern in the software.
  • the feature matching between the photos from the individual camera perspectives can be done here, for example, by rectifying the two photos and assigning the gray value values around the corresponding maxima of the speckles 12 in the photos from the individual camera perspectives. Otherwise, the same thing as in Fig. 4 can apply here.
  • Another possible implementation is to overlay the virtual 3D test object O to be measured with a checkerboard pattern in the software.
  • the feature matching between the photos from the individual camera perspectives is done here, for example, by determining the corner points 10 of the checkerboard patterns and assigning the corresponding coordinates in the photos from the individual camera perspectives.
  • the 22-1285 25 individual squares of the chessboard can be coded, for example, by colors or QR-like patterns. Otherwise, the same thing as in Fig. 4 or 5 can apply here.
  • the virtual 3D test object O consists of N virtual sub-objects or sub-objects distributed arbitrarily in 3D space (e.g. it can be a regular or grid-like three-dimensional arrangement of three-dimensional spheres), each with an area pattern are superimposed, the feature(s) of which are used for stereoscopic feature matching.
  • Another possible implementation would be to overlay the virtual 3D object to be measured with an aperiodic surface pattern, for example a sine pattern (not shown), on the software side.
  • the feature matching between the photos from the individual camera perspectives can be implemented here by rectifying the photos and assigning the gray value values around the corresponding maxima of the sine pattern in the photos from the individual camera perspectives.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

L'invention concerne un procédé de mesure d'images 3D virtuelles sur un dispositif d'affichage de champ, qui est conçu pour superposer des images 3D dans le champ de vision d'un utilisateur par réflexion sur une plaque de réflexion partiellement transparente. Le procédé comprend les étapes consistant à : fournir des données de génération d'image relatives à un objet de test 3D virtuel, la surface d'objet tridimensionnelle de celui-ci étant superposée à un motif de surface visuelle prédéterminé et/ou l'objet de test 3D lui-même ou des objets partiels non continus individuels de celui-ci représentant un motif 3D prédéterminé ; générer l'objet de test 3D virtuel en fonction des données de génération d'image fournies ; capturer l'objet de test 3D virtuel généré à partir de deux perspectives différentes à l'aide d'un système de caméra stéréo optique et identifier des points de surface d'objet correspondants dans les deux images de caméra sur la base des motifs de surface et/ou des motifs 3D capturés dans celles-ci ; et déterminer à partir de celles-ci au moins un paramètre de représentation réel de l'objet de test 3D virtuel généré.
PCT/EP2023/068086 2022-08-09 2023-06-30 Dispositif et procédé de mesure d'images virtuelles tridimensionnelles et d'objets sur un affichage tête haute WO2024032971A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102022120088.4A DE102022120088A1 (de) 2022-08-09 2022-08-09 Vorrichtung und Verfahren zur Vermessung dreidimensionaler virtueller Bilder und Objekte eines Head-Up-Displays
DE102022120088.4 2022-08-09

Publications (1)

Publication Number Publication Date
WO2024032971A1 true WO2024032971A1 (fr) 2024-02-15

Family

ID=87074755

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2023/068086 WO2024032971A1 (fr) 2022-08-09 2023-06-30 Dispositif et procédé de mesure d'images virtuelles tridimensionnelles et d'objets sur un affichage tête haute

Country Status (2)

Country Link
DE (1) DE102022120088A1 (fr)
WO (1) WO2024032971A1 (fr)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US1871281A (en) 1926-07-26 1932-08-09 Savage Lawrence Francis Stereoscopic photography
US6430373B1 (en) 2000-05-24 2002-08-06 Minoru Inaba Stereo camera
DE102006049695A1 (de) 2006-10-16 2008-04-24 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Vorrichtung und Verfahren zum berührungslosen Erfassen einer dreidimensionalen Kontur
JP2012058076A (ja) * 2010-09-09 2012-03-22 3D Media Co Ltd 3次元計測装置及び3次元計測方法
DE102014013221A1 (de) * 2014-09-05 2015-04-02 Daimler Ag Vorrichtung und Verfahren zur Kalibrierung einer Bildanzeigeeinheit eines Fahrzeugs
US9503703B1 (en) * 2012-10-05 2016-11-22 Amazon Technologies, Inc. Approaches for rectifying stereo cameras
DE102015211954A1 (de) 2015-06-26 2016-12-29 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Verfahren und Vorrichtung zum berührungslosen Vermessen einer Objektoberfläche
JP2019219929A (ja) * 2018-06-20 2019-12-26 株式会社フォーディーアイズ 常時キャリブレーションシステム及びその方法

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102018001969A1 (de) 2018-03-12 2018-09-20 Daimler Ag Verfahren zur Kalibrierung eines kontaktanalogen Headup-Displays eines Fahrzeuges in einer Werkstatt
DE102019004816A1 (de) 2019-07-10 2020-01-16 Daimler Ag Verfahren zur Kalibrierung eines Head-up-Displays eines Fahrzeugs
DE102019131740A1 (de) 2019-11-25 2021-05-27 Audi Ag Verfahren und Anzeigevorrichtung zur Erzeugung eines Tiefeneffekts in der Perspektive eines Beobachters auf einem flachen Anzeigemedium sowie Kraftfahrzeug
DE102020003921A1 (de) 2020-06-30 2020-08-13 Daimler Ag Verfahren zur Kalibrierung eines Head-Up-Displays und Fahrzeug mit einem solchen Head-Up-Display

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US1871281A (en) 1926-07-26 1932-08-09 Savage Lawrence Francis Stereoscopic photography
US6430373B1 (en) 2000-05-24 2002-08-06 Minoru Inaba Stereo camera
DE102006049695A1 (de) 2006-10-16 2008-04-24 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Vorrichtung und Verfahren zum berührungslosen Erfassen einer dreidimensionalen Kontur
JP2012058076A (ja) * 2010-09-09 2012-03-22 3D Media Co Ltd 3次元計測装置及び3次元計測方法
US9503703B1 (en) * 2012-10-05 2016-11-22 Amazon Technologies, Inc. Approaches for rectifying stereo cameras
DE102014013221A1 (de) * 2014-09-05 2015-04-02 Daimler Ag Vorrichtung und Verfahren zur Kalibrierung einer Bildanzeigeeinheit eines Fahrzeugs
DE102015211954A1 (de) 2015-06-26 2016-12-29 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Verfahren und Vorrichtung zum berührungslosen Vermessen einer Objektoberfläche
JP2019219929A (ja) * 2018-06-20 2019-12-26 株式会社フォーディーアイズ 常時キャリブレーションシステム及びその方法

Also Published As

Publication number Publication date
DE102022120088A1 (de) 2024-02-15

Similar Documents

Publication Publication Date Title
DE69826753T2 (de) Optischer Profilsensor
AT506110B1 (de) Vorrichtung und verfahren zur erfassung von körpermassdaten und konturdaten
DE112010005008B4 (de) System und Verfahren zur Bestimmung von Kamerafehlkalibrierung im Laufzeitbetrieb
EP2040026B1 (fr) Procédé et système de calibrage d'un appareil de mesure de la forme d'une surface réfléchissante
DE112010005646B4 (de) Kameraabstandsmessvorrichtung
EP2329222B1 (fr) Procédé et dispositif de mesure pour déterminer la géométrie de roues ou d'essieux d'un véhicule
DE102018108027B4 (de) Objekterfassungsvorrichtung
DE102006055758B4 (de) Verfahren zur Kalibrierung von Kameras und Projektoren
EP2166510B1 (fr) Procédé de détermination de la position et de l'orientation d'une caméra installée dans un véhicule
EP2989594B1 (fr) Dispositif et procédé d'identification d'inscriptions sur des pneus de véhicules
DE102012108567A1 (de) Verfahren und Vorrichtung zum Erlangen von Tiefeninformationen unter Verwendung eines Lichtmusters
WO2008064892A1 (fr) Procédé de détermination d'une position, dispositif et produit de programme informatique
DE10137241A1 (de) Registrierung von Tiefenbildern mittels optisch projizierter Marken
DE102011075703A1 (de) Verfahren und Vorrichtung zur Kalibrierung einer Projektionseinrichtung eines Fahrzeugs
DE102015122172A1 (de) Scheinwerferbasierte Projetion von Mustern zur Vermessung räumlicher Eigenschaften einer Fahrzeugumgebung
DE112017007347T5 (de) Objekterkennungsvorrichtung und fahrzeug
DE102007013664A1 (de) Multisensorieller Hypothesen-basierter Objektdetektor und Objektverfolger
DE102008002725A1 (de) Verfahren und Vorrichtung zur 3D-Rekonstruktion
WO2020178198A1 (fr) Estimation du déplacement d'une position d'image
EP1352363B1 (fr) Procede et dispositif permettant de compenser un defaut d'alignement d'un dispositif de production d'images
DE102017010683A1 (de) Verfahren zur automatischen Wiederherstellung eines eingemessenen Zustands eines Projektionssystems
DE102007037131A1 (de) Verfahren zur dreidimensionalen Vermessung einer Oberfläche
WO2024032971A1 (fr) Dispositif et procédé de mesure d'images virtuelles tridimensionnelles et d'objets sur un affichage tête haute
DE102011055967B4 (de) Messverfahren und Vorrichtung zur Durchführung des Messverfahrens
DE102004007049A1 (de) Verfahren zur Klassifizierung eines Objekts mit einer Stereokamera

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23736723

Country of ref document: EP

Kind code of ref document: A1