CN116634259A - HUD image position adjusting method and HUD image position adjusting system - Google Patents

HUD image position adjusting method and HUD image position adjusting system Download PDF

Info

Publication number
CN116634259A
CN116634259A CN202310527309.5A CN202310527309A CN116634259A CN 116634259 A CN116634259 A CN 116634259A CN 202310527309 A CN202310527309 A CN 202310527309A CN 116634259 A CN116634259 A CN 116634259A
Authority
CN
China
Prior art keywords
relative position
hud
image
target
target driver
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310527309.5A
Other languages
Chinese (zh)
Inventor
许勤军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuxi Cheliantianxia Information Technology Co ltd
Original Assignee
Wuxi Cheliantianxia Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuxi Cheliantianxia Information Technology Co ltd filed Critical Wuxi Cheliantianxia Information Technology Co ltd
Priority to CN202310527309.5A priority Critical patent/CN116634259A/en
Publication of CN116634259A publication Critical patent/CN116634259A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/58Means for changing the camera field of view without moving the camera body, e.g. nutating or panning of optics or image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image

Abstract

The application provides a HUD image position adjusting method and a HUD image position adjusting system, wherein the HUD image position adjusting method comprises the following steps: acquiring a face image of a target driver acquired by an image acquisition device; identifying the facial image and determining a first relative position of eyes of a target driver and the image acquisition device; determining a third relative position between the eyes of the target driver and the windshield based on the first relative position and a predetermined second relative position between the image capture device and the windshield; determining a fourth relative position of the HUD projected image and the target driver's eye-box area based on a third relative position between the target driver's eyes and the windshield; and adjusting the position of the HUD projection image based on the fourth relative position so as to enable the target driver to acquire all projection images. Therefore, through the technical scheme provided by the application, the automatic adjustment of the HUD image position can be realized, and the manual operation is reduced, so that the HUD image position adjustment is simpler and more convenient.

Description

HUD image position adjusting method and HUD image position adjusting system
Technical Field
The application relates to the technical field of HUD imaging, in particular to a HUD image position adjusting method and system.
Background
With the development of intelligent technology of automobiles, the cost of Head up display devices (HUDs for short) is reduced, and more automobiles are equipped with HUDs. The HUD can display prompt information such as speed, rotating speed, navigation, fault warning and the like in the front head-up range of the driver, and is combined with surrounding real scenes, so that the frequency of a low-head observation instrument of the driver is reduced to improve driving safety, and meanwhile, the perception of the driver to surrounding environment is improved.
In order to enable drivers to watch complete HUD images, the projection positions of the HUD images need to be adjusted, when the HUD images are adjusted, proper HUD height settings of each driver are stored (face information corresponds to the HUD height settings), and the corresponding stored data are automatically matched in a face recognition mode when the HUD images are used next time, so that the HUD heights are automatically adjusted. However, the HUD height adjustment in the prior art is manually adjusted, and each user/driver needs to be manually adjusted at least once. In the adjustment process, after the driver is required to operate (generally, the HUD height adjustment is required to enter a corresponding setting item on the central control screen), the driver returns to the driving sitting position to confirm whether the virtual image position is adjusted in place, the driver can be required to go back and forth for several times normally, and the process is relatively complicated.
Disclosure of Invention
Accordingly, the present application is directed to a HUD image position adjusting method and system, which can automatically adjust the HUD image position and reduce manual operations.
The embodiment of the application provides a HUD image position adjusting method, which comprises the following steps:
acquiring a face image of a target driver acquired by an image acquisition device;
identifying the facial image and determining a first relative position of the eyes of the target driver and the image acquisition device;
determining a third relative position between the target driver's eye and the windshield based on the first relative position and a predetermined second relative position between the image capture device and the windshield;
determining a fourth relative position of the HUD projected image and the target driver's eye-box area based on a third relative position between the target driver's eyes and the windshield;
and adjusting the position of the HUD projection image based on the fourth relative position so as to enable the target driver to acquire all projection images.
Optionally, the adjusting the position of the HUD projection image based on the fourth relative position includes:
determining a target refraction angle of the curved mirror according to the fourth relative position;
and adjusting the curved mirror according to the target refraction angle so as to adjust the position of the HUD projection image.
Optionally, the adjusting the curved mirror according to the target refraction angle includes:
and determining a target rotation angle of the curved mirror based on the target refraction angle, and adjusting the curved mirror according to the target rotation angle.
Optionally, the determining a third relative position between the target driver's eye and the windshield includes: a third relative position of the target driver's eyes and a HUD light reflecting area in the windshield is determined.
Optionally, the third relative position includes: the relative distance and angle of the target driver's eyes to the HUD light reflecting area in the windshield.
Optionally, the fourth relative position includes a horizontal position and a vertical distance.
The embodiment of the application also provides an adjusting system for the HUD projection image position, which comprises the following components:
the acquisition module is used for acquiring the face image of the target driver acquired by the image acquisition device;
the first determining module is used for identifying the facial image and determining a first relative position of eyes of the target driver and the image acquisition device;
a second determination module for determining a third relative position between the target driver's eye and the windshield based on the first relative position and a predetermined second relative position between the image capture device and the windshield;
a third determination module for determining a fourth relative position of the HUD projected image and the target driver's eye-box area based on a third relative position between the target driver's eyes and the windshield;
and the adjusting module is used for adjusting the position of the HUD projection image based on the fourth relative position so as to enable the target driver to acquire all projection images.
Optionally, the adjusting module is configured to, when configured to adjust the position of the HUD projection image based on the fourth relative position,:
determining a target refraction angle of the curved mirror according to the fourth relative position;
and adjusting the curved mirror according to the target refraction angle so as to adjust the position of the HUD projection image.
Optionally, when the adjusting module is used for adjusting the curved mirror according to the target refraction angle, the adjusting module is used for:
and determining a target rotation angle of the curved mirror based on the target refraction angle, and adjusting the curved mirror according to the target rotation angle.
Optionally, the determining a third relative position between the target driver's eye and the windshield includes: a third relative position of the target driver's eyes and a HUD light reflecting area in the windshield is determined.
Optionally, the third relative position includes: the relative distance and angle of the target driver's eyes to the HUD light reflecting area in the windshield.
Optionally, the fourth relative position includes a horizontal position and a vertical distance.
The embodiment of the application also provides electronic equipment, which comprises: a processor, a memory and a bus, said memory storing machine readable instructions executable by said processor, said processor and said memory communicating over the bus when the electronic device is running, said machine readable instructions when executed by said processor performing the steps of the adjustment method as described above.
The embodiments of the present application also provide a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the adjustment method as described above.
The embodiment of the application provides a method and a system for adjusting the position of a HUD image, wherein the method for adjusting the position of the HUD image comprises the following steps: acquiring a face image of a target driver acquired by an image acquisition device; identifying the facial image and determining a first relative position of the eyes of the target driver and the image acquisition device; determining a third relative position between the target driver's eye and the windshield based on the first relative position and a predetermined second relative position between the image capture device and the windshield; determining a fourth relative position of the HUD projected image and the target driver's eye-box area based on a third relative position between the target driver's eyes and the windshield; and adjusting the position of the HUD projection image based on the fourth relative position so as to enable the target driver to acquire all projection images.
Therefore, through the technical scheme provided by the application, the automatic adjustment of the HUD image position can be realized, and the manual operation is reduced, so that the HUD image position adjustment is simpler and more convenient.
In order to make the above objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of a HUD image position adjustment method according to an embodiment of the present application;
fig. 2 is a schematic diagram of an internal structure of an automobile provided with a HUD according to the present application;
fig. 3 is a schematic structural diagram of an adjusting system for HUD projection image position according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. The components of the embodiments of the present application generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the application, as presented in the figures, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. Based on the embodiments of the present application, every other embodiment obtained by a person skilled in the art without making any inventive effort falls within the scope of protection of the present application.
With the development of intelligent technology of automobiles, the cost of Head up display devices (HUDs for short) is reduced, and more automobiles are equipped with HUDs. The HUD can display prompt information such as speed, rotating speed, navigation, fault warning and the like in the front head-up range of the driver, and is combined with surrounding real scenes, so that the frequency of a low-head observation instrument of the driver is reduced to improve driving safety, and meanwhile, the perception of the driver to surrounding environment is improved.
In order to enable drivers to watch complete HUD images, the projection positions of the HUD images need to be adjusted, when the HUD images are adjusted, proper HUD height settings of each driver are stored (face information corresponds to the HUD height settings), and the corresponding stored data are automatically matched in a face recognition mode when the HUD images are used next time, so that the HUD heights are automatically adjusted. However, the HUD height adjustment in the prior art is manually adjusted, and each user/driver needs to be manually adjusted at least once. In the adjustment process, after the driver is required to operate (generally, the HUD height adjustment is required to enter a corresponding setting item on the central control screen), the driver returns to the driving sitting position to confirm whether the virtual image position is adjusted in place, the driver can be required to go back and forth for several times normally, and the process is relatively complicated.
Based on the above, the embodiment of the application provides a HUD image position adjusting method, so as to realize automatic adjustment of the HUD image position and reduce manual operation.
Referring to fig. 1, fig. 1 is a flowchart of a HUD image position adjusting method according to an embodiment of the present application. As shown in fig. 1, an adjusting method provided by an embodiment of the present application includes:
s101, acquiring a face image of a target driver acquired by an image acquisition device.
Here, the image capturing device may be a camera installed on the target vehicle, and the camera may be a DMS camera, or may be another camera, such as an OMS, or another camera installed in the vehicle and capable of capturing a human face.
The camera can be an RGB monocular camera (comprising color, black and white, infrared and the like), an RGBD depth camera, the depth camera has binocular, TOF, grating and other types, and can also be an in-cabin radar capable of outputting 3D information.
Wherein, the face image of the target driver acquired by the image acquisition device can be acquired at specified time intervals by the acquisition instruction. It should be noted that the specified time interval can be set in advance according to the requirement, for example, the specified time interval can be 1 hour, 3 hours, 6 hours, or the like. The acquisition instruction can be triggered when the driver acts on a virtual switch or a hardware switch of the automobile through a specified operation, which can be a click operation, a slide operation, a toggle operation, a push operation, a voice operation, or the like.
For example, referring to fig. 2, fig. 2 is a schematic diagram of an internal structure of an automobile with a HUD according to the present application. As shown in fig. 2, after the face image of the target driver is collected by the image collecting device, the collected face image may be transmitted to the HU cabin host by the controller. As shown in fig. 2, a possible application scenario provided by the present application is provided. The application scenario is exemplified with the HUD system applied to a vehicle. The HUD system is used for projecting instrument information (vehicle speed, temperature, oil quantity and the like) and navigation information and the like on a vehicle into the visual field of a driver through a windshield of the vehicle, wherein a virtual image corresponding to the navigation information can be superimposed on a real environment outside the vehicle, so that the driver can obtain an augmented reality visual effect, and the HUD system can be used for AR navigation, self-adaptive cruising, lane departure early warning and the like. Since the virtual image corresponding to the navigation information needs to be combined with the real image, the vehicle is required to have accurate positioning and detection functions, and generally the HUD system needs to be matched with an advanced driving assistance system (advanceddriving assistant system, ADAS) system of the vehicle. It should be noted that the above scenario is only an example, and the HUD system provided by the present application may be applied to other scenarios, for example, may also be applied to an aircraft (such as a fighter plane), and the fighter plane may perform object tracking and aiming based on the HUD system, thereby helping to improve the success rate and flexibility of combat.
The following explains some terms in fig. 2 in the present application. It should be noted that these explanations are for the convenience of those skilled in the art, and do not limit the scope of the present application.
Eye box: an eye box generally refers to the area where the driver's eyes can see the entire displayed image, see fig. 1 above. To accommodate differences in the height of the driver, the typical eye box size is 130mm by 50mm, i.e., the driver's eyes have a range of movement of about + -50 mm in the longitudinal direction and about + -130 mm in the lateral direction. If the driver's eyes are within the range of the eye box, a complete and clear image can be seen. If the driver's eyes are out of the range of the eye box, distortion of the image, color error, etc. may be seen, and even the image cannot be seen.
PGU: in one possible implementation, the PGU includes a display screen that is separable into a plurality of regions, such as a first region for emitting a first light and a second region for emitting a second light.
Primary mirror: alternatively, the primary mirror may be a first planar mirror; alternatively, the primary mirror is a first curved mirror.
In one possible implementation, a primary mirror nearest the PGU is configured to receive the first light from the PGU and reflect the first light toward a secondary mirror. Therefore, the first reflecting mirror can be used for folding the light path of the first light ray once.
A secondary mirror: alternatively, the secondary mirror may be a curved mirror.
In one possible implementation, a secondary mirror nearest to the primary mirror is configured to receive the first light from the secondary mirror after the optical path is folded and reflect the first light after the optical path is folded toward the windshield.
In one possible implementation, the secondary mirror further comprises a rotation axis for adjusting the position of the virtual image on the windscreen.
Before acquiring a face image of a target driver, the target driver is detected to determine whether the target driver meets an adjustment triggering condition. For example, the adjustment triggering condition may be detecting that the target driver has belted the seat belt, or detecting that the target driver starts the automobile, or the adjustment triggering condition may be other, which is not limited herein.
S102, recognizing the facial image, and determining a first relative position of eyes of the target driver and the image acquisition device.
Here, a position determination algorithm may be predetermined, by which the eyes of the target driver in the face image are detected, and the first relative position of the eyes of the target driver and the image pickup device is determined based on the detected human eye information.
S103, determining a third relative position between the eyes of the target driver and the windshield based on the first relative position and a predetermined second relative position between the image acquisition device and the windshield.
Here, the second relative position between the image pickup device and the windshield is determined based on the in-vehicle device design position information. The second relative position between the image acquisition device and the windshield is specifically the relative position between the image acquisition device and the HUD light reflection area in the windshield.
Wherein the third relative position includes: the relative distance and angle of the target driver's eyes to the HUD light reflecting area in the windshield.
Thus, the determining a third relative position between the target driver's eyes and the windshield includes: a third relative position of the target driver's eyes and a HUD light reflecting area in the windshield is determined.
Wherein, HUD light reflection area is the required position of HUD projection virtual image in the windshield.
S104, determining a fourth relative position of the HUD projection image and the eye box area of the target driver based on a third relative position between the eyes of the target driver and the windshield.
Here, the fourth relative position includes: horizontal position and vertical distance.
The eye box area is an area range of eye movement of a target driver, and the eye box area is a rectangular area and is positioned at the center of an eyeball. Within the field of the eye-box, the eye can see the entire image projected by the HUD.
It should be noted that there is a fixed area/range on the front windshield for reflecting HUD light, i.e., HUD light reflecting area; when the vehicle is designed, a standard eye position (which can be understood as a standard height) is set at the driver position; when the HUD is designed, the standard eye box position is designed according to the standard eye position, so that the adjustable range of the eye box is designed for adapting to different heights; the so-called virtual image adjustment is to adjust the eye box to the actual eye position, and the driver experiences that the position of the virtual image (generally, the height of the virtual image) is adjusted (actually, the angle of a curved mirror is adjusted to change the incident angle of HUD light to a windshield, the incident angle is changed synchronously, the height of reflected light is changed, and the reverse height of the virtual image is also changed); wherein. The size of the eye box is generally 130x50mm (horizontal x vertical), the eyes are in the middle area of the eye box, and the adjustable range of the eye box is generally +/-50 mm. Thus, a fourth relative position may be determined based on the third relative position.
With continued reference to fig. 2, the HUD projection image required by the driver is an image that the light emitted by the PGU (image generating unit) is projected by the primary mirror, the secondary mirror and reflected by the front windshield of the automobile and then finally transmitted to the eye position (the eye box area) corresponding to the height of the driver, i.e. the driver can see the complete virtual image.
And S105, adjusting the position of the HUD projection image based on the fourth relative position so as to enable the target driver to acquire all projection images.
In one embodiment of the present application, adjusting the position of the HUD projection image based on the fourth relative position includes: determining a target refraction angle of the curved mirror according to the fourth relative position; and adjusting the curved mirror according to the target refraction angle so as to adjust the position of the HUD projection image.
In another embodiment of the present application, the adjusting the curved mirror according to the target refraction angle includes: and determining a target rotation angle of the curved mirror based on the target refraction angle, and adjusting the curved mirror according to the target rotation angle.
It should be noted that, when the HUD optical path is designed, virtual images at different positions are known to have corresponding angle data of the curved mirror, and when the curved mirror is adjusted according to the target rotation angle, the virtual images can be controlled by a stepping motor.
With continued reference to fig. 2, the secondary mirror in fig. 2 is a curved mirror, which is a free-form mirror. The HUD optical path adjustment principle in this scheme will be described with reference to fig. 2: the HUD internal module PGU emits light, the light is reflected by the first-stage reflecting lens and then by the second-stage reflecting lens/curved mirror (motor control, angle can be changed), the light is emitted to the front windshield, the light enters the eyes of a driver after being reflected by the front windshield, and a virtual image is seen after the light is converged.
When the HUD is designed and the light rays are known to be incident on the front windshield at different angles, virtual images at different positions can be seen in the range of the eye box area. The HUD incidence angle is obtained by changing the angle of the curved mirror (the incident light of the curved mirror is fixed, the angle of the curved mirror is changed actually, the angle of the curved mirror is changed synchronously, the direction of the light reflected by the curved mirror is changed), namely, different virtual image positions are obtained by adjusting the corresponding angle of the curved mirror, and the corresponding angle of the curved mirror can be obtained by controlling the motor in the HUD.
In addition, after the adjustment of the virtual image position required by the target driver is completed, an adjustment parameter is acquired, and a face image of the driver, a HUD parameter adjusted by the driver and a parameter value are stored correspondingly. Thus, when the target driver uses the vehicle next time, the target vehicle can acquire the identity information of the driver of the target vehicle, determine the HUD parameters and the parameter values matched with the identity information of the driver, automatically adjust the HUD parameters of the vehicle according to the HUD parameters and the corresponding parameter values, and further, under the condition that multiple people use the target vehicle, the HUD parameters of the target vehicle do not need to be frequently adjusted by the driver, so that the experience of the driver is improved.
The embodiment of the application provides a method and a system for adjusting the position of a HUD image, wherein the method for adjusting the position of the HUD image comprises the following steps: acquiring a face image of a target driver acquired by an image acquisition device; identifying the facial image and determining a first relative position of the eyes of the target driver and the image acquisition device; determining a third relative position between the target driver's eye and the windshield based on the first relative position and a predetermined second relative position between the image capture device and the windshield; determining a fourth relative position of the HUD projected image and the target driver's eye-box area based on a third relative position between the target driver's eyes and the windshield; and adjusting the position of the HUD projection image based on the fourth relative position so as to enable the target driver to acquire all projection images.
Therefore, through the technical scheme provided by the application, the automatic adjustment of the HUD image position can be realized, and the manual operation is reduced, so that the HUD image position adjustment is simpler and more convenient.
Referring to fig. 3, fig. 3 is a schematic structural diagram of a HUD projection image position adjusting system according to an embodiment of the present application. As shown in fig. 3, the conditioning system 300 includes:
an acquisition module 310 for acquiring a face image of the target driver acquired by the image acquisition device;
a first determining module 320, configured to identify the facial image, and determine a first relative position of the eyes of the target driver and the image capturing device;
a second determination module 330 for determining a third relative position between the target driver's eyes and the windshield based on the first relative position and a predetermined second relative position between the image capture device and the windshield;
a third determination module 340 for determining a fourth relative position of the HUD projected image and the target driver's eye-box area based on a third relative position between the target driver's eyes and the windshield;
and the adjusting module 350 is configured to adjust the position of the HUD projection image based on the fourth relative position, so that the target driver obtains all projection images.
Optionally, when the adjusting module 350 is configured to adjust the position of the HUD projection image based on the fourth relative position, the adjusting module 350 is configured to:
determining a target refraction angle of the curved mirror according to the fourth relative position;
and adjusting the curved mirror according to the target refraction angle so as to adjust the position of the HUD projection image.
Optionally, when the adjusting module 350 is configured to adjust the curved mirror according to the target refraction angle, the adjusting module 350 is configured to:
and determining a target rotation angle of the curved mirror based on the target refraction angle, and adjusting the curved mirror according to the target rotation angle.
Optionally, the determining a third relative position between the target driver's eye and the windshield includes: a third relative position of the target driver's eyes and a HUD light reflecting area in the windshield is determined.
Optionally, the third relative position includes: the relative distance and angle of the target driver's eyes to the HUD light reflecting area in the windshield.
Optionally, the fourth relative position includes a horizontal position and a vertical distance.
Referring to fig. 4, fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the application. As shown in fig. 4, the electronic device 400 includes a processor 410, a memory 420, and a bus 430.
The memory 420 stores machine-readable instructions executable by the processor 410, when the electronic device 400 is running, the processor 410 communicates with the memory 420 through the bus 430, and when the machine-readable instructions are executed by the processor 410, the steps of the adjustment method in the method embodiment shown in fig. 1 can be executed, and the specific implementation can be referred to the method embodiment and will not be described herein.
The embodiment of the present application further provides a computer readable storage medium, where a computer program is stored, where the computer program when executed by a processor may perform the steps of the adjustment method in the embodiment of the method shown in fig. 1, and a specific implementation manner may refer to the embodiment of the method and will not be described herein.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the several embodiments provided by the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. The above-described apparatus embodiments are merely illustrative, for example, the division of the units is merely a logical function division, and there may be other manners of division in actual implementation, and for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some communication interface, device or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer readable storage medium executable by a processor. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Finally, it should be noted that: the above examples are only specific embodiments of the present application, and are not intended to limit the scope of the present application, but it should be understood by those skilled in the art that the present application is not limited thereto, and that the present application is described in detail with reference to the foregoing examples: any person skilled in the art may modify or easily conceive of the technical solution described in the foregoing embodiments, or perform equivalent substitution of some of the technical features, while remaining within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application. Therefore, the protection scope of the application is subject to the protection scope of the claims.

Claims (10)

1. A method of adjusting a HUD image position, the method comprising:
acquiring a face image of a target driver acquired by an image acquisition device;
identifying the facial image and determining a first relative position of the eyes of the target driver and the image acquisition device;
determining a third relative position between the target driver's eye and the windshield based on the first relative position and a predetermined second relative position between the image capture device and the windshield;
determining a fourth relative position of the HUD projected image and the target driver's eye-box area based on a third relative position between the target driver's eyes and the windshield;
and adjusting the position of the HUD projection image based on the fourth relative position so as to enable the target driver to acquire all projection images.
2. The adjustment method according to claim 1, characterized in that the adjusting the position of the HUD projection image based on the fourth relative position comprises:
determining a target refraction angle of the curved mirror according to the fourth relative position;
and adjusting the curved mirror according to the target refraction angle so as to adjust the position of the HUD projection image.
3. The adjustment method according to claim 2, wherein said adjusting said curved mirror at said target refractive angle comprises:
and determining a target rotation angle of the curved mirror based on the target refraction angle, and adjusting the curved mirror according to the target rotation angle.
4. The method of adjusting of claim 1, wherein the determining a third relative position between the target driver's eye and the windshield comprises: a third relative position of the target driver's eyes and a HUD light reflecting area in the windshield is determined.
5. A method of adjusting as claimed in claim 3, wherein the third relative position comprises: the relative distance and angle of the target driver's eyes to the HUD light reflecting area in the windshield.
6. A method of adjusting as claimed in claim 3, wherein the fourth relative position comprises a horizontal position and a vertical distance.
7. An adjustment system for HUD projected image position, said adjustment system comprising:
the acquisition module is used for acquiring the face image of the target driver acquired by the image acquisition device;
the first determining module is used for identifying the facial image and determining a first relative position of eyes of the target driver and the image acquisition device;
a second determination module for determining a third relative position between the target driver's eye and the windshield based on the first relative position and a predetermined second relative position between the image capture device and the windshield;
a third determination module for determining a fourth relative position of the HUD projected image and the target driver's eye-box area based on a third relative position between the target driver's eyes and the windshield;
and the adjusting module is used for adjusting the position of the HUD projection image based on the fourth relative position so as to enable the target driver to acquire all projection images.
8. The adjustment system of claim 7, wherein the adjustment module, when configured to adjust the position of the HUD projected image based on the fourth relative position, is configured to:
determining a target refraction angle of the curved mirror according to the fourth relative position;
and adjusting the curved mirror according to the target refraction angle so as to adjust the position of the HUD projection image.
9. An electronic device, comprising: a processor, a memory and a bus, said memory storing machine readable instructions executable by said processor, said processor and said memory communicating via said bus when the electronic device is running, said machine readable instructions when executed by said processor performing the steps of the adjustment method according to any one of claims 1 to 6.
10. A computer-readable storage medium, characterized in that it has stored thereon a computer program which, when executed by a processor, performs the steps of the adjustment method according to any of claims 1 to 6.
CN202310527309.5A 2023-05-09 2023-05-09 HUD image position adjusting method and HUD image position adjusting system Pending CN116634259A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310527309.5A CN116634259A (en) 2023-05-09 2023-05-09 HUD image position adjusting method and HUD image position adjusting system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310527309.5A CN116634259A (en) 2023-05-09 2023-05-09 HUD image position adjusting method and HUD image position adjusting system

Publications (1)

Publication Number Publication Date
CN116634259A true CN116634259A (en) 2023-08-22

Family

ID=87601818

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310527309.5A Pending CN116634259A (en) 2023-05-09 2023-05-09 HUD image position adjusting method and HUD image position adjusting system

Country Status (1)

Country Link
CN (1) CN116634259A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090303158A1 (en) * 2008-06-09 2009-12-10 Nobuyuki Takahashi Head-up display system
WO2016101917A1 (en) * 2014-12-25 2016-06-30 Byd Company Limited Vehicle, head-up displaying system and method for adjusting height of projection image thereof
WO2022188096A1 (en) * 2021-03-11 2022-09-15 华为技术有限公司 Hud system, vehicle, and virtual image position adjustment method
CN115471395A (en) * 2022-03-28 2022-12-13 北京罗克维尔斯科技有限公司 Image adjusting method and device, electronic equipment and storage medium
CN115690251A (en) * 2022-11-15 2023-02-03 北京梧桐车联科技有限责任公司 Method and device for displaying image on front windshield of vehicle

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090303158A1 (en) * 2008-06-09 2009-12-10 Nobuyuki Takahashi Head-up display system
WO2016101917A1 (en) * 2014-12-25 2016-06-30 Byd Company Limited Vehicle, head-up displaying system and method for adjusting height of projection image thereof
WO2022188096A1 (en) * 2021-03-11 2022-09-15 华为技术有限公司 Hud system, vehicle, and virtual image position adjustment method
CN115471395A (en) * 2022-03-28 2022-12-13 北京罗克维尔斯科技有限公司 Image adjusting method and device, electronic equipment and storage medium
CN115690251A (en) * 2022-11-15 2023-02-03 北京梧桐车联科技有限责任公司 Method and device for displaying image on front windshield of vehicle

Similar Documents

Publication Publication Date Title
CN108027511B (en) Information display device, information providing system, moving object apparatus, information display method, and recording medium
US11048095B2 (en) Method of operating a vehicle head-up display
CN108243332B (en) Image adjusting method of vehicle-mounted head-up display system and vehicle-mounted head-up display system
US10768416B2 (en) Projection type display device, projection display method, and projection display program
JP6497158B2 (en) Display device, moving body
CN108225734B (en) Error calibration system based on HUD system and error calibration method thereof
US20170060235A1 (en) Method of operating a vehicle head-up display
EP3357734A1 (en) Display device
CN112534334A (en) Three-dimensional augmented reality head-up display for realizing augmented reality for driver's viewpoint by locating image on ground
CN107272194A (en) Head-up display device and its method
WO2017090319A1 (en) Device for detecting direction of line of sight, and system for detecting direction of line of sight
CN114200675B (en) Display method and device, head-up display system and vehicle
CN113993741B (en) Cabin system adjusting device and method for adjusting a cabin system
JP2016210212A (en) Information providing device, information providing method and control program for information provision
EP2669719A1 (en) Multi-viewer three-dimensional display having a holographic diffuser
JPWO2018180856A1 (en) Head up display device
CN111727399B (en) Display system, mobile object, and design method
CN111923835A (en) Vehicle-based rearview display method
CN116634259A (en) HUD image position adjusting method and HUD image position adjusting system
CN115690251A (en) Method and device for displaying image on front windshield of vehicle
CN115471395A (en) Image adjusting method and device, electronic equipment and storage medium
CN115018942A (en) Method and apparatus for image display of vehicle
JP6813437B2 (en) Display system
WO2018180857A1 (en) Head-up display apparatus
US11662811B2 (en) Holographic display system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination