CN114821723B - Projection image plane adjusting method, device, equipment and storage medium - Google Patents

Projection image plane adjusting method, device, equipment and storage medium Download PDF

Info

Publication number
CN114821723B
CN114821723B CN202210451546.3A CN202210451546A CN114821723B CN 114821723 B CN114821723 B CN 114821723B CN 202210451546 A CN202210451546 A CN 202210451546A CN 114821723 B CN114821723 B CN 114821723B
Authority
CN
China
Prior art keywords
change value
image plane
projection image
value
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210451546.3A
Other languages
Chinese (zh)
Other versions
CN114821723A (en
Inventor
卫昱华
张波
韩雨青
吕涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Zejing Automobile Electronic Co ltd
Original Assignee
Jiangsu Zejing Automobile Electronic Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Zejing Automobile Electronic Co ltd filed Critical Jiangsu Zejing Automobile Electronic Co ltd
Priority to CN202210451546.3A priority Critical patent/CN114821723B/en
Publication of CN114821723A publication Critical patent/CN114821723A/en
Application granted granted Critical
Publication of CN114821723B publication Critical patent/CN114821723B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Geometry (AREA)
  • Multimedia (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)

Abstract

The application is applicable to the technical field of vehicle driving, and provides a projection image plane adjusting method, a device, equipment and a storage medium, wherein the projection image plane adjusting method comprises the following steps: acquiring a three-axis angle change value of the projection equipment and a multi-frame face image of a driver; determining a first height change value of a projection image plane matched with the positions of human eyes in the face images according to the plurality of frames of face images; determining a second height change value of the projection image plane according to the three-axis angle change value; and adjusting the projection image plane according to the first height change value and the second height change value. The method for adjusting the projection image plane effectively solves the problem of image plane position change of in-vehicle projection equipment such as HUD projection caused by vehicle vibration.

Description

Projection image plane adjusting method, device, equipment and storage medium
Technical Field
The application belongs to the technical field of vehicle driving, and particularly relates to a projection image plane adjusting method, device, equipment and storage medium.
Background
In order to keep the driver highly attentive during driving and avoid the occurrence of Head-down and Head-turning operations as much as possible, a Head Up Display (HUD) is generally mounted on a vehicle, and driving information such as the speed per hour and navigation of the vehicle is projected onto a windshield directly in front of the driver by the HUD. However, when the vehicle runs to a bumpy road section or passes through a speed bump, the position (also called as an HUD projection image plane) of the display picture projected on the windshield glass along with the vibration of the vehicle can be changed, namely, the HUD shakes, so that the relative position of the eye of the driver and the HUD projection image plane changes, the comfort level of the driver for observing the image plane is reduced, and safe driving is affected.
Disclosure of Invention
The embodiment of the application provides a method, a device and equipment for adjusting a projection image plane and a storage medium, which can solve the problem of image plane position change of in-vehicle projection equipment such as HUD projection caused by vehicle vibration.
In a first aspect, an embodiment of the present application provides a method for adjusting a projection image plane, where the method includes:
acquiring a three-axis angle change value of projection equipment and a multi-frame face image of a driver;
determining a first height change value of a projection image plane matched with the positions of human eyes in the face images according to the plurality of frames of face images;
determining a second height change value of the projection image plane according to the three-axis angle change value;
and adjusting the projection image plane according to the first height change value and the second height change value.
Because along with the vibration of the vehicle, the relative position of the image plane projected by the in-vehicle projection equipment such as the HUD and the eyes of the driver changes, the actual height change value of the image plane projected by the HUD is determined through the obtained three-axis angle change value of the HUD, the actual height change value of the eyes of the driver is determined by utilizing the obtained face image, based on the matching relation between the actual height change value of the eyes and the ideal height change value of the image plane, the actual height change value of the image plane and the ideal height change value of the image plane can be utilized, the HUD is adjusted to enable the image plane projected by the HUD to be matched with the eyes of the driver, the driver can see the relatively stable projected image plane when the vehicle vibrates, the problem of the change of the image plane position of the HUD projection caused by the vibration of the vehicle is effectively solved, and the safe driving of the driver is ensured.
In a possible implementation manner of the first aspect, determining a second height variation value of the projection image plane according to the three-axis angle variation values includes:
and if the three-axis angle change value is greater than a first preset threshold value and the first height change value is greater than a second preset threshold value, determining a second height change value of the projection image plane according to the three-axis angle change value.
In a possible implementation manner of the first aspect, determining, according to a plurality of frames of face images, a first height variation value of a projection image plane that matches a position of an eye in the face images includes:
respectively obtaining the diameters of human irises in the multi-frame human face images;
determining a first distance according to the diameter and a first preset parameter, wherein the first distance is an actual distance between human eyes and equipment for collecting multiple frames of human face images;
and determining a first height change value according to the first distance and a second preset parameter, wherein the first preset parameter is the focal length of the equipment, and the second preset parameter is an internal parameter matrix and an external parameter matrix of the equipment.
In a possible implementation manner of the first aspect, the determining a second height change value of the projection image plane according to the three-axis angle change value includes: determining a second height change value of the projection image surface according to the pitch angle change value and a preset second distance; the preset second distance is a horizontal distance between the projection equipment and the projection image plane.
In a possible implementation manner of the first aspect, adjusting the projection image plane according to the first height variation value and the second height variation value includes:
determining a compensation value according to the first height change value and the second height change value;
and controlling the projection equipment by using the motor according to the compensation value to adjust the height of the projection image plane.
In a second aspect, an embodiment of the present application provides a projection image plane adjusting apparatus, including:
the acquisition unit is used for acquiring a three-axis angle change value of the projection equipment and a multi-frame face image of a driver;
the first determining unit is used for determining a first height change value of a projection image plane matched with the positions of human eyes in the face images according to the plurality of frames of face images;
the second determining unit is used for determining a second height change value of the projection image surface according to the three-axis angle change value;
and the adjusting unit is used for adjusting the projection image plane according to the first height change value and the second height change value.
In a possible implementation manner of the second aspect, the second determining unit includes:
the determination module is used for determining a second height change value of the projection image plane according to the three-axis angle change value when the judgment module judges that the three-axis angle change value is larger than a first preset threshold value and the first height change value is larger than a second preset threshold value.
In one possible implementation manner of the second aspect, the first determining unit includes:
the acquisition module is used for respectively acquiring the diameters of the irises of the human eyes in the multi-frame human face images;
the first determining module is used for determining a first distance according to the diameter and a first preset parameter, wherein the first distance is an actual distance between human eyes and equipment for collecting multiple frames of human face images;
and the second determining module is used for determining a first height change value according to the first distance and a second preset parameter, wherein the first preset parameter is the focal length of the equipment, and the second preset parameter is an internal parameter matrix and an external parameter matrix of the equipment.
In a possible implementation manner of the second aspect, the three-axis angle change value includes a pitch angle change value, and the second determining unit is configured to determine a second height change value of the projection image plane according to the pitch angle change value and a preset second distance; the preset second distance is a horizontal distance between the projection equipment and the projection image plane.
In a possible implementation manner of the second aspect, the adjusting unit is configured to determine a compensation value according to the first height variation value and the second height variation value; and controlling the projection equipment by using the motor according to the compensation value to adjust the height of the projection image surface.
In a third aspect, the present application provides a terminal device, comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor implements the method according to the first aspect or any alternative manner of the first aspect when executing the computer program.
In a fourth aspect, a computer readable storage medium stores a computer program which, when executed by a processor, performs the method of the first aspect or any alternative of the first aspect.
In a fifth aspect, embodiments of the present application provide a computer program product, which, when run on a terminal device, causes the terminal device to perform the method according to the first aspect or any alternative manner of the first aspect.
It is understood that the beneficial effects of the second aspect to the fifth aspect can be referred to the related description of the first aspect, and are not described herein again.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings required to be used in the embodiments or the prior art description will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings may be obtained according to these drawings without inventive labor.
Fig. 1 is a schematic diagram of a method for adjusting a projection image plane according to an embodiment of the present application;
FIG. 2 is a schematic diagram illustrating an image plane and a human eye according to an embodiment of the present application;
fig. 3 is a schematic diagram of an image plane in a case where the HUD provided in an embodiment of the present application is not shaken;
fig. 4 is a schematic diagram of an image plane in a case where the HUD provided in an embodiment of the present application is jittered;
FIG. 5 is a flow chart illustrating a method for determining a shaking event of a HUD according to another embodiment of the present application;
fig. 6 is a schematic structural diagram of a projection image plane adjusting device provided in an embodiment of the present application;
fig. 7 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application without making any creative effort belong to the protection scope of the present application.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. And the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. The terms "first," "second," "third," and the like are used solely to distinguish one from another as to indicate or imply relative importance.
In order to keep the driver highly attentive during driving and avoid the occurrence of Head-down and Head-turning operations as much as possible, a Head Up Display (HUD) is generally mounted on a vehicle, and driving information such as the speed per hour and navigation of the vehicle is projected onto a windshield directly in front of the driver by the HUD. However, when the vehicle runs to a bumpy road section or passes through a speed bump, the position (also called as an HUD projection image surface) of the display picture projected on the windshield glass along with the vibration of the vehicle can be changed, namely, the HUD shakes, so that the relative position of the eyes of the driver and the HUD projection image surface is changed, the comfort level of the driver for observing the image surface is reduced, and safe driving is affected.
The application provides a projection image plane adjusting method, based on the relation that the actual height variation value of human eyes matches with the ideal height variation value of image plane, utilize the dynamic regulation HUD of the height variation value of the human eye position variation value of driver and HUD projection image plane, make the display height of HUD projection image plane after the regulation correspond with the position of human eyes of driver, be convenient for the driver to see relatively stable display frame in the driving process, the problem of HUD projection image plane position variation caused by vehicle vibration has effectively been solved, thereby guarantee driver's safe driving.
Fig. 1 is a schematic flowchart of a method for adjusting a projection image plane according to an embodiment of the present disclosure, and the method may be applied to a driving computer and other devices by way of example and not limitation.
S101, obtaining a three-axis angle change value of the projection equipment and a multi-frame face image of a driver.
In practical application, the projection device can be a HUD, an AR-HUD or other devices with projection function, and can be installed above or inside a center console of a vehicle cabin; the three-axis angle values may include roll angle, pitch angle, and yaw angle.
The various embodiments of this application will be described using a HUD as an example, wherein the three-axis angle variation value is used to identify the trajectory of the HUD, including the angle values rotating around the X-axis, Y-axis and Z-axis, respectively, wherein the angle rotating around the X-axis is called the roll angle of the HUD, the angle rotating around the Y-axis is called the pitch angle of the HUD, and the angle rotating around the Z-axis is called the yaw angle of the HUD.
As an example, the three-axis angle values of the HUD can be obtained by an Electronic Control Unit (ECU) in the vehicle reading data of an Inertial Measurement Unit (IMU) in the HUD, wherein the IMU is a sensor for measuring acceleration and rotational motion.
The image of the face of the driver can be acquired by an image or video acquisition device arranged in the vehicle. For example, a face image of the driver may be captured with a camera provided on a windshield corresponding to the driving position; it is also possible to acquire a video including information on the face of the driver using a video recording device in the vehicle, and then obtain an image of the face of the driver from a video frame or image acquired from the video.
S102, determining a first height change value of a projection image plane matched with the positions of human eyes in the face images according to the plurality of frames of face images.
The first height change value is a height change value of a projection image surface corresponding to an actual height change value of human eyes, which is obtained according to the human face image.
It should be noted that, as shown in fig. 2, assuming that the height of the human eye is h and the horizontal distance between the human eye and the projection image plane is d2, the angle between the extension line of the projection image plane projected by the HUD and the human eye line and the ground when no jitter event occurs in the HUD can be expressed by the following formula (1):
Figure BDA0003618832780000061
in the above formula (1), β is the pitch angle of the HUD, and fov represents the angular range that can be covered by the HUD lens.
Based on the triangle similarity theorem, the first height value h3 of the projection image plane can be determined according to the following formula (2):
Figure BDA0003618832780000071
according to the formula (2), the height variation value Δ H of the projection image plane at the time t1 and the time t2 can be calculated respectively ideal
Assuming that the first height value of the projection image plane at the time t1 is h 31 The height value h' of the human eye; the first height value of the projection image surface at the time t2 is h 32 The height value H' of the human eye can be calculated according to the following formula (3) to obtain a first height change value delta H of the projection image plane ideal
ΔH ideal =h 31 -h 32 =h″-h′ (3)
As can be seen from equation (3), when the HUD is shaken, the first height variation value of the projection image plane is equal to the height variation value of the eye position in the face image.
Due to the fact thatIdeally, the height value of the projection image plane projected by the HUD corresponds to the height value of the eyes of the driver, so that the driver can see a relatively stable projection image plane during driving. In the case of vibration of the vehicle, the projection image plane projected by the HUD changes in accordance with the vibration of the vehicle. Therefore, the height variation value can be based on the actual height variation value of human eyes and the ideal height variation value of the projection image plane (i.e. the first height variation value Δ H) ideal ) And determining a first height change value of a projection image surface projected by the HUD according to the matched relation.
In one embodiment, determining a first height variation value of human eyes in a face image according to a plurality of frames of face images comprises: respectively obtaining the diameters of human irises in the multi-frame human face images; from the diameter, a first height variation value is determined.
In the embodiment of the application, the diameters of the irises of the human eyes in the multi-frame human face images are respectively obtained; determining a first distance according to the diameter and a first preset parameter, wherein the first distance is an actual distance between human eyes and equipment for acquiring multiple frames of human face images; and determining first position change information of a projection image plane matched with the position of the eyes in the face image according to the first distance and a second preset parameter, wherein the first position change information comprises a first height change value, the first preset parameter is the focal length of the equipment, and the second preset parameter is an internal parameter matrix and an external parameter matrix of the equipment.
If a camera is used for collecting multiple frames of face images, the first preset parameter is the focal length of the camera, and the second preset parameter is an internal parameter matrix of the camera and an external parameter matrix of the camera.
It should be understood that, based on the acquired multi-frame face images, human eyes in the face images are identified by using an image identification method, and then the actual position information of the eyes is calculated according to the identification result of the human eyes. It should be understood that the image recognition method may be a recognition method based on deep learning, and the present application is not limited thereto.
Calculating the position information of the eyes from the recognition result of the human eyes may include: the method comprises the following steps of firstly, acquiring the diameter P of an iris in a face image, the focal length f of an image or video acquisition device and the real diameter L of the iris in human eyes, and determining the real distance d from the human eyes to the acquisition device based on the similar triangle theorem, wherein the calculation formula is as follows:
Figure BDA0003618832780000081
in the formula (4), according to experience, the true diameter L of the iris in human eyes is about 11.75 ± 0.5mm, and in practical application, the true distance d from the human eyes to the acquisition device can be calculated by using the value of the true diameter L of the iris in human eyes.
In practical application, the outer contour of the iris can be detected from the face image, pixel coordinate values (u, v) corresponding to the outer contour are obtained, and the diameter P of the iris in the face image is determined according to the obtained pixel coordinate values (u, v) of the iris in the face image.
And secondly, acquiring an internal reference matrix C, a rotation matrix R and a translation matrix T of the image or video acquisition device according to calibration parameters of the image or video acquisition device during installation, and calculating to obtain the real position information (x, y, z) of the human eyes in the three-dimensional coordinate system through the following formula (5).
Figure BDA0003618832780000082
In the formula (5), the internal parameter matrix C is an internal parameter of the image or video capturing device, and the internal parameter includes a focal length of the image or video capturing device, and the like; the rotation matrix R and the translation matrix T are external parameters of the image or video capture device.
And (5) calculating the real position information (x, y, z) of the human eye according to the formula (5), wherein z represents the height value of the human eye.
Based on the formula (4) and the formula (5), a height change value of human eyes, namely a first height change value, is further obtained according to the multiple frames of face images which are respectively obtained, and then an ideal height change value of a projection image surface projected by the HUD is obtained. For example, the human eye height value z1 corresponding to the time t1 and the human eye height value z2 corresponding to the time t2 are calculated by using the above formula (4) and formula (5), and then the corresponding human eye height variation value is z2-z1, that is, the first height variation value is z2-z1, and the ideal height variation value of the projection image plane projected by the corresponding HUD is also z2-z1.
In an optional manner, the eye height variation value may be obtained by comparing a real-time acquired eye height value with a preset eye height value, and the preset eye height value may be a pre-calculated eye height value of a vehicle driver during an initial configuration process of the HUD in the vehicle. For example, when the projection image plane of the HUD is normally located at the preset position, in the process of performing initialization configuration on the HUD in the vehicle, the calculated human eye height value of the vehicle driver is z3, and the corresponding human eye height value z1 is calculated by using the above formula (4) and formula (5) after the vehicle passes through the deceleration strip, and then the corresponding human eye height change value is z3-z1.
S103, determining a second height change value of the projection image plane according to the three-axis angle change values.
The second height change value is a height change value which enables the height of the projection image surface to change along with the vibration of the vehicle.
In a possible implementation, the three-axis angle change value includes a pitch angle change value, and the determining the second height change value of the projected image plane according to the three-axis angle change value includes: and determining a second height change value of the projection image plane according to the pitch angle change value and a preset second distance.
The pitching angle value is an angle value of the HUD rotating around the Y axis and can be obtained by reading IMU data through an ECU in the vehicle; the preset second distance refers to a horizontal distance between the HUD and the projection image plane, and the manner of acquiring the preset second distance includes, but is not limited to, directly reading HUD configuration parameters to acquire. The Y axis is horizontal to the ground and vertical to the X axis, and the X axis direction is the same as the automobile driving direction.
As shown in fig. 3, the projection image plane 1 is a projection image plane projected by the HUD in a case where no shake event occurs in the HUD, and h1 represents a height value of the projection image plane 1; as shown in fig. 4, the projection image plane 2 is a projection image plane projected by the HUD when the HUD is in a shake event, h2 represents a height value of the projection image plane 2, and the projection image plane height variation value Δ h can be calculated by the following formula (6):
Δh=d1×tanα (6)
in the above formula (6), d1 represents the horizontal distance between the HUD and the projection image plane; alpha represents the pitching angle value of the HUD, and the real height change value of the projection image surface is h2-h1, namely delta h.
And S104, adjusting the projection image plane according to the first height change value and the second height change value.
In the embodiment of the present application, the specific process of adjusting the HUD includes: according to the height change value delta H of the projection image surface and the ideal height change value delta H of the projection image surface ideal The compensation value B is calculated by the following formula (7) HUD
B HUD =ΔH ideal -Δh (7)
In practical application, the HUD can be controlled to rotate by a corresponding angle value by using a motor in a vehicle or an external motor to realize adjustment of the projection image plane.
In a possible implementation, as shown in fig. 5, which is a flowchart for determining a HUD shaking event, referring to fig. 5, in a case that a three-axis angle variation value of a HUD in a vehicle is greater than a first preset threshold, determining whether a height variation value of a human eye is greater than a second preset threshold, if the height variation value of the human eye is greater than the second preset threshold, determining that the HUD shaking event occurs, and after determining that the HUD shaking event occurs, executing the projection image plane adjusting method provided in the embodiment of the present application.
It should be understood that the three-axis angle change value of the HUD in the vehicle may be a difference between a plurality of three-axis angle values acquired within a preset time. The three-axis angle values obtained within the preset time can be obtained by reading data of the IMU sensor in the HUD for multiple times, for example, the three-axis angle value is w1 at time t1, the three-axis angle value is w2 at time t2, and the three-axis angle change value is w2-w1 when t1 is less than t 2.
The three-axis angle change value of the HUD in the vehicle may also be a difference between a three-axis angle value of the HUD acquired when the vehicle vibrates and a preset three-axis angle value. The preset triaxial angle value may be pre-stored in a database, or may be obtained through multiple measurements. For example, if the three-axis angle value obtained by reading the data of the IMU sensor in the HUD is w1 in real time, and the three-axis angle value stored in the database is w3 when there is no jitter in the HUD, the three-axis angle change value is w1-w3.
The first preset threshold and the second preset threshold may be determined according to actual application requirements, which is not limited in the present application.
Optionally, if the three-axis angle variation value of the HUD in the vehicle is greater than a first preset threshold, and the height variation value of human eyes is less than or equal to a second preset threshold, or the three-axis angle variation value of the HUD is less than or equal to the first preset threshold, it is determined that the HUD does not have a shaking event.
The reason that the change value of the three-axis angle of the HUD is larger than the first preset threshold value may be that a random walk error occurs in the IMU sensor, and a measure for zeroing input data of the IMU can be taken, so that correction of IMU measurement data is achieved, and the random walk error of the IMU sensor is improved.
The embodiment of the application is based on the relation that the actual altitude variation value of people's eye matches with the ideal altitude variation value of projection image plane, can utilize the actual altitude variation value of projection image plane and the ideal altitude variation value of projection image plane, adjust HUD so that the projection image plane that HUD throws and driver's people's eye phase-match, make the driver can see the HUD projection image plane of relatively stable when the vehicle takes place to vibrate, the problem of HUD projection image plane position change that the vehicle vibration caused has effectively been solved, thereby guarantee driver's safe driving.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Fig. 6 shows a block diagram of a projection image plane adjusting apparatus provided in the embodiment of the present application, corresponding to the projection image plane adjusting method described in the above embodiment, and only the relevant parts to the embodiment of the present application are shown for convenience of description.
Referring to fig. 6, the projection image plane adjusting apparatus 200 is applied to a HUD, and the projection image plane adjusting apparatus 200 includes:
an acquiring unit 201, configured to acquire a three-axis angle change value of the HUD and a multi-frame face image of the driver;
a first determining unit 202, configured to determine, according to multiple frames of face images, a first height variation value of a projection image plane matched with the eye position in the face images;
the second determining unit 203 is configured to determine a second height variation value of the projection image plane according to the three-axis angle variation value;
and the adjusting unit 204 is used for adjusting the projection image plane according to the first height change value and the second height change value.
In one possible implementation manner, the second determining unit 203 includes:
the determination module is used for determining a second height change value of the projection image plane according to the three-axis angle change value when the judgment module judges that the three-axis angle change value is larger than a first preset threshold value and the first height change value is larger than a second preset threshold value.
In one possible implementation, the first determining unit 202 includes:
the acquisition module is used for respectively acquiring the diameters of the irises of the human eyes in the multi-frame human face images;
the first determining module is used for determining a first distance according to the diameter and a first preset parameter, wherein the first distance is an actual distance between human eyes and equipment for collecting multiple frames of human face images;
and the second determining module is used for determining a first height change value according to the first distance and a second preset parameter, wherein the first preset parameter is the focal length of the equipment, and the second preset parameter is an internal parameter matrix and an external parameter matrix of the equipment.
In a possible implementation manner, the three-axis angle change value includes a pitch angle change value, and the second determining unit 203 is configured to determine a second height change value of the projection image plane according to the pitch angle change value and a preset second distance; the preset second distance is a horizontal distance between the HUD and the projection image plane.
In a possible implementation manner, the adjusting unit 204 is configured to determine a compensation value according to the first height variation value and the second height variation value; and controlling the HUD by using a motor according to the compensation value to adjust the height of the projection image surface.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/units, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and specific reference may be made to the part of the embodiment of the method, which is not described herein again.
Based on the same inventive concept, the embodiment of the present application further provides a terminal device, and the terminal device 300 is shown in fig. 7.
As shown in fig. 7, the terminal device 300 of this embodiment includes: a processor 301, a memory 302, and a computer program 303 stored in the memory 302 and operable on the processor 301. The computer program 303 may be executed by the processor 301 to generate instructions, and the processor 301 may implement the steps in the embodiments of the authority authentication method according to the instructions. Alternatively, the processor 301 implements the functions of the modules/units in the above-described device embodiments when executing the computer program 303.
Illustratively, the computer program 303 may be partitioned into one or more modules/units, which are stored in the memory 302 and executed by the processor 301 to accomplish the present application. One or more of the modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 303 in the terminal device 300.
Those skilled in the art will appreciate that fig. 7 is merely an example of the terminal device 300 and does not constitute a limitation of the terminal device 300 and may include more or less components than those shown, or combine certain components, or different components, for example, the terminal device 300 may further include input-output devices, network access devices, buses, etc.
The Processor 301 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 302 may be an internal storage unit of the terminal device 300, such as a hard disk or a memory of the terminal device 300. The memory 302 may also be an external storage device of the terminal device 300, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided on the terminal device 300. Further, the memory 302 may also include both an internal storage unit of the terminal apparatus 300 and an external storage device. The memory 302 is used to store computer programs and other programs and data required by the terminal device 300. The memory 302 may also be used to temporarily store data that has been output or is to be output.
The terminal device provided in this embodiment may execute the method embodiments, and the implementation principle and the technical effect are similar, which are not described herein again.
Embodiments of the present application further provide a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to implement the method of the above-mentioned method embodiments.
The embodiment of the present application further provides a computer program product, which when running on a terminal device, enables the terminal device to implement the method of the above method embodiment when executed.
The integrated unit may be stored in a computer-readable storage medium if it is implemented in the form of a software functional unit and sold or used as a separate product. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable storage medium may include at least: any entity or device capable of carrying computer program code to a photographing apparatus/terminal device, recording medium, computer Memory, read-Only Memory (ROM), random Access Memory (RAM), electrical carrier signal, telecommunications signal and software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc.
Reference throughout this application to "one embodiment" or "some embodiments," etc., means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather mean "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
In the description of the present application, it is to be understood that the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or as implying that the number of indicated technical features is indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature.
In addition, in the present application, unless otherwise explicitly specified or limited, the terms "connected," "connected," and the like are to be construed broadly, e.g., as meaning both mechanically and electrically; the terms may be directly connected or indirectly connected through an intermediate medium, and may be used for communicating the inside of two elements or for interacting the two elements, unless otherwise specifically defined, and the specific meaning of the terms in the present application may be understood by those skilled in the art according to the specific circumstances.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and these modifications or substitutions do not depart from the scope of the technical solutions of the embodiments of the present application.

Claims (7)

1. A method for adjusting a projection image plane is characterized by comprising the following steps:
acquiring a three-axis angle change value of the projection equipment and a multi-frame face image of a driver;
respectively acquiring the diameters of the human irises in the multiple frames of human face images;
determining a first distance according to the diameter and a first preset parameter, wherein the first distance is an actual distance between the human eyes and equipment for collecting the multiple frames of human face images;
determining a first height change value according to the first distance and a second preset parameter, wherein the first preset parameter is the focal length of the equipment, and the second preset parameter is an internal parameter matrix and an external parameter matrix of the equipment;
determining a second height change value of the projection image plane according to a pitch angle change value and a preset second distance included in the three-axis angle change value; the preset second distance is a horizontal distance between the projection equipment and the projection image surface;
and determining a compensation value according to the first height change value and the second height change value, and adjusting the projection image plane according to the compensation value.
2. The method of claim 1, wherein determining a second elevation change value of the projected image plane from the three-axis angle change values comprises:
and if the three-axis angle change value is greater than a first preset threshold value and the first height change value is greater than a second preset threshold value, determining a second height change value of the projection image plane according to the three-axis angle change value.
3. The method of claim 1, wherein said adjusting the projected image plane according to the compensation value comprises:
and controlling the projection equipment by using a motor according to the compensation value to adjust the height of the projection image surface.
4. A projection image plane adjusting apparatus, characterized in that the apparatus comprises:
the acquisition unit is used for acquiring a three-axis angle change value of the projection equipment and a multi-frame face image of a driver;
the first determining unit is used for respectively acquiring the diameters of the irises of the human eyes in the plurality of frames of human face images; determining a first distance according to the diameter and a first preset parameter, wherein the first distance is an actual distance between the human eyes and equipment for collecting the multiple frames of human face images; determining a first height change value according to the first distance and a second preset parameter, wherein the first preset parameter is the focal length of the equipment, and the second preset parameter is an internal parameter matrix and an external parameter matrix of the equipment;
the second determining unit is used for determining a second height change value of the projection image plane according to a pitch angle change value and a preset second distance included in the three-axis angle change value; the preset second distance is a horizontal distance between the projection equipment and the projection image surface;
and the adjusting unit is used for determining a compensation value according to the first height change value and the second height change value and adjusting the projection image plane according to the compensation value.
5. The apparatus according to claim 4, wherein the second determining unit comprises:
the determination module is used for determining a second height change value of the projection image plane according to the three-axis angle change value when the judgment module judges that the three-axis angle change value is larger than a first preset threshold value and the first height change value is larger than a second preset threshold value.
6. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 3 when executing the computer program.
7. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 3.
CN202210451546.3A 2022-04-27 2022-04-27 Projection image plane adjusting method, device, equipment and storage medium Active CN114821723B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210451546.3A CN114821723B (en) 2022-04-27 2022-04-27 Projection image plane adjusting method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210451546.3A CN114821723B (en) 2022-04-27 2022-04-27 Projection image plane adjusting method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114821723A CN114821723A (en) 2022-07-29
CN114821723B true CN114821723B (en) 2023-04-18

Family

ID=82509514

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210451546.3A Active CN114821723B (en) 2022-04-27 2022-04-27 Projection image plane adjusting method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114821723B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114258319A (en) * 2021-05-18 2022-03-29 华为技术有限公司 Projection method and device, vehicle and AR-HUD

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019224922A1 (en) * 2018-05-22 2019-11-28 三菱電機株式会社 Head-up display control device, head-up display system, and head-up display control method
CN112946889A (en) * 2019-12-11 2021-06-11 未来(北京)黑科技有限公司 Head-up display imaging system, vehicle and moving mechanism control method and device
CN112114427A (en) * 2020-09-08 2020-12-22 中国第一汽车股份有限公司 HUD projection height adjusting method, device and equipment and vehicle
CN113240592A (en) * 2021-04-14 2021-08-10 重庆利龙科技产业(集团)有限公司 Distortion correction method for calculating virtual image plane based on AR-HUD dynamic eye position

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114258319A (en) * 2021-05-18 2022-03-29 华为技术有限公司 Projection method and device, vehicle and AR-HUD

Also Published As

Publication number Publication date
CN114821723A (en) 2022-07-29

Similar Documents

Publication Publication Date Title
CN110057352B (en) Camera attitude angle determination method and device
EP2988169B1 (en) Imaging device and image shake correction method
JP6924251B2 (en) Methods and devices for calibrating extrinsic parameters of image sensors
CN109941277A (en) The method, apparatus and vehicle of display automobile pillar A blind image
CN111800589B (en) Image processing method, device and system and robot
CN113112413B (en) Image generation method, image generation device and vehicle-mounted head-up display system
CN109345591A (en) A kind of vehicle itself attitude detecting method and device
US10997737B2 (en) Method and system for aligning image data from a vehicle camera
DE102018201509A1 (en) Method and device for operating a display system with data glasses
CN111279354A (en) Image processing method, apparatus and computer-readable storage medium
CN112706755B (en) Vehicle-mounted camera adjusting method and device
CN109922267A (en) Image stabilization processing method, computer installation and computer readable storage medium based on gyro data
JP6669182B2 (en) Occupant monitoring device
US20190080181A1 (en) Method for detecting a rolling shutter effect in images of an environmental region of a motor vehicle, computing device, driver assistance system as well as motor vehicle
US11315276B2 (en) System and method for dynamic stereoscopic calibration
CN112904996B (en) Picture compensation method and device for vehicle-mounted head-up display equipment, storage medium and terminal
CN114821723B (en) Projection image plane adjusting method, device, equipment and storage medium
CN110497852A (en) Vehicle-mounted vidicon and camera chain
CN116442707B (en) System and method for estimating vertical and pitching motion information of vehicle body based on binocular vision
JP2018136739A (en) Calibration device
CN111854788B (en) AR Navigation Compensation System Based on Inertial Measurement Unit
JP2002005626A (en) Position detector
CN114754779B (en) Positioning and mapping method and device and electronic equipment
CN113379832B (en) Camera pose adjusting method and device, electronic equipment and storage medium
JP6950597B2 (en) On-board unit, driving evaluation system, information processing method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant