CN113741035B - HUD display method and device, electronic equipment and storage medium - Google Patents

HUD display method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113741035B
CN113741035B CN202111039980.2A CN202111039980A CN113741035B CN 113741035 B CN113741035 B CN 113741035B CN 202111039980 A CN202111039980 A CN 202111039980A CN 113741035 B CN113741035 B CN 113741035B
Authority
CN
China
Prior art keywords
hud
projection position
information
target
driver
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111039980.2A
Other languages
Chinese (zh)
Other versions
CN113741035A (en
Inventor
刘玉红
王雪丰
薛亚冲
武玉龙
王晨如
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BOE Technology Group Co Ltd
Beijing BOE Display Technology Co Ltd
Original Assignee
BOE Technology Group Co Ltd
Beijing BOE Display Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BOE Technology Group Co Ltd, Beijing BOE Display Technology Co Ltd filed Critical BOE Technology Group Co Ltd
Priority to CN202111039980.2A priority Critical patent/CN113741035B/en
Publication of CN113741035A publication Critical patent/CN113741035A/en
Application granted granted Critical
Publication of CN113741035B publication Critical patent/CN113741035B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0149Head-up displays characterised by mechanical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9535Search customisation based on user profiles and personalisation
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0149Head-up displays characterised by mechanical features
    • G02B2027/0169Supporting or connecting means other than the external walls

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Optics & Photonics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Instrument Panels (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The embodiment of the application provides a HUD display method and device, electronic equipment and a computer readable storage medium. The method comprises the following steps: the current pose information of the driver is obtained; determining a target projection position of the HUD according to the current pose information; the projection position of the HUD is adjusted to the target projection position. According to the embodiment of the application, the projection position of the HUD is automatically adjusted along with the change of the current pose information of the driver, so that the adjustment of the projection position of the HUD is more intelligent and convenient, manual adjustment of a user is not needed, and meanwhile, the projection position of the HUD is automatically adjusted according to the current pose of the driver, so that the attention of the driver is not dispersed, and the driving safety is guaranteed.

Description

HUD display method and device, electronic equipment and storage medium
Technical Field
The application relates to the technical field of intelligent automobile driving, in particular to a HUD display method, a HUD display device, an electronic device and a storage medium.
Background
The head-up display HUD is a virtual display system which is currently in a vehicle, and important driving information such as speed per hour, navigation and the like is projected onto a windshield glass in front of a driver through a special optical system, so that the driver can see the important driving information such as speed per hour, navigation and the like without lowering the head and turning the head as much as possible, and the driving safety is improved.
Currently, three ways of adjusting the HUD exist in the market, one way is manual adjustment, and the HUD is adjusted to a target projection position through manually adjusting projection equipment corresponding to the HUD; the second is semi-automatic adjustment, adjusts the HUD to the target projection position through the on-vehicle button, and the other is voice adjustment, realizes the change of HUD position through voice control to adjust the HUD to the target projection position.
All the three adjusting modes can adjust the HUD, but the adjusting modes are not flexible, the sitting posture of a driver is changed at any time, the optimal viewing angle is also changed when the sitting posture is changed, if the position of the HUD is adjusted when the sitting posture is changed each time, the driver is required to manually judge for many times to adjust, the adjusting modes are extremely inflexible, the position of the adjusted HUD is not necessarily the most suitable position information, and the driver is possibly distracted to cause safety accidents in some cases.
Disclosure of Invention
Aiming at the defects of the prior art, the application provides a HUD display method, a HUD display device, an electronic device and a storage medium, which are used for solving the technical problem that the adjustment of the projection position of the HUD is inflexible in the prior art.
In a first aspect, an embodiment of the present application provides a method for displaying a HUD, including:
Acquiring current pose information of a driver;
determining a target projection position of the HUD according to the current pose information;
the projection position of the HUD is adjusted to the target projection position.
In one possible implementation, the method further includes the steps of:
acquiring face information of a driver;
determining a target projection position of the HUD according to the current pose information comprises the following steps:
if the face information exists in the preset database, personalized information corresponding to the face information is obtained from the preset database; the personalized information comprises at least one pose information of the face information and a target projection position of the HUD corresponding to the pose information;
if the current pose information exists in the personalized information, acquiring the target projection position of the HUD corresponding to the current pose information from the personalized information.
In one possible implementation, determining the target projection position of the HUD according to the current pose information further includes:
if the face information is determined not to exist in the preset database, determining the horizontal sight line of the driver according to the current pose information;
determining a target sight line range according to the horizontal sight line and a preset angle range;
the position of the target line-of-sight range on the projection background is taken as the target projection position of the HUD.
In one possible implementation manner, personalized information corresponding to the face information is obtained from a preset database, and then the method further includes:
if the personalized information does not contain the current pose information, determining the horizontal sight of the driver according to the current pose information;
determining a target sight line range according to the horizontal sight line and a preset angle range;
the position of the target line-of-sight range on the projection background is taken as the target projection position of the HUD.
In one possible implementation, the position of the target line-of-sight range on the projection background is taken as the target projection position of the HUD, and then further includes:
personalized information corresponding to the face information is newly added in a preset database, and the personalized information comprises current pose information and a target projection position of the HUD corresponding to the current pose information.
In one possible implementation manner, the face information of the driver is obtained, and then the method further includes:
and if the driver is in the fatigue driving state according to the face information, displaying warning information through the HUD.
In one possible implementation, the method further includes:
and if the face information of the driver is not detected, performing screen quenching processing on the HUD.
In a second aspect, an embodiment of the present application provides a display device of a HUD, including:
The acquisition module is used for acquiring the current pose information of the driver;
the target projection position determining module is used for determining the target projection position of the HUD according to the current pose information;
and the adjusting module is used for adjusting the projection position of the HUD to the target projection position.
In one possible implementation, the obtaining module further includes:
the face information acquisition sub-module is used for acquiring face information of a driver;
a target projection position determination module comprising:
the personalized information acquisition sub-module is used for acquiring personalized information corresponding to the face information from a preset database if the face information exists in the preset database; the personalized information comprises at least one pose information of the face information and a target projection position of the HUD corresponding to the pose information;
the target projection position determining first sub-module is used for acquiring the target projection position of the HUD corresponding to the current pose information from the personalized information if the current pose information exists in the personalized information.
In one possible implementation, the target projection position determining module further includes:
the horizontal sight line determining sub-module is used for determining the horizontal sight line of the driver according to the current pose information if the face information is determined not to exist in the preset database;
The target sight range determining submodule determines a target sight range according to the horizontal sight and a preset angle range;
the target projection position determination second sub-module is used for taking the position of the target sight range on the projection background as the target projection position of the HUD.
In one possible implementation, the target projection position determining module further includes:
the horizontal sight line determining sub-module is used for determining the horizontal sight line of the driver according to the current pose information if the current pose information does not exist in the personalized information;
the target sight range determining submodule is used for determining a target sight range according to the horizontal sight and a preset angle range;
the target projection position determination second sub-module is used for taking the position of the target sight range on the projection background as the target projection position of the HUD.
In one possible implementation, the apparatus further includes:
the new adding module is used for adding personalized information corresponding to the face information in a preset database, wherein the personalized information comprises current pose information and a target projection position of the HUD corresponding to the current pose information.
In one possible implementation, the apparatus further includes:
and the warning module is used for displaying warning information through the HUD if the driver is determined to be in the fatigue driving state according to the face information.
In one possible implementation, the apparatus further includes:
and the screen extinguishing module is used for carrying out screen extinguishing processing on the HUD if the face information of the driver is not detected.
In a third aspect, an embodiment of the application provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the method as provided in the first aspect when the program is executed.
In a fourth aspect, embodiments of the present application provide a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the method as provided by the first aspect.
In a fifth aspect, embodiments of the present application provide a computer program product comprising computer instructions stored in a computer readable storage medium, which when read from the computer readable storage medium by a processor of a computer device, cause the computer device to perform the steps of the method as provided in the first aspect.
The technical scheme provided by the embodiment of the application has the beneficial technical effects that: the current pose information of the driver is obtained; determining a target projection position of the HUD according to the current pose information; the projection position of the HUD is adjusted to the target projection position. According to the embodiment of the application, the projection position of the HUD is automatically adjusted along with the change of the current pose information of the driver, so that the adjustment of the projection position of the HUD is more intelligent and convenient, manual adjustment of a user is not needed, and meanwhile, the projection position of the HUD is automatically adjusted according to the current pose of the driver, so that the attention of the driver is not dispersed, and the driving safety is guaranteed.
Additional aspects and advantages of the application will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the application.
Drawings
The foregoing and/or additional aspects and advantages of the application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings, in which:
fig. 1 is a flow chart of a HUD display method according to an embodiment of the present application;
fig. 2 is a schematic flow chart of determining a target projection position of a HUD according to current pose information according to an embodiment of the present application;
fig. 3 is a schematic diagram of a relationship among a camera, pose information of a driver, a projection device and a windshield according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a process for determining a target projection position according to current pose information according to an embodiment of the present application;
FIG. 5 is a flow chart of HUD operation according to an embodiment of the present application;
fig. 6 is a schematic frame diagram of a structure of a display device of a HUD according to an embodiment of the present application;
fig. 7 is a schematic frame diagram of a structure of an electronic device according to an embodiment of the present application.
Detailed Description
The present application is described in detail below, examples of embodiments of the application are illustrated in the accompanying drawings, wherein the same or similar reference numerals refer to the same or similar components or components having the same or similar functions throughout. Further, if detailed description of the known technology is not necessary for the illustrated features of the present application, it will be omitted. The embodiments described below by referring to the drawings are illustrative only and are not to be construed as limiting the application.
It will be understood by those skilled in the art that all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs unless defined otherwise. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the prior art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless expressly stated otherwise, as understood by those skilled in the art. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may also be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or wirelessly coupled. The term "and/or" as used herein includes all or any element and all combination of one or more of the associated listed items.
First, several terms related to the present application are described and explained:
pose is a relative concept that refers to displacement and rotation transformations between two coordinate systems.
The pose can comprise a camera pose and an object pose, the object pose and the camera pose are similar, the difference is that the camera pose is transformed from which coordinate system to the camera coordinate system, the camera pose refers to translation and rotation transformation of the camera coordinate system relative to a world coordinate system at the moment of shooting a current image, and the world coordinate system can be defined at any position or can be overlapped with the current camera coordinate system; the object pose refers to translation and rotation transformation of a camera coordinate system relative to a world system where an original object is located at the moment of capturing a current image, the original object can be placed at any position of the world system, and the gravity center and the orientation of the object are generally aligned with the world system.
The inventor of the application researches and discovers that the head-up display HUD is a virtual display system which is currently in a vehicle, and important driving information such as speed per hour, navigation and the like is projected onto windshield glass in front of a driver through a special optical system, so that the driver can see the important driving information such as speed per hour, navigation and the like without lowering the head or turning the head as much as possible, and the driving safety is improved.
Currently, three ways of adjusting the HUD exist in the market, one way is manual adjustment, and the HUD is adjusted to a target projection position through manually adjusting projection equipment corresponding to the HUD; the second is semi-automatic adjustment, adjusts the HUD to the target projection position through the on-vehicle button, and the other is voice adjustment, realizes the change of HUD position through voice control to adjust the HUD to the target projection position.
All the three adjusting modes can adjust the HUD, but the adjusting modes are not flexible, the sitting posture of a driver is changed at any time, the optimal viewing angle is also changed when the sitting posture is changed, if the position of the HUD is adjusted when the sitting posture is changed each time, the driver is required to manually judge for many times to adjust, the adjusting modes are extremely inflexible, the position of the adjusted HUD is not necessarily the most suitable position information, and the driver is possibly distracted to cause safety accidents in some cases.
The application provides a HUD display method, a HUD display device, electronic equipment and a storage medium, and aims to solve the technical problems in the prior art.
The following describes the technical scheme of the present application and how the technical scheme of the present application solves the above technical problems in detail with specific embodiments.
The embodiment of the application provides a HUD display method, a flow diagram of which is shown in figure 1, comprising the following steps:
step S101, current pose information of the driver is acquired.
The current pose information of the driver can be the pose of the head of the driver at the current moment, or the pose of the eyes or the pose of the centers of the eyes of the driver at the current moment.
According to the embodiment of the application, the current pose information of the driver is reflected by the pose of the camera when the head (the eyes or the centers of the eyes) of the driver is acquired, the absolute pose of the camera in a certain scene can be used for representing the current pose information of the driver, such as 6DOF (Degree of Freedom ) pose data, including 3 position coordinates and 3 rotation angles, and the relative pose of the camera in a certain scene relative to a certain reference pose (such as the origin of a world coordinate system of a target scene) can be used, such as a rotation matrix and a translation vector.
According to the embodiment of the application, the current pose information of the driver is acquired through the camera, the current pose information of the driver is acquired in real time through the camera, the tracking of the pose information of the driver is realized, the camera can acquire images containing the driver in real time, the current pose information of the driver is converted through the existing algorithm, and the embodiment of the application does not limit how to determine the current pose information of the driver specifically.
The infrared camera is preferably selected by the camera, so that the current pose information of the driver can be acquired at night.
Step S102, determining the target projection position of the HUD according to the current pose information.
The target projection position of the HUD in the embodiment of the application refers to a position which enables a driver to see various information on the HUD without lowering the head or turning the head, and is the position which is most beneficial to the driver to watch.
After the current pose information of the driver is determined, the embodiment of the application determines the target projection position of the HUD according to the current pose information of the driver, and details are shown in the following.
It should be emphasized that different drivers have different pose information, and a change in the sitting position of the driver may also result in a change in the current pose information of the driver, so that the target projection position of the HUD is continuously changed.
Step S103, adjusting the projection position of the HUD to the target projection position.
The projection position of the HUD according to the embodiment of the present application may fall on a windshield, that is, the HUD is projected on the windshield, where the projection position of the HUD has an initial projection position, that is, a default projection position, and may be set on the windshield that is 2m directly in front of the driver seat, for example, the target projection position is (2, 0 °), where the target projection position records a change amount of the target projection position based on the initial projection position, for example, the target projection position is (Y '1, η1), which indicates that the retraction distance of the target projection position with respect to the initial projection position is Y'1, and the rotation angle is η1.
After the target projection position of the HUD is determined, the projection angle of the projection device corresponding to the HUD is converted according to the target projection position and the position of the projection device corresponding to the HUD, and the projection position of the HUD is adjusted to the target projection position by automatically adjusting the projection angle of the projection device corresponding to the HUD.
The above-mentioned projection angle of the projection device corresponding to the HUD is converted according to the target projection position and the position of the projection device corresponding to the HUD by using the conventional conversion method, and the embodiments of the present application will not be described in detail herein.
According to the embodiment of the application, the current pose information of the driver is obtained; determining a target projection position of the HUD according to the current pose information; the projection position of the HUD is adjusted to the target projection position. According to the embodiment of the application, the projection position of the HUD is automatically adjusted along with the change of the current pose information of the driver, so that the adjustment of the projection position of the HUD is more intelligent and convenient, manual adjustment of a user is not needed, and meanwhile, the projection position of the HUD is automatically adjusted according to the current pose of the driver, so that the attention of the driver is not dispersed, and the driving safety is guaranteed.
The embodiment of the application provides a possible implementation manner, and the method for acquiring the current pose information of the driver comprises the following steps:
acquiring face information of a driver;
the face information of the embodiment of the application can be a plurality of face images acquired in advance, such as a depth image, a color image or an infrared image, and a plurality of face images of the driver can be acquired for the same driver, so that the face information of the driver can be comprehensively analyzed.
Determining a target projection position of the HUD according to the current pose information comprises the following steps:
if the face information exists in the preset database, personalized information corresponding to the face information is obtained from the preset database, wherein the personalized information comprises at least one pose information of the face information and a target projection position of the HUD corresponding to the pose information.
The face information in the preset database can be stored in the form of images, and for the same face information, various types of face images, such as depth images, color images or infrared images, can be acquired. Before the current pose information of the driver is obtained, the face information of the driver is also collected through the camera, whether the face information exists in a preset database is judged through the existing face recognition technology, if the face information exists in the preset database, personalized information corresponding to the face information is obtained from the preset database, and the personalized information comprises pose information of the face information and target projection positions of HUDs corresponding to the pose information.
At least one piece of pose information of the face information corresponding to the personalized information and a target projection position corresponding to the pose information are stored in the personalized information, the pose information in the preset database can be represented by 6DOF pose data, and the target projection position can be represented by a retraction distance and a rotation angle relative to the initial projection position.
As shown in table 1, it exemplarily shows face information stored in a preset database and personalized information for the face information:
TABLE 1
As shown in table 1, each of the above xi, xi ' and xi″ represents an x-axis coordinate, each of yi, yi ' and yi″ represents a Y-axis coordinate, each of zi, zi ' and zi″ represents a z-axis coordinate, each of αi, αi ' and αi″ represents a rotation angle of the x-axis, each of βi, βi ' and βi″ represents a rotation angle of the Y-axis, each of γ2, γ2' and γ2″ represents a rotation angle of the z-axis, and each of Y ' i and Y ' i ' represents a rotation angle of the target projection position, wherein i is a positive integer.
As shown in table 1, the preset database has face information of 3 drivers and personalized information corresponding to each face information, the personalized information includes pose information and target projection positions corresponding to the pose information, for example, the face information of driver 1 includes an image 1-1, an image 1-2 and an image 1-3, the target projection pose corresponding to the pose information (x 1, Y1, z1, α1, β1, γ1) of driver 1 is (Y '1, η1), and the target projection position corresponding to the pose information (x 1', Y1', z1', α1', β1', γ1 ') of driver 1 is (Y' 1', η1').
If the current pose information exists in the personalized information, acquiring a target projection position corresponding to the current pose information from the personalized information.
After personalized information corresponding to the face information is determined, whether the personalized information contains current pose information or not is judged, and if the personalized information contains the current pose information, a projection position corresponding to the current pose information is directly obtained from the personalized pose information.
According to the embodiment of the application, after the face information is determined to exist in the preset database, the personalized information of the face information is directly called, and if the personalized information corresponding to the face information also exists the current pose information, the target projection position corresponding to the current pose information can be directly called, so that the target projection position corresponding to the current pose information can be quickly matched.
Continuing the above example, assuming that after the face information of the driver is obtained, the face information of the driver is found to exist in the preset database, and the face information of the driver accords with the face information of the driver 2, personalized information corresponding to the face information can be directly obtained from the preset database, and if the current pose information is (x 2', Y2', z2', α2', β2', γ2 '), the current pose information (x 2', Y2', z2', α2', β2', γ2 ') can be directly determined from the personalized information, and the target projection position corresponding to the current pose information (Y '2', η2 ').
The embodiment of the application provides a possible implementation manner, as shown in fig. 2, for determining a target projection position of the HUD according to current pose information, and further includes:
step S201, if the face information is determined not to exist in a preset database, determining the horizontal sight line of the driver according to the current pose information;
if the face information exists in the preset data, the target projection position corresponding to the current pose information can be directly determined from the personalized information corresponding to the face information, and if the face information does not exist in the preset database, the target projection position corresponding to the current pose information needs to be determined through calculation, and the follow-up content is specifically seen.
According to the embodiment of the application, after the current pose information is determined, the horizontal sight line of the driver, which is parallel to the ground, can be determined according to the current pose information.
Step S202, determining a target sight line range according to the horizontal sight line and a preset angle range.
The preset angle range in the embodiment of the application refers to an angle range from a horizontal sight line, and the preset angle range can be determined by taking the horizontal sight line as a 0-degree sight line, for example, the preset angle can be set to be 4-5 degrees.
The target sight line range in the embodiment of the application refers to a position where the driver's sight line may fall, and it is understood that when the driver looks forward, the driver's field of view is a range value.
Specifically, for example, the preset angle range is 4 ° to 5 ° with the horizontal line of sight being 0 °, and the target line of sight range is a range of 4 ° to 5 ° obliquely downward from the horizontal line of sight.
In step S203, the position of the target line-of-sight range on the projection background is taken as the target projection position of the HUD.
The projection background in the embodiment of the application refers to the background where the HUD projects, the HUD is displayed on the projection background, the projection background can be any background which can be projected, the preferred projection background is a windshield, and the embodiment of the application does not limit other projection backgrounds.
After the target sight line range is determined, the position of the target sight line range on the projection background is taken as the target projection position, wherein the target projection position is the position at which a driver can see various information on the HUD without lowering the head or turning the head, and is the position most beneficial to the driver to watch.
Specifically, continuing the above example, the target line of sight range is a range of 4 ° to 5 ° with the horizontal line of sight obliquely downward, and then a position where the range of 4 ° to 5 ° with the horizontal line of sight obliquely downward falls on the windshield may be taken as the target projection position, and the center of the position of the target line of sight range on the projection background may be taken as the target projection center, so that the center of HUD projection is located at the center of the target projection position.
As shown in fig. 3, a schematic diagram of a relationship among a camera, pose information of a driver, projection equipment and a windshield is exemplarily shown, pose information of the driver adopts pose information of eyes of the driver, the camera collects current pose information of the driver, a horizontal sight line of the driver is determined according to the current pose information, a target sight line range is determined according to the horizontal sight line of the driver and a preset angle range, a position of the target sight line range on the windshield is a target projection position, and after the target projection position is determined, a projection angle of the projection equipment is converted through an existing conversion method, so that the projection equipment projects the HUD to the target projection position according to the projection angle.
The embodiment of the application provides a possible implementation manner, wherein personalized information corresponding to face information is obtained from a preset database, and then the method further comprises the following steps:
if the personalized information does not contain the current pose information, determining the level realization of the driver according to the current pose information;
determining a target sight line range according to the horizontal sight line and a preset angle range;
the position of the object line of sight on the projection background is taken as the object projection position.
In the embodiment of the present application, after the personalized information corresponding to the face information does not have the current pose information, the manner of determining the target projection position corresponding to the current pose information is the same as the processes of the step S201, the step S202 and the step S203, and the embodiment of the present application is not described herein again.
The embodiment of the application provides a possible implementation manner, wherein the position of the target sight line range on the projection background is used as a target projection position, and then the method further comprises the following steps:
personalized information corresponding to the face information is newly added in a preset database, and the personalized information comprises current pose information and a target projection position of the HUD corresponding to the current pose information.
In the embodiment of the application, if the face information does not exist in the preset database, or after the target projection pose corresponding to the current pose information does not exist in the personalized information corresponding to the face information, the horizontal sight of the driver can be determined according to the pose information, then the target sight range is determined according to the horizontal sight of the driver and the preset angle range, then the position on the background of the target sight range is projected again as the target projection position, so that the target projection position corresponding to the current pose information is determined, after the target projection position corresponding to the current pose information is determined, personalized information corresponding to the face information can be newly added in the preset database, the personalized information corresponding to the current pose information comprises the current pose information and the target projection position corresponding to the current pose information, and the personalized information corresponding to the face information is newly added in the preset database, so that after the face information is identified next time, the personalized information corresponding to the face information is directly called, and if the current pose information of the face information is positioned in the personalized information next time, the target projection position corresponding to the current pose information in the personalized information can be directly called, and the target projection position corresponding to the personalized information is not required to be calculated again.
As shown in fig. 4, a schematic diagram of a process of determining a target projection position according to current pose information is shown, and the whole process is as follows: acquiring face information and current pose information of a driver; judging whether the face information is positioned in a preset database or not; if the face information is located in a preset database, directly calling personalized information corresponding to the face information; judging whether the personalized information contains a target projection position corresponding to the current pose information or not; and if the target projection position corresponding to the current pose information exists in the personalized information, calling the target projection position.
If the face information is not located in the preset database or the target projection position corresponding to the current pose information does not exist in the personalized information corresponding to the face information, determining the horizontal sight line of the driver according to the current pose information; determining a target sight line range according to the horizontal sight line of the driver and a preset angle range; taking the position of the target sight line range on the projection background as a target projection position; and newly adding personalized information corresponding to the face information in a preset database, wherein the personalized information comprises current pose information and a target projection position corresponding to the current pose information.
The embodiment of the application provides a possible implementation manner, and the method comprises the following steps of:
and if the driver is in the fatigue driving state according to the face information, displaying warning information through the HUD.
According to the embodiment of the application, after the face information of the driver is acquired, the driving state of the driver is determined through the face information, wherein the driving state of the driver comprises a normal driving state and a fatigue driving state, the driving state can be determined through detecting and judging data related to the driving state, for example, if abnormal opening and closing of eyes of the driver are detected, the driver is determined to be in the fatigue driving state because the fatigue eye closing time is longer than the preset time, if the driver is detected to be yawed, the first yawed is counted, and in the preset time period, if the yawed number is longer than the preset yawed number, the driver is determined to be in the fatigue driving state.
After determining that the driver is in the fatigue driving state, a warning message can be displayed on the HUD, and the warning message can be in any form, such as the sudden brightness enhancement of a HUD display screen, and displays word information such as 'fatigue driving state, please stop driving', and the like to remind the user to stop driving, and at the same time, the warning message can assist the voice to wake up to remind the user.
The embodiment of the application provides a possible implementation mode, and if the face information of the driver is not detected, the HUD is subjected to screen extinguishing processing.
When the face information cannot be detected, the HUD is subjected to screen-off processing so as to save electricity.
As shown in fig. 5, an operation flow chart of a HUD is exemplarily shown, the HUD first determines whether the camera detects the face information, and if the camera does not detect the face information, the HUD is turned off; if face information is detected, current pose information of a driver is obtained from the face information, driving state of the driver is judged, whether the driver is in a fatigue driving state is judged, if the driver is in the fatigue driving state, warning information is sent out through the HUD, if the driver is not in the fatigue driving state, the target projection position of the HUD is determined according to the current pose information of the driver, and the projection position of the HUD is adjusted to the target projection position.
Based on the same inventive concept, a schematic structural frame of a display device of a HUD according to an embodiment of the present application is shown in fig. 6, and the display device includes:
an obtaining module 610, configured to obtain current pose information of a driver;
a target projection position determining module 620, configured to determine a target projection position of the HUD according to the current pose information;
The adjustment module 630 is configured to adjust the projection position of the HUD to the target projection position.
The embodiment of the application provides a possible implementation manner, and the acquisition module further comprises:
the face information acquisition sub-module is used for acquiring face information of a driver;
a target projection position determination module comprising:
the personalized information acquisition sub-module is used for acquiring personalized information corresponding to the face information from a preset database if the face information exists in the preset database; the personalized information comprises at least one pose information of the face information and a target projection position of the HUD corresponding to the pose information;
the target projection position determining first sub-module is used for acquiring the target projection position of the HUD corresponding to the current pose information from the personalized information if the current pose information exists in the personalized information.
The embodiment of the application provides a possible implementation manner, and the target projection position determining module further comprises:
the horizontal sight line determining sub-module is used for determining the horizontal sight line of the driver according to the current pose information if the face information is determined not to exist in the preset database;
the target sight range determining submodule determines a target sight range according to the horizontal sight and a preset angle range;
The target projection position determination second sub-module is used for taking the position of the target sight range on the projection background as the target projection position of the HUD.
The embodiment of the application provides a possible implementation manner, and the target projection position determining module further comprises:
the horizontal sight line determining sub-module is used for determining the horizontal sight line of the driver according to the current pose information if the current pose information does not exist in the personalized information;
the target sight range determining submodule is used for determining a target sight range according to the horizontal sight and a preset angle range;
the target projection position determination second sub-module is used for taking the position of the target sight range on the projection background as the target projection position of the HUD.
The embodiment of the application provides a possible implementation manner, and the device further comprises:
the new adding module is used for adding personalized information corresponding to the face information in a preset database, wherein the personalized information comprises current pose information and a target projection position of the HUD corresponding to the current pose information.
The embodiment of the application provides a possible implementation manner, and the device further comprises:
and the warning module is used for displaying warning information through the HUD if the driver is determined to be in the fatigue driving state according to the face information.
The embodiment of the application provides a possible implementation manner, and the device further comprises:
and the screen extinguishing module is used for carrying out screen extinguishing processing on the HUD if the face information of the driver is not detected.
The display device of the HUD in this embodiment may execute any of the display methods of the HUD provided in the embodiments of the present application, and the implementation principle is similar, and will not be described herein.
Based on the same inventive concept, an embodiment of the present application provides an electronic device, including: memory and a processor.
The memory is communicatively coupled to the processor.
At least one computer program stored in the memory for, when executed by the processor, implementing various alternative implementations of any of the HUD display methods provided by the embodiments of the present application.
Those skilled in the art will appreciate that the electronic devices provided by the embodiments of the present application may be specially designed and constructed for the required purposes, or may comprise known devices in general purpose computers. These devices have computer programs stored therein that are selectively activated or reconfigured. Such a computer program may be stored in a device (e.g., computer) readable medium or in any type of medium suitable for storing electronic instructions and coupled to a bus, respectively.
Compared with the prior art, can realize: the current pose information of the driver is obtained; determining a target projection position of the HUD according to the current pose information; the projection position of the HUD is adjusted to the target projection position. According to the embodiment of the application, the projection position of the HUD is automatically adjusted along with the change of the current pose information of the driver, so that the adjustment of the projection position of the HUD is more intelligent and convenient, manual adjustment of a user is not needed, and meanwhile, the projection position of the HUD is automatically adjusted according to the current pose of the driver, so that the attention of the driver is not dispersed, and the driving safety is guaranteed.
The present application provides, in an alternative embodiment, an electronic device, as shown in fig. 7, an electronic device 2000 shown in fig. 7 including: a processor 2001 and a memory 2003. Wherein the processor 2001 is communicatively coupled to the memory 2003, such as via a bus 2002.
The processor 2001 may be a CPU (Central Processing Unit ), general purpose processor, DSP (Digital Signal Processor, data signal processor), ASIC (Application Specific Integrated Circuit ), FPGA (Field-Programmable Gate Array, field programmable gate array) or other programmable logic device, transistor logic device, hardware component, or any combination thereof. Which may implement or perform the various exemplary logic blocks, modules and circuits described in connection with this disclosure. The processor 2001 may also be a combination of computing functions, e.g., comprising one or more microprocessor combinations, a combination of a DSP and a microprocessor, etc.
Bus 2002 may include a path to transfer information between the components. Bus 2002 may be a PCI (Peripheral Component Interconnect, peripheral component interconnect Standard) bus or an EISA (Extended Industry Standard Architecture ) bus, or the like. The bus 2002 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in fig. 7, but not only one bus or one type of bus.
The Memory 2003 may be, but is not limited to, a ROM (Read-Only Memory) or other type of static storage device that can store static information and instructions, a RAM (random access Memory ) or other type of dynamic storage device that can store information and instructions, an EEPROM (Electrically Erasable Programmable Read Only Memory, electrically erasable programmable Read-Only Memory), a CD-ROM (Compact Disc Read-Only Memory) or other optical disk storage, optical disk storage (including compact disk, laser disk, optical disk, digital versatile disk, blu-ray disk, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer.
Optionally, the electronic device 2000 may also include a transceiver 2004. The transceiver 2004 may be used for both reception and transmission of signals. The transceiver 2004 may allow the electronic device 2000 to communicate wirelessly or by wire with other devices to exchange data. It should be noted that, in practical application, the transceiver 2004 is not limited to one.
Optionally, the electronic device 2000 may also include an input unit 2005. The input unit 2005 may be used to receive input digital, character, image, and/or sound information, or to generate key signal inputs related to user settings and function controls of the electronic device 2000. The input unit 2005 may include, but is not limited to, one or more of a touch screen, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a track ball, a mouse, a joystick, a camera, a microphone, etc.
Optionally, the electronic device 2000 may also include an output unit 2006. An output unit 2006 may be used to output or present information processed by the processor 2001. The output unit 2006 may include, but is not limited to, one or more of a display device, a speaker, a vibration device, and the like.
While fig. 7 shows an electronic device 2000 having various means, it is to be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead.
Optionally, a memory 2003 is used for storing application code for executing aspects of the present application and is controlled for execution by the processor 2001. The processor 2001 is configured to execute application program codes stored in the memory 2003 to implement any of the HUD display methods provided in the embodiments of the present application.
Based on the same inventive concept, the embodiments of the present application provide a computer readable storage medium having a computer program stored thereon, which when executed by an electronic device/processor implements any of the HUD display methods provided by the embodiments of the present application.
Compared with the prior art, the embodiment of the application provides a computer readable storage medium, which is used for acquiring the current pose information of a driver; determining a target projection position of the HUD according to the current pose information; the projection position of the HUD is adjusted to the target projection position. According to the embodiment of the application, the projection position of the HUD is automatically adjusted along with the change of the current pose information of the driver, so that the adjustment of the projection position of the HUD is more intelligent and convenient, manual adjustment of a user is not needed, and meanwhile, the projection position of the HUD is automatically adjusted according to the current pose of the driver, so that the attention of the driver is not dispersed, and the driving safety is guaranteed.
Embodiments of the present application provide various alternative implementations of a computer readable storage medium suitable for use in any of the HUD display methods described above. And will not be described in detail herein.
By applying the embodiment of the application, at least the following beneficial effects can be realized: the current pose information of the driver is obtained; determining a target projection position of the HUD according to the current pose information; the projection position of the HUD is adjusted to the target projection position. According to the embodiment of the application, the projection position of the HUD is automatically adjusted along with the change of the current pose information of the driver, so that the adjustment of the projection position of the HUD is more intelligent and convenient, manual adjustment of a user is not needed, and meanwhile, the projection position of the HUD is automatically adjusted according to the current pose of the driver, so that the attention of the driver is not dispersed, and the driving safety is guaranteed.
Those of skill in the art will appreciate that the various operations, methods, steps in the flow, acts, schemes, and alternatives discussed in the present application may be alternated, altered, combined, or eliminated. Further, other steps, means, or steps in a process having various operations, methods, or procedures discussed herein may be alternated, altered, rearranged, disassembled, combined, or eliminated. Further, steps, measures, schemes in the prior art with various operations, methods, flows disclosed in the present application may also be alternated, altered, rearranged, decomposed, combined, or deleted.
In the description of the present application, it should be understood that the terms "center," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like indicate orientations or positional relationships based on the orientation or positional relationships shown in the drawings, merely to facilitate describing the present application and simplify the description, and do not indicate or imply that the devices or elements referred to must have a specific orientation, be configured and operated in a specific orientation, and thus should not be construed as limiting the present application.
The terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present application, unless otherwise indicated, the meaning of "a plurality" is two or more.
In the description of the present application, it should be noted that, unless explicitly specified and limited otherwise, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be either fixedly connected, detachably connected, or integrally connected, for example; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements. The specific meaning of the above terms in the present application will be understood in specific cases by those of ordinary skill in the art.
In the description of the present specification, a particular feature, structure, material, or characteristic may be combined in any suitable manner in one or more embodiments or examples.
It should be understood that, although the steps in the flowcharts of the figures are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited in order and may be performed in other orders, unless explicitly stated herein. Moreover, at least some of the steps in the flowcharts of the figures may include a plurality of sub-steps or stages that are not necessarily performed at the same time, but may be performed at different times, the order of their execution not necessarily being sequential, but may be performed in turn or alternately with other steps or at least a portion of the other steps or stages.
The foregoing is only a partial embodiment of the present application, and it should be noted that it will be apparent to those skilled in the art that modifications and adaptations can be made without departing from the principles of the present application, and such modifications and adaptations should and are intended to be comprehended within the scope of the present application.

Claims (10)

1. A display method for head-up display HUD is characterized by comprising the following steps:
acquiring current pose information of a driver;
determining a target projection position of the HUD according to the current pose information;
adjusting the projection position of the HUD to the target projection position;
the determining the target projection position of the HUD according to the current pose information comprises the following steps:
determining the horizontal sight line of the driver according to the current pose information; the horizontal sight line is a sight line parallel to the ground;
and determining a target sight line range according to the horizontal sight line and a preset angle range, and determining a target projection position of the HUD according to the target sight line range, wherein the target sight line range represents the sight line range of the driver.
2. The HUD display method according to claim 1, wherein the obtaining current pose information of the driver further comprises:
acquiring face information of the driver;
the determining the target projection position of the HUD according to the current pose information comprises the following steps:
if the face information exists in a preset database, personalized information corresponding to the face information is obtained from the preset database; the personalized information comprises at least one pose information of the face information and a target projection position of the HUD corresponding to the pose information;
And if the current pose information exists in the personalized information, acquiring a target projection position of the HUD corresponding to the current pose information from the personalized information.
3. The HUD display method according to claim 2, wherein determining the target projection position of the HUD according to the current pose information further comprises:
if the face information is determined not to exist in the preset database, determining the horizontal sight line of the driver according to the current pose information;
determining a target sight line range according to the horizontal sight line and a preset angle range;
and taking the position of the target sight line range on the projection background as the target projection position of the HUD.
4. The HUD display method according to claim 1, wherein the determining the target projection position of the HUD according to the target line-of-sight range includes:
and taking the position of the target sight line range on the projection background as the target projection position of the HUD.
5. The HUD display method according to claim 3 or 4, wherein the positioning of the target line-of-sight range on the projection background is used as the target projection position of the HUD, and further comprising:
and newly adding personalized information corresponding to the face information in the preset database, wherein the personalized information comprises the current pose information and a target projection position of the HUD corresponding to the current pose information.
6. The HUD display method according to claim 1, wherein the acquiring the face information of the driver further comprises:
and if the driver is in the fatigue driving state according to the face information, displaying warning information through the HUD.
7. The HUD display method according to claim 1, further comprising:
and if the face information of the driver is not detected, performing screen quenching processing on the HUD.
8. A display device for head-up display HUD, comprising:
the acquisition module is used for acquiring the current pose information of the driver;
the target projection position determining module is used for determining the target projection position of the HUD according to the current pose information;
the adjusting module is used for adjusting the projection position of the HUD to the target projection position;
the target projection position determining module is specifically used for determining the horizontal sight line of the driver according to the current pose information; the horizontal sight line is a sight line parallel to the ground;
and determining a target sight line range according to the horizontal sight line and a preset angle range, and determining a target projection position of the HUD according to the target sight line range, wherein the target sight line range represents the sight line range of the driver.
9. A terminal device, comprising:
a processor;
a memory communicatively coupled to the processor;
at least one program stored in the memory and configured to be executed by the processor, the at least one program configured to: a display method for realizing the HUD according to any one of claims 1 to 7.
10. A non-transitory computer readable storage medium having stored thereon a computer program, the computer readable storage medium being characterized in that the computer program, when executed by an electronic device, implements the display method of the HUD according to any of claims 1 to 7.
CN202111039980.2A 2021-09-06 2021-09-06 HUD display method and device, electronic equipment and storage medium Active CN113741035B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111039980.2A CN113741035B (en) 2021-09-06 2021-09-06 HUD display method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111039980.2A CN113741035B (en) 2021-09-06 2021-09-06 HUD display method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113741035A CN113741035A (en) 2021-12-03
CN113741035B true CN113741035B (en) 2023-10-17

Family

ID=78736208

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111039980.2A Active CN113741035B (en) 2021-09-06 2021-09-06 HUD display method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113741035B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109747656A (en) * 2017-11-02 2019-05-14 罗曦明 Artificial intelligence vehicle assistant drive method, apparatus, equipment and storage medium
CN111016785A (en) * 2019-11-26 2020-04-17 惠州市德赛西威智能交通技术研究院有限公司 Head-up display system adjusting method based on human eye position
CN112428936A (en) * 2020-11-27 2021-03-02 奇瑞汽车股份有限公司 Method and device for automatically adjusting parameters of head-up display

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109747656A (en) * 2017-11-02 2019-05-14 罗曦明 Artificial intelligence vehicle assistant drive method, apparatus, equipment and storage medium
CN111016785A (en) * 2019-11-26 2020-04-17 惠州市德赛西威智能交通技术研究院有限公司 Head-up display system adjusting method based on human eye position
CN112428936A (en) * 2020-11-27 2021-03-02 奇瑞汽车股份有限公司 Method and device for automatically adjusting parameters of head-up display

Also Published As

Publication number Publication date
CN113741035A (en) 2021-12-03

Similar Documents

Publication Publication Date Title
US11449294B2 (en) Display system in a vehicle
EP2914002B1 (en) Virtual see-through instrument cluster with live video
KR101554798B1 (en) Providing a corrected view based on the position of a user with respect to a mobile platform
US20160163108A1 (en) Augmented reality hud display method and device for vehicle
US20190227694A1 (en) Device for providing augmented reality service, and method of operating the same
US8799810B1 (en) Stability region for a user interface
US20100287500A1 (en) Method and system for displaying conformal symbology on a see-through display
US20120139816A1 (en) In-vehicle display management system
US9823735B2 (en) Method for selecting an information source from a plurality of information sources for display on a display of smart glasses
WO2022241638A1 (en) Projection method and apparatus, and vehicle and ar-hud
US20160221502A1 (en) Cognitive displays
CN102968180A (en) User interface control based on head direction
WO2021196716A1 (en) Information display control method and apparatus, and vehicle
US20220187903A1 (en) Image processing and display method, augmented reality device, image processing device, and display system
US20210081047A1 (en) Head-Mounted Display With Haptic Output
US20180253144A1 (en) Assisted item selection for see through glasses
CN115525152A (en) Image processing method, system, device, electronic equipment and storage medium
CN110435539B (en) Image display method of head-up display and head-up display system
CN112242009A (en) Display effect fusion method, system, storage medium and main control unit
TWI799000B (en) Method, processing device, and display system for information display
CN113741035B (en) HUD display method and device, electronic equipment and storage medium
CN111435269A (en) Display adjusting method, system, medium and terminal of vehicle head-up display device
CN106095375B (en) Display control method and device
CN114743433B (en) Multi-channel alarm presenting method and device for simulating threats in flight training environment
US20220348080A1 (en) Control of a display of an augmented reality head-up display apparatus for a motor vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant