CN115097628A - Driving information display method, device and system - Google Patents

Driving information display method, device and system Download PDF

Info

Publication number
CN115097628A
CN115097628A CN202210723101.6A CN202210723101A CN115097628A CN 115097628 A CN115097628 A CN 115097628A CN 202210723101 A CN202210723101 A CN 202210723101A CN 115097628 A CN115097628 A CN 115097628A
Authority
CN
China
Prior art keywords
information
coordinate system
glasses
driving
driving information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210723101.6A
Other languages
Chinese (zh)
Other versions
CN115097628B (en
Inventor
成一诺
张晨昭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingwei Hirain Tech Co Ltd
Original Assignee
Beijing Jingwei Hirain Tech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingwei Hirain Tech Co Ltd filed Critical Beijing Jingwei Hirain Tech Co Ltd
Priority to CN202210723101.6A priority Critical patent/CN115097628B/en
Publication of CN115097628A publication Critical patent/CN115097628A/en
Application granted granted Critical
Publication of CN115097628B publication Critical patent/CN115097628B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0183Adaptation to parameters characterising the motion of the vehicle

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Instrument Panels (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention provides a driving information display method, a device and a system, wherein the driving information display method comprises the following steps: acquiring driving data related to target driving information, wherein the target driving information is driving information to be displayed; generating a display interface containing the target driving information based on the driving data related to the target driving information and preset information for generating the interface; and sending the display interface to the AR glasses for display. The driving information display method provided by the invention sends the display interface containing the driving information to the AR glasses for display, and the driving information is not displayed on the front windshield based on the optical-mechanical system, so that the optical-mechanical system is not needed, namely, the optical-mechanical system with large volume is not needed to be arranged under the front windshield, the condition of crowded space in front of the cockpit is improved, and the AR glasses adopt the mixed waveguide technology and can provide a visual angle of more than 50 degrees, so that the display interface can be richer and clearer, and the user experience is better.

Description

Driving information display method, device and system
Technical Field
The invention relates to the technical field of information display, in particular to a driving information display method, device and system.
Background
A Head Up Display (HUD), also called a Head Up Display, is a system that projects important driving information such as speed per hour and navigation onto a front windshield of an automobile, so that a driver can see the important driving information such as speed per hour and navigation without lowering or turning his Head as much as possible.
An Augmented Reality-Head Up Display (AR-HUD) system utilizes AR technology to superpose virtual image information and real road condition real-time information, so that a driver can more directly acquire driving information. For example, when the AR-HUD is applied in navigation, the navigation instruction information of the virtual image can be directly superimposed on the real road. The appearance of AR technique for HUD's application range is abundanter, can more effectual improvement driving safety nature.
The current AR-HUD system is mainly fixed below a front windshield of an automobile in a front-mounted mode, and a picture is projected to eyes of a driver through reflection by the front windshield. However, there are problems with front-mounted AR-HUD systems based on an automotive front windshield display: the size of the displayed picture is smaller; the image definition and the brightness are not high, and the ghost image problem is easy to exist; the optical-mechanical structure occupies a larger space in the vehicle.
Disclosure of Invention
In view of the above, the present invention provides a driving information display method, device and system, for solving the problems of the existing front-mounted AR-HUD system based on the display of the front windshield of the automobile, and the technical scheme is as follows:
a driving information display method comprises the following steps:
the method comprises the steps of obtaining driving data related to target driving information, wherein the target driving information is driving information to be displayed;
generating a display interface containing the target driving information based on the driving data related to the target driving information and preset information for generating the interface;
and sending the display interface to augmented reality AR glasses for display.
Optionally, the acquiring driving data related to the target driving information includes:
acquiring a driving environment data message from an Advanced Driving Assistance System (ADAS) arranged on a vehicle, and/or acquiring a vehicle body data message and/or a navigation data message from the vehicle;
screening out messages related to the target driving information from the obtained messages;
and analyzing the driving data related to the target driving information from the screened messages.
Optionally, the target driving information includes first type information and/or second type information, the display of the first type information is related to the environment, and the display of the second type information is unrelated to the environment;
the generating of the display interface containing the target driving information based on the driving data related to the target driving information and the preset information for generating the interface includes:
determining a pixel point corresponding to the first type of information on an optical lens of the AR glasses based on the data related to the first type of information and the current pose of the AR glasses;
and/or determining corresponding pixel points of the preset position information aiming at the second type of information on optical lenses of the AR glasses, wherein the preset position information aiming at the second type of information is the position information of the second type of information in a virtual world coordinate system;
and generating a display interface containing the target driving information according to the determined pixel points, the driving data related to the target driving information and the information for generating the interface.
Optionally, the first type of information includes navigation instruction information, and the data related to the first type of information includes location information of the navigation instruction information in an environment coordinate system;
the determining, based on the data related to the first type of information and the current pose of the AR glasses, a corresponding pixel point of the first type of information on an optical lens of the AR glasses includes:
and determining corresponding pixel points of the navigation indication information on optical lenses of the AR glasses based on the current pose of the AR glasses, the position information of the navigation indication information in an environment coordinate system, a scale factor between a virtual world coordinate system and the environment coordinate system and a transformation matrix determined by the parameters of the AR glasses.
Optionally, the determining, based on the current pose of the AR glasses, the position information of the navigation instruction information in the environment coordinate system, the scale factor between the virtual world coordinate system and the environment coordinate system, and the transformation matrix determined by the parameter of the AR glasses, a pixel point corresponding to the navigation instruction information on the optical lens of the AR glasses includes:
determining the position information of the navigation indication information in the virtual world coordinate system based on the current pose of the AR glasses, the position information of the navigation indication information in the environment coordinate system and a scale factor between the virtual world coordinate system and the environment coordinate system;
and converting the position information of the navigation indication information in the virtual world coordinate system into a pixel coordinate system based on a transformation matrix determined by the parameters of the AR glasses to obtain a pixel point of the navigation indication information corresponding to the optical lens of the AR glasses.
Optionally, the first type of information includes target object prompt information, and the data related to the first type of information includes a pose matrix of the target object with respect to the ADAS coordinate system;
the determining, based on the data related to the first type of information and the current pose of the AR glasses, a corresponding pixel point of the first type of information on an optical lens of the AR glasses includes:
and determining corresponding pixel points of the target object prompt information on the optical lenses of the AR glasses based on the current pose of the AR glasses, a pose matrix of the target object relative to an ADAS coordinate system, a transformation matrix of the ADAS camera coordinate system relative to a DMS coordinate system of a driver detection system, a transformation matrix of a head coordinate system of the driver relative to the DMS coordinate system, a transformation matrix of the head coordinate system of the driver relative to a coordinate system of the AR glasses, a scale factor between a virtual world coordinate system and an environment coordinate system and a transformation matrix determined by parameters of the AR glasses.
Optionally, the determining, based on the current pose of the AR glasses, the pose matrix of the target object with respect to the ADAS coordinate system, the transformation matrix of the ADAS camera coordinate system with respect to the DMS coordinate system of the driver detection system, the transformation matrix of the head coordinate system of the driver with respect to the DMS coordinate system, the transformation matrix of the head coordinate system of the driver with respect to the coordinate system of the AR glasses, the scaling factor between the virtual world coordinate system and the environment coordinate system, and the transformation matrix determined by the parameters of the AR glasses, a corresponding pixel point of the target object prompt information on the optical lenses of the AR glasses includes:
determining the current pose of the target object relative to an AR glasses coordinate system as position information of target object prompt information under an environment coordinate based on the current pose of the AR glasses, a pose matrix of the target object relative to an ADAS coordinate system, a transformation matrix of the ADAS camera coordinate system relative to a driver detection system (DMS) coordinate system, a transformation matrix of a driver head coordinate system relative to the DMS coordinate system and a transformation matrix of the driver head coordinate system relative to an AR glasses coordinate system;
converting the position information of the prompt information of the target object in the environment coordinate to the virtual world coordinate based on a proportional factor between the virtual world coordinate system and the environment coordinate system to obtain the position information of the prompt information of the target object in the virtual world coordinate;
and converting the position information of the target object prompt information under the virtual world coordinate into a pixel coordinate system based on a transformation matrix determined by the parameters of the AR glasses to obtain a pixel point of the target object prompt information corresponding to the optical lens of the AR glasses.
Optionally, the driving information display method further includes:
when an interface adjusting instruction sent by the AR glasses is received, adjusting the display interface according to the interface adjusting instruction;
and sending the adjusted display interface to the AR glasses for display.
A driving information display device comprising: the device comprises a data acquisition module, a display interface generation module and a display interface sending module;
the data acquisition module is used for acquiring driving data related to target driving information, wherein the target driving information is driving information to be displayed;
the display interface generating module is used for generating a display interface containing the target driving information based on the driving data related to the target driving information and preset information for generating the interface;
and the display interface sending module is used for sending the display interface to the AR glasses for display.
A driving information display system comprising: data processing equipment and Augmented Reality (AR) glasses;
the data processing equipment is used for acquiring driving data related to target driving information, wherein the target driving information is driving information to be displayed, generating a display interface containing the target driving information based on the driving data related to the target driving information and preset information for generating the interface, and sending the display interface to the AR glasses;
and the AR glasses are used for displaying the display interface generated by the data processing equipment.
According to the driving information display method, the driving information display device and the driving information display system, driving data related to the target driving information are obtained firstly, then a display interface containing the target driving information is generated based on the driving data related to the target driving information and preset information used for generating the interface, and finally the generated display interface is sent to the AR glasses to be displayed. The driving information display method provided by the invention sends the display interface containing the driving information to the AR glasses for display, and the driving information is not displayed on the front windshield based on the optical-mechanical system, so that the optical-mechanical system is not needed, namely, the optical-mechanical system with large volume is not needed to be arranged under the front windshield, the condition of crowded space in front of the cockpit is improved, and the AR glasses adopt the mixed waveguide technology and can provide a visual angle of more than 50 degrees, so that the display interface can be richer and clearer, and the user experience is better.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
Fig. 1 is a schematic flow chart of a driving information display method according to an embodiment of the present invention;
fig. 2 is a schematic diagram illustrating a display interface including driving information displayed on an optical lens of AR glasses according to an embodiment of the present invention;
fig. 3 is a schematic flow chart illustrating a process of generating a display interface including target driving information based on driving data related to the target driving information and preset information for generating an interface according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a coordinate system according to an embodiment of the present invention;
fig. 5 is a schematic flow chart illustrating a process of determining a pixel point corresponding to navigation instruction information on an optical lens of AR glasses based on data related to first-type information and a current pose of the AR glasses according to an embodiment of the present invention;
fig. 6 is a schematic flow chart illustrating a process of determining a pixel point corresponding to target object indication information on an optical lens of AR glasses based on data related to first-type information and a current pose of the AR glasses according to the embodiment of the present invention;
fig. 7 is a schematic structural diagram of a driving information display device according to an embodiment of the present invention;
FIG. 8 is a schematic structural diagram of a data processing apparatus according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of a driving information display system according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In order to solve various problems of the existing front-mounted AR-HUD system based on automobile front windshield display, the applicant researches and continuously provides a driving information display method, device and system, which perfectly solve the problems of the existing scheme.
First embodiment
Referring to fig. 1, a schematic flow chart of a driving information display method according to an embodiment of the present invention is shown, which may include:
step S101: and acquiring the driving data related to the target driving information.
The target driving information is driving information to be displayed.
For example, the driving information to be displayed may include, but is not limited to, one or more of speed information, mileage information, target object prompt information (such as an obstacle warning icon), navigation indication information (such as a navigation indication icon), road information, and the like. It should be noted that what information needs to be displayed can be set according to specific application requirements.
Specifically, the process of acquiring the driving data related to the target driving information may include:
step a1, obtaining an environmental data message from an Advanced Driving Assistance System (ADAS) arranged on the vehicle, and/or obtaining a vehicle body data message and/or a navigation data message from the vehicle.
Specifically, the environment data message CAN be acquired from the ADAS through the CAN bus, and the vehicle body data message and/or the navigation data message CAN be acquired from the vehicle through the CAN bus. The data carried by the environment data message may, but is not limited to, include one or more of lane line data, pedestrian position data, vehicle data, obstacle data, and the like, and the vehicle body data message may, but is not limited to, include one or more of vehicle speed, mileage, alarm information, and the like.
It should be noted that what types of messages are obtained can be determined according to the target driving information (i.e., the driving information to be displayed), for example, if the target driving information includes a vehicle speed, a mileage and a navigation indication icon, a vehicle body data message and a navigation data message can be obtained, and if the target driving information includes a navigation indication icon and an obstacle warning icon, a driving environment data message and a navigation data message can be obtained. Of course, the driving environment data message, the vehicle body data message and the navigation data message can also be obtained, and then the message related to the target driving information is screened from the messages.
Step a2, screening out messages related to the target driving information from the obtained messages.
Specifically, the message related to the target driving information can be screened from the acquired messages based on the pre-established correspondence between the message identifier and the data carried by the message.
The corresponding relationship between the message identifier and the data carried by the message is shown in the following table:
table 1 correspondence between packet identifier and packet carrying data
Message identification The message carries data
0x260 Current vehicle speed
0x264 Corner of wheel
In table 1 above, "0 x 260" corresponds to "current vehicle speed", it is described that data carried by a message identified as "0 x 260" is "current vehicle speed", in table 1 above, "0 x 264" corresponds to "wheel rotation angle", and it is described that data carried by a message identified as "0 x 264" is "wheel rotation angle". Assuming that the driving information to be displayed includes the current vehicle speed, a message with a message identifier of "0 x 260" can be screened from the acquired messages.
Step a3, analyzing the driving data related to the target driving information from the screened messages.
Illustratively, the target driving information (i.e. the driving information to be displayed) includes a current vehicle speed and a navigation indication icon, and correspondingly, the screened messages are a message carrying the current vehicle speed and a message carrying navigation data, and the purpose of step a3 is to analyze the current vehicle speed from the message carrying the current vehicle speed and to analyze the navigation data from the message carrying the navigation data.
Step S102: and generating a display interface containing the target driving information based on the driving data related to the target driving information and preset information for generating the interface.
The preset information for generating the interface may include interface layout information and interface resource information, the interface layout information is used to determine a display layout of the target driving information in the display interface, and the interface resource information is used to determine a display form of the target driving information in the display interface, for example, what font the mileage information is displayed in, what icon the obstacle warning icon is displayed in, and the like. And determining the target driving information contained in the generated display interface according to the driving data related to the target driving information.
Step S103: and sending the display interface to the AR glasses for display.
Referring to fig. 2, a schematic diagram of a display interface including driving information displayed on an optical lens of AR glasses is shown.
Optionally, the driving information display method provided in this embodiment may further include: and when an interface adjusting instruction sent by the AR glasses is received, adjusting the display interface according to the interface adjusting instruction, and sending the adjusted display interface to the AR glasses for display.
The interface adjusting instruction may be, but is not limited to, any one of a voice instruction, a gesture instruction, and a touch instruction in terms of a trigger manner of the instruction, and the interface adjusting instruction may be, but is not limited to, any one of an interface mode switching instruction, a window call-out instruction, an interface size adjustment, an interface brightness adjustment instruction, an interface position adjustment instruction, and the like in terms of a function realized by the instruction.
The driving information display method provided by the embodiment of the invention comprises the steps of firstly obtaining data related to target driving information (namely driving information to be displayed), then generating a display interface containing the target driving information based on the driving data related to the target driving information and preset information for generating the interface, and finally sending the generated display interface to AR glasses for displaying. The driving information display method provided by the embodiment of the invention sends the display interface containing the driving information to the AR glasses for display, and the driving information is not displayed on the front windshield based on the optical-mechanical system, so that the optical-mechanical system is not needed, namely, the optical-mechanical system with large volume is not needed to be arranged under the front windshield, and the condition of crowded space in front of the cockpit is improved.
Second embodiment
This embodiment is similar to the "step S102: and generating a specific implementation process of a display interface containing the target driving information for introduction based on the driving data related to the target driving information and preset information for generating the interface.
It should be noted that the target driving information (i.e., the driving information to be displayed) may include a first type of information (e.g., a navigation indication icon, an obstacle warning icon, etc.) and/or a second type of information (e.g., speed, mileage, etc.). The display of the first type of information is related to the environment, such as the ground display of a navigation indicator map label, the obstacle warning icon is displayed on the obstacle, the display of the second type of information is unrelated to the environment, the display position of the second type of information on the AR glasses is fixed, and the vehicle speed is displayed on the lower left corner of the optical lens of the AR glasses.
In this embodiment, a specific implementation process of step S102 is described by taking as an example that the target driving information includes the first type information and the second type information.
Referring to fig. 3, a schematic flow chart of generating a display interface including target driving information based on driving data related to the target driving information and preset information for generating the interface is shown, and the method may include:
step S301 a: and determining pixel points corresponding to the first type of information on the optical lenses of the AR glasses based on the data related to the first type of information and the current pose of the AR glasses.
Optionally, the first type of information may include, but is not limited to: navigation indication information (such as navigation indication icons) and/or target object indication information (such as obstacle alert icons).
Since the display of the first kind of information is environment-dependent, the display of the first kind of information is affected by the pose of the AR glasses, e.g., the navigation instruction map is displayed adhesively, when the pose of the AR glasses is changed due to some reason, the navigation indication icon is not displayed in a sticking mode, for example, the obstacle warning icon is displayed on the obstacle, when the pose of the AR glasses is changed due to some reason, the obstacle warning icon is not displayed on the obstacle any more, but rather on the side of the obstacle, in order to be able to match the first type of information that is ultimately displayed on the AR glasses to the environment, e.g., as the pose of the AR glasses changes, the navigation indication icon can still be displayed in a ground-attached mode, the obstacle warning icon can still be displayed on the obstacle, and when the pixel points corresponding to the first type of information on the optical lenses of the AR glasses are determined, the pixel points are determined by combining the pose of the AR glasses.
Step S301 b: and determining pixel points corresponding to the preset position information aiming at the second type of information on the optical lenses of the AR glasses.
And the preset position information aiming at the second type of information is the position information of the second type of information in the virtual world coordinate system. It should be noted that the position information of the second type information in the virtual world coordinate system may be included in the interface layout information. The virtual world coordinate system and other coordinate systems to which the present invention relates are shown in fig. 4.
Specifically, the implementation process of step S301b may include: and determining a pixel point corresponding to the preset position information aiming at the second type of information on the optical lens of the AR glasses based on the transformation matrix determined by the parameters of the AR glasses. Among them, the transformation matrix determined by the parameters of the AR glasses may include a window transformation matrix V and a perspective projection matrix P.
It should be noted that, based on the window transformation matrix V and the perspective projection matrix P, the corresponding pixel point of the position information preset for the second type of information on the optical lens of the AR glasses is determined, that is, the position information in the virtual world coordinate system is converted into the pixel coordinate system based on the window transformation matrix V and the perspective projection matrix P.
Specifically, the position point M of the second type of information in the virtual world coordinate system 1 Corresponding pixel point p on optical lens of AR glasses 1 Can be determined by the following formula:
p 1 =V·P·M 1 (1)
in addition, M is 1 Multiplying P to represent the position point M in the virtual world coordinate system 1 Converting into image coordinate system to convert into P.M 1 Multiplying V denotes the point P.M in the image coordinate system 1 Converting the position information into a pixel coordinate system to obtain a position point M of the second type information in a virtual world coordinate system 1 Corresponding pixel point p on optical lens of AR glasses 1
In addition, since the display of the second type of information is independent of the environment and the display position of the AR glasses is fixed (for example, the speed is displayed at the lower left of the optical lens of the AR glasses), the display of the second type of information is not affected by the posture of the AR glasses, and the problem of matching the display of the second type of information with the environment does not need to be considered.
Step S302: and generating a display interface containing the target driving information according to the determined pixel points, the driving data related to the target driving information and the information for generating the interface.
According to the pixel points determined by the method, the data related to the driving information and the information used for generating the interface, a display interface which contains the target driving information and is matched with the environment can be generated.
Third embodiment
In the above embodiment, it is mentioned that the target driving information (i.e. the driving information to be displayed) may include the first type of information and/or the second type of information, where the first type of information may include navigation instruction information and/or target object instruction information, and in this embodiment, taking the example that the first type of information includes navigation instruction information and target object instruction information, as for the "step S301 a: and determining a specific implementation process of a pixel point corresponding to the first type of information on an optical lens of the AR glasses based on the data related to the first type of information and the current pose of the AR glasses.
Firstly, a process of determining corresponding pixel points of navigation indication information on optical lenses of AR glasses based on data related to first-class information and the current pose of the AR glasses is introduced.
Based on the data related to the first type of information and the current pose of the AR glasses, the process of determining a pixel point of the navigation instruction information corresponding to the optical lens of the AR glasses may include: and determining a pixel point of the navigation indication information corresponding to the optical lens of the AR glasses based on the current pose of the AR glasses, the navigation data in the data related to the first type of information, the scale factor between the virtual world coordinate system and the environment coordinate system and the transformation matrix determined by the parameters of the AR glasses.
Specifically, please refer to fig. 5, which shows a schematic flow chart of determining a pixel point corresponding to the navigation instruction information on the optical lens of the AR glasses based on the current pose of the AR glasses, the navigation data in the data related to the first type of information, the scale factor between the virtual world coordinate system and the environment coordinate system, and the transformation matrix determined by the parameters of the AR glasses, and may include:
step S501: and determining the position information of the navigation indication information in the virtual world coordinate system based on the current pose of the AR glasses, the navigation data in the data related to the first type of information and the scale factor between the virtual world coordinate system and the environment coordinate system.
The navigation data in the data related to the first type of information contains the position information of the navigation instruction information in the environment coordinate system. And determining the position information of the navigation indication information under the virtual world coordinate system based on the current position and attitude of the AR glasses, the navigation data in the data related to the first type of information and the scale factor between the virtual world coordinate system and the environment coordinate system, namely converting the position information of the navigation indication information under the virtual world coordinate system into the position information under the virtual world coordinate system based on the scale factor between the virtual world coordinate system and the environment coordinate system in combination with the current position and attitude of the AR glasses.
Specifically, the position point M of the navigation instruction information in the environment coordinate system can be determined according to the following formula 2 And converting to a virtual world coordinate system: to obtain the position point Q of the navigation instruction information under the virtual world coordinate system:
Q=K·T A ·M 2 (2)
where K represents the scale factor between the virtual world coordinate system and the environmental coordinate system, T A Representing the current pose of the AR glasses, and Q representing a position point M in an environment coordinate system 2 And corresponding position points under the virtual world coordinate system.
Step S502: and based on a transformation matrix determined by the parameters of the AR glasses, converting the position information of the navigation indication information in the virtual world coordinate system into a pixel coordinate system to obtain a pixel point corresponding to the navigation indication information on an optical lens of the AR glasses.
The transformation matrix determined by the parameters of the AR glasses may include a window transformation matrix V and a perspective projection matrix P, and the position information of the navigation instruction information in the virtual world coordinate system may be converted into the pixel coordinate system based on the window transformation matrix V and the perspective projection matrix P.
In particular, navigation may be performed according to the following equationConverting the position point Q of the indicating information in the virtual world coordinate system into the pixel coordinate system to obtain the corresponding pixel point p of the position point Q on the optical lens of the AR glasses 2
p 2 =V·P·Q=V·P·K·T A ·M 2 (3)
In addition, Q by P indicates that a position point Q in the virtual world coordinate system is converted into a point in the image coordinate system, and P · Q by V indicates that a point in the image coordinate system is converted into a pixel point in the pixel coordinate system.
Next, a process of determining a pixel point corresponding to the target object indication information on the optical lens of the AR glasses based on the data related to the first type of information and the current pose of the AR glasses is described.
Based on the data related to the first type of information and the current pose of the AR glasses, the process of determining a pixel point of the target object indication information corresponding to the optical lens of the AR glasses may include: and determining pixel points corresponding to the target object prompt information on the optical lenses of the AR glasses based on the current pose of the AR glasses, a pose matrix of the target object relative to an ADAS coordinate system, a transformation matrix of the ADAS camera coordinate system relative to a DMS coordinate system of the driver detection system, a transformation matrix of the head coordinate system of the driver relative to the DMS coordinate system, a transformation matrix of the head coordinate system of the driver relative to the coordinate system of the AR glasses, a scaling factor between the virtual world coordinate system and the environment coordinate system and a transformation matrix determined by the parameters of the AR glasses.
Specifically, please refer to fig. 6, which shows a schematic flow chart of determining a corresponding pixel point of the target object prompt information on the optical lens of the AR glasses based on the current pose of the AR glasses, the pose matrix of the target object relative to the ADAS coordinate system, the transformation matrix of the ADAS camera coordinate system relative to the DMS coordinate system, the transformation matrix of the driver head coordinate system relative to the AR glasses coordinate system, the scaling factor between the virtual world coordinate system and the environment coordinate system, and the transformation matrix determined by the parameters of the AR glasses, and may include:
step S601: and determining the current pose of the target object relative to the coordinate system of the AR glasses as the position information of the prompt information of the target object under the environment coordinate based on the current pose of the AR glasses, the pose matrix of the target object relative to the ADAS coordinate system, the transformation matrix of the ADAS camera coordinate system relative to the coordinate system of the driver detection system DMS, the transformation matrix of the head coordinate system of the driver relative to the coordinate system of the DMS and the transformation matrix of the head coordinate system of the driver relative to the coordinate system of the AR glasses.
It should be noted that the pose matrix of the target object with respect to the ADAS coordinate system is included in the data related to the first type of information, and the pose matrix of the target object with respect to the ADAS coordinate system is determined by the ADAS according to the image of the target object acquired by the ADAS.
Specifically, the current pose of the target object with respect to the AR glasses coordinate system (i.e., the position information of the target object prompt information in the environment coordinate) may be determined according to the following formula:
Figure BDA0003712363200000121
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003712363200000122
a pose matrix representing the target object relative to the ADAS coordinate system,
Figure BDA0003712363200000123
a transformation matrix representing the ADAS camera coordinate system relative to the driver detection system DMS coordinate system,
Figure BDA0003712363200000124
a transformation matrix representing the coordinate system of the driver's head relative to the coordinate system of the DMS,
Figure BDA0003712363200000125
a transformation matrix representing the DMS coordinate system relative to the driver's head coordinate system,
Figure BDA0003712363200000126
representing the head coordinate system of the driver relative to ATransformation matrix of the R-glasses coordinate system, T A The current pose of the AR glasses is represented,
Figure BDA0003712363200000131
i.e. representing the current pose of the target object relative to the AR glasses coordinate system.
It should be noted that, in the following description,
Figure BDA0003712363200000132
representing the pose of the target object relative to the DMS coordinate system,
Figure BDA0003712363200000133
Figure BDA0003712363200000134
representing the pose of the target object relative to the driver's head coordinate system,
Figure BDA0003712363200000135
representing the initial pose of the target object relative to the AR glasses,
Figure BDA0003712363200000136
representing the current pose of the target object relative to the AR glasses coordinate system.
Step S602: and converting the position information of the target object prompt information under the environment coordinate into the position information of the target object prompt information under the virtual world coordinate based on the scale factor between the virtual world coordinate system and the environment coordinate system to obtain the position information of the target object prompt information under the virtual world coordinate.
Specifically, the position point M of the target object prompt information in the environment coordinate system may be determined according to the following formula 3 And converting to a virtual world coordinate system:
Figure BDA0003712363200000137
wherein R represents a position point M in an environment coordinate system 3 Corresponding position points under the virtual world coordinate system, K represents the virtual world coordinate system and the environment coordinateScale factor between lines, T A Representing the current pose of the AR glasses.
Step S603: and based on a transformation matrix determined by the parameters of the AR glasses, converting the position information of the target object prompt information under the virtual world coordinate into a pixel coordinate system to obtain a pixel point corresponding to the target object prompt information on an optical lens of the AR glasses.
Specifically, the position point R in the virtual world coordinate system can be converted to the pixel coordinate system according to the following formula:
Figure BDA0003712363200000138
wherein p is 3 And expressing the pixel point corresponding to the position point R in the virtual world coordinate system in the pixel coordinate system, namely the pixel point corresponding to the optical lens of the AR glasses.
Next, taking the target object indication information as an obstacle warning icon as an example, a pixel point corresponding to the target object prompt information on the optical lens of the AR glasses is determined.
The pose matrix of the obstacle in front of the vehicle relative to the ADAS coordinate system can be obtained by analyzing the message acquired from the ADAS
Figure BDA0003712363200000139
The pose matrix is the position information of the barrier warning icon in an ADAS coordinate system, and the transformation matrix of the ADAS coordinate system relative to the DMS camera coordinate system
Figure BDA00037123632000001310
Transformation matrix of driver head coordinate system relative to DMS coordinate system
Figure BDA00037123632000001311
And a transformation matrix of the head coordinate system of the driver relative to the AR glasses coordinate system
Figure BDA00037123632000001312
Respectively as follows:
Figure BDA0003712363200000141
Figure BDA0003712363200000142
Figure BDA0003712363200000143
firstly, determining the position information of the obstacle warning icon under the environment coordinate according to the following formula:
Figure BDA0003712363200000144
then, by
Figure BDA0003712363200000145
And converting the position information of the obstacle warning icon in the environment coordinate system into the virtual world coordinate system to obtain the position information of the obstacle warning icon in the virtual world coordinate system.
Finally, calculating according to the FOV of the AR glasses being 50 degrees and the near-far plane being 0.1m and 500m respectively to obtain a perspective projection matrix P, calculating according to the resolution being 720P to obtain a window transformation matrix V, and calculating according to P 3 And calculating to obtain corresponding pixel points of the obstacle warning icons displayed on the display interface on the AR glasses optical lens.
The above mentioned transformation matrix is then applied
Figure BDA0003712363200000146
The acquisition mode of (2) is introduced.
Transformation matrix of ADAS camera coordinate system relative to DMS coordinate system of driver detection system
Figure BDA0003712363200000147
Can be determined according to the pose of the DMS on the vehicle and the pose of the ADAS camera on the vehicleSpecifically, a transformation matrix of the ADAS camera coordinate system relative to the DMS coordinate system
Figure BDA0003712363200000148
Can be determined according to the following formula:
Figure BDA0003712363200000149
wherein M is DMS Representing the pose of DMS on the vehicle, M Camera Shows the pose of the ADAS camera on the vehicle, M DMS And M Camera The poses of the DMS camera and the ADAS camera are both under an environment coordinate system. It should be noted that the mounting positions and angles of the DMS and ADAS cameras on the vehicle are fixed, and therefore, M is a fixed value DMS And M Camera Is a fixed value.
Transformation matrix of driver head coordinate system relative to DMS coordinate system
Figure BDA00037123632000001410
May be determined from the driver head image acquired by the DMS camera and the parameters of the DMS camera.
Transformation matrix of head coordinate system of driver relative to AR glasses coordinate system
Figure BDA0003712363200000151
Can be determined according to the position relation between the origin of the head coordinate system of the driver and the origin of the AR glasses coordinate system. As shown in fig. 4, the origin O of the head coordinate system of the driver H For the middle point of the connecting line of the two eyes of the driver and the Z of the head coordinate system of the driver H Axis is the face-facing direction, and for the AR spectacle coordinate system, the origin O A For its camera (one camera is arranged at the center of the AR glasses), at the optical center, Z A The axis can be considered as being equal to Z H The axes are aligned, so the AR glasses coordinate system can be viewed as the origin of the driver's head coordinate system from O H Is translated to O A In view of this, a transformation matrix from the head coordinate system of the driver to the AR glasses coordinate system is obtained
Figure BDA0003712363200000152
Can be expressed as:
Figure BDA0003712363200000153
wherein, t HA Represents O H To O A Can be obtained by measurement, and I represents an identity matrix. Note that t is HA The approximate position of the glasses can be measured manually, and can be regarded as a constant value, and the influence of the error of the head of the driver to the glasses caused by individual difference is negligible.
Transformation matrix of head coordinate system of driver relative to AR glasses coordinate system
Figure BDA0003712363200000154
The design of (2) considers the distance factor between the eyes of the driver and the coordinate system of the AR glasses, and after the factor is considered, the finally generated display interface is better attached to the real environment, and the driver experience is better.
Fourth embodiment
The following describes the driving information display device provided by the embodiment of the present invention, and the driving information display device described below and the driving information display method described above may be referred to in correspondence with each other.
Referring to fig. 7, a schematic structural diagram of a driving information display device according to an embodiment of the present invention is shown, which may include: a data acquisition module 701, a display interface generation module 702 and a display interface sending module 703.
The data obtaining module 701 is configured to obtain driving data related to target driving information, where the target driving information is driving information to be displayed.
A display interface generating module 702, configured to generate a display interface including the target driving information based on driving data related to the target driving information and preset information used for generating the interface.
And a display interface sending module 703, configured to send the display interface to the augmented reality AR glasses for displaying.
Optionally, the data obtaining module 701 includes: the device comprises a message acquisition module, a message screening module and a message analysis module.
The message acquisition module is used for acquiring driving environment data messages from an Advanced Driving Assistance System (ADAS) arranged on a vehicle and/or acquiring vehicle body data messages and/or navigation data messages from the vehicle.
And the message screening module is used for screening out the message related to the target driving information from the acquired messages.
And the message analysis module is used for analyzing the driving data related to the target driving information from the screened messages.
Optionally, the target driving information includes first type information and/or second type information, the display of the first type information is related to the environment, and the display of the second type information is unrelated to the environment.
The display interface generation module 702 may include: the pixel point determining submodule and the display interface generating submodule.
The pixel point determining submodule is used for determining a pixel point corresponding to the first type of information on an optical lens of the AR glasses based on data related to the first type of information and the current pose of the AR glasses; and/or determining corresponding pixel points of the preset position information aiming at the second type of information on the optical lens of the AR glasses, wherein the preset position information aiming at the second type of information is the position information of the second type of information under a virtual world coordinate system;
and the display interface generation submodule is used for generating a display interface containing the target driving information according to the determined pixel points, the driving data related to the target driving information and the information for generating the interface.
Optionally, the first type of information includes navigation instruction information, and the data related to the first type of information includes location information of the navigation instruction information in an environment coordinate system, and when the pixel point determining submodule determines, based on the data related to the first type of information and the current pose of the AR glasses, a corresponding pixel point of the first type of information on an optical lens of the AR glasses, the pixel point determining submodule is specifically configured to:
and determining a corresponding pixel point of the navigation indication information on an optical lens of the AR glasses based on the current pose of the AR glasses, the position information of the navigation indication information in an environment coordinate system, a scale factor between a virtual world coordinate system and the environment coordinate system and a transformation matrix determined by the parameters of the AR glasses.
Optionally, when the pixel point determining submodule determines the corresponding pixel point of the navigation instruction information on the optical lens of the AR glasses based on the current pose of the AR glasses, the position information of the navigation instruction information in the environment coordinate system, the scale factor between the virtual world coordinate system and the environment coordinate system, and the transformation matrix determined by the parameter of the AR glasses, the pixel point determining submodule is specifically configured to:
determining the position information of the navigation indication information in the virtual world coordinate system based on the current pose of the AR glasses, the position information of the navigation indication information in the environment coordinate system and a scale factor between the virtual world coordinate system and the environment coordinate system;
and converting the position information of the navigation indication information in the virtual world coordinate system into a pixel coordinate system based on a transformation matrix determined by the parameters of the AR glasses to obtain a pixel point of the navigation indication information corresponding to the optical lens of the AR glasses.
Optionally, the first type of information includes target object prompt information, the data related to the first type of information includes a pose matrix of the target object with respect to an ADAS coordinate system, and when the pixel point determining submodule determines a corresponding pixel point of the first type of information on an optical lens of the AR glasses based on the data related to the first type of information and the current pose of the AR glasses, the pixel point determining submodule is specifically configured to:
and determining corresponding pixel points of target object prompt information on optical lenses of the AR glasses based on the current pose of the AR glasses, a pose matrix of the target object relative to an ADAS coordinate system, a transformation matrix of the ADAS camera coordinate system relative to a DMS coordinate system of a driver detection system, a transformation matrix of a head coordinate system of the driver relative to the DMS coordinate system, a transformation matrix of the head coordinate system of the driver relative to a coordinate system of the AR glasses, a scale factor between a virtual world coordinate system and an environment coordinate system and a transformation matrix determined by parameters of the AR glasses.
Optionally, the pixel point determining sub-module is specifically configured to, when determining a corresponding pixel point of the target object prompt information on the optical lens of the AR glasses based on the current pose of the AR glasses, the pose matrix of the target object relative to the ADAS coordinate system, the transformation matrix of the ADAS camera coordinate system relative to the DMS coordinate system of the driver detection system, the transformation matrix of the driver head coordinate system relative to the DMS coordinate system, the transformation matrix of the driver head coordinate system relative to the AR coordinate system of the AR glasses, the scaling factor between the virtual world coordinate system and the environment coordinate system, and the transformation matrix determined by the parameters of the AR glasses, determine:
determining the current pose of the target object relative to an AR glasses coordinate system as position information of target object prompt information under an environment coordinate based on the current pose of the AR glasses, a pose matrix of the target object relative to an ADAS coordinate system, a transformation matrix of the ADAS camera coordinate system relative to a driver detection system (DMS) coordinate system, a transformation matrix of a driver head coordinate system relative to the DMS coordinate system and a transformation matrix of the driver head coordinate system relative to an AR glasses coordinate system;
converting the position information of the target object prompt information under the environment coordinate into the position information of the target object prompt information under the virtual world coordinate based on a scale factor between the virtual world coordinate system and the environment coordinate system to obtain the position information of the target object prompt information under the virtual world coordinate;
and converting the position information of the target object prompt information under the virtual world coordinate into a pixel coordinate system based on a transformation matrix determined by the parameters of the AR glasses to obtain a pixel point of the target object prompt information on an optical lens of the AR glasses.
Optionally, the driving information display method may further include:
when an interface adjusting instruction sent by the AR glasses is received, adjusting the display interface according to the interface adjusting instruction; and sending the adjusted display interface to the AR glasses for display.
The driving information display device provided by the embodiment of the invention firstly acquires data related to target driving information (namely driving information to be displayed), then generates a display interface containing the target driving information based on the driving data related to the target driving information and preset information for generating the interface, and finally sends the generated display interface to AR glasses for displaying. The driving information display device provided by the embodiment of the invention sends the display interface containing the driving information to the AR glasses for display, and does not display the driving information on the front windshield based on the optical-mechanical system, so that the optical-mechanical system is not needed, namely, the optical-mechanical system with large volume is not needed to be arranged under the front windshield, the condition of crowded space in front of the cockpit is improved, and the AR glasses adopt a mixed waveguide technology and can provide a viewing angle of more than 50 degrees, so that the display interface can be richer and clearer, and the user experience is better. In addition, the current pose of the AR glasses is considered when determining the pixel points corresponding to the first type of information on the optical lenses of the AR glasses, so that the finally generated display interface can be matched with the environment.
Fifth embodiment
An embodiment of the present invention further provides a data processing device, please refer to fig. 8, which shows a schematic structural diagram of the data processing device, where the data processing device may include: at least one processor 801, at least one communication interface 802, at least one memory 803, and at least one communication bus 804;
in the embodiment of the present invention, the number of the processor 801, the communication interface 802, the memory 803, and the communication bus 804 is at least one, and the processor 801, the communication interface 802, and the memory 803 complete the communication with each other through the communication bus 804;
the processor 801 may be a central processing unit CPU, or an application Specific Integrated circuit asic, or one or more Integrated circuits configured to implement embodiments of the present invention, or the like;
the memory 803 may include a high-speed RAM memory, and may further include a non-volatile memory (non-volatile memory) or the like, such as at least one disk memory;
wherein the memory stores a program and the processor can call the program stored in the memory, the program for:
the method comprises the steps of obtaining driving data related to target driving information, wherein the target driving information is driving information to be displayed;
generating a display interface containing the target driving information based on the driving data related to the target driving information and preset information for generating the interface;
and sending the display interface to augmented reality AR glasses for display.
Alternatively, the detailed function and the extended function of the program may be as described above.
Sixth embodiment
An embodiment of the present invention further provides a readable storage medium, where the readable storage medium may store a program adapted to be executed by a processor, where the program is configured to:
the method comprises the steps of obtaining driving data related to target driving information, wherein the target driving information is driving information to be displayed;
generating a display interface containing the target driving information based on the driving data related to the target driving information and preset information for generating the interface;
and sending the display interface to augmented reality AR glasses for display.
Alternatively, the detailed function and the extended function of the program may be as described above.
Seventh embodiment
An embodiment of the present invention provides a driving information display system, please refer to fig. 9, which shows a schematic structural diagram of the driving information display system, and the driving information display system may include: a data processing device 901 and AR glasses 902.
The data processing device 901 is configured to acquire driving data related to target driving information, where the target driving information is driving information to be displayed, generate a display interface including the target driving information based on the driving data related to the target driving information and preset information used for generating the interface, and send the display interface to the AR glasses 902.
As shown in fig. 9, the data processing device 901 CAN communicate with the ADAS and DMS provided on the vehicle through the CAN bus, and CAN also communicate with the vehicle through the CAN bus. When acquiring data related to driving information to be displayed, the data processing device 901 may acquire driving environment data messages from ADAS through the CAN bus, and/or acquire vehicle body data messages and/or navigation data messages from a vehicle through the CAN bus, and further may screen out messages related to driving information to be displayed from the acquired messages, and analyze driving data related to driving information to be displayed from the screened messages.
Optionally, the target driving information may include a first type of information and/or a second type of information, the first type of information being displayed in association with the environment, and the second type of information being displayed in association with the environment.
When generating a display interface containing target driving information based on driving data related to the target driving information and preset information for generating an interface, the data processing device 901 is specifically configured to:
determining corresponding pixel points of the first type of information on the optical lenses of the AR glasses 902 based on the data related to the first type of information and the current pose of the AR glasses 902; and/or determining corresponding pixel points of the preset position information aiming at the second type of information on the optical lens of the AR glasses 902, wherein the preset position information aiming at the second type of information is the position information of the second type of information under the virtual world coordinate system; and generating a display interface containing the target driving information according to the determined pixel points, the driving data related to the target driving information and the information for generating the interface.
Optionally, the first type of information may include navigation instruction information, the data related to the first type of information may include position information of the navigation instruction information in an environment coordinate system, and the data processing device 901 is specifically configured to, when determining, based on the data related to the first type of information and the current pose of the AR glasses 902, a corresponding pixel point of the first type of information on an optical lens of the AR glasses 902:
based on the current pose of the AR glasses 902, the position information of the navigation instruction information in the environment coordinate system, the scale factor between the virtual world coordinate system and the environment coordinate system, and the transformation matrix determined by the parameters of the AR glasses 902, the pixel point corresponding to the navigation instruction information on the optical lens of the AR glasses 902 is determined.
Optionally, the first type of information may include prompt information of the target object, and the data related to the first type of information includes a pose matrix of the target object with respect to the ADAS coordinate system, so that when determining, based on the data related to the first type of information and the current pose of the AR glasses 902, a corresponding pixel point of the first type of information on an optical lens of the AR glasses 902, the data processing device 901 is specifically configured to:
based on the current pose of the AR glasses 902, the pose matrix of the target object with respect to the ADAS coordinate system, the transformation matrix of the ADAS camera coordinate system with respect to the DMS coordinate system of the driver detection system, the transformation matrix of the head coordinate system of the driver with respect to the DMS coordinate system, the transformation matrix of the head coordinate system of the driver with respect to the AR glasses 902 coordinate system, the scaling factor between the virtual world coordinate system and the environment coordinate system, and the transformation matrix determined by the parameters of the AR glasses 902, the corresponding pixel point of the target object prompt information on the optical lenses of the AR glasses 902 is determined.
It should be noted that, the pose matrix of the target object relative to the ADAS coordinate system is determined by the ADAS according to the image of the target object acquired by the camera thereof, and then is obtained by analyzing the message acquired by the ADAS by the data processing device 901, the transformation matrix of the driver head coordinate system relative to the DMS coordinate system is determined by the data processing device 901 according to the driver head image acquired from the DMS, the transformation matrix of the driver head coordinate system relative to the AR glasses 902 coordinate system is determined by the data processing device 901 according to the origin of the driver head coordinate system and the origin of the AR glasses 902 coordinate system, where the origin of the driver head coordinate system is acquired from the DMS by the data processing device 901.
When determining the pixel points corresponding to the preset position information for the second type of information on the optical lenses of the AR glasses 902, the data processing device 901 is specifically configured to determine the pixel points corresponding to the preset position information for the second type of information on the optical lenses of the AR glasses 902 based on the transformation matrix determined by the parameters of the AR glasses 902.
It should be noted that, for a more specific implementation process of the data processing device 901 for generating a display interface including target driving information based on driving data related to the target driving information and preset information used for generating the interface, reference may be made to the foregoing embodiment, and this embodiment is not described herein again.
After the data processing device 901 generates the display interface, the generated display interface may be transmitted to the AR glasses 902 in real time (optionally, the display interface may be transmitted in an HDMI signal form), the AR glasses 902 display the received display interface, and after the display interface is displayed by the AR glasses 902, the driver may see the display interface at a position D (for example, 8) meters ahead through the AR glasses 902, where D is a picture depth and is determined by the AR glasses 902.
After the display interface is displayed by the AR glasses 902, a driver can interact with the AR glasses 902 through a voice instruction, a gesture instruction or a touch instruction, after the AR glasses 902 acquire the instruction of the driver, the instruction can be transmitted to the data processing device through a wireless communication mode such as bluetooth, the data processing device adjusts the display interface according to the instruction, the adjusted display interface is transmitted to the AR glasses 902, and the AR glasses 902 display the adjusted display interface.
Finally, it should also be noted that, in this document, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. A driving information display method is characterized by comprising the following steps:
the method comprises the steps of obtaining driving data related to target driving information, wherein the target driving information is driving information to be displayed;
generating a display interface containing the target driving information based on the driving data related to the target driving information and preset information for generating the interface;
and sending the display interface to augmented reality AR glasses for display.
2. The driving information display method according to claim 1, wherein the acquiring driving data related to the target driving information includes:
acquiring a driving environment data message from an Advanced Driving Assistance System (ADAS) arranged on a vehicle, and/or acquiring a vehicle body data message and/or a navigation data message from the vehicle;
screening out messages related to the target driving information from the obtained messages;
and analyzing the driving data related to the target driving information from the screened messages.
3. The driving information display method according to claim 2, wherein the target driving information comprises a first type of information and/or a second type of information, the first type of information is displayed in relation to the environment, and the second type of information is displayed in relation to the environment;
the generating of the display interface containing the target driving information based on the driving data related to the target driving information and the preset information for generating the interface includes:
determining a pixel point corresponding to the first type of information on an optical lens of the AR glasses based on the data related to the first type of information and the current pose of the AR glasses;
and/or determining corresponding pixel points of the preset position information aiming at the second type of information on the optical lens of the AR glasses, wherein the preset position information aiming at the second type of information is the position information of the second type of information under a virtual world coordinate system;
and generating a display interface containing the target driving information according to the determined pixel points, the driving data related to the target driving information and the information for generating the interface.
4. The driving information display method according to claim 3, wherein the first type of information includes navigation instruction information, and the data related to the first type of information includes position information of the navigation instruction information in an environment coordinate system;
the determining, based on the data related to the first type of information and the current pose of the AR glasses, a corresponding pixel point of the first type of information on an optical lens of the AR glasses includes:
and determining a corresponding pixel point of the navigation indication information on an optical lens of the AR glasses based on the current pose of the AR glasses, the position information of the navigation indication information in an environment coordinate system, a scale factor between a virtual world coordinate system and the environment coordinate system and a transformation matrix determined by the parameters of the AR glasses.
5. The driving information display method according to claim 4, wherein the determining of the corresponding pixel point of the navigation instruction information on the optical lens of the AR glasses based on the current pose of the AR glasses, the position information of the navigation instruction information in the environment coordinate system, the scale factor between the virtual world coordinate system and the environment coordinate system, and the transformation matrix determined by the parameters of the AR glasses comprises:
determining the position information of the navigation indication information in the virtual world coordinate system based on the current pose of the AR glasses, the position information of the navigation indication information in the environment coordinate system and a scale factor between the virtual world coordinate system and the environment coordinate system;
and converting the position information of the navigation indication information in the virtual world coordinate system into a pixel coordinate system based on a transformation matrix determined by the parameters of the AR glasses to obtain a pixel point of the navigation indication information corresponding to the optical lens of the AR glasses.
6. The driving information display method according to claim 3, wherein the first type of information includes target object prompt information, and the data related to the first type of information includes a pose matrix of the target object with respect to an ADAS coordinate system;
the determining, based on the data related to the first type of information and the current pose of the AR glasses, a corresponding pixel point of the first type of information on an optical lens of the AR glasses includes:
and determining corresponding pixel points of the target object prompt information on the optical lenses of the AR glasses based on the current pose of the AR glasses, a pose matrix of the target object relative to an ADAS coordinate system, a transformation matrix of the ADAS camera coordinate system relative to a DMS coordinate system of a driver detection system, a transformation matrix of a head coordinate system of the driver relative to the DMS coordinate system, a transformation matrix of the head coordinate system of the driver relative to a coordinate system of the AR glasses, a scale factor between a virtual world coordinate system and an environment coordinate system and a transformation matrix determined by parameters of the AR glasses.
7. The driving information display method according to claim 6, wherein the determining of the corresponding pixel points of the target object prompt information on the optical lenses of the AR glasses based on the current pose of the AR glasses, the pose matrix of the target object with respect to the ADAS coordinate system, the transformation matrix of the ADAS camera coordinate system with respect to the driver detection system DMS coordinate system, the transformation matrix of the driver head coordinate system with respect to the DMS coordinate system, and the transformation matrix of the driver head coordinate system with respect to the AR glasses coordinate system, the scaling factor between the virtual world coordinate system and the environment coordinate system, and the transformation matrix determined by the parameters of the AR glasses comprises:
determining the current pose of the target object relative to an AR glasses coordinate system as position information of target object prompt information under an environment coordinate based on the current pose of the AR glasses, a pose matrix of the target object relative to an ADAS coordinate system, a transformation matrix of the ADAS camera coordinate system relative to a driver detection system (DMS) coordinate system, a transformation matrix of a driver head coordinate system relative to the DMS coordinate system and a transformation matrix of the driver head coordinate system relative to an AR glasses coordinate system;
converting the position information of the target object prompt information under the environment coordinate into the position information of the target object prompt information under the virtual world coordinate based on a scale factor between the virtual world coordinate system and the environment coordinate system to obtain the position information of the target object prompt information under the virtual world coordinate;
and converting the position information of the target object prompt information under the virtual world coordinate into a pixel coordinate system based on a transformation matrix determined by the parameters of the AR glasses to obtain a pixel point of the target object prompt information on an optical lens of the AR glasses.
8. A driving information display method according to any one of claims 1 to 7, further comprising:
when an interface adjusting instruction sent by the AR glasses is received, adjusting the display interface according to the interface adjusting instruction;
and sending the adjusted display interface to the AR glasses for display.
9. A driving information display device, comprising: the device comprises a data acquisition module, a display interface generation module and a display interface sending module;
the data acquisition module is used for acquiring driving data related to target driving information, wherein the target driving information is driving information to be displayed;
the display interface generating module is used for generating a display interface containing the target driving information based on the driving data related to the target driving information and preset information for generating the interface;
and the display interface sending module is used for sending the display interface to the AR glasses for display.
10. A driving information display system, comprising: data processing equipment and Augmented Reality (AR) glasses;
the data processing device is used for acquiring driving data related to target driving information, wherein the target driving information is driving information to be displayed, generating a display interface containing the target driving information based on the driving data related to the target driving information and preset information for generating the interface, and sending the display interface to the AR glasses;
and the AR glasses are used for displaying the display interface generated by the data processing equipment.
CN202210723101.6A 2022-06-24 2022-06-24 Driving information display method, device and system Active CN115097628B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210723101.6A CN115097628B (en) 2022-06-24 2022-06-24 Driving information display method, device and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210723101.6A CN115097628B (en) 2022-06-24 2022-06-24 Driving information display method, device and system

Publications (2)

Publication Number Publication Date
CN115097628A true CN115097628A (en) 2022-09-23
CN115097628B CN115097628B (en) 2024-05-07

Family

ID=83292714

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210723101.6A Active CN115097628B (en) 2022-06-24 2022-06-24 Driving information display method, device and system

Country Status (1)

Country Link
CN (1) CN115097628B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN207164368U (en) * 2017-08-31 2018-03-30 北京新能源汽车股份有限公司 Vehicle-mounted augmented reality system
CN109649275A (en) * 2018-11-29 2019-04-19 福瑞泰克智能系统有限公司 A kind of driving assistance system and method based on augmented reality
CN110203140A (en) * 2019-06-28 2019-09-06 威马智慧出行科技(上海)有限公司 Automobile augmented reality display methods, electronic equipment, system and automobile
CN113147596A (en) * 2021-04-13 2021-07-23 一汽奔腾轿车有限公司 AR technology-based head-mounted vehicle-mounted information display system and method
WO2021197189A1 (en) * 2020-03-31 2021-10-07 深圳光峰科技股份有限公司 Augmented reality-based information display method, system and apparatus, and projection device
CN114489332A (en) * 2022-01-07 2022-05-13 北京经纬恒润科技股份有限公司 Display method and system of AR-HUD output information

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN207164368U (en) * 2017-08-31 2018-03-30 北京新能源汽车股份有限公司 Vehicle-mounted augmented reality system
CN109649275A (en) * 2018-11-29 2019-04-19 福瑞泰克智能系统有限公司 A kind of driving assistance system and method based on augmented reality
CN110203140A (en) * 2019-06-28 2019-09-06 威马智慧出行科技(上海)有限公司 Automobile augmented reality display methods, electronic equipment, system and automobile
WO2021197189A1 (en) * 2020-03-31 2021-10-07 深圳光峰科技股份有限公司 Augmented reality-based information display method, system and apparatus, and projection device
CN113147596A (en) * 2021-04-13 2021-07-23 一汽奔腾轿车有限公司 AR technology-based head-mounted vehicle-mounted information display system and method
CN114489332A (en) * 2022-01-07 2022-05-13 北京经纬恒润科技股份有限公司 Display method and system of AR-HUD output information

Also Published As

Publication number Publication date
CN115097628B (en) 2024-05-07

Similar Documents

Publication Publication Date Title
US10282915B1 (en) Superimposition device of virtual guiding indication and reality image and the superimposition method thereof
CN109649275B (en) Driving assistance system and method based on augmented reality
US10539790B2 (en) Coordinate matching apparatus for head-up display
WO2018167966A1 (en) Ar display device and ar display method
JP4868055B2 (en) Display device
US20120235805A1 (en) Information display apparatus and information display method
US20180164114A1 (en) Vehicle Navigation Display System And Method Thereof
US20210110791A1 (en) Method, device and computer-readable storage medium with instructions for controllling a display of an augmented-reality head-up display device for a transportation vehicle
CN210139859U (en) Automobile collision early warning system, navigation and automobile
WO2022241638A1 (en) Projection method and apparatus, and vehicle and ar-hud
JP2007102691A (en) View-field support apparatus for vehicle
JPH10176928A (en) Viewpoint position measuring method and device, head-up display, and mirror adjustment device
CN113483774B (en) Navigation method, navigation device, electronic equipment and readable storage medium
JP6186905B2 (en) In-vehicle display device and program
EP1839950A2 (en) On-vehicle stereoscopic display device
CN116091740B (en) Information display control method, storage medium and electronic device
KR20130119144A (en) Method and device for displaying object using transparent display panel
US9846819B2 (en) Map image display device, navigation device, and map image display method
CN112484743A (en) Vehicle-mounted HUD fusion live-action navigation display method and system thereof
CN115097628A (en) Driving information display method, device and system
WO2019119358A1 (en) Method, device and system for displaying augmented reality poi information
CN110347241B (en) AR head-up display optical system capable of realizing normal live-action display
US20050030380A1 (en) Image providing apparatus, field-of-view changing method, and computer program product for changing field-of-view
KR20180026418A (en) Apparatus for matching coordinate of head-up display
US20230234442A1 (en) Method For Displaying Augmented Reality Information In Vehicles

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant