CN111121815B - Path display method, system and computer storage medium based on AR-HUD navigation - Google Patents

Path display method, system and computer storage medium based on AR-HUD navigation Download PDF

Info

Publication number
CN111121815B
CN111121815B CN201911378417.0A CN201911378417A CN111121815B CN 111121815 B CN111121815 B CN 111121815B CN 201911378417 A CN201911378417 A CN 201911378417A CN 111121815 B CN111121815 B CN 111121815B
Authority
CN
China
Prior art keywords
navigation
path
points
calculating
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911378417.0A
Other languages
Chinese (zh)
Other versions
CN111121815A (en
Inventor
曾繁华
伍跃洪
孙欣然
李万超
旷璨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Lilong Zhongbao Intelligent Technology Co ltd
Original Assignee
Chongqing Lilong Zhongbao Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Lilong Zhongbao Intelligent Technology Co ltd filed Critical Chongqing Lilong Zhongbao Intelligent Technology Co ltd
Priority to CN201911378417.0A priority Critical patent/CN111121815B/en
Publication of CN111121815A publication Critical patent/CN111121815A/en
Application granted granted Critical
Publication of CN111121815B publication Critical patent/CN111121815B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • G01C21/3415Dynamic re-routing, e.g. recalculating the route when the user deviates from calculated route or after detecting real-time traffic data or accidents
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3605Destination input or retrieval
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3635Guidance using 3D or perspective road maps
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3667Display of a road map
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3691Retrieval, searching and output of information related to real-time traffic, weather, or environmental conditions
    • G01C21/3694Output thereof on a road map
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T90/00Enabling technologies or technologies with a potential or indirect contribution to GHG emissions mitigation

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Atmospheric Sciences (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Ecology (AREA)
  • Environmental & Geological Engineering (AREA)
  • Environmental Sciences (AREA)
  • Navigation (AREA)

Abstract

The invention discloses a path display method, a system and a computer storage medium based on AR-HUD navigation, belonging to the technical field of vehicle navigation, comprising the following steps: s1, obtaining navigation data of a navigation path, wherein the navigation data comprises a distance offset, a corresponding gradient and a rotation degree; s2, generating key points of a navigation path according to the navigation data, and calculating three-dimensional positions of the key points under vehicle body coordinates; and S3, drawing a three-dimensional graph of a navigation arrow according to the three-dimensional position, and projecting the three-dimensional graph to a head-up display. The invention solves the technical problems that in the prior art, a navigation track cannot be generated in real time according to the gradient and the rotation degree, and a navigation arrow is not attached to an actual road condition.

Description

Path display method, system and computer storage medium based on AR-HUD navigation
Technical Field
The invention relates to the technical field of vehicle navigation, in particular to a path display method, a system and a computer storage medium based on AR-HUD navigation.
Background
The continuous development of automobile technology makes people more and more closely connected with automobiles, and the requirements on functions of the automobiles are also more and more increased. In the existing car navigation technology, navigation is mostly performed through application programs installed in a car machine system. The method is that scene images shot by the vehicle in real time are collected, and corresponding arrows are marked on the collected images to represent the road on which the vehicle needs to go, but the method requires navigation information of a driver's monitor screen, so that the attention of the driver is easily dispersed, and the driving is unsafe.
In the prior art, arrow patterns projected on the front windshield by AR-HUD technology are being stepped to represent the navigation direction of the driving. However, the arrow generated by the method cannot generate a navigation track in real time according to the gradient and the rotation degree, so that the navigation arrow cannot be attached to the actual road condition. The calculation cannot be performed on paths with different multi-section gradients and degrees of rotation, and the calculation of the whole path is complex and time-consuming.
Disclosure of Invention
The invention aims to solve the technical problem that a navigation track cannot be generated in real time according to gradient and rotation degree in the prior art, and a navigation arrow is not attached to an actual road condition.
In order to achieve the above object, the present invention provides the following technical solutions:
in one aspect, the present invention further provides a path display method based on AR-HUD navigation, which specifically includes the following steps: s1, obtaining navigation data of a navigation path, wherein the navigation data comprises a distance offset, a corresponding gradient and a rotation degree; s2, generating key points of a navigation path according to the navigation data, and calculating three-dimensional positions of the key points under vehicle body coordinates; and S3, drawing a three-dimensional graph of a navigation arrow according to the three-dimensional position, and projecting the three-dimensional graph to a head-up display.
Further, the step of S2 specifically includes: s21, generating key points of a navigation path according to the navigation data, and calculating key point indexes to be drawn; s22, counting skipped points in two adjacent key points to generate non-key point data; s23, calculating the three-dimensional coordinates of the key points according to the non-key point data.
Preferably, the step S21 specifically includes: s211, calculating the number of path points according to the navigation path and the key points; s212, calculating the distance between the navigation arrows according to the distance between the adjacent path points and the number of the navigation arrows; s213, generating path reference points and key point indexes through the number of the path points and the distance between the navigation arrows.
Preferably, the step S23 specifically includes: s231, generating three-dimensional coordinates of the previous key point according to the navigation data of the previous key point; s232, calculating the three-dimensional coordinates of the key points according to the three-dimensional coordinates of the last key point and the non-key point data.
Further, the method further comprises S4: constructing a view matrix under the coordinates of the vehicle body, monitoring the human eye movement track, and updating the view matrix according to the human eye movement track.
Further, the step of S4 specifically includes: s41, constructing a view matrix under the coordinates of the vehicle body through a vehicle-mounted camera; s42, collecting the pupil movement distance of the user and generating a human eye movement track; s43, updating the view matrix according to the human eye movement track.
On the other hand, the invention also provides a path display system based on AR-HUD navigation, which comprises the following modules: the data acquisition module is used for acquiring the distance offset of the navigation path and the navigation data of the corresponding gradient and rotation degree; the position calculation module is used for generating key points of a navigation path according to the navigation data and calculating the three-dimensional positions of the key points under the coordinates of the vehicle body; and the figure generating module is used for drawing a three-dimensional figure of the navigation arrow according to the three-dimensional position and projecting the three-dimensional figure to the head-up display.
Preferably, the position calculation module comprises the following units: the index calculation unit is used for calculating the number of the key points to be drawn and the key point index to be drawn; the data statistics unit is used for counting skipped points in two adjacent key points and generating non-key point data; and the coordinate generation unit is used for calculating the three-dimensional coordinates of the key points according to the non-key point data.
Further, the system also comprises an eye movement updating module, wherein the eye movement updating module is used for constructing a view matrix under the coordinates of the vehicle body, monitoring the movement track of human eyes and updating the view matrix according to the movement track of the human eyes; the eye movement updating module specifically comprises the following units: the matrix construction unit is used for constructing a view matrix under the coordinates of the vehicle body through the vehicle-mounted camera; the eye movement detection unit is used for collecting the pupil movement distance of the user and generating a human eye movement track; and the view updating unit is used for updating the view matrix according to the human eye movement track.
Meanwhile, the invention also provides a computer storage medium, wherein the computer storage medium stores a computer program, and the computer program realizes the steps of the path display method based on AR-HUD navigation when being executed by a processor.
Compared with the prior art, the invention has the beneficial effects that:
according to the AR-HUD navigation-based path display method and system, navigation data are acquired and calculated through the method, the technical problem that navigation tracks cannot be generated in real time according to gradient and rotation degree in the prior art, navigation arrows are not attached to actual road conditions is solved, a plurality of navigation arrow icons are dynamically displayed on the navigation paths attached to the actual road conditions, display is more visual, and when the dynamic paths of complex roads are drawn by using the plurality of navigation icons, only gesture information of the number of the used navigation icons is calculated, and the whole path coordinates are not calculated; meanwhile, the method for calculating the navigation data can solve the technical problems that paths with different gradients and degrees of rotation cannot be calculated in the prior art, and the calculation of the whole path is time-consuming, so that the complicated navigation path can be simulated with less resource consumption, and the running speed is improved; the invention also acquires the human eye movement track of the driver and updates the view in real time, so that the technical problem that the image cannot be changed along with the change of the position of the visual angle of the human eye in the prior art can be solved, the navigation path can be calculated according to the real visual angle of the driver, and the navigation accuracy is improved.
Drawings
FIG. 1 is a flow chart of a route display method based on AR-HUD navigation according to the present invention;
FIG. 2 is a schematic diagram of another flow chart of a path display method based on AR-HUD navigation according to the present invention;
FIG. 3 is a schematic diagram of a path display system based on AR-HUD navigation according to the present invention;
FIG. 4 is a schematic diagram of another structure of a path display system based on AR-HUD navigation according to the present invention;
FIG. 5 is an effect diagram of a path display method and system based on AR-HUD navigation according to the present invention.
Detailed Description
The present invention will be described in further detail with reference to examples and embodiments. It should not be construed that the scope of the above subject matter of the present invention is limited to the following embodiments, and all techniques realized based on the present invention are within the scope of the present invention.
The invention relates to a path display method, a system and a computer storage medium based on AR-HUD navigation, which combines an AR-HUD projection technology to accurately project a virtual image into a real environment by an AR-HUD projection complete machine module, thereby being convenient for a driver to drive better, and the specific implementation mode is as follows:
first embodiment
Fig. 1 is a flowchart illustrating a path display method based on AR-HUD navigation according to an exemplary embodiment.
Referring to fig. 1, a path display method based on AR-HUD navigation in this embodiment includes the following steps:
step S1, obtaining navigation data of a navigation path, and obtaining the navigation data of a distance offset, a corresponding gradient and a rotation degree.
The navigation data of the road is acquired in the step, so that the navigation data of the next step can be conveniently processed. Navigation data is obtained from existing navigation applications, such as common google maps, high-german maps, hundred-degree maps and other conventional navigation applications. The acquired navigation data comprise distance offset and corresponding gradient and rotation degree, and the acquired navigation data are prepared for solving the technical problem that navigation tracks cannot be generated in real time according to the gradient and rotation degree in the prior art, and navigation arrows are not coincident with actual road conditions.
And S2, generating key points of the navigation path according to the navigation data, and calculating the three-dimensional positions of the key points under the coordinates of the vehicle body.
In this step, in order to select a key point of the navigation path, for a path to be navigated, navigation information such as straight, right turn, ascending, entering a rotary island, etc. exists, and the points, which are directional to navigation, of the navigation path, that are the key points of the vehicle for changing the moving direction are determined. When a plurality of navigation icons are used for drawing a complex dynamic path, only the information of the number of the used navigation icons is needed to be calculated, the whole path coordinate is not needed to be calculated, the complex navigation path can be simulated with less resource consumption, and the specific operation steps are as follows:
step S21, calculating the number of the path points, the path reference points and the key point indexes required to be drawn according to the key points. In order to obtain the index of the key point to be drawn, navigation arrows for navigating the key point are needed before each key point, for example, road signs such as turning around, turning right and straight on a guideboard exist when the road signs are on multiple lanes, and the drawing of the arrows is above the path point. The specific operation steps are as follows:
step S211, calculating the number of path points according to the navigation path and the key points.
For a route to be driven, there is a point where the driving direction of the vehicle is changed, which is a key point. However, the drawn navigation icon needs to be before the key point to achieve the purpose of reminding the driver of reversing, and the point of the navigation mark, namely the path point, is drawn. There are multiple path points between adjacent key points, and the minimum distance between the path points is typically 2 meters, so the number of path points can be obtained according to the distance between the paths between the adjacent key points divided by the minimum distance between the path points. And obtaining the number of the route points between the whole route to be navigated according to the navigation route and the key points.
Step S212, calculating the distance between the navigation arrows according to the distance between the adjacent path points and the number of the navigation arrows.
And calculating the distance between the navigation arrows according to the number of the navigation arrows presented on the UI interface. According to the number of the route points calculated in step S212, by obtaining the distance n between two adjacent route points, the number of the navigation arrows is a, and the spacing s=n/a of the navigation arrows.
In step S213, the path reference points and the key point index are generated by the number of path points and the distance between navigation arrows.
The distance between the navigation arrows can be obtained through step S212, the path reference point can be obtained through the distance between the navigation arrows and the path points, the first path reference point is determined to be the path reference point n1, after the distance between the navigation arrows is passed, the second path reference point n2, that is, the path reference point n2=the path reference point n1+s, the third path reference point n3 and the fourth path reference point n4 are gradually generated, and so on. And integrating the generated path reference points together to form a key point index.
And S22, counting skipped path points in two adjacent path reference points to generate non-key point data.
The step S21 can obtain a path reference point, and the path points between two adjacent path reference points are skipped, that is, the skipped path points between the path reference point n1 and the path reference point n2, between the path reference point n2 and the path reference point n3, and between the path reference point n3 and the path reference point n4 are the path points that need to be counted in the step, that is, the skipped path points are counted in the step. The navigation arrow only needs to use the values of the path reference point n1, the path reference point n2, the path reference point n3, and the path reference point n4 in the drawing. The statistical method is performed by looping, for example, when for (i path reference point nti = 0;i < path reference point n; i++) loops the total number of path points (n), counting by adding 1 each time when [ i is not equal to the index of each navigation arrow on the total path (path reference point n1, path reference point n2, path reference point n3, path reference point n 4).
Step S23, calculating three-dimensional coordinates of the path reference point according to the non-key point data.
The parameter of the latter path reference point for calculating the three-dimensional coordinate is a relative value with respect to the current path point, and since the calculation cost needs to be saved, the three-dimensional coordinate is calculated by performing matrix operation only with respect to the path reference point. The method specifically comprises the following steps:
in step S231, three-dimensional data is generated according to the navigation data of the previous path reference point and the three-dimensional coordinates of the previous path reference point.
In order to calculate the three-dimensional coordinates of the path reference point n2, the three-dimensional coordinates of the path reference point n1 need to be obtained, and the path reference point n2 rotates and displaces relative to the path reference point n1, and other navigation parameters.
Step S232, calculating the three-dimensional position of the path reference point according to the three-dimensional data and the non-key point data.
The path reference point n2 is a sum of parameters of points skipped between the path reference point n2 and the path reference point n1, which are relative parameters of the path reference point n2 point with respect to the path reference point n1 point, that is, non-key point data generated through the above steps. Since the difference in parameters between each path point is equal at the time of designing the data, the difference in the count parameters of the skipped points can be used to calculate the three-dimensional coordinates of the path reference point n 2. At the next call, the index of each navigation arrow on the total path is the path reference point n1+offset, the path reference point n2=the path reference point n1+step+offset, the path reference point n3=the path reference point n2+step+offset, and the path reference point n4=the path reference point n3+step+offset. The offset is non-key point data, and the larger the offset is, the faster the navigation arrow moves.
And step S3, drawing a three-dimensional graph of the navigation arrow according to the three-dimensional position, and projecting the three-dimensional graph to the head-up display.
The three-dimensional image generated by the method is finally projected to a head-up display, namely the three-dimensional image drawn by the method is finally projected to a vehicle.
According to the route display method based on AR-HUD navigation, the technical problem that a navigation track cannot be generated in real time according to the gradient and the rotation degree in the prior art is solved, and navigation arrows are not attached to actual road conditions, and a plurality of navigation arrow icons are dynamically displayed on the navigation route attached to the actual road conditions, so that the display is more visual; meanwhile, the method for calculating the navigation data can solve the technical problem that paths with different gradients and degrees of rotation cannot be calculated in the prior art, the calculation of the whole path is time-consuming, and the complex navigation path can be simulated with less resource consumption, so that the running speed is improved.
Second embodiment
Fig. 2 is another flow chart illustrating a path display method based on AR-HUD navigation according to an exemplary embodiment. Referring to fig. 2, a path display method based on AR-HUD navigation in this embodiment includes the following steps:
step S1, obtaining the distance offset of the navigation path and the navigation data of corresponding gradient and rotation degree.
And S2, generating key points of the navigation path according to the navigation data, and calculating the three-dimensional positions of the key points under the coordinates of the vehicle body.
And step S3, drawing a three-dimensional graph of the navigation arrow according to the three-dimensional position, and projecting the three-dimensional graph to the head-up display.
And S4, constructing a view matrix under the coordinates of the vehicle body, monitoring the human eye movement track, and updating the view matrix according to the human eye movement track.
Since the steps S1-S3 are described in detail in the above embodiments, they are not described here again. In step S4, the driver can determine that driving is performed in different directions by constructing the view matrix in the vehicle body coordinates, and then seeing the direction of the navigation arrow. In the driving process of a driver, the eyeball movement of the driver is collected in real time, the movement track of the driver is monitored, and the view matrix is updated according to the movement track of the driver, so that three-dimensional figures such as a navigation arrow and the like are updated in real time along with the movement of human eyes.
Preferably, step S4 further comprises the steps of:
in step S41, a view matrix under the coordinates of the vehicle body is constructed by the vehicle-mounted camera. The method is used for constructing the view matrix, so that the subsequent real-time updating is facilitated.
Step S42, collecting the pupil movement distance of the user and generating the human eye movement track. The method is used for collecting human eye movement, so that human eye movement tracks can be conveniently generated.
Step S43, updating the view matrix according to the human eye motion trail. The method comprises the steps of updating the video matrix in real time according to the human eye movement track acquired in the previous step, so as to ensure that the three-dimensional graph can correspondingly move along with the movement of human eyes. The view matrix is a 4x4 matrix calculated by using a GLM (OpenGL Mathematics) library function according to the coordinates of the real-time position of the pupil of the human eye under a vehicle coordinate system and the internal and external parameters of the camera calibrated in advance, and the view matrix is updated along with the motion trail real-time position of the pupil of the human eye.
According to the path display method based on AR-HUD navigation, provided by the invention, besides the technical effects shown in the previous embodiment, the technical problem that an image cannot be changed along with the change of the position of the visual angle of the human eye in the prior art can be solved by collecting the motion trail of the human eye of the driver and updating the view in real time, the navigation path can be calculated according to the real visual angle of the driver, and the navigation accuracy is improved.
Third embodiment
In addition to the AR-HUD navigation-based path display method provided by the invention, the invention also provides an AR-HUD navigation-based path display system. As shown in fig. 3, the system comprises the following modules:
the data acquisition module 10 is used for acquiring the distance offset of the navigation path and the navigation data of the corresponding gradient and rotation degree.
The position calculation module 20 is configured to generate key points of the navigation path according to the navigation data, and calculate three-dimensional positions of the key points under the coordinates of the vehicle body.
The graphics generation module 30 is configured to draw a three-dimensional graphics of the navigation arrow according to the three-dimensional position, and project the three-dimensional graphics to the head-up display.
The position calculation module 20 includes the following units:
an index calculation unit 21 for calculating the number of path points, path reference points, and the index of the key points to be drawn from the key points.
The data statistics unit 22 is configured to count skipped path points in two adjacent path reference points, and generate non-key point data.
The coordinate generating unit 23 is used for calculating the three-dimensional coordinates of the path reference point according to the non-key point data.
The related unit configured by the system is used for executing the related instruction in the path display method based on the AR-HUD navigation, and the detailed description is omitted herein.
According to the route display system based on AR-HUD navigation, the navigation data are acquired and calculated through the data acquisition module 10 and the position calculation module 20, the technical problem that navigation tracks cannot be generated in real time according to gradient and rotation degree in the prior art, navigation arrows are not attached to actual road conditions is solved, and a plurality of navigation arrow icons are dynamically displayed on the attached navigation route of the actual road conditions, so that display is more visual; meanwhile, the system can solve the technical problem that the calculation of the path with different gradients and rotation degrees cannot be calculated in the prior art, the calculation of the whole path is time-consuming, and the complex navigation path can be simulated with less resource consumption, so that the running speed is improved.
Fourth embodiment
In addition to the above-mentioned path display system based on AR-HUD navigation provided by the invention, the invention also provides a path display system based on AR-HUD navigation. As shown in fig. 4, the system comprises the following modules:
the data acquisition module 10 is used for acquiring the distance offset of the navigation path and the navigation data of the corresponding gradient and rotation degree.
The position calculation module 20 is configured to generate key points of the navigation path according to the navigation data, and calculate three-dimensional positions of the key points under the coordinates of the vehicle body.
The graphics generation module 30 is configured to draw a three-dimensional graphics of the navigation arrow according to the three-dimensional position, and project the three-dimensional graphics to the head-up display.
The eye movement updating module 40 is configured to construct a view matrix under the coordinates of the vehicle body, monitor the movement track of the human eyes, and update the view matrix according to the movement track of the human eyes.
Wherein the eye movement update module further comprises the following units:
a matrix construction unit 41 for constructing a view matrix under the coordinates of the vehicle body by the vehicle-mounted camera.
The eye movement detection unit 42 is configured to collect a pupil movement distance of the user and generate a human eye movement track.
The view updating unit 43 is configured to update the view matrix according to the motion trail of the human eye.
The related unit configured by the system is used for executing the related instruction in the path display method based on the AR-HUD navigation, and the detailed description is omitted herein.
According to the path display system based on AR-HUD navigation provided by the embodiment, besides the technical effects of the previous embodiment, the eye movement track of the driver is acquired through the eye movement updating module 40, and the view is updated in real time, as shown in fig. 5, the technical problem that an image cannot be changed along with the change of the position of the visual angle of the human eye in the prior art can be solved, the navigation path can be calculated according to the actual visual angle of the driver, and the navigation accuracy is improved.
Fifth embodiment
Meanwhile, the invention also provides a computer storage medium. The computer storage medium according to the embodiments of the present invention stores a computer program, which when executed by a processor implements any one of the steps of the path display method based on AR-HUD navigation described above, and may be any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user computer through any kind of network, including a local area network (LA path reference point n) or a wide area network (WA path reference point n), or may be connected to an external computer (e.g., connected through the internet using an internet service provider).
In summary, the foregoing is merely a detailed description of the preferred embodiments of the invention, and is not intended to limit the scope of the invention. In practical applications, a person skilled in the art can make several adjustments according to the technical solution. Any modifications, equivalent substitutions, partial applications, etc. which are made under the principles of the present invention as set forth herein, are intended to be included within the scope of the present invention.

Claims (8)

1. The path display method based on AR-HUD navigation is characterized by comprising the following steps:
s1, obtaining navigation data of a navigation path, wherein the navigation data comprises a distance offset, a corresponding gradient and a rotation degree;
s2, generating key points of a navigation path according to the navigation data, and calculating three-dimensional positions of the key points under vehicle body coordinates;
the step of S2 specifically comprises the following steps:
s21, generating key points of a navigation path according to the navigation data, and calculating key point indexes to be drawn;
the step of S21 specifically includes:
s211, calculating the number of path points according to the navigation path and the key points;
s212, calculating the distance between the navigation arrows according to the distance between the adjacent path points and the number of the navigation arrows;
s213, generating path reference points and key point indexes through the number of the path points and the distance between the navigation arrows;
s22, counting skipped points in two adjacent key points to generate non-key point data;
s23, calculating three-dimensional coordinates of the key points according to the non-key point data;
and S3, drawing a three-dimensional graph of a navigation arrow according to the three-dimensional position, and projecting the three-dimensional graph to a head-up display.
2. The AR-HUD navigation-based path display method according to claim 1, wherein the step of S23 specifically includes:
s231, generating three-dimensional coordinates of the previous key point according to the navigation data of the previous key point;
s232, calculating the three-dimensional coordinates of the key points according to the three-dimensional coordinates of the last key point and the non-key point data.
3. The AR-HUD navigation-based path display method according to claim 1, further comprising S4: constructing a view matrix under the coordinates of the vehicle body, monitoring the human eye movement track, and updating the view matrix according to the human eye movement track.
4. The AR-HUD navigation-based path display method according to claim 3, wherein the step of S4 specifically includes:
s41, constructing a view matrix under the coordinates of the vehicle body through a vehicle-mounted camera;
s42, collecting the pupil movement distance of the user and generating a human eye movement track;
s43, updating the view matrix according to the human eye movement track.
5. A path display system based on AR-HUD navigation, the system comprising the following modules:
the data acquisition module is used for acquiring the distance offset of the navigation path and the navigation data of the corresponding gradient and rotation degree;
the position calculation module is used for generating key points of a navigation path according to the navigation data and calculating the three-dimensional positions of the key points under the coordinates of the vehicle body;
the method specifically comprises the following steps:
s21, generating key points of a navigation path according to the navigation data, and calculating key point indexes to be drawn;
the step of S21 specifically includes:
s211, calculating the number of path points according to the navigation path and the key points;
s212, calculating the distance between the navigation arrows according to the distance between the adjacent path points and the number of the navigation arrows;
s213, generating path reference points and key point indexes through the number of the path points and the distance between the navigation arrows;
s22, counting skipped points in two adjacent key points to generate non-key point data;
s23, calculating three-dimensional coordinates of the key points according to the non-key point data;
and the figure generating module is used for drawing a three-dimensional figure of the navigation arrow according to the three-dimensional position and projecting the three-dimensional figure to the head-up display.
6. The AR-HUD navigation based path display system of claim 5, wherein the location calculation module comprises the following elements:
the index calculation unit is used for calculating the number of the key points to be drawn and the key point index to be drawn;
the data statistics unit is used for counting skipped points in two adjacent key points and generating non-key point data;
and the coordinate generation unit is used for calculating the three-dimensional coordinates of the key points according to the non-key point data.
7. The AR-HUD navigation based path display system of claim 5, further comprising an eye movement update module for constructing a view matrix in the vehicle body coordinates, monitoring a human eye movement trajectory, and updating the view matrix according to the human eye movement trajectory; the eye movement updating module specifically comprises the following units:
the matrix construction unit is used for constructing a view matrix under the coordinates of the vehicle body through the vehicle-mounted camera;
the eye movement detection unit is used for collecting the pupil movement distance of the user and generating a human eye movement track;
and the view updating unit is used for updating the view matrix according to the human eye movement track.
8. A computer storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the steps of the method according to any one of claims 1 to 4.
CN201911378417.0A 2019-12-27 2019-12-27 Path display method, system and computer storage medium based on AR-HUD navigation Active CN111121815B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911378417.0A CN111121815B (en) 2019-12-27 2019-12-27 Path display method, system and computer storage medium based on AR-HUD navigation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911378417.0A CN111121815B (en) 2019-12-27 2019-12-27 Path display method, system and computer storage medium based on AR-HUD navigation

Publications (2)

Publication Number Publication Date
CN111121815A CN111121815A (en) 2020-05-08
CN111121815B true CN111121815B (en) 2023-07-07

Family

ID=70504110

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911378417.0A Active CN111121815B (en) 2019-12-27 2019-12-27 Path display method, system and computer storage medium based on AR-HUD navigation

Country Status (1)

Country Link
CN (1) CN111121815B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112738487B (en) 2020-12-24 2022-10-11 阿波罗智联(北京)科技有限公司 Image projection method, device, equipment and storage medium
CN113326758A (en) * 2021-05-25 2021-08-31 青岛慧拓智能机器有限公司 Head-up display technology for remotely controlling driving monitoring video
CN114518117A (en) * 2022-02-24 2022-05-20 北京百度网讯科技有限公司 Navigation method, navigation device, electronic equipment and medium
CN115406462A (en) * 2022-08-31 2022-11-29 重庆长安汽车股份有限公司 Navigation and live-action fusion method and device, electronic equipment and storage medium
CN115683152A (en) * 2022-10-27 2023-02-03 长城汽车股份有限公司 Vehicle navigation guiding method and device based on coordinate transformation and electronic equipment
CN116105747B (en) * 2023-04-07 2023-07-04 江苏泽景汽车电子股份有限公司 Dynamic display method for navigation path, storage medium and electronic equipment

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101368827A (en) * 2007-08-16 2009-02-18 北京灵图软件技术有限公司 Communication navigation method, apparatus and communication navigation system
CN105333883A (en) * 2014-08-07 2016-02-17 深圳点石创新科技有限公司 Navigation path and trajectory displaying method and apparatus for head-up display (HUD)
CN106448206A (en) * 2016-11-08 2017-02-22 厦门盈趣科技股份有限公司 Pavement aided navigation system based on Internet of vehicles
DE102016203080A1 (en) * 2016-02-26 2017-08-31 Robert Bosch Gmbh Method for operating a head-up display, head-up display device
CN107228681A (en) * 2017-06-26 2017-10-03 上海驾馥电子科技有限公司 A kind of navigation system for strengthening navigation feature by camera
CN108180921A (en) * 2017-12-22 2018-06-19 联创汽车电子有限公司 Utilize the AR-HUD navigation system and its air navigation aid of GPS data
CN108896066A (en) * 2018-03-23 2018-11-27 江苏泽景汽车电子股份有限公司 A kind of augmented reality head up display and its navigation implementation method
CN108981740A (en) * 2018-06-11 2018-12-11 同济大学 Blind under the conditions of a kind of low visibility drives navigation system and its method
CN109143305A (en) * 2018-09-30 2019-01-04 百度在线网络技术(北京)有限公司 Automobile navigation method and device
CN109462750A (en) * 2018-12-29 2019-03-12 上海玮舟微电子科技有限公司 A kind of head-up-display system, information display method, device and medium
CN109525039A (en) * 2018-11-29 2019-03-26 国网新疆电力有限公司昌吉供电公司 A kind of power distribution network operation monitoring method and system
WO2019057452A1 (en) * 2017-09-21 2019-03-28 Volkswagen Aktiengesellschaft Method, device and computer-readable storage medium with instructions for controlling a display of an augmented-reality head-up display device for a motor vehicle
WO2019097755A1 (en) * 2017-11-17 2019-05-23 アイシン・エィ・ダブリュ株式会社 Display device and computer program
CN109883439A (en) * 2019-03-22 2019-06-14 百度在线网络技术(北京)有限公司 A kind of automobile navigation method, device, electronic equipment and storage medium
CN109974734A (en) * 2019-04-02 2019-07-05 百度在线网络技术(北京)有限公司 A kind of event report method, device, terminal and storage medium for AR navigation
CN109990797A (en) * 2017-12-29 2019-07-09 周秦娜 A kind of control method of the augmented reality navigation display for HUD
DE102019000901A1 (en) * 2019-02-07 2019-07-25 Daimler Ag Method for displaying navigation instructions in a head-up display of a Krafftfahrzeugs and computer program product
CN110136519A (en) * 2019-04-17 2019-08-16 百度在线网络技术(北京)有限公司 Simulation system and method based on ARHUD navigation
CN110516880A (en) * 2019-08-29 2019-11-29 广州小鹏汽车科技有限公司 Path processing method and system and vehicle

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4333704B2 (en) * 2006-06-30 2009-09-16 アイシン・エィ・ダブリュ株式会社 Navigation device
KR20100070973A (en) * 2008-12-18 2010-06-28 박호철 Head-up display navigation apparatus, system and service implementation method thereof

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101368827A (en) * 2007-08-16 2009-02-18 北京灵图软件技术有限公司 Communication navigation method, apparatus and communication navigation system
CN105333883A (en) * 2014-08-07 2016-02-17 深圳点石创新科技有限公司 Navigation path and trajectory displaying method and apparatus for head-up display (HUD)
DE102016203080A1 (en) * 2016-02-26 2017-08-31 Robert Bosch Gmbh Method for operating a head-up display, head-up display device
CN106448206A (en) * 2016-11-08 2017-02-22 厦门盈趣科技股份有限公司 Pavement aided navigation system based on Internet of vehicles
CN107228681A (en) * 2017-06-26 2017-10-03 上海驾馥电子科技有限公司 A kind of navigation system for strengthening navigation feature by camera
WO2019057452A1 (en) * 2017-09-21 2019-03-28 Volkswagen Aktiengesellschaft Method, device and computer-readable storage medium with instructions for controlling a display of an augmented-reality head-up display device for a motor vehicle
WO2019097755A1 (en) * 2017-11-17 2019-05-23 アイシン・エィ・ダブリュ株式会社 Display device and computer program
CN108180921A (en) * 2017-12-22 2018-06-19 联创汽车电子有限公司 Utilize the AR-HUD navigation system and its air navigation aid of GPS data
CN109990797A (en) * 2017-12-29 2019-07-09 周秦娜 A kind of control method of the augmented reality navigation display for HUD
CN108896066A (en) * 2018-03-23 2018-11-27 江苏泽景汽车电子股份有限公司 A kind of augmented reality head up display and its navigation implementation method
CN108981740A (en) * 2018-06-11 2018-12-11 同济大学 Blind under the conditions of a kind of low visibility drives navigation system and its method
CN109143305A (en) * 2018-09-30 2019-01-04 百度在线网络技术(北京)有限公司 Automobile navigation method and device
CN109525039A (en) * 2018-11-29 2019-03-26 国网新疆电力有限公司昌吉供电公司 A kind of power distribution network operation monitoring method and system
CN109462750A (en) * 2018-12-29 2019-03-12 上海玮舟微电子科技有限公司 A kind of head-up-display system, information display method, device and medium
DE102019000901A1 (en) * 2019-02-07 2019-07-25 Daimler Ag Method for displaying navigation instructions in a head-up display of a Krafftfahrzeugs and computer program product
CN109883439A (en) * 2019-03-22 2019-06-14 百度在线网络技术(北京)有限公司 A kind of automobile navigation method, device, electronic equipment and storage medium
CN109974734A (en) * 2019-04-02 2019-07-05 百度在线网络技术(北京)有限公司 A kind of event report method, device, terminal and storage medium for AR navigation
CN110136519A (en) * 2019-04-17 2019-08-16 百度在线网络技术(北京)有限公司 Simulation system and method based on ARHUD navigation
CN110516880A (en) * 2019-08-29 2019-11-29 广州小鹏汽车科技有限公司 Path processing method and system and vehicle

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Changrak Yoon.Development of augmented in-vehicle navigation system for Head-Up Display.《2014 International Conference on Information and Communication Technology Convergence (ICTC)》.2014,601-602. *
田婧怡.基于增强现实技术的导航方法的研究与应用.《中国优秀硕士学位论文全文数据库 信息科技辑》.2018,I138-1826. *
鲁云飞.基于三维视线跟踪的抬头显示系统研究.《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》.2018,C035-111. *

Also Published As

Publication number Publication date
CN111121815A (en) 2020-05-08

Similar Documents

Publication Publication Date Title
CN111121815B (en) Path display method, system and computer storage medium based on AR-HUD navigation
CN112204343B (en) Visualization of high definition map data
US11656091B2 (en) Content visualizing method and apparatus
JP7258078B2 (en) Real scene navigation icon display method, apparatus, equipment and medium
US11727272B2 (en) LIDAR-based detection of traffic signs for navigation of autonomous vehicles
CN110832348B (en) Point cloud data enrichment for high definition maps of autonomous vehicles
JP6644742B2 (en) Algorithms and infrastructure for robust and efficient vehicle positioning
US10347046B2 (en) Augmented reality transportation notification system
Olaverri-Monreal et al. Connection of the SUMO microscopic traffic simulator and the unity 3D game engine to evaluate V2X communication-based systems
US10176634B2 (en) Lane boundary detection data generation in virtual environment
US20190325264A1 (en) Machine learning a feature detector using synthetic training data
US20170109458A1 (en) Testbed for lane boundary detection in virtual driving environment
US20210001891A1 (en) Training data generation for dynamic objects using high definition map data
US10096158B2 (en) Method and system for virtual sensor data generation with depth ground truth annotation
WO2020264222A1 (en) Image-based keypoint generation
WO2013020075A2 (en) Prominence-based generation and rendering of map features
US11518413B2 (en) Navigation of autonomous vehicles using turn aware machine learning based models for prediction of behavior of a traffic entity
CN114429528A (en) Image processing method, image processing apparatus, image processing device, computer program, and storage medium
CN115406462A (en) Navigation and live-action fusion method and device, electronic equipment and storage medium
CN110321854B (en) Method and apparatus for detecting target object
CN108595095B (en) Method and device for simulating movement locus of target body based on gesture control
JP7375149B2 (en) Positioning method, positioning device, visual map generation method and device
CN111260722A (en) Vehicle positioning method, apparatus and storage medium
CN115357500A (en) Test method, device, equipment and medium for automatic driving system
Zhang et al. Visualization of UGV motion state fused geospatial information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information

Inventor after: Zeng Fanhua

Inventor after: Wu Yuehong

Inventor after: Sun Xinran

Inventor after: Li Wanchao

Inventor after: Kuang can

Inventor before: Sun Xinran

Inventor before: Li Wanchao

Inventor before: Kuang can

CB03 Change of inventor or designer information
TA01 Transfer of patent application right

Effective date of registration: 20230606

Address after: 401147 Building 5-1 #, No. 24, Changhui Road, Yuzui Town, Liangjiang New Area, Chongqing

Applicant after: Chongqing Lilong Zhongbao Intelligent Technology Co.,Ltd.

Address before: 404100 No.4 diance village, Jiangbei District, Chongqing

Applicant before: Chongqing Lilong technology industry (Group) Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: A path display method, system, and computer storage medium based on AR-HUD navigation

Granted publication date: 20230707

Pledgee: Societe Generale Limited by Share Ltd. Chongqing branch

Pledgor: Chongqing Lilong Zhongbao Intelligent Technology Co.,Ltd.

Registration number: Y2024500000002

PE01 Entry into force of the registration of the contract for pledge of patent right