CN111707283A - Navigation method, device, system and equipment based on augmented reality technology - Google Patents

Navigation method, device, system and equipment based on augmented reality technology Download PDF

Info

Publication number
CN111707283A
CN111707283A CN202010392648.3A CN202010392648A CN111707283A CN 111707283 A CN111707283 A CN 111707283A CN 202010392648 A CN202010392648 A CN 202010392648A CN 111707283 A CN111707283 A CN 111707283A
Authority
CN
China
Prior art keywords
information
vehicle
navigation
acquiring
current position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010392648.3A
Other languages
Chinese (zh)
Inventor
汪春年
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Geely Holding Group Co Ltd
Zhejiang Zeekr Intelligent Technology Co Ltd
Original Assignee
Zhejiang Geely Holding Group Co Ltd
Ningbo Geely Automobile Research and Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Geely Holding Group Co Ltd, Ningbo Geely Automobile Research and Development Co Ltd filed Critical Zhejiang Geely Holding Group Co Ltd
Priority to CN202010392648.3A priority Critical patent/CN111707283A/en
Publication of CN111707283A publication Critical patent/CN111707283A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3602Input other than that of destination using image analysis, e.g. detection of road signs, lanes, buildings, real preceding vehicles using a camera
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3605Destination input or retrieval
    • G01C21/3614Destination input or retrieval through interaction with a road map, e.g. selecting a POI icon on a road map
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3605Destination input or retrieval
    • G01C21/3623Destination input or retrieval using a camera or code reader, e.g. for optical or magnetic codes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3635Guidance using 3D or perspective road maps
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3667Display of a road map
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3691Retrieval, searching and output of information related to real-time traffic, weather, or environmental conditions
    • G01C21/3694Output thereof on a road map
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3697Output of additional, non-guidance related information, e.g. low fuel level
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Atmospheric Sciences (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Ecology (AREA)
  • Environmental & Geological Engineering (AREA)
  • Environmental Sciences (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)

Abstract

The invention provides a navigation method based on augmented reality technology, which comprises the following steps: acquiring navigation demand information and planning a navigation path; acquiring current position information and surrounding environment information of a vehicle; acquiring the guide information of the front route of the vehicle according to the navigation path and the current position information of the vehicle; acquiring first navigation information according to the current position information of the vehicle, the surrounding environment information of the vehicle and the guide information of the route in front of the vehicle; acquiring active safety information of the vehicle according to the current position information of the vehicle and the surrounding environment information of the vehicle; acquiring and outputting second navigation information according to the first navigation information and the vehicle active safety information; the invention also provides a device, a system and equipment; the navigation method provided by the invention can realize the visualization of the surrounding traffic information; and the navigation information is combined with the vehicle active safety technology, so that more accurate and safe driving guidance is provided for drivers and passengers, potential traffic safety hazards are avoided, and the driving safety is improved.

Description

Navigation method, device, system and equipment based on augmented reality technology
Technical Field
The invention relates to the technical field of navigation, in particular to a navigation method, a navigation device, a navigation system and navigation equipment based on an augmented reality technology.
Background
Currently, a user often guides driving through a navigation technology in the process of driving a vehicle: and the information such as the current position of the vehicle, the road condition in front and the like is obtained through a navigation technology. The AR (augmented reality) technology can acquire road condition information in front of a driving road of a vehicle through a camera installed in front of the vehicle, and projects an image in front of the vehicle through a display device, so that a real environment and a virtual image are superimposed on the same picture in real time, and seamless integration of the real environment and the virtual information is realized.
The AR navigation is a navigation method implemented on the basis of combining the AR technology and the map information, and can provide more intuitive, more vivid, more accurate, and safer navigation services for people.
In the prior art, the AR navigation technology improves the accuracy of navigation based on the recognition of street view and GPS signal coordinate position and attitude associated with a front lane line and an environmental road. The AR navigation technology is more concerned with the recognition of the road ahead and the guidance of the navigation direction during the driving and navigation of the vehicle, and lacks the combination with the information of the vehicle itself and the information of the surroundings during the driving of the vehicle (especially the information of the surrounding road conditions when the vehicle changes lanes and turns around). When a driver changes lanes or turns at an intersection according to a navigation-guided route, if the driver cannot find road condition information with safety risks around the vehicle in time, such as overtaking and accelerating of a vehicle behind; when vehicles close to a lane change and are jammed and still run according to the guidance of a navigation route, traffic accidents such as scraping, even collision and the like can be caused, and potential safety hazards are brought to driving. Therefore, how to combine the AR navigation technology with the vehicle-mounted sensor and the vehicle active safety technology remains a technical problem to be overcome urgently.
Aiming at the defects in the prior art, the application aims to provide a navigation method, a navigation device, a navigation system and navigation equipment based on an augmented reality technology, so that the navigation accuracy and the driving safety can be greatly improved.
Disclosure of Invention
In view of the foregoing problems in the prior art, an object of the present invention is to provide a navigation method, apparatus, system and device based on augmented reality technology.
In order to solve the above problems, the present invention provides a navigation method based on augmented reality technology, the method comprising:
acquiring navigation demand information, and planning a navigation path according to the navigation demand information;
acquiring current position information of a vehicle and surrounding environment information of the vehicle, wherein the current position information of the vehicle comprises a name of a road where the vehicle is located;
acquiring the guiding information of the front route of the vehicle according to the navigation path and the current position information of the vehicle;
acquiring first navigation information according to the current position information of the vehicle, the surrounding environment information of the vehicle and the guiding information of the route in front of the vehicle;
acquiring vehicle active safety information according to the current position information of the vehicle and the surrounding environment information of the vehicle;
acquiring second navigation information according to the first navigation information and the vehicle active safety information;
and outputting the second navigation information.
Further, the vehicle active safety information comprises lane departure information, vehicle distance information and blind area detection alarm information.
Further, the vehicle active safety information comprises lane departure information, vehicle distance information, blind area detection alarm information and warning information;
the warning information is determined by the following method:
acquiring vehicle interior image information, and acquiring behavior information of drivers and passengers in the vehicle according to the vehicle interior image information;
and judging the behavior information of the drivers and passengers in the automobile, and generating corresponding warning information when the behavior of the drivers and passengers in the automobile is judged to be dangerous.
Specifically, the vehicle surrounding environment information includes vehicle surrounding lane line information, vehicle surrounding traffic identification information, vehicle surrounding other vehicle information, vehicle surrounding pedestrian information, and vehicle surrounding obstacle information.
Specifically, the obtaining of the vehicle active safety information according to the vehicle current position information and the vehicle surrounding environment information includes:
acquiring lane departure information according to the current position information of the vehicle and the surrounding lane line information;
acquiring the vehicle distance information according to the current position information of the vehicle, the information of other vehicles around the vehicle, the information of pedestrians around the vehicle and the information of obstacles around the vehicle;
and detecting the blind area of the vehicle according to the surrounding environment information of the vehicle to acquire blind area alarm information.
Specifically, the vehicle front route guidance information includes distance information of the vehicle from the front intersection, steering information of the vehicle at the front intersection, and a front road name.
Further, the vehicle front route guidance information comprises distance information of the vehicle from the front intersection, steering information of the vehicle at the front intersection, a front road name and an estimated passing time of the vehicle from the current position to the front intersection;
the expected transit time is determined by:
acquiring vehicle running information, wherein the vehicle running information comprises vehicle speed information, acceleration information and steering wheel angle information;
and acquiring the predicted passing time according to the navigation path, the current position information of the vehicle and the running information of the vehicle.
In another aspect, the present invention provides a navigation device based on augmented reality technology, including:
the first acquisition module is used for acquiring navigation demand information;
the second acquisition module is used for acquiring the navigation path according to the navigation demand information;
the third acquisition module is used for acquiring the current position information of the vehicle and the surrounding environment information of the vehicle, wherein the current position information of the vehicle comprises the name of the road where the vehicle is located;
the fourth acquisition module is used for acquiring the guiding information of the front route of the vehicle according to the navigation path and the current position information of the vehicle;
the first navigation information acquisition module is used for acquiring first navigation information according to the current position information of the vehicle, the surrounding environment information of the vehicle and the guide information of the route in front of the vehicle;
the fifth acquisition module is used for acquiring vehicle active safety information according to the current position information of the vehicle and the surrounding environment information of the vehicle;
the second navigation information acquisition module is used for acquiring second navigation information according to the first navigation information and the vehicle active safety information;
and the output module is used for outputting the second navigation information.
The invention also discloses a navigation system based on the augmented reality technology, which comprises a camera, a radar and the navigation device based on the augmented reality technology, wherein the camera and the radar are both connected with the navigation device;
the camera is used for acquiring the surrounding environment information of the vehicle and the image information in the vehicle;
the radar is used for acquiring vehicle distance information according to the current position information of the vehicle and the surrounding environment information of the vehicle;
the navigation device can receive the vehicle distance information acquired by the radar and the vehicle surrounding environment information and the in-vehicle image information acquired by the camera.
The present invention also protects an electronic device comprising:
one or more processors;
a memory; and
one or more programs, stored in the memory and executed by the one or more processors, the programs comprising instructions for performing an augmented reality technology-based navigation method of the above-described aspects.
Due to the technical scheme, the invention has the following beneficial effects:
according to the navigation method based on the augmented reality technology, the visualization of the surrounding environment information in the driving process can be realized through facility equipment installed on a vehicle; the combination of navigation information and vehicle active safety technology is realized, the reminding of blind area alarm, safe distance alarm, lane change alarm and the like of drivers and passengers is provided, more accurate and safer driving guide is provided, potential traffic safety hidden dangers are avoided in advance, and the driving safety is improved; by means of the AR navigation technology, a more visual and visual AR navigation method is provided.
Drawings
In order to more clearly illustrate the technical solution of the present invention, the drawings used in the description of the embodiment or the prior art will be briefly described below. It is obvious that the drawings in the following description are only some embodiments of the invention, and that for a person skilled in the art, other drawings can be derived from them without inventive effort.
Fig. 1 is a schematic flowchart of a navigation method based on augmented reality technology according to an embodiment of the present invention;
fig. 2 is a schematic flow chart of a navigation method based on augmented reality technology according to another embodiment of the present invention;
fig. 3 is a schematic structural diagram of a navigation device based on augmented reality technology according to an embodiment of the present invention.
In the figure: 10-a first acquisition module, 20-a second acquisition module, 30-a third acquisition module, 40-a fourth acquisition module, 50-a first navigation information acquisition module, 60-a fifth acquisition module, 70-a second navigation information acquisition module, and 80-an output module.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, are within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprising" and "having," as well as any variations thereof, are intended to cover non-exclusive inclusions.
Example 1
With reference to fig. 1, the present embodiment provides a navigation method based on augmented reality technology, including:
s110: acquiring navigation demand information, and planning a navigation path according to the navigation demand information; the navigation demand information comprises navigation request information, departure place information and destination information, and after the navigation request information is confirmed to be correct, a navigation path is planned according to the departure place information and the destination information.
S120: acquiring current position information of a vehicle and surrounding environment information of the vehicle, wherein the current position information of the vehicle comprises a name of a road where the vehicle is located; in the embodiment of the specification, the current position information of the vehicle can be acquired through a vehicle-mounted GPS; the vehicle surrounding environment information can be obtained through a radar and a look-around camera arranged on the top of the vehicle.
After obtaining the original materials of the vehicle surrounding environment information, the method performs multi-task simulation detection and identification on the vehicle surrounding environment information obtained by the camera through a deep learning algorithm, captures and distinguishes traffic identification information such as lane line information, vehicle surrounding object information, traffic light information, speed limit identification and the like, and further subdivides the vehicle surrounding object information into other vehicle information, pedestrian information, obstacle information and building information;
and calculating the distance information between the vehicle and the surrounding objects according to the Doppler effect and the time difference of the received reflected waves.
S130: acquiring the guiding information of the front route of the vehicle according to the navigation path and the current position information of the vehicle; the vehicle front route guidance information includes distance information of the vehicle from a front intersection, turning information of the vehicle at the front intersection, and a front road name.
The turning information of the vehicle at the front intersection comprises left turning, straight going, right turning, rotary island and turning around of the vehicle at the front intersection.
S140: acquiring first navigation information according to the current position information of the vehicle, the surrounding environment information of the vehicle and the guiding information of the route in front of the vehicle;
therefore, the first navigation information includes not only the own vehicle surrounding environment information but also the front road guidance information giving driving guidance to the driver.
S150: acquiring active safety information of the vehicle through a fusion algorithm according to the current position information of the vehicle and the surrounding environment information of the vehicle; the vehicle active safety information comprises lane departure information, vehicle distance information and blind area detection alarm information.
The lane departure information can be obtained through the current position information of the vehicle and the lane line information around the vehicle;
the distance information is distance information between the vehicle and surrounding objects and is obtained by a radar, and the distance information comprises first distance information between the vehicle and other surrounding vehicles, second distance information between the vehicle and surrounding pedestrians and third distance information between the vehicle and surrounding obstacles; the first distance information is obtained through the current position information of the vehicle and the information of other vehicles around the vehicle; the second distance information is obtained through the current position information of the vehicle and the information of pedestrians around the vehicle; the third distance information is obtained through the current position information of the vehicle and the information of obstacles around the vehicle;
and detecting the blind area of the vehicle according to the surrounding environment information of the vehicle to obtain blind area alarm information.
S160: acquiring second navigation information according to the first navigation information and the vehicle active safety information;
in the embodiment of the present specification, the second navigation information includes the current position information of the vehicle, the surrounding environment information of the vehicle obtained by the camera radar, the distance information between the vehicle and the intersection ahead, the steering information of the vehicle at the intersection ahead, the road guide information ahead of the name of the road ahead, and the active safety information such as lane departure information, vehicle distance information, and blind zone detection alarm information, so that when the second navigation information is output to the driver, the driver can not only know the position of the vehicle and the surrounding environment information of the vehicle, but also drive the vehicle according to the guidance of the guide information, and besides, can obtain the safety information of the vehicle in real time, and avoid that the driver is lack of knowledge of the surrounding dangerous situation when driving according to the guide information, for example, the rear vehicle located at the blind zone of the driver overtakes, or overtakes, Acceleration, lane change, jam and the like of vehicles close to a lane, if the driver does not obtain similar dangerous information and still drives according to the guidance of a navigation route, traffic accidents such as scraping, even collision and the like can be caused, and potential safety hazards are brought to driving.
S170: and outputting the second navigation information.
In the embodiment of the description, the second navigation information includes both image information and voice information, wherein the image information is visually presented in front of the driver through a display instrument and other devices installed on the vehicle on the basis of the first navigation information through a fusion algorithm, image rendering and layer superposition with the vehicle active safety information, and includes vehicle current position information, vehicle surrounding environment information, front road guidance information and vehicle active safety information;
the image information is presented as follows:
on an image in front of a driver, the current position of the vehicle and the surrounding environment information of the vehicle are displayed and superposed with front route guide information, including left/right turn, U-turn, straight movement and lane change and the like, traffic indicator light information, front speed limit photographing and other prompting information, lane departure, vehicle/pedestrian detection, distance collision reminding according to vehicle distance information and the like;
1) when lane departure is detected, highlighting the departing lane line on the AR navigation image picture to prompt a driver and passengers;
2) when the fact that the vehicle and the front vehicle/pedestrian are in the unsafe distance or the vehicle/pedestrian in the unsafe distance exists in the blind area of the driver is detected, the vehicle, the pedestrian or other objects are circled on the AR navigation picture;
3) and when the user turns on the steering lamp but judges that the adjacent lane is not allowed to change lanes immediately at the moment through the vehicle distance information, highlighting the target lane on the AR navigation picture and prompting the driver to wait.
The voice information can be output in a voice broadcasting mode; besides, other output modes of the warning lamp and the flashing lamp can be provided.
The navigation method based on the augmented reality technology provided by the embodiment of the specification can realize the visualization of the peripheral environment information in the driving process through the facility equipment installed on the vehicle; the navigation information is combined with the vehicle active safety technology, the warnings such as driver and passenger blind area warning, safe distance warning and lane change warning are provided, more accurate and safer driving guidance is provided, and the driving safety is improved; by means of the AR navigation technology, a more visual and visual AR navigation method is provided.
Example 2
As shown in fig. 2, the present embodiment provides a navigation method based on augmented reality technology, including:
s210: acquiring navigation demand information, and planning a navigation path according to the navigation demand information; the navigation demand information comprises navigation request information, departure place information and destination information, and after the navigation request information is confirmed to be correct, a navigation path is planned according to the departure place information and the destination information.
S220: acquiring current position information of a vehicle and surrounding environment information of the vehicle, wherein the current position information of the vehicle comprises the current position of the vehicle and the name of a road where the vehicle is located; in the embodiment of the specification, the current position information of the vehicle can be acquired through a vehicle-mounted GPS; the vehicle surrounding environment information includes: traffic sign information such as lane line information, vehicle surrounding object information and traffic light information, speed limit sign, furtherly, vehicle surrounding object information can divide into: other vehicle information (i.e., other vehicles except the own vehicle), pedestrian information, obstacle information, and building information, the vehicle surrounding environment information being obtained by radar and a look-around camera and a front camera installed on the roof and/or the periphery of the vehicle.
S230: acquiring the guiding information of the front route of the vehicle according to the navigation path and the current position information of the vehicle; the vehicle front route guidance information comprises distance information of the vehicle and a front intersection, steering information of the vehicle at the front intersection, a front road name and predicted passing time of the vehicle from a current position to the front intersection.
Wherein, the steering information of the vehicle at the front intersection comprises: the vehicle turns left, goes straight, turns right, rotary island and turns around at the front intersection.
The expected transit time is determined as follows:
acquiring vehicle running information, wherein the vehicle running information comprises vehicle speed information, acceleration information and steering wheel angle information; the vehicle travel information also includes information such as electric quantity information.
And acquiring the predicted passing time according to the navigation path, the current position information of the vehicle and the vehicle running information.
S240: acquiring first navigation information according to the current position information of the vehicle, the surrounding environment information of the vehicle, the driving information of the vehicle and the guiding information of the route in front of the vehicle;
s250: acquiring vehicle active safety information according to the current position information of the vehicle and the surrounding environment information of the vehicle; the vehicle active safety information comprises lane departure information, vehicle distance information, blind area detection alarm information and warning information.
The lane departure information is obtained by calculation through a big data algorithm according to the current position information of the vehicle and the lane line information around the vehicle;
the vehicle distance information comprises first distance information between the vehicle and other surrounding vehicles, second distance information between the vehicle and surrounding pedestrians, and third distance information between the vehicle and surrounding obstacles;
the blind area alarm information is obtained by detecting the blind area of the vehicle according to the surrounding environment information of the vehicle, which is obtained by the camera and the radar and is positioned at the blind area of the field of vision of the driver;
the warning information is determined by:
acquiring vehicle interior image information, and acquiring behavior information of drivers and passengers in the vehicle according to the vehicle interior image information; the vehicle interior image information can be obtained through a camera arranged in a carriage, after the original material of the vehicle interior image information is obtained, the image information is captured and identified through a deep learning algorithm, and behavior information of drivers and passengers is obtained.
And judging the behavior information of the drivers and passengers in the automobile, and generating corresponding warning information when the behavior of the drivers and passengers in the automobile is judged to be dangerous. For example: when the dangerous behaviors that the driver and the passenger do not fasten the safety belt, stretch out of the window by hands and the like are judged, corresponding warning information and the like are generated and finally output and displayed to the driver.
S260: acquiring second navigation information according to the first navigation information and the vehicle active safety information;
s270: and outputting the second navigation information.
The second navigation information comprises image information and voice information, wherein the voice information can be output to a driver in a voice broadcasting mode, for example, when the behavior of a driver and a passenger is dangerous, the timeliness of information acquisition can be improved through voice output, and therefore the driving safety is greatly improved;
when the image information can be visually projected to the front of a driver through equipment such as a head-up display instrument, the driver can obtain various information required by vehicle running without lowering head;
the image information is presented as follows: by AR display technology, the driver can know: the system comprises a vehicle driving information such as the current position of the vehicle, the name of the road where the vehicle is located, the information of lane lines around the vehicle, objects around the vehicle, traffic signs, the speed and the oil quantity of the vehicle, a distance between the vehicle and a front crossing, steering information of the front crossing, the name of the front road and the predicted passing time of the front crossing, lane departure information, vehicle distance information, blind area detection alarm information, warning information of the behaviors of drivers and passengers, and the like, wherein the vehicle driving information comprises the distance between the vehicle and the front crossing, the steering information of the front crossing, the;
1) when overspeed or low electric quantity is detected, vehicle speed information and an electric quantity icon are highlighted on the AR navigation picture to remind a driver of decelerating or charging in time;
2) and when the situation that the driver and the passenger don't wear the safety belt is detected, highlighting an icon without the safety belt on the navigation image to remind the driver and the passenger.
In a navigation method based on an augmented reality technology provided in an embodiment of the present specification, the second navigation information further includes active safety information; therefore, the driver can drive along with the front route guide information and monitor the surrounding environment in real time, lane departure is avoided, the following distance is short, and safety risks exist in blind areas. Other similar parts of this embodiment and embodiment 1 can be referred to each other, and are not described in detail herein.
Example 3
As shown in fig. 3, an embodiment of the present specification provides a navigation device based on augmented reality technology, including:
a first obtaining module 10, configured to obtain navigation requirement information;
the second obtaining module 20 is configured to obtain a navigation path according to the navigation requirement information;
the third obtaining module 30 is configured to obtain current position information of a vehicle and surrounding environment information of the vehicle, where the current position information of the vehicle includes a name of a road where the vehicle is currently located;
a fourth obtaining module 40, configured to obtain, according to the navigation path and the current position information of the vehicle, route guidance information in front of the vehicle;
a first navigation information obtaining module 50, configured to obtain first navigation information according to the vehicle surrounding environment information and the vehicle front route guidance information;
a fifth obtaining module 60, configured to obtain vehicle active safety information according to the vehicle current position information and the vehicle surrounding environment information;
the second navigation information obtaining module 70 is configured to obtain second navigation information according to the first navigation information and the vehicle active safety information;
and an output module 80, configured to output the second navigation information.
Example 4
The embodiment of the specification provides a navigation system based on augmented reality technology, which comprises a camera, a radar and a navigation device provided in the technical scheme, wherein the camera and the radar are both connected with the navigation device;
the camera comprises a front camera, a look-around camera and an in-vehicle camera, and is used for acquiring vehicle surrounding environment information and in-vehicle image information;
the radar comprises a laser radar and an ultrasonic radar, and is used for acquiring vehicle distance information according to the current position information of the vehicle and the surrounding environment information of the vehicle;
the navigation device can receive the vehicle distance information acquired by the radar and the vehicle surrounding environment information and the vehicle interior image information acquired by the camera.
Example 5
An embodiment of the present specification provides an electronic device, including:
one or more processors;
a memory; and
one or more programs, stored in the memory and executed by the one or more processors, the programs comprising instructions for performing the augmented reality technology-based navigation method as provided in the above-described solution.
While the invention has been described with reference to specific embodiments, it will be appreciated by those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the invention can be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned.
Also, in some cases, the actions or steps recited in the claims can be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results.

Claims (10)

1. A navigation method based on augmented reality technology is characterized by comprising the following steps:
acquiring navigation demand information, and planning a navigation path according to the navigation demand information;
acquiring current position information of a vehicle and surrounding environment information of the vehicle, wherein the current position information of the vehicle comprises a name of a road where the vehicle is located;
acquiring the guiding information of the front route of the vehicle according to the navigation path and the current position information of the vehicle;
acquiring first navigation information according to the current position information of the vehicle, the surrounding environment information of the vehicle and the guiding information of the route in front of the vehicle;
acquiring vehicle active safety information according to the current position information of the vehicle and the surrounding environment information of the vehicle;
acquiring second navigation information according to the first navigation information and the vehicle active safety information;
and outputting the second navigation information.
2. The augmented reality technology-based navigation method of claim 1, wherein the vehicle active safety information comprises lane departure information, vehicle distance information and blind zone detection alarm information.
3. The augmented reality technology-based navigation method according to claim 1, wherein the vehicle active safety information includes lane departure information, vehicle distance information, blind zone detection alarm information, and warning information;
the warning information is determined by the following method:
acquiring vehicle interior image information, and acquiring behavior information of drivers and passengers in the vehicle according to the vehicle interior image information;
and judging the behavior information of the drivers and passengers in the automobile, and generating corresponding warning information when the behavior of the drivers and passengers in the automobile is judged to be dangerous.
4. The augmented reality technology-based navigation method according to claim 2 or 3, wherein the vehicle surrounding environment information includes:
lane line information around the vehicle, traffic identification information around the vehicle, other vehicle information around the vehicle, pedestrian information around the vehicle, and obstacle information around the vehicle.
5. The augmented reality technology-based navigation method according to claim 4, wherein the obtaining of vehicle active safety information according to the vehicle current position information and the vehicle surrounding environment information includes:
acquiring lane departure information according to the current position information of the vehicle and the surrounding lane line information;
acquiring vehicle distance information according to the current position information of the vehicle, information of other vehicles around the vehicle, information of pedestrians around the vehicle and information of obstacles around the vehicle;
and detecting the blind area of the vehicle according to the surrounding environment information of the vehicle to obtain blind area alarm information.
6. The augmented reality technology-based navigation method of claim 1, wherein the vehicle front route guidance information comprises distance information of the vehicle from a front intersection, turning information of the vehicle at the front intersection and a front road name.
7. The augmented reality technology-based navigation method according to claim 1, wherein the vehicle front route guidance information includes distance information of the vehicle from the front intersection, steering information of the vehicle at the front intersection, a front road name, and an expected transit time for the vehicle to travel from a current position to the front intersection;
the expected transit time is determined by:
acquiring vehicle running information, wherein the vehicle running information comprises vehicle speed information, acceleration information and steering wheel angle information;
and acquiring the predicted passing time according to the navigation path, the current position information of the vehicle and the running information of the vehicle.
8. A navigation device based on augmented reality technology, comprising:
the first acquisition module is used for acquiring navigation demand information;
the second acquisition module is used for acquiring the navigation path according to the navigation demand information;
the third acquisition module is used for acquiring the current position information of the vehicle and the surrounding environment information of the vehicle, wherein the current position information of the vehicle comprises the name of the road where the vehicle is located;
the fourth acquisition module is used for acquiring the guiding information of the front route of the vehicle according to the navigation path and the current position information of the vehicle;
the first navigation information acquisition module is used for acquiring first navigation information according to the current position information of the vehicle, the surrounding environment information of the vehicle and the guide information of the route in front of the vehicle;
the fifth acquisition module is used for acquiring vehicle active safety information according to the current position information of the vehicle and the surrounding environment information of the vehicle;
the second navigation information acquisition module is used for acquiring second navigation information according to the first navigation information and the vehicle active safety information;
and the output module is used for outputting the second navigation information.
9. An augmented reality technology-based navigation system, comprising a camera, a radar and the augmented reality technology-based navigation device according to claim 8, wherein the camera and the radar are connected with the navigation device;
the camera is used for acquiring the surrounding environment information of the vehicle and the image information in the vehicle;
the radar is used for acquiring vehicle distance information according to the current position information of the vehicle and the surrounding environment information of the vehicle;
the navigation device can receive the vehicle distance information acquired by the radar, the vehicle surrounding environment information acquired by the camera and the in-vehicle image information.
10. An electronic device, comprising:
one or more processors;
a memory; and
one or more programs stored in the memory and executed by the one or more processors, the programs comprising instructions for performing an augmented reality technology-based navigation method of any one of claims 1 to 7.
CN202010392648.3A 2020-05-11 2020-05-11 Navigation method, device, system and equipment based on augmented reality technology Pending CN111707283A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010392648.3A CN111707283A (en) 2020-05-11 2020-05-11 Navigation method, device, system and equipment based on augmented reality technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010392648.3A CN111707283A (en) 2020-05-11 2020-05-11 Navigation method, device, system and equipment based on augmented reality technology

Publications (1)

Publication Number Publication Date
CN111707283A true CN111707283A (en) 2020-09-25

Family

ID=72536939

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010392648.3A Pending CN111707283A (en) 2020-05-11 2020-05-11 Navigation method, device, system and equipment based on augmented reality technology

Country Status (1)

Country Link
CN (1) CN111707283A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112466157A (en) * 2020-11-25 2021-03-09 中通服咨询设计研究院有限公司 Traffic accident early warning method and device
CN112747765A (en) * 2021-01-08 2021-05-04 重庆长安汽车股份有限公司 Path pushing method and system based on navigation and sensor fusion and storage medium
CN113183758A (en) * 2021-04-28 2021-07-30 昭通亮风台信息科技有限公司 Auxiliary driving method and system based on augmented reality
CN113326758A (en) * 2021-05-25 2021-08-31 青岛慧拓智能机器有限公司 Head-up display technology for remotely controlling driving monitoring video
CN113776551A (en) * 2021-09-27 2021-12-10 北京乐驾科技有限公司 Navigation method and device based on augmented reality glasses, glasses and equipment
CN114299753A (en) * 2021-11-30 2022-04-08 东风柳州汽车有限公司 Blind area reminding method, device, equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104786933A (en) * 2015-03-04 2015-07-22 江苏大学 Panoramic image driving auxiliary device and panoramic image driving auxiliary method
CN106297342A (en) * 2016-10-19 2017-01-04 胡爱彬 A kind of in advance, the alarm set of real-time prompting traffic light information and method
CN109584596A (en) * 2018-12-20 2019-04-05 奇瑞汽车股份有限公司 Vehicle drive reminding method and device
CN110084184A (en) * 2019-04-25 2019-08-02 浙江吉利控股集团有限公司 A kind of safety belt based on image processing techniques is not detection system and method
CN110375764A (en) * 2019-07-16 2019-10-25 中国第一汽车股份有限公司 Lane change reminding method, system, vehicle and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104786933A (en) * 2015-03-04 2015-07-22 江苏大学 Panoramic image driving auxiliary device and panoramic image driving auxiliary method
CN106297342A (en) * 2016-10-19 2017-01-04 胡爱彬 A kind of in advance, the alarm set of real-time prompting traffic light information and method
CN109584596A (en) * 2018-12-20 2019-04-05 奇瑞汽车股份有限公司 Vehicle drive reminding method and device
CN110084184A (en) * 2019-04-25 2019-08-02 浙江吉利控股集团有限公司 A kind of safety belt based on image processing techniques is not detection system and method
CN110375764A (en) * 2019-07-16 2019-10-25 中国第一汽车股份有限公司 Lane change reminding method, system, vehicle and storage medium

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112466157A (en) * 2020-11-25 2021-03-09 中通服咨询设计研究院有限公司 Traffic accident early warning method and device
CN112466157B (en) * 2020-11-25 2021-10-19 中通服咨询设计研究院有限公司 Traffic accident early warning method and device
CN112747765A (en) * 2021-01-08 2021-05-04 重庆长安汽车股份有限公司 Path pushing method and system based on navigation and sensor fusion and storage medium
CN113183758A (en) * 2021-04-28 2021-07-30 昭通亮风台信息科技有限公司 Auxiliary driving method and system based on augmented reality
CN113326758A (en) * 2021-05-25 2021-08-31 青岛慧拓智能机器有限公司 Head-up display technology for remotely controlling driving monitoring video
CN113776551A (en) * 2021-09-27 2021-12-10 北京乐驾科技有限公司 Navigation method and device based on augmented reality glasses, glasses and equipment
CN114299753A (en) * 2021-11-30 2022-04-08 东风柳州汽车有限公司 Blind area reminding method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
RU2719101C2 (en) Method and vehicle to assist drivers when changing lanes on a carriageway
CN111707283A (en) Navigation method, device, system and equipment based on augmented reality technology
US9723243B2 (en) User interface method for terminal for vehicle and apparatus thereof
US9417080B2 (en) Movement trajectory generator
US11634150B2 (en) Display device
US9827907B2 (en) Drive assist device
US7642931B2 (en) Driving support image-display apparatus and program therefor
WO2018066711A1 (en) Travel assistance device and computer program
JP6398957B2 (en) Vehicle control device
WO2016186039A1 (en) Automobile periphery information display system
WO2020125178A1 (en) Vehicle driving prompting method and apparatus
JP2009298360A (en) Driving assistance system of vehicle
CN111373461A (en) Method for displaying the course of a safety area in front of a vehicle or an object using a display unit, device for carrying out the method, motor vehicle and computer program
US20120109521A1 (en) System and method of integrating lane position monitoring with locational information systems
EP3627110B1 (en) Method for planning trajectory of vehicle
US20190286125A1 (en) Transportation equipment and traveling control method therefor
KR102611337B1 (en) Vehicle AR display device and method of operation thereof
US10854172B2 (en) Display system, display control method, and storage medium
CN113401056B (en) Display control device, display control method, and computer-readable storage medium
JP2017191472A (en) Image display method and image display device
CN116935695A (en) Collision warning system for a motor vehicle with an augmented reality head-up display
JP2018144690A (en) Display device for vehicle
CN113401071A (en) Display control device, display control method, and computer-readable storage medium
CN117445810A (en) Vehicle auxiliary driving method, device, medium and vehicle
CN117037523A (en) Navigation method, device, system and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20230417

Address after: 315899 room 1031, building 1, Liaohe Road business building, Xinqi street, Ningbo, Zhejiang

Applicant after: Zhejiang jikrypton Intelligent Technology Co.,Ltd.

Applicant after: ZHEJIANG GEELY HOLDING GROUP Co.,Ltd.

Address before: 315336 818 Binhai two road, Hangzhou Bay New District, Ningbo, Zhejiang

Applicant before: NINGBO GEELY AUTOMOBILE RESEARCH AND DEVELOPMENT Co.,Ltd.

Applicant before: ZHEJIANG GEELY HOLDING GROUP Co.,Ltd.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200925