CN110702135A - Navigation method and device for vehicle, automobile and storage medium - Google Patents

Navigation method and device for vehicle, automobile and storage medium Download PDF

Info

Publication number
CN110702135A
CN110702135A CN201910975098.5A CN201910975098A CN110702135A CN 110702135 A CN110702135 A CN 110702135A CN 201910975098 A CN201910975098 A CN 201910975098A CN 110702135 A CN110702135 A CN 110702135A
Authority
CN
China
Prior art keywords
vehicle
intersection
navigation route
steering action
acquiring
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910975098.5A
Other languages
Chinese (zh)
Inventor
钟仲芳
王佩生
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Xiaopeng Motors Technology Co Ltd
Original Assignee
Guangzhou Xiaopeng Motors Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Xiaopeng Motors Technology Co Ltd filed Critical Guangzhou Xiaopeng Motors Technology Co Ltd
Priority to CN201910975098.5A priority Critical patent/CN110702135A/en
Publication of CN110702135A publication Critical patent/CN110702135A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • G01C21/3415Dynamic re-routing, e.g. recalculating the route when the user deviates from calculated route or after detecting real-time traffic data or accidents

Abstract

The embodiment of the application provides a navigation method and device of a vehicle, the vehicle and a storage medium, wherein the method comprises the following steps: after the vehicle runs according to the set navigation route, acquiring a second steering action of the vehicle passing through an intersection on the navigation route; comparing the second steering action with a first steering action pre-cached in a vehicle-mounted system; and when the deviation of the navigation route is determined according to the comparison result, deviation correction processing is carried out, so that the deviation of the vehicle from the navigation route is recognized in time, and whether the vehicle deviates from the navigation route or not can be judged according to the first steering action and the real-time second steering action in the cache under the condition of not depending on a positioning system, so that the recognition efficiency and the recognition accuracy are improved.

Description

Navigation method and device for vehicle, automobile and storage medium
Technical Field
The present application relates to the field of information processing technologies, and in particular, to a navigation method and apparatus for a vehicle, and a storage medium.
Background
With the development of science and technology, the navigation product brings great convenience to the user when going out, and when the user goes out and is in an unfamiliar environment, the starting point and the terminal point can be input in the navigation application to carry out route planning.
In the prior art, when judging whether a running vehicle deviates from a navigation route, a positioning system is required to obtain the real-time position of the vehicle, and the deviation between the real-time position and the navigation route is compared.
However, when the positioning system fails, such as a positioning error is too large or the positioning system does not work, the navigation application cannot acquire the real-time position of the vehicle, determine whether the vehicle deviates from the navigation route to correct the navigation route in time, and the recognition efficiency is low.
Disclosure of Invention
In view of the above, it is proposed to provide a navigation method and apparatus, a car, a storage medium for a vehicle that overcome or at least partially solve the above problems, comprising:
a method of navigating a vehicle, the method comprising:
after the vehicle runs according to the set navigation route, acquiring a second steering action of the vehicle passing through an intersection on the navigation route;
comparing the second steering action with a first steering action pre-cached in a vehicle-mounted system;
and when the navigation route is determined to deviate according to the comparison result, deviation correction processing is carried out.
Optionally, the step of obtaining a second steering action of the vehicle passing through an intersection on the navigation route after the vehicle travels according to the set navigation route includes:
acquiring a road scene image after a vehicle runs according to a set navigation route;
identifying the road scene image;
and when the road scene image is determined to comprise the intersection scene characteristics according to the recognition result, acquiring a second steering action of the vehicle passing through the intersection on the navigation route.
Optionally, the step of acquiring the road scene image after the vehicle travels according to the set navigation route includes:
determining the starting point position of a navigation route after a vehicle runs according to the set navigation route;
determining an intersection corresponding to the starting point position in the navigation route, and acquiring a route distance between the starting point position and the intersection;
calculating a driving distance of the vehicle from the starting point position to a first real-time position;
and when the difference value between the route distance and the driving distance is smaller than a difference threshold value, acquiring a road scene image.
Optionally, the step of acquiring the road scene image after the vehicle travels according to the set navigation route includes:
determining the starting point position of a navigation route after a vehicle runs according to the set navigation route;
determining an intersection corresponding to the starting point position in the navigation route and a second real-time position of the vehicle;
and when the distance between the second real-time position and the intersection is smaller than a distance threshold value, acquiring a road scene image.
Optionally, the step of acquiring the road scene image after the vehicle travels according to the set navigation route includes:
determining the starting point position of a navigation route after a vehicle runs according to the set navigation route;
determining an intersection corresponding to the starting point position in the navigation route and a preset shooting area taking the intersection as a center;
and acquiring a third real-time position of the vehicle, and acquiring a road scene image when the third real-time position is in the preset shooting area.
Optionally, the step of identifying the road scene image includes:
establishing an intersection scene feature library, wherein the intersection scene feature library comprises candidate feature vectors of one or more intersection scene images;
acquiring a real-time feature vector of the road scene image;
and identifying the road scene image according to the real-time characteristic vector and the candidate characteristic vector, and outputting an identification result.
Optionally, the step of establishing an intersection scene feature library includes:
determining all crossing scenes in the navigation route, and acquiring crossing image characteristics of the crossing scenes;
and generating a candidate feature vector corresponding to the intersection image feature.
Optionally, the step of obtaining a second steering action of the vehicle passing through the intersection on the navigation route when it is determined that the road scene image includes the intersection scene feature according to the recognition result includes:
when the road scene image comprises the intersection scene characteristics, acquiring the acceleration of the vehicle passing through the intersection on the navigation route;
calculating a running gradient by using the acceleration;
determining a second steering action of the vehicle based on the travel grade.
Optionally, the step of determining a second steering action of the vehicle according to the travel gradient comprises:
when the running gradient is within a preset gradient range, acquiring a steering angle of the vehicle;
and identifying a second steering action of the vehicle by using the running gradient and the steering angle.
Optionally, the step of determining a second steering action of the vehicle according to the running gradient further comprises:
when the running gradient is out of a preset gradient range, acquiring a steering angle of the vehicle;
acquiring a road attribute corresponding to the intersection in the navigation route;
identifying a second steering action of the vehicle using the travel grade, the steering angle, and the road attribute.
Optionally, the step of comparing the second steering action with the first steering action cached in advance in the vehicle-mounted system includes:
comparing the second steering action with a first steering action pre-cached in a vehicle-mounted system;
deleting the first steering action when the second steering action is the same as the first steering action;
and when the second steering action is different from the first steering action, generating a deviation route result.
Optionally, the step of performing deviation correction processing includes:
waiting for a deviation in course response by the positioning system;
deleting the first turning action when the route deviation response is received within a preset time;
and re-executing the step of comparing the second steering action with the first steering action cached in the vehicle-mounted system in advance.
Optionally, the step of performing deviation correction processing further includes:
when the route deviation response is not received within the preset time, the current position of the vehicle is obtained;
and when the distance between the current position and the intersection is greater than a distance threshold value, replanning the navigation route by using the current position.
Optionally, after the step of replanning the navigation route with the current position when the distance between the current position and the intersection is greater than the distance threshold, the method further includes:
and re-executing the step of acquiring the road scene image after the vehicle runs according to the set navigation route.
A navigation device of a vehicle, the device comprising:
the second steering action acquisition module is used for acquiring a second steering action of the vehicle passing through an intersection on the navigation route after the vehicle runs according to the set navigation route;
the comparison module is used for comparing the second steering action with a first steering action cached in advance in a vehicle-mounted system;
and the deviation correcting module is used for performing deviation correcting processing when the deviation of the navigation route is determined according to the comparison result.
A vehicle comprising a processor, a memory and a computer program stored on the memory and being executable on the processor, the computer program, when executed by the processor, implementing the steps of the navigation method of the vehicle as above.
A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the navigation method of the vehicle as above.
The embodiment of the application has the following advantages:
in the embodiment of the application, the navigation route information of the vehicle is acquired, wherein the navigation route information comprises intersection driving information, the real-time steering information of the vehicle driving to an intersection is determined, and the navigation route deviation information is generated when the real-time steering information is different from the intersection driving information, so that the deviation of the vehicle from the navigation route is recognized in time, whether the vehicle deviates from the navigation route can be judged according to the real-time steering action of the vehicle under the condition of not depending on a positioning system, and the recognition efficiency and the recognition accuracy are improved.
Drawings
In order to more clearly illustrate the technical solutions of the present application, the drawings needed to be used in the description of the present application will be briefly introduced below, and it is apparent that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive labor.
FIG. 1 is a flow chart illustrating steps of a method for navigating a vehicle according to an embodiment of the present application;
FIG. 2 is a flow chart illustrating steps of another method for navigating a vehicle according to an embodiment of the present application;
FIG. 3 is an image of an intersection scene provided by an embodiment of the present application;
FIG. 4 is an image of an intersection scene feature library provided in an embodiment of the present application;
FIG. 5 is a flow chart illustrating steps of another method for navigating a vehicle according to an embodiment of the present application;
FIG. 6 is a flow chart illustrating steps of another method for navigating a vehicle according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of a navigation device of a vehicle according to an embodiment of the present application.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, the present application is described in further detail with reference to the accompanying drawings and the detailed description. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Referring to fig. 1, a flowchart illustrating steps of a navigation method for a vehicle according to an embodiment of the present application may be applied to an on-board system, and specifically, may include the following steps:
step 101, after a vehicle runs according to a set navigation route, acquiring a second steering action of the vehicle passing through an intersection on the navigation route;
as an example, the second steering action (realTurnType) may include a direction and a road to which the vehicle is heading when turning at the intersection.
Before the vehicle starts to run, a user can click a Navigation application on a large screen of the vehicle-mounted System, and the current position and the destination point of the vehicle are input through voice or characters, or the vehicle-mounted System can directly acquire the current position of the vehicle through GPS (global positioning System) positioning or BDS (Beidou satellite Navigation System, BeiDou Navigation satellite System) positioning. After acquiring the current position and destination of the vehicle, the vehicle-mounted system can send a navigation route planning request to the remote server through a built-in map engine.
After the route planning is successful, the vehicle-mounted system can receive the navigation route scheme from the remote server and set the navigation route scheme as the navigation route of the current driving, each road section in the navigation route can be connected by a plurality of intersections, and when the vehicle drives according to the navigation route, the vehicle-mounted system can acquire a second steering action when the vehicle passes through the intersections.
Step 102, comparing the second steering action with a first steering action cached in advance in a vehicle-mounted system;
as an example, the first turning action may be a turning action that the vehicle indicated in the navigation route should perform at an intersection, e.g. the vehicle should turn left at an intersection a away from high speed.
In a specific implementation, the remote server can determine the distance the vehicle should travel, and the intersection and direction to turn, for each road segment when planning the navigation route. Therefore, after the vehicle-mounted system acquires the navigation route, the navigation route including the first steering action can be stored into a cache region (turnListCache) of the navigation application, and when the information related to the navigation route is called next time, the information can be directly read from the cache region.
When the vehicle passes through the intersection, the vehicle-mounted system can search for a first steering action in the navigation route, compare whether a second steering action of the vehicle is consistent with the first steering action, and judge whether the vehicle deviates from the navigation route.
And 103, when the navigation route is determined to deviate according to the comparison result, deviation correction processing is carried out.
After the comparison, if the second steering action is consistent with the first steering action, it may be determined that the vehicle does not deviate from the navigation route currently, and if the second steering action is inconsistent with the first steering action, it may be determined that the vehicle deviates from the navigation route.
In the embodiment of the application, after the vehicle runs according to the set navigation route, the second turning action of the vehicle passing through the intersection on the navigation route is obtained, the second turning action is compared with the first turning action which is cached in the vehicle-mounted system in advance, when the navigation route is determined to deviate according to the comparison result, deviation correction processing can be carried out, timely recognition of the deviation of the vehicle from the navigation route is achieved, under the condition that a positioning system is not relied, whether the vehicle deviates from the navigation route or not can be judged according to the first turning action which is cached and the real-time second turning action, and the recognition efficiency and the recognition accuracy are improved
Referring to fig. 2, a flowchart illustrating steps of another navigation method for a vehicle according to an embodiment of the present application may be applied to an on-board system, and specifically, may include the following steps:
step 201, acquiring a road scene image after a vehicle runs according to a set navigation route;
as an example, the road scene image may be an image of a road environment around the vehicle captured by a camera, where the road scene may include an intersection scene, a viaduct scene, a tunnel scene, and the like, and as shown in fig. 3, is a road scene image captured by a roof camera.
Before the vehicle starts to run, a user can click a navigation application on a large screen of the vehicle-mounted system, a destination point is input to generate a navigation route planning request, and the vehicle-mounted system can obtain a navigation route from a remote server and display the navigation route through the large screen of the vehicle-mounted system in response to user operation.
In practical application, a camera can be installed on the top of the vehicle or a cab of the vehicle, when a user operates the vehicle and runs according to a set navigation route, the vehicle-mounted system can acquire road scene images in the running process of the vehicle through the camera to identify the surrounding environment.
In an embodiment of the present application, step 201 may include the following sub-steps:
substep 11, after the vehicle runs according to the set navigation route, determining the starting point position of the navigation route;
after the vehicle runs according to the navigation route, the vehicle-mounted system can determine the position where the vehicle is located when the navigation is started as the starting point of the navigation route, for example, when a user starts driving according to the guidance of a navigation application and clicks a key of 'start navigation' on a large screen of the vehicle-mounted system, the vehicle is located; alternatively, the intersection position of the intersection where the vehicle is located may be determined as the start position.
Substep 12, determining an intersection corresponding to the starting point position in the navigation route, and obtaining a route distance between the starting point position and the intersection;
as an example, the navigation route may be formed by connecting a plurality of road segments by a plurality of intersections.
After the starting point is determined, the vehicle-mounted system may further determine an intersection corresponding to the starting point from the navigation route, and obtain a route distance (turn distance) between the starting point and the intersection, where the intersection may be one of the intersections closest to the starting point in the navigation route.
For example, the navigation route sequentially includes 4 intersections: after the vehicle starts to run, the vehicle-mounted system can determine that the intersection 1 closest to the starting point position is the corresponding intersection and acquire the route distance between the starting point position and the intersection 1; when the vehicle travels to the intersection 1, it can be determined that the intersection 2 closest to the intersection 1 is the corresponding intersection, and the route distance between the intersection 2 and the intersection 3 is acquired.
Substep 13, calculating a driving distance of the vehicle from the starting position to a first real-time position;
as an example, the first real-time location may be a location where the vehicle is currently located.
When the vehicle starts from the starting point position, the vehicle-mounted system can start timing, acquire the running speed and the running time of the vehicle from the central gateway and calculate the running distance of the vehicle when the vehicle runs from the starting point position to the first real-time position.
And a substep 14 of acquiring a road scene image when the difference between the route distance and the travel distance is less than a difference threshold value.
After the route distance and the driving distance are obtained, the vehicle-mounted system can perform difference on the route distance and the driving distance to obtain a difference value between the route distance and the driving distance, and judge whether the difference value is smaller than a preset difference threshold value, for example, 10 meters.
Step 202, identifying the road scene image;
after the road scene image is obtained, the vehicle-mounted system can preprocess the road scene image, obtain image characteristics from the road scene image, identify the image characteristics and judge whether the front of the vehicle is an intersection scene or not.
In an embodiment of the present application, step 202 may include the following sub-steps:
substep 21, establishing an intersection scene feature library, wherein the intersection scene feature library comprises candidate feature vectors of one or more intersection scene images;
as an example, the candidate feature vector may be a Speeded Up RobustFeatures (SURF) vector, and the SURF algorithm is an image recognition and description algorithm that may be used in computer vision tasks such as object recognition and 3D reconstruction.
In specific implementation, before the vehicle-mounted system performs identification, an intersection scene feature library may be established, where the intersection scene feature library may include a plurality of intersection scene images and corresponding candidate feature vectors.
In an embodiment of the present application, the intersection scene feature library may be established through the following sub-steps: determining all crossing scenes in the navigation route, and acquiring crossing image characteristics of the crossing scenes; and generating a candidate feature vector corresponding to the intersection image feature.
As an example, the intersection image feature may be a SURF feature.
When the intersection scene feature library is established, the vehicle-mounted system can acquire an intersection scene image of each intersection in the navigation route through the map engine, for example, when the navigation route comprises an intersection 1, an intersection 2 and an intersection 3, the vehicle-mounted system can acquire the intersection scene images of the three intersections and respectively extract one or more intersection image features. After the intersection image features are obtained, the vehicle-mounted system can generate candidate feature vectors by adopting the intersection image features, the intersection images and the candidate feature vectors can be in one-to-one correspondence, namely, one intersection image can have one candidate feature vector.
As shown in fig. 4, an intersection scene feature library image is shown, wherein a plurality of intersection scene images (i.e., "intersection scene pictures" in fig. 4) and intersection image features (i.e., "SURF features" in fig. 4) may be included, and in addition, an intersection number and intersection attributes, such as an intersection name, intersection coordinates, etc., may also be set for each intersection scene image in the intersection scene feature library.
Substep 22, obtaining a real-time feature vector of the road scene image;
after the intersection scene feature library is established, the vehicle-mounted system can extract real-time feature vectors of the image from road scene images acquired in real time, and calculate the similarity between the real-time feature vectors and a plurality of candidate feature vectors so as to determine the specific scene of the current road.
And a substep 23, recognizing the road scene image according to the real-time characteristic vector and the candidate characteristic vector, and outputting a recognition result.
After the real-time feature vector is obtained, the vehicle-mounted system can compare the real-time feature vector with a plurality of candidate feature vectors, identify a road scene image, judge whether a road scene where the vehicle is currently located belongs to a road scene recorded in an intersection scene feature library, and generate an identification result.
In a specific implementation, the image recognition process may be implemented by calculating the similarity between the real-time feature vector and the candidate feature vector. During calculation, the similarity can be changed between 0 and 1, a similarity threshold value such as 0.5 can be preset in the vehicle-mounted system, when the similarity is larger than the threshold value such as 0.5, the road scene image is determined to be successfully identified, the vehicle-mounted system can determine that the road scene image comprises intersection scene characteristics, the road scene is an intersection scene, when a plurality of candidate feature vectors with the similarity larger than the threshold value exist, the results can be arranged in a descending order, the intersection scene corresponding to the candidate feature vector with the maximum similarity is determined as the intersection where the current vehicle is located, the number of the intersection scene is obtained, and the identification result containing the intersection number is output.
When the similarity between the real-time feature vector and the candidate feature vector is smaller than the threshold value, the road scene is determined not to be the intersection scene corresponding to the candidate feature vector, and the vehicle-mounted system can continue to calculate the similarity between the real-time feature vector and the next candidate feature vector until the road scene is successfully identified.
Step 203, when the road scene image is determined to include the intersection scene characteristics according to the recognition result, acquiring a second steering action of the vehicle passing through the intersection on the navigation route;
when the recognition result is obtained and the road scene image is determined to contain the intersection scene characteristics, and the front of the vehicle is the intersection, the vehicle-mounted system can further obtain a second steering action of the vehicle when the vehicle passes through the intersection.
In an embodiment of the present application, step 203 may include the following sub-steps:
substep 31, when the road scene image comprises the intersection scene characteristics, acquiring the acceleration of the vehicle passing through the intersection on the navigation route;
when the scene of the front road is the intersection scene, the vehicle-mounted system can acquire the acceleration (gravityRate) of the vehicle when the vehicle passes through the intersection through the gravity sensor.
A substep 32 of calculating a running gradient using said acceleration;
after acquiring the vehicle acceleration, the vehicle-mounted system may acquire the running gradient of the current running road of the vehicle by calculating a formula "gradient θ", where θ may be the vehicle acceleration.
A substep 33 of determining a second steering action of the vehicle, depending on the running gradient.
After determining the grade of travel, the in-vehicle system may determine a second steering action of the vehicle at the intersection based on the grade of travel in combination with the turn time, i.e., the time taken for the vehicle to pass the intersection.
In an embodiment of the present application, the sub-step 33 may include the following sub-steps: when the running gradient is within a preset gradient range, acquiring a steering angle of the vehicle; and identifying a second steering action of the vehicle by using the running gradient and the steering angle.
In recognizing the second steering action, the in-vehicle system may first determine whether the traveling gradient is within a preset gradient range, for example, within a preset gradient range of-1 ° to 1 °, and when the traveling gradient is within the preset range, it may be determined that the vehicle is traveling on a flat road, the in-vehicle system may acquire the sum of the angular velocities (angulerRate) of the vehicle through the angular velocity sensor, and calculate the steering angle of the vehicle after determining the steering time.
In practical applications, since the vehicle is not steered, for example, while traveling in a straight road, the angular velocity and the acceleration do not change significantly, and the change of the angular velocity and the acceleration is large when steering, the on-board system can calculate the steering time of the vehicle through the change of the angular velocity and the acceleration.
For example, when the vehicle-mounted system outputs the recognition result of the intersection scene in front, the vehicle does not enter the intersection to turn, the angular velocity and the acceleration do not change significantly, for example, the change rate is smaller than the change rate threshold, but the distance between the vehicle and the intersection is within the preset range at this time, the vehicle is about to enter the intersection to turn, and the vehicle-mounted system can record the acceleration and the angular velocity while receiving the recognition result, and set the state of the acceleration and the angular velocity at this time as the initial state. After the vehicle starts to turn, the direction or the magnitude of the angular velocity and the acceleration of the vehicle are continuously changed, the change rate is larger than the change rate threshold value, and the vehicle-mounted system can determine that the vehicle enters a turning state, record the time when the vehicle is changed from the initial state to the turning state and record the time as the initial change time. After the steering is finished, the vehicle drives into the straight road again, the change rate of the angular velocity and the acceleration is continuously reduced until the change rate is lower than the change rate threshold value, and the vehicle returns to the initial state, and then the time when the steering state is changed into the initial state can be recorded and is recorded as the time when the change is finished.
The in-vehicle system may calculate a difference between the end change timing and the initial change timing, and determine the difference as a steering time of the vehicle.
Alternatively, in the case where the running angular velocity and the running acceleration do not significantly change, for example, the extension angle of the road is less than the threshold value, or the vehicle turns through a curve at a constant speed, a preset time interval may be employed as the turning time.
After the steering time is determined, the vehicle-mounted system can calculate the product of the angular velocity and the steering time to obtain the steering angle. Because the gradient is in the preset range, the vehicle is determined to run on the flat road, the steering form on the flat road is monotonous, the conversion of scenes such as an overhead scene, a high speed scene or a tunnel scene is not involved, the connection of different roads can be realized, and the vehicle-mounted system can identify the second steering action of the vehicle through the steering angle.
In another embodiment of the present application, the substep 33 may comprise the substeps of: when the running gradient is out of a preset gradient range, acquiring a steering angle of the vehicle; acquiring a road attribute corresponding to the intersection in the navigation route; identifying a second steering action of the vehicle using the travel grade, the steering angle, and the road attribute.
As an example, a road attribute (roadClass) may determine the type of road in different directions at an intersection, such as a highway segment, an overpass segment, a bridge crossing segment, and so on.
In practice, an intersection can connect different roads, for example, an intersection can separately drive to an overhead bridge, drive away from the current highway section, and continue driving on the highway section, and there can still be various situations for the same steering angle and steering gradient, for example, it can be determined that the vehicle is going straight downward through the steering angle and the steering gradient, but the straight downward movement can be from straight downward to overhead, from straight downward to bridge, or from straight downward to high speed.
Based on this, when the driving gradient is outside the preset gradient range, for example, the gradient is greater than 1 ° or less than-1 °, after the vehicle-mounted system acquires the steering angle, the vehicle-mounted system may further read the road attribute of the intersection in the navigation route from the cache of the navigation application, determine the specific road segment connected to the intersection at different steering angles, and identify the second steering action of the vehicle by combining the driving gradient, the steering angle and the road attribute.
For example, the vehicle-mounted system may obtain an intersection number of the current intersection, and search for a road attribute corresponding to the intersection number in an established intersection scene feature library or a navigation route, where the intersection number of the current intersection is intersection 1, and intersection 1 includes 3 roads, and the road attributes are respectively a straight road section a on a level road, an uphill road section on a right turn to an overhead road, and a bridge driven to an uphill road on a left turn, and when a driving gradient is greater than 1 ° and a steering angle is 45 °, it may be determined that a second turning motion of the vehicle is used as the bridge driven to the uphill road on the left turn.
Step 204, comparing the second steering action with a first steering action cached in a vehicle-mounted system in advance;
as an example, the first turning maneuver may be a turning maneuver that the vehicle indicated in the navigation route should perform at an intersection, e.g. the vehicle should make a left turn uphill at intersection 1 towards a bridge.
In a specific implementation, the remote server can determine the distance the vehicle should travel, and the intersection and direction to turn, for each road segment when planning the navigation route. Therefore, after the vehicle-mounted system acquires the navigation route, the navigation route including the first turning action can be stored into the cache region of the navigation application, and when the related information of the navigation route is called next time, the navigation route can be directly read from the cache region.
When the vehicle passes through the intersection, the vehicle-mounted system can search for a first steering action in the navigation route, compare whether a second steering action of the vehicle is consistent with the first steering action, and judge whether the vehicle deviates from the navigation route.
Step 205, when the second steering action is the same as the first steering action, deleting the first steering action;
when the second steering action is consistent with the first steering action, the vehicle can be determined not to deviate from the navigation route, and the vehicle passes through the intersection and the first steering action of the intersection is compared with the second steering action, so that the vehicle-mounted system can delete the first steering action in the cache, and the storage space resource is saved.
Step 206, when the second steering action is different from the first steering action, generating a deviation route result;
when the second steering action is different from the first steering action, e.g., the second steering action is a "left turn downhill into the lower level of the overhead" and the first steering action is a "level straight run leading to road segment a", the on-board system may generate an off-course result.
And step 207, when the navigation route is determined to deviate according to the comparison result, deviation correction processing is carried out.
When the comparison result is a deviation route result, the vehicle is determined to have deviated from the navigation route, and the vehicle-mounted system can further perform deviation correction processing according to the deviation result and the response condition of the positioning system.
In an embodiment of the present application, the following sub-steps may be adopted for the deviation correction process:
a substep 41 of waiting for a deviation response of the course of the positioning system;
as an example, the route deviation response may be a response made by the positioning system to determine that the vehicle has deviated from the navigation route.
Under the condition of normal work, the positioning system can acquire real-time position information of the vehicle, and sends prompt information of vehicle deviation to the map engine when the vehicle deviates from the navigation route, and based on the prompt information, the vehicle-mounted system can wait for route deviation response of the positioning system when determining that the navigation route deviates.
Substep 42, when receiving the route deviation response within a preset time, deleting the first turning motion;
when the vehicle-mounted system receives the route deviation response within the preset time, the positioning system can be determined to still work normally, the positioning system can send the navigation route deviation information to the map engine, and the vehicle-mounted system can ignore the generated route deviation result and delete the first steering action in the cache.
Substep 43, re-executing step 204.
After deleting the first turning motion, the vehicle-mounted system can continue to acquire a second turning motion of the vehicle at the next intersection, and re-execute step 204 to compare whether the second turning motion is the same as the first turning motion at the next intersection in the navigation route.
In another embodiment of the present application, the deviation correction process may be performed using the following sub-steps:
a substep 51 of obtaining the current position of the vehicle when the route deviation response is not received within a preset time;
if the vehicle-mounted system does not receive the route deviation response within the preset time, the positioning system can be determined to be in fault and whether the vehicle deviates from the navigation route or not can not be judged, and at the moment, the vehicle-mounted system can calculate the current position of the vehicle according to the running time and the running speed of the vehicle.
And a substep 52 of replanning the navigation route with the current position when the distance between the current position and the intersection is greater than a distance threshold.
When the distance between the current position of the vehicle and the intersection is greater than the distance threshold, for example, greater than 200 meters, the vehicle can be determined to deviate from the navigation route greatly, and the vehicle cannot continue to run according to the original navigation route, the vehicle-mounted system can send the current position of the vehicle to the map engine, and the map engine can replan the navigation route according to the current position of the vehicle and the running destination.
In another embodiment of the present application, the sub-step 52 may be further followed by the following sub-steps: step 201 is re-executed.
After acquiring the new navigation route, the vehicle can travel again according to the re-planned navigation route, and the vehicle-mounted system can re-execute step 201 to acquire a second steering action at the intersection in the new navigation route during the vehicle traveling process.
In the embodiment of the application, after a vehicle runs according to a set navigation route, a road scene image is acquired, the road scene image is recognized, when the road scene image comprises intersection scene characteristics, a second steering action of the vehicle passing through an intersection on the navigation route can be acquired, the second steering action is compared with a first steering action which is cached in a vehicle-mounted system of the vehicle in advance, when the second steering action is the same as the first steering action, the first steering action is deleted, when the second steering action is different from the first steering action, a deviated route result is generated, and when the deviation of the navigation route is determined according to the comparison result, deviation correction processing is carried out, so that whether the vehicle deviates from the navigation route in time is recognized, and whether the current running route of the vehicle is the same as the navigation route can be judged without depending on a positioning system, and the route is replanned when not the same, so that the condition that the vehicle deviates from the navigation route cannot be found in time when the positioning system breaks down is avoided, the identification efficiency is improved, and the user experience is improved.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the embodiments are not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the embodiments. Further, those skilled in the art will also appreciate that the embodiments described in the specification are presently preferred and that no particular act is required of the embodiments of the application.
Referring to fig. 5, a flowchart illustrating steps of another navigation method for a vehicle according to an embodiment of the present application may be applied to an on-board system, and specifically, may include the following steps:
step 501, after a vehicle runs according to a set navigation route, determining a starting point position of the navigation route;
step 502, determining an intersection corresponding to the starting point position in the navigation route and a second real-time position of the vehicle;
as an example, the second real-time location may be a location where the vehicle is currently located.
In a specific implementation, the navigation route may be formed by connecting a plurality of intersections with a plurality of road segments, and after the start point position is determined, the vehicle-mounted system may further determine, from the navigation route, an intersection which is the shortest distance from the start point position, that is, an intersection closest to the start point position.
Further, when the vehicle departs from the starting point position, the on-board system may start timing, acquire a travel speed and a travel time from the central gateway, calculate a distance traveled by the vehicle after departing from the starting point position in real time, and determine a second real-time position of the vehicle, for example, when the vehicle departs from the starting point position and travels to a next intersection in the navigation route at a speed of 10 m/sec, the on-board system may determine that the vehicle departs from the starting point position for 20 seconds, and then determine the second real-time position of the vehicle.
Step 503, when the distance between the second real-time position and the intersection is smaller than a distance threshold, acquiring a road scene image;
as an example, the road scene image may be an image of the road environment around the vehicle captured by a camera, and the vehicle-mounted system may acquire the road scene image by the camera installed on the roof of the vehicle or in the cab of the vehicle.
After the second real-time position is determined, the vehicle-mounted system can calculate whether the distance between the second real-time position and the intersection is smaller than a preset distance threshold value, if the distance between the second real-time position and the intersection is smaller than the distance threshold value, the vehicle approaches the intersection, at the moment, the vehicle-mounted system can start the camera to acquire a road scene image, and if the distance between the second real-time position and the intersection is larger than the threshold value, the vehicle-mounted system can calculate the second real-time position of the vehicle in real time and continuously monitor whether the distance is smaller than the threshold.
Step 504, identifying the road scene image;
step 505, when the road scene image is determined to include the intersection scene characteristics according to the recognition result, acquiring a second steering action of the vehicle passing through the intersection on the navigation route;
step 506, comparing the second steering action with a first steering action cached in advance in the vehicle-mounted system;
and 507, performing deviation correction processing when the deviation of the navigation route is determined according to the comparison result.
In the embodiment of the application, after a vehicle runs according to a set navigation route, the starting point position of the navigation route is determined, an intersection corresponding to the starting point position in the navigation route and a second real-time position of the vehicle are determined, when the distance between the second real-time position and the intersection is smaller than a distance threshold, a road scene image is obtained, the road scene image is identified, when the road scene image comprises intersection scene characteristics according to an identification result, a second steering action of the vehicle passing through the intersection on the navigation route can be obtained, the second steering action is compared with a first steering action which is cached in a vehicle-mounted system of the vehicle in advance, when the navigation route is determined to be deviated according to the comparison result, deviation correction processing is carried out, whether the vehicle deviates from the navigation route or not is identified in time, and whether the current running route of the vehicle is the same as the navigation route or not can be judged without depending on a positioning system, and the route is replanned when not the same, so that the condition that the vehicle deviates from the navigation route cannot be found in time when the positioning system breaks down is avoided, the identification efficiency is improved, and the user experience is improved.
Referring to fig. 6, a flowchart illustrating steps of another navigation method for a vehicle according to an embodiment of the present application may be applied to an on-board system, and specifically, may include the following steps:
601, after a vehicle runs according to a set navigation route, determining a starting point position of the navigation route;
step 602, determining an intersection corresponding to the starting point position in the navigation route, and a preset shooting area taking the intersection as a center;
as an example, the preset shooting area may be an area where the in-vehicle system turns on a camera to shoot.
Because the navigation route can be formed by connecting a plurality of road sections by a plurality of intersections, after the starting point position is determined, the vehicle-mounted system can further determine the intersection with the shortest distance to the starting point position from the navigation route as the intersection corresponding to the starting point position.
In practical application, a shooting area can be preset for the intersection, for example, the preset shooting area can be generated by taking the intersection as a circle center and taking a preset distance as a radius, when the vehicle-mounted system acquires a navigation route, the map engine can send a corresponding relation list of the preset shooting area and the intersection to the vehicle-mounted system, and after the vehicle-mounted system determines the intersection, the vehicle-mounted system can search from the acquired relation list to determine the preset shooting area; or, a preset radius may be stored in the vehicle-mounted system, and after the intersection is determined, the vehicle-mounted system may obtain the preset radius and calculate the preset shooting area with the intersection as the center of the circle. Further, the shape of the preset photographing region is not limited to a circle, but may be a circular ring, a rectangle, or a polygon.
Step 603, acquiring a third real-time position of the vehicle, and acquiring a road scene image when the third real-time position is in the preset shooting area;
as an example, the third real-time location may be a location where the vehicle is currently located; the road scene image may be an image of the road environment around the vehicle captured by the camera.
In the running process of the vehicle, the vehicle-mounted system may obtain the third real-time position of the vehicle in real time, specifically, the vehicle-mounted system may send a position obtaining request to the map engine, and determine the third real-time position through the positioning system, or after the vehicle starts from the starting position, the vehicle-mounted system may calculate the distance that the vehicle has run after starting from the starting position through the running speed and the running time, and determine the third real-time position of the vehicle.
After the third real-time position is obtained, the vehicle-mounted system can judge whether the third real-time position falls into a preset shooting area, when the third real-time position is in the preset shooting area, the vehicle-mounted system determines that the vehicle is close to the intersection, the camera can be started, the road scene image is obtained, and if the third real-time position is not in the preset shooting area, the vehicle-mounted system can update the third real-time position of the vehicle according to a preset time interval, and continuously monitors whether the vehicle enters the preset shooting area.
Step 604, identifying the road scene image;
step 605, when the road scene image is determined to include the intersection scene characteristics according to the recognition result, acquiring a second steering action of the vehicle passing through the intersection on the navigation route;
step 606, comparing the second steering action with a first steering action pre-cached in the vehicle-mounted system;
and step 607, when the navigation route is determined to deviate according to the comparison result, deviation correction processing is carried out.
In the embodiment of the application, after a vehicle runs according to a set navigation route, the starting point position of the navigation route is determined, an intersection corresponding to the starting point position in the navigation route and a preset shooting area with the intersection as the center are determined, then a third real-time position of the vehicle can be obtained, when the third real-time position is in the preset shooting area, a road scene image is obtained and identified, and when the road scene image is determined to include intersection scene characteristics according to an identification result, a second steering action of the vehicle passing through the intersection on the navigation route can be obtained; and the second steering action is compared with the first steering action pre-cached in the vehicle-mounted system, and when the deviation of the navigation route is determined according to the comparison result, the deviation correction processing is carried out, so that whether the vehicle deviates from the navigation route or not is timely identified, whether the current driving route of the vehicle is the same as the navigation route or not can be judged under the condition of not depending on a positioning system, and the route is re-planned when the current driving route of the vehicle is different from the navigation route, so that the condition that the vehicle deviates from the navigation route cannot be timely found when the positioning system breaks down is avoided, and the user experience is improved while the identification efficiency is improved.
Referring to fig. 7, a schematic structural diagram of a navigation device of a vehicle according to an embodiment of the present application is shown, which may be applied to an on-board system, and specifically, may include the following modules:
a second steering action obtaining module 701, configured to obtain a second steering action of the vehicle passing through an intersection on the navigation route after the vehicle travels according to the set navigation route;
a comparison module 702, configured to compare the second steering action with a first steering action cached in advance in the vehicle-mounted system;
and the deviation correcting module 703 is configured to invoke a deviation processing module when it is determined that the navigation route deviates according to the comparison result.
In an embodiment of the present application, the second steering action obtaining module 701 may include:
the road scene image acquisition submodule is used for acquiring a road scene image after the vehicle runs according to the set navigation route;
the identification submodule is used for identifying the road scene image;
and the turning action determining submodule is used for acquiring a second turning action of the vehicle passing through the intersection on the navigation route when the road scene image is determined to comprise the intersection scene characteristics according to the recognition result.
In an embodiment of the present application, the road scene image obtaining sub-module may include:
the first starting point position determining unit is used for determining the starting point position of the navigation route after the vehicle runs according to the set navigation route;
the route distance acquisition unit is used for determining the intersection corresponding to the starting point position in the navigation route and acquiring the route distance between the starting point position and the intersection;
a travel distance calculation unit for calculating a travel distance of the vehicle traveling from the start position to a first real-time position;
and the first judgment unit is used for acquiring a road scene image when the difference value between the route distance and the driving distance is smaller than a difference value threshold value.
In another embodiment of the present application, the road scene image acquisition sub-module may include:
a second starting point position determination unit for determining a starting point position of the navigation route after the vehicle travels according to the set navigation route;
the second real-time position determining unit is used for determining an intersection corresponding to the starting point position in the navigation route and a second real-time position of the vehicle;
and the second judging unit is used for acquiring a road scene image when the distance between the second real-time position and the intersection is smaller than a distance threshold.
In another embodiment of the present application, the road scene image acquisition sub-module may include:
a third starting point position determination unit for determining a starting point position of the navigation route after the vehicle travels according to the set navigation route;
the preset shooting area determining unit is used for determining an intersection corresponding to the starting point position in the navigation route and a preset shooting area taking the intersection as a center;
and the third judging unit is used for acquiring a third real-time position of the vehicle and acquiring a road scene image when the third real-time position is in the preset shooting area.
In an embodiment of the present application, the identification sub-module may include:
the system comprises a feature library establishing unit, a feature library generating unit and a feature library generating unit, wherein the feature library establishing unit is used for establishing an intersection scene feature library, and the intersection scene feature library comprises candidate feature vectors of one or more intersection scene images;
the real-time characteristic vector acquisition unit is used for acquiring a real-time characteristic vector of the road scene image;
and the identification unit is used for identifying the road scene image according to the real-time characteristic vector and the candidate characteristic vector and outputting an identification result.
In an embodiment of the present application, the feature library establishing unit may include:
the intersection scene determining subunit is used for determining all the intersection scenes in the navigation route and acquiring intersection image characteristics of the intersection scenes;
and the candidate feature vector generating unit is used for generating candidate feature vectors corresponding to the intersection image features.
In an embodiment of the present application, the steering action determination sub-module may include:
the angular velocity acquisition unit is used for acquiring the acceleration of the vehicle passing through the intersection on the navigation route when the road scene image comprises the intersection scene characteristics;
a traveling gradient calculation unit for calculating a traveling gradient using the acceleration;
a second steering action determination unit for determining a second steering action of the vehicle in accordance with the running gradient.
In an embodiment of the present application, the second steering action determining unit may include:
a first acquisition subunit configured to acquire a steering angle of the vehicle when the running gradient is within a preset gradient range;
a first identification subunit, configured to identify a second steering action of the vehicle using the travel gradient and the steering angle.
In another embodiment of the present application, the second steering action determining unit may further include:
a second acquisition subunit configured to acquire a steering angle of the vehicle when the running gradient is outside a preset gradient range;
the road attribute acquiring subunit is used for acquiring the road attribute corresponding to the intersection in the navigation route;
a second identification subunit, configured to identify a second steering action of the vehicle using the travel gradient, the steering angle, and the road attribute.
In an embodiment of the present application, the apparatus may include:
a deleting module, configured to delete the first turning motion when the second turning motion is the same as the first turning motion;
and the deviation result generating module is used for generating a deviation route result when the second steering action is different from the first steering action.
In an embodiment of the present application, the deviation processing module includes:
the waiting submodule is used for waiting for a route deviation response of the positioning system;
and the deleting submodule is used for deleting the first steering action and calling the comparing module 702 when the route deviation response is received within the preset time.
In an embodiment of the present application, the deviation processing module may further include:
the position obtaining submodule is used for obtaining the current position of the vehicle when the route deviation response is not received within the preset time;
and the planning submodule is used for re-planning the navigation route by the current position when the distance between the current position and the intersection is greater than a distance threshold.
In an embodiment of the application, the planning sub-module may call the road scene image obtaining sub-module after the navigation route is re-planned.
In the embodiment of the application, the navigation route information of the vehicle is acquired, wherein the navigation route information comprises intersection driving information, the real-time steering information of the vehicle driving to an intersection is determined, and the navigation route deviation information is generated when the real-time steering information is different from the intersection driving information, so that the deviation of the vehicle from the navigation route is recognized in time, whether the vehicle deviates from the navigation route can be judged according to the real-time steering action of the vehicle under the condition of not depending on a positioning system, and the recognition efficiency and the recognition accuracy are improved.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
An embodiment of the present application also provides an electronic device, which may include a processor, a memory, and a computer program stored on the memory and capable of running on the processor, wherein the computer program, when executed by the processor, implements the steps of the navigation method of the vehicle as above.
An embodiment of the present application also provides a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the steps of the navigation method of the vehicle as above.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present application have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including the preferred embodiment and all such alterations and modifications as fall within the true scope of the embodiments of the application.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
The navigation method and apparatus, the vehicle, and the storage medium of the vehicle are introduced in detail, and the principle and the implementation of the present application are explained in the present document by applying specific examples, and the description of the above embodiments is only used to help understand the method and the core idea of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (17)

1. A method of navigating a vehicle, the method comprising:
after the vehicle runs according to the set navigation route, acquiring a second steering action of the vehicle passing through an intersection on the navigation route;
comparing the second steering action with a first steering action pre-cached in a vehicle-mounted system;
and when the navigation route is determined to deviate according to the comparison result, deviation correction processing is carried out.
2. The method of claim 1, wherein the step of obtaining a second steering action of the vehicle passing through the intersection on the navigation route after the vehicle travels according to the set navigation route comprises:
acquiring a road scene image after a vehicle runs according to a set navigation route;
identifying the road scene image;
and when the road scene image is determined to comprise the intersection scene characteristics according to the recognition result, acquiring a second steering action of the vehicle passing through the intersection on the navigation route.
3. The method of claim 2, wherein the step of obtaining the road scene image after the vehicle is driven according to the set navigation route comprises:
determining the starting point position of a navigation route after a vehicle runs according to the set navigation route;
determining an intersection corresponding to the starting point position in the navigation route, and acquiring a route distance between the starting point position and the intersection;
calculating a driving distance of the vehicle from the starting point position to a first real-time position;
and when the difference value between the route distance and the driving distance is smaller than a difference threshold value, acquiring a road scene image.
4. The method of claim 2, wherein the step of obtaining the road scene image after the vehicle is driven according to the set navigation route comprises:
determining the starting point position of a navigation route after a vehicle runs according to the set navigation route;
determining an intersection corresponding to the starting point position in the navigation route and a second real-time position of the vehicle;
and when the distance between the second real-time position and the intersection is smaller than a distance threshold value, acquiring a road scene image.
5. The method of claim 2, wherein the step of obtaining the road scene image after the vehicle is driven according to the set navigation route comprises:
determining the starting point position of a navigation route after a vehicle runs according to the set navigation route;
determining an intersection corresponding to the starting point position in the navigation route and a preset shooting area taking the intersection as a center;
and acquiring a third real-time position of the vehicle, and acquiring a road scene image when the third real-time position is in the preset shooting area.
6. The method of claim 2, wherein the step of identifying the road scene image comprises:
establishing an intersection scene feature library, wherein the intersection scene feature library comprises candidate feature vectors of one or more intersection scene images;
acquiring a real-time feature vector of the road scene image;
and identifying the road scene image according to the real-time characteristic vector and the candidate characteristic vector, and outputting an identification result.
7. The method of claim 6, wherein the step of creating an intersection scene feature library comprises:
determining all crossing scenes in the navigation route, and acquiring crossing image characteristics of the crossing scenes;
and generating a candidate feature vector corresponding to the intersection image feature.
8. The method of claim 2, wherein the step of obtaining a second turning action of the vehicle passing through the intersection on the navigation route when it is determined that the road scene image includes the intersection scene feature according to the recognition result comprises:
when the road scene image comprises the intersection scene characteristics, acquiring the acceleration of the vehicle passing through the intersection on the navigation route;
calculating a running gradient by using the acceleration;
determining a second steering action of the vehicle based on the travel grade.
9. The method of claim 8, wherein the step of determining a second steering action of the vehicle based on the grade of travel comprises:
when the running gradient is within a preset gradient range, acquiring a steering angle of the vehicle;
and identifying a second steering action of the vehicle by using the running gradient and the steering angle.
10. The method of claim 8, wherein the step of determining a second steering action of the vehicle based on the travel grade further comprises:
when the running gradient is out of a preset gradient range, acquiring a steering angle of the vehicle;
acquiring a road attribute corresponding to the intersection in the navigation route;
identifying a second steering action of the vehicle using the travel grade, the steering angle, and the road attribute.
11. The method of claim 1, wherein the step of comparing the second steering action with the first steering action previously buffered in the vehicle onboard system further comprises:
deleting the first steering action when the second steering action is the same as the first steering action;
and when the second steering action is different from the first steering action, generating a deviation route result.
12. The method of claim 11, wherein the step of performing a deviation correction process comprises:
waiting for a deviation in course response by the positioning system;
deleting the first turning action when the route deviation response is received within a preset time;
and re-executing the step of comparing the second steering action with the first steering action cached in the vehicle-mounted system in advance.
13. The method of claim 11, wherein the step of performing a deviation correction process further comprises:
when the route deviation response is not received within the preset time, the current position of the vehicle is obtained;
and when the distance between the current position and the intersection is greater than a distance threshold value, replanning the navigation route by using the current position.
14. The method of claim 13, wherein after the step of replanning the navigation route with the current location when the distance from the current location to the intersection is greater than the distance threshold, further comprising:
and re-executing the step of acquiring the road scene image after the vehicle runs according to the set navigation route.
15. A navigation device of a vehicle, characterized in that the device comprises:
the second steering action acquisition module is used for acquiring a second steering action of the vehicle passing through an intersection on the navigation route after the vehicle runs according to the set navigation route;
the comparison module is used for comparing the second steering action with a first steering action cached in advance in a vehicle-mounted system;
and the deviation correcting module is used for performing deviation correcting processing when the deviation of the navigation route is determined according to the comparison result.
16. A vehicle, characterized by comprising a processor, a memory and a computer program stored on the memory and capable of running on the processor, the computer program, when executed by the processor, implementing the steps of the navigation method of the vehicle according to any one of claims 1 to 14.
17. A computer-readable storage medium, characterized in that a computer program is stored on the computer-readable storage medium, which computer program, when being executed by a processor, carries out the steps of the navigation method of the vehicle according to any one of claims 1 to 14.
CN201910975098.5A 2019-10-14 2019-10-14 Navigation method and device for vehicle, automobile and storage medium Pending CN110702135A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910975098.5A CN110702135A (en) 2019-10-14 2019-10-14 Navigation method and device for vehicle, automobile and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910975098.5A CN110702135A (en) 2019-10-14 2019-10-14 Navigation method and device for vehicle, automobile and storage medium

Publications (1)

Publication Number Publication Date
CN110702135A true CN110702135A (en) 2020-01-17

Family

ID=69199755

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910975098.5A Pending CN110702135A (en) 2019-10-14 2019-10-14 Navigation method and device for vehicle, automobile and storage medium

Country Status (1)

Country Link
CN (1) CN110702135A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111623794A (en) * 2020-05-15 2020-09-04 广州小鹏车联网科技有限公司 Display control method for vehicle navigation, vehicle and readable storage medium
CN113237487A (en) * 2021-04-09 2021-08-10 烟台杰瑞石油服务集团股份有限公司 Vision-aided navigation method and device
CN114740514A (en) * 2022-06-09 2022-07-12 武汉地铁集团有限公司 Method, system, electronic device and storage medium for positioning patrolman in subway station
CN117705141A (en) * 2024-02-06 2024-03-15 腾讯科技(深圳)有限公司 Yaw recognition method, yaw recognition device, computer readable medium and electronic equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5043902A (en) * 1987-12-28 1991-08-27 Aisin Aw Co., Ltd. Vehicular navigation apparatus
CN101246018A (en) * 2008-03-14 2008-08-20 凯立德欣技术(深圳)有限公司 Road indication method, device and navigator supporting image
CN103837152A (en) * 2014-01-03 2014-06-04 观致汽车有限公司 Intelligent turning prompting system and method for driving of vehicle
CN106156723A (en) * 2016-05-23 2016-11-23 北京联合大学 A kind of crossing fine positioning method of view-based access control model
CN106643757A (en) * 2015-10-30 2017-05-10 中国电信股份有限公司 Method and device for judging whether vehicle goes on or off ramp
CN106908069A (en) * 2015-12-23 2017-06-30 大陆汽车投资(上海)有限公司 Navigation auxiliary based on prediction
US20190130736A1 (en) * 2017-10-31 2019-05-02 Waymo Llc Detecting and responding to traffic redirection for autonomous vehicles

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5043902A (en) * 1987-12-28 1991-08-27 Aisin Aw Co., Ltd. Vehicular navigation apparatus
CN101246018A (en) * 2008-03-14 2008-08-20 凯立德欣技术(深圳)有限公司 Road indication method, device and navigator supporting image
CN103837152A (en) * 2014-01-03 2014-06-04 观致汽车有限公司 Intelligent turning prompting system and method for driving of vehicle
CN106643757A (en) * 2015-10-30 2017-05-10 中国电信股份有限公司 Method and device for judging whether vehicle goes on or off ramp
CN106908069A (en) * 2015-12-23 2017-06-30 大陆汽车投资(上海)有限公司 Navigation auxiliary based on prediction
CN106156723A (en) * 2016-05-23 2016-11-23 北京联合大学 A kind of crossing fine positioning method of view-based access control model
US20190130736A1 (en) * 2017-10-31 2019-05-02 Waymo Llc Detecting and responding to traffic redirection for autonomous vehicles

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111623794A (en) * 2020-05-15 2020-09-04 广州小鹏车联网科技有限公司 Display control method for vehicle navigation, vehicle and readable storage medium
CN113237487A (en) * 2021-04-09 2021-08-10 烟台杰瑞石油服务集团股份有限公司 Vision-aided navigation method and device
CN114740514A (en) * 2022-06-09 2022-07-12 武汉地铁集团有限公司 Method, system, electronic device and storage medium for positioning patrolman in subway station
CN117705141A (en) * 2024-02-06 2024-03-15 腾讯科技(深圳)有限公司 Yaw recognition method, yaw recognition device, computer readable medium and electronic equipment

Similar Documents

Publication Publication Date Title
US10300916B2 (en) Autonomous driving assistance system, autonomous driving assistance method, and computer program
CN110702135A (en) Navigation method and device for vehicle, automobile and storage medium
US10399571B2 (en) Autonomous driving assistance system, autonomous driving assistance method, and computer program
US11657072B2 (en) Automatic feature extraction from imagery
JP4910510B2 (en) Control information storage device and program
US20090240426A1 (en) Navigation device and navigation method
US9989367B2 (en) Technique for providing travel information
CN107560622A (en) A kind of method and apparatus based on driving image-guidance
US10192438B2 (en) Electronic apparatus, guide method, and guide system
CN111311902B (en) Data processing method, device, equipment and machine readable medium
CN101246010A (en) Lane determining device, method, and program
WO2022052283A1 (en) Vehicle positioning method and device
KR102438114B1 (en) Method and apparatus for determining a driving route of a vehicle
CN109345015B (en) Method and device for selecting route
JP3838291B2 (en) Vehicle navigation apparatus and recording medium
JPH11271074A (en) Device and method for comparing mark image and program storage medium
CN112197780B (en) Path planning method and device and electronic equipment
JP4875509B2 (en) Navigation device and navigation method
CN110596741A (en) Vehicle positioning method and device, computer equipment and storage medium
CN111260549A (en) Road map construction method and device and electronic equipment
JP6224344B2 (en) Information processing apparatus, information processing method, information processing system, and information processing program
CN112836003A (en) Map processing method and device
JP2013036930A (en) Navigation device and navigation system comprising the same
JP4325644B2 (en) Vehicle navigation apparatus and route guidance method
JP2002350145A (en) Method and apparatus for present position detection for navigation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination