CN110726417A - Vehicle yaw identification method, device, terminal and storage medium - Google Patents

Vehicle yaw identification method, device, terminal and storage medium Download PDF

Info

Publication number
CN110726417A
CN110726417A CN201911102568.3A CN201911102568A CN110726417A CN 110726417 A CN110726417 A CN 110726417A CN 201911102568 A CN201911102568 A CN 201911102568A CN 110726417 A CN110726417 A CN 110726417A
Authority
CN
China
Prior art keywords
road
observation
sample
target
positioning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911102568.3A
Other languages
Chinese (zh)
Other versions
CN110726417B (en
Inventor
吴跃进
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201911102568.3A priority Critical patent/CN110726417B/en
Publication of CN110726417A publication Critical patent/CN110726417A/en
Application granted granted Critical
Publication of CN110726417B publication Critical patent/CN110726417B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • G01C21/3415Dynamic re-routing, e.g. recalculating the route when the user deviates from calculated route or after detecting real-time traffic data or accidents
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Navigation (AREA)

Abstract

The application discloses a vehicle yaw identification method, a vehicle yaw identification device, a terminal and a medium. The method comprises the following steps: acquiring locating point data of a target vehicle, wherein the locating point data comprises a plurality of locating points of the target vehicle in the driving process and characteristic information of the locating points; acquiring the characteristics of the positioning points and the characteristics of candidate observation roads of the positioning points, and determining a target road matched with a target positioning point in the positioning points in the candidate observation roads according to the characteristics of the positioning points and the characteristics of the candidate observation roads; and comparing the preset route with the target road to determine whether the target vehicle deviates from the preset route, so that whether the target vehicle deviates from the preset route can be more accurately and quickly identified in the driving process of the vehicle, and a more correct navigation route is provided for a user.

Description

Vehicle yaw identification method, device, terminal and storage medium
Technical Field
The present application relates to the field of positioning technologies, and in particular, to a method, an apparatus, a terminal, and a storage medium for identifying yaw of a vehicle.
Background
Nowadays, a user utilizes terminal navigation software to carry out vehicle driving navigation more and more commonly, and convenience is provided for the user to drive through navigation.
In the navigation process, yaw identification is very important for user navigation experience, and when a user deviates from a planned route, the yaw needs to be identified as early as possible and accurately, and then a new route can be planned for the user, so that the user can be guided to drive accurately and timely.
At present, a commonly used yaw identification method is used for identifying yaw according to the relation between positioning data (including position, direction, speed, precision and the like) of a vehicle and a planned route, the method is too dependent on the quality of the positioning data, if the quality of the positioning data is not high or position drift occurs, identification delay or identification errors are easily caused, the problem that a user vehicle actually deviates from the planned route but a terminal navigation software does not identify exists, wrong guidance is still carried out on the user according to the original route, and wrong interference can be brought to user driving.
Disclosure of Invention
The application provides a vehicle yaw identification method, a vehicle yaw identification device, a vehicle yaw identification terminal and a storage medium, which can identify whether yaw occurs more accurately and rapidly in the vehicle driving process so as to provide a more correct navigation route for a user.
In a first aspect, a vehicle yaw identification method is provided, including:
acquiring positioning point data of a target vehicle, wherein the positioning point data comprises a plurality of positioning points of the target vehicle in the driving process and the characteristics of the positioning points;
acquiring the characteristics of candidate observation roads of the positioning points, and determining a target road matched with a target positioning point in the positioning points in the candidate observation roads according to the characteristics of the positioning points and the characteristics of the candidate observation roads;
and comparing a preset route with the target road to determine whether the target vehicle deviates from the preset route.
In a second aspect, there is provided a vehicle yaw recognition apparatus comprising: location module, matching module, definite module, wherein:
the positioning module is used for acquiring positioning point data of a target vehicle, wherein the positioning point data comprises a plurality of positioning points of the target vehicle in the driving process and characteristic information of the positioning points;
the matching module is used for acquiring the characteristics of the positioning points and the characteristics of candidate observation roads of the positioning points, and determining a target road matched with a target positioning point in the positioning points in the candidate observation roads according to the characteristics of the positioning points and the characteristics of the candidate observation roads;
the determining module is used for comparing a preset route with the target road and determining whether the target vehicle deviates from the preset route.
In a third aspect, a terminal is provided, which includes an input device and an output device, and further includes: a processor adapted to implement one or more instructions; and a computer storage medium storing one or more instructions adapted to be loaded by the processor and to perform the vehicle yaw identification method according to the first aspect.
In a fourth aspect, there is provided a computer storage medium storing one or more instructions adapted to be loaded by a processor and to perform the steps of the first aspect and any possible implementation thereof.
The method comprises the steps of obtaining locating point data of a target vehicle, wherein the locating point data comprise a plurality of locating points of the target vehicle in the driving process and characteristic information of the locating points; acquiring the characteristics of the positioning points and the characteristics of candidate observation roads of the positioning points, and determining a target road matched with a target positioning point in the positioning points in the candidate observation roads according to the characteristics of the positioning points and the characteristics of the candidate observation roads; the preset route is compared with the target road to determine whether the target vehicle deviates from the preset route, so that whether yawing occurs can be more accurately and quickly identified in the vehicle driving process, and a more correct navigation route is provided for a user.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments or the background art of the present application, the drawings required to be used in the embodiments or the background art of the present application will be described below.
Fig. 1A is a schematic view of a vehicle driving navigation scene according to an embodiment of the present disclosure;
fig. 1B is a schematic view of another vehicle driving navigation scenario provided in the embodiment of the present application;
FIG. 2 is a schematic flow chart illustrating a method for identifying yaw of a vehicle according to an embodiment of the present disclosure;
FIG. 3 is a schematic flow chart diagram illustrating another method for identifying yaw of a vehicle according to an embodiment of the present application;
fig. 4A is a schematic diagram of a map corresponding to sample data according to an embodiment of the present disclosure;
fig. 4B is a schematic diagram of a map corresponding to another sample data provided in the embodiment of the present application;
fig. 5A is a schematic map diagram corresponding to another sample data provided in the embodiment of the present application;
FIG. 5B is a schematic view of a navigation map according to an embodiment of the present disclosure;
FIG. 5C is a schematic view of another exemplary navigation map provided in the present application;
FIG. 6A is a schematic yaw diagram provided by an embodiment of the present application;
FIG. 6B is another schematic yaw diagram provided by an embodiment of the present application;
FIG. 6C is a schematic representation of yet another yaw provided by an embodiment of the present application;
FIG. 7 is a schematic structural diagram of a vehicle yaw recognition apparatus according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a terminal according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
In some terminal navigation software, a user vehicle actually deviates from a planned route, but the navigation software does not recognize the deviation, and wrong guidance is still performed for the user according to the original route, so that wrong interference is brought to the driving of the user.
For example, reference may be made to fig. 1A and 1B for two common vehicle driving navigation scenarios, respectively. In fig. 1A, the dotted line is the user's driving trajectory, and the solid line is the planned route for navigation; in a main road and auxiliary road scene, the distance between the main road and the auxiliary road is close, so that the yaw cannot be identified for a long time, and the driving track of the user is not on the planned route, but still provides error guidance for the user according to the original route. In fig. 1B, the dotted line is the user driving trajectory, and the solid line is the planned route for navigation; the drift of the positioning point (GPS point) causes the error identification of the yaw, namely the identification of the driving track of the user is wrong, the user is mistaken to deviate from the original planned route (actually not), the water drop marking is the identification of the yaw point, the terminal navigation software provides a wrong new route for the user again, and the serious error interference is brought to the driving of the user.
The method has the advantages that the yaw identification is very important for the navigation experience of the user, the yaw is identified as early and accurate as possible under the condition that the user deviates from the planned route, then a new route can be planned for the user, and the user is guided to drive accurately and timely.
The embodiments of the present application will be described below with reference to the drawings.
Referring to fig. 2, fig. 2 is a schematic flowchart illustrating a method for identifying a yaw of a vehicle according to an embodiment of the present application. The method can comprise the following steps:
201. the method comprises the steps of obtaining positioning point data of a target vehicle, wherein the positioning point data comprise a plurality of positioning points of the target vehicle in the driving process and the characteristics of the positioning points.
The implementation subject matter in the embodiments of the present application is a terminal, which may also be referred to as a terminal device in specific implementations, including but not limited to other portable devices such as a mobile phone, a laptop computer, or a tablet computer with a touch-sensitive surface (e.g., a touch screen display and/or a touch pad), and the positioning service and the navigation function may be implemented by an application program. It should also be understood that in some embodiments, the device is not a portable communication device, but is a desktop computer having a touch-sensitive surface (e.g., a touch screen display and/or touchpad).
The terminal navigation software can provide a planned route for a user to drive a vehicle and perform voice guidance along the road for the user during the driving process of the user.
The yaw in the embodiment of the application refers to the condition of deviating from a flight line, and if the position of a vehicle driven by a user deviates from a planned route provided by terminal navigation software, the user is considered to be navigated by the yaw; typically, after identifying the yaw, the terminal navigation software will plan a new route for the user.
The Positioning point data refers to position information of vehicle Positioning points, which can be obtained, and includes positions of a plurality of Positioning points, and can be positioned through a Global Positioning System (GPS), and the Positioning point data is a Positioning System of high-precision radio navigation based on air satellites, and can provide accurate geographic position, vehicle speed and accurate time information in any place and near space of the world. The position information of a vehicle positioning point (GPS point) may be acquired by a GPS, and specifically, the terminal navigation software acquires the GPS information by a GPS positioning system carried by the terminal to determine the current position of the target vehicle. The GPS position may have an accuracy problem that it cannot be guaranteed that the real position of the vehicle can be accurately reflected at any time and any place (drift occurs as described above).
The above mentioned features of the positioning points can be understood as GPS attributes-that is, can include position, direction, speed, altitude, accuracy, etc.
The GPS information of the target vehicle, including a plurality of positioning points of the target vehicle during traveling and characteristics of the positioning points, may be obtained through a GPS positioning system, and step 202 may be performed.
202. And acquiring the characteristics of the candidate observation roads of the positioning points, and determining a target road matched with a target positioning point in the positioning points in the candidate observation roads according to the characteristics of the positioning points and the characteristics of the candidate observation roads.
Optionally, before determining a target road matched with a target anchor point in the anchor points in the candidate observation roads according to the features of the anchor points and the features of the candidate observation roads, the method further includes:
and 2A, determining a first candidate observation road set corresponding to a target positioning point in the positioning points and a second candidate observation road set corresponding to a reference positioning point, wherein the reference positioning point is a previous positioning point of the target positioning point.
And the candidate observation road corresponding to the positioning point needs to be confirmed in the navigation software. However, since one anchor point only matches one correct road, it indicates that the position is on a certain road, that is, the above candidate observed road may be understood as a surrounding road (which may be multiple roads) to which the position of one anchor point may correspond.
The method for determining the candidate observation road of the positioning point may include: and selecting roads with the number less than the preset number as candidate observation roads of the positioning point from near to far according to the length of the vertical distance in the preset range taking the positioning point as the center. The preset range and the preset number can be modified and set according to needs.
The characteristics of the candidate observation road may include: the length, speed limit, road grade, category, lane number, toll fee, etc. may be obtained from a local database, a server, or a third party system.
One (only one road may be around) or multiple candidate observation roads may be selected around one localization point as the candidate observation road set corresponding to the localization point. The anchor point (the anchor point obtained latest) of the current vehicle may be used as the target anchor point, the former anchor point may be used as the reference anchor point, a first candidate observation road set corresponding to the target anchor point and a second candidate observation road set corresponding to the reference anchor point in the anchor points are respectively determined, and then steps 2B and 2C may be performed.
Further, the determining a target road matched with a target anchor point in the anchor points in the candidate observation roads according to the features of the anchor points and the features of the candidate observation roads includes:
2B, obtaining the observation probability of the target positioning point relative to the first candidate observation road and the transition probability between the second candidate observation road and the first candidate observation road according to the characteristics of the positioning point and the characteristics of the candidate observation road of the positioning point; the observation probability is used to indicate a degree of matching between the target localization point and the first observation-candidate road, and the transition probability is used to indicate a probability of transition from any of the second observation-candidate roads to any of the first observation-candidate roads;
and 2C, determining a target road matched with a target positioning point in the positioning points in the first candidate observation road set according to the observation probability and the transition probability.
In the embodiment of the application, the obtained characteristics of each positioning point and the characteristics of the candidate observation road can be used for prediction through a trained network model, and the relationship between the positioning point and the road is analyzed to determine the correct road matched with the positioning point of the current vehicle and the correct position of the positioning point.
Firstly, the observation probability of the target anchor point relative to the first candidate observation road can be predicted and obtained according to the characteristics of the anchor point and the characteristics of the candidate observation road of the anchor point,
in general, the observation probability refers to a probability between [0,1] output by using a network model according to a feature (which may be an integrated or converted feature value) between a road and a positioning point based on any one of surrounding roads; the higher the probability, the more the positioning point is attached to the road. In practical application, based on the obtained features and the network model, the observation probability of each anchor point for each candidate observation road can be output.
The transition probability between the second observation candidate road and the first observation candidate road may be understood as follows:
the transition probability can be understood as: according to the characteristics between two positioning points, between two roads and between the positioning points and the roads, the probability between [0,1] is output by using a network model; the higher the probability, the higher the possibility of a transition from any of the second candidate observed roads to any of the first candidate observed roads is indicated.
Through the observation probability and the transition probability, the output probability can be calculated and used as a judgment basis of the matched target road. Specifically, the maximum output probability of the target location point corresponding to all the candidate observation roads may be calculated as the output probability of the candidate observation road. The corresponding road with the highest output probability may be determined as the target road that best matches the target anchor point.
In this embodiment of the present application, the target road refers to the target location point, that is, the correct road matched with the current location point, that is, according to the above calculation, the current target vehicle should be located on the target road. Further, step 203 may be performed.
203. And comparing a preset route with the target road to determine whether the target vehicle deviates from the preset route.
The preset route is a planned route provided by the navigation software for driving of the user, and the target road can be understood as a road determined by the method on which the user actually drives. By comparing the preset route with the target road, whether the target vehicle deviates from the preset route can be determined.
Specifically, in the actual vehicle navigation process, the best matching road and position of the current positioning point of the vehicle can be calculated based on the method, the best matching road and position are compared with the planned route, if the comparison result is consistent, the yaw is not considered, and if the comparison result is inconsistent, the yaw is considered to occur.
In an embodiment, after the step 203, the method may further include: and outputting yaw prompt information.
The yaw prompting information can be voice information, image or character information and is used for prompting a user that the current driving route deviates from a navigation planned route; further optionally, the method may further include:
replanning a reroute for the target vehicle based on the target location and the target road;
and displaying the changed route on a map.
Specifically, in the case of confirming that the deviation from the preset route is detected, the driving route, that is, the changed route, may be newly planned for the user in combination with the current vehicle location and the road environment where the vehicle is located, and may be displayed on a map and may output a corresponding navigation voice. The real-time adjustment is carried out by monitoring the actual driving route of the user in time, so that more accurate and faster navigation service is provided for the user.
The method comprises the steps of obtaining locating point data of a target vehicle, wherein the locating point data comprise a plurality of locating points of the target vehicle in the driving process and characteristic information of the locating points; acquiring the characteristics of the positioning points and the characteristics of candidate observation roads of the positioning points, and determining a target road matched with a target positioning point in the positioning points in the candidate observation roads according to the characteristics of the positioning points and the characteristics of the candidate observation roads; the preset route is compared with the target road to determine whether the target vehicle deviates from the preset route, so that whether yawing occurs can be more accurately and quickly identified in the vehicle driving process, and a more correct navigation route is provided for a user.
Referring to fig. 3, fig. 3 is a schematic flowchart of another vehicle yaw identification method according to an embodiment of the present application, which can be applied to step 202 and step 203 of the embodiment shown in fig. 2. As shown in fig. 3, the method may include:
301. and calling an observation calculation module to obtain observation probability according to the characteristics of the positioning points, the characteristics of the candidate observation roads and the communication relation among the candidate observation roads.
302. And calling a transition calculation module to obtain transition probability according to the features between the target positioning point and the reference positioning point, the features between the first candidate observation road and the second candidate observation road, the features between the target positioning point and the first candidate observation road and the features between the reference positioning point and the second candidate observation road.
Specifically, based on the description in the embodiment shown in fig. 2, the observation probability and the transition probability may be obtained by prediction through a trained model. Further, the observation calculation module and the transition calculation module can be called to obtain the observation probability and the transition probability respectively.
The implementation subject matter in the embodiments of the present application is the terminal, and in particular implementations, the terminal may also be referred to as a terminal device, including but not limited to other portable devices such as a mobile phone, a laptop computer, or a tablet computer having a touch-sensitive surface (e.g., a touch screen display and/or a touch pad), which may provide positioning services and navigation functions. It should also be understood that in some embodiments, the device is not a portable communication device, but is a desktop computer having a touch-sensitive surface (e.g., a touch screen display and/or touchpad).
The observation probability and the transition probability in the embodiment of the application can be obtained by means of the observation calculation module and the transition calculation module through a pre-trained network model prediction. Specifically, the method comprises the following steps:
in one embodiment, the observation calculation module includes an observation prediction model, and the training method of the observation prediction model includes:
obtaining first positioning point sample data, wherein in the first positioning point sample data, each sample positioning point and a first sample road to which each sample positioning point belongs are marked as positive samples, and each sample positioning point and a sample road except the first sample road to which each sample positioning point belongs are marked as negative samples;
and training characteristic parameters of a network model by using the sample data of the first fixed site to obtain the observation prediction model.
The method for obtaining the sample data of the first fixed location point may include:
selecting roads with the number less than the preset sample number as sample roads of the sample positioning points from near to far according to the length of the vertical distance in a preset sample range taking the sample positioning points as the center;
acquiring the characteristics of the sample positioning point and the characteristics of the sample road;
determining a first sample road to which each sample positioning point belongs according to the positioning data of the historical driving track;
and marking the first sample road as a positive sample corresponding to the sample positioning point, and marking the sample roads except the first sample road as negative samples corresponding to the sample positioning point to obtain the sample data of the first positioning point.
Firstly, a batch of characteristics (characteristic values) related to positioning point data and candidate observation roads set based on industry experience can be obtained; an observation prediction model is trained through a batch of manually labeled positive and negative training samples. The following description will be seen in detail.
Optionally, the branch calculation module includes a branch prediction model, and the training method of the branch prediction model includes:
obtaining sample data of the second positioning point, wherein in the sample data of the positioning point, data of a sample road to which each sample positioning point belongs is transferred to a sample road to which an adjacent next sample positioning point belongs is marked as a positive sample, sample data of the positioning points except the positive sample is marked as a negative sample, and the positive sample and the negative sample do not contain sample data positioning points which are not communicated between the sample roads to which the sample positioning points belong;
and training the characteristic parameters of the network model by using the sample data of the second positioning point to obtain the transfer prediction model.
See in particular the following description of the two models.
The following describes an observation and prediction model according to an embodiment of the present application.
Fig. 4A is a schematic diagram of a map corresponding to sample data, where an observation object may be several roads around a location point.
In particular, the candidate observed road set
Figure BDA0002270287860000091
Can be represented at anchor point giPeripheral selection of niAnd observing the road. Can use
Figure BDA0002270287860000093
Representing the anchor point giFor roadIs observed.
The points in fig. 4A are schematic several positioning points m, the dashed lines formed by these positioning points are the driving routes determined by reference to the positioning points, the road L1 is the correct road corresponding to the positioning points, the arrow points to the actual position of the positioning point, and these data can be used as a positive sample for observing the prediction model. The positioning points in fig. 4A correspond to the position points on the road.
The features for the observation and prediction model can be shown in table 1, which is only illustrated here:
TABLE 1
The data of the positive sample can adopt an existing data set, and the actual position points on the sample road and the sample road corresponding to the sample positioning points can be obtained through manual marking.
Correspondingly, each sample positioning point corresponds to a correct first sample road, and other sample roads except the correct road in the sample roads observed by the surrounding candidates can be marked as negative samples, wherein the selection of the sample road as the candidate observation road is similar to the method in the application, and roads with the number less than the number of preset samples can be selected as the sample roads of the sample positioning points according to the vertical distance length from near to far in the preset sample range with the sample positioning points as the centers.
Fig. 4B is a schematic diagram of a map corresponding to another sample data, in which the water drops are marked as anchor points, the road L2 is a correct road, and the other roads L3, L4 and L5 can be regarded as negative samples.
As can be seen from the above figure, each anchor point may correspond to one road as a positive sample and a plurality of roads as negative samples.
Characteristic values are produced according to artificially labeled training samples (including positive and negative samples), and an observation prediction model is trained through parameter adjustment.
Specifically, after an observation prediction model is trained, in practical application, any anchor point giThe peripheral candidate niStrip observation roadFor a certain observation road
Figure BDA0002270287860000103
Generating a list of feature values for an observation prediction model
Figure BDA0002270287860000104
There are a total of K features; tabulating feature values
Figure BDA0002270287860000105
Taking into an observation prediction model to calculate the observation probability
Figure BDA0002270287860000106
The trained observation prediction model can be applied to the calculation of the observation probability in the foregoing embodiment.
The following describes a branch prediction model according to an embodiment of the present application.
The branch prediction model concerns: two consecutive GPS points giAnd gi+1Respectively corresponding candidate observation road sets
Figure BDA0002270287860000107
Any one of the roadsTo
Figure BDA0002270287860000109
Any one of the roads
Figure BDA00022702878600001010
Transition (probability) of (c).
For example, fig. 5A is a schematic map diagram corresponding to another sample data, and the points in fig. 5A are two continuous positioning points gi、gi+1The two positioning points should match the road P. Can use
Figure BDA00022702878600001011
Representing the anchor point giCorresponding candidate observation road
Figure BDA00022702878600001012
To anchor point gi+1Corresponding candidate observation road
Figure BDA00022702878600001013
The transition probability of (2).
A set of characteristics (characteristic values) related between two positioning points, between two roads, and between a positioning point and a road can be set based on industry experience; and manually labeling a batch of positive and negative training samples based on the mode, and training a transfer prediction model. In practical application, based on the actual characteristics and the transition prediction model, the transition probability from a first locating point corresponding to a candidate observed road to a second locating point corresponding to a candidate observed road can be output.
The connection relation between roads in the embodiment of the application can be understood as whether any two roads are connected in a set of driveable road topology network, namely whether a vehicle can reach a road B from a road A by driving. Specifically, it may be determined whether each two of the candidate observation roads are communicated with each other and the shortest driving distance is greater than a distance threshold, and if the two candidate observation roads are communicated with each other and the shortest driving distance is not greater than the distance threshold, it is determined that the two candidate observation roads are communicated with each other, and if not, it is determined that the two candidate observation roads are not communicated with each other, and a communication relationship between all the candidate observation roads is obtained.
The term "inter-road communication" used herein refers to limited inter-road communication, and specifically means that the shortest driving distance from the road a to the road B cannot exceed a threshold value Lm; even if driving from road a can reach road B, but the shortest driving distance from road a to road B is > threshold Lm, the present application considers "road a is not communicated with road B". Note: hereinafter, any reference to "communication between roads" or the like refers to restricted communication between roads.
For example, reference may be made to a navigation map diagram of fig. 5B, in which two roads a and a are connected; for another example, fig. 5C is a schematic view of another navigation map, and as shown in fig. 5C, the two roads A, B are not connected.
For the sample data of the branch prediction model, in an alternative embodiment, the features related between two positioning points, between two roads, and between a positioning point and a road may be summarized and selected based on industry experience accumulation and case analysis, which may be specifically shown in table 2 and is only illustrated here:
Figure BDA0002270287860000111
TABLE 2
Optionally, based on the sample data labeled in the observation and prediction model, the following may be obtained: each positioning point corresponds to a plurality of candidate observation roads, wherein one candidate observation road is used as a positive sample, and the rest candidate observation roads are used as negative samples; the training of the transfer computation model here may also be carried over with these sample data.
Positive sample: from the first anchor point giCorresponding to a correct road
Figure BDA0002270287860000112
To a second anchor point gi+1Corresponding to a correct road
Figure BDA0002270287860000121
And (4) transferring. Note: if the road is
Figure BDA0002270287860000122
To the road
Figure BDA0002270287860000123
If there is no connection, the sample is discarded.
Negative sample: assume anchor point giCorresponds to niCandidate observation road, anchor point gi+1Corresponds to ni+1A candidate observation road theoretically having ni*ni+1One sample is transferred, the only positive sample on the upper side is excluded, then (n) remainsi*ni+1-1) negative examples. Note: if there is no communication between the two roads, the sample is discarded.
Optionally, the positive and negative sample number selection method may be: from the above, two adjacent positioning points correspond to one positive sample and a plurality of negative samples; then positive samples must be left for use, but negative samples need to be screened according to the rules. The specific rule is as follows: if the quantity of the transferred negative samples corresponding to the two adjacent positioning points is less than or equal to the threshold Nf, reserving all the negative samples; if the maximum negative sample is greater than the threshold Nf, Nf negative samples are reserved at most according to the shortest communication distance (short and long) between two roads.
And (3) producing characteristic values aiming at the training samples (including positive and negative samples), and training a transfer prediction model by adjusting parameters. The trained branch prediction model can be applied to the calculation of the branch probability in the foregoing embodiment.
In one embodiment, after the branch prediction model is trained, in practical application, any two adjacent anchor points g are usediAnd gi+1Respectively correspond to niCandidate observation road around strip
Figure BDA0002270287860000124
And ni+1Candidate observation road around strip
Figure BDA0002270287860000125
For ni*ni+1Individual transfer combined road
Figure BDA0002270287860000126
To the roadIf not, the combination is discarded; if the two are connected, generating a characteristic value list of the transfer prediction model
Figure BDA0002270287860000128
There are S features in total; tabulating feature values
Figure BDA0002270287860000129
By substituting the transition prediction model, the transition probability can be calculated
In summary, the observation probability can be obtained by the observation prediction model, and the transition probability can be obtained by the transition prediction model. Step 303 may continue with calculating the output probability as a parameter.
303. And obtaining the output probability of the target positioning point corresponding to the first candidate observation road according to the observation probability and the transition probability.
Specifically, in an embodiment, the flow of the method for calculating the output probability is as follows:
(0) initializing, i equals to 1, and making an initial value Trg of the output probability0=0。
(1) For the current anchor point giBased on the rule of the aforementioned description "method for selecting candidate observation roads around" to select the anchor point giCorresponds to niSet of candidate observation roadsLet j equal 1, go to step (2).
(2) From the road set LiTake out the jth road
Figure BDA00022702878600001212
If j > niEntering the step (10); otherwise, calculating the road
Figure BDA00022702878600001213
Corresponding observed eigenvalue
Figure BDA00022702878600001214
And (4) entering the step (3).
(3) Will observe the characteristic valueTaking into an observation prediction model to calculate the observation probability
Figure BDA00022702878600001216
If i > 1, go to step (4), otherwise go to step (9).
(4) Taking out the positioning point gi-1Corresponding candidate observation road setAnd its output probabilityLet k be 1, proceed to step (5).
(5) Take out Li-1Middle k road
Figure BDA0002270287860000133
If k > ni-1Step (8) is entered, otherwise step (6) is entered.
(6) Judging the subordinate road
Figure BDA0002270287860000134
To the road
Figure BDA0002270287860000135
Whether the connection is established or not, if not, entering the step (5); otherwise, calculating the slave road
Figure BDA0002270287860000136
To the road
Figure BDA0002270287860000137
Corresponding transfer eigenvalue
Figure BDA0002270287860000138
Step (8) is entered.
(7) Transfer the characteristic value
Figure BDA0002270287860000139
Substituting a branch prediction model to calculate the probability of a branch
Figure BDA00022702878600001310
Calculating a temporary output probabilityLet k be k +1, proceed to step (5).
(8) Calculating road
Figure BDA00022702878600001312
Final output probability of
Figure BDA00022702878600001313
Let j be j +1, proceed to step (2).
(9) Calculating road
Figure BDA00022702878600001314
Final output probability of
Figure BDA00022702878600001315
Let j be j +1, proceed to step (2).
(10) Obtaining the current positioning point giOutput probability corresponding to all candidate observed roads
Figure BDA00022702878600001316
304. And determining the first candidate observation road corresponding to the maximum output probability in the output probabilities as a target road.
The maximum output probability represents that the matching degree of the road and the target positioning point is the highest, so that the road is determined as the target road and can be used for more accurate yaw judgment.
Step 203 in the embodiment shown in fig. 2 may be executed to determine whether the target vehicle is yawing, and follow-up navigation prompts and measures, which may also refer to the related detailed description in the embodiment shown in fig. 2, and will not be described herein again.
In an alternative embodiment, the distance from the target positioning point to the target road may be determined in a map as a target position of the target positioning point on the road, and then the target position and the target road may be marked on the map to display the current driving position to the user.
Specifically, the road with the largest output probability may be used as the current positioning point giBest matching roadAnd locating the point
Figure BDA00022702878600001318
To the road
Figure BDA00022702878600001319
Is taken as the current positioning point giThe optimal position on the road. The above steps may be repeated, that is, i is made to be i +1, and step (1) is performed.
Optionally, according to the above steps, a final matching road (target road) and a matching position (target position) corresponding to any one of the anchor points in the series of anchor points may be calculated.
Specifically, in the actual vehicle navigation process, the best matching road and position of the current positioning point can be calculated based on the method, the best matching road and position are compared with the planned route, if the comparison result is consistent, the yaw is not considered, and if the comparison result is inconsistent, the yaw is considered to occur.
For example, referring to a yaw schematic diagram shown in fig. 6A, a user should turn right to drive according to a planned route in a map, but the user does not turn right but moves straight, and a yaw point of a new algorithm in the diagram is a position where the yaw of the user is determined by using the yaw recognition method according to the embodiment of the present application, so that the yaw can be recognized earlier than a yaw point of a general regular algorithm.
Reference may also be made to another yaw schematic diagram shown in fig. 6B, where the user trajectory is actually parallel to the planned route, and the new algorithm yaw point determines the position of the user yaw for the yaw recognition method in the embodiment of the present application, and the parallel-route yaw may be recognized, but may not be recognized by the general rule algorithm.
Still referring to fig. 6C, a schematic view of a yaw may be further referred, where the regular algorithm yaw point determines a position of the user yaw for a general regular algorithm, and the general regular algorithm determines that the user trajectory is not the same as the planned route and is misjudged as the yaw because of the offset of the anchor point.
It can be obviously seen that the vehicle yaw identification method in the embodiment of the application can improve the yaw identification sensitivity and can identify the yaw behavior as early as possible. Especially, in a parallel road scene, the main navigation products in the industry cannot identify the yaw currently, and the method can solve the problem of identifying most parallel road yaw.
The vehicle yaw identification method in the embodiment of the application reduces the rate of the error yaw, can reduce a large number of cases of the error yaw caused by the rule model based on the way of the network model of road connectivity and training, provides more accurate navigation service, and improves the user experience.
Based on the description of the embodiment of the vehicle yaw identification method, the embodiment of the application further discloses a vehicle yaw identification device. The vehicle yaw identifying apparatus may perform the method shown in fig. 2 and/or fig. 3.
Referring to fig. 7, the vehicle yaw recognition apparatus 700 includes: a positioning module 610, a matching module 620, and a determining module 630, wherein:
the positioning module 610 is configured to obtain positioning point data of a target vehicle, where the positioning point data includes a plurality of positioning points of the target vehicle during a driving process and feature information of the positioning points;
the matching module 620 is configured to obtain the feature of the anchor point and the feature of the candidate observation road of the anchor point, and determine, according to the feature of the anchor point and the feature of the candidate observation road, a target road matched with a target anchor point in the candidate observation road;
the determining module 630 is configured to compare a preset route with the target road, and determine whether the target vehicle deviates from the preset route.
According to an embodiment of the present application, the steps involved in the methods shown in fig. 2 and fig. 3 may be performed by the modules in the vehicle yaw identifying apparatus 700 shown in fig. 7, and are not described herein again.
According to another embodiment of the present application, the modules in the vehicle yaw identifying apparatus 700 shown in fig. 7 may be respectively or entirely combined into one or several additional modules, or some module(s) may be further split into a plurality of functionally smaller modules, which may achieve the same operation without affecting the achievement of the technical effect of the embodiment of the present application. The modules are divided based on logic functions, and in practical application, the functions of one module can be realized by a plurality of modules, or the functions of a plurality of modules can be realized by one module. In other embodiments of the present application, the terminal-based terminal may also include other modules, and in practical applications, these functions may also be implemented by the assistance of other modules, and may be implemented by cooperation of a plurality of modules.
Based on the description of the method embodiment and the device embodiment, the embodiment of the application also provides a terminal. Referring to fig. 8, the terminal 800 includes at least a processor 801, an input device 802, an output device 803, and a computer storage medium 804. The processor 801, the input device 802, the output device 803, and the computer storage medium 804 within the terminal may be connected by a bus or other means.
A computer storage medium 804 may be stored in the memory of the terminal, the computer storage medium 804 being configured to store a computer program comprising program instructions, and the processor 801 being configured to execute the program instructions stored by the computer storage medium 804. The processor 801 (or CPU) is a computing core and a control core of the terminal, and is adapted to implement one or more instructions, and in particular, is adapted to load and execute the one or more instructions so as to implement a corresponding method flow or a corresponding function; in one embodiment, the processor 801 described above in the embodiments of the present application may be configured to perform a series of processes, including: acquiring positioning point data of a target vehicle, wherein the positioning point data comprises a plurality of positioning points of the target vehicle in the driving process and the characteristics of the positioning points; acquiring the characteristics of candidate observation roads of the positioning points, and determining a target road matched with a target positioning point in the positioning points in the candidate observation roads according to the characteristics of the positioning points and the characteristics of the candidate observation roads; and comparing a preset route with the target road, determining whether the target vehicle deviates from the preset route, and the like.
An embodiment of the present application further provides a computer storage medium (Memory), where the computer storage medium is a Memory device in a terminal and is used to store programs and data. It is understood that the computer storage medium herein may include a built-in storage medium in the terminal, and may also include an extended storage medium supported by the terminal. The computer storage medium provides a storage space that stores an operating system of the terminal. Also stored in this memory space are one or more instructions, which may be one or more computer programs (including program code), suitable for loading and execution by processor 801. The computer storage medium may be a high-speed RAM memory, or may be a non-volatile memory (non-volatile memory), such as at least one disk memory; and optionally at least one computer storage medium located remotely from the processor.
In one embodiment, one or more instructions stored in a computer storage medium may be loaded and executed by the processor 801 to implement the respective steps of the method in the above-described embodiments; in particular implementations, one or more instructions in a computer storage medium may be loaded and executed by processor 801 to perform any of the steps of fig. 2 and 3.
The terminal 800 of the embodiment of the application can obtain locating point data of a target vehicle, where the locating point data includes a plurality of locating points of the target vehicle in a driving process and feature information of the locating points; acquiring the characteristics of the positioning points and the characteristics of candidate observation roads of the positioning points, and determining a target road matched with a target positioning point in the positioning points in the candidate observation roads according to the characteristics of the positioning points and the characteristics of the candidate observation roads; the preset route is compared with the target road to determine whether the target vehicle deviates from the preset route, so that whether yawing occurs can be more accurately and quickly identified in the vehicle driving process, and a more correct navigation route is provided for a user.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses and modules may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the division of the module is only one logical division, and other divisions may be possible in actual implementation, for example, a plurality of modules or components may be combined or integrated into another system, or some features may be omitted, or not performed. The shown or discussed mutual coupling, direct coupling or communication connection may be an indirect coupling or communication connection of devices or modules through some interfaces, and may be in an electrical, mechanical or other form.
Modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical modules, may be located in one place, or may be distributed on a plurality of network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The procedures or functions according to the embodiments of the present application are wholly or partially generated when the computer program instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on or transmitted over a computer-readable storage medium. The computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)), or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that includes one or more of the available media. The usable medium may be a read-only memory (ROM), or a Random Access Memory (RAM), or a magnetic medium, such as a floppy disk, a hard disk, a magnetic tape, a magnetic disk, or an optical medium, such as a Digital Versatile Disk (DVD), or a semiconductor medium, such as a Solid State Disk (SSD).

Claims (13)

1. A vehicle yaw identification method, comprising:
acquiring positioning point data of a target vehicle, wherein the positioning point data comprises a plurality of positioning points of the target vehicle in the driving process and the characteristics of the positioning points;
acquiring the characteristics of candidate observation roads of the positioning points, and determining a target road matched with a target positioning point in the positioning points in the candidate observation roads according to the characteristics of the positioning points and the characteristics of the candidate observation roads;
and comparing a preset route with the target road to determine whether the target vehicle deviates from the preset route.
2. The method according to claim 1, wherein said method further comprises, before determining, among said candidate observed roads, a target road for which a target anchor point among said anchor points matches, based on said features of said anchor points and said features of said candidate observed roads:
determining a first candidate observation road set corresponding to a target positioning point in the positioning points and a second candidate observation road set corresponding to a reference positioning point, wherein the reference positioning point is a previous positioning point of the target positioning point;
the determining, according to the features of the anchor point and the features of the candidate observation roads, a target road matched with a target anchor point in the anchor points in the candidate observation roads includes:
according to the characteristics of the positioning point and the characteristics of the candidate observation road of the positioning point, obtaining the observation probability of the target positioning point relative to the first candidate observation road and the transition probability between the second candidate observation road and the first candidate observation road; the observation probability is used to represent the degree of matching between the target location point and the first observation candidate road, and the transition probability is used to represent the probability of transitioning from any of the second observation candidate roads to any of the first observation candidate roads;
and determining a target road matched with a target positioning point in the positioning points in the first candidate observation road set according to the observation probability and the transition probability.
3. The method according to claim 2, wherein said obtaining an observation probability of said target anchor point with respect to said first candidate observation road and a transition probability between said second candidate observation road and said first candidate observation road according to said features of said anchor point and said features of said candidate observation road of said anchor point comprises:
judging whether two candidate observation roads are communicated with each other or not and whether the shortest driving distance is greater than a distance threshold or not, if the two candidate observation roads are communicated with each other and the shortest driving distance is not greater than the distance threshold, determining that the two candidate observation roads are communicated with each other, and if not, determining that the two candidate observation roads are not communicated with each other so as to obtain the communication relation between all the candidate observation roads;
calling an observation calculation module to obtain the observation probability according to the characteristics of the positioning points, the characteristics of the candidate observation roads and the communication relation among the candidate observation roads;
and calling a transition calculation module to obtain the transition probability according to the features between the target positioning point and the reference positioning point, the features between the first candidate observation road and the second candidate observation road, the features between the target positioning point and the first candidate observation road and the features between the reference positioning point and the second candidate observation road.
4. The method according to claim 2 or 3, wherein determining, in the first set of candidate observation roads, a target road for which a target anchor point among the anchor points matches according to the observation probability and the transition probability comprises:
obtaining the output probability of the target positioning point corresponding to the first candidate observation road according to the observation probability and the transition probability;
and determining the first candidate observation road corresponding to the maximum output probability in the output probabilities as a target road.
5. A method according to any of claims 1-3, characterized in that the method of determining candidate observation roads for an anchor point comprises:
and selecting roads with the number less than the preset number as candidate observation roads of the positioning point from near to far according to the length of the vertical distance in a preset range taking the positioning point as the center.
6. The method of claim 2, further comprising:
determining the foot of the target positioning point to the target road as the target position of the target positioning point on the road in the map;
marking the target location and the target road on a map.
7. The method of claim 6, wherein if the target vehicle deviates from the predetermined route, the method further comprises:
outputting yaw prompt information;
re-planning a reroute for the target vehicle based on the target location and the target road;
displaying the reroute on a map.
8. The method of claim 3, wherein the observation computation module comprises an observation prediction model, and the training method of the observation prediction model comprises:
obtaining sample data of a first positioning point, wherein each sample positioning point and a first sample road to which each sample positioning point belongs are marked as positive samples, and each sample positioning point and a sample road except the first sample road to which each sample positioning point belongs are marked as negative samples;
and training characteristic parameters of a network model by using the sample data of the first fixed site to obtain the observation prediction model.
9. The method according to claim 8, wherein the obtaining of the sample data of the first fixed location point comprises:
selecting roads with the number less than the preset sample number as sample roads of the sample positioning point from near to far according to the length of the vertical distance in a preset sample range taking the sample positioning point as the center;
acquiring the characteristics of the sample positioning point and the characteristics of the sample road;
determining a first sample road to which each sample positioning point belongs according to the positioning data of the historical driving track;
and marking the first sample road as a positive sample corresponding to the sample positioning point, and marking the sample roads except the first sample road as negative samples corresponding to the sample positioning point to obtain the sample data of the first positioning point.
10. The method according to claim 8 or 9, wherein the branch calculation module comprises a branch prediction model, and the method for training the branch prediction model comprises:
obtaining sample data of the second positioning point, wherein in the positioning point sample data, data of a sample road to which each sample positioning point belongs is transferred to a sample road to which an adjacent next sample positioning point belongs is marked as a positive sample, the sample data of the positioning points except the positive sample is marked as a negative sample, and the positive sample and the negative sample do not contain the sample data positioning points which are not communicated between the sample roads to which the sample positioning points belong;
and training the characteristic parameters of the network model by using the sample data of the second positioning point to obtain the transfer prediction model.
11. A vehicle yaw recognition apparatus, comprising: location module, matching module, definite module, wherein:
the positioning module is used for acquiring positioning point data of a target vehicle, wherein the positioning point data comprises a plurality of positioning points of the target vehicle in the driving process and characteristic information of the positioning points;
the matching module is used for acquiring the characteristics of the positioning points and the characteristics of candidate observation roads of the positioning points, and determining a target road matched with a target positioning point in the positioning points in the candidate observation roads according to the characteristics of the positioning points and the characteristics of the candidate observation roads;
the determining module is used for comparing a preset route with the target road and determining whether the target vehicle deviates from the preset route.
12. A terminal comprising an input device and an output device, further comprising:
a processor adapted to implement one or more instructions; and the number of the first and second groups,
a computer storage medium having stored thereon one or more instructions adapted to be loaded by the processor and to perform the vehicle yaw identifying method of any of claims 1-10.
13. A computer-readable storage medium having one or more instructions stored thereon, the one or more instructions adapted to be loaded by a processor and to perform the vehicle yaw identifying method of any of claims 1-10.
CN201911102568.3A 2019-11-12 2019-11-12 Vehicle yaw identification method, device, terminal and storage medium Active CN110726417B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911102568.3A CN110726417B (en) 2019-11-12 2019-11-12 Vehicle yaw identification method, device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911102568.3A CN110726417B (en) 2019-11-12 2019-11-12 Vehicle yaw identification method, device, terminal and storage medium

Publications (2)

Publication Number Publication Date
CN110726417A true CN110726417A (en) 2020-01-24
CN110726417B CN110726417B (en) 2022-03-04

Family

ID=69224021

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911102568.3A Active CN110726417B (en) 2019-11-12 2019-11-12 Vehicle yaw identification method, device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN110726417B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111220149A (en) * 2020-03-02 2020-06-02 腾讯科技(深圳)有限公司 Navigation method, device and equipment of mobile equipment and computer storage medium
CN111341150A (en) * 2020-02-28 2020-06-26 长安大学 Reminding method and device for preventing ultrahigh vehicle from entering limited-height road section
CN111369819A (en) * 2020-03-02 2020-07-03 腾讯科技(深圳)有限公司 Method and device for selecting driving object
CN111735457A (en) * 2020-06-30 2020-10-02 北京百度网讯科技有限公司 Indoor navigation method and device, electronic equipment and readable storage medium
CN113532448A (en) * 2020-04-13 2021-10-22 广州汽车集团股份有限公司 Navigation method and system for automatically driving vehicle and driving control equipment
CN113984074A (en) * 2021-10-18 2022-01-28 北京中交兴路信息科技有限公司 Method, device, equipment and medium for identifying target vehicle navigation route yaw
CN117537842A (en) * 2024-01-10 2024-02-09 深圳依时货拉拉科技有限公司 Route yaw recognition method, route yaw recognition device, computer-readable storage medium and computer-readable storage device

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105444769A (en) * 2015-11-26 2016-03-30 北京百度网讯科技有限公司 Map matching method and device
CN106595680A (en) * 2016-12-15 2017-04-26 福州大学 Vehicle GPS data map matching method based on hidden markov model
CN108731691A (en) * 2017-04-19 2018-11-02 腾讯科技(深圳)有限公司 The determination method and apparatus of the yaw point of navigation equipment
US20190051153A1 (en) * 2017-08-11 2019-02-14 Here Global B.V. Updating maps and road status
CN109919518A (en) * 2019-03-29 2019-06-21 百度在线网络技术(北京)有限公司 Quality determination method, device, server and the medium of map path matching data
CN109959376A (en) * 2017-12-14 2019-07-02 腾讯科技(北京)有限公司 Track correcting method is related to the navigation routine method for drafting and device of interior wiring
CN109974718A (en) * 2019-04-09 2019-07-05 百度在线网络技术(北京)有限公司 Map-matching method, device, equipment and medium
CN110260870A (en) * 2019-07-18 2019-09-20 北京百度网讯科技有限公司 Map-matching method, device, equipment and medium based on hidden Markov model
CN110426050A (en) * 2019-08-07 2019-11-08 北京百度网讯科技有限公司 Map match correcting method, device, equipment and storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105444769A (en) * 2015-11-26 2016-03-30 北京百度网讯科技有限公司 Map matching method and device
CN106595680A (en) * 2016-12-15 2017-04-26 福州大学 Vehicle GPS data map matching method based on hidden markov model
CN108731691A (en) * 2017-04-19 2018-11-02 腾讯科技(深圳)有限公司 The determination method and apparatus of the yaw point of navigation equipment
US20190051153A1 (en) * 2017-08-11 2019-02-14 Here Global B.V. Updating maps and road status
CN109959376A (en) * 2017-12-14 2019-07-02 腾讯科技(北京)有限公司 Track correcting method is related to the navigation routine method for drafting and device of interior wiring
CN109919518A (en) * 2019-03-29 2019-06-21 百度在线网络技术(北京)有限公司 Quality determination method, device, server and the medium of map path matching data
CN109974718A (en) * 2019-04-09 2019-07-05 百度在线网络技术(北京)有限公司 Map-matching method, device, equipment and medium
CN110260870A (en) * 2019-07-18 2019-09-20 北京百度网讯科技有限公司 Map-matching method, device, equipment and medium based on hidden Markov model
CN110426050A (en) * 2019-08-07 2019-11-08 北京百度网讯科技有限公司 Map match correcting method, device, equipment and storage medium

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111341150A (en) * 2020-02-28 2020-06-26 长安大学 Reminding method and device for preventing ultrahigh vehicle from entering limited-height road section
CN111341150B (en) * 2020-02-28 2021-01-26 长安大学 Reminding method and device for preventing ultrahigh vehicle from entering limited-height road section
CN111220149A (en) * 2020-03-02 2020-06-02 腾讯科技(深圳)有限公司 Navigation method, device and equipment of mobile equipment and computer storage medium
CN111369819A (en) * 2020-03-02 2020-07-03 腾讯科技(深圳)有限公司 Method and device for selecting driving object
CN113532448A (en) * 2020-04-13 2021-10-22 广州汽车集团股份有限公司 Navigation method and system for automatically driving vehicle and driving control equipment
CN111735457A (en) * 2020-06-30 2020-10-02 北京百度网讯科技有限公司 Indoor navigation method and device, electronic equipment and readable storage medium
CN111735457B (en) * 2020-06-30 2022-06-17 北京百度网讯科技有限公司 Indoor navigation method and device, electronic equipment and readable storage medium
CN113984074A (en) * 2021-10-18 2022-01-28 北京中交兴路信息科技有限公司 Method, device, equipment and medium for identifying target vehicle navigation route yaw
CN117537842A (en) * 2024-01-10 2024-02-09 深圳依时货拉拉科技有限公司 Route yaw recognition method, route yaw recognition device, computer-readable storage medium and computer-readable storage device
CN117537842B (en) * 2024-01-10 2024-09-24 深圳依时货拉拉科技有限公司 Route yaw recognition method, route yaw recognition device, computer-readable storage medium and computer-readable storage device

Also Published As

Publication number Publication date
CN110726417B (en) 2022-03-04

Similar Documents

Publication Publication Date Title
CN110726417B (en) Vehicle yaw identification method, device, terminal and storage medium
CN113538919B (en) Lane departure recognition method, device, equipment and storage medium
CN106912018B (en) Map matching method and system based on signaling track
EP3435035A1 (en) Yawing recognition method, terminal and storage medium
CN106969782A (en) Method for pushing, device, equipment and the storage medium of navigation way
CN112050821B (en) Lane line polymerization method
CN104819726A (en) Navigation data processing method, navigation data processing device and navigation terminal
CN114413920B (en) Lane data processing method, navigation method and device
CN111831768B (en) Method and device for correcting driving track, storage medium and electronic equipment
JP2006242948A (en) Personal navigation system, and route guide method in the personal navigation system
CN111380540B (en) Map matching method and device, medium and terminal
CN115585816B (en) Lane-level map matching method and device
CN112689234B (en) Indoor vehicle positioning method, device, computer equipment and storage medium
CN111141301A (en) Navigation end point determining method, device, storage medium and computer equipment
CN106855878B (en) Historical driving track display method and device based on electronic map
KR102209076B1 (en) Method, system, and non-transitory computer readable record medium for correcting typing error of virtual keyboard
US20230063809A1 (en) Method for improving road topology through sequence estimation and anchor point detetection
WO2024037487A1 (en) Path correction method and apparatus applied to vehicle, and electronic device
CN117516531A (en) Unmanned plane control and navigation method, system, terminal and storage medium
JP7478831B2 (en) Autonomous driving based riding method, device, equipment and storage medium
CN111060110A (en) Robot navigation method, robot navigation device and robot
CN113411749B (en) Entrance position determining method and device
CN114659537A (en) Navigation starting point road determining method, device, equipment and storage medium
CN115265568A (en) Mobile terminal navigation method, software, server and equipment based on photographing positioning
CN112284405A (en) Method, apparatus, computing device and computer readable medium for navigation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40020821

Country of ref document: HK

SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant