CN117537842A - Route yaw recognition method, route yaw recognition device, computer-readable storage medium and computer-readable storage device - Google Patents

Route yaw recognition method, route yaw recognition device, computer-readable storage medium and computer-readable storage device Download PDF

Info

Publication number
CN117537842A
CN117537842A CN202410033990.2A CN202410033990A CN117537842A CN 117537842 A CN117537842 A CN 117537842A CN 202410033990 A CN202410033990 A CN 202410033990A CN 117537842 A CN117537842 A CN 117537842A
Authority
CN
China
Prior art keywords
route
road network
sample
positioning sequence
yaw
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410033990.2A
Other languages
Chinese (zh)
Inventor
韩璐
樊旭颖
赵宇航
李隽颖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Yishi Huolala Technology Co Ltd
Original Assignee
Shenzhen Yishi Huolala Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Yishi Huolala Technology Co Ltd filed Critical Shenzhen Yishi Huolala Technology Co Ltd
Priority to CN202410033990.2A priority Critical patent/CN117537842A/en
Publication of CN117537842A publication Critical patent/CN117537842A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • G01C21/343Calculating itineraries, i.e. routes leading from a starting point to a series of categorical destinations using a global route restraint, round trips, touristic trips
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching

Abstract

The application is applicable to the field of freight transportation, and provides a route yaw identification method, a route yaw identification device, a computer-readable storage medium and computer equipment, wherein the route yaw identification method comprises the following steps: acquiring road network data, a navigation route and a positioning sequence corresponding to current equipment; determining at least one reference route corresponding to the positioning sequence according to the road network data and the positioning sequence; constructing a route image corresponding to the reference route based on the road network data, the reference route and the positioning sequence; determining a reference route matched with the positioning sequence as a matched route in the route image; comparing the navigation route with the matching route to obtain a route yaw recognition result, and the scheme can improve the accuracy of yaw judgment.

Description

Route yaw recognition method, route yaw recognition device, computer-readable storage medium and computer-readable storage device
Technical Field
The application belongs to the field of freight transportation, and particularly relates to a route yaw identification method, a route yaw identification device, a computer-readable storage medium and computer equipment.
Background
In the process of providing navigation service for users, the map navigation application can acquire the real-time position of the users through the positioning positions of the terminal equipment such as mobile phones and the like, thereby providing real-time map navigation for the users. In the actual navigation, if the terminal device is found out not to be on the planned navigation route through the positioning position of the terminal device, the general application can remind the user of deviating from the navigation route through voice prompt and other modes, and guide the user to return to the navigation route or recalculate the navigation route for the user.
However, due to the influence of factors such as equipment positioning precision and environmental shielding (such as high-rise buildings and viaducts), yaw judgment errors (yaw is not judged by a user as being generated by the yaw), on one hand, the errors can cause voice broadcasting, inconsistent user guidance and actual conditions, and cause trouble to the user, and on the other hand, the number of times of yaw state reporting and the number of times of recalculating navigation routes can be increased, so that traffic waste is caused, and the resource expense of navigation service is increased. Therefore, how to perform accurate yaw judgment is a technical problem to be solved by those skilled in the art.
Disclosure of Invention
The invention aims to provide a route yaw identification method, a route yaw identification device, a computer readable storage medium and computer equipment, and aims to solve the problem of how to accurately judge yaw.
In a first aspect, the present application provides a route yaw identification method, including:
acquiring road network data, a navigation route and a positioning sequence corresponding to current equipment;
determining at least one reference route corresponding to the positioning sequence according to the road network data and the positioning sequence;
constructing a route image corresponding to the reference route based on the road network data, the reference route and the positioning sequence;
determining a reference route matched with the positioning sequence as a matched route in the route image;
and comparing the navigation route with the matched route to obtain a route yaw recognition result.
In a second aspect, the present application provides a route yaw identification device, comprising:
the acquisition module is used for acquiring road network data, navigation routes and positioning sequences corresponding to the current equipment;
the first determining module is used for determining at least one reference route corresponding to the positioning sequence according to the road network data and the positioning sequence;
the construction module is used for constructing a route image corresponding to the reference route based on the road network data, the reference route and the positioning sequence;
the second determining module is used for determining a reference route matched with the positioning sequence as a matched route in the route image;
and the comparison module is used for comparing the navigation route and the matching route to obtain a route yaw recognition result.
Optionally, in some embodiments of the present application, the second determining module includes:
the first acquisition unit is used for acquiring a preset route matching model;
the extraction unit is used for extracting global features, starting point features and end point features corresponding to each reference route from the route image;
the splicing unit is used for splicing the global features, the starting point features and the end point features to obtain splicing features corresponding to the reference route;
and the processing unit is used for processing the spliced characteristics based on the route matching model to obtain a matching route matched with the reference route.
Optionally, in some embodiments of the present application, the splicing unit is specifically configured to:
extracting global features corresponding to each reference route in the route image through a preset first extraction sub-network, and;
extracting features of each reference route in the route image in a preset range by taking a starting point as a reference through a preset second extraction sub-network to obtain starting point features corresponding to the reference routes, and;
and extracting the characteristics of each reference route in the route image in a preset range by taking the end point as a standard through a preset second extraction sub-network to obtain the end point characteristics corresponding to the reference route.
Optionally, in some embodiments of the present application, a training unit is further included, where the training unit is configured to:
acquiring a basic matching model, a sample positioning sequence and sample road network data;
determining sample road network information in a preset range of each sample point in the sample positioning sequence according to the sample road network data;
determining at least one sample reference route corresponding to the sample positioning sequence based on the sample road network data and each sample point in the sample positioning sequence;
constructing a sample route image containing a sample reference route according to the sample road network information, the sample positioning sequence and the sample reference route;
inputting the sample route image into the basic matching model, and estimating a sample matching route corresponding to the sample positioning sequence;
and training the basic matching model according to the labeling information corresponding to the sample reference route and the sample matching route.
Optionally, in some embodiments of the present application, the building block includes:
the second acquisition unit is used for acquiring the reference road network information corresponding to the reference route from the road network data;
and the construction unit is used for constructing a route image corresponding to the reference route based on the reference road network information and the positioning points in the positioning sequence.
Optionally, in some embodiments of the present application, the building unit is specifically configured to:
acquiring the road network attribute of the reference road network information;
determining point location information corresponding to a locating point in the locating sequence;
and constructing a route image corresponding to the reference route based on the road network attribute and the point location information.
Optionally, in some embodiments of the present application, the first determining module is specifically configured to:
acquiring a preset route recall model;
and determining at least one reference route corresponding to the positioning sequence in the road network data based on the route recall model and the positioning sequence.
In a third aspect, the present application also provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of any of the methods described above.
In a fourth aspect, the present application also provides a computer device comprising one or more processors; a memory; and one or more computer programs, the processor and the memory being connected by a bus, wherein the one or more computer programs are stored in the memory and configured to be executed by the one or more processors, wherein the processor performs the steps of the method as described in any of the above.
In the application, at least one reference route corresponding to a positioning sequence is determined through road network data and the positioning sequence, and a route image corresponding to the reference route is constructed based on the road network data, the reference route and the positioning sequence; and then, determining a reference route matched with the positioning sequence as a matching route in the route image, namely, performing secondary verification on the reference route matched with the positioning sequence by utilizing the route image, and finally, comparing the navigation route with the matching route to obtain a route yaw recognition result, so that misjudgment easily occurs by adopting a single yaw recognition algorithm when the positioning accuracy of hardware equipment is poor, and the accuracy of yaw judgment can be improved by adopting the scheme of the application.
Drawings
Fig. 1 is a flowchart of a route yaw recognition method according to an embodiment of the present application.
Fig. 2 is a schematic structural diagram of a route matching model in a route yaw recognition method according to an embodiment of the present application.
Fig. 3 is a functional block diagram of a route yaw recognition device according to an embodiment of the present application.
Fig. 4 is a specific structural block diagram of a computer device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantageous effects of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
In order to illustrate the technical solutions described in the present application, the following description is made by specific examples.
Referring to fig. 1, a flowchart of a route yaw recognition method according to an embodiment of the present application is mainly exemplified by application of the route yaw recognition method to a computer device, and the route yaw recognition method according to an embodiment of the present application includes the following steps:
s101, acquiring road network data, a navigation route and a positioning sequence corresponding to current equipment.
Road network (road network) refers to a road system composed of various roads interconnected and interlaced into a mesh distribution in a certain area. The navigation route is a route moving from a navigation start point to a navigation end point in the road network. The positioning sequence corresponding to the current device is a sequence formed by the point positions of the current device at the reporting time of each position, and it can be understood that the positioning sequence corresponding to the current device can be a positioning point at the starting point of route planning and at the time of route navigation. The positioning sequence can be a global positioning system (Global Positioning System, GPS) positioning sequence or a Beidou satellite navigation system (Beidou Navigation Satellite System, BDS) positioning sequence, and can be specifically determined according to actual conditions.
The road network data can be acquired through a network, the navigation route and the positioning sequence corresponding to the current device can be acquired through a navigation application, the current device can be a mobile phone, a tablet or a vehicle-mounted terminal, and the details are not described herein.
S102, determining at least one reference route corresponding to the positioning sequence according to the road network data and the positioning sequence.
The reference route is at least one route recalled based on the road network data and the positioning sequence, for example, 3 reference routes can be recalled, 5 reference routes can be recalled, and the selection can be specifically carried out according to actual conditions.
Note that, in the recalled reference route, the reference route may be sorted in order from high to low based on the matching degree with the positioning sequence, and the reference route may be a driving route, a riding route, a walking route, or the like.
The method is particularly important for relation quantification change of all roads in the city, and research on relation quantification among roads in the industry is mainly divided into two major categories, namely a data mining scheme and machine learning. The main idea of the data mining scheme is to mine the co-occurrence relationship among roads based on route data of a large number of users. For example: the vehicle diversion probability at the intersection can be mined based on a large amount of user route data, or the vehicle diversion probability at the intersection under the constraint of the starting and ending points can be mined based on a large amount of user route data; the machine learning may include a hidden markov (Hidden Markov Model, HMM) probabilistic description statistical model that optimizes hidden unknown parameters in the model by the input user route data to obtain a markov process that approximates the training set to estimate the topology connected road transition probabilities; a deep learning method of a recurrent neural network (Recurrent Neural Network, RNN) based on a Long short-term memory (LSTM) and related modifications thereof may also be included, the model being capable of estimating the probability of co-occurrence between roads over a length of context interval.
Optionally, in some embodiments, the application adopts an HMM model to determine at least one reference route corresponding to the positioning sequence, that is, the step of determining, according to the road network data and the positioning sequence, the at least one reference route corresponding to the positioning sequence may specifically include:
acquiring a preset route recall model;
and determining at least one reference route corresponding to the positioning sequence in the road network data based on the route recall model and the positioning sequence.
Specifically, the route recall model is an HMM model, a hidden Markov model is adopted, the road network matching operation is carried out by combining the communication relation between each locating point and the road network data, and the locating sequence is matched into the road network data, so that the road network matching is carried out with high accuracy.
Specifically, for the positioning sequence, a hidden markov model may be used, and in combination with the context between the track segments in the positioning sequence, the probability (i.e., similarity) between each route in the road network data and the positioning sequence is determined, the segment with the highest probability corresponding to the positioning sequence is determined as the segment matched with the positioning sequence, and then the segments are spliced to form the reference route corresponding to the positioning sequence. It should be noted that, since the positioning information such as GPS may drift or there may be a dense parallel road on the road of the area, the reference route matched by the HMM may have a matching error, so in the present application, the reference route matched by the HMM is further checked twice, that is, steps S103 and S014 are performed.
It can be appreciated that the positioning sequence can obtain a plurality of road network matched routes based on the HMM model, and in the application, the first N (N is a positive integer) routes with highest similarity can be determined as reference routes.
S103, constructing a route image corresponding to the reference route based on the road network data, the reference route and the positioning sequence.
For each of the above-mentioned matched reference routes, a route image corresponding to the reference route may be constructed according to the road network information of the preset range of the reference route, where the road network information of the preset range of the reference route may include data such as a position, a direction, and a shape of a road, so that the secondary verification of the reference route by using the constructed route image is facilitated, that is, optionally, in some embodiments, the step of constructing a route image corresponding to the reference route based on the road network data, the reference route, and the positioning sequence may specifically include:
acquiring reference road network information corresponding to a reference route from the road network data;
and constructing a route image corresponding to the reference route based on the reference road network information and the locating points in the locating sequence.
The reference road network information corresponding to the reference route may specifically include: and referring to the information such as the position, the direction, the road type, the road form, the road function grade and the like of all the roads in the preset range of the route. The position of the road may be a position relative to the reference route, and the direction of the road may be a direction in which the vehicle travels on the road. For example, road a is located on the left side of the reference route, road b is located on the right side of the reference route, and road a is in the direction from west to east (same as the traveling direction of the reference route), and road b is in the direction from east to east (opposite to the traveling direction of the reference route), alternatively, in some embodiments, the information of road b may be filtered out because road b is in the opposite direction to the traveling direction of the reference route.
In order to enable the constructed route image to carry not only the data of the road network but also the data of the positioning sequence, therefore, when constructing, the reference road network information corresponding to the reference route can be obtained from the road network data, and the positioning sequence is added into the reference road network information by taking the reference road network information as a benchmark, thereby forming the route image.
Specifically, labels corresponding to different information in advance, that is, a speed of a positioning point corresponds to a label s1, a direction of the positioning point corresponds to a label s2, a position confidence of the positioning point corresponds to a label s3, in addition, a road type corresponds to a label d1, a road shape corresponds to a label d2, a road function corresponds to a label d3, and the like in road network data may be specifically set according to actual situations, that is, optionally, in some embodiments, the step of constructing a route image including a reference route corresponding to the positioning point based on reference road network information and the positioning point in the positioning sequence may specifically include:
acquiring road network attributes of reference road network information;
determining point location information corresponding to a locating point in a locating sequence;
and constructing a route image corresponding to the reference route based on the road network attribute and the point location information.
For example, specifically, the reference route and the anchor point sequence information are drawn into a three-channel image according to the relative position relationship, the connection relationship between the reference route and the anchor point is represented by straight line connection, the anchor point is represented by a point with a direction and a length, wherein the direction represents the direction of the anchor point, the length represents the speed information of the anchor point, and in addition, the brightness of the anchor point can be used for representing the confidence of the anchor point, and the anchor point with lower confidence is represented as a darker point. In addition, drawing the attribute information of the direction, the position and the length in the peripheral road network information into a three-channel image; and representing attribute information such as road types, road forms, road function grades and the like in the surrounding road network information in the form of image channels, namely M total categories corresponding to the attribute, wherein each category is used as an image channel, if a certain road belongs to the M1 category, the position of the road is found in the channel corresponding to the M1 category, the relevant pixel is set as 1, and pixels at other road positions which do not belong to the M1 category are set as 0.
S104, determining a reference route matched with the positioning sequence in the route image as a matched route.
In the specific matching, the matching can be performed through a preset route matching model, and the route matching model can be a convolutional neural network model or other models. Optionally, in some embodiments, the present application provides a new route matching model, using the route matching network to extract global features, start features and end features corresponding to each reference route from the route image, and finally, determining, based on the global features, start features and end features, the reference route matching the positioning sequence as a matching route in the route image, that is, optionally, in some embodiments, the step of "determining, in the route image, the reference route matching the positioning sequence as a matching route" may specifically include:
acquiring a preset route matching model;
extracting global features, starting point features and end point features corresponding to each reference route from the route image;
splicing the global features, the starting point features and the end point features to obtain splicing features corresponding to the reference route;
and processing the spliced characteristics based on a route matching model to obtain a matching route matched with the reference route.
Referring to fig. 2, the route matching model specifically includes a feature extraction network T1, a feature extraction network T2, and a classification network T3, wherein, because the present application converts route matching into image matching, a depth residual network (ResNet), a visual geometry group network (Visual Geometry Group, VGG), or a mobile phone embedded network (MobileNet) can be used as the feature extraction network T1 and the feature extraction network T2. Alternatively, in some embodiments, the feature extraction network T1 and the feature extraction network T2 are the same network, but the parameters are not the same.
Specifically, inputting the route image into a feature extraction network T1, and extracting global features corresponding to the route image by the feature extraction network T1; meanwhile, intercepting data in a preset range of a starting point of a positioning sequence in a route image, and extracting features corresponding to the intercepted starting point image by utilizing a feature extraction network T2 to obtain starting point features; similarly, intercepting data in a preset range of the end point of the positioning sequence in the route image, and extracting features corresponding to the intercepted end point image by using a feature extraction network T2 to obtain end point features, that is, optionally, in some embodiments, the step of extracting global features, start point features and end point features corresponding to each reference route from the route image may specifically include:
global features corresponding to each reference route in the route image are extracted through a preset first extraction sub-network, features in a preset range of each reference route in the route image with a starting point as a reference are extracted through a preset second extraction sub-network, starting point features corresponding to the reference routes are obtained, and features in a preset range of each reference route in the route image with an ending point as a reference are extracted through a preset second extraction sub-network, so that end point features corresponding to the reference routes are obtained.
Then, the extracted global features, starting point features and end point features are spliced, and the splicing modes can be connected, combined or fused, and specifically can be selected according to actual conditions. The significance of feature splicing is as follows: and different characteristics are spliced, so that the content represented by the final splicing result is more comprehensive and rich, and the matching result is more accurate when the splicing result is used for matching in the follow-up process.
Finally, the splicing result is processed by using the classification network T3 in the route matching model to obtain a matching route matched with the reference route, and then step S105 is performed.
It should be noted that, the route matching model is pre-constructed, and since the task performed by the route matching model is a classification task, i.e. determining whether the routes match, the loss function used in training may be a classification loss function, such as a binary cross entropy loss function (Binary Cross Entropy, BCELoss), and of course, may be a cross entropy loss function (BCEWithLogitsLoss) based on a sigmoid function. Wherein, BCELoss: the last layer needs to be scaled by sigmoid before passing through the function. BCEWithLogitsLoss: BCEWITHLogitsLoss synthesizes Sigmoid and BCELoss in one step, does not need to scale through the Sigmoid at last, and directly processes the finally obtained logits, wherein the logits refer to the result which is not scaled through the Sigmoid and softmax. BCEWithLogitsLoss can be applied to multi-tag prediction, and in this application, only two tags (i.e., judging correct and incorrect) are applied, so BCELoss is preferably used.
Specifically, during training, first, a basic matching model is acquired, and then, a catenary is performed on the basic matching model by using a sample positioning sequence and sample road network data, so as to obtain a route matching model, that is, optionally, in some embodiments, the route yaw recognition method of the present application specifically may further include:
acquiring a basic matching model, a sample positioning sequence and sample road network data;
determining sample road network information within a preset range of each sample point in a sample positioning sequence according to the sample road network data;
determining at least one sample reference route corresponding to the sample positioning sequence based on the sample road network data and each sample point in the sample positioning sequence;
constructing a sample route image containing a sample reference route according to the sample road network information, the sample positioning sequence and the sample reference route;
inputting the sample route image into a basic matching model, and predicting a sample matching route corresponding to the sample positioning sequence;
and training the basic matching model according to the labeling information corresponding to the sample matching route.
The sample positioning sequence is a positioning sequence generated in sample road network data, so that the subsequent sample route image can be ensured to have the data of the sample positioning sequence, and the training process is different from the previous embodiment for determining a matching route matched with the reference route, and the difference is that: the sample reference route is marked in advance, namely, the information of whether the sample reference route is yawed is marked, and during training, parameters of the basic matching model are adjusted through the sample matching route corresponding to the sample reference route and marking information of the sample reference route, so that training of the basic matching model is completed, and a route matching model is obtained.
S105, comparing the navigation route with the matched route to obtain a route yaw recognition result.
After the matching route is obtained, the difference between the navigation route and the matching route is compared, and when the yaw of the user is determined, the user is prompted, for example, in a voice or text mode. According to the method and the device, the route map is utilized to carry out secondary verification on the reference route matched with the positioning sequence, so that a matching route with higher precision can be obtained, the accuracy of a route yaw identification result is improved, trouble to a user caused by inconsistent yaw prompt and actual conditions is avoided, the number of re-calculation times caused by yaw is greatly reduced, and therefore flow waste and navigation service resources are reduced.
The route yaw identification process provided by the application is as above.
After road network data, a navigation route and a positioning sequence corresponding to current equipment are acquired, at least one reference route corresponding to the positioning sequence is determined according to the road network data and the positioning sequence, then a route image corresponding to the reference route is constructed based on the road network data, the reference route and the positioning sequence, then the reference route matched with the positioning sequence is determined in the route image as a matched route, and finally the navigation route and the matched route are compared to obtain a route yaw recognition result. In the route yaw recognition scheme provided by the application, the route map is utilized to carry out secondary verification on the reference route matched with the positioning sequence, and finally, the navigation route and the matched route are compared, so that a route yaw recognition result is obtained, and the situation that misjudgment occurs easily due to the adoption of a single yaw recognition algorithm when the positioning accuracy of hardware equipment is poor is avoided.
Referring to fig. 3, the route yaw recognition device according to an embodiment of the present application may be a computer program or a piece of program code running in a computer device, for example, the route yaw recognition device is an application software; the route yaw recognition device can be used for executing corresponding steps in the route yaw recognition method provided by the embodiment of the application. The route yaw identifying device provided in an embodiment of the present application includes an obtaining module 201, a first determining module 202, a constructing module 203, a second determining module 204, and a comparing module 205, which is specifically as follows:
the obtaining module 201 is configured to obtain road network data, a navigation route, and a positioning sequence corresponding to a current device.
The first determining module 202 is configured to determine at least one reference route corresponding to the positioning sequence according to the road network data and the positioning sequence.
Optionally, in some embodiments, the first determining module 202 may specifically be configured to:
acquiring a preset route recall model;
and determining at least one reference route corresponding to the positioning sequence in the road network data based on the route recall model and the positioning sequence.
The construction module 203 is configured to construct a route image including a route corresponding to the reference route based on the road network data, the reference route, and the positioning sequence.
Optionally, in some embodiments, the building module 203 may specifically include:
the second acquisition unit is used for acquiring reference road network information corresponding to the reference route from the road network data;
and the construction unit is used for constructing a route image corresponding to the reference route based on the reference road network information and the locating points in the locating sequence.
Alternatively, in some embodiments, the building element may specifically be configured to: acquiring road network attributes of reference road network information; determining point location information corresponding to a locating point in a locating sequence; and constructing a route image corresponding to the reference route based on the road network attribute and the point location information.
A second determining module 204 is configured to determine, in the route image, that the reference route matching the positioning sequence is a matching route.
Optionally, in some embodiments, the second determining module 204 may specifically include:
the first acquisition unit is used for acquiring a preset route matching model;
the extraction unit is used for extracting global features, starting point features and end point features corresponding to each reference route from the route image;
the splicing unit is used for splicing the global features, the starting point features and the end point features to obtain splicing features corresponding to the reference route;
and the processing unit is used for processing the spliced characteristics based on the route matching model to obtain a matching route matched with the reference route.
Alternatively, in some embodiments, the splicing unit may specifically be configured to: global features corresponding to each reference route in the route image are extracted through a preset first extraction sub-network, features in a preset range taking a starting point as a reference are extracted through a preset second extraction sub-network, starting point features corresponding to the reference routes are obtained, and features in a preset range taking an ending point as a reference are extracted through a preset second extraction sub-network, so that ending point features corresponding to the reference routes are obtained.
Optionally, in some embodiments, the training device further comprises a training unit, and the training unit may specifically be configured to:
acquiring a basic matching model, a sample positioning sequence and sample road network data;
determining sample road network information within a preset range of each sample point in a sample positioning sequence according to the sample road network data;
determining at least one sample reference route corresponding to the sample positioning sequence based on the sample road network data and each sample point in the sample positioning sequence;
constructing a sample route image containing a sample reference route according to the sample road network information, the sample positioning sequence and the sample reference route;
inputting the sample route image into the basic matching model, and estimating a sample matching route corresponding to the sample positioning sequence;
and training the basic matching model according to the labeling information corresponding to the sample reference route and the sample matching route.
And the comparison module 205 is used for comparing the navigation route with the matching route to obtain a route yaw recognition result.
The route yaw recognition device provided in an embodiment of the present application and the route yaw recognition method provided in an embodiment of the present application belong to the same concept, and detailed implementation processes thereof are shown throughout the specification and are not repeated here.
The application provides a route yaw recognition device, after an acquisition module 201 acquires road network data, a navigation route and a positioning sequence corresponding to current equipment, a first determination module 202 determines at least one reference route corresponding to the positioning sequence according to the road network data and the positioning sequence, then a construction module 203 constructs a route image corresponding to the reference route based on the road network data, the reference route and the positioning sequence, then a second determination module 204 determines the reference route matched with the positioning sequence in the route image as a matching route, and finally a comparison module 205 compares the navigation route and the matching route to obtain a route yaw recognition result. In the route yaw recognition scheme provided by the application, the route map is utilized to carry out secondary verification on the reference route matched with the positioning sequence, and finally, the navigation route and the matched route are compared, so that a route yaw recognition result is obtained, and the situation that misjudgment occurs easily due to the adoption of a single yaw recognition algorithm when the positioning accuracy of hardware equipment is poor is avoided.
An embodiment of the present application provides a computer readable storage medium storing a computer program which, when executed by a processor, implements the steps of a route yaw identification method as provided by an embodiment of the present application.
Fig. 4 shows a specific block diagram of a computer device according to an embodiment of the present application, where the computer device 100 includes: one or more processors 101, a memory 102, and one or more computer programs, wherein the processors 101 and the memory 102 are connected by a bus, the one or more computer programs being stored in the memory 102 and configured to be executed by the one or more processors 101, the processor 101 implementing the steps of a route yaw identification method as provided by an embodiment of the present application when the computer programs are executed. The computer device includes a server, a terminal, and the like. The computer device may be a desktop computer, a mobile terminal or a vehicle-mounted device, the mobile terminal including at least one of a cell phone, a tablet computer, a personal digital assistant or a wearable device.
It should be understood that the steps in the embodiments of the present application are not necessarily sequentially performed in the order indicated by the step numbers. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in various embodiments may include multiple sub-steps or stages that are not necessarily performed at the same time, but may be performed at different times, nor do the order in which the sub-steps or stages are performed necessarily performed in sequence, but may be performed alternately or alternately with at least a portion of the sub-steps or stages of other steps or steps.
Those skilled in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by a computer program for instructing relevant hardware, where the program may be stored in a non-volatile computer readable storage medium, and where the program, when executed, may include processes in the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the various embodiments provided herein may include non-volatile and/or volatile memory. The nonvolatile memory can include Read Only Memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), memory bus direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), among others.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The foregoing examples illustrate only a few embodiments of the invention, which are described in detail and are not to be construed as limiting the scope of the invention. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the invention, which are all within the scope of the invention. Accordingly, the scope of protection of the present invention is to be determined by the appended claims.

Claims (10)

1. A method of route yaw identification, comprising:
acquiring road network data, a navigation route and a positioning sequence corresponding to current equipment;
determining at least one reference route corresponding to the positioning sequence according to the road network data and the positioning sequence;
constructing a route image corresponding to the reference route based on the road network data, the reference route and the positioning sequence;
determining a reference route matched with the positioning sequence as a matched route in the route image;
and comparing the navigation route with the matched route to obtain a route yaw recognition result.
2. The route yaw identification method of claim 1, wherein the determining in the route image that a reference route that matches the positioning sequence is a matching route includes:
acquiring a preset route matching model;
extracting global features, starting point features and end point features corresponding to each reference route from the route image;
splicing the global features, the starting point features and the end point features to obtain splicing features corresponding to the reference route;
and processing the spliced characteristics based on the route matching model to obtain a matching route matched with the reference route.
3. The route yaw recognition method of claim 2, wherein the extracting global features, start features, and end features corresponding to each reference route from the route image includes:
extracting global features corresponding to each reference route in the route image through a preset first extraction sub-network, and;
extracting features of each reference route in the route image in a preset range by taking a starting point as a reference through a preset second extraction sub-network to obtain starting point features corresponding to the reference routes, and;
and extracting the characteristics of each reference route in the route image in a preset range by taking the end point as a standard through a preset second extraction sub-network to obtain the end point characteristics corresponding to the reference route.
4. The route yaw identification method of claim 2, further comprising:
acquiring a basic matching model, a sample positioning sequence and sample road network data;
determining sample road network information in a preset range of each sample point in the sample positioning sequence according to the sample road network data;
determining at least one sample reference route corresponding to the sample positioning sequence based on the sample road network data and each sample point in the sample positioning sequence;
constructing a sample route image containing a sample reference route according to the sample road network information, the sample positioning sequence and the sample reference route;
inputting the sample route image into the basic matching model, and estimating a sample matching route corresponding to the sample positioning sequence;
and training the basic matching model according to the labeling information corresponding to the sample reference route and the sample matching route.
5. The route yaw recognition method of claim 1, wherein the constructing a route image including a correspondence of the reference route based on the road network data, the reference route, and the positioning sequence includes:
acquiring reference road network information corresponding to the reference route from the road network data;
and constructing a route image corresponding to the reference route based on the reference road network information and the positioning points in the positioning sequence.
6. The route yaw recognition method of claim 5, wherein constructing a route image including a correspondence of the reference route based on the reference road network information and the anchor points in the anchor sequence includes:
acquiring the road network attribute of the reference road network information;
determining point location information corresponding to a locating point in the locating sequence;
and constructing a route image corresponding to the reference route based on the road network attribute and the point location information.
7. The route yaw identification method of any one of claims 1 to 6, wherein the determining at least one reference route corresponding to the positioning sequence from the road network data and the positioning sequence includes:
acquiring a preset route recall model;
and determining at least one reference route corresponding to the positioning sequence in the road network data based on the route recall model and the positioning sequence.
8. A route yaw recognition device, comprising:
the acquisition module is used for acquiring road network data, navigation routes and positioning sequences corresponding to the current equipment;
the first determining module is used for determining at least one reference route corresponding to the positioning sequence according to the road network data and the positioning sequence;
the construction module is used for constructing a route image corresponding to the reference route based on the road network data, the reference route and the positioning sequence;
the second determining module is used for determining a reference route matched with the positioning sequence as a matched route in the route image;
and the comparison module is used for comparing the navigation route and the matching route to obtain a route yaw recognition result.
9. A computer readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the steps of the route yaw identification method of any one of claims 1 to 7.
10. A computer device, comprising:
one or more processors;
a memory; and one or more computer programs, the processor and the memory being connected by a bus, wherein the one or more computer programs are stored in the memory and configured to be executed by the one or more processors, characterized in that the processor, when executing the computer programs, implements the steps of the route yaw identification method of any one of claims 1 to 7.
CN202410033990.2A 2024-01-10 2024-01-10 Route yaw recognition method, route yaw recognition device, computer-readable storage medium and computer-readable storage device Pending CN117537842A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410033990.2A CN117537842A (en) 2024-01-10 2024-01-10 Route yaw recognition method, route yaw recognition device, computer-readable storage medium and computer-readable storage device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410033990.2A CN117537842A (en) 2024-01-10 2024-01-10 Route yaw recognition method, route yaw recognition device, computer-readable storage medium and computer-readable storage device

Publications (1)

Publication Number Publication Date
CN117537842A true CN117537842A (en) 2024-02-09

Family

ID=89792348

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410033990.2A Pending CN117537842A (en) 2024-01-10 2024-01-10 Route yaw recognition method, route yaw recognition device, computer-readable storage medium and computer-readable storage device

Country Status (1)

Country Link
CN (1) CN117537842A (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109459050A (en) * 2018-12-17 2019-03-12 北京百度网讯科技有限公司 The recommended method and its device of navigation routine
CN110726417A (en) * 2019-11-12 2020-01-24 腾讯科技(深圳)有限公司 Vehicle yaw identification method, device, terminal and storage medium
CN111435088A (en) * 2019-01-15 2020-07-21 北京嘀嘀无限科技发展有限公司 Road matching method and device, electronic equipment and storage medium
CN111460067A (en) * 2020-03-30 2020-07-28 滴图(北京)科技有限公司 Method and device for automatically updating navigation route and electronic equipment
CN111811533A (en) * 2020-07-06 2020-10-23 腾讯科技(深圳)有限公司 Yaw determination method and device and electronic equipment
CN113639741A (en) * 2020-04-27 2021-11-12 阿里巴巴集团控股有限公司 Yaw identification and navigation route planning method and device
CN114153934A (en) * 2021-12-13 2022-03-08 高德软件有限公司 Machine learning model training, navigation route recommendation method and computer storage medium
CN114459495A (en) * 2022-01-11 2022-05-10 腾讯科技(深圳)有限公司 Displacement information generation method, displacement information generation device and computer-readable storage medium
CN114459493A (en) * 2021-12-28 2022-05-10 高德软件有限公司 Navigation yaw confirmation method, device, equipment and storage medium
US20220291001A1 (en) * 2021-05-31 2022-09-15 Apollo Intelligent Connectivity (Beijing) Technology Co., Ltd. Method and apparatus for generating vehicle navigation path
CN115183789A (en) * 2022-06-28 2022-10-14 阿里巴巴(中国)有限公司 Navigation route determination method and device
CN116310499A (en) * 2023-02-06 2023-06-23 湖南星图空间信息技术有限公司 Ship yaw detection method for optical remote sensing image
CN116698075A (en) * 2023-08-07 2023-09-05 腾讯科技(深圳)有限公司 Road network data processing method and device, electronic equipment and storage medium
CN116972860A (en) * 2023-01-13 2023-10-31 腾讯科技(深圳)有限公司 Yaw recognition method and device, electronic equipment and storage medium

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109459050A (en) * 2018-12-17 2019-03-12 北京百度网讯科技有限公司 The recommended method and its device of navigation routine
CN111435088A (en) * 2019-01-15 2020-07-21 北京嘀嘀无限科技发展有限公司 Road matching method and device, electronic equipment and storage medium
CN110726417A (en) * 2019-11-12 2020-01-24 腾讯科技(深圳)有限公司 Vehicle yaw identification method, device, terminal and storage medium
CN111460067A (en) * 2020-03-30 2020-07-28 滴图(北京)科技有限公司 Method and device for automatically updating navigation route and electronic equipment
CN113639741A (en) * 2020-04-27 2021-11-12 阿里巴巴集团控股有限公司 Yaw identification and navigation route planning method and device
CN111811533A (en) * 2020-07-06 2020-10-23 腾讯科技(深圳)有限公司 Yaw determination method and device and electronic equipment
US20220291001A1 (en) * 2021-05-31 2022-09-15 Apollo Intelligent Connectivity (Beijing) Technology Co., Ltd. Method and apparatus for generating vehicle navigation path
CN114153934A (en) * 2021-12-13 2022-03-08 高德软件有限公司 Machine learning model training, navigation route recommendation method and computer storage medium
CN114459493A (en) * 2021-12-28 2022-05-10 高德软件有限公司 Navigation yaw confirmation method, device, equipment and storage medium
CN114459495A (en) * 2022-01-11 2022-05-10 腾讯科技(深圳)有限公司 Displacement information generation method, displacement information generation device and computer-readable storage medium
CN115183789A (en) * 2022-06-28 2022-10-14 阿里巴巴(中国)有限公司 Navigation route determination method and device
CN116972860A (en) * 2023-01-13 2023-10-31 腾讯科技(深圳)有限公司 Yaw recognition method and device, electronic equipment and storage medium
CN116310499A (en) * 2023-02-06 2023-06-23 湖南星图空间信息技术有限公司 Ship yaw detection method for optical remote sensing image
CN116698075A (en) * 2023-08-07 2023-09-05 腾讯科技(深圳)有限公司 Road network data processing method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN110849384B (en) Navigation route generation method and device, readable storage medium and computer equipment
US20160178377A1 (en) Navigation system, path prediction method thereof and computer readable medium for performing the same
CN105677793A (en) Site database establishing method and device, and candidate riding site recommending method and device
CN108286981B (en) Vehicle path planning method and device of Internet of vehicles and computer equipment
CN113587944B (en) Quasi-real-time vehicle driving route generation method, system and equipment
CN112241442B (en) Map updating method, map updating device, computer equipment and storage medium
CN112815948B (en) Method, device, computer equipment and storage medium for identifying yaw mode
CN112432650B (en) High-precision map data acquisition method, vehicle control method and device
CN110598917B (en) Destination prediction method, system and storage medium based on path track
CN112069300A (en) Semantic recognition method and device for task-based dialog, electronic equipment and storage medium
CN111091215A (en) Vehicle identification method and device, computer equipment and storage medium
CN111931077B (en) Data processing method, device, electronic equipment and storage medium
CN112598192A (en) Method and device for predicting vehicle entering logistics park, storage medium and terminal
CN111753639A (en) Perception map generation method and device, computer equipment and storage medium
CN110751831A (en) Travel mode identification method and device, computer equipment and storage medium
CN110599301B (en) Vehicle management method, device, computer equipment and storage medium
CN112699201B (en) Navigation data processing method and device, computer equipment and storage medium
CN112633812B (en) Track segmentation method, device, equipment and storage medium for freight vehicle
CN111582378B (en) Training generation method, position detection method and device of positioning recognition model
CN117537842A (en) Route yaw recognition method, route yaw recognition device, computer-readable storage medium and computer-readable storage device
Karimi et al. A methodology for predicting performances of map-matching algorithms
CN114245329B (en) Traffic mode identification method, device, equipment and storage medium
CN116443032A (en) Method, system, equipment and storage medium for predicting future long-term vehicle speed
CN111782973A (en) Interest point state prediction method and device, electronic equipment and storage medium
CN111624640A (en) Positioning method, positioning device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination