CN117705141B - Yaw recognition method, yaw recognition device, computer readable medium and electronic equipment - Google Patents

Yaw recognition method, yaw recognition device, computer readable medium and electronic equipment Download PDF

Info

Publication number
CN117705141B
CN117705141B CN202410168273.0A CN202410168273A CN117705141B CN 117705141 B CN117705141 B CN 117705141B CN 202410168273 A CN202410168273 A CN 202410168273A CN 117705141 B CN117705141 B CN 117705141B
Authority
CN
China
Prior art keywords
navigation
navigation terminal
yaw
features
terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410168273.0A
Other languages
Chinese (zh)
Other versions
CN117705141A (en
Inventor
肖宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202410168273.0A priority Critical patent/CN117705141B/en
Publication of CN117705141A publication Critical patent/CN117705141A/en
Application granted granted Critical
Publication of CN117705141B publication Critical patent/CN117705141B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Navigation (AREA)

Abstract

The embodiment of the application can be applied to the field of maps, and particularly provides a yaw identification method, a yaw identification device, a computer readable medium and electronic equipment. The yaw recognition method comprises the following steps: acquiring a positioning signal of a navigation terminal, wherein the positioning signal comprises a positioning position of the navigation terminal; identifying a road scene where the navigation terminal is located according to the positioning position, and matching a position point corresponding to the positioning position on a set navigation route; identifying a guiding direction for the navigation terminal according to the position points and the navigation route; and if the road scene where the navigation terminal is positioned is an intersection area, identifying the yaw condition of the navigation terminal according to the guiding direction and the set navigation characteristics. The technical scheme of the embodiment of the application not only ensures the accuracy of yaw identification, but also can effectively reduce the calculated amount of yaw identification and meet the navigation requirement of actual application scenes.

Description

Yaw recognition method, yaw recognition device, computer readable medium and electronic equipment
Technical Field
The present application relates to the field of computers and communication technologies, and in particular, to a yaw recognition method, a yaw recognition device, a computer readable medium, and an electronic apparatus.
Background
With the increasing number of automobiles and mobile terminals, the demand for map navigation services is increasing. In the map navigation field, accurate yaw judgment is very important, so that the problem of vehicle driving errors can be timely identified, accurate and reasonable driving guidance can be rapidly given, and more comfortable driving experience is brought. The yaw recognition scheme provided in the related technology is low in accuracy or complex in algorithm, and is difficult to meet the navigation requirements in the actual application scene.
Disclosure of Invention
The embodiment of the application provides a yaw identification method, a yaw identification device, a computer readable medium and electronic equipment, which not only ensure the accuracy of yaw identification, but also can effectively reduce the calculated amount of yaw identification and meet the navigation requirement of actual application scenes.
Other features and advantages of the application will be apparent from the following detailed description, or may be learned by the practice of the application.
According to an aspect of an embodiment of the present application, there is provided a yaw recognition method, including: acquiring a positioning signal of a navigation terminal, wherein the positioning signal comprises a positioning position of the navigation terminal; identifying a road scene where the navigation terminal is located according to the positioning position, and matching a position point corresponding to the positioning position on a set navigation route; identifying a guiding direction for the navigation terminal according to the position points and the navigation route; and if the road scene where the navigation terminal is positioned is an intersection area, identifying the yaw condition of the navigation terminal according to the guiding direction and the set navigation characteristics.
According to an aspect of an embodiment of the present application, there is provided a yaw recognition apparatus including: the acquisition unit is configured to acquire a positioning signal of the navigation terminal, wherein the positioning signal comprises a positioning position of the navigation terminal; the identification unit is configured to identify a road scene where the navigation terminal is located according to the positioning position and match a position point corresponding to the positioning position on a set navigation route; a determination unit configured to identify a guiding direction for the navigation terminal based on the location point and the navigation route; and the processing unit is configured to identify the yaw condition of the navigation terminal according to the guiding direction and the set navigation characteristic if the road scene where the navigation terminal is positioned is an intersection area.
In some embodiments of the application, based on the foregoing, the identification unit is configured to: acquiring road network data at the positioning position from map data according to the positioning position, wherein the road network data comprises road section data; matching a target road section where the positioning position is located in the road network data; and identifying the road scene where the navigation terminal is located according to the target road section.
In some embodiments of the present application, based on the foregoing aspects, the identifying unit identifies, according to the target road segment, a road scene in which the navigation terminal is located, including at least one of the following manners: identifying a road scene where the navigation terminal is located according to the attribute data of the target road section, wherein the attribute data of the target road section is used for representing the road type of the target road section; and identifying the road scene where the navigation terminal is located according to the relation between the target road section and the adjacent road section.
In some embodiments of the application, based on the foregoing, the determining unit is configured to: intercepting a route with a set length on the navigation route according to the position of the position point on the navigation route, wherein the route with the set length comprises the position point; and identifying a guiding direction for the navigation terminal according to the route with the set length, wherein the guiding direction comprises one or more of straight running, steering and turning around.
In some embodiments of the application, based on the foregoing, the navigation features include distance-like features and angle-like features; the processing unit is configured to: and if the guiding direction is straight, determining that the navigation terminal does not yaw when the distance class feature is smaller than or equal to a distance threshold value or when the angle class feature is smaller than or equal to an angle threshold value.
In some embodiments of the application, based on the foregoing, the processing unit is further configured to: and adjusting the size of the distance threshold according to the angle class characteristics.
In some embodiments of the application, based on the foregoing, the navigation features include distance-like features and angle-like features; the processing unit is configured to: if the guiding direction is steering or turning around, determining that the navigation terminal does not yaw when at least one of the following conditions is met: the steering trend of the sensor is consistent with the steering trend of the navigation route, and the sensor is a sensor which is arranged on the navigation terminal and can output a direction signal; the steering trend of the positioning signal of the navigation terminal is consistent with the steering trend of the navigation route; the navigation terminal has a deceleration trend; the navigation terminal is located in the intersection area.
In some embodiments of the application, based on the foregoing, the processing unit is further configured to: if the accumulated steering angle of the sensor in the intersection area is in a set angle interval, determining that the steering trend of the sensor is consistent with the steering trend of the navigation route; if the accumulated steering angle of the positioning signal of the navigation terminal in the intersection area is within a set angle interval, determining that the steering trend of the positioning signal of the navigation terminal is consistent with the steering trend of the navigation route.
In some embodiments of the application, based on the foregoing, the navigation features include distance-like features and angle-like features; the processing unit is configured to: and if the distance type feature is larger than a distance threshold value and the angle type feature is larger than an angle threshold value, determining that the navigation terminal is identified to yaw.
In some embodiments of the application, based on the foregoing, the processing unit is further configured to perform at least one of the following: adjusting the size of the distance threshold according to the quality of the positioning signal, wherein the quality of the positioning signal and the size of the distance threshold form an inverse correlation relation;
And adjusting the size of the distance threshold according to the road scene, wherein the complexity of the road scene and the size of the distance threshold form a positive correlation.
In some embodiments of the application, based on the foregoing, the processing unit is further configured to: acquiring multidimensional features of the navigation terminal in the navigation process; acquiring a set number of features from the multi-dimensional features according to the order of the importance of the multi-dimensional features from high to low; and generating the distance type features and the angle type features according to the set number of features.
In some embodiments of the application, based on the foregoing, the processing unit is configured to: acquiring features belonging to a distance type from the set number of features, and taking the minimum value of the features belonging to the distance type as the distance type feature; and acquiring the features belonging to the angle type from the set number of features, and taking the minimum value in the features belonging to the angle type as the angle type feature.
In some embodiments of the application, based on the foregoing, the processing unit is further configured to: taking multi-dimensional characteristics contained in the historical navigation data as sample data, and taking yaw conditions corresponding to the historical navigation data as sample labels so as to train a machine learning model; the importance of the multi-dimensional features is determined based on the trained machine learning model.
In some embodiments of the application, based on the foregoing, the processing unit is further configured to: if the yaw of the navigation terminal is recognized according to the guiding direction and the set navigation characteristics, determining that the yaw of the navigation terminal does not occur actually when at least one of the following conditions is met:
The running speed of the navigation terminal is smaller than or equal to a set speed threshold value;
The navigation terminal is positioned in a set road scene;
The quality of the positioning signal of the navigation terminal is smaller than or equal to a set quality threshold;
The steering trend of the positioning signal of the navigation terminal is inconsistent with the steering trend of a sensor arranged on the navigation terminal;
The matching degree between the positioning position of the navigation terminal and the target road section matched in the road network data is smaller than or equal to a set precision threshold value;
the positioning position of the navigation terminal moves towards the navigation route.
According to an aspect of an embodiment of the present application, there is provided a computer readable medium having stored thereon a computer program which, when executed by a processor, implements a yaw recognition method as described in the above embodiments.
According to an aspect of an embodiment of the present application, there is provided an electronic apparatus including: one or more processors; storage means for storing one or more computer programs that, when executed by the one or more processors, cause the electronic device to implement the yaw recognition method as described in the embodiments above.
According to an aspect of an embodiment of the present application, there is provided a computer program product comprising a computer program stored in a computer readable storage medium. The processor of the electronic device reads and executes the computer program from the computer-readable storage medium, so that the electronic device performs the yaw recognition method provided in the various alternative embodiments described above.
In the technical solutions provided in some embodiments of the present application, a road scene where a navigation terminal is located may be identified according to a location position of the navigation terminal, and a location point corresponding to the location position may be matched on a set navigation route, so as to identify a guiding direction for the navigation terminal according to the location point and the navigation route, so that when the road scene where the navigation terminal is located is an intersection area, a yaw condition of the navigation terminal is identified according to the guiding direction and the set navigation feature. Therefore, the technical scheme of the embodiment of the application can determine the guiding direction of the navigation terminal by identifying the road scene and the navigation line matching, so that when the navigation terminal is identified to be in the intersection area, the yaw condition of the navigation terminal is identified according to the guiding direction and the navigation characteristics, the accuracy of yaw identification is ensured, the calculated amount of yaw identification can be effectively reduced, and the navigation requirement of the actual application scene is met.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application as claimed.
Drawings
Fig. 1 shows a schematic diagram of an exemplary system architecture to which the technical solution of an embodiment of the present application may be applied.
FIG. 2 illustrates a flow chart of a yaw recognition method according to one embodiment of the application.
FIG. 3 illustrates a schematic diagram of a yaw recognition system, according to one embodiment of the application.
Fig. 4 shows a three-axis definition schematic of a handset device according to an embodiment of the application.
Fig. 5 shows a three-axis definition schematic of a vehicle device according to an embodiment of the application.
Fig. 6 shows a three-axis definition schematic of a vehicle device according to an embodiment of the application.
FIG. 7 shows a schematic diagram of a navigation route according to one embodiment of the application.
Fig. 8 shows a schematic diagram of a navigation guidance direction according to an embodiment of the application.
FIG. 9 illustrates a schematic diagram of importance scores for features according to one embodiment of the application.
FIG. 10 illustrates a block diagram of a yaw recognition device, according to one embodiment of the application.
Fig. 11 shows a schematic diagram of a computer system suitable for use in implementing an embodiment of the application.
Detailed Description
Example embodiments are now described in a more complete manner with reference being made to the figures. However, the illustrated embodiments may be embodied in various forms and should not be construed as limited to only these examples; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art.
Furthermore, the described features, structures, or characteristics of the application may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are set forth in order to provide a thorough understanding of embodiments of the application. However, it will be recognized by one skilled in the art that the present inventive arrangements may be practiced without all of the specific details of the embodiments, that one or more specific details may be omitted, or that other methods, elements, devices, steps, etc. may be used.
In the present embodiment, the term "module" or "unit" refers to a computer program or a part of a computer program having a predetermined function and working together with other relevant parts to achieve a predetermined object, and may be implemented in whole or in part by using software, hardware (such as a processing circuit or a memory), or a combination thereof. Also, a processor (or multiple processors or memories) may be used to implement one or more modules or units. Furthermore, each module or unit may be part of an overall module or unit that incorporates the functionality of the module or unit.
The block diagrams depicted in the figures are merely functional entities and do not necessarily correspond to physically separate entities. That is, the functional entities may be implemented in software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
The flow diagrams depicted in the figures are exemplary only, and do not necessarily include all of the elements and operations/steps, nor must they be performed in the order described. For example, some operations/steps may be decomposed, and some operations/steps may be combined or partially combined, so that the order of actual execution may be changed according to actual situations.
It should be noted that: references herein to "a plurality" means two or more. "and/or" describes an association relationship of an association object, meaning that there may be three relationships, e.g., a and/or B may represent: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship.
It can be understood that, before and during the process of collecting the relevant data (such as the positioning signal of the navigation terminal, road network data, etc.), the present application may display a prompt interface or a popup window for prompting the user to collect the relevant data currently, so that the present application only starts to execute the relevant step of obtaining the relevant data after obtaining the confirmation operation sent by the user to the prompt interface or the popup window, or ends the relevant step of obtaining the relevant data (i.e. does not obtain the relevant data when the confirmation operation sent by the user to the prompt interface or the popup window is not obtained). In other words, all data collected by the present application is collected with the consent and authorization of the user, and the collection, use and processing of the relevant data requires compliance with the relevant laws and regulations and standards of the relevant country and region.
With the increasing number of automobiles and mobile terminals, the demand for map navigation services is increasing. In the map navigation field, accurate yaw judgment is very important, so that the problem of vehicle driving errors can be timely identified, accurate and reasonable driving guidance can be rapidly given, and more comfortable driving experience is brought. In the yaw recognition scheme proposed in the related art, a method for recognizing the yaw through a machine learning model exists, but the algorithm of the scheme is high in complexity, and the accuracy of recognizing the yaw through a road matching method is low.
Based on the above, the embodiment of the application provides a new yaw recognition scheme, and the guiding direction of the navigation terminal can be determined by recognizing the matching of the road scene and the navigation line, so that when the navigation terminal is recognized to be in the intersection area, the yaw condition of the navigation terminal is recognized according to the guiding direction and the navigation characteristics, the accuracy of yaw recognition is ensured, the calculated amount of yaw recognition is effectively reduced, and the navigation requirement of the practical application scene is met.
In the following, an application scenario of the technical solution of the embodiment of the present application will be described with reference to fig. 1, where as shown in fig. 1, the navigation terminal may be a vehicle terminal 101, and an electronic map application is installed in the vehicle terminal 101, and the vehicle terminal may perform driving according to a navigation route in an electronic map, for example, performing automatic driving, driving assistance, and so on. The vehicle terminal 101 has a positioning device disposed therein, which may be a satellite positioning device that can acquire observed positioning signal information. The satellite positioning device is used for tracking and processing satellite signals, and measuring geometrical distances between the device and satellites (pseudo-range observations) and Doppler effects of the satellite signals (Doppler observations). The satellite positioning device generally comprises an antenna, a satellite signal tracking loop, a baseband signal processing module and the like, and the terminal device integrated with the satellite positioning device can calculate the positioning position of the terminal device according to the pseudo-range observation value and the Doppler observation value.
Alternatively, the satellite positioning device may receive GNSS (Global Navigation SATELLITE SYSTEM ) positioning signals, such as may receive positioning signals of one or more of positioning satellites 103a, 103b shown in fig. 1. The GNSS positioning signals may be, for example, one or more of global positioning system (Global Positioning System, GPS) positioning signals, beidou satellite navigation system (BeiDou Navigation SATELLITE SYSTEM, BDS) positioning signals, GLONASS satellite navigation system positioning signals, GALILEO satellite navigation system positioning signals.
In some alternative embodiments, after acquiring the observed positioning signal, the vehicle terminal 101 may acquire the positioning position of the vehicle terminal 101 according to the positioning signal, then identify a road scene (such as a tunnel, an intersection, a service area, an overhead bridge, etc.) where the vehicle terminal 101 is located according to the positioning position, and match a location point corresponding to the positioning position on a set navigation route, and then the vehicle terminal 101 may identify a guiding direction (such as straight, turning or turning around) for the vehicle terminal 101 according to the matched location point and the navigation route. Further, when the road scene where the vehicle terminal 101 is located is an intersection area, the yaw situation of the vehicle terminal 101 is identified according to the guiding direction and the set navigation feature (such as the closest distance from the positioning position to the navigation route, the distance between the positioning position and the matching position point on the navigation route, etc.), that is, whether the vehicle terminal 101 is yawed or not is identified. If it is recognized that the vehicle terminal 101 is yawed, the vehicle terminal 101 may request the server 102 to re-plan the navigation route for the vehicle terminal 101 according to the location position and destination of the vehicle terminal 101.
In some alternative embodiments, after acquiring the observed positioning signal, the vehicle terminal 101 may also send the observed positioning signal to the server 102, and the server 102 may also identify a road scene (such as a tunnel, an intersection, a service area, an overhead bridge, etc.) where the vehicle terminal 101 is located according to the positioning position, and match a location point corresponding to the positioning position on a navigation route of the vehicle terminal 101, and then the server 102 may identify a guiding direction (such as straight, turning, or turning around) for the vehicle terminal 101 according to the matched location point and the navigation route. Further, when the road scene where the vehicle terminal 101 is located is an intersection area, the yaw situation of the vehicle terminal 101 is identified according to the guiding direction and the set navigation feature (such as the closest distance from the positioning position to the navigation route, the distance between the positioning position and the matching position point on the navigation route, etc.), that is, whether the vehicle terminal 101 is yawed or not is identified. If it is recognized that the vehicle terminal 101 is yaw-ed, the server 102 may re-plan the navigation route according to the location position and destination of the vehicle terminal 101.
It should be noted that, the server 102 may be an independent physical server, or may be a server cluster or a distributed system formed by at least two physical servers, or may be a cloud server that provides cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, a content distribution network (Content Delivery Network, CDN), and basic cloud computing services such as big data and an artificial intelligence platform. The vehicle terminal 101 may also be a smart phone, a smart speaker, a screen speaker, a smart watch, a sensor, etc., but is not limited thereto, for example, the vehicle terminal 101 may also be replaced by a mobile terminal such as an aircraft. The respective vehicle terminals and servers may be directly or indirectly connected through wired or wireless communication, and the number of the vehicle terminals and servers may be one or at least two, which is not limited herein.
The implementation details of the technical scheme of the embodiment of the application are described in detail below:
Fig. 2 shows a flow chart of a yaw recognition method according to an embodiment of the present application, which may be performed by an electronic device having a calculation processing function, such as a navigation device (e.g. a vehicle terminal, a mobile terminal) performing a navigation function, or a server in communication with the navigation device. Referring to fig. 2, the yaw recognition method at least includes steps S210 to S240, and is described in detail as follows:
In step S210, a positioning signal of the navigation terminal is obtained, where the positioning signal includes a positioning position of the navigation terminal.
Alternatively, the navigation terminal may be a terminal device that has a positioning function and is capable of performing a navigation function, such as a vehicle terminal, a smart phone, a smart watch, or the like.
The positioning signal in the embodiment of the application may be a satellite positioning signal, for example, may be a GNSS positioning signal. The GNSS positioning signals may be, for example, one or more of GPS positioning signals, BDS positioning signals, GLONASS satellite navigation system positioning signals, GALILEO satellite navigation system positioning signals. The navigation terminal can determine an absolute position (i.e. a positioning position) according to longitude and latitude coordinate information contained in the received satellite positioning signals.
Alternatively, the positioning signal may also be a positioning signal from an auxiliary positioning device, for example, a positioning signal provided by a base station device or a Road Side device (e.g., road Side Unit, RSU for short). It should be noted that: if the positioning signal provided by the auxiliary positioning device indicates the relative position of the navigation terminal and the auxiliary positioning device, the absolute position (i.e. the positioning position) of the navigation terminal itself can be determined from the relative position and the absolute position of the auxiliary positioning device.
In step S220, a road scene where the navigation terminal is located is identified according to the positioning position, and a position point corresponding to the positioning position is matched on the set navigation route.
In some optional embodiments, when identifying the road scene where the navigation terminal is located, road network data at the location position may be obtained from map data according to the location position of the navigation terminal, where the road network data includes road segment data, and then a target road segment where the location position is located is matched with the road network data, and the road scene where the navigation terminal is located is identified according to the target road segment.
Optionally, the road network data (Road Network Data) is a kind of geographical information data for representing a road system. It generally contains basic elements such as roads, intersections, bridges, tunnels, etc., and attribute information related to these elements such as road names, types, widths, start-end position coordinates, speed limits, etc. The most basic element in the road network data is a road Segment, a plurality of communicated road segments form a Link together, one Link is communicated with other links only through a starting point and an ending point, different road segments can be cut into inside the Link according to the shape, and a plurality of communicated links can form a Route.
In some alternative embodiments, when acquiring road network data at the location of the navigation terminal, road network data within a set range around (for example, a range around 50 meters, 100 meters, etc.) may be acquired with the location of the navigation terminal as the center.
In some alternative embodiments, the road scene where the navigation terminal is located may be identified according to the attribute data of the target road segment, where the attribute data of the target road segment is used to characterize the road type of the target road segment, and then the road scene where the navigation terminal is located may be identified according to the road type of the target road segment, for example, if the attribute data of the target road segment indicates that the road type of the target road segment is a tunnel, it may be determined that the navigation terminal is located in the tunnel scene.
In some alternative embodiments, the road scene in which the navigation terminal is located may be identified according to the relationship between the target road segment and the neighboring road segment. For example, if parallel roads with the same height exist around the target road section, the road scene where the navigation terminal is located is described as a main road scene and an auxiliary road scene; and if parallel roads or cross roads with different heights exist around the target road section, indicating that the road scene where the navigation terminal is positioned is an overhead scene.
In some alternative embodiments, the road scene where the navigation terminal is located may also be identified together according to two factors, namely, attribute data of the target road segment and a relationship between the target road segment and an adjacent road segment. For example, the road scene where the navigation terminal is located can be respectively identified according to the attribute data of the target road section and the relation between the target road section and the adjacent road section, and then if the identification results of the two factors are consistent, the road scene where the navigation terminal is located is determined to be identified; if the identification results of the two factors are not identical, the identification result with a high priority may be selected according to the priority between the two factors.
In some alternative embodiments, matching the location point corresponding to the positioning location on the set navigation route is mainly based on the positioning location of the navigation terminal, and the possible location of the navigation terminal is identified on the set navigation route. For example, the distance between the positioning position and each point on the navigation route may be calculated, and then the point with the smallest distance is selected as the position point corresponding to the positioning position that is matched on the navigation route.
In step S230, a guiding direction for the navigation terminal is identified based on the matched position point and the navigation route.
Alternatively, the guiding direction for the navigation terminal may be straight, turning or dropping the head, etc. Steering can also be classified into left steering, right steering, etc.; the turning around can also be divided into turning around right, turning around left, etc.
In some alternative embodiments, when identifying the guiding direction for the navigation terminal, a route of a set length may be intercepted on the navigation route according to the matched position point on the navigation route, the route of the set length including the position point, and then the guiding direction for the navigation terminal is identified according to the route of the set length. For example, the route with the set length may be a route 5 meters or 10 meters (the numerical value is only an example) around the location point, so that it can be identified from the intercepted route whether the navigation terminal is to execute, turn around or turn around.
In step S240, if the road scene where the navigation terminal is located is an intersection area, the yaw situation of the navigation terminal is identified according to the guiding direction of the navigation terminal and the set navigation feature.
In some alternative embodiments, if the navigation terminal is not at an intersection region, it is stated that the navigation terminal does not yaw and is straight along the road by default. When the navigation terminal is located in the intersection area, yaw may occur due to a steering error or a turning error of the navigation terminal, so that when the navigation terminal is located in the intersection area, whether the navigation terminal is yaw needs to be identified.
In some alternative embodiments, the navigation features relied upon in identifying the yaw condition of the navigation terminal may be features of the navigation terminal that are relevant during navigation. Such as one or more of the following features: the navigation method comprises the steps of speed of a navigation terminal, traveling direction of the navigation terminal, distance between a positioning position of the navigation terminal and a matching point in a map, road grade/road width and other attributes of a road where the navigation terminal is located, distance between the positioning position of the navigation terminal and the matching point on a navigation route, local shape of the navigation route, nearest distance between the positioning position of the navigation terminal and the full-section navigation route, accumulated change condition of course angle obtained by a sensor on the navigation terminal in a certain time window, included angle between a positioning signal of the navigation terminal and the map matching road, included angle between the positioning signal of the navigation terminal and the matching road on the navigation route, included angle between the map matching road and the matching road on the navigation route, quality evaluation score of the positioning signal of the navigation terminal, road scene where the navigation terminal is located, distance between the map matching point and the matching point on the navigation route and the matching position point on the navigation route and the like.
In some alternative embodiments, since the navigation terminal has more relevant features in the navigation process, in order to reduce the calculation amount as much as possible, suitable features may be selected to identify the yaw situation of the navigation terminal. For example, after acquiring the multi-dimensional features of the navigation terminal in the navigation process (a plurality of the above-listed features and non-listed features may be selected as the multi-dimensional features), the importance of the multi-dimensional features may be identified, then a set number of features may be acquired from the multi-dimensional features according to the order of the importance of the multi-dimensional features from high to low, and the acquired set number of features may be processed to obtain the distance-type features and the angle-type features.
In some alternative embodiments, machine learning techniques in artificial intelligence (ARTIFICIAL INTELLIGENCE, AI) techniques may be employed to identify the importance of features. Wherein artificial intelligence is the intelligence of simulating, extending and expanding a person using a digital computer or a machine controlled by a digital computer, sensing the environment, obtaining knowledge, and using knowledge to obtain optimal results. In other words, artificial intelligence is an integrated technology of computer science that attempts to understand the essence of intelligence and to produce a new intelligent machine that can react in a similar way to human intelligence. Artificial intelligence, i.e. research on design principles and implementation methods of various intelligent machines, enables the machines to have functions of sensing, reasoning and decision.
The artificial intelligence technology is a comprehensive subject, and relates to the technology with wide fields, namely the technology with a hardware level and the technology with a software level. Artificial intelligence infrastructure technologies generally include technologies such as sensors, dedicated artificial intelligence chips, cloud computing, distributed storage, big data processing technologies, operation/interaction systems, mechatronics, and the like. The artificial intelligence software technology mainly comprises a computer vision technology, a voice processing technology, a natural language processing technology, machine learning/deep learning, automatic driving, intelligent traffic and other directions. Machine learning (MACHINE LEARNING, abbreviated as ML) is a multi-domain interdisciplinary, and involves multiple disciplines such as probability theory, statistics, approximation theory, convex analysis, and algorithm complexity theory. It is specially studied how a computer simulates or implements learning behavior of a human to acquire new knowledge or skills, and reorganizes existing knowledge structures to continuously improve own performance. Machine learning is the core of artificial intelligence, a fundamental approach to letting computers have intelligence, which is applied throughout various areas of artificial intelligence. Machine learning and deep learning typically include techniques such as artificial neural networks, belief networks, reinforcement learning, transfer learning, induction learning, teaching learning, and the like.
Optionally, in an embodiment of the present application, the machine learning model may be trained by using the multi-dimensional feature included in the historical navigation data as sample data and using a yaw condition corresponding to the historical navigation data as a sample tag, and then the importance of the multi-dimensional feature is determined based on the trained machine learning model. Alternatively, the machine learning model may be a neural network model such as a convolutional neural network (Convolutional Neural Networks, CNN) or a cyclic neural network (Recurrent Neural Network, RNN), or may be a support vector machine (Support Vector Machine, SVM) model, a bayesian model, a decision tree model, or the like.
In some alternative embodiments, when the acquired set number of features are processed to generate the distance class features and the angle class features, the features belonging to the distance class may be acquired from the acquired set number of features, and then the minimum value of the features belonging to the distance class is taken as the distance class feature. Alternatively, in other embodiments of the present application, an average value or median of features belonging to a distance type, or the like, may be used as the distance class feature, or the distance class feature may be determined in other manners.
Similarly, a feature belonging to an angle type may be acquired from the acquired set number of features, and then the minimum value among the features belonging to the angle type is taken as an angle type feature. Alternatively, in other embodiments of the present application, an average value or a median value or the like of the features belonging to the angle type may be used as the angle class feature, or the angle class feature may be determined in other manners.
In some alternative embodiments, when the yaw situation of the navigation terminal is identified according to the guiding direction and the set navigation feature, if the guiding direction for the navigation terminal is straight, it may be determined that the navigation terminal is not yaw when the distance class feature is less than or equal to a distance threshold (indicating that the degree of deviation between the navigation terminal and the navigation route is small), or when the angle class feature is less than or equal to an angle threshold (indicating that the degree of deviation between the navigation terminal and the navigation route is small).
Optionally, for the case that the guiding direction of the navigation terminal is straight, the size of the distance threshold may be adjusted according to the angle class feature. Such as a distance threshold ofThe magnitude of the distance threshold may be adjusted according to the following formula:
Wherein, 、/>、/>Is a constant parameter; /(I)Values representing the angle class characteristics.
In some alternative embodiments, when the yaw situation of the navigation terminal is identified according to the guiding direction and the set navigation feature, if the guiding direction for the navigation terminal is turning or turning around, it is determined that the navigation terminal does not yaw when at least one of the following conditions is satisfied:
the steering trend of the sensor is consistent with the steering trend of the navigation route, and the sensor is a sensor which is arranged on the navigation terminal and can output a direction signal;
the steering trend of the positioning signal of the navigation terminal is consistent with the steering trend of the navigation route;
The navigation terminal has a deceleration trend;
The navigation terminal is positioned in the intersection area.
Alternatively, the above-described sensor may be a three-axis acceleration sensor, a three-axis magnetic force sensor, a three-axis gyro sensor, or the like. When determining whether the navigation terminal has a deceleration trend, the speed of the navigation terminal in a set time period can be counted to determine, or the speed of the navigation terminal can be determined according to the acceleration of the navigation terminal, for example, if the acceleration of the navigation terminal is negative, the navigation terminal is indicated to have the deceleration trend. In addition, if the navigation terminal is in the intersection area, it is indicated that the navigation terminal has not completed the steering operation or has not completed the turning operation, and it is also determined that it is not yawed.
In some alternative embodiments, when determining whether the steering trend of the sensor is consistent with the steering trend of the navigation route, it may be determined whether the cumulative steering angle of the sensor in the intersection area is within a set angle interval, for example, if the cumulative steering angle of the sensor in the intersection area is within the set angle interval, then the steering trend of the sensor is consistent with the steering trend of the navigation route. Alternatively, if the guiding direction for the navigation terminal is steering, the angle section may be a smaller area, such as a section of about 30 degrees (e.g., 25-35 degrees); if the guiding direction for the navigation terminal is a u-turn, the angle interval may be a larger area, such as an interval of about 120 degrees (e.g., 115-125 degrees).
In some alternative embodiments, when determining whether the steering trend of the positioning signal of the navigation terminal is consistent with the steering trend of the navigation route, it may be determined whether the steering trend of the positioning signal of the navigation terminal is consistent with the steering trend of the navigation route by determining whether the cumulative steering angle of the positioning signal of the navigation terminal in the intersection area is within a set angle interval, for example, if the cumulative steering angle of the positioning signal of the navigation terminal in the intersection area is within the set angle interval. Alternatively, if the guiding direction for the navigation terminal is steering, the angle section may be a smaller area, such as a section of about 30 degrees (e.g., 25-35 degrees); if the guiding direction for the navigation terminal is a u-turn, the angle interval may be a larger area, such as an interval of about 120 degrees (e.g., 115-125 degrees).
In some alternative embodiments, when the yaw condition of the navigation terminal is identified according to the guiding direction and the set navigation feature, if the distance class feature is greater than the distance threshold value and the angle class feature is greater than the angle threshold value, it is determined that the deviation degree between the navigation terminal and the navigation route is greater, and the yaw of the navigation terminal is identified.
In some alternative embodiments, the magnitude of the distance threshold may also be adjusted according to the quality of the positioning signal, where the quality of the positioning signal is inversely related to the magnitude of the distance threshold, i.e. the worse the quality of the positioning signal, the greater the distance threshold. This is because, when the quality of the positioning signal is poor, a large influence may be caused on yaw recognition, so that in order to avoid misjudgment, the magnitude of the distance threshold may be set large, but a maximum value may be set, i.e., the distance threshold cannot exceed the maximum value; in order to improve accuracy of yaw recognition when the quality of the positioning signal is good, the magnitude of the distance threshold may be set smaller, but a minimum value may be set, i.e. the distance threshold cannot be smaller than the minimum value.
In some alternative embodiments, the magnitude of the distance threshold may also be adjusted according to a road scene, where the complexity of the road scene and the magnitude of the distance threshold are in positive correlation, that is, the more complex the road scene, the greater the distance threshold. This is because when the road scene is complex, a large influence may be caused on yaw recognition, so in order to avoid misjudgment, the magnitude of the distance threshold may be set large, but a maximum value may be set, i.e., the distance threshold cannot exceed the maximum value; in order to improve accuracy of yaw recognition when the road scene is not very complex, the distance threshold may be set to be small, but a minimum value, i.e., the distance threshold cannot be set to be smaller than the minimum value.
In some alternative embodiments, to avoid yaw misidentification, after identifying that the navigation terminal is yawing according to the guiding direction and the set navigation characteristics, it may be determined that the navigation terminal is not yawing in practice if at least one of the following conditions is met:
the running speed of the navigation terminal is smaller than or equal to a set speed threshold value;
The navigation terminal is positioned in a set road scene, such as a tunnel, a parking lot, an intersection and the like;
The quality of the positioning signal of the navigation terminal is smaller than or equal to the set quality threshold, and the quality of the positioning signal is poor, so that the yaw recognition can be greatly influenced, and the navigation terminal can be considered to be not yawed;
The steering trend of the positioning signal of the navigation terminal is inconsistent with the steering trend of the sensor installed on the navigation terminal, and the situation is probably caused by the abnormality of the positioning signal or the sensor, so that the navigation terminal can be considered to be not yawed;
The matching degree between the positioning position of the navigation terminal and the target road section matched in the road network data is smaller than or equal to a set precision threshold value, which indicates that the quality of the positioning signal of the navigation terminal is possibly poor, and then the navigation terminal can be considered to be not yawed;
The positioning position of the navigation terminal moves towards the navigation route, which means that the navigation terminal is correcting the deviation from the navigation route, and can therefore also be considered as the navigation terminal is not yawed.
The technical scheme of the embodiment of the application not only ensures the accuracy of yaw identification, but also can effectively reduce the calculated amount of yaw identification and meet the navigation requirement of actual application scenes.
The implementation details of the technical solution of the embodiment of the present application are described in detail below with reference to fig. 3 to 9:
The technical scheme of the embodiment of the application mainly combines the current positioning position, the current sensor information, the current road data and the current navigation route of the navigation equipment to carry out navigation yaw judgment so as to efficiently and accurately judge the yaw behavior of the navigation equipment. Specifically, the technical scheme of the embodiment of the application is mainly applied to identifying whether the navigation terminal runs according to navigation guidance in the crossing bifurcation area, and judging the yaw behavior of the navigation terminal in time so as to correct route recalculation, and the main steps in an exemplary embodiment are as follows:
And step A, acquiring historical positioning information of a GNSS system of the navigation terminal, wherein the historical positioning information comprises absolute positions (such as longitude and latitude coordinates of a vehicle terminal when the navigation terminal is the vehicle terminal) and speeds (comprising speed and direction). Alternatively, historical sensor information of the navigation terminal may also be obtained.
And B, combining the current positioning information and the sensor information of the navigation terminal and the current road data, and obtaining the real-time motion state information of the navigation terminal and the characteristic relation of the road, so as to judge the road on which the navigation terminal is most likely to run currently, and judging the road scene on which the navigation terminal is currently located according to the matched optimal road.
And C, combining the current positioning information and the sensor information of the navigation terminal and the navigation route information, obtaining the optimal matching position of the navigation terminal on the route, and judging the current intersection guiding direction according to the matching position.
And D, judging whether the road is currently in an intersection area or not based on the matched optimal road information in the step B, if so, entering the step E, otherwise, returning to the step A.
Step E, extracting high-importance features from the pre-training model based on the current positioning information, the navigation route, the road network matching and other information; and (3) performing yaw judgment based on the extracted features, if the yaw judgment is performed on the navigation terminal, entering a step F, otherwise, returning to the step A for reprocessing.
And F, adding some yaw suppression logic for carrying out yaw error protection aiming at a special scene, returning to the step A if the yaw protection is hit, otherwise, recognizing that the navigation terminal is yaw, and requesting to re-plan the navigation route.
Based on the above steps, in an exemplary embodiment of the present application, taking a case where the navigation terminal is a vehicle terminal as an example, as shown in fig. 3, the yaw recognition system may include: the system comprises a vehicle positioning module, a map data module, a map matching module, a navigation matching module, a yaw identification module and a yaw error suppression module. The processing of each module is described in detail below:
In some alternative embodiments, the vehicle positioning module is configured to acquire GNSS positioning information of the navigation system or fuse positioning information (fuse the GNSS positioning point and the sensor signal to obtain a final track reckoning point) to output positioning point information of a current moment, which may include longitude and latitude coordinates of a vehicle position coordinate, time of the current positioning point, and speed information of the current vehicle, including a speed and a direction.
Alternatively, the positioning point information may be obtained based on techniques such as normal GNSS positioning, precise single point positioning (precise point positioning, PPP), real-time kinematic (Real-TIME KINEMATIC, RTK) positioning, etc. when acquiring GNSS positioning information.
The general GNSS positioning technology is a positioning technology based on GNSS, and determines a position of a receiver by receiving GNSS satellite signals and measuring a distance between a satellite and the receiver. The PPP technology is a high-precision GNSS positioning technology, and utilizes a single GNSS receiver to realize millimeter-to-decimeter-level high-precision positioning based on carrier phase observation values by combining high-precision data such as precise ephemeris, satellite clock errors and the like. The RTK positioning technology is a high-precision positioning technology based on GNSS, and realizes centimeter-level high-precision positioning by processing carrier phase observation values between a reference station and a mobile station in real time.
Alternatively, sensor information of the vehicle may be obtained, where the sensor information of the vehicle may include a three-axis acceleration sensor result, a three-axis gyroscope result, a three-axis magnetic force sensor result, and the like of the navigation positioning device. If the navigational positioning apparatus in the vehicle is a cell phone apparatus, the three-axis definition of the cell phone apparatus is shown in FIG. 4. Based on the sensor information, a gesture and heading reference system (Attitude AND HEADING REFERENCE SYSTEM, AHRS) gesture estimation is carried out on the mobile phone equipment, the AHRS consists of a triaxial accelerometer, a triaxial magnetometer and a triaxial gyroscope, and when the mobile phone equipment is stably placed, based on the output information of the sensor, the AHRS can provide heading (Yaw angle Yaw), roll (Roll angle Roll) and Pitch (Pitch angle Pitch) information for the equipment.
If the navigation and positioning apparatus of the vehicle is integrated in the vehicle terminal, as shown in fig. 5, a three-dimensional coordinate system O-XYZ (the definition of the coordinate system is not limited, but may be front X, left Y, upper Z) may be obtained with the center of the vehicle as the origin, the right side of the vehicle as the X axis, the forward direction as the Y axis, and the vertical vehicle plane upward as the Z axis. On this basis, the rotation about the Z axis may be defined as a Yaw angle, and represents a change in the heading angle of the vehicle (representing a left-right heading direction), the rotation about the X axis as a Pitch angle, and represents a change in the Pitch angle of the vehicle (representing a state of ascending and descending the vehicle), and the rotation about the Y axis as a Roll angle, and represents a state of lateral inclination of the vehicle. The Yaw angle Yaw, roll angle Roll and Pitch angle Pitch can also be expressed by decomposing into an X-Y plane, an X-Z plane and a Y-Z plane respectively, as shown in fig. 6.
In some alternative embodiments, the map data module is configured to obtain, according to positioning information of the vehicle positioning module, local map information of a certain range around the current position, where the obtained information includes a length, a number/width of lanes, connectivity between roads, a road shape point representation, road attributes (overhead, ramp, main/auxiliary road, tunnel, etc.), road grades (high speed, provincial road, rural road, etc.), and the map data is one of important information on which calculation depends.
In some alternative embodiments, the map matching module is configured to obtain the local road network data from the map data module within a certain range based on the GNSS positioning information. And carrying out map matching (MAP MATCHING) by combining the GNSS positioning information, the sensor information and the local map data to obtain an optimal matching road. After the optimal matching road is obtained, the special area scene can be identified by judging the attribute of the current road (such as a tunnel, a service area, a toll station and the like), or the special road scene can be identified according to the relation between the current road and the surrounding road (such as a main road area and an auxiliary road area with the same height on the surrounding, or an overhead area with different heights on the surrounding).
Alternatively, in performing map matching, a variety of matching algorithms may be employed, such as a hidden Markov model (Hidden Markov Model, HMM) may be employed. The hidden Markov model is used to describe a probability distribution of a sequence of unobservable states, where each state corresponds to an observation sequence. In the map matching problem, the HMM may be used to match observed location data to a path on a map, such as in one example, the process of map matching using the HMM may be: setting an initial state probability matrix and a transition probability matrix, wherein the initial state probability matrix and the transition probability matrix describe the initial position on a map and the state transition probability on a path; then calculating the probability of observing a specific position in each state according to the map data and the observation data, which can be realized by calculating the distance and direction difference between each position and the observation position on the map; then, calculating the probability of reaching each state from the initial state under the condition of a given observation sequence by using a forward algorithm, wherein the probability is called forward probability; further, according to the forward probability and the transition probability matrix, calculating an optimal path by using a Viterbi algorithm, wherein the optimal path is a path with the maximum probability of finally reaching a termination state through a series of state transitions from a starting state; and finally, matching the state of the optimal path with the path on the map to obtain a final matching result.
More specifically, the observation probability and the emission probability can be defined by the distance and the included angle between the GNSS positioning point and each road, the closer the distance to the road is, the smaller the distance to the road is, and likewise, the larger the included angle between the GNSS signal and the road is, the smaller the probability is, and the smaller the included angle is, the larger the probability is; the transfer consideration can consider the communication relation between roads and the coincidence degree of the included angle of the communication roads and the angle change of the sensor/GNSS signals, the larger the angle probability is, and the smaller the transfer probability is otherwise. With the emission probability and the transition probability, the current best matching road can be obtained by using the Viterbi algorithm.
In some alternative embodiments, the navigation matching module matches anchor points to a given navigation route based primarily on GNSS positioning signals and sensor information. As shown in fig. 7, a navigation route may be understood as a series of connected line segments connecting a start point and an end point. The navigation matching module is similar to the map matching process in the process of matching the navigation route, and can also use an HMM model for matching, wherein the difference is that the road selection of the navigation route is unique, and probability bifurcation is not needed. After the current navigation route matching result is obtained, a section of navigation guiding direction for identifying the current area can be intercepted on the navigation route based on the current matching point.
Alternatively, as shown in fig. 8, the navigation guidance directions of the current area may be classified into three types: straight-going, turning (left-turn, right-turn) and turning around (left-turn, right-turn, collinear-turn), which basically covers all navigation intersection scenes, but can also be defined more finely, such as right-front turning, left-rear-turn, right-rear-turn, etc.
In some alternative embodiments, the yaw recognition module is divided primarily into the following sections:
part 1: model pre-training to obtain high importance features
Firstly, based on the results in map matching and navigation matching in the above embodiments, the multi-dimensional features can be extracted as much as possible for model training, and in the embodiments of the present application, the features are not particularly limited, and any feature that can be obtained, such as the speed of a locating point, the direction of the locating point, the distance from the locating point to a map matching point, the road class/road width of a map matching road, the distance from the locating point to a navigation matching point, the local shape of a navigation route, the nearest distance from the locating point to a full-section navigation route, the direction of the navigation matching point, the cumulative change of the heading angle (Yaw angle) obtained by a sensor in a certain time window, the included angle between a locating signal and a map matching road, the included angle between the locating signal and a navigation matching section, the quality evaluation score of the signal, the scene judgment result, the distance from the map matching point to the route matching point, and the like can be extracted.
Then, based on the extracted features and the true values (in this embodiment, whether the vehicle is yawed or not), model learning training is performed, where no specific requirements are made on the learning model, and neural network learning such as CNN, RNN, etc. may be performed, or conventional SVM, bayesian, decision tree (such as XGBOOST, GBDT model, etc.), and when the model is trained to a more ideal state, feature importance analysis is performed on the model, so as to obtain importance ranking of each feature in the model, and further, valuable features can be identified. In one example, as shown in fig. 9, the importance scores of the features may be shown, and the features of top N (where N may be customized according to the need) may be selected for extraction according to the importance analysis of the features, so as to implement the construction of the yaw recognition model.
Part 2: top N-based feature design yaw recognition model
In some alternative embodiments, it is assumed that the top N features obtained are: the distance (d 1) from the positioning point to the map matching point, the distance (d 2) from the positioning point to the navigation matching point, the nearest distance (d 3) from the positioning point to the full-section navigation route, the cumulative change (y) of the course angle (Yaw angle) obtained by the sensor in a certain time window (if the equipment is not provided with a sensor or the mobile phone is in an unstable state, the value can be calculated by the cumulative angle change of the GNSS signal), the included angle (a 1) of the positioning signal and the map matching road, the included angle (a 2) of the positioning signal and the navigation matching section, the included angle (a 3) of the map matching road and the navigation matching section, the quality evaluation score (q, which can be a score of 0-1, the quality is better as the score is larger), the distance (d 4) from the map matching point to the route matching point, and the speed (v) of the positioning point.
For the above features, the following categories can be roughly classified:
distance-related features: d1, d2, d3, d4;
Angle-related features: a1, a2, a3;
Sensor-related features: y;
signal itself-related characteristics: q and v.
Based on the above four types of features, a yaw recognition model with high interpretability can be designed, and one specific example is as follows:
In some alternative embodiments, the new features d and a may be obtained by processing based on the distance-related features and angle-related features described above, where d=min (d 1, d2, d3, d 4); a=min (a 1, a2, a 3), where MIN () represents a minimum operation.
A distance yaw threshold value for making a navigational yaw determination may then be defined; Wherein for different navigation guidance directions, different distance yaw thresholds/>, may be usedTo make yaw recognition judgment.
In some alternative embodiments, if the current navigational guidance direction is straight, then if d.ltoreq.Or a is less than or equal to/>No yaw reminding is sent (namely yaw is not determined), and the vehicle is directly exited; here/>And/>The method comprises the step of sending a minimum threshold value of yaw reminding when a preset navigation guiding direction is a straight-going scene. Alternatively, the threshold/>, may be adjusted based on the angle differenceThe specific formula is as follows:
Wherein, 、/>、/>Is a constant parameter; /(I)Values representing the angle class characteristics.
In some alternative embodiments, if the current navigational guidance direction is turning (left or right), then a determination is made as to whether the sensor/GNSS signal turning trend is consistent with the navigational route, and if so, no yaw reminder is issued. Alternatively, the definition of consistency herein may be set according to actual needs, such as planning a route for left/right turn, sensor signal cumulative turn, or GNSS signal cumulative turn for left/rightIf the degree (30 degrees for example) is the same, the yaw prompt is not sent once, and the user directly exits.
In some alternative embodiments, if the current navigation guidance direction is steering (left or right), it may be determined whether the GNSS signal speed has a tendency to slow down, and if so, the vehicle is considered to be still in the steering process, the steering action is not completed, no yaw reminder is sent temporarily, and the vehicle is directly exited.
In some alternative embodiments, if the current navigation guidance direction is steering (left or right), if the current navigation guidance direction is in the intersection area based on the map matching result, a yaw reminder is not sent temporarily, the vehicle is directly exited, and the vehicle is waited for exiting the intersection area.
In some alternative embodiments, if the current navigational guidance direction is u-turn, a determination is made as to whether the sensor steering trend/GNSS signal steering trend is consistent with the navigational route, and if so, no yaw reminder is issued. Alternatively, the definition of consistency herein may be set according to actual needs, such as planning a route for left/right turn, sensor signal cumulative turn, or GNSS signal cumulative turn for left/rightIf the degree (120 degrees for example) is the same, the yaw reminding is not sent once, and the user directly exits.
In some optional embodiments, if the current navigation guiding direction is u-turn, it may be determined whether the GNSS signal speed has a tendency to slow down, and if so, it is considered that the vehicle is still in the steering process, the steering action is not completed, no yaw reminder is sent temporarily, and the vehicle is directly exited.
In some optional embodiments, if the current navigation guiding direction is u-turn, if the current navigation guiding direction is determined to be in the intersection area based on the map matching result, a yaw prompt is not sent temporarily, the vehicle is directly exited, and the vehicle is waited for exiting the intersection area.
In some alternative embodiments, the distance threshold may be adjusted based on signal characteristics or scene characteristics, and the specific adjustment logic is not particularly limited in embodiments of the present application, such as in one example, the distance threshold may be adjusted based on GNSS signal quality scoresThe method comprises the following steps:
Wherein, Is a preset parameter for defining a threshold value according to the signal quality amplification distance; /(I)Representing GNSS signal quality scores.
In another example of the present application, if it is determined that the scene is currently in a complex scene such as an overhead road, a parallel road, etc. based on the map matching result, the distance threshold may also be adjustedThe method comprises the following steps:
Wherein, Is a preset parameter for increasing the yaw threshold and improving the accuracy by reducing the yaw sensitivity.
In some alternative embodiments, if d is greater thanAnd a is greater than/>Sending a yaw prompt and entering into a yaw error suppression module; otherwise, the user exits, and no yaw reminding is sent out.
In some alternative embodiments, the yaw error suppressing module mainly performs further verification protection on the yaw result identified by the yaw identifying module to ensure that the final yaw result is more accurate, and the main idea is to add a series of suppressing logics, and hit any one of the suppressing logics to uniformly vote for the yaw identified by the time. The specific content of these suppression logics is not limited in the embodiment of the present application, and some reference examples are given below:
If the speed v of the positioning point is less than vTh (a set low-speed threshold), no yaw reminding is sent out; in special scenes, if the map is matched with roads in tunnels, parking lots, intersections and the like, yaw reminding is not sent out; the signal quality difference does not give off-course alerts, i.e. if q < qTh (set quality threshold), then no off-course alert is given; if the steering trend of the positioning signal is inconsistent with that of the sensor, no yaw reminding is sent out; judging the matching degree of the current signal and the current map matching road, and if the matching degree is smaller, not sending yaw reminding; and judging whether the signal has a trend of moving towards the navigation route, if so, not sending a yaw prompt.
If the yaw reminding sent by the yaw recognition module is restrained in the yaw error restraining module, the yaw reminding is not sent out, and the user directly exits, otherwise, the yaw judgment is determined to be successful, the yaw reminding is sent out, and the user requests to re-plan the navigation route.
In summary, the technical scheme of the embodiment of the application provides a method for quickly identifying yaw, which can identify important features based on a pre-training model, extract high-importance features based on a map matching result and a navigation matching result to design a yaw judgment algorithm, and then consider some special scene conditions to perform error yaw suppression protection, so that quick and accurate yaw identification is realized, and the navigation experience of a user is improved.
Compared with the scheme of judging by directly using a machine learning model, the technical scheme of the embodiment of the application has the advantages that the characteristic importance analysis obtained by characteristic learning is used for extracting the most effective key characteristics from the dimensional information and carrying out high-interpretability algorithm logic judgment. On one hand, the method reduces the performance cost (including memory and CPU occupation) running on equipment (mobile phone end or vehicle machine end and the like) without loading and using the model, on the other hand, the method extracts the high-importance characteristics in the model, designs a high-interpretability recognition algorithm, and is not inferior to or even exceeds the recognition scheme of the machine learning model in recognition precision. Meanwhile, compared with the road matching scheme, the technical scheme of the embodiment of the application can more effectively identify key important components in the current information by introducing the high-importance characteristics of the pre-training model, efficiently develop the design algorithm and achieve better identification effect.
The following describes an embodiment of the apparatus of the present application that may be used to perform the yaw recognition method of the above-described embodiment of the present application. For details not disclosed in the embodiments of the apparatus of the present application, please refer to the embodiments of the yaw recognition method of the present application.
Fig. 10 shows a block diagram of a yaw recognition device according to an embodiment of the present application, which may be applied to an electronic apparatus having a calculation processing function, such as a navigation apparatus (e.g., a vehicle terminal, a mobile terminal) that performs a navigation function, or a server that communicates with the navigation apparatus.
Referring to fig. 10, a yaw recognition apparatus 1000 according to an embodiment of the present application includes: an acquisition unit 1002, an identification unit 1004, a determination unit 1006, and a processing unit 1008.
The acquiring unit 1002 is configured to acquire a positioning signal of a navigation terminal, where the positioning signal includes a positioning position of the navigation terminal; the identifying unit 1004 is configured to identify a road scene where the navigation terminal is located according to the positioning position, and match a position point corresponding to the positioning position on a set navigation route; the determining unit 1006 is configured to identify a guiding direction for the navigation terminal according to the location point and the navigation route; the processing unit 1008 is configured to identify a yaw condition of the navigation terminal according to the guiding direction and the set navigation feature if the road scene where the navigation terminal is located is an intersection area.
In some embodiments of the present application, based on the foregoing scheme, the identifying unit 1004 is configured to: acquiring road network data at the positioning position from map data according to the positioning position, wherein the road network data comprises road section data; matching a target road section where the positioning position is located in the road network data; and identifying the road scene where the navigation terminal is located according to the target road section.
In some embodiments of the present application, based on the foregoing aspects, the identifying unit 1004 identifies, according to the target road segment, a road scene in which the navigation terminal is located, including at least one of the following manners: identifying a road scene where the navigation terminal is located according to the attribute data of the target road section, wherein the attribute data of the target road section is used for representing the road type of the target road section; and identifying the road scene where the navigation terminal is located according to the relation between the target road section and the adjacent road section.
In some embodiments of the present application, based on the foregoing scheme, the determining unit 1006 is configured to: intercepting a route with a set length on the navigation route according to the position of the position point on the navigation route, wherein the route with the set length comprises the position point; and identifying a guiding direction for the navigation terminal according to the route with the set length, wherein the guiding direction comprises one or more of straight running, steering and turning around.
In some embodiments of the application, based on the foregoing, the navigation features include distance-like features and angle-like features; the processing unit 1008 is configured to: and if the guiding direction is straight, determining that the navigation terminal does not yaw when the distance class feature is smaller than or equal to a distance threshold value or when the angle class feature is smaller than or equal to an angle threshold value.
In some embodiments of the present application, based on the foregoing, the processing unit 1008 is further configured to: and adjusting the size of the distance threshold according to the angle class characteristics.
In some embodiments of the application, based on the foregoing, the navigation features include distance-like features and angle-like features; the processing unit 1008 is configured to: if the guiding direction is steering or turning around, determining that the navigation terminal does not yaw when at least one of the following conditions is met: the steering trend of the sensor is consistent with the steering trend of the navigation route, and the sensor is a sensor which is arranged on the navigation terminal and can output a direction signal; the steering trend of the positioning signal of the navigation terminal is consistent with the steering trend of the navigation route; the navigation terminal has a deceleration trend; the navigation terminal is located in the intersection area.
In some embodiments of the present application, based on the foregoing, the processing unit 1008 is further configured to: if the accumulated steering angle of the sensor in the intersection area is in a set angle interval, determining that the steering trend of the sensor is consistent with the steering trend of the navigation route; if the accumulated steering angle of the positioning signal of the navigation terminal in the intersection area is within a set angle interval, determining that the steering trend of the positioning signal of the navigation terminal is consistent with the steering trend of the navigation route.
In some embodiments of the application, based on the foregoing, the navigation features include distance-like features and angle-like features; the processing unit 1008 is configured to: and if the distance type feature is larger than a distance threshold value and the angle type feature is larger than an angle threshold value, determining that the navigation terminal is identified to yaw.
In some embodiments of the present application, based on the foregoing, the processing unit 1008 is further configured to perform at least one of the following: adjusting the size of the distance threshold according to the quality of the positioning signal, wherein the quality of the positioning signal and the size of the distance threshold form an inverse correlation relation;
And adjusting the size of the distance threshold according to the road scene, wherein the complexity of the road scene and the size of the distance threshold form a positive correlation.
In some embodiments of the present application, based on the foregoing, the processing unit 1008 is further configured to: acquiring multidimensional features of the navigation terminal in the navigation process; acquiring a set number of features from the multi-dimensional features according to the order of the importance of the multi-dimensional features from high to low; and generating the distance type features and the angle type features according to the set number of features.
In some embodiments of the present application, based on the foregoing, the processing unit 1008 is configured to: acquiring features belonging to a distance type from the set number of features, and taking the minimum value of the features belonging to the distance type as the distance type feature; and acquiring the features belonging to the angle type from the set number of features, and taking the minimum value in the features belonging to the angle type as the angle type feature.
In some embodiments of the present application, based on the foregoing, the processing unit 1008 is further configured to: taking multi-dimensional characteristics contained in the historical navigation data as sample data, and taking yaw conditions corresponding to the historical navigation data as sample labels so as to train a machine learning model; the importance of the multi-dimensional features is determined based on the trained machine learning model.
In some embodiments of the present application, based on the foregoing, the processing unit 1008 is further configured to: if the yaw of the navigation terminal is recognized according to the guiding direction and the set navigation characteristics, determining that the yaw of the navigation terminal does not occur actually when at least one of the following conditions is met:
The running speed of the navigation terminal is smaller than or equal to a set speed threshold value;
The navigation terminal is positioned in a set road scene;
The quality of the positioning signal of the navigation terminal is smaller than or equal to a set quality threshold;
The steering trend of the positioning signal of the navigation terminal is inconsistent with the steering trend of a sensor arranged on the navigation terminal;
The matching degree between the positioning position of the navigation terminal and the target road section matched in the road network data is smaller than or equal to a set precision threshold value;
the positioning position of the navigation terminal moves towards the navigation route.
Fig. 11 shows a schematic diagram of a computer system suitable for use in implementing an embodiment of the application.
It should be noted that, the computer system 1100 of the electronic device shown in fig. 11 is only an example, and should not impose any limitation on the functions and the application scope of the embodiments of the present application.
As shown in fig. 11, the computer system 1100 may include a central processing unit (Central Processing Unit, CPU) 1101 that may perform various appropriate actions and processes, such as performing the methods described in the above embodiments, according to a program stored in a Read-Only Memory (ROM) 1102 or a program loaded from a storage portion 1108 into a random access Memory (Random Access Memory, RAM) 1103. In the RAM 1103, various programs and data required for system operation are also stored. The CPU 1101, ROM 1102, and RAM 1103 are connected to each other by a bus 1104. An Input/Output (I/O) interface 1105 is also connected to bus 1104.
The following components may be connected to the I/O interface 1105: an input section 1106 including a keyboard, a mouse, and the like; an output portion 1107 including a Cathode Ray Tube (CRT), a Liquid crystal display (Liquid CRYSTAL DISPLAY, LCD), and a speaker, etc.; a storage section 1108 including a hard disk or the like; and a communication section 1109 including a network interface card such as a LAN (Local Area Network ) card, a modem, or the like. The communication section 1109 performs communication processing via a network such as the internet. The drive 1110 is also connected to the I/O interface 1105 as needed. Removable media 1111, such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like, is installed as needed in drive 1110, so that a computer program read therefrom is installed as needed in storage section 1108.
In particular, according to embodiments of the present application, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present application include a computer program product comprising a computer program embodied on a computer readable medium for performing the method shown in the flowchart. In such an embodiment, the computer program can be downloaded and installed from a network via the communication portion 1109, and/or installed from the removable media 1111. When executed by a Central Processing Unit (CPU) 1101, performs the various functions defined in the system of the present application.
It should be noted that, the computer readable medium shown in the embodiments of the present application may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-Only Memory (ROM), an erasable programmable read-Only Memory (Erasable Programmable Read Only Memory, EPROM), a flash Memory, an optical fiber, a portable compact disc read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. In the present application, however, a computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with a computer-readable computer program embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. A computer program embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. Where each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer programs.
The units involved in the embodiments of the present application may be implemented by software, or may be implemented by hardware, and the described units may also be provided in a processor. Wherein the names of the units do not constitute a limitation of the units themselves in some cases.
As another aspect, the present application also provides a computer-readable medium that may be contained in the electronic device described in the above embodiment; or may exist alone without being incorporated into the electronic device. The computer readable medium carries one or more computer programs which, when executed by the electronic device, cause the electronic device to implement the methods described in the above embodiments.
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functions of two or more modules or units described above may be embodied in one module or unit in accordance with embodiments of the application. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present application may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a mobile hard disk, etc.) or on a network, comprising several instructions for causing an electronic device to perform the method according to the embodiments of the present application. For example, the electronic device may perform the yaw recognition method shown in fig. 2.
Other embodiments of the application will be apparent to those skilled in the art from consideration of the specification and practice of the embodiments disclosed herein. This application is intended to cover any variations, uses, or adaptations of the application following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the application pertains.
It is to be understood that the application is not limited to the precise arrangements and instrumentalities shown in the drawings, which have been described above, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (15)

1. A yaw recognition method, comprising:
Acquiring a positioning signal of a navigation terminal, wherein the positioning signal comprises a positioning position of the navigation terminal;
Identifying a road scene where the navigation terminal is located according to the positioning position, and matching a position point corresponding to the positioning position on a set navigation route;
identifying a guiding direction for the navigation terminal according to the position points and the navigation route;
If the road scene where the navigation terminal is located is an intersection area, identifying the yaw condition of the navigation terminal according to the guiding direction and the set navigation characteristics; the navigation features include distance-type features and angle-type features;
Wherein, according to the guiding direction and the set navigation characteristic, the yaw condition of the navigation terminal is identified, comprising: if the distance type feature is larger than a distance threshold value and the angle type feature is larger than an angle threshold value, determining that the navigation terminal is identified to yaw;
the yaw recognition method further includes: taking multi-dimensional characteristics contained in historical navigation data of the navigation terminal as sample data and taking yaw conditions corresponding to the historical navigation data as sample labels so as to train a machine learning model; determining importance of the multi-dimensional feature based on the trained machine learning model;
acquiring multidimensional features of the navigation terminal in the navigation process; acquiring a set number of features from the multi-dimensional features of the navigation terminal in the navigation process according to the sequence from high to low of the importance of the multi-dimensional features; and generating the distance type features and the angle type features according to the set number of features.
2. The yaw recognition method of claim 1, wherein recognizing a road scene in which the navigation terminal is located based on the positioning position, comprises:
Acquiring road network data at the positioning position from map data according to the positioning position, wherein the road network data comprises road section data;
Matching a target road section where the positioning position is located in the road network data;
and identifying the road scene where the navigation terminal is located according to the target road section.
3. The yaw recognition method of claim 2, wherein recognizing a road scene in which the navigation terminal is located from the target link includes at least one of:
Identifying a road scene where the navigation terminal is located according to the attribute data of the target road section, wherein the attribute data of the target road section is used for representing the road type of the target road section;
and identifying the road scene where the navigation terminal is located according to the relation between the target road section and the adjacent road section.
4. The yaw recognition method of claim 1, wherein recognizing a guiding direction for the navigation terminal based on the location point and the navigation route includes:
Intercepting a route with a set length on the navigation route according to the position of the position point on the navigation route, wherein the route with the set length comprises the position point;
And identifying a guiding direction for the navigation terminal according to the route with the set length, wherein the guiding direction comprises one or more of straight running, steering and turning around.
5. The yaw recognition method of claim 1, wherein recognizing a yaw condition of the navigation terminal according to the guiding direction and the set navigation feature includes:
and if the guiding direction is straight, determining that the navigation terminal does not yaw when the distance class feature is smaller than or equal to a distance threshold value or when the angle class feature is smaller than or equal to an angle threshold value.
6. The yaw identification method of claim 5, further comprising:
And adjusting the size of the distance threshold according to the angle class characteristics.
7. The yaw recognition method of claim 1, wherein recognizing a yaw condition of the navigation terminal according to the guiding direction and the set navigation feature includes: if the guiding direction is steering or turning around, determining that the navigation terminal does not yaw when at least one of the following conditions is met:
the steering trend of the sensor is consistent with the steering trend of the navigation route, and the sensor is a sensor which is arranged on the navigation terminal and can output a direction signal;
The steering trend of the positioning signal of the navigation terminal is consistent with the steering trend of the navigation route;
The navigation terminal has a deceleration trend;
the navigation terminal is located in the intersection area.
8. The yaw identification method of claim 7, wherein the yaw identification method further comprises:
If the accumulated steering angle of the sensor in the intersection area is in a set angle interval, determining that the steering trend of the sensor is consistent with the steering trend of the navigation route;
If the accumulated steering angle of the positioning signal of the navigation terminal in the intersection area is within a set angle interval, determining that the steering trend of the positioning signal of the navigation terminal is consistent with the steering trend of the navigation route.
9. The yaw identification method of claim 1, further comprising at least one of:
Adjusting the size of the distance threshold according to the quality of the positioning signal, wherein the quality of the positioning signal and the size of the distance threshold form an inverse correlation relation;
And adjusting the size of the distance threshold according to the road scene, wherein the complexity of the road scene and the size of the distance threshold form a positive correlation.
10. The yaw recognition method of claim 1, wherein generating the distance-class feature and the angle-class feature from the set number of features comprises:
acquiring features belonging to a distance type from the set number of features, and taking the minimum value of the features belonging to the distance type as the distance type feature;
and acquiring the features belonging to the angle type from the set number of features, and taking the minimum value in the features belonging to the angle type as the angle type feature.
11. The yaw identification method of any one of claims 1 to 10, further comprising: if the yaw of the navigation terminal is recognized according to the guiding direction and the set navigation characteristics, determining that the yaw of the navigation terminal does not occur actually when at least one of the following conditions is met:
The running speed of the navigation terminal is smaller than or equal to a set speed threshold value;
The navigation terminal is positioned in a set road scene;
The quality of the positioning signal of the navigation terminal is smaller than or equal to a set quality threshold;
The steering trend of the positioning signal of the navigation terminal is inconsistent with the steering trend of a sensor arranged on the navigation terminal;
The matching degree between the positioning position of the navigation terminal and the target road section matched in the road network data is smaller than or equal to a set precision threshold value;
the positioning position of the navigation terminal moves towards the navigation route.
12. A yaw recognition apparatus, comprising:
The acquisition unit is configured to acquire a positioning signal of the navigation terminal, wherein the positioning signal comprises a positioning position of the navigation terminal;
the identification unit is configured to identify a road scene where the navigation terminal is located according to the positioning position and match a position point corresponding to the positioning position on a set navigation route;
A determination unit configured to identify a guiding direction for the navigation terminal based on the location point and the navigation route;
The processing unit is configured to identify the yaw condition of the navigation terminal according to the guiding direction and the set navigation characteristics if the road scene where the navigation terminal is located is an intersection area; the navigation features include distance-type features and angle-type features;
wherein the processing unit is configured to: if the distance type feature is larger than a distance threshold value and the angle type feature is larger than an angle threshold value, determining that the navigation terminal is identified to yaw;
The processing unit is further configured to: taking multi-dimensional characteristics contained in historical navigation data of the navigation terminal as sample data and taking yaw conditions corresponding to the historical navigation data as sample labels so as to train a machine learning model; determining importance of the multi-dimensional feature based on the trained machine learning model; acquiring multidimensional features of the navigation terminal in the navigation process; acquiring a set number of features from the multi-dimensional features of the navigation terminal in the navigation process according to the sequence from high to low of the importance of the multi-dimensional features; and generating the distance type features and the angle type features according to the set number of features.
13. A computer readable medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the yaw identification method of any one of claims 1 to 11.
14. An electronic device, comprising:
One or more processors;
a memory for storing one or more computer programs that, when executed by the one or more processors, cause the electronic device to implement the yaw recognition method of any one of claims 1 to 11.
15. A computer program product, characterized in that it comprises a computer program stored in a computer readable storage medium, from which a processor of an electronic device reads and executes the computer program, causing the electronic device to perform the yaw recognition method of any one of claims 1 to 11.
CN202410168273.0A 2024-02-06 2024-02-06 Yaw recognition method, yaw recognition device, computer readable medium and electronic equipment Active CN117705141B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410168273.0A CN117705141B (en) 2024-02-06 2024-02-06 Yaw recognition method, yaw recognition device, computer readable medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410168273.0A CN117705141B (en) 2024-02-06 2024-02-06 Yaw recognition method, yaw recognition device, computer readable medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN117705141A CN117705141A (en) 2024-03-15
CN117705141B true CN117705141B (en) 2024-05-07

Family

ID=90162897

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410168273.0A Active CN117705141B (en) 2024-02-06 2024-02-06 Yaw recognition method, yaw recognition device, computer readable medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN117705141B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105890606A (en) * 2016-03-31 2016-08-24 百度在线网络技术(北京)有限公司 Navigation route recognition method and device
CN107228677A (en) * 2016-03-23 2017-10-03 腾讯科技(深圳)有限公司 Driftage recognition methods and device
CN108021984A (en) * 2016-11-01 2018-05-11 第四范式(北京)技术有限公司 Determine the method and system of the feature importance of machine learning sample
CN110006442A (en) * 2019-04-17 2019-07-12 北京百度网讯科技有限公司 Air navigation aid, device, equipment and medium
CN110702135A (en) * 2019-10-14 2020-01-17 广州小鹏汽车科技有限公司 Navigation method and device for vehicle, automobile and storage medium
CN111044056A (en) * 2018-10-15 2020-04-21 华为技术有限公司 Positioning method based on road matching, chip subsystem and electronic equipment
CN111811533A (en) * 2020-07-06 2020-10-23 腾讯科技(深圳)有限公司 Yaw determination method and device and electronic equipment
CN115585820A (en) * 2022-09-23 2023-01-10 阿里巴巴(中国)有限公司 Yaw guiding method, device, electronic equipment and computer program product
CN116972863A (en) * 2023-02-17 2023-10-31 腾讯科技(深圳)有限公司 Route yaw identification method, device, equipment, storage medium and product
CN116989816A (en) * 2023-09-05 2023-11-03 腾讯科技(深圳)有限公司 Yaw identification method and device and electronic equipment

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107228677A (en) * 2016-03-23 2017-10-03 腾讯科技(深圳)有限公司 Driftage recognition methods and device
CN105890606A (en) * 2016-03-31 2016-08-24 百度在线网络技术(北京)有限公司 Navigation route recognition method and device
CN108021984A (en) * 2016-11-01 2018-05-11 第四范式(北京)技术有限公司 Determine the method and system of the feature importance of machine learning sample
CN111044056A (en) * 2018-10-15 2020-04-21 华为技术有限公司 Positioning method based on road matching, chip subsystem and electronic equipment
CN110006442A (en) * 2019-04-17 2019-07-12 北京百度网讯科技有限公司 Air navigation aid, device, equipment and medium
CN110702135A (en) * 2019-10-14 2020-01-17 广州小鹏汽车科技有限公司 Navigation method and device for vehicle, automobile and storage medium
CN111811533A (en) * 2020-07-06 2020-10-23 腾讯科技(深圳)有限公司 Yaw determination method and device and electronic equipment
CN115585820A (en) * 2022-09-23 2023-01-10 阿里巴巴(中国)有限公司 Yaw guiding method, device, electronic equipment and computer program product
CN116972863A (en) * 2023-02-17 2023-10-31 腾讯科技(深圳)有限公司 Route yaw identification method, device, equipment, storage medium and product
CN116989816A (en) * 2023-09-05 2023-11-03 腾讯科技(深圳)有限公司 Yaw identification method and device and electronic equipment

Also Published As

Publication number Publication date
CN117705141A (en) 2024-03-15

Similar Documents

Publication Publication Date Title
Suhr et al. Sensor fusion-based low-cost vehicle localization system for complex urban environments
CN110686686B (en) System and method for map matching
Hashemi et al. A critical review of real-time map-matching algorithms: Current issues and future directions
EP3936822B1 (en) Vehicle positioning method and apparatus, and vehicle, and storage medium
CN101964941A (en) Intelligent navigation and position service system and method based on dynamic information
CN102208013A (en) Scene matching reference data generation system and position measurement system
CN104819726A (en) Navigation data processing method, navigation data processing device and navigation terminal
CN101571400A (en) Embedded onboard combined navigation system based on dynamic traffic information
CN106469505B (en) Floating car track deviation rectifying method and device
CN109515439B (en) Automatic driving control method, device, system and storage medium
CN110389995B (en) Lane information detection method, apparatus, device, and medium
CN111721306B (en) Road matching method and device, electronic equipment and readable storage medium
CN113447033A (en) Lane-level map matching method and system
CN112197780B (en) Path planning method and device and electronic equipment
EP3968609A1 (en) Control method, vehicle, and server
US11579628B2 (en) Method for localizing a vehicle
CN111401255B (en) Method and device for identifying bifurcation junctions
CN102082996A (en) Self-locating mobile terminal and method thereof
CN112633812B (en) Track segmentation method, device, equipment and storage medium for freight vehicle
CN113742437A (en) Map updating method and device, electronic equipment and storage medium
CN117705141B (en) Yaw recognition method, yaw recognition device, computer readable medium and electronic equipment
CN116793378A (en) Tunnel detection method and device, electronic equipment and storage medium
US20220026237A1 (en) Production of digital road maps by crowdsourcing
Xi et al. Map matching algorithm and its application
CN115112125A (en) Positioning method and device for automatic driving vehicle, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant