CN113033463A - Deceleration strip detection method and device, electronic equipment and storage medium - Google Patents
Deceleration strip detection method and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN113033463A CN113033463A CN202110384644.5A CN202110384644A CN113033463A CN 113033463 A CN113033463 A CN 113033463A CN 202110384644 A CN202110384644 A CN 202110384644A CN 113033463 A CN113033463 A CN 113033463A
- Authority
- CN
- China
- Prior art keywords
- data
- sub
- vehicle
- state
- period
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 95
- 238000003860 storage Methods 0.000 title claims abstract description 32
- 238000013145 classification model Methods 0.000 claims abstract description 64
- 238000000034 method Methods 0.000 claims abstract description 57
- 238000005259 measurement Methods 0.000 claims abstract description 9
- 238000003062 neural network model Methods 0.000 claims description 27
- 238000012549 training Methods 0.000 claims description 23
- 230000015654 memory Effects 0.000 claims description 20
- 230000006870 function Effects 0.000 claims description 19
- 238000004590 computer program Methods 0.000 claims description 13
- 238000005516 engineering process Methods 0.000 abstract description 27
- 238000013473 artificial intelligence Methods 0.000 abstract description 18
- 238000012545 processing Methods 0.000 description 24
- 230000008569 process Effects 0.000 description 18
- 238000010586 diagram Methods 0.000 description 14
- 238000004891 communication Methods 0.000 description 8
- 238000013528 artificial neural network Methods 0.000 description 7
- 239000011159 matrix material Substances 0.000 description 7
- 238000012805 post-processing Methods 0.000 description 7
- 238000010801 machine learning Methods 0.000 description 6
- 230000007613 environmental effect Effects 0.000 description 5
- 238000005286 illumination Methods 0.000 description 4
- 230000000007 visual effect Effects 0.000 description 4
- 241000282414 Homo sapiens Species 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 3
- 238000010924 continuous production Methods 0.000 description 3
- 238000013527 convolutional neural network Methods 0.000 description 3
- 238000001914 filtration Methods 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 238000013135 deep learning Methods 0.000 description 2
- 230000002452 interceptive effect Effects 0.000 description 2
- 238000010606 normalization Methods 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000007418 data mining Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000003058 natural language processing Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 230000000306 recurrent effect Effects 0.000 description 1
- 230000002787 reinforcement Effects 0.000 description 1
- 230000006403 short-term memory Effects 0.000 description 1
- 238000013526 transfer learning Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/07—Target detection
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Computation (AREA)
- Molecular Biology (AREA)
- Computational Linguistics (AREA)
- Software Systems (AREA)
- Mathematical Physics (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computing Systems (AREA)
- General Health & Medical Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Traffic Control Systems (AREA)
Abstract
The embodiment of the application discloses a deceleration strip detection method and device, electronic equipment and a storage medium, and relates to the fields of traffic, maps, positioning, artificial intelligence, big data, cloud technology and the like. The method comprises the following steps: acquiring vehicle data corresponding to a current time period of a vehicle, wherein the vehicle data comprises running data corresponding to at least one sub time period, and the running data comprises Inertial Measurement Unit (IMU) data and wheel speed data; inputting the driving data into a classification model, and classifying the driving data through the classification model to obtain a classification result corresponding to each sub-period; determining the driving state corresponding to each sub-period based on the classification result corresponding to each sub-period; and determining the running state of the vehicle based on the running state corresponding to each sub-period, wherein the running state is a first state or a second state, the first state represents that the vehicle passes through the deceleration strip, and the second state represents that the vehicle does not pass through the deceleration strip. By adopting the technical scheme, the accuracy rate of determining whether the vehicle passes through the deceleration strip is improved.
Description
Technical Field
The embodiment of the application relates to the technical fields of traffic, maps, positioning, artificial intelligence, big data and cloud, in particular to a deceleration strip detection method and device, electronic equipment and a storage medium.
Background
With the increase of travel demands of people, the accuracy of vehicle positioning becomes more and more important. At present, a Global Positioning System (GPS for short) is generally adopted for Positioning a vehicle, and although Positioning of the vehicle based on the GPS can well meet most Positioning requirements, a problem that accurate Positioning of the vehicle may not be achieved in some scenes still exists, for example, when a GPS signal is weak or due to some other special situations.
In order to better meet the actual requirement, other positioning methods have been proposed in the prior art, for example, a vehicle can be positioned based on a deceleration strip, but the effect of determining whether the vehicle passes through the deceleration strip is not ideal, and how to determine whether the vehicle passes through the deceleration strip still needs to be improved.
Disclosure of Invention
The embodiment of the application provides a deceleration strip detection method and device, electronic equipment and a storage medium, and improves the accuracy rate of determining whether a vehicle passes through the deceleration strip.
In one aspect, an embodiment of the present application provides a deceleration strip detection method, which includes:
acquiring vehicle data corresponding to a current time period of a vehicle, wherein the vehicle data comprises running data corresponding to at least one sub-time period, and the running data comprises Inertial Measurement Unit (IMU) data and wheel speed data;
inputting the driving data into a classification model, and classifying the driving data through the classification model to obtain a classification result corresponding to each sub-period, wherein for one sub-period, the classification result corresponding to the sub-period represents the driving state of the vehicle in one sub-period;
determining a driving state corresponding to each sub-period based on a classification result corresponding to each sub-period;
and determining the running state of the vehicle based on the running state corresponding to each sub-period, wherein the running state is a first state or a second state, the first state represents that the vehicle passes through the deceleration strip, and the second state represents that the vehicle does not pass through the deceleration strip.
In one aspect, the embodiment of the application provides a deceleration strip detection device, the device includes:
the vehicle data acquisition module is used for acquiring vehicle data corresponding to a current time period of a vehicle, wherein the vehicle data comprises running data corresponding to at least one sub time period, and the running data comprises Inertial Measurement Unit (IMU) data and wheel speed data;
a classification result determining module, configured to input the driving data into a classification model, and classify the driving data through the classification model to obtain a classification result corresponding to each sub-period, where, for one sub-period, the classification result corresponding to the sub-period represents a driving state of the vehicle in the sub-period;
a driving state determining module, configured to determine a driving state corresponding to each sub-period based on a classification result corresponding to each sub-period;
the driving state determining module is configured to determine a driving state of the vehicle based on a driving state corresponding to each of the sub-periods, where the driving state is a first state or a second state, the first state indicates that the vehicle passes through a deceleration strip, and the second state indicates that the vehicle does not pass through the deceleration strip.
In a possible embodiment, the vehicle data further includes a time stamp of each sub-period, and the driving state determining module, when determining the driving state of the vehicle based on the driving state corresponding to each sub-period, is configured to:
determining each sub-period of which the corresponding driving state is the first state in each sub-period;
based on the time stamp of each sub-period in which the driving state is the first state, the driving state of each sub-period is adjusted as follows:
if the time length between any two sub-periods of the first state is less than or equal to a first time length, determining the running state of each sub-period between any two sub-periods as the first state;
and determining the running state of the vehicle based on the adjusted corresponding running state of each sub-period.
In a possible embodiment, when determining the driving state of the vehicle based on the driving state corresponding to each of the sub-periods, the driving state determination module is specifically configured to:
for a first target time interval in each adjusted sub-time interval, determining a time length corresponding to the first target time interval according to a timestamp of each sub-time interval contained in the first target time interval, wherein the first target time interval is at least two sub-time intervals of which the continuous driving state is a first state;
and if the time length corresponding to the first target time interval is less than or equal to a second time length, determining the running state of the vehicle based on the running states of all the sub-time intervals except the first target time intervals.
In one possible embodiment, the acquisition time interval of the IMU data is a first time interval, the acquisition time interval of the wheel speed data is a second time interval, and if the first time interval is not equal to the second time interval, the vehicle data acquisition module is further configured to:
performing time stamp alignment on the IMU data and the wheel speed data based on the time length corresponding to any one of the first time interval or the second time interval to obtain running data with aligned time stamps;
the classification result determination module, when inputting the travel data into a classification model, is configured to:
and inputting the running data with the aligned time stamps into the classification model.
In a possible embodiment, the apparatus further includes a time information determining module, configured to:
if the running state of the vehicle is the first state, determining time information of the vehicle passing through the deceleration strip based on the running state corresponding to each sub-period and the timestamp of each sub-period, wherein the time information comprises at least one of the following items:
a start time;
an end time;
the elapsed time period.
In a possible embodiment, the vehicle data further includes initial position information of the vehicle, and the apparatus further includes a target position information determination module configured to:
if the driving state of the vehicle is the first state, acquiring map data within the preset range of the initial position information;
determining position information of a deceleration strip in the map data;
and determining the target position information of the vehicle according to the determined position information of the deceleration strip.
In a possible embodiment, the initial position information is position information of the vehicle whose acquired time is closest to the time information.
In a possible embodiment, the classification model is obtained by the training module training the neural network model by:
acquiring a training data set, wherein the training data set comprises each sample driving data with a label, and the label represents a real classification result corresponding to the sample driving data;
training the neural network model based on the sample driving data until a loss function corresponding to the neural network model converges, and taking the converged neural network model as the classification model;
the input of the neural network model is the sample travel data, the output is the prediction classification result corresponding to the sample travel data, and the value of the loss function represents the difference between the prediction classification result and the real classification result corresponding to the sample travel data.
In a possible embodiment, the driving state determining module is specifically configured to:
for a second target time interval in each sub-time interval, determining the time length corresponding to the second target time interval according to the time stamp of each sub-time interval contained in the second target time interval, wherein the second target time interval is at least two sub-time intervals of which the continuous driving state is the first state;
and if the time length corresponding to the second target time period is greater than the set time length, determining that the running state of the vehicle is the first state.
In one aspect, an embodiment of the present application provides an electronic device, which includes a processor and a memory, where the processor and the memory are connected to each other;
the memory is used for storing computer programs;
the processor is configured to execute the method provided in any optional embodiment of the deceleration strip detection method when the computer program is called.
Optionally, the electronic device includes a vehicle-mounted terminal.
In one aspect, the embodiment of the present application provides a vehicle, and the electronic device is installed in the vehicle.
In one aspect, the present application provides a computer-readable storage medium, where a computer program is stored, where the computer program is executed by a processor to implement the method provided in any one of the possible implementation manners of the deceleration strip detection method.
In one aspect, embodiments of the present application provide a computer program product or a computer program, which includes computer instructions stored in a computer-readable storage medium. The processor of the electronic device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the method provided by any one of the possible embodiments of the speed bump detection method.
The scheme provided by the embodiment of the application has the beneficial effects that:
in an embodiment of the application, the method, the apparatus, the electronic device, and the storage medium for detecting a deceleration strip provided in the embodiment of the application obtain driving data of a vehicle in a current time period, where the driving data includes driving data corresponding to at least one sub-time period, then input the driving data into a classification model trained in advance, classify the driving data through the classification model to obtain a classification result corresponding to each sub-time period, determine a driving state corresponding to each sub-time period based on the classification result of each sub-time period, and then determine the driving state of the vehicle based on the driving state of each sub-time period, that is, determine whether the vehicle passes through the deceleration strip. By adopting the mode, when whether the vehicle passes through the deceleration strip or not is detected, the classification model can be used for classifying the driving data of the vehicle to obtain the classification result, whether the vehicle passes through the deceleration strip or not is determined based on the classification result, the mode that whether the vehicle passes through the deceleration strip or not is different from the mode that whether the vehicle passes through the deceleration strip or not is determined by adopting an image detection method or a laser radar detection method in the prior art, the used data is the driving data, the driving data is classified by adopting the classification model, compared with the mode that the image data is used, the influence of illumination is not limited, and compared with the mode that the laser radar is used for detection, the cost is low.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic structural diagram of a deceleration strip detection system to which an embodiment of the application is applied;
FIG. 2 is a schematic diagram of IMU data and wheel speed data provided by an embodiment of the present application;
fig. 3 is a schematic structural diagram of another deceleration strip detection system to which the embodiment of the application is applied;
FIG. 4 is a schematic flow chart of another alternative deceleration strip detection method provided in the embodiments of the present application;
FIG. 5 is a schematic diagram of driving data of a sub-period provided by an embodiment of the present application;
FIG. 6 is a schematic diagram of a driving state of a sub-period provided by an embodiment of the present application;
FIG. 7 is a schematic diagram of another sub-period driving status provided by an embodiment of the present application;
FIG. 8 is a schematic diagram of a driving state of a further sub-period provided by an embodiment of the present application;
FIG. 9 is a schematic diagram of a driving state of a further sub-period provided by an embodiment of the present application;
fig. 10 is a schematic structural diagram of a deceleration strip detection device provided in an embodiment of the present application;
fig. 11 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
For better understanding and explanation of the solutions provided in the embodiments of the present application, the following description will first refer to some relevant technical and nomenclature terms used in the embodiments of the present application:
the inertial measurement unit IMU, generally referred to as a combined unit consisting of 3 accelerometers and 3 gyroscopes, which are mounted on mutually perpendicular measurement axes of the vehicle, may be referred to as IMU data.
The deceleration strip detection method provided by the embodiment of the application relates to various fields of traffic, maps, positioning, artificial intelligence, big data and Cloud technology, such as intelligent driving positioning technology in traffic, machine learning in artificial intelligence, Cloud computing in Cloud technology, Cloud service, related data computing processing in the big data field and the like. For example, the classification result corresponding to the driving data according to the embodiment of the present application may be realized by a classification model in the field of artificial intelligence, and the storage of the data (for example, vehicle data, driving data of each sample, and the like), the processing of the data, and the like according to the embodiment of the present application may be realized by a cloud technology.
Artificial Intelligence (AI) is a theory, method, technique and application system that uses a digital computer or a machine controlled by a digital computer to simulate, extend and expand human Intelligence, perceive the environment, acquire knowledge and use the knowledge to obtain the best results. In other words, artificial intelligence is a comprehensive technique of computer science that attempts to understand the essence of intelligence and produce a new intelligent machine that can react in a manner similar to human intelligence. Artificial intelligence is the research of the design principle and the realization method of various intelligent machines, so that the machines have the functions of perception, reasoning and decision making.
The artificial intelligence technology is a comprehensive subject and relates to the field of extensive technology, namely the technology of a hardware level and the technology of a software level. The artificial intelligence infrastructure generally includes technologies such as sensors, dedicated artificial intelligence chips, cloud computing, distributed storage, big data processing technologies, operation/interaction systems, mechatronics, and the like. The artificial intelligence software technology mainly comprises a computer vision technology, a voice processing technology, a natural language processing technology, machine learning/deep learning and the like.
Machine Learning (ML) is a multi-domain cross discipline, and relates to a plurality of disciplines such as probability theory, statistics, approximation theory, convex analysis, algorithm complexity theory and the like. The special research on how a computer simulates or realizes the learning behavior of human beings so as to acquire new knowledge or skills and reorganize the existing knowledge structure to continuously improve the performance of the computer. Machine learning is the core of artificial intelligence, is the fundamental approach for computers to have intelligence, and is applied to all fields of artificial intelligence. Machine learning and deep learning generally include techniques such as artificial neural networks, belief networks, reinforcement learning, transfer learning, inductive learning, and formal education learning.
The cloud technology is a hosting technology for unifying series resources such as hardware, software, network and the like in a wide area network or a local area network to realize the calculation, storage, processing and sharing of data. The deceleration strip detection method provided by the embodiment of the application can be realized based on cloud computing (cloud computing) in cloud technology.
Cloud Computing refers to obtaining required resources in an on-demand and easily-extensible manner through a Network, and is a product of development and fusion of traditional computers and Network Technologies, such as Grid Computing (Grid Computing), Distributed Computing (Distributed Computing), Parallel Computing (Parallel Computing), Utility Computing (Utility Computing), Network Storage (Network Storage Technologies), Virtualization (Virtualization), Load balancing (Load Balance), and the like.
The artificial intelligence cloud Service is also generally called AIaaS (AI as a Service). The method is a service mode of an artificial intelligence platform, and particularly, the AIaaS platform splits several types of common artificial intelligence services, and provides independent or packaged services at a cloud, such as processing building monitoring requests.
Big data (Big data) refers to a data set which cannot be captured, managed and processed by a conventional software tool within a certain time range, and is a massive, high-growth-rate and diversified information asset which can have stronger decision-making power, insight discovery power and flow optimization capability only by a new processing mode. With the advent of the cloud era, big data has attracted more and more attention. The deceleration strip detection method provided by the embodiment needs a special technology based on big data to effectively implement, wherein the technology suitable for big data includes massive parallel processing databases, data mining, distributed file systems, distributed databases, the cloud computing and the like.
In some possible embodiments, the deceleration strip detection method can be applied to various scenes in which it is required to detect whether a vehicle passes through a deceleration strip and perform vehicle navigation positioning. For example, in a scene without a GPS signal or with a weak GPS signal, such as an underground parking lot, in an outdoor environment, the deceleration strip detection method of the present application may also be used to detect whether a vehicle passes through a deceleration strip, and then correct the location of the vehicle based on the location of the deceleration strip, that is, the deceleration strip detection method of the present application does not limit a use environment, and may detect whether a vehicle passes through a deceleration strip in various environments, and then locate the vehicle by using the location of the deceleration strip, which is not limited herein.
According to one aspect of the embodiment of the invention, a deceleration strip detection method is provided. For a better understanding and a description of the solutions provided in the examples of the present application, the following description first refers to a specific example of an alternative embodiment provided in the present application.
At present, in order to improve the accuracy of positioning a vehicle, a plurality of traffic elements that may affect the positioning accuracy are referred to. For example, the deceleration strip can be used as a reference factor, the deceleration strip is identified and positioned, and then the vehicle is positioned based on the identified deceleration strip. In the existing method, the following schemes are generally adopted for the identification mode of the deceleration strip: processing the image through a visual target detection technology, and determining whether a deceleration strip exists in the image and the position of the deceleration strip in the image; and judging whether the deceleration strip exists in the three-dimensional point cloud and the distance between the deceleration strip and the vehicle through a laser radar target detection technology.
However, the above-described solutions of the prior art have the following problems:
when the vehicle passes through the deceleration strip, the deceleration strip is arranged below the vehicle and exceeds the visual field range of the image, and the time for the vehicle to pass through the deceleration strip cannot be determined; the image data is susceptible to illumination changes, and a reliable detection result cannot be output in a dark scene.
Point cloud data scanned by the laser radar are sparse, and the situation that the scanned deceleration strip is incomplete easily occurs, so that detection omission is caused; when the vehicle passes through the deceleration strip, the deceleration strip is arranged below the vehicle and exceeds the scanning range of the laser radar, and the time for the vehicle to pass through the deceleration strip cannot be determined; lidar is expensive to manufacture and does not have large-scale expansibility.
That is to say, the prior art has the problem that when the vehicle is positioned according to traffic elements such as deceleration strips, the detection of the deceleration strips is inaccurate, and the positioning of the vehicle is inaccurate.
In order to solve the above problems, the deceleration strip detection method in the present application can be used to detect a deceleration strip and then locate a vehicle based on the position of the deceleration strip, referring to fig. 1, fig. 1 is a schematic structural diagram of a deceleration strip detection system to which an embodiment of the present application is applicable, and as shown in fig. 1, the system includes an input data processing module 01, a detection network 02 (i.e., a classification model described above), the detection network includes an encoder and a decoder, and a post-processing module 03, and the functions of the respective modules and the overall implementation flow of the deceleration strip detection method are described in detail below.
The principle of the deceleration strip detection method in the embodiment of the application includes, but is not limited to, the detection of deceleration strips, and can also be used for the detection of other traffic elements which can be used for assisting the vehicle positioning. For convenience of description, the deceleration strip will be exemplified in the following description of some embodiments.
As shown in fig. 1, the vehicle positioning system in the present application mainly includes three modules:
input data processing module 01:
time stamp alignment is performed on the IMU data and wheel speed data.
Detection network 02:
the method is characterized by comprising an encoder and a decoder, wherein the encoder and decoder modules are common Neural networks, such as Convolutional Neural Networks (CNN), in practical application, the encoder and decoder modules can be other Neural networks, and are not limited herein. The output result of the neural network describes the classification result of the input data at each moment (i.e. the classification result of each sub-period), and the moment belonging to the deceleration strip can be selected according to the classification result at each moment.
The post-processing module 03:
since the deceleration strip is a continuous process, class jump should not occur in the process, and the post-processing module is used for merging the classification results of the fracture.
The main flow of the deceleration strip detection method is as follows: acquiring vehicle data, optionally, the vehicle data may include IMU data and wheel speed data of a vehicle, performing timestamp alignment on the IMU data and the wheel speed data through the input data processing module 01 to obtain aligned timestamp data, and sending the aligned vehicle data to the speed bump detection network 02 (i.e., a classification model); secondly, the detection network 02 receives the IMU data and the wheel speed data, and outputs a classification result after the calculation of an encoder and a decoder; then, the classification result is sent to the post-processing module 03, and the classification result is optimized to obtain the detection result. And finally, obtaining the starting time and the ending time of the vehicle passing through the deceleration strip according to the time stamp in the detection result. And then, matching the vehicle with the deceleration strip in the map data according to the starting time and the ending time of the vehicle passing through the deceleration strip, so as to realize the positioning of the vehicle.
Each step is described in detail below:
s1, obtaining IMU data and wheel speed data (i.e. driving data) of the vehicle at the current time interval.
It will be appreciated that the duration of a period may correspond to the duration of a time interval, for example, the time interval is t seconds and the duration of a period may be t seconds.
Wherein the IMU data includes 3-dimensional accelerometer data, 3-dimensional gyroscope data, and 6-dimensional data in total.
The wheel speed data comprises 2-dimensional data, namely the speed of the left wheel and the speed of the right wheel and the speed of the rear wheel, and when the wheel speed data is used, the average speed of the left wheel and the right wheel can be taken and is represented by 1-dimensional data.
And S2, if the first time interval for collecting the IMU data and the second time interval for collecting the wheel speed data are different, performing timestamp alignment processing on the obtained IMU data and the wheel speed data.
In practical applications, the acquisition frequency of the IMU data is usually different from that of the wheel speed data, and when in use, the IMU data and the wheel speed data need to be subjected to timestamp alignment processing. The timestamp alignment process may be performed based on a first time interval corresponding to the IMU data or a second time interval corresponding to the wheel speed data.
For example, the wheel speed data is adjusted by interpolation based on the time stamp of the first time interval of the collected IMU data, so as to obtain wheel speed mean data with the same frequency as the IMU data.
As shown in fig. 2, in the current time period, the collected IMU data includes IMU data corresponding to time 1 to time 5 (i.e., time corresponding to each sub-time period in the foregoing), and the collected wheel speed data includes wheel speed data corresponding to time 6 to time 11, it can be seen that the collection time interval of IMU data and the collection time interval of wheel speed data are different, and timestamp alignment processing is required. For example, the time stamp alignment process may be performed with reference to the time stamp of the IMU data, and the description will be given by taking the time stamp of the wheel speed data aligned to time 1 as an example:
in one manner, the wheel speed data at a time closest to the time at time 1 may be taken as the wheel speed data at time 1, e.g., the wheel speed data at time 6 closest to time 1 may be taken as the wheel speed data at time 1, i.e., the time stamp of the wheel speed data at time 6 is aligned to time 1.
In one manner, an average (or weighted average) of the wheel speed data at two times closest to time 1 may be taken as the wheel speed data of the wheel speed data at that time 1, such as an average (or weighted average) of the wheel speed data at times 6 and 7 closest to time 1 may be taken as the wheel speed data of the wheel speed data at time 1.
In the above manner, the IMU data and the wheel speed data may be time stamped at time 1 to time 5, and thus, the IMU data and the wheel speed data after time stamp alignment, that is, the running data after time stamp alignment, may be obtained.
It is understood that the above is only an example, and the present embodiment is not limited thereto.
Specifically, assuming that the acquisition frequency of the IMU data is h Hz, and the acquisition time duration (i.e., the time duration of one period) is t seconds, a two-dimensional matrix for representing the IMU data and the wheel speed mean data may be obtained, where t × h is the number of rows of the matrix and represents the data amount (i.e., the travel data of the vehicle in one period), the data of each row corresponds to the travel data of one sub-period, and 7 is the number of columns of the matrix and represents the data dimension, and the data of each column corresponds to the data of one dimension by combining the 6-dimensional IMU data and the 1-dimensional wheel speed mean data.
And S3, inputting the aligned IMU data and wheel speed data into a detection network (namely a classification model), and classifying the IMU data and the wheel speed data to obtain a classification result.
The input of the module is a two-dimensional matrix (i.e. running data) with the size of (t x h, 7), the output result is the two-dimensional matrix (i.e. the classification result of the running data corresponding to each sub-period) with the size of (t x h, c) after passing through an encoder and a decoder, c represents the total number of categories, each row in the two-dimensional matrix of the output result corresponds to the classification result of one sub-period, and c values in one row of data represent the probability that the running data is in various categories, wherein the category of the category comprises but is not limited to a speed bump, a level road, other bump types and the like. After the row-by-row normalization operation, the two-dimensional matrix corresponding to the output result describes the probability that each time in the input data (i.e., the driving data corresponding to each sub-period) belongs to each category. The normalized calculation method is as follows:
wherein,the response value of the result, i.e. the probability value of each time in the foregoing belonging to each category, is output for the network. Finally, according to the normalization result, selecting the category with the highest probability line by line as an output result (namely, a classification result) with the size of (t x h),1)。
And S4, inputting the classification result obtained by the detection network and the timestamp information of the vehicle data (namely IMU data and wheel speed data) subjected to timestamp alignment processing into a post-processing module, and optimizing the classification result through the post-processing module.
After the classification result output by the detection network and the timestamp information of the vehicle data are given, the post-processing module can combine the split classification results, and the detection continuity of the deceleration strip is guaranteed. The method mainly comprises two steps of combining the fracture result and detecting by mistake:
1. combining the fracture results:
since the deceleration strip is a continuous process, no class change should occur in the middle, and the classification results need to be merged. Specifically, it is assumed that the classification results are that two adjacent moments of the deceleration strip are respectively,If, ifOrThen will beToAll data in between are classified as speed bumps, where,is the mergeable threshold (i.e., first duration).
For example, it is assumed that the travel data after the time stamp alignment is time stamped from time 1 to time 5 as shown in fig. 2, where the travel state corresponding to time 1 (i.e., one time in sub-period 1 in the following) is the second state, the travel state corresponding to time 2 (i.e., one time in sub-period 2 in the following) is the first state, the travel state corresponding to time 3 (i.e., one time in sub-period 3 in the following) is the second state, the travel state corresponding to time 4 (i.e., one time in sub-period 4 in the following) is the first state, and the travel state corresponding to time 5 (i.e., one time in sub-period 5 in the following) is the first state.
Wherein, time 2 (i.e., the above-mentioned time)) And time 4 (i.e., as described above)) If the time length between the moment 2 and the moment 4 is less than two adjacent moments of the deceleration strip as a result of classificationThis indicates that the detection result at time 3 is incorrect, and at this time, the detection result at time 3 may be changed from the second state to the first state.
Through the adjustment processing of the driving state at each time through the above process, the adjusted classification result (i.e., the adjusted corresponding driving state of each sub-period) can be obtained, which can also be referred to as obtaining the detection result of the vehicle data in the current period.
2. And (3) filtering false detection:
after the classification results at all times are combined, the detection results are assumed to be the start time and the end time of the deceleration strip respectively,If, ifOrThe detection result is deleted, wherein,is the minimum response time threshold (i.e., the second duration). If the time to pass through the deceleration strip is less than the minimum response time threshold, it indicates thatAndthe current detection result in between is false detection.
For example, after the merged fracture result processing, the driving states from time 2 to time 5 are all the first states, and at this time, the start time of the deceleration strip in the detection result is time 2 (i.e. instant time)) The end time of the deceleration strip in the detection result is time 5 (i.e. instant time)) If the time period between time 2 and time 5 is less than or equal toIf the time length between the time 2 and the time 5 is longer than the time length between the time 2 and the time 5, the detection result is deleted, and the detection result is judged to be false detectionAnd then, the detection results between the time 2 and the time 5 are all deceleration strips, that is, the vehicle passes through the deceleration strips between the time 2 and the time 5, the starting time of the vehicle passing through the deceleration strips is the time 2, and the ending time of the vehicle passing through the deceleration strips is the time 5.
Through the process, the result with error detection can be filtered, the error detection can be greatly avoided, and the accuracy of the detection result is improved.
And S5, if the final detection result shows that the vehicle passes through the deceleration strip, acquiring initial position information of the vehicle at the time closest to the time stamp information of the deceleration strip by using a visual positioning technology or a laser radar positioning technology.
And S6, taking the initial position information as a center, acquiring map data within a preset range of the initial position information, and taking the middle position information of a deceleration strip closest to the initial position information in the map data as the current target position information of the vehicle.
Through this application embodiment, there is following beneficial effect mainly:
the method avoids the use of a laser radar with high manufacturing cost for vehicle positioning in the prior art, can realize vehicle positioning under a low-cost positioning system, and has strong expansibility;
in the environment without GPS signals or with weak GPS signals, the vehicle can still be accurately positioned, and the accuracy and the robustness of vehicle positioning are improved;
the time of the vehicle passing through the deceleration strip can be accurately obtained, and the accuracy and the real-time performance of vehicle positioning are improved.
As an example, fig. 3 shows a schematic structural diagram of a deceleration strip detection system to which the embodiment of the present application is applied, and it can be understood that the deceleration strip detection method provided by the embodiment of the present application can be applied to, but is not limited to, the application scenario shown in fig. 3.
In the present example, as shown in fig. 3, the deceleration strip detection system in this example may include, but is not limited to, a vehicle-mounted terminal 101, a network 102, and an electronic device (e.g., a server) 103. The vehicle-mounted terminal 101 may communicate with the electronic device 103 through the network 102, for example, the electronic device may be accessed to the vehicle positioning system, the vehicle data may be obtained through the vehicle-mounted terminal, the vehicle data may be transmitted to the electronic device, the vehicle data may be processed by the electronic device, whether the vehicle passes through a deceleration strip or not may be determined, and the vehicle may be positioned according to a detection result of the deceleration strip.
The main execution body of the deceleration strip detection method in the present application may be a vehicle-mounted terminal of a vehicle, or may be any electronic device (such as a server, a mobile phone, etc.), and the following description takes the main execution body as a server as an example.
As shown in fig. 3, the vehicle-mounted terminal is installed on the vehicle shown in fig. 1, the electronic device is a server 103, and a specific implementation process of the deceleration strip detection method in the present application may include steps S1-S4:
in step S1, the vehicle-mounted terminal 101 acquires vehicle data corresponding to the current time period, and transmits the vehicle data to the server 103 via the network 102, wherein the vehicle data includes travel data corresponding to at least one sub-time period, and the travel data includes IMU data and wheel speed data.
In step S2, after receiving the vehicle data of the current time period, the server 103 inputs the driving data into the classification model, and classifies the driving data by the classification model to obtain a classification result corresponding to each sub-time period, where for one sub-time period, the classification result corresponding to the sub-time period represents the driving state of the vehicle in the sub-time period.
In step S3, the server 103 determines the travel state corresponding to each sub-period based on the classification result corresponding to each sub-period.
And step S4, determining the driving state of the vehicle based on the driving state corresponding to each sub-period, wherein the driving state is a first state or a second state, the first state represents that the vehicle passes through the deceleration strip, and the second state represents that the vehicle does not pass through the deceleration strip.
It is understood that the above is only an example, and the embodiment of the present application is not limited herein, for example, the vehicle-mounted terminal may also directly process the vehicle data, and the vehicle-mounted terminal performs the steps S2-S4 to achieve positioning of the vehicle, and is not limited herein.
The vehicle-mounted terminal can be a vehicle-mounted navigation terminal, a vehicle-mounted computer and the like. The electronic device may be any electronic device with computing capability, for example, the electronic device may be a server or a user terminal. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server or a server cluster providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a Network service, cloud communication, middleware service, a domain name service, a security service, a CDN (Content Delivery Network), a big data and artificial intelligence platform. Such networks may include, but are not limited to: a wired network, a wireless network, wherein the wired network comprises: a local area network, a metropolitan area network, and a wide area network, the wireless network comprising: bluetooth, Wi-Fi, and other networks that enable wireless communication. The user terminal may be a smart phone (e.g., an Android phone, an iOS phone, etc.), a tablet computer, a notebook computer, a digital broadcast receiver, an MID (Mobile Internet Devices), a PDA (personal digital assistant), a desktop computer, a smart speaker, a smart watch, etc., and the vehicle-mounted terminal and the electronic device may be directly or indirectly connected through a wired or wireless communication manner, but are not limited thereto. The determination may also be based on the requirements of the actual application scenario, and is not limited herein.
Referring to fig. 4, fig. 4 is a flowchart illustrating another optional deceleration strip detection method provided in the embodiment of the present application, where the method may be executed by a vehicle-mounted terminal, or may be executed by the vehicle-mounted terminal and any electronic device in an interactive manner, where the electronic device may be a server or a user terminal, or may be the user terminal and the server, or may be executed by the vehicle-mounted terminal and the user terminal in an interactive manner, as shown in fig. 4, the deceleration strip detection method provided in the embodiment of the present application may include the following steps:
step S401, vehicle data corresponding to a current time interval of a vehicle are obtained, the vehicle data comprise running data corresponding to at least one sub-time interval, and the running data comprise inertial measurement unit IMU data and wheel speed data.
Step S402, inputting the driving data into a classification model, classifying the driving data through the classification model, and obtaining a classification result corresponding to each sub-period, wherein for one sub-period, the classification result corresponding to the sub-period represents a driving state of the vehicle in the sub-period.
Step S403, determining the driving state corresponding to each sub-period based on the classification result corresponding to each sub-period;
step S404, determining a driving state of the vehicle based on the driving state corresponding to each of the sub-periods, where the driving state is a first state or a second state, the first state indicates that the vehicle passes through a deceleration strip, and the second state indicates that the vehicle does not pass through the deceleration strip.
Optionally, the vehicle may be a common vehicle or an automatic driving vehicle, and is all installed with a vehicle-mounted terminal, wherein the vehicle-mounted terminal may be a vehicle-mounted navigation terminal, a vehicle-mounted computer, or the like. Vehicle data including travel data of the vehicle, which represents acceleration data of the vehicle, a speed of a wheel, a travel direction of the vehicle, and the like, is acquired by an in-vehicle terminal installed in the vehicle.
The driving data comprises IMU data and wheel speed data, and the IMU data comprises 3-dimensional accelerometer data and 3-dimensional gyroscope data, wherein the data comprises 6-dimensional data. The wheel speed data contains 2-dimensional data, which are the speed of the left and right wheels and the speed of the rear wheel, respectively. When in use, the average speed of the two can be taken and represented by 1-dimensional data.
Then, the vehicle-mounted terminal may transmit the acquired vehicle data to a user terminal, such as a mobile phone, communicatively connected to the vehicle-mounted terminal, and determine the driving state of the vehicle by performing the following processing on the vehicle data through the mobile phone:
for the method of determining the driving state of the vehicle, a classification model (i.e., a detection network) may be used to obtain, specifically, the driving data of the vehicle is input into a classification model trained in advance, then the driving data is classified by the classification model, that is, the classification of the driving data corresponding to each sub-period is determined to belong to a deceleration strip, a flat road or other bump type, the classification result of the driving data corresponding to each sub-period is obtained, for any sub-period, the classification result represents the probability that the vehicle passes through the deceleration strip in the sub-period, then the driving state of each sub-period is determined by the classification result corresponding to each sub-period, and then the specific type of the driving state of the vehicle is determined by the driving state of each sub-period.
And determining the driving state of the vehicle based on the driving data in the vehicle data, wherein the driving state comprises two states, namely a first state or a second state, if the driving state of the vehicle is the first state, the vehicle passes through the deceleration strip, and if the driving state of the vehicle is the second state, the vehicle does not pass through the deceleration strip in the current time period. Then, the determination of the driving state of the vehicle is to determine whether the vehicle passes through a deceleration strip.
For example, the vehicle data includes driving data corresponding to at least one sub-period, and as shown in fig. 5, the current period includes sub-period 1 to sub-period 5, and each sub-period may include driving data corresponding to the sub-period, where sub-period 1 corresponds to driving data 1, sub-period 2 corresponds to driving data 2, sub-period 3 corresponds to driving data 3, sub-period 4 corresponds to driving data 4, and sub-period 5 corresponds to driving data 5.
The driving state corresponding to each sub-period can be obtained based on a classification model, for example, as shown in fig. 6, for each sub-period in fig. 5, the driving state corresponding to sub-period 1 is the second state, the driving state corresponding to sub-period 2 is the first state, the driving state corresponding to sub-period 3 is the second state, the driving state corresponding to sub-period 4 is the first state, and the driving state corresponding to sub-period 5 is the first state.
Then, the running state of the vehicle is determined based on the running states respectively corresponding to the respective sub-periods. For example, as shown in fig. 6, in the classification result of the driving states of the sub-periods, the driving states of the sub-periods 2, 4, and 5 are all the first states, the driving states of the sub-periods 1 and 3 are the second states, and the sub-period 3 with the driving state being the second state is located between the sub-periods 2 and 4 with the driving state being the first state, since the process that the vehicle passes through the deceleration strip is a continuous process, no other driving state occurs in the middle, which indicates that the detection result of the sub-period 3 is incorrect, and at this time, it may be determined that the driving state of the vehicle is the first state. By adopting the mode, the running state of the vehicle can be determined based on the running state corresponding to each sub-period, and the accuracy of determining the running state of the vehicle is improved by determining the global information of the current period through the local information of each sub-period.
In practical applications, the principle of the deceleration strip detection method in the embodiment of the present application may also be used for detecting other traffic elements, and the present application is not limited herein, that is, is not limited to the detection of a deceleration strip, and may also be other relatively special traffic elements, for example, a traffic light, and in general, a vehicle may decelerate until stopping running before passing through the traffic light, and stay for a certain period of time (waiting time of the vehicle before the traffic light), so that the special traffic element may be used to determine the running state of the vehicle, and locate the vehicle.
According to the embodiment of the application, the driving data of the vehicle in the current time period is obtained, the driving data comprises the driving data corresponding to at least one sub-time period, then the driving data is input into a classification model trained in advance, the driving data is classified through the classification model to obtain a classification result corresponding to each sub-time period, the driving state corresponding to each sub-time period is determined based on the classification result of each sub-time period, and then the driving state of the vehicle is determined based on the driving state of each sub-time period, namely whether the vehicle passes through a deceleration strip or not is determined. By adopting the mode, when whether the vehicle passes through the deceleration strip or not is detected, the classification model can be used for classifying the driving data of the vehicle to obtain the classification result, whether the vehicle passes through the deceleration strip or not is determined based on the classification result, the mode that whether the vehicle passes through the deceleration strip or not is different from the mode that whether the vehicle passes through the deceleration strip or not is determined by adopting an image detection method or a laser radar detection method in the prior art, the used data is the driving data, the driving data is classified by adopting the classification model, compared with the mode that the image data is used, the influence of illumination is not limited, and compared with the mode that the laser radar is used for detection, the cost is low.
In an optional embodiment, the determining the driving state of the vehicle based on the driving state corresponding to each of the sub-periods includes:
for a second target time interval in each sub-time interval, determining the time length corresponding to the second target time interval according to the time stamp of each sub-time interval contained in the second target time interval, wherein the second target time interval is at least two sub-time intervals of which the continuous driving state is the first state;
and if the time length corresponding to the second target time period is greater than the set time length, determining that the running state of the vehicle is the first state.
Optionally, the duration of the second target time period may be determined based on the time stamps of the sub-time periods, where the second target time period is at least two sub-time periods in which the continuous driving state is the first state, and if the duration corresponding to the second target time period is greater than a set duration (which may be equal to the second duration), it indicates that the driving state of the vehicle is the first state, that is, the vehicle passes through the deceleration strip.
In practical applications, the driving state of the vehicle may also be determined by other judgment criteria, which is not limited herein.
The following details a process of determining target position information of a vehicle based on a deceleration strip in the following manner.
In an optional embodiment, the vehicle data further includes initial position information of the vehicle, and the method further includes:
if the driving state of the vehicle is the first state, acquiring map data within the preset range of the initial position information;
determining position information of a deceleration strip in the map data;
and determining the target position information of the vehicle according to the determined position information of the deceleration strip.
Optionally, the acquired vehicle data may further include initial position information of the vehicle, and in the absence of a GPS signal or in the case of a poor environmental condition, the vehicle may be preliminarily positioned based on a visual positioning technology and/or a laser radar positioning technology to acquire the initial position information of the vehicle.
And then, based on the initial position information and the deceleration strip, correcting the initial position information to realize accurate positioning of the current target position information of the vehicle.
Specifically, the map data within the preset range of the initial position information may be acquired centering on the initial position information. Then, the position information of the deceleration strip is determined based on the map data and the initial position information, for example, a traffic element which is closest to the deceleration strip in the map data and is the same as the element type of the deceleration strip can be matched, and the position information of the traffic element can be used as the position information of the deceleration strip.
Then, target position information of the vehicle is determined according to the position information of the deceleration strip. As one example, the position information of the deceleration strip may be taken as the target position information of the vehicle. As another example, the position information of the deceleration strip and the initial position information may be fused, such as taking an average of the two, and taking the average as the target position information of the vehicle, etc., which is not limited herein.
Through this application embodiment, can realize fixing a position the vehicle through deceleration strip and initial position information, this in-process, can still realize the accurate positioning to the vehicle in the environment that no GPS signal or GPS signal are weak, not only improved the stability of vehicle location, still improved the accuracy of vehicle location.
In an alternative embodiment, the IMU data is acquired at a first time interval, the wheel speed data is acquired at a second time interval, and if the first time interval is not equal to the second time interval, the method further comprises:
performing time stamp alignment on the IMU data and the wheel speed data based on the time length corresponding to any one of the first time interval or the second time interval to obtain running data with aligned time stamps;
the inputting of the travel data into the classification model includes:
and inputting the running data with the aligned time stamps into the classification model.
Optionally, the driving data may include IMU data and wheel speed data, and in practical applications, the acquisition frequencies of the IMU data and the wheel speed data are generally different, and the data needs to be subjected to a timestamp alignment process. That is, when the first time interval for acquiring IMU data and the second time interval for acquiring wheel speed data are different, the time stamp alignment process needs to be performed on IMU data and wheel speed data. The time stamp alignment process may be performed with any one of the first time interval or the second time interval as a reference time stamp.
For any one of the sub-period driving data, in an alternative example, the wheel speed data closest to the sub-period may be adjusted based on the IMU data of the sub-period to obtain the sub-period driving data (including the IMU data and the adjusted wheel speed data). The specific adjustment manner can refer to the description of fig. 2, and is not described herein again.
Then, the travel data with the aligned time stamps can be input into a classification model, the travel data with the aligned time stamps are classified by the classification model to obtain a classification result of each sub-period, the travel state of each sub-period is determined according to the classification result of each sub-period, and then the travel state of the vehicle is determined according to the travel state of each sub-period.
Through the embodiment of the application, the time stamp alignment processing can be performed, so that the time stamp information of the IMU data and the wheel speed data is consistent, and the convenience of data processing is improved.
In an optional embodiment, the method further includes:
if the running state of the vehicle is the first state, determining time information of the vehicle passing through the deceleration strip based on the running state corresponding to each sub-period and the timestamp of each sub-period, wherein the time information comprises at least one of the following items:
a start time;
an end time;
the elapsed time period.
Optionally, the acquired vehicle data further includes a timestamp of each sub-period, and the time information of the vehicle passing through the deceleration strip may be determined based on the timestamp of each sub-period, for example, the start time, the end time, or the elapsed time of the vehicle passing through the deceleration strip may be determined.
For example, as shown in fig. 7, if the driving state in sub-period 1 is the second state, and the driving states in sub-periods 2 to 5 are the first states, it may be determined that the starting time of the vehicle passing through the deceleration strip is a certain time in sub-period 2, such as the time of acquiring IMU data in sub-period 2.
As shown in fig. 8, the driving states of sub-periods 1 to 4 are the first state, and the driving state of sub-period 5 is the second state, then it can be determined that the end time of the vehicle passing through the deceleration strip is a certain time in sub-period 5, such as the collection time of IMU data in sub-period 5.
As shown in fig. 9, if the driving state of sub-period 1 is the second state, the driving states of sub-periods 2 to 4 are the first state, and the driving state of sub-period 5 is the second state, then it can be determined that the starting time of the vehicle passing through the deceleration strip is a certain time in sub-period 2, such as the acquisition time of IMU data in sub-period 2, and the ending time of the vehicle passing through the deceleration strip is a certain time in sub-period 4, such as the acquisition time of IMU data in sub-period 4.
It is understood that the above is only an example, and the present implementation is not limited in any way.
In an optional embodiment, the initial position information is position information of the vehicle whose acquired time is closest to the time information.
After the time information that the vehicle passes through the deceleration strip is determined, the position information of the vehicle at the moment closest to the time information can be acquired as the initial position information of the vehicle based on the time information, then the position information of the deceleration strip is determined based on the initial position information and the map data, and further, the positioning of the vehicle can be achieved based on the position information of the deceleration strip.
Through this application embodiment, can determine the time information of vehicle process deceleration strip, obtain the positional information of deceleration strip based on this time information to the positional information based on the deceleration strip fixes a position the vehicle, can make the positional information of vehicle more accurate, has improved the accuracy of vehicle location.
In an optional embodiment, the determining the driving state of the vehicle based on the driving state corresponding to each of the sub-periods further includes:
determining each sub-period of which the corresponding driving state is the first state in each sub-period;
based on the time stamp of each sub-period in which the driving state is the first state, the driving state of each sub-period is adjusted as follows:
if the time length between any two sub-periods of the first state is less than or equal to a first time length, determining the running state of each sub-period between any two sub-periods as the first state;
and determining the running state of the vehicle based on the adjusted corresponding running state of each sub-period.
Optionally, in practical applications, when determining the driving state of the vehicle, all the detection results may be considered, so that the detection results of the respective fractures need to be merged, and a specific merging processing manner may refer to the description of the merged fracture results in the foregoing, and is not described herein again.
In an optional embodiment, the determining the driving state of the vehicle based on the adjusted driving state corresponding to each of the sub-periods includes:
for a first target time interval in each adjusted sub-time interval, determining a time length corresponding to the first target time interval according to a timestamp of each sub-time interval contained in the first target time interval, wherein the first target time interval is at least two sub-time intervals of which the continuous driving state is a first state;
and if the time length corresponding to the first target time interval is less than or equal to a second time length, determining the running state of the vehicle based on the running states of all the sub-time intervals except the first target time intervals.
Optionally, in practical applications, a situation of false detection may occur, and therefore, a result of the detected driving state of the vehicle needs to be filtered, and a specific filtering manner may refer to the detailed description of the false detection of filtering, which is not described herein again.
In an alternative embodiment, the classification model is obtained by training a neural network model in the following manner:
acquiring a training data set, wherein the training data set comprises each sample driving data with a label, and the label represents a corresponding real classification result of the sample driving data;
training the neural network model based on the sample driving data until a loss function corresponding to the neural network model converges, and taking the converged neural network model as the classification model;
the input of the neural network model is the sample travel data, the output is the prediction classification result corresponding to the sample travel data, and the value of the loss function represents the difference between the prediction classification result and the real classification result corresponding to the sample travel data.
Optionally, the classification model needs to be trained in advance, and in the training process, a supervised learning manner may be adopted, and the specific training process is as follows:
and acquiring a training data set, wherein the training data set comprises a large amount of sample driving data, each sample driving data is provided with a label, and for the label corresponding to each sample driving data, the label represents a real classification result of the sample driving data.
Then, each sample driving data in the training data set is input into the neural network model one by one, a prediction classification result corresponding to each sample driving data is output through the neural network model, and a loss function is calculated through a real classification result and a prediction classification result corresponding to each sample driving data, wherein the loss function can use cross quotient loss (cross quotient loss), and the value of the loss function represents the difference between the prediction classification result and the real classification result corresponding to each sample driving data. If the loss function converges, the neural network model at the time of convergence can be used as a classification model, and if the loss function does not converge, the neural network model continues to be trained through the training data set and the loss function until the loss function converges.
In practical application, the neural network model is not limited, and the network structure of the neural network model can be selected according to actual needs. For example, the Neural Network model may be a Long Short Term Memory Neural Network (LSTM), a Bi-directional LSTM, a Neural Network based on a combination of a forward LSTM and a backward LSTM, a Bi-directional LSTM, a Recurrent Neural Network (RNN), a CNN Network, etc., without any limitation.
After the classification model is trained in the above mode, the classification model can be used for classifying the driving data of the vehicle, for example, whether the vehicle passes through the deceleration strip within a period of time can be detected, in the process of detection through the classification model, compared with the prior art, the classification model does not depend on images, and the laser radar judges whether the vehicle passes through the deceleration strip, so that the problem that the deceleration strip is influenced by environmental factors when the deceleration strip is determined by an image detection method is avoided, the problem that the manufacturing cost is high when the deceleration strip is determined by the laser radar is avoided, the classification model can be used in various environmental states, and the adaptability is high.
Through this application embodiment, can obtain the classification result that the data corresponds that traveles through the classification model, this kind of mode that adopts the classification model compares with the method that uses image detection among the prior art, does not receive the environmental impact, can detect the deceleration strip under arbitrary environmental status to, do not receive the field of vision to shelter from the influence, use the classification model to extract the characteristic of data of traveling, the expressive ability reinforce, can distinguish through the deceleration strip and other jolts, prevent the false detection, improved the accuracy of detecting the deceleration strip.
Referring to fig. 10, fig. 10 is a schematic structural diagram of a deceleration strip detection apparatus provided in an embodiment of the present application. The deceleration strip detection device 1 that this application embodiment provided includes:
the vehicle data acquisition module 11 is configured to acquire vehicle data corresponding to a current time period of a vehicle, where the vehicle data includes travel data corresponding to at least one sub-time period, and the travel data includes inertial measurement unit IMU data and wheel speed data;
a classification result determining module 12, configured to input the driving data into a classification model, and classify the driving data through the classification model to obtain a classification result corresponding to each sub-period, where, for one sub-period, the classification result corresponding to the sub-period represents a driving state of the vehicle in the sub-period;
a driving state determining module 13, configured to determine a driving state corresponding to each sub-period based on a classification result corresponding to each sub-period;
the driving state determining module 13 is configured to determine a driving state of the vehicle based on the driving state corresponding to each of the sub-periods, where the driving state is a first state or a second state, the first state indicates that the vehicle passes through a deceleration strip, and the second state indicates that the vehicle does not pass through the deceleration strip.
In a possible embodiment, the vehicle data further includes a time stamp of each sub-period, and the driving state determining module, when determining the driving state of the vehicle based on the driving state corresponding to each sub-period, is configured to:
determining each sub-period of which the corresponding driving state is the first state in each sub-period;
based on the time stamp of each sub-period in which the driving state is the first state, the driving state of each sub-period is adjusted as follows:
if the time length between any two sub-periods of the first state is less than or equal to a first time length, determining the running state of each sub-period between any two sub-periods as the first state;
and determining the running state of the vehicle based on the adjusted corresponding running state of each sub-period.
In a possible embodiment, when determining the driving state of the vehicle based on the driving state corresponding to each of the sub-periods, the driving state determination module is specifically configured to:
for a first target time interval in each adjusted sub-time interval, determining a time length corresponding to the first target time interval according to a timestamp of each sub-time interval contained in the first target time interval, wherein the first target time interval is at least two sub-time intervals of which the continuous driving state is a first state;
and if the time length corresponding to the first target time interval is less than or equal to a second time length, determining the running state of the vehicle based on the running states of all the sub-time intervals except the first target time intervals.
In one possible embodiment, the acquisition time interval of the IMU data is a first time interval, the acquisition time interval of the wheel speed data is a second time interval, and if the first time interval is not equal to the second time interval, the vehicle data acquisition module is further configured to:
performing time stamp alignment on the IMU data and the wheel speed data based on the time length corresponding to any one of the first time interval or the second time interval to obtain running data with aligned time stamps;
the classification result determination module, when inputting the travel data into a classification model, is configured to:
and inputting the running data with the aligned time stamps into the classification model.
In a possible embodiment, the apparatus further includes a time information determining module, configured to:
if the running state of the vehicle is the first state, determining time information of the vehicle passing through the deceleration strip based on the running state corresponding to each sub-period and the timestamp of each sub-period, wherein the time information comprises at least one of the following items:
a start time;
an end time;
the elapsed time period.
In a possible embodiment, the vehicle data further includes initial position information of the vehicle, and the apparatus further includes a target position information determination module configured to:
if the driving state of the vehicle is the first state, acquiring map data within the preset range of the initial position information;
determining position information of a deceleration strip in the map data;
and determining the target position information of the vehicle according to the determined position information of the deceleration strip.
In a possible embodiment, the initial position information is position information of the vehicle whose acquired time is closest to the time information.
In a possible embodiment, the classification model is obtained by the training module training the neural network model by:
acquiring a training data set, wherein the training data set comprises each sample driving data with a label, and the label represents a real classification result corresponding to the sample driving data;
training the neural network model based on the sample driving data until a loss function corresponding to the neural network model converges, and taking the converged neural network model as the classification model;
the input of the neural network model is the sample travel data, the output is the prediction classification result corresponding to the sample travel data, and the value of the loss function represents the difference between the prediction classification result and the real classification result corresponding to the sample travel data.
In a possible embodiment, the driving state determining module is specifically configured to:
for a second target time interval in each sub-time interval, determining the time length corresponding to the second target time interval according to the time stamp of each sub-time interval contained in the second target time interval, wherein the second target time interval is at least two sub-time intervals of which the continuous driving state is the first state;
and if the time length corresponding to the second target time period is greater than the set time length, determining that the running state of the vehicle is the first state.
In the embodiment of the application, the driving data of the vehicle in the current time period is obtained, the driving data comprises driving data corresponding to at least one sub-time period, then the driving data is input into a classification model trained in advance, the driving data is classified through the classification model to obtain classification results corresponding to the sub-time periods, the driving state corresponding to each sub-time period is determined based on the classification results of the sub-time periods, and then the driving state of the vehicle is determined based on the driving state of each sub-time period, namely whether the vehicle passes through a deceleration strip is determined. By adopting the mode, when whether the vehicle passes through the deceleration strip or not is detected, the classification model can be used for classifying the driving data of the vehicle to obtain the classification result, whether the vehicle passes through the deceleration strip or not is determined based on the classification result, the mode that whether the vehicle passes through the deceleration strip or not is different from the mode that whether the vehicle passes through the deceleration strip or not is determined by adopting an image detection method or a laser radar detection method in the prior art, the used data is the driving data, the driving data is classified by adopting the classification model, compared with the mode that the image data is used, the influence of illumination is not limited, and compared with the mode that the laser radar is used for detection, the cost is low.
In a specific implementation, the apparatus 1 may execute the implementation manners provided in the steps in fig. 4 through the built-in functional modules, which may specifically refer to the implementation manners provided in the steps, and are not described herein again.
Referring to fig. 11, fig. 11 is a schematic structural diagram of an electronic device provided in an embodiment of the present application. As shown in fig. 11, the electronic device 1000 in the present embodiment may include: the processor 1001, the network interface 1004, and the memory 1005, and the electronic device 1000 may further include: a user interface 1003, and at least one communication bus 1002. Wherein a communication bus 1002 is used to enable connective communication between these components. The user interface 1003 may include a Display screen (Display) and a Keyboard (Keyboard), and the optional user interface 1003 may also include a standard wired interface and a standard wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). The memory 1004 may be a high-speed RAM memory or a non-volatile memory (e.g., at least one disk memory). The memory 1005 may optionally be at least one memory device located remotely from the processor 1001. As shown in fig. 11, a memory 1005, which is a kind of computer-readable storage medium, may include therein an operating system, a network communication module, a user interface module, and a device control application program.
In the electronic device 1000 shown in fig. 11, the network interface 1004 may provide a network communication function; the user interface 1003 is an interface for providing a user with input; and the processor 1001 may be used to invoke a device control application stored in the memory 1005.
It should be understood that in some possible embodiments, the processor 1001 may be a Central Processing Unit (CPU), and the processor may be other general purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), field-programmable gate arrays (FPGAs) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, and the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The memory may include both read-only memory and random access memory, and provides instructions and data to the processor. The portion of memory may also include non-volatile random access memory. For example, the memory may also store device type information.
In a specific implementation, the electronic device 1000 may execute the implementation manners provided in the steps in fig. 4 through the built-in functional modules, which may specifically refer to the implementation manners provided in the steps, and are not described herein again.
Optionally, the electronic device includes a vehicle-mounted terminal.
In one aspect, the embodiment of the present application provides a vehicle, and the electronic device is installed in the vehicle.
An embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and is executed by a processor to implement the method provided in each step in fig. 4, which may specifically refer to the implementation manner provided in each step, and is not described herein again.
The computer readable storage medium may be an internal storage unit of the task processing device provided in any of the foregoing embodiments, for example, a hard disk or a memory of an electronic device. The computer readable storage medium may also be an external storage device of the electronic device, such as a plug-in hard disk, a Smart Memory Card (SMC), a Secure Digital (SD) card, a flash card (flash card), and the like, which are provided on the electronic device. The computer readable storage medium may further include a magnetic disk, an optical disk, a read-only memory (ROM), a Random Access Memory (RAM), and the like. Further, the computer readable storage medium may also include both an internal storage unit and an external storage device of the electronic device. The computer-readable storage medium is used for storing the computer program and other programs and data required by the electronic device. The computer readable storage medium may also be used to temporarily store data that has been output or is to be output.
Embodiments of the present application provide a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the electronic device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the method provided by the steps of fig. 4.
The terms "first", "second", and the like in the claims and in the description and drawings of the present application are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or electronic device that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or electronic device. Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments. The term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
Those of ordinary skill in the art will appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be embodied in electronic hardware, computer software, or combinations of both, and that the components and steps of the examples have been described in a functional general in the foregoing description for the purpose of illustrating clearly the interchangeability of hardware and software. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The above disclosure is only for the purpose of illustrating the preferred embodiments of the present application and is not intended to limit the scope of the present application, which is defined by the appended claims.
Claims (12)
1. A deceleration strip detection method is characterized by comprising the following steps:
acquiring vehicle data corresponding to a current time period of a vehicle, wherein the vehicle data comprises running data corresponding to at least one sub time period, and the running data comprises Inertial Measurement Unit (IMU) data and wheel speed data;
inputting the driving data into a classification model, classifying the driving data through the classification model, and obtaining a classification result corresponding to each sub-period, wherein for one sub-period, the classification result corresponding to the sub-period represents the driving state of the vehicle in one sub-period;
determining a driving state corresponding to each sub-period based on a classification result corresponding to each sub-period;
and determining the running state of the vehicle based on the running state corresponding to each sub-period, wherein the running state is a first state or a second state, the first state represents that the vehicle passes through a deceleration strip, and the second state represents that the vehicle does not pass through the deceleration strip.
2. The method of claim 1, wherein the vehicle data further comprises a timestamp for each sub-period, and wherein determining the driving status of the vehicle based on the driving status for each of the sub-periods comprises:
determining each sub-period of which the corresponding driving state is the first state in each sub-period;
based on the time stamp of each sub-period in which the driving state is the first state, the driving state of each sub-period is adjusted as follows:
if the time length between any two sub-periods of the first state is less than or equal to a first time length, determining the driving state of each sub-period between any two sub-periods as the first state;
determining the driving state of the vehicle based on the adjusted corresponding driving state of each sub-period.
3. The method of claim 2, wherein determining the driving state of the vehicle based on the adjusted driving state for each of the sub-periods comprises:
for a first target time interval in each adjusted sub-time interval, determining a time length corresponding to the first target time interval according to a timestamp of each sub-time interval contained in the first target time interval, wherein the first target time interval is at least two sub-time intervals of which the continuous driving state is a first state;
and if the time length corresponding to the first target time interval is less than or equal to a second time length, determining the running state of the vehicle based on the running state of each sub-time interval except each first target time interval.
4. The method of claim 1, wherein the IMU data is acquired at a first time interval and the wheel speed data is acquired at a second time interval, and wherein the method further comprises, if the first time interval is not equal to the second time interval:
performing time stamp alignment on the IMU data and the wheel speed data based on the duration corresponding to any one of the first time interval or the second time interval to obtain running data with aligned time stamps;
the inputting the driving data into a classification model comprises:
and inputting the running data with the aligned time stamps into the classification model.
5. The method of claim 2, further comprising:
if the running state of the vehicle is the first state, determining time information of the vehicle passing through the deceleration strip based on the running state corresponding to each sub-period and the timestamp of each sub-period, wherein the time information comprises at least one of the following items:
a start time;
an end time;
the elapsed time period.
6. The method of claim 5, wherein the vehicle data further includes initial location information of the vehicle, the method further comprising:
if the driving state of the vehicle is the first state, obtaining map data located in the initial position information preset range;
determining position information of a deceleration strip in the map data;
and determining the target position information of the vehicle according to the determined position information of the deceleration strip.
7. The method according to claim 6, characterized in that the initial position information is the position information of the vehicle whose acquired time is closest to the time information.
8. The method of claim 1, wherein the classification model is trained on a neural network model by:
acquiring a training data set, wherein the training data set comprises each sample driving data with a label, and the label represents a real classification result corresponding to the sample driving data;
training the neural network model based on the sample driving data until a loss function corresponding to the neural network model converges, and taking the converged neural network model as the classification model;
the input of the neural network model is each sample driving data, the output is a prediction classification result corresponding to each sample driving data, and the value of the loss function represents the difference between the prediction classification result corresponding to each sample driving data and the real classification result.
9. The method according to any one of claims 1 to 8, wherein determining the driving state of the vehicle based on the driving state corresponding to each of the sub-periods comprises:
for a second target time interval in each sub-time interval, determining the time length corresponding to the second target time interval according to the time stamp of each sub-time interval contained in the second target time interval, wherein the second target time interval is at least two sub-time intervals of which the continuous driving state is the first state;
and if the time length corresponding to the second target time period is greater than the set time length, determining that the running state of the vehicle is the first state.
10. A deceleration strip detection device, characterized in that, the device includes:
the vehicle data acquisition module is used for acquiring vehicle data corresponding to a current time period of a vehicle, wherein the vehicle data comprises running data corresponding to at least one sub time period, and the running data comprises Inertial Measurement Unit (IMU) data and wheel speed data;
the classification result determining module is used for inputting the driving data into a classification model, classifying the driving data through the classification model and obtaining a classification result corresponding to each sub-period, wherein for one sub-period, the classification result corresponding to the sub-period represents the driving state of the vehicle in one sub-period;
the driving state determining module is used for determining the driving state corresponding to each sub-period based on the classification result corresponding to each sub-period;
the driving state determining module is configured to determine a driving state of the vehicle based on a driving state corresponding to each sub-period, where the driving state is a first state or a second state, the first state indicates that the vehicle passes through a deceleration strip, and the second state indicates that the vehicle does not pass through the deceleration strip.
11. An electronic device comprising a processor and a memory, the processor and the memory being interconnected;
the memory is used for storing a computer program;
the processor is configured to perform the method of any of claims 1 to 9 when the computer program is invoked.
12. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which is executed by a processor to implement the method of any one of claims 1 to 9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110384644.5A CN113033463B (en) | 2021-04-09 | 2021-04-09 | Deceleration strip detection method and device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110384644.5A CN113033463B (en) | 2021-04-09 | 2021-04-09 | Deceleration strip detection method and device, electronic equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113033463A true CN113033463A (en) | 2021-06-25 |
CN113033463B CN113033463B (en) | 2023-08-01 |
Family
ID=76456228
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110384644.5A Active CN113033463B (en) | 2021-04-09 | 2021-04-09 | Deceleration strip detection method and device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113033463B (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114970705A (en) * | 2022-05-20 | 2022-08-30 | 深圳市有一说一科技有限公司 | Driving state analysis method, device, equipment and medium based on multi-sensing data |
CN115615422A (en) * | 2022-12-20 | 2023-01-17 | 禾多科技(北京)有限公司 | Deceleration strip detection method and device, electronic equipment and computer readable medium |
CN116342111A (en) * | 2023-05-30 | 2023-06-27 | 中汽信息科技(天津)有限公司 | Intelligent transaction method and system for automobile parts based on big data |
CN117171701A (en) * | 2023-08-14 | 2023-12-05 | 陕西天行健车联网信息技术有限公司 | Vehicle running data processing method, device, equipment and medium |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150291177A1 (en) * | 2014-04-14 | 2015-10-15 | Hyundai Motor Company | Speed bump detection apparatus and navigation data updating apparatus and method using the same |
KR20160127996A (en) * | 2015-04-28 | 2016-11-07 | 영남대학교 산학협력단 | Apparatus for recognization and controlling system a speed bump of Autonomous Driving Vehicle |
US20180334166A1 (en) * | 2017-03-30 | 2018-11-22 | Baidu Usa Llc | Deceleration curb-based direction checking and lane keeping system for autonomous driving vehicles |
FR3090162A1 (en) * | 2018-12-17 | 2020-06-19 | Continental Automotive France | Detection of road retarders by automatic learning |
CN111649740A (en) * | 2020-06-08 | 2020-09-11 | 武汉中海庭数据技术有限公司 | Method and system for high-precision positioning of vehicle based on IMU |
US20200346654A1 (en) * | 2017-06-22 | 2020-11-05 | Nissan Motor Co., Ltd. | Vehicle Information Storage Method, Vehicle Travel Control Method, and Vehicle Information Storage Device |
CN112477851A (en) * | 2020-11-30 | 2021-03-12 | 广州小鹏自动驾驶科技有限公司 | Deceleration strip identification method and device, vehicle and readable storage medium |
-
2021
- 2021-04-09 CN CN202110384644.5A patent/CN113033463B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150291177A1 (en) * | 2014-04-14 | 2015-10-15 | Hyundai Motor Company | Speed bump detection apparatus and navigation data updating apparatus and method using the same |
KR20160127996A (en) * | 2015-04-28 | 2016-11-07 | 영남대학교 산학협력단 | Apparatus for recognization and controlling system a speed bump of Autonomous Driving Vehicle |
US20180334166A1 (en) * | 2017-03-30 | 2018-11-22 | Baidu Usa Llc | Deceleration curb-based direction checking and lane keeping system for autonomous driving vehicles |
US20200346654A1 (en) * | 2017-06-22 | 2020-11-05 | Nissan Motor Co., Ltd. | Vehicle Information Storage Method, Vehicle Travel Control Method, and Vehicle Information Storage Device |
FR3090162A1 (en) * | 2018-12-17 | 2020-06-19 | Continental Automotive France | Detection of road retarders by automatic learning |
CN111649740A (en) * | 2020-06-08 | 2020-09-11 | 武汉中海庭数据技术有限公司 | Method and system for high-precision positioning of vehicle based on IMU |
CN112477851A (en) * | 2020-11-30 | 2021-03-12 | 广州小鹏自动驾驶科技有限公司 | Deceleration strip identification method and device, vehicle and readable storage medium |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114970705A (en) * | 2022-05-20 | 2022-08-30 | 深圳市有一说一科技有限公司 | Driving state analysis method, device, equipment and medium based on multi-sensing data |
CN114970705B (en) * | 2022-05-20 | 2024-05-07 | 深圳市有一说一科技有限公司 | Running state analysis method, device, equipment and medium based on multi-sensing data |
CN115615422A (en) * | 2022-12-20 | 2023-01-17 | 禾多科技(北京)有限公司 | Deceleration strip detection method and device, electronic equipment and computer readable medium |
CN116342111A (en) * | 2023-05-30 | 2023-06-27 | 中汽信息科技(天津)有限公司 | Intelligent transaction method and system for automobile parts based on big data |
CN116342111B (en) * | 2023-05-30 | 2023-08-29 | 中汽信息科技(天津)有限公司 | Intelligent transaction method and system for automobile parts based on big data |
CN117171701A (en) * | 2023-08-14 | 2023-12-05 | 陕西天行健车联网信息技术有限公司 | Vehicle running data processing method, device, equipment and medium |
CN117171701B (en) * | 2023-08-14 | 2024-05-14 | 陕西天行健车联网信息技术有限公司 | Vehicle running data processing method, device, equipment and medium |
Also Published As
Publication number | Publication date |
---|---|
CN113033463B (en) | 2023-08-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113033463B (en) | Deceleration strip detection method and device, electronic equipment and storage medium | |
EP3137355B1 (en) | Device for designating objects to a navigation module of a vehicle equipped with said device | |
US10317240B1 (en) | Travel data collection and publication | |
CN110785719A (en) | Method and system for instant object tagging via cross temporal verification in autonomous vehicles | |
CN110869559A (en) | Method and system for integrated global and distributed learning in autonomous vehicles | |
CN110753953A (en) | Method and system for object-centric stereo vision in autonomous vehicles via cross-modality verification | |
CN113155173B (en) | Perception performance evaluation method and device, electronic device and storage medium | |
CN112328730A (en) | Map data updating method, related device, equipment and storage medium | |
CN115470884A (en) | Platform for perception system development of an autopilot system | |
CN115082690B (en) | Target recognition method, target recognition model training method and device | |
CN112149763A (en) | Method and device for improving road surface abnormity detection by using crowdsourcing concept | |
CN116972860A (en) | Yaw recognition method and device, electronic equipment and storage medium | |
CN115019060A (en) | Target recognition method, and training method and device of target recognition model | |
CN110909656A (en) | Pedestrian detection method and system with integration of radar and camera | |
CN116823884A (en) | Multi-target tracking method, system, computer equipment and storage medium | |
CN116434173A (en) | Road image detection method, device, electronic equipment and storage medium | |
CN113762030A (en) | Data processing method and device, computer equipment and storage medium | |
CN117633519B (en) | Lane change detection method, apparatus, electronic device and storage medium | |
CN118038397B (en) | Image processing method, device, electronic equipment and storage medium | |
KR20200072022A (en) | Apparatus and method for servicing personalized information based on user interest | |
US11879744B2 (en) | Inferring left-turn information from mobile crowdsensing | |
CN117593892B (en) | Method and device for acquiring true value data, storage medium and electronic equipment | |
EP4394664A1 (en) | Automated data generation by neural network ensembles | |
CN116434041A (en) | Mining method, device and equipment for error perception data and automatic driving vehicle | |
WO2024217656A1 (en) | Control device and method for testing and/or updating a program element of a vehicle function |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
REG | Reference to a national code |
Ref country code: HK Ref legal event code: DE Ref document number: 40047312 Country of ref document: HK |
|
GR01 | Patent grant | ||
GR01 | Patent grant |