CN115116161A - Vehicle data acquisition method and device, storage medium and vehicle - Google Patents

Vehicle data acquisition method and device, storage medium and vehicle Download PDF

Info

Publication number
CN115116161A
CN115116161A CN202210726815.2A CN202210726815A CN115116161A CN 115116161 A CN115116161 A CN 115116161A CN 202210726815 A CN202210726815 A CN 202210726815A CN 115116161 A CN115116161 A CN 115116161A
Authority
CN
China
Prior art keywords
vehicle
frequency
data
recording
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210726815.2A
Other languages
Chinese (zh)
Inventor
晏飞
于鸿达
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiaomi Automobile Technology Co Ltd
Original Assignee
Xiaomi Automobile Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiaomi Automobile Technology Co Ltd filed Critical Xiaomi Automobile Technology Co Ltd
Priority to CN202210726815.2A priority Critical patent/CN115116161A/en
Publication of CN115116161A publication Critical patent/CN115116161A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0841Registering performance data

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The method determines a target recording frequency of the vehicle for recording target data according to a driving state parameter of the vehicle, and records the target data according to the target recording frequency. The recording frequency of the target data such as vehicle recording point cloud data, radar data, vehicle external image data, vehicle internal image data, vehicle body signal data and vehicle attitude data can be dynamically changed along with the change of the driving state parameters, and the recording frequency of the specific target data can be improved under the scene that the specific target data is needed, so that the data volume can be sufficient during fault detection or accident scene recovery. Moreover, under the scene that the specific target data is not needed, the recording frequency of the specific target data is reduced, and the consumption of storage resources is reduced.

Description

Vehicle data acquisition method and device, storage medium and vehicle
Technical Field
The present disclosure relates to the field of automatic driving technologies, and in particular, to a vehicle data collection method, device, storage medium, and vehicle.
Background
In the case of a vehicle, it stores various data generated during the travel of the vehicle. Generally, the frequency of storing various data by the vehicle is fixed and is overwritten according to a preset time period. However, if the vehicle records various data at a fixed frequency, when an accident scene is reproduced, the accident scene cannot be reproduced truly due to insufficient data amount.
Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides a vehicle data acquisition method, apparatus, storage medium, and vehicle.
According to a first aspect of embodiments of the present disclosure, there is provided a vehicle data acquisition method including:
acquiring a driving state parameter of a vehicle;
determining a target recording frequency of the vehicle recording target data according to the driving state parameters;
and recording the target data according to the target recording frequency.
Optionally, the determining a target recording frequency of the vehicle recording target data according to the driving state parameter includes:
taking the driving state parameters as input of a judgment model to obtain target recording frequency of the vehicle recording target data;
wherein the judgment model is obtained by training a machine learning model by using the historical driving state parameters marked with the recording frequency.
Optionally, the method further comprises:
and sending the target data to a target terminal according to the target recording frequency.
Optionally, the driving state parameters include weather information, vehicle speed information, position information, lane information, and obstacle information;
the target data includes point cloud data collected by a laser radar, radar data collected by a radio radar, vehicle exterior image data, vehicle interior image data, vehicle body signal data, and vehicle attitude data.
Optionally, the determining a target recording frequency of the vehicle recording target data according to the driving state parameter includes:
in the case that the driving state parameter represents that the vehicle is in a parking state, determining the frequency of recording the image data outside the vehicle by the vehicle as a first image recording frequency, and determining the frequency of recording other target data except the image data outside the vehicle by the vehicle as a lowest recording frequency, wherein the first image recording frequency is less than the reference image recording frequency of the vehicle;
when the number of the obstacles in the preset range of the vehicle represented by the driving state parameter is larger than a preset obstacle threshold value, determining the frequency of recording the image data outside the vehicle by the vehicle as a second image recording frequency, wherein the second image recording frequency is larger than the reference image recording frequency;
and when the driving state parameter represents that the vehicle is in a night driving state, determining the frequency of recording the image data outside the vehicle by the vehicle as a third image recording frequency, wherein the third image recording frequency is less than the reference image recording frequency and greater than the first image recording frequency.
Optionally, the determining a target recording frequency of the vehicle recording target data according to the driving state parameter includes:
when the confidence coefficient of the point cloud data acquired by the laser radar represented by the driving state parameters is smaller than a preset confidence coefficient threshold value, determining the frequency of the vehicle for recording the point cloud data as a first point cloud recording frequency, wherein the first point cloud recording frequency is smaller than the reference point cloud recording frequency of the vehicle;
and when the variation amplitude of the driving environment where the driving state parameter represents the vehicle is smaller than the preset amplitude, determining the frequency of the vehicle for recording the point cloud data as a second point cloud recording frequency, wherein the second point cloud recording frequency is smaller than the reference point cloud recording frequency.
Optionally, the determining a target recording frequency of the vehicle recording target data according to the driving state parameter includes:
when the driving state parameter represents that the vehicle speed of the vehicle is greater than a first vehicle speed threshold value, determining the frequency of the vehicle for recording the radar data, the vehicle body signal data, the vehicle attitude data and the vehicle interior image data as a corresponding first frequency;
when the driving state parameter represents that the vehicle speed of the vehicle is less than a second vehicle speed threshold value, determining the frequency of the vehicle for recording the radar data, the vehicle body signal data, the vehicle attitude data and the vehicle interior image data as a corresponding second frequency;
when the running state parameters represent that the running environment in which the vehicle is located causes the vehicle posture data to be distorted, determining the frequency of the vehicle for recording the vehicle posture data as a third frequency;
wherein the second vehicle speed threshold is less than the first vehicle speed threshold, the second frequency is less than the first frequency, and the third frequency is less than the second frequency.
According to a second aspect of the embodiments of the present disclosure, there is provided a vehicle data acquisition apparatus including:
the acquisition module is configured to acquire a driving state parameter of the vehicle;
a determination module configured to determine a target recording frequency of the vehicle recording target data according to the driving state parameter;
and the recording module is configured to record the target data according to the target recording frequency.
According to a third aspect of the embodiments of the present disclosure, there is provided a vehicle including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to execute the executable instructions to implement the steps of the method of the first aspect
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the steps of the method according to the first aspect of the present disclosure.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects: the target recording frequency of the vehicle recording target data is determined according to the driving state parameters of the vehicle, and the target data is recorded according to the target recording frequency. The recording frequency of the target data such as vehicle recording point cloud data, radar data, vehicle external image data, vehicle internal image data, vehicle body signal data and vehicle attitude data can be dynamically changed along with the change of the driving state parameters, and the recording frequency of the specific target data can be improved under the scene that the specific target data is needed, so that the data volume can be sufficient during fault detection or accident scene recovery. Moreover, under the scene that the specific target data is not needed, the recording frequency of the specific target data is reduced, and the consumption of storage resources is reduced.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a flow chart illustrating a vehicle data collection method according to an exemplary embodiment.
FIG. 2 is a block diagram illustrating a vehicle data collection device according to an exemplary embodiment.
FIG. 3 is a functional block diagram schematic of a vehicle shown in an exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below do not represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
It should be noted that all actions of acquiring signals, information or data in the present application are performed under the premise of complying with the corresponding data protection regulation policy of the country of the location and obtaining the authorization given by the owner of the corresponding device.
FIG. 1 is a flow chart illustrating a vehicle data collection method according to an exemplary embodiment. As shown in fig. 1, the method is used in a vehicle and may include the following steps.
In step 110, the driving state parameters of the vehicle are acquired.
Here, the driving state parameters may include weather information, vehicle speed information, position information, lane information, and obstacle information. The weather information refers to weather information of an environment where the vehicle is located, and the weather information can be obtained by acquiring a real-time weather forecast. The vehicle speed information refers to the real-time vehicle speed of the vehicle, and can be obtained through a speed sensor. The position information refers to real-time position information of the vehicle, and the position information may be obtained by a Positioning module, for example, by a GPS (Global Positioning System) or a beidou satellite navigation System. The type of road on which the vehicle is located may be determined based on the location information of the vehicle. For example, it may be determined whether the vehicle is located on an expressway, a national road, an urban road, an on-ground parking lot, an underground parking lot, or the like, based on the position information of the vehicle. The lane information includes the number of lanes of a lane where the vehicle is currently located, the position of the lane where the vehicle is located, and the type of the lane. For example, the number of lanes includes a single lane, a double lane, a triple lane, and the like. The lane is located at the position of the lane, such as the middle lane of three lanes, and the like. The lane type includes whether a guardrail or not exists in the lane. The obstacle information includes the number of other vehicles, the number of pedestrians, the number of non-motor vehicles, and the like, which are located within a preset range of the vehicle.
In step 120, a target recording frequency of the vehicle recording target data is determined according to the driving state parameter.
Here, the target data includes point cloud data collected by a laser Radar, Radar data collected by a radio Radar (Radar), vehicle exterior image data, vehicle interior image data, vehicle body signal data, and vehicle attitude data. The vehicle exterior image data is image data of the exterior of the vehicle acquired by a camera device provided around the vehicle, and the vehicle interior image data is image data of the interior of the vehicle acquired by a camera device provided in the interior of the vehicle. The vehicle body signal data includes a vehicle speed signal, a wheel speed signal, a steering wheel rotation angle signal, etc., which may be acquired by a sensor connected to a vehicle body controller. The vehicle attitude data includes yaw angle information, pitch angle information, and roll angle information of the vehicle, which may be acquired by a gyroscope, an IMU (Inertial Measurement Unit).
It should be understood that, for different driving state parameters, which characterize different types of driving scenes in which the vehicle is located, the target recording frequency of the corresponding target data is also different.
In step 130, the target data is recorded according to the target recording frequency.
Here, after the target recording frequency is determined, the vehicle records the target data at the target recording frequency. For different target data, the corresponding target recording frequencies may be different or the same. For example, when the position information of the vehicle indicates that the vehicle is traveling on a highway, the vehicle may record the body signal data and the vehicle attitude data at the highest frequency, while reducing the frequency of recording the point cloud data and the vehicle exterior image data.
For example, the vehicle records the target data at the target recording frequency means that the vehicle stores the target data in the memory according to the target recording frequency.
Thus, the target recording frequency of the vehicle recording target data is determined according to the driving state parameters of the vehicle, and the target data is recorded according to the target recording frequency. The recording frequency of the target data such as vehicle recording point cloud data, radar data, vehicle external image data, vehicle internal image data, vehicle body signal data and vehicle attitude data can be dynamically changed along with the change of the driving state parameters, and the recording frequency of the specific target data can be improved under the scene that the specific target data is needed, so that the data volume can be sufficient during fault detection or accident scene recovery. Moreover, under the scene that the specific target data is not needed, the recording frequency of the specific target data is reduced, and the consumption of storage resources is reduced.
In some possible implementations, the target recording frequency of the vehicle recording target data may be obtained using the driving state parameter as an input of a determination model.
Here, after the running state parameters of the vehicle are acquired, the acquired running state parameters are input to the determination model, and the determination model outputs the target recording frequency for each target data.
The judgment model is obtained by training a machine learning model by using the historical driving state parameters marked with the recording frequency. The decision model may be a neural network model, such as a CNN neural network model.
The historical driving state parameter may be a driving state parameter of the collected vehicle under the actual driving condition. And then marking the recording frequency of the target data corresponding to each historical driving state parameter, and training the machine learning model by using the marked historical driving state parameters to obtain a trained machine learning model, wherein the trained machine learning model is a judgment model.
It should be understood that the samples used to train the machine learning model may be different [ weather information, vehicle speed information, position information, lane information, and obstacle information ] and corresponding labels [ point cloud data recording frequency, radar data recording frequency, vehicle exterior image data recording frequency, vehicle interior image data recording frequency, body signal data recording frequency, vehicle attitude data recording frequency ]. For example, the historical driving state parameters are "clear weather, vehicle speed 90km/h, location of a highway, vehicle running in a middle lane of three lanes, few other vehicles around the vehicle, no pedestrian and/or non-motor vehicle", and corresponding recording frequency is "point cloud data: 2Hz, Radar data: 15Hz, vehicle exterior image data of 10Hz, vehicle interior image data of 30Hz, vehicle body signal data: 100Hz, vehicle attitude data 100Hz ".
Therefore, the driving state parameters of the vehicle are input into the judgment model, the recording frequency of recording each target data of the vehicle can be set in a targeted mode according to the complex driving road conditions, and the recording frequency of the target data can meet the driving scene requirements of the vehicle.
In some implementation manners, the target data may be further sent to a target terminal according to the target recording frequency.
Here, the target terminal may be a user terminal, such as a mobile terminal, associated with the vehicle. Of course, the target terminal may also be a supervising terminal. When the vehicle records the target data according to the target recording frequency, the vehicle can also send the target data to the target terminal according to the target recording frequency, so that the target terminal can judge whether the vehicle is abnormal or not according to the received target data. For example, when the vehicle is in a parking state, the vehicle may acquire vehicle exterior image data according to the target recording frequency and transmit the vehicle exterior image data to the target terminal, so that the target terminal determines whether the vehicle is in a stolen state according to the vehicle exterior image data.
For example, the vehicle may transmit the target data to the target terminal according to the target recording frequency. Of course, a new transmission frequency may be determined according to the target recording frequency, and the target data may be transmitted to the target terminal according to the corresponding transmission frequency. The transmission frequency is positively correlated with the target recording frequency to enable frequent transmission of the target data to the target terminal in an abnormal situation or to reduce the transmission of the target data to the target terminal in a normal situation.
Therefore, the target data are sent to the target terminal, so that the target data can be backed up, and the target terminal can master the vehicle condition information in time.
In some possible implementation manners, in step 120, in a case that the driving state parameter indicates that the vehicle is in a parking state, a frequency at which the vehicle records the image data outside the vehicle is determined as a first image recording frequency, and a frequency at which the vehicle records other target data than the image data outside the vehicle is determined as a lowest recording frequency, wherein the first image recording frequency is less than a reference image recording frequency of the vehicle.
Here, the reference image recording frequency of the vehicle may refer to a default frequency for the vehicle to store the vehicle external image data collected by the external camera, and may be 5Hz, for example.
And when the vehicle speed information of the vehicle is 0km/h and the position information of the vehicle indicates that the vehicle is positioned in the parking lot, determining that the vehicle is in a parking state. At this time, the vehicle stores the vehicle exterior image data at a first image recording frequency that is less than the reference image recording frequency, such as recording the vehicle exterior image data at a frequency of 1 Hz. Meanwhile, the frequency of the vehicle recording point cloud data, the radar data, the vehicle interior image data, the vehicle body signal data and the vehicle posture data is not the lowest recording frequency, such as 0 Hz.
It is worth mentioning that when the vehicle is in a parking state, the vehicle does not need to record point cloud data, radar data, vehicle interior image data, vehicle body signal data and vehicle posture data, and the vehicle exterior image data is recorded by reserving the frequency of 1 Hz.
Therefore, when the vehicle is in a parking state, the recording of the point cloud data, the radar data, the vehicle interior image data, the vehicle body signal data and the vehicle posture data is stopped, so that the power consumption and the storage resource consumption of the vehicle are reduced, and the vehicle exterior image data is recorded at a low first image recording frequency, so that the safety of the vehicle is ensured.
In some embodiments, when the driving state parameter indicates that the number of obstacles located within the preset range of the vehicle is greater than a preset obstacle threshold, determining a frequency at which the vehicle records the image data outside the vehicle as a second image recording frequency, the second image recording frequency being greater than the reference image recording frequency.
Here, the vehicle may determine the obstacle information from the acquired vehicle exterior image data, and control the vehicle to record the vehicle exterior image data at a second image recording frequency that is greater than the reference image recording frequency when the obstacle information indicates that the number of obstacles located within a preset range of the vehicle is greater than a preset obstacle threshold.
It should be appreciated that a number of obstacles within a predetermined range of the vehicle greater than a predetermined obstacle threshold is indicative of the vehicle traveling in a complex road condition environment. For example, when the vehicle is traveling on an urban road and the number of other vehicles, pedestrians, and non-motor vehicles around the host vehicle is large, it is determined that the second image recording frequency records the vehicle exterior image data.
Wherein the second image recording frequency may be the highest frequency at which the vehicle records image data outside the vehicle, such as 30 Hz. Of course, the second image recording frequency may be set according to actual conditions, and the target is to be able to record the vehicle external image data with a higher image recording frequency in a case where the running condition is more complicated than the normal running condition corresponding to the vehicle with the reference image recording frequency.
Therefore, under the environment of complex road conditions, more external image data of the vehicle can be collected by the vehicle, so that the true phase of an accident can be restored according to more external image data of the vehicle when the accident occurs.
In some embodiments, when the driving state parameter characterizes that the vehicle is in a night driving state, a frequency at which the vehicle records the image data outside the vehicle is determined as a third image recording frequency, which is less than the reference image recording frequency and greater than the first image recording frequency.
Here, the vehicle may determine whether the vehicle is in a night driving state according to the time information or the weather information. When the vehicle is in a night-time driving state, the external camera device of the vehicle cannot acquire useful vehicle external image data due to a light problem, and therefore the vehicle external image data can be recorded at a third image recording frequency that is less than the reference image recording frequency and greater than the first image recording frequency. For example, the vehicle exterior image data is recorded at a frequency of 2 Hz.
It should be understood that when the vehicle is in the night-time driving state, the vehicle exterior image data is recorded at a frequency lower than the reference image recording frequency, which is essential to reduce the frequency at which the vehicle records the vehicle exterior image data when the confidence of the image data acquired by the exterior camera device is reduced.
Therefore, when the confidence of the image data acquired by the external camera device is reduced, the frequency of the vehicle for recording the image data outside the vehicle is reduced, and the storage resource can be saved so as to store more other data which have more reference value for scene reproduction.
In some embodiments, when the confidence of the driving state parameter representing the point cloud data acquired by the laser radar is less than a preset confidence threshold, determining the frequency of the vehicle recording the point cloud data as a first point cloud recording frequency, wherein the first point cloud recording frequency is less than a reference point cloud recording frequency of the vehicle.
Here, the reference point cloud recording frequency may refer to a default frequency used by the vehicle to store the point cloud data collected by the laser radar, and for example, the reference point cloud recording frequency may be 5 Hz.
Whether the confidence coefficient of the point cloud data acquired by the laser radar is smaller than a preset confidence coefficient threshold value or not can be judged according to the weather information acquired by the vehicle. For example, when the weather information represents rainy or foggy days, the confidence coefficient of the point cloud data acquired by the laser radar is determined to be smaller than a preset confidence coefficient threshold value.
It should be understood that when the vehicle runs in a rainy or foggy environment, the laser radar cannot work normally, which results in inaccuracy of the acquired point cloud data, and thus the confidence of the point cloud data acquired by the laser radar is smaller than a preset confidence threshold. At this time, when the accident scene is restored according to the point cloud data, the restored accident scene is also inaccurate. Thus, the vehicle may record the point cloud data at a first point cloud recording frequency that is less than a reference point cloud recording frequency of the vehicle. For example, a vehicle may record point cloud data at a frequency of 2 Hz.
Therefore, when the confidence coefficient of the point cloud data acquired by the laser radar is smaller than a preset confidence coefficient threshold value, more storage resources can be reserved to store other more valuable data such as vehicle body signal data and vehicle attitude data by reducing the recording frequency of the vehicle recording point cloud data.
In some embodiments, when the variation amplitude of the driving state parameter representing the driving environment where the vehicle is located is smaller than a preset amplitude, the frequency of the vehicle recording the point cloud data is determined as a second point cloud recording frequency, wherein the second point cloud recording frequency is smaller than the reference point cloud recording frequency.
Here, the variation range of the driving state parameter representing the driving environment of the vehicle is smaller than a preset range, which actually indicates that the vehicle is driven in a scene with little scene variation, and the point cloud data collected by the laser radar has a large number of repetitions.
For example, it may be determined whether a variation range of a running environment in which the vehicle is located is less than a preset range according to at least one of position information of the vehicle, image data outside the vehicle, and point cloud data. For example, when the position information of the vehicle indicates that the vehicle is located on a highway, the variation amplitude of the driving environment in which the vehicle is located is determined to be smaller than the preset amplitude. For another example, when the vehicle external image data and/or the point cloud data acquired for the continuous preset time period have a large number of repetitions and fewer vehicles, it is determined that the variation range of the driving environment in which the vehicle is located is smaller than the preset range.
It should be understood that when the vehicle runs in a scene in which the variation range of the running environment is smaller than the preset range, a large amount of repeated point cloud data is collected, and at this time, the frequency of recording the point cloud data can be reduced. The point cloud data is recorded, for example, at a frequency of 2 Hz.
Therefore, by reducing the recording frequency of the vehicle recording point cloud data, more storage resources can be reserved to store other more valuable data, such as vehicle body signal data, vehicle attitude data and the like.
In some embodiments, when the driving state parameter characterizes a vehicle speed of the vehicle greater than a first vehicle speed threshold, a frequency at which the vehicle records the radar data, the body signal data, the vehicle attitude data, and the vehicle interior image data is determined as a corresponding first frequency.
Here, it may be determined whether the vehicle speed of the vehicle is greater than the first vehicle speed threshold value from the vehicle speed information acquired by the vehicle. Wherein the first vehicle speed threshold value may be 60 km/h. It should be appreciated that the vehicle speed of the vehicle is greater than the first vehicle speed threshold, indicating that the vehicle is in a fast-driving condition, such as when traveling on an expressway.
When the vehicle runs at a speed greater than a first vehicle speed threshold value, the vehicle records the radar data, the vehicle body signal data, the vehicle attitude data and the vehicle interior image data at corresponding first frequencies, respectively. The first frequency may be the highest frequency at which corresponding target data is recorded. For example, the first frequency at which radar data is recorded is 15Hz, the first frequency at which body signal data is recorded is 100Hz, the first frequency at which vehicle attitude data is recorded is 100Hz, and the first frequency at which vehicle interior image data is recorded is 30 Hz.
It is to be noted that when the vehicle is in a high-speed traveling state, the data more important for scene reproduction are radar data, body signal data, vehicle attitude data, and vehicle interior image data, and therefore, the radar data, the body signal data, the vehicle attitude data, and the vehicle interior image data are recorded at a higher first frequency.
In some embodiments, when the vehicle speed at which the driving state parameter characterizes the vehicle is less than a second vehicle speed threshold, the frequency at which the vehicle records the radar data, the body signal data, the vehicle attitude data, and the vehicle interior image data is determined as a corresponding second frequency.
Here, the second vehicle speed threshold value is smaller than the first vehicle speed threshold value, and may be, for example, 40 km/h. And when the driving state parameter indicates that the vehicle speed of the vehicle is less than the second vehicle speed threshold value, the vehicle is in a slow driving or traffic jam state. At this time, the vehicle attitude, the vehicle body signal, the radar data, and the in-vehicle image data are less varied, and the frequency of recording the radar data, the vehicle body signal data, the vehicle attitude data, and the vehicle interior image data can be reduced.
Accordingly, when the driving state parameter indicates that the vehicle speed of the vehicle is less than the second vehicle speed threshold, the radar data, the body signal data, the vehicle attitude data, and the vehicle interior image data of the vehicle may be recorded at a second frequency that is less than the first frequency. For example, radar data is recorded at a frequency of 5Hz, body signal data is recorded at a frequency of 20Hz, vehicle attitude data is recorded at a frequency of 20Hz, and vehicle interior image data is recorded at a frequency of 5 Hz.
In some embodiments, when the driving state parameter characterizes a driving environment in which the vehicle is located, resulting in distortion of the vehicle attitude data, the frequency at which the vehicle records the vehicle attitude data is determined as a third frequency.
Here, it may be determined whether the driving environment in which the vehicle is located causes distortion of the vehicle attitude data, based on the position information of the vehicle. For example, when the vehicle travels in an underground parking lot, the vehicle attitude data collected by the vehicle may be distorted, and at this time, the vehicle attitude data may be recorded at a third frequency that is less than the second frequency. For example, vehicle attitude data is recorded at a frequency of 10 Hz.
It is worth to be noted that, in the above embodiments, the essence is to determine the driving scene type of the vehicle according to the driving state parameters of the vehicle, and determine the target recording frequency of the vehicle recording target data according to the driving scene type. The recording frequency is increased according to the target data with more values of the driving scene types, the target data with more values are recorded, and the recording frequency is reduced according to the target data with less values of the driving scene types, so that the target data with less values are recorded.
Of course, in practical applications, the target recording frequency of the target data may be determined comprehensively according to one or more conditions set forth in the above embodiments.
For example, when the vehicle is in a night driving state, the vehicle speed is greater than a first vehicle speed threshold value, the vehicle is in a clear night, the driving position is an expressway, and the number of vehicles is small, the external image data of the vehicle collected by the external camera device is of little value, and the point cloud data collected by the laser radar can better record the environmental information, so that the recording frequency of recording the external image data of the vehicle can be reduced, and the recording frequency of recording the point cloud data can be improved. Moreover, since the vehicle speed of the vehicle is greater than the first vehicle speed threshold, the radar data, the body signal data, the vehicle attitude data, and the vehicle interior image data may be recorded at the first frequency. For example, the frequency of recording the vehicle external image data is 2Hz, the frequency of recording the point cloud data is 10Hz, the frequency of recording the radar data is 15Hz, the frequency of recording the vehicle attitude data is 100Hz, the frequency of recording the vehicle body signal data is 100Hz, and the frequency of recording the vehicle internal image data is 30 Hz.
For another example, when the vehicle is in a daytime driving state, the vehicle speed of the vehicle is greater than a first vehicle speed threshold value, the weather is clear, the driving position is an expressway, and the number of vehicles is small, so that the variation amplitude of the surrounding environment of the vehicle is smaller than a preset amplitude, the collected vehicle external image data and point cloud data have a large number of repetitions, the reference value for scene recurrence is not large, and the vehicle internal image data, the vehicle body signal data and the vehicle attitude data have higher values. Therefore, the frequency of recording the image data outside the vehicle by the vehicle is 10Hz, the frequency of recording the point cloud data is 2Hz, the frequency of recording the radar data is 15Hz, the frequency of recording the attitude data of the vehicle is 100Hz, the frequency of recording the signal data of the vehicle body is 100Hz, and the frequency of recording the image data inside the vehicle is 30 Hz.
For example, when the vehicle is in a daytime running state, the vehicle speed is lower than the second vehicle speed threshold value, the weather is clear, the running position is an expressway, and there are many other vehicles, since the environment in which the host vehicle is located is a complex environment with many other vehicles, information of the other vehicles can be recorded more, and therefore the frequency of recording the image data outside the vehicle can be increased. The vehicle speed is slow, and the change of the self-position, the vehicle body signal, the in-vehicle condition and the vehicle posture of the vehicle is small, so that the frequency of recording the vehicle interior image data, the vehicle body signal data and the vehicle posture data can be reduced. For example, the frequency of the vehicle for recording the vehicle external image data is 20Hz, the frequency of the point cloud data is 2Hz, the frequency of the radar data is 5Hz, the frequency of the vehicle attitude data is 20Hz, the frequency of the vehicle body signal data is 20Hz, and the frequency of the vehicle internal image data is 5 Hz.
For another example, when the vehicle is in a daytime driving state, the vehicle speed is greater than a second vehicle speed threshold value, the weather is clear, the driving position is an urban road, the vehicle is located in a bidirectional non-guardrail lane, and a large number of obstacles (other vehicles, pedestrians, non-motor vehicles) exist, the vehicle is represented to be in a complex driving road condition, and at this time, the vehicle records point cloud data, radar data, vehicle external image data, vehicle internal image data, vehicle body signal data and vehicle posture data at the highest frequency so as to acquire more data information.
FIG. 2 is a block diagram illustrating a vehicle data collection device according to an exemplary embodiment. Referring to fig. 2, the apparatus 200 includes:
an obtaining module 210 configured to obtain a driving state parameter of a vehicle;
a determining module 220 configured to determine a target recording frequency of the vehicle recording target data according to the driving state parameter;
a recording module 230 configured to record the target data according to the target recording frequency.
Optionally, the determining module 220 is specifically configured to:
taking the driving state parameters as input of a judgment model to obtain target recording frequency of the vehicle recording target data;
wherein the judgment model is obtained by training a machine learning model by using the historical driving state parameters marked with the recording frequency.
Optionally, the apparatus 200 further comprises:
and the sending module is configured to send the target data to a target terminal according to the target recording frequency.
Optionally, the driving state parameters include weather information, vehicle speed information, position information, lane information, and obstacle information;
the target data includes point cloud data collected by a laser radar, radar data collected by a radio radar, vehicle exterior image data, vehicle interior image data, vehicle body signal data, and vehicle attitude data.
Optionally, the determining module 220 is specifically configured to:
in the case that the driving state parameter represents that the vehicle is in a parking state, determining the frequency of recording the image data outside the vehicle by the vehicle as a first image recording frequency, and determining the frequency of recording other target data except the image data outside the vehicle by the vehicle as a lowest recording frequency, wherein the first image recording frequency is less than the reference image recording frequency of the vehicle;
when the number of the obstacles in the preset range of the vehicle represented by the driving state parameter is larger than a preset obstacle threshold value, determining the frequency of recording the image data outside the vehicle by the vehicle as a second image recording frequency, wherein the second image recording frequency is larger than the reference image recording frequency;
when the driving state parameter represents that the vehicle is in a night driving state, determining the frequency of recording the image data outside the vehicle by the vehicle as a third image recording frequency, wherein the third image recording frequency is less than the reference image recording frequency and greater than the first image recording frequency.
Optionally, the determining module 220 is specifically configured to:
when the confidence coefficient of the point cloud data acquired by the laser radar represented by the driving state parameters is smaller than a preset confidence coefficient threshold value, determining the frequency of the vehicle for recording the point cloud data as a first point cloud recording frequency, wherein the first point cloud recording frequency is smaller than the reference point cloud recording frequency of the vehicle;
and when the variation amplitude of the driving environment where the vehicle is located represented by the driving state parameters is smaller than a preset amplitude, determining the frequency of the point cloud data recorded by the vehicle as a second point cloud recording frequency, wherein the second point cloud recording frequency is smaller than the reference point cloud recording frequency.
Optionally, the determining module 220 is specifically configured to:
when the driving state parameter represents that the vehicle speed of the vehicle is greater than a first vehicle speed threshold value, determining the frequency of the vehicle for recording the radar data, the vehicle body signal data, the vehicle attitude data and the vehicle interior image data as a corresponding first frequency;
when the driving state parameter represents that the vehicle speed of the vehicle is less than a second vehicle speed threshold value, determining the frequency of the vehicle for recording the radar data, the vehicle body signal data, the vehicle attitude data and the vehicle interior image data as a corresponding second frequency;
when the running state parameters represent the running environment where the vehicle is located, so that the vehicle attitude data is distorted, determining the frequency of the vehicle for recording the vehicle attitude data as a third frequency;
wherein the second vehicle speed threshold is less than the first vehicle speed threshold, the second frequency is less than the first frequency, and the third frequency is less than the second frequency.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
The present disclosure also provides a computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the steps of the vehicle data acquisition method provided by the present disclosure.
Referring to fig. 3, fig. 3 is a functional block diagram of a vehicle according to an exemplary embodiment. The vehicle 600 may be configured in a fully or partially autonomous driving mode. For example, the vehicle 600 may acquire environmental information of its surroundings through the sensing system 620 and derive an automatic driving strategy based on an analysis of the surrounding environmental information to implement full automatic driving, or present the analysis result to the user to implement partial automatic driving.
Vehicle 600 may include various subsystems such as infotainment system 610, perception system 620, decision control system 630, drive system 640, and computing platform 650. Alternatively, vehicle 600 may include more or fewer subsystems, and each subsystem may include multiple components. In addition, each of the sub-systems and components of the vehicle 600 may be interconnected by wire or wirelessly.
In some embodiments, the infotainment system 610 may include a communication system 611, an entertainment system 612, and a navigation system 613.
The communication system 611 may comprise a wireless communication system that may communicate wirelessly with one or more devices, either directly or via a communication network. For example, the wireless communication system may use 3G cellular communication, such as CDMA, EVD0, GSM/GPRS, or 4G cellular communication, such as LTE. Or 5G cellular communication. The wireless communication system may communicate with a Wireless Local Area Network (WLAN) using WiFi. In some embodiments, the wireless communication system may utilize an infrared link, bluetooth, or ZigBee to communicate directly with the device. Other wireless protocols, such as various vehicular communication systems, for example, a wireless communication system may include one or more Dedicated Short Range Communications (DSRC) devices that may include public and/or private data communications between vehicles and/or roadside stations.
The entertainment system 612 may include a display device, a microphone, and a sound box, and a user may listen to a broadcast in the car based on the entertainment system, playing music; or the mobile phone is communicated with the vehicle, screen projection of the mobile phone is realized on the display equipment, the display equipment can be in a touch control type, and a user can operate the display equipment by touching the screen.
In some cases, the voice signal of the user may be acquired through a microphone, and certain control of the vehicle 600 by the user, such as adjusting the temperature in the vehicle, etc., may be implemented according to the analysis of the voice signal of the user. In other cases, music may be played to the user through a stereo.
The navigation system 613 may include a map service provided by a map provider to provide navigation of a route of travel for the vehicle 600, and the navigation system 613 may be used in conjunction with a global positioning system 621 and an inertial measurement unit 622 of the vehicle. The map service provided by the map provider can be a two-dimensional map or a high-precision map.
The sensing system 620 may include several types of sensors that sense information about the environment surrounding the vehicle 600. For example, the sensing system 620 may include a global positioning system 621 (the global positioning system may be a GPS system, a beidou system or other positioning system), an Inertial Measurement Unit (IMU) 622, a laser radar 623, a millimeter wave radar 624, an ultrasonic radar 625, and a camera 626. The sensing system 620 may also include sensors of internal systems of the monitored vehicle 600 (e.g., an in-vehicle air quality monitor, a fuel gauge, an oil temperature gauge, etc.). Sensor data from one or more of these sensors may be used to detect the object and its corresponding characteristics (position, shape, orientation, velocity, etc.). Such detection and identification is a critical function of the safe operation of the vehicle 600.
Global positioning system 621 is used to estimate the geographic location of vehicle 600.
The inertial measurement unit 622 is used to sense a pose change of the vehicle 600 based on the inertial acceleration. In some embodiments, inertial measurement unit 622 may be a combination of accelerometers and gyroscopes.
Lidar 623 utilizes laser light to sense objects in the environment in which vehicle 600 is located. In some embodiments, lidar 623 may include one or more laser sources, laser scanners, and one or more detectors, among other system components.
The millimeter-wave radar 624 utilizes radio signals to sense objects within the surrounding environment of the vehicle 600. In some embodiments, in addition to sensing objects, the millimeter-wave radar 624 may also be used to sense the speed and/or heading of objects.
The ultrasonic radar 625 may sense objects around the vehicle 600 using ultrasonic signals.
The camera 626 is used to capture image information of the surroundings of the vehicle 600. The image capturing device 626 may include a monocular camera, a binocular camera, a structured light camera, a panoramic camera, and the like, and the image information acquired by the image capturing device 626 may include still images or video stream information.
Decision control system 630 includes a computing system 631 that makes analytical decisions based on information acquired by sensing system 620, decision control system 630 further includes a vehicle control unit 632 that controls the powertrain of vehicle 600, and a steering system 633, throttle 634, and brake system 635 for controlling vehicle 600.
The computing system 631 may operate to process and analyze the various information acquired by the perception system 620 to identify objects, and/or features in the environment surrounding the vehicle 600. The target may comprise a pedestrian or an animal and the objects and/or features may comprise traffic signals, road boundaries and obstacles. Computing system 631 may use object recognition algorithms, Motion from Motion (SFM) algorithms, video tracking, and like techniques. In some embodiments, the computing system 631 may be used to map an environment, track objects, estimate the speed of objects, and so forth. The computing system 631 may analyze the various information obtained and derive a control strategy for the vehicle.
The vehicle controller 632 may be used to perform coordinated control on the power battery and the engine 641 of the vehicle to improve the power performance of the vehicle 600.
The steering system 633 is operable to adjust the heading of the vehicle 600. For example, in one embodiment, a steering wheel system.
The throttle 634 is used to control the operating speed of the engine 641 and thus the speed of the vehicle 600.
The brake system 635 is used to control the deceleration of the vehicle 600. The braking system 635 may use friction to slow the wheel 644. In some embodiments, the braking system 635 may convert the kinetic energy of the wheels 644 into electrical current. The braking system 635 may also take other forms to slow the rotational speed of the wheels 644 to control the speed of the vehicle 600.
The drive system 640 may include components that provide powered motion to the vehicle 600. In one embodiment, the drive system 640 may include an engine 641, an energy source 642, a transmission 643, and wheels 644. The engine 641 may be an internal combustion engine, an electric motor, an air compression engine, or other types of engine combinations, such as a hybrid engine consisting of a gasoline engine and an electric motor, a hybrid engine consisting of an internal combustion engine and an air compression engine. The engine 641 converts the energy source 642 into mechanical energy.
Examples of energy sources 642 include gasoline, diesel, other petroleum-based fuels, propane, other compressed gas-based fuels, ethanol, solar panels, batteries, and other sources of electrical power. The energy source 642 may also provide energy to other systems of the vehicle 600.
The transmission 643 may transmit mechanical power from the engine 641 to the wheels 644. The transmission 643 may include a gearbox, a differential, and a drive shaft. In one embodiment, the transmission 643 may also include other devices, such as clutches. Wherein the drive shaft may include one or more axles that may be coupled to one or more wheels 644.
Some or all of the functionality of the vehicle 600 is controlled by the computing platform 650. Computing platform 650 can include at least one processor 651, which processor 651 can execute instructions 653 stored in a non-transitory computer-readable medium, such as memory 652. In some embodiments, the computing platform 650 may also be a plurality of computing devices that control individual components or subsystems of the vehicle 600 in a distributed manner.
The processor 651 may be any conventional processor, such as a commercially available CPU. Alternatively, the processor 651 may also include a processor such as a Graphics Processor (GPU), a Field Programmable Gate Array (FPGA), a System On Chip (SOC), an Application Specific Integrated Circuit (ASIC), or a combination thereof. Although fig. 3 functionally illustrates a processor, memory, and other elements of a computer in the same block, those skilled in the art will appreciate that the processor, computer, or memory may actually comprise multiple processors, computers, or memories that may or may not be stored within the same physical housing. For example, the memory may be a hard drive or other storage medium located in a different enclosure than the computer. Thus, references to a processor or computer are to be understood as including references to a collection of processors or computers or memories which may or may not operate in parallel. Rather than using a single processor to perform the steps described herein, some components, such as the steering component and the retarding component, may each have their own processor that performs only computations related to the component-specific functions.
In the disclosed embodiment, the processor 651 may perform the vehicle data collection method described above.
In various aspects described herein, the processor 651 may be located remotely from the vehicle and in wireless communication with the vehicle. In other aspects, some of the processes described herein are executed on a processor disposed within the vehicle and others are executed by a remote processor, including taking the steps necessary to perform a single maneuver.
In some embodiments, the memory 652 may contain instructions 653 (e.g., program logic), which instructions 653 may be executed by the processor 651 to perform various functions of the vehicle 600. The memory 652 may also contain additional instructions, including instructions to send data to, receive data from, interact with, and/or control one or more of the infotainment system 610, the perception system 620, the decision control system 630, the drive system 640.
In addition to instructions 653, memory 652 may also store data such as road maps, route information, the location, direction, speed, and other such vehicle data of the vehicle, as well as other information. Such information may be used by the vehicle 600 and the computing platform 650 during operation of the vehicle 600 in autonomous, semi-autonomous, and/or manual modes.
Computing platform 650 may control functions of vehicle 600 based on inputs received from various subsystems (e.g., drive system 640, perception system 620, and decision control system 630). For example, computing platform 650 may utilize input from decision control system 630 in order to control steering system 633 to avoid obstacles detected by perception system 620. In some embodiments, the computing platform 650 is operable to provide control over many aspects of the vehicle 600 and its subsystems.
Optionally, one or more of these components described above may be mounted or associated separately from the vehicle 600. For example, the memory 652 may exist partially or completely separate from the vehicle 600. The above components may be communicatively coupled together in a wired and/or wireless manner.
Optionally, the above components are only an example, in an actual application, components in the above modules may be added or deleted according to an actual need, and fig. 3 should not be construed as limiting the embodiment of the present disclosure.
An autonomous automobile traveling on a roadway, such as vehicle 600 above, may identify objects within its surrounding environment to determine an adjustment to the current speed. The object may be another vehicle, a traffic control device, or another type of object. In some examples, each identified object may be considered independently, and based on the respective characteristics of the object, such as its current speed, acceleration, separation from the vehicle, etc., may be used to determine the speed at which the autonomous vehicle is to be adjusted.
Optionally, the vehicle 600 or a sensory and computing device associated with the vehicle 600 (e.g., computing system 631, computing platform 650) may predict behavior of the identified object based on characteristics of the identified object and the state of the surrounding environment (e.g., traffic, rain, ice on the road, etc.). Optionally, each identified object depends on the behavior of each other, so it is also possible to predict the behavior of a single identified object taking all identified objects together into account. The vehicle 600 is able to adjust its speed based on the predicted behavior of the identified object. In other words, the autonomous vehicle is able to determine what steady state the vehicle will need to adjust to (e.g., accelerate, decelerate, or stop) based on the predicted behavior of the object. In this process, other factors may also be considered to determine the speed of the vehicle 600, such as the lateral position of the vehicle 600 in the road being traveled, the curvature of the road, the proximity of static and dynamic objects, and so forth.
In addition to providing instructions to adjust the speed of the autonomous vehicle, the computing device may also provide instructions to modify the steering angle of the vehicle 600 to cause the autonomous vehicle to follow a given trajectory and/or maintain a safe lateral and longitudinal distance from objects in the vicinity of the autonomous vehicle (e.g., vehicles in adjacent lanes on the road).
The vehicle 600 may be any type of vehicle, such as a car, a truck, a motorcycle, a bus, a boat, an airplane, a helicopter, a recreational vehicle, a train, etc., and the disclosed embodiment is not particularly limited.
In another exemplary embodiment, a computer program product is also provided, which comprises a computer program executable by a programmable device, the computer program having code portions for performing the above-mentioned vehicle data acquisition method when executed by the programmable device.
In another exemplary embodiment, there is also provided a chip comprising a processor and an interface; the processor is used for reading instructions to execute the vehicle data acquisition method.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (10)

1. A vehicle data collection method, comprising:
acquiring a driving state parameter of a vehicle;
determining a target recording frequency of the vehicle recording target data according to the driving state parameters;
and recording the target data according to the target recording frequency.
2. The method of claim 1, wherein determining a target recording frequency of the vehicle recording target data according to the driving state parameter comprises:
the driving state parameters are used as input of a judgment model, and target recording frequency of the vehicle recording target data is obtained;
wherein the judgment model is obtained by training a machine learning model by using the historical driving state parameters marked with the recording frequency.
3. The method according to claim 1 or 2, characterized in that the method further comprises:
and sending the target data to a target terminal according to the target recording frequency.
4. The method according to claim 1 or 2, characterized in that the driving state parameters include weather information, vehicle speed information, position information, lane information, and obstacle information;
the target data includes point cloud data collected by a laser radar, radar data collected by a radio radar, vehicle exterior image data, vehicle interior image data, vehicle body signal data, and vehicle attitude data.
5. The method of claim 4, wherein determining a target recording frequency of the vehicle recording target data according to the driving state parameter comprises:
in the case that the driving state parameter represents that the vehicle is in a parking state, determining the frequency of recording the image data outside the vehicle by the vehicle as a first image recording frequency, and determining the frequency of recording other target data except the image data outside the vehicle by the vehicle as a lowest recording frequency, wherein the first image recording frequency is less than the reference image recording frequency of the vehicle;
when the number of the obstacles in the preset range of the vehicle represented by the driving state parameter is larger than a preset obstacle threshold value, determining the frequency of recording the image data outside the vehicle by the vehicle as a second image recording frequency, wherein the second image recording frequency is larger than the reference image recording frequency;
when the driving state parameter represents that the vehicle is in a night driving state, determining the frequency of recording the image data outside the vehicle by the vehicle as a third image recording frequency, wherein the third image recording frequency is less than the reference image recording frequency and greater than the first image recording frequency.
6. The method of claim 4, wherein determining a target recording frequency of the vehicle recording target data according to the driving state parameter comprises:
when the confidence coefficient of the point cloud data acquired by the laser radar represented by the driving state parameters is smaller than a preset confidence coefficient threshold value, determining the frequency of the vehicle for recording the point cloud data as a first point cloud recording frequency, wherein the first point cloud recording frequency is smaller than the reference point cloud recording frequency of the vehicle;
and when the variation amplitude of the driving environment where the driving state parameter represents the vehicle is smaller than the preset amplitude, determining the frequency of the vehicle for recording the point cloud data as a second point cloud recording frequency, wherein the second point cloud recording frequency is smaller than the reference point cloud recording frequency.
7. The method of claim 4, wherein determining a target logging frequency for the vehicle to log target data based on the driving status parameter comprises:
when the driving state parameter represents that the vehicle speed of the vehicle is greater than a first vehicle speed threshold value, determining the frequency of the vehicle for recording the radar data, the vehicle body signal data, the vehicle attitude data and the vehicle interior image data as a corresponding first frequency;
when the driving state parameter represents that the vehicle speed of the vehicle is less than a second vehicle speed threshold value, determining the frequency of the vehicle for recording the radar data, the vehicle body signal data, the vehicle attitude data and the vehicle interior image data as a corresponding second frequency;
when the running state parameters represent that the running environment in which the vehicle is located causes the vehicle posture data to be distorted, determining the frequency of the vehicle for recording the vehicle posture data as a third frequency;
wherein the second vehicle speed threshold is less than the first vehicle speed threshold, the second frequency is less than the first frequency, and the third frequency is less than the second frequency.
8. A vehicle data collection device, comprising:
an acquisition module configured to acquire a driving state parameter of a vehicle;
a determination module configured to determine a target recording frequency of the vehicle recording target data according to the driving state parameter;
and the recording module is configured to record the target data according to the target recording frequency.
9. A vehicle, characterized by comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to execute the executable instructions to implement the steps of the method of any one of claims 1 to 7.
10. A computer-readable storage medium, on which computer program instructions are stored, which program instructions, when executed by a processor, carry out the steps of the method of any one of claims 1 to 7.
CN202210726815.2A 2022-06-23 2022-06-23 Vehicle data acquisition method and device, storage medium and vehicle Pending CN115116161A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210726815.2A CN115116161A (en) 2022-06-23 2022-06-23 Vehicle data acquisition method and device, storage medium and vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210726815.2A CN115116161A (en) 2022-06-23 2022-06-23 Vehicle data acquisition method and device, storage medium and vehicle

Publications (1)

Publication Number Publication Date
CN115116161A true CN115116161A (en) 2022-09-27

Family

ID=83329358

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210726815.2A Pending CN115116161A (en) 2022-06-23 2022-06-23 Vehicle data acquisition method and device, storage medium and vehicle

Country Status (1)

Country Link
CN (1) CN115116161A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117155737A (en) * 2023-10-30 2023-12-01 泉州市搏浪科技集团有限公司 Vehicle data acquisition and analysis system based on CAN bus
CN117155737B (en) * 2023-10-30 2024-06-04 泉州市搏浪科技集团有限公司 Vehicle data acquisition and analysis system based on CAN bus

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009099033A (en) * 2007-10-18 2009-05-07 Denso Corp Vehicle peripheral image photographing controller and program used therefor
US20180130271A1 (en) * 2016-11-04 2018-05-10 Hitachi, Ltd. Vehicle Operation Data Collection Apparatus, Vehicle Operation Data Collection System, and Vehicle Operation Data Collection Method
CN109857002A (en) * 2019-01-15 2019-06-07 北京百度网讯科技有限公司 Collecting method, device, equipment and computer readable storage medium
CN110930689A (en) * 2018-09-20 2020-03-27 比亚迪股份有限公司 Road condition data dynamic acquisition method and device based on automobile data recorder
CN111986347A (en) * 2020-07-20 2020-11-24 汉海信息技术(上海)有限公司 Device management method, device, electronic device and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009099033A (en) * 2007-10-18 2009-05-07 Denso Corp Vehicle peripheral image photographing controller and program used therefor
US20180130271A1 (en) * 2016-11-04 2018-05-10 Hitachi, Ltd. Vehicle Operation Data Collection Apparatus, Vehicle Operation Data Collection System, and Vehicle Operation Data Collection Method
CN110930689A (en) * 2018-09-20 2020-03-27 比亚迪股份有限公司 Road condition data dynamic acquisition method and device based on automobile data recorder
CN109857002A (en) * 2019-01-15 2019-06-07 北京百度网讯科技有限公司 Collecting method, device, equipment and computer readable storage medium
CN111986347A (en) * 2020-07-20 2020-11-24 汉海信息技术(上海)有限公司 Device management method, device, electronic device and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117155737A (en) * 2023-10-30 2023-12-01 泉州市搏浪科技集团有限公司 Vehicle data acquisition and analysis system based on CAN bus
CN117155737B (en) * 2023-10-30 2024-06-04 泉州市搏浪科技集团有限公司 Vehicle data acquisition and analysis system based on CAN bus

Similar Documents

Publication Publication Date Title
CN113128303A (en) Automatic driving method, related equipment and computer readable storage medium
WO2021065626A1 (en) Traffic control system, traffic control method, and control device
CN115042821B (en) Vehicle control method, vehicle control device, vehicle and storage medium
CN115123257B (en) Pavement deceleration strip position identification method and device, vehicle, storage medium and chip
CN114779790B (en) Obstacle recognition method and device, vehicle, server, storage medium and chip
CN115100377A (en) Map construction method and device, vehicle, readable storage medium and chip
CN115035494A (en) Image processing method, image processing device, vehicle, storage medium and chip
CN115202234B (en) Simulation test method and device, storage medium and vehicle
CN112654547A (en) Driving reminding method, device and system
CN114842440B (en) Automatic driving environment sensing method and device, vehicle and readable storage medium
CN115056784B (en) Vehicle control method, device, vehicle, storage medium and chip
CN115042814A (en) Traffic light state identification method and device, vehicle and storage medium
CN115871523A (en) Battery heating method, device, vehicle, readable storage medium and chip
CN115221151A (en) Vehicle data transmission method and device, vehicle, storage medium and chip
CN115203457A (en) Image retrieval method, image retrieval device, vehicle, storage medium and chip
CN115205848A (en) Target detection method, target detection device, vehicle, storage medium and chip
CN115100630A (en) Obstacle detection method, obstacle detection device, vehicle, medium, and chip
CN115116161A (en) Vehicle data acquisition method and device, storage medium and vehicle
CN115115707B (en) Vehicle falling water detection method, vehicle, computer readable storage medium and chip
CN115139946B (en) Vehicle falling water detection method, vehicle, computer readable storage medium and chip
CN114822216B (en) Method and device for generating parking space map, vehicle, storage medium and chip
CN115063639B (en) Model generation method, image semantic segmentation device, vehicle and medium
CN114852092B (en) Steering wheel hands-off detection method and device, readable storage medium and vehicle
CN114802435B (en) Vehicle control method, device, vehicle, storage medium and chip
CN115407344B (en) Grid map creation method, device, vehicle and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination