CN114492679A - Vehicle data processing method and device, electronic equipment and medium - Google Patents

Vehicle data processing method and device, electronic equipment and medium Download PDF

Info

Publication number
CN114492679A
CN114492679A CN202210401134.9A CN202210401134A CN114492679A CN 114492679 A CN114492679 A CN 114492679A CN 202210401134 A CN202210401134 A CN 202210401134A CN 114492679 A CN114492679 A CN 114492679A
Authority
CN
China
Prior art keywords
vehicle
data
fusion
acquisition
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210401134.9A
Other languages
Chinese (zh)
Other versions
CN114492679B (en
Inventor
汪骏
张富国
董红荣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing China Tsp Technology Co ltd
Original Assignee
Beijing China Tsp Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing China Tsp Technology Co ltd filed Critical Beijing China Tsp Technology Co ltd
Priority to CN202210401134.9A priority Critical patent/CN114492679B/en
Publication of CN114492679A publication Critical patent/CN114492679A/en
Application granted granted Critical
Publication of CN114492679B publication Critical patent/CN114492679B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application provides a vehicle data processing method, a device, electronic equipment and a medium, which are applied to a processing system, wherein the processing system comprises a processing platform and a plurality of fusion acquisition equipment, and the fusion acquisition equipment comprises an integrally packaged radar sensor and a camera; the method comprises the following steps: determining the acquisition priority of each fusion acquisition equipment according to the vehicle running data, and controlling the fusion acquisition equipment to acquire radar data and images according to the acquisition priority; the sampling priorities are different, and the sampling frequencies of the fusion acquisition equipment are different; fusing the radar data and the images of the fusion acquisition equipment at different sampling moments to obtain target information of at least one target object in a preset range around the vehicle for an intelligent driving system to use; and determining a target image from the acquired images, and adding a reminding mark according to the target object meeting the marking condition to obtain a composite image for a user to browse, so that the computing resources are more reasonably distributed, and the instantaneity of the sensor in data processing is improved.

Description

Vehicle data processing method and device, electronic equipment and medium
Technical Field
The application relates to the field of intelligent driving, in particular to a vehicle data processing method and device, electronic equipment and a medium.
Background
Intelligent driving is an important direction for automobile development nowadays, and nowadays, driving assistance systems in the intelligent driving direction have achieved good research results and applications, such as lane keeping, brake assistance, and the like. When making a decision, the intelligent driving mainly determines a control strategy by analyzing and processing a large amount of sensor data, wherein the sensor data are acquired by various sensors such as a camera, ultrasonic waves, millimeter waves, a laser radar and the like. The immediacy of sensor data processing is highly required by intelligent driving, and therefore, a more reasonable sensor data processing strategy is required.
Disclosure of Invention
In view of this, an object of the present application is to provide a vehicle data processing method, an apparatus, an electronic device, and a medium, which adjust a sensor data acquisition and processing strategy according to actual requirements, so as to more reasonably allocate computing resources and improve instantaneity of sensor data processing.
The vehicle data processing method is applied to a processing system, the processing system comprises a processing platform and a plurality of fusion acquisition devices arranged on a vehicle, the fusion acquisition devices comprise integrally packaged radar sensors and cameras, and the types of the radar sensors and/or the types of the cameras in different types of fusion acquisition devices are different; the method comprises the following steps:
determining the acquisition priority corresponding to each fusion acquisition device according to the vehicle running data, and controlling the corresponding fusion acquisition devices to acquire radar data and images within a preset range around the vehicle according to the acquisition priority; the sampling frequencies of the fusion acquisition equipment are different;
fusing the radar data and the images of the fusion acquisition equipment at different sampling moments to acquire target information of at least one target object in a preset range around the vehicle; the target information of the target object is used for an intelligent driving system;
determining a target image from images acquired by various fusion acquisition devices, and adding a reminding mark in the target image according to a target object of which target information meets a marking condition to obtain a composite image.
In some embodiments, the vehicle data processing method, wherein the determining, according to the vehicle driving data, the acquisition priority corresponding to each fusion acquisition device, and controlling, according to the acquisition priority, the corresponding fusion acquisition device to acquire radar data and images within a preset range around the vehicle includes the following steps:
acquiring vehicle speed data and steering wheel corner data of a vehicle as vehicle running data, and determining acquisition priority corresponding to each fusion acquisition device according to the vehicle speed data and the steering wheel corner data of the vehicle;
and determining the sampling frequency of each fusion acquisition device according to the acquisition priority corresponding to each fusion acquisition device, and controlling the fusion acquisition devices to acquire radar data and images in a preset range around the vehicle according to the corresponding sampling frequency.
In some embodiments, in the vehicle data processing method, the determining the acquisition priority corresponding to each fusion acquisition device according to the vehicle speed data and the steering wheel angle data of the vehicle includes the following steps:
determining a speed grade corresponding to the vehicle speed data of the vehicle and determining an angle grade corresponding to the steering wheel angle data; different speed levels correspond to different vehicle speed threshold ranges, and different angle levels correspond to different angle threshold ranges;
determining the priority of each fusion acquisition device according to the speed level and the angle level; the higher the speed level is, the higher the acquisition priority of the fusion acquisition equipment adopting the laser radar is; the higher the angle level is, the higher the acquisition priority of the fusion acquisition device adopting the millimeter wave radar is.
In some embodiments, in the vehicle data processing method, vehicle speed data and steering wheel angle data of a vehicle are acquired as vehicle running data, and a corresponding acquisition priority of each fusion acquisition device is determined according to the vehicle speed data and the steering wheel angle data of the vehicle; the method comprises the following steps:
acquiring environmental data;
determining the acquisition priority corresponding to each fusion acquisition device according to the vehicle speed data and the steering wheel corner data of the vehicle and the vehicle running data;
and correcting the acquisition priority of each fusion acquisition device according to the environmental data to obtain the corrected acquisition priority.
In some embodiments, the vehicle data processing method for determining a target image from images acquired by a plurality of fusion acquisition devices comprises the following steps:
obtaining vehicle operation data, and determining target fusion acquisition equipment according to the vehicle operation data;
and determining a target image from the images acquired by the target fusion acquisition equipment at a plurality of sampling moments in a preset time period of the equipment, so that the calculation resource occupied by synthesizing the images is less than that occupied by acquiring the target information.
In some embodiments, in the vehicle data processing method, the radar data and the images of the fusion acquisition device at different sampling moments are subjected to fusion processing, and target information of at least one target object in a preset range around the vehicle is obtained; the method comprises the following steps:
fusing radar data and images acquired by fusion acquisition equipment at a sampling moment to acquire first information of at least one target object at the sampling moment; the first information comprises distance information, position information, shape information, size information and angle range information;
and fusing the first information of the target objects at the plurality of sampling moments to obtain the target information of at least one target object in the preset range around the vehicle.
In some embodiments, in the vehicle data processing method, the fusion acquisition device includes a radar sensor, a camera, a combined connector, and a wire harness, and the radar sensor and the camera are respectively connected to the combined connector and respectively transmit radar data and an image to the processing platform through the wire harness.
In some embodiments, a vehicle data processing apparatus is further provided, which is applied to a processing system, where the processing system includes a processing platform and multiple types of fusion acquisition devices disposed on a vehicle, where the fusion acquisition devices include integrally packaged radar sensors and cameras, and the types of radar sensors or the types of cameras in different types of fusion acquisition devices are different; the device comprises:
the determining module is used for determining the acquisition priority corresponding to each fusion acquisition device according to the vehicle running data, and controlling the corresponding fusion acquisition device to acquire radar data and images in a preset range around the vehicle according to the acquisition priority; the sampling frequencies of the fusion acquisition equipment are different;
the fusion module is used for fusing the radar data and the images of the fusion acquisition equipment at different sampling moments to acquire target information of at least one target object in a preset range around the vehicle; the target information of the target object is used for an intelligent driving system;
and the marking module is used for determining a target image from the images acquired by the various fusion acquisition devices, and adding a reminding mark in the target image according to a target object of which the target information meets the marking condition to obtain a composite image.
In some embodiments, there is also provided an electronic device comprising: a processor, a memory and a bus, the memory storing machine readable instructions executable by the processor, the processor and the memory communicating over the bus when the electronic device is operating, the machine readable instructions when executed by the processor performing the steps of the vehicle data processing method.
In some embodiments, a computer-readable storage medium is also provided, on which a computer program is stored which, when being executed by a processor, performs the steps of the vehicle data processing method.
This application is through the vehicle data of traveling, confirm the demand to different kinds of sensor data of intelligent driving system, thereby confirm the collection priority that every kind fuses the collection equipment to correspond, then control different fusion collection equipment and gather the information around the vehicle according to different sampling frequency, make the required sensor data of real-time control strategy of making abundanter, immediately, and its data of other types of sensor is less relatively, thereby according to abundanter, more suitable control strategy is made to instant sensor data, and simultaneously, also more effective utilization computational resource.
Drawings
To more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
FIG. 1 illustrates a method flow diagram of a vehicle data processing method according to an embodiment of the present application;
FIG. 2 is a flowchart illustrating a method for controlling a corresponding fusion acquisition device to acquire radar data and images within a preset range around a vehicle according to an embodiment of the present application;
fig. 3 is a flowchart illustrating a method for determining an acquisition priority corresponding to each type of fusion acquisition device according to an embodiment of the present application;
FIG. 4 is a flowchart illustrating a method for obtaining target information of at least one target object within a preset range around a vehicle according to an embodiment of the present application;
FIG. 5 is a flow chart illustrating a method for determining a target image from images acquired by a plurality of fusion acquisition devices according to an embodiment of the present application;
FIG. 6 is a schematic structural diagram of a vehicle data processing device according to an embodiment of the present application;
fig. 7 shows a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the purpose, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it should be understood that the drawings in the present application are for illustrative and descriptive purposes only and are not used to limit the scope of protection of the present application. Additionally, it should be understood that the schematic drawings are not necessarily drawn to scale. The flowcharts used in this application illustrate operations implemented according to some embodiments of the present application. It should be understood that the operations of the flow diagrams may be performed out of order, and steps without logical context may be performed in reverse order or simultaneously. One skilled in the art, under the guidance of this application, may add one or more other operations to, or remove one or more operations from, the flowchart.
In addition, the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that in the embodiments of the present application, the term "comprising" is used to indicate the presence of the features stated hereinafter, but does not exclude the addition of further features.
Intelligent driving is an important direction for automobile development nowadays, and nowadays, driving assistance systems in the intelligent driving direction have achieved good research results and applications, such as lane keeping, brake assistance, and the like. When making a decision, the intelligent driving mainly determines a control strategy by analyzing and processing a large amount of sensor data, wherein the sensor data is acquired by various sensors such as an infrared camera, a common camera, ultrasonic waves, millimeter waves, a laser radar and the like.
Different sensors have different advantages and defects, in general, a laser radar sensor acquires information by means of transmitting pulses (waveforms), the laser radar sensor with high resolution can accurately distinguish single-point positions and detect long-distance targets, but the whole scene information around a vehicle is difficult to acquire at the same time; moreover, the laser radar emits pulses with poor penetrability, and can detect information of a long-distance target in a good environment (sunny days); but the detection effect is poor when the environment is poor (rain, fog, ice and snow). The angle resolution precision and the angle measurement precision of the millimeter wave radar are high, and the loss of the millimeter wave radar in the propagation of cloud, smoke and dust is much smaller than that of a laser signal. The propagation loss is much more severe than that of the microwave, and thus, the millimeter wave radar is superior to the laser radar in the presence of rain, fog, and the like.
The infrared camera is used for shooting thermal imaging pictures and can also shoot clear portrait at night; the ordinary camera can shoot the environment image around the vehicle, but when the lamp light is not available at night, the shot portrait is fuzzy.
In actual driving, sensor data required for intelligent driving or driving assistance also differs in different driving scenes.
For example, when the vehicle speed is fast, a sensor with a long detection distance needs to be selected for detection, and information needs to be acquired according to a shorter detection period, so that an intelligent driving system can make an accurate and instant decision. On the contrary, when the vehicle speed is slower, a sensor with a long detection distance needs to be selected for detection, and information can be acquired according to a longer detection period.
For example, when the steering wheel has a large steering angle, a sensor with a wide detection range is required to acquire the information around the vehicle according to a relatively short period; when the rotation angle is small, a sensor with a small detection range can be adopted to acquire the information near the vehicle according to a relatively long period.
Based on the above, the application provides a vehicle data processing method, which is applied to a processing system, wherein the processing system comprises a processing platform and a plurality of fusion acquisition devices arranged on a vehicle, the fusion acquisition devices comprise integrally packaged radar sensors and cameras, and the types of the radar sensors and/or the types of the cameras in different types of fusion acquisition devices are different; as shown in fig. 1, the method includes:
s101, determining the acquisition priority corresponding to each fusion acquisition device according to vehicle running data, and controlling the corresponding fusion acquisition devices to acquire radar data and images within a preset range around the vehicle according to the acquisition priority; the sampling frequencies of the fusion acquisition equipment are different;
s102, fusing the radar data and the images of the fusion acquisition equipment at different sampling moments to obtain target information of at least one target object in a preset range around the vehicle, so that an intelligent driving system of the vehicle controls the vehicle according to the target information;
s103, determining a target image from the images acquired by the multiple fusion acquisition devices, and adding a reminding mark in the target image according to a target object of which the target information meets the marking condition so as to display the target image added with the reminding mark.
Here, this application is through the vehicle data of traveling, confirm the demand of intelligent driving system to different kinds of sensor data, thereby confirm the collection priority that every kind fuses the collection equipment to correspond, then control different fusion collection equipment and gather vehicle surrounding information according to different sampling frequency, make the required sensor data of real-time control strategy of making abundanter, immediately, and its data of other types of sensor is less relatively, thereby according to abundanter, more suitable control strategy is made to instant sensor data, and simultaneously, also more effective utilization computational resource.
The sensor data required by the real-time control strategy is not only required, but also has a large influence on the real-time control strategy in the current vehicle using scene, the more, richer and more instantaneity the sensor data are, and the more accurate the real-time control strategy is.
For example, when the vehicle speed is fast, it is necessary to detect an obstacle, a pedestrian, a sensitive area, and the like in a long distance in front of the vehicle in time. Taking the car speed per hour of 120 kilometers as an example, the car advances by 33.3m/s per second, if the sampling frequency during detection is low, the detection distance is short, and the control strategy is delayed by adding the hysteresis caused by data transmission and data processing. Because the laser radar sensor with high resolution can accurately distinguish a single-point position and detect a long-distance target, the sensor data required by the real-time control strategy is determined to be the sensor data collected by the laser radar sensor in the vehicle using scene.
When the turning angle of the automobile is large, the targets in the lateral direction and the transverse direction of the automobile need to be detected timely. Because the angle resolution precision and the angle measurement precision of the millimeter wave radar sensor are high, the sensor data required by the real-time control strategy is determined to be the sensor data collected by the millimeter wave radar sensor in the vehicle using scene.
It should be noted that, the intelligent driving system described in the embodiment of the present application may also be referred to as a driving assistance system according to a specific application. The intelligent driving system is a system for making vehicle control strategies such as braking and deceleration according to a sensor arranged on a vehicle.
The application provides a vehicle data processing method, which is applied to a processing system, wherein the processing system on the market at present also comprises a plurality of sensors arranged on a vehicle, but the sensors are independent sensors, each sensor has an independent installation position, and the sensors are communicated with a processing platform through independent plug connectors and wiring harnesses.
However, this solution has high material costs, machining costs, installation costs and maintenance costs. For example, each more sensor is required to form a mounting hole on a structural member of the vehicle, and a corresponding wire groove is formed, which results in a complex and high structural member. At present, with the continuous development of the electronic industry, the cost of the sensor is continuously reduced, the processing cost of forming a mounting hole and a wire slot on a structural member, and the mounting cost of the sensor is possibly higher than the cost of the sensor; in addition, the plug-in unit fixedly arranged is subject to vibration for a long time in the running process of the vehicle, so that the plug-in unit is easy to loosen, fall off and deform, and poor contact with the sensor can be caused. The larger the number of the plug connectors is, the higher the cost is, in order to control the cost, the plug connectors with poor quality can be selected, so that the data transmission of the sensor can be unstable in the long-term use process, and the decision accuracy of the intelligent driving system is influenced.
Based on this, in this application, processing system includes processing platform and sets up the multiple collection equipment that fuses on the vehicle, fuse collection equipment including the radar sensor and the camera of integration encapsulation, the different radar sensor kind difference and/or the camera kind difference in the different kinds of collection equipment that fuses.
Specifically, integration collection equipment of integration encapsulation includes radar sensor, camera, combination plug connector, pencil, radar sensor and camera are connected respectively the combination plug connector to pass through the pencil is respectively with radar data and image transmission to processing platform.
The combined plug connector comprises a first plug-in area and a second plug-in area which are isolated from each other, wherein the first plug-in area is used for connecting a radar sensor, and the second plug-in area is used for connecting a camera. The wiring harness comprises a first signal wire and a second signal wire, the first signal wire is used for achieving data transmission between the radar sensor and the processing platform, and the second signal wire is used for achieving data transmission between the camera and the processing platform.
According to the method, the camera and the radar sensor are integrally arranged, and the front-section sensors are communicated with the processing platform through the combined adapter and the wire harness, so that the number of mounting holes and wire grooves in the structural member is reduced when a manufacturer produces vehicles, the complexity and height of the structural member are reduced, and the self cost of the structural member and the mounting cost of the sensor are reduced; after the number of the plug connectors is reduced, plug connector manufacturing companies with better quality can be selected at the original cost, and the reliability of data transmission in the long-term use process is improved.
Specifically, the radar sensor in the fusion acquisition device may be a laser radar sensor or a millimeter wave radar sensor; the camera can be an infrared camera and a common camera. That is, the fusion acquisition apparatus includes the following combinations: laser radar sensor + infrared camera, laser radar sensor + ordinary camera, millimeter wave radar sensor + infrared camera, millimeter wave radar sensor + ordinary camera.
The vehicle is specifically carried with which fusion acquisition equipment, and can be flexibly matched according to the high, middle and low-end positioning of the vehicle type and the selection of customers.
In the step S101, according to the vehicle driving data, determining an acquisition priority corresponding to each fusion acquisition device, and according to the acquisition priority, controlling the corresponding fusion acquisition device to acquire radar data and images within a preset range around the vehicle; the sampling priorities are different, and the sampling frequencies of the fusion acquisition equipment are different.
The higher the acquisition priority of the integration acquisition equipment is, the higher the sampling frequency of the integration acquisition equipment is, so that the acquired sensor data is more, the information around the vehicle can be reflected more timely and accurately, and the intelligent driving system can make a more reasonable strategy.
In the vehicle data processing method according to the embodiment of the present application, as shown in fig. 2, determining the acquisition priority corresponding to each fusion acquisition device according to the vehicle driving data, and controlling the corresponding fusion acquisition device to acquire radar data and images within a preset range around the vehicle according to the acquisition priority includes the following steps:
s201, acquiring vehicle speed data and steering wheel corner data of a vehicle as vehicle running data, and determining acquisition priority corresponding to each fusion acquisition device according to the vehicle speed data and the steering wheel corner data of the vehicle;
s202, determining the sampling frequency of each fusion acquisition device according to the acquisition priority corresponding to each fusion acquisition device, and controlling the fusion acquisition devices to acquire radar data and images in a preset range around the vehicle according to the corresponding sampling frequency.
Here, the higher the vehicle speed data is, the higher the acquisition priority of the fusion acquisition device adopting the laser radar is; the higher the acquisition priority of the steering wheel corner data is, the higher the acquisition priority of the fusion acquisition equipment adopting the millimeter wave radar is.
The vehicle speed data is high, and obstacles, pedestrians, sensitive areas and the like in a long distance in front of the vehicle need to be detected more timely, so that the intelligent driving system can make a decision according to the target information of the detected target object. Therefore, the acquisition priority of the fusion acquisition equipment adopting the laser radar is higher, and the sampling frequency of the fusion acquisition equipment adopting the laser radar is high, so that the detection time interval of the road condition in the front long distance is shortened, and the target information of a target object is found or updated in time; meanwhile, in the same time period, the detected information around the vehicle is richer, and the accuracy of the target information of the target object can be improved after the detected information is subjected to space-time fusion.
Similarly, when the turning angle of the automobile is large, targets in the lateral direction and the transverse direction of the automobile need to be detected in time, so that the acquisition priority of the fusion acquisition equipment adopting the millimeter wave radar is higher, the detection time interval of road conditions in the lateral direction and the transverse direction of the automobile is shortened, and the target information of a target object is found or updated in time; meanwhile, in the same time period, the detected information around the vehicle is richer, and the accuracy of the target information of the target object can be improved after the detected information is subjected to space-time fusion.
Based on the method, the more needed road condition information under the current vehicle driving scene is collected with certain choice and emphasis aiming at different vehicle driving scenes, so that the information around the vehicle can be acquired more pertinently and timely, the calculation power of the processing system is reasonably distributed, the timeliness of the intelligent driving system for processing the information and making a decision is improved from two aspects of more timely information collection and more timely information processing, more data are fused, and the accuracy of the target information of the target object is further improved.
Here, the determining the collection priority corresponding to each fusion collection device according to the vehicle speed data and the steering wheel angle data of the vehicle includes the following steps:
determining a speed grade corresponding to the vehicle speed data of the vehicle and determining an angle grade corresponding to the steering wheel angle data; different speed levels correspond to different vehicle speed threshold ranges, and different angle levels correspond to different angle threshold ranges;
determining the priority of each fusion acquisition device according to the speed level and the angle level; the higher the speed level is, the higher the acquisition priority of the fusion acquisition equipment adopting the laser radar is; the higher the angle level is, the higher the acquisition priority of the fusion acquisition device adopting the millimeter wave radar is.
That is, step S101, namely: determining the acquisition priority corresponding to each fusion acquisition device according to vehicle running data, and determining the sampling frequency corresponding to each fusion acquisition device according to the acquisition priority; controlling the fusion acquisition equipment to acquire radar data and images within a preset range around the vehicle according to the corresponding sampling frequency;
illustratively, according to the vehicle speed data and the steering wheel angle data of the vehicle, a Kalman filtering method is adopted to determine the sampling frequency of each fusion acquisition device.
The method comprises the following specific steps: firstly, establishing a weight matrix of a Kalman filtering method
Figure P_220411103654398_398540001
Each row represents a weight attribute, and each attribute needs to be parametrized.
The weight parameters in the weight matrix are as follows:
weight parameter
Figure P_220411103654429_429793001
Is the speed of the vehicle and is,
Figure P_220411103654448_448295002
the weight parameter corresponding to the first stage (0-10 km/h) of the speed,
Figure P_220411103654464_464445003
the weight parameter corresponding to the second stage of speed (11-20 km/h), and so on,
Figure P_220411103654495_495705004
the weight parameter is corresponding to the mth level of the speed;
weight parameter
Figure P_220411103654511_511316001
Is the steering wheel angle of the vehicle,
Figure P_220411103654542_542569002
is a weight parameter corresponding to the first stage (0-10 degrees) of the angle,
Figure P_220411103654558_558192003
is a weight parameter corresponding to the second level of angle (11-20 degrees) and so on,
Figure P_220411103654573_573826004
the weight parameter is corresponding to the mth level of the speed;
weight parameter
Figure P_220411103654605_605111001
Is the sampling frequency of the vehicle and is,
Figure P_220411103654620_620694002
the weighting parameter corresponding to the first level of frequency (100 Hz),
Figure P_220411103654653_653901003
the weight parameter corresponding to the second stage (200 Hz) of the frequency, and so on,
Figure P_220411103654669_669530004
the weighting parameter is corresponding to the mth level of the frequency;
weight parameter
Figure P_220411103654685_685162001
As other parameters such as longitudinal distance, lateral distance, etc.
Figure P_220411103654716_716435002
The weight parameter corresponding to the first level of other parameters,
Figure P_220411103654732_732032003
the weight parameters corresponding to the second level of other parameters, and so on,
Figure P_220411103654747_747659004
the weight parameter corresponding to the mth level of other parameters;
the number of the weight parameters can expand the matrix content according to the complexity of the algorithm and adjust the parameters in the development process.
Weight matrix of Kalman filtering method
Figure P_220411103654778_778925001
The following:
Figure P_220411103654794_794545002
in the vehicle data processing method, as shown in fig. 3, vehicle speed data and steering wheel angle data of a vehicle are acquired as vehicle driving data, and acquisition priorities corresponding to each type of fusion acquisition equipment are determined according to the vehicle speed data and the steering wheel angle data of the vehicle; the method comprises the following steps:
s301, acquiring environmental data;
s302, according to the speed data and the steering wheel corner data of the vehicle and the vehicle running data, determining the corresponding acquisition priority of each fusion acquisition device;
and S303, correcting the acquisition priority of each fusion acquisition device according to the environment data to obtain the corrected acquisition priority.
That is to say, the sampling frequency of the fusion acquisition device is also related to the environment, for example, in rainy and snowy weather, the accuracy of the data acquired by the fusion acquisition device adopting the laser radar is reduced, and then the acquisition priority of the fusion acquisition device adopting the laser radar is reduced, so as to reduce the sampling frequency of the fusion device. Meanwhile, the acquisition priority of the fusion acquisition equipment adopting the millimeter wave radar is increased. And the sampling frequency of the fusion acquisition equipment is increased.
The environmental data is used as the reference data of the vehicle using scene, so that the acquired sensor data can better meet the requirements.
In the step S102, the radar data and the images of the fusion acquisition device at different sampling times are subjected to fusion processing, and target information of at least one target object in a preset range around the vehicle is obtained; the target information of the target object is used for an intelligent driving system; specifically, the target information of the target object includes: the position of the target object, the target distance between the target object and the vehicle, and the shape profile of the target object.
Wherein the target object comprises an obstacle and a target area. Illustratively, the target object is: nearby vehicles, pedestrians, roadside railings, signal lights, and the like; the target area is, for example: the target area can be obtained by processing images shot by the camera, for example, a sign image in the images is extracted, the target area is judged, and information such as the direction, the target distance and the like of the sign is judged by combining radar data.
In some embodiments, the target information of the target object may be represented in a coordinate system, and the shape contour of the target object may be coordinates of a point on the contour of the target object.
The intelligent driving system makes some decisions according to the target information of the target object, for example, when the target information of the target object judges that a person is in front of the vehicle, the vehicle speed can be reduced or the brake can be controlled.
In the embodiment of the application, as shown in fig. 4, in the vehicle data processing method, radar data and images of fusion acquisition equipment at different sampling moments are subjected to fusion processing, and target information of at least one target object in a preset range around a vehicle is obtained; the method comprises the following steps:
s401, fusing radar data and images acquired by fusion acquisition equipment at a sampling moment to acquire first information of at least one target object at the sampling moment; the first information comprises distance information, position information, shape information, size information and angle range information;
s402, fusing the first information of the target objects at the plurality of sampling moments to obtain the target information of at least one target object in a preset range around the vehicle.
The radar data and the images of the fusion acquisition equipment at different sampling moments are subjected to fusion processing, and the obtained target information can not only reflect the state of a target object, but also realize behavior prediction and motion detection of the target object.
In the step S103, a target image is determined from images acquired by multiple fusion acquisition devices, and a reminding mark is added to the target image according to a target object whose target information meets a mark condition, so as to obtain a composite image for a driver to browse; here, it is determined which image of the fusion capturing apparatus is to be captured as the target image, and one image is selected as the target image from among a plurality of images captured by the fusion capturing apparatus, instead of each being the target image.
The target object whose target information satisfies the mark condition may be a target object whose type satisfies a predetermined type, a target object whose distance from the vehicle satisfies a predetermined distance condition, or a target object whose distance satisfies a predetermined type and a predetermined distance condition.
For example, a pedestrian or a vehicle is taken as a high-priority target object, and when the processing platform identifies the pedestrian according to the radar data and the image and the pedestrian is within a preset distance from the vehicle, the pedestrian is determined as the target object whose target information meets the marking condition.
And adding a reminding mark in the target image according to the target object of which the target information meets the marking condition to obtain a composite image, sending the composite image to a vehicle-mounted system of the carriage, and displaying the composite image on a display screen of the vehicle-mounted system for a driver to browse.
And the reminding marks added in the target image correspond to different reminding marks of different types of target objects. For example, when there is a person around the vehicle, a human-shaped sign is displayed; when there is a vehicle around the vehicle, a one-car-shaped sign is displayed.
Since normally, the image shot by only one camera is displayed in the display screen of the vehicle-mounted system, for example, the image right in front of the vehicle, and the image right behind the vehicle; in some embodiments, the target object corresponding to the reminder mark is not only the target object existing on the target image, but also the target object satisfying the marking condition in each direction near the vehicle, thereby increasing the information amount of the reminder.
In some embodiments, the reminder mark further includes target information corresponding to a target object, for example: target distance, driving direction, etc. The direction of the behavior may be represented by a directional arrow.
In the embodiment of the present application, as shown in fig. 5, determining a target image from images acquired by multiple fusion acquisition devices includes the following steps:
s501, obtaining vehicle operation data, and determining target fusion acquisition equipment according to the vehicle operation data;
s502, determining a target image from images acquired by the target fusion acquisition equipment at a plurality of sampling moments in a preset time period of the equipment, so that the calculation resources occupied when the images are synthesized are less than the calculation resources occupied when the target information is obtained.
The vehicle operation data, for example, a reverse operation, a steering operation, a parking operation, and the like. When the vehicle is operated differently, different target images need to be displayed on a display screen of the vehicle-mounted system. For example, during normal driving, an image in front of the vehicle needs to be displayed; when backing, the remote image behind the vehicle needs to be displayed; when parking, it is necessary to display an image of the ground behind the vehicle, and the like.
Therefore, different target fusion acquisition devices are determined based on different vehicle operation data.
And determining a target image from the images acquired by the target fusion acquisition equipment at a plurality of sampling moments in a preset time period of the equipment, so that the calculation resource occupied by synthesizing the images is less than that occupied by acquiring the target information.
Here, that is, not every image acquired by the target fusion acquisition device is synthesized and displayed, but a target image is selected according to a certain rule to be synthesized and displayed, so that the priority and the frame rate of the synthesized images browsed by the user are reduced to reduce the influence on the processing platform, the task with high priority is assigned with more calculation power, the processing process in step S102 is given priority, the intelligent driving system is given priority to timely obtaining the target information of the target object and timely processing, and the driving assistance function and the driving safety function are guaranteed.
The embodiment of the application also provides a vehicle data processing device which is applied to a processing system, wherein the processing system comprises a processing platform and a plurality of fusion acquisition devices arranged on a vehicle, each fusion acquisition device comprises an integrally packaged radar sensor and a camera, and the types of the radar sensors or the types of the cameras in different types of fusion acquisition devices are different; as shown in fig. 6, the apparatus includes:
the determining module 601 is configured to determine an acquisition priority corresponding to each type of fusion acquisition device according to vehicle driving data, and control the corresponding fusion acquisition device to acquire radar data and images within a preset range around a vehicle according to the acquisition priority; the sampling frequencies of the fusion acquisition equipment are different;
the fusion module 602 is configured to perform fusion processing on the radar data and the images of the fusion acquisition device at different sampling times to obtain target information of at least one target object in a preset range around the vehicle; the target information of the target object is used for an intelligent driving system;
the marking module 603 is configured to determine a target image from images acquired by multiple fusion acquisition devices, and add a reminding mark to the target image according to a target object whose target information meets a marking condition, so as to obtain a composite image.
In some embodiments, the determining module in the vehicle data processing apparatus, when determining the acquisition priority corresponding to each fusion acquisition device according to the vehicle driving data, and controlling the corresponding fusion acquisition device to acquire radar data and images within a preset range around the vehicle according to the acquisition priority, is specifically configured to: acquiring vehicle speed data and steering wheel corner data of a vehicle as vehicle running data, and determining acquisition priority corresponding to each fusion acquisition device according to the vehicle speed data and the steering wheel corner data of the vehicle;
and determining the sampling frequency of each fusion acquisition device according to the acquisition priority corresponding to each fusion acquisition device, and controlling the fusion acquisition devices to acquire radar data and images in a preset range around the vehicle according to the corresponding sampling frequency.
In some embodiments, the determining module in the vehicle data processing apparatus, when determining the acquisition priority corresponding to each type of fusion acquisition device according to the vehicle speed data and the steering wheel angle data of the vehicle, is specifically configured to: determining a speed grade corresponding to the vehicle speed data of the vehicle and determining an angle grade corresponding to the steering wheel angle data; different speed levels correspond to different vehicle speed threshold ranges, and different angle levels correspond to different angle threshold ranges;
determining the priority of each fusion acquisition device according to the speed level and the angle level; the higher the speed level is, the higher the acquisition priority of the fusion acquisition equipment adopting the laser radar is; the higher the angle level is, the higher the acquisition priority of the fusion acquisition device adopting the millimeter wave radar is.
In some embodiments, the determining module in the vehicle data processing apparatus acquires vehicle speed data and steering wheel angle data of a vehicle as vehicle driving data, and determines the acquisition priority corresponding to each fusion acquisition device according to the vehicle speed data and the steering wheel angle data of the vehicle; the method is specifically used for:
acquiring environmental data;
determining the acquisition priority corresponding to each fusion acquisition device according to the vehicle speed data and the steering wheel corner data of the vehicle and the vehicle running data;
and correcting the acquisition priority of each fusion acquisition device according to the environmental data to obtain the corrected acquisition priority.
In some embodiments, the marking module in the vehicle data processing apparatus, when determining the target image from the images acquired by the plurality of fusion acquisition devices, is specifically configured to:
obtaining vehicle operation data, and determining target fusion acquisition equipment according to the vehicle operation data;
and determining a target image from the images acquired by the target fusion acquisition equipment at a plurality of sampling moments in a preset time period of the equipment, so that the calculation resource occupied by synthesizing the images is less than that occupied by acquiring the target information.
In some embodiments, a fusion module in the vehicle data processing apparatus is configured to perform fusion processing on the radar data and the images at different sampling times of the fusion acquisition device to obtain target information of at least one target object in a preset range around the vehicle; the method comprises the following steps:
fusing radar data and images acquired by fusion acquisition equipment at a sampling moment to acquire first information of at least one target object at the sampling moment; the first information comprises distance information, position information, shape information, size information and angle range information;
and fusing the first information of the target objects at the plurality of sampling moments to obtain the target information of at least one target object in the preset range around the vehicle.
In some embodiments, the fusion acquisition equipment in the processing system applying the device comprises a radar sensor, a camera, a combined plug connector and a wire harness, wherein the radar sensor and the camera are respectively connected with the combined plug connector and respectively transmit radar data and images to the processing platform through the wire harness.
As shown in fig. 7, an electronic device 700 is further provided in the embodiment of the present application, and includes: a processor 702, a memory 701 and a bus, wherein the memory 701 stores machine-readable instructions executable by the processor 702, the processor 702 communicates with the memory 701 via the bus when the electronic device is operated, and the machine-readable instructions are executed by the processor 702 to perform the steps of the vehicle data processing method.
A computer-readable storage medium in an embodiment of the present application, having stored thereon a computer program which, when being executed by a processor, performs the steps of the vehicle data processing method.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to corresponding processes in the method embodiments, and are not described in detail in this application. In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. The above-described apparatus embodiments are merely illustrative, and for example, the division of the modules is merely a logical division, and there may be other divisions in actual implementation, and for example, a plurality of modules or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or modules through some communication interfaces, and may be in an electrical, mechanical or other form.
The modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solutions of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for enabling a computer device (which may be a personal computer, a platform server, or a network device) to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A vehicle data processing method is characterized by being applied to a processing system, wherein the processing system comprises a processing platform and a plurality of fusion acquisition devices arranged on a vehicle, the fusion acquisition devices comprise integrally packaged radar sensors and cameras, and the types of the radar sensors and/or the types of the cameras in different types of fusion acquisition devices are different; the method comprises the following steps:
determining the acquisition priority corresponding to each fusion acquisition equipment according to the vehicle running data, and controlling the corresponding fusion acquisition equipment to acquire radar data and images in a preset range around the vehicle according to the acquisition priority; the sampling frequencies of the fusion acquisition equipment are different;
fusing the radar data and the images of the fusion acquisition equipment at different sampling moments to acquire target information of at least one target object in a preset range around the vehicle; the target information of the target object is used for an intelligent driving system;
determining a target image from images acquired by various fusion acquisition devices, and adding a reminding mark in the target image according to a target object of which target information meets a marking condition to obtain a composite image.
2. The vehicle data processing method according to claim 1, wherein the step of determining the acquisition priority corresponding to each fusion acquisition device according to the vehicle driving data and controlling the corresponding fusion acquisition device to acquire the radar data and the image within the preset range around the vehicle according to the acquisition priority comprises the following steps:
acquiring vehicle speed data and steering wheel corner data of a vehicle as vehicle running data, and determining acquisition priority corresponding to each fusion acquisition device according to the vehicle speed data and the steering wheel corner data of the vehicle;
and determining the sampling frequency of each fusion acquisition device according to the acquisition priority corresponding to each fusion acquisition device, and controlling the fusion acquisition devices to acquire radar data and images in a preset range around the vehicle according to the corresponding sampling frequency.
3. The vehicle data processing method according to claim 2, wherein the step of determining the acquisition priority corresponding to each fusion acquisition device according to the vehicle speed data and the steering wheel angle data of the vehicle comprises the following steps:
determining a speed grade corresponding to the vehicle speed data of the vehicle and determining an angle grade corresponding to the steering wheel angle data; different speed levels correspond to different vehicle speed threshold ranges, and different angle levels correspond to different angle threshold ranges;
determining the priority of each fusion acquisition device according to the speed level and the angle level; the higher the speed level is, the higher the acquisition priority of the fusion acquisition equipment adopting the laser radar is; the higher the angle level is, the higher the acquisition priority of the fusion acquisition device adopting the millimeter wave radar is.
4. The vehicle data processing method according to claim 2, characterized by acquiring vehicle speed data and steering wheel angle data of a vehicle as vehicle driving data, and determining acquisition priority corresponding to each fusion acquisition device according to the vehicle speed data and steering wheel angle data of the vehicle; the method comprises the following steps:
acquiring environmental data;
determining the acquisition priority corresponding to each fusion acquisition device according to the vehicle speed data and the steering wheel corner data of the vehicle and the vehicle running data;
and correcting the acquisition priority of each fusion acquisition device according to the environmental data to obtain the corrected acquisition priority.
5. The vehicle data processing method according to claim 1, wherein determining the target image from the images acquired by the plurality of fusion acquisition devices comprises the steps of:
obtaining vehicle operation data, and determining target fusion acquisition equipment according to the vehicle operation data;
and determining a target image from the images acquired by the target fusion acquisition equipment at a plurality of sampling moments in a preset time period of the equipment, so that the calculation resource occupied by synthesizing the images is less than that occupied by acquiring the target information.
6. The vehicle data processing method according to claim 1, characterized in that the radar data and the images of the fusion acquisition device at different sampling moments are subjected to fusion processing to obtain target information of at least one target object in a preset range around the vehicle; the method comprises the following steps:
fusing radar data and images acquired by fusion acquisition equipment at a sampling moment to acquire first information of at least one target object at the sampling moment; the first information comprises distance information, position information, shape information, size information and angle range information;
and fusing the first information of the target objects at the plurality of sampling moments to obtain the target information of at least one target object in the preset range around the vehicle.
7. The vehicle data processing method according to claim 1, wherein the fusion acquisition device comprises a radar sensor, a camera, a combined plug connector and a wire harness, wherein the radar sensor and the camera are respectively connected with the combined plug connector and send radar data and images to a processing platform through the wire harness.
8. A vehicle data processing device is characterized by being applied to a processing system, wherein the processing system comprises a processing platform and a plurality of fusion acquisition devices arranged on a vehicle, the fusion acquisition devices comprise integrally packaged radar sensors and cameras, and the types of the radar sensors or the types of the cameras in different types of fusion acquisition devices are different; the device comprises:
the determining module is used for determining the acquisition priority corresponding to each fusion acquisition device according to the vehicle running data, and controlling the corresponding fusion acquisition device to acquire radar data and images in a preset range around the vehicle according to the acquisition priority; the sampling frequencies of the fusion acquisition equipment are different;
the fusion module is used for fusing the radar data and the images of the fusion acquisition equipment at different sampling moments to acquire target information of at least one target object in a preset range around the vehicle; the target information of the target object is used for an intelligent driving system;
and the marking module is used for determining a target image from the images acquired by the various fusion acquisition devices, and adding a reminding mark in the target image according to a target object of which the target information meets the marking condition to obtain a composite image.
9. An electronic device, comprising: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating over the bus when the electronic device is operating, the machine-readable instructions when executed by the processor performing the steps of the vehicle data processing method of any one of claims 1 to 7.
10. A computer-readable storage medium, characterized in that a computer program is stored thereon, which computer program, when being executed by a processor, carries out the steps of the vehicle data processing method according to any one of claims 1 to 7.
CN202210401134.9A 2022-04-18 2022-04-18 Vehicle data processing method and device, electronic equipment and medium Active CN114492679B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210401134.9A CN114492679B (en) 2022-04-18 2022-04-18 Vehicle data processing method and device, electronic equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210401134.9A CN114492679B (en) 2022-04-18 2022-04-18 Vehicle data processing method and device, electronic equipment and medium

Publications (2)

Publication Number Publication Date
CN114492679A true CN114492679A (en) 2022-05-13
CN114492679B CN114492679B (en) 2022-08-02

Family

ID=81489464

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210401134.9A Active CN114492679B (en) 2022-04-18 2022-04-18 Vehicle data processing method and device, electronic equipment and medium

Country Status (1)

Country Link
CN (1) CN114492679B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114915646A (en) * 2022-06-16 2022-08-16 上海伯镭智能科技有限公司 Data grading uploading method and device for unmanned mine car
CN115346362A (en) * 2022-06-10 2022-11-15 斑马网络技术有限公司 Driving data processing method and device, electronic equipment and storage medium
WO2024093093A1 (en) * 2022-10-31 2024-05-10 河北网新科技集团股份有限公司 Decision-making method and system based on multi-modal information perception in external environment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110025548A1 (en) * 2009-07-31 2011-02-03 Gm Global Technology Operations, Inc. System and method for vehicle sensor fusion
CN109017802A (en) * 2018-06-05 2018-12-18 长沙智能驾驶研究院有限公司 Intelligent driving environment perception method, device, computer equipment and storage medium
CN110850413A (en) * 2019-11-26 2020-02-28 奇瑞汽车股份有限公司 Method and system for detecting front obstacle of automobile
CN112519801A (en) * 2020-12-23 2021-03-19 临工集团济南重机有限公司 Safety protection system and method for running vehicle
CN113920089A (en) * 2021-10-11 2022-01-11 广东汇天航空航天科技有限公司 Target detection method and device and electronic controller

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110025548A1 (en) * 2009-07-31 2011-02-03 Gm Global Technology Operations, Inc. System and method for vehicle sensor fusion
CN109017802A (en) * 2018-06-05 2018-12-18 长沙智能驾驶研究院有限公司 Intelligent driving environment perception method, device, computer equipment and storage medium
CN110850413A (en) * 2019-11-26 2020-02-28 奇瑞汽车股份有限公司 Method and system for detecting front obstacle of automobile
CN112519801A (en) * 2020-12-23 2021-03-19 临工集团济南重机有限公司 Safety protection system and method for running vehicle
CN113920089A (en) * 2021-10-11 2022-01-11 广东汇天航空航天科技有限公司 Target detection method and device and electronic controller

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115346362A (en) * 2022-06-10 2022-11-15 斑马网络技术有限公司 Driving data processing method and device, electronic equipment and storage medium
CN115346362B (en) * 2022-06-10 2024-04-09 斑马网络技术有限公司 Driving data processing method and device, electronic equipment and storage medium
CN114915646A (en) * 2022-06-16 2022-08-16 上海伯镭智能科技有限公司 Data grading uploading method and device for unmanned mine car
CN114915646B (en) * 2022-06-16 2024-04-12 上海伯镭智能科技有限公司 Data grading uploading method and device for unmanned mine car
WO2024093093A1 (en) * 2022-10-31 2024-05-10 河北网新科技集团股份有限公司 Decision-making method and system based on multi-modal information perception in external environment

Also Published As

Publication number Publication date
CN114492679B (en) 2022-08-02

Similar Documents

Publication Publication Date Title
EP3854646B1 (en) Systems and methods for navigating with safe distances
CN114492679B (en) Vehicle data processing method and device, electronic equipment and medium
EP3544856B1 (en) Determining a road surface characteristic
US9507345B2 (en) Vehicle control system and method
CN111201787B (en) Imaging apparatus, image processing apparatus, and image processing method
US9747800B2 (en) Vehicle recognition notification apparatus and vehicle recognition notification system
US11100806B2 (en) Multi-spectral system for providing precollision alerts
US20190049992A1 (en) System and Method for Improved Obstable Awareness in Using a V2x Communications System
CN112466141A (en) Vehicle-road-collaboration-oriented intelligent network connection end equipment interaction method, system and storage medium
US10369995B2 (en) Information processing device, information processing method, control device for vehicle, and control method for vehicle
US20070016372A1 (en) Remote Perspective Vehicle Environment Observation System
CN112313537A (en) Information processing apparatus and information processing method, imaging apparatus, computer program, information processing system, and mobile body apparatus
DE112018004507T5 (en) INFORMATION PROCESSING DEVICE, MOTION DEVICE AND METHOD AND PROGRAM
US20240142607A1 (en) Information processing device, information processing method, computer program, and mobile device
CN111736153A (en) Environment detection system, method, apparatus, and medium for unmanned vehicle
CN109927629B (en) Display control apparatus, display control method, and vehicle for controlling projection apparatus
CN110962744A (en) Vehicle blind area detection method and vehicle blind area detection system
CN113696907A (en) L3-level automatic driving system
CN114228491A (en) Head-up display system and method with night vision enhanced virtual reality
CN112534297A (en) Information processing apparatus, information processing method, computer program, information processing system, and mobile apparatus
CN111169381A (en) Vehicle image display method and device, vehicle and storage medium
US11845429B2 (en) Localizing and updating a map using interpolated lane edge data
CN113771845B (en) Method and device for predicting vehicle track, vehicle and storage medium
CN109470491A (en) Blind monitoring road test evaluation system
US20210179115A1 (en) Method and apparatus for monitoring a yaw sensor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant