CN112684450B - Sensor deployment method and device, electronic equipment and storage medium - Google Patents

Sensor deployment method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN112684450B
CN112684450B CN202011506638.4A CN202011506638A CN112684450B CN 112684450 B CN112684450 B CN 112684450B CN 202011506638 A CN202011506638 A CN 202011506638A CN 112684450 B CN112684450 B CN 112684450B
Authority
CN
China
Prior art keywords
sensor
sensing
performance
candidate deployment
combination
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011506638.4A
Other languages
Chinese (zh)
Other versions
CN112684450A (en
Inventor
马涛
刘知正
李怡康
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Sensetime Lingang Intelligent Technology Co Ltd
Original Assignee
Shanghai Sensetime Lingang Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Sensetime Lingang Intelligent Technology Co Ltd filed Critical Shanghai Sensetime Lingang Intelligent Technology Co Ltd
Priority to CN202011506638.4A priority Critical patent/CN112684450B/en
Publication of CN112684450A publication Critical patent/CN112684450A/en
Application granted granted Critical
Publication of CN112684450B publication Critical patent/CN112684450B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The disclosure provides a method, a device, an electronic device and a storage medium for sensor deployment, wherein the method comprises the following steps: determining at least one sensor combination deployed on the driving equipment and a plurality of candidate deployment parameter combinations of each sensor combination according to the main body parameters of the driving equipment, the range to be perceived, the set number of sensors and the isolated perception performance of each sensor; according to the perception performance of each sensor combination under each candidate deployment parameter combination, determining the candidate deployment parameter combination of the sensor combination with optimal perception performance; the sensors are deployed on the running gear according to the determined candidate deployment parameters of the sensor combination. According to the method and the device, on the premise of saving sensor resources, the perception performance can be improved, and the collected sensor data can be better served for subsequent application.

Description

Sensor deployment method and device, electronic equipment and storage medium
Technical Field
The disclosure relates to the technical field of automatic driving, in particular to a sensor deployment method, a sensor deployment device, electronic equipment and a storage medium.
Background
In the field of autopilot technology, it is often necessary to collect a large amount of vehicle data for subsequent applications, for example, for training, tuning, etc. of models, and also for map creation, etc. The vehicle data may be data collected by sensors (e.g., radar detectors, image sensors) mounted on the autonomous vehicle.
To better ensure the implementation of the subsequent application, the deployment of relevant sensors for the autonomous vehicle is often required before the data acquisition takes place. In the related art, the type, the installation position and the like of the sensor to be installed can be determined by a user, or the sensor to be deployed is too many, so that resources are wasted, or the sensor to be deployed is too few, or the installation position of the sensor is unreasonable, so that the sensing requirement of an automatic driving vehicle cannot be met.
Disclosure of Invention
The embodiment of the disclosure provides at least a method, a device, electronic equipment and a storage medium for deploying a sensor, which can improve the perception performance on the premise of saving sensor resources and ensure that acquired sensor data can better serve subsequent applications.
In a first aspect, embodiments of the present disclosure provide a method of sensor deployment, the method comprising:
determining at least one sensor combination deployed on the running equipment and a plurality of candidate deployment parameter combinations of each sensor combination according to main body parameters of the running equipment, a range to be perceived, the set number of sensors and isolated perception performance of each sensor;
according to the perception performance of each sensor combination under each candidate deployment parameter combination, determining the candidate deployment parameter combination of the sensor combination with optimal perception performance;
and deploying the sensor on the driving equipment according to the determined candidate deployment parameters of the sensor combination.
By adopting the sensor deployment method, at least one sensor combination deployed on the running equipment and a plurality of candidate deployment parameter combinations of each sensor combination can be determined according to the running equipment main body parameters, the range required to be perceived, the set number of sensors and the isolated perception performance of each sensor, then the candidate deployment parameter combination of the sensor combination with the optimal perception performance can be determined according to the perception performance of each sensor combination under each candidate deployment parameter combination, the perception performance can represent the perception performance of each sensor combination (namely, the larger the combination result of the sensor combination) under the selected candidate deployment parameter combination, the better the perception performance is, so that the sensor deployment method can determine the candidate deployment parameter combination with the optimal perception performance to a certain extent, and deploy the sensors according to the determined candidate deployment parameter combination, so that the deployed sensors can obtain the better perception performance under the condition of less number, and obtain the better perception performance under the condition of lower sensor cost.
In one possible embodiment, determining the perceived performance of each sensor combination at each candidate deployment parameter combination comprises:
for each candidate deployment parameter combination of each sensor combination, determining the perceived performance of the sensor combination under the candidate deployment parameter combination based on the perceived performance of each sensor included in the sensor combination under the respective candidate deployment parameters.
Here, the sensing performance of one sensor under one candidate deployment parameter combination may be disassembled for the sensor combination and the candidate deployment parameter combination, respectively, so that in the case of determining the sensing performance of each sensor included in the sensor combination under the candidate deployment parameter, the sensing performance of the sensor combination under the candidate deployment parameter combination may be determined through the combination of the sensing performance.
In one possible implementation, determining the perceived performance of each sensor included in a sensor assembly at a respective candidate deployment parameter includes:
dividing the range which the driving equipment needs to sense into a plurality of sensing areas according to a preset dividing size; the plurality of sensing regions includes at least one close range sensing region and at least one far range sensing region;
For each sensor included in the sensor, respectively performing information sensing on the at least one close-range sensing area and information sensing on the at least one far-range sensing area based on each candidate deployment parameter of the sensor, and determining each first sensing performance of the sensor for the at least one close-range sensing area and each second sensing performance of the sensor for the at least one far-range sensing area;
based on the first perceived performance and the second perceived performance, the perceived performance of each sensor included in the sensor under the candidate deployment parameters is obtained.
Considering the influence of distance factors on the sensing performance of the sensor, in general, the closer the distance is, the worse the distance is, in order to realize the sensing performance evaluation under different conditions, herein, the range of the driving device to be sensed may be divided into a plurality of sensing areas according to a preset dividing size, then, each sensor included in one sensor is determined, and based on the first sensing performance and the second sensing performance of at least one close sensing area, the sensing performance of each sensor included in the sensor under each candidate deployment parameter may be obtained, that is, the overall sensing performance of the sensor at different distances may be obtained under the condition of separately evaluating the close sensing performance and the far sensing performance, so that the sensing performance of the sensor may be better evaluated.
In one possible implementation, the information sensing of the at least one close range sensing area based on each candidate deployment parameter of a sensor, and determining each first sensing performance of the sensor for the at least one close range sensing area includes:
determining a synthesis region corresponding to the at least one close-range sensing region; wherein the synthesis region is formed by combining the at least one close-range sensing region;
and based on each candidate deployment parameter of the sensor, performing information sensing on the synthesized region to obtain each first sensing performance of the sensor aiming at the synthesized region.
Here, in consideration of the high perceptibility of the sensor to the close range target, even if the acquired sensor data of a part of the target is the target, that is, the perceived performance of the sensor is not strongly dependent on the close range target object, so here, the information sensing can be performed on the synthesized region based on only a plurality of candidate deployment parameters of each sensor, the target object is not considered, and the operation is simple, thereby making the determination of the first perceived performance more suitable for the requirement of the close range sensing.
In one possible implementation manner, in a case where the sensor is a radar device, based on each candidate deployment parameter of one sensor, information sensing is performed on the composite area, so as to obtain each first sensing performance of the sensor for the composite area, where the method includes:
determining, for a candidate deployment parameter of a radar device, a first number of short-range-aware areas in the composite area into which the radio beams fall if the radar device transmits radio beams to the composite area after deployment according to the candidate deployment parameter;
and determining a first perception performance of the radar device for the synthesized region under the candidate deployment parameters according to the first quantity.
In a possible implementation manner, in a case that the sensor is an image acquisition device, based on each candidate deployment parameter of the one sensor, information sensing is performed on the synthesized area, so as to obtain each first sensing performance of the sensor for the synthesized area, where the method includes:
determining, for a candidate deployment parameter of an image acquisition device, a second number of close-range perceived areas in the synthesized area contained in the acquired image when the image acquisition device acquires the image of the synthesized area after deployment according to the candidate deployment parameter;
And determining a first perception performance of the image acquisition device for the synthesized region under the candidate deployment parameters according to the second quantity.
In one possible implementation, the information sensing of the at least one remote sensing area based on the respective candidate deployment parameters of a sensor, and determining the respective second sensing performance of the sensor for the at least one remote sensing area includes:
for a candidate deployment parameter of a sensor, for each remote sensing area in the at least one remote sensing area, performing information sensing on target objects in the remote sensing area based on the candidate deployment parameter of the sensor, and determining sensing performance of the sensor for the remote sensing area under the candidate deployment parameter;
and obtaining a second perception performance of the sensor for the at least one remote perception region under the candidate deployment parameters based on the perception performance of the sensor for each remote perception region under the candidate deployment parameters.
Here, in consideration of the weak sensing capability of the sensor for the remote target, even if the sensor data of all the targets are acquired, the target may not be determined, that is, the sensing performance of the sensor has a strong dependence on the remote target, so here, information sensing may be performed for the target object set in each remote sensing area, so that the determination of the second sensing performance is more suitable for the requirement of remote sensing.
In a possible implementation manner, in a case that the sensor is a radar device, the information sensing is performed on a target object in the remote sensing area based on a candidate deployment parameter of a sensor, and determining a sensing performance of the sensor for the remote sensing area under the candidate deployment parameter includes:
determining, for each of the at least one remote sensing area, a number of radio beams of the radar device that hit a target object within the remote sensing area upon transmitting the radio beams to the target object after deployment according to the candidate deployment parameters;
based on the number of radio beams striking the target object, a perceived performance of the sensor for the remote perceived area under the candidate deployment parameters is determined.
In a possible implementation manner, in a case that the sensor is an image acquisition device, based on a candidate deployment parameter of a sensor, performing information sensing on a target object in the remote sensing area, determining sensing performance of the sensor for the remote sensing area under the candidate deployment parameter includes:
Determining, for each of the at least one remote sensing region, duty cycle information of a target object in an image acquired by the image acquisition device when the image acquisition device acquires the image of the target object in the remote sensing region after deployment according to the candidate deployment parameters;
and determining the perception performance of the sensor for the remote perception region under the candidate deployment parameters based on the duty ratio information.
In a possible implementation manner, the obtaining, based on the perceived performance of the sensor for each remote sensing area under the candidate deployment parameter, a second perceived performance of the sensor for the at least one remote sensing area under the candidate deployment parameter includes:
based on the sensing performance of the sensor for each remote sensing area under the candidate deployment parameters and the in-area sensing weight set for each remote sensing area, the weighted summation obtains the second sensing performance of the sensor for the at least one remote sensing area under the candidate deployment parameters.
In consideration of that in an actual application scene, the sensing requirements of different remote sensing areas are different, at this time, the set sensing weights in the areas can be used for adding weights for the sensing performance of the remote sensing areas, so that the determined second sensing performance more accords with the sensing requirements of the actual scene.
In one possible embodiment, the at least one close range sensing region and the at least one far range sensing region are determined as follows:
determining, for each of the plurality of perceived areas, a distance between the perceived area and the driving apparatus;
under the condition that the determined distance is smaller than a preset threshold value, determining that the sensing area belongs to the short-distance sensing area; or,
and under the condition that the determined distance is greater than or equal to a preset threshold value, determining that the sensing area belongs to the long-distance sensing area.
In one possible embodiment, where different types of sensors are included in a sensor combination, determining the perceived performance of a sensor combination under a candidate deployment parameter combination of the sensor combination based on the perceived performance of each sensor in the sensor combination under the candidate deployment parameter combination, comprising:
based on the perceived performance of each sensor in the sensor combination under the candidate deployment parameter combination of the sensor combination and the type perceived weight set for each sensor, weighted summation results in the perceived performance of the sensor combination under the candidate deployment parameter combination.
Here, considering that in an actual application scenario, there are different requirements for the sensing performance of different types of sensors, different types of sensing weights may be given to the different types of sensors, so as to meet the requirements for the sensing performance in the actual application scenario.
In one possible implementation, the perceived performance of each sensor combination at each candidate deployment parameter combination is determined as follows:
for each sensor included in each sensor combination, determining a candidate deployment parameter corresponding to the sensor from each candidate deployment parameter combination of the sensor combination, and dividing the parameter value of each parameter in the candidate deployment parameters into a plurality of value intervals;
and determining the perception performance of the sensor under the candidate deployment parameter combination based on the designated parameter values in the multiple value intervals after the candidate deployment parameters corresponding to each sensor are divided.
Here, in order to reduce the calculation amount of the larger number of perceptual performance evaluations caused by the combination of the candidate deployment parameters of the sensor, the parameter value of each parameter in the candidate deployment parameters of the sensor can be divided into value intervals, and information perception is performed based on the divided value intervals, so that the calculation amount of the evaluation is greatly reduced, and the evaluation efficiency is improved.
In one possible implementation, the candidate deployment parameter combination of the sensor combination for determining optimal perceived performance includes:
determining a value interval corresponding to the candidate deployment parameter of each sensor in the candidate deployment parameter combination of the sensor which enables the perception performance to be optimal;
determining the sensing performance of the sensor under the candidate deployment parameter combination again based on the determined values of a plurality of parameters in the value interval corresponding to the candidate deployment parameter of each sensor;
and determining one parameter value group with optimal perception performance as a candidate deployment parameter combination of the sensor with optimal perception performance.
Here, the more preferable parameter value scheme can be searched based on the division of the value interval, and then the parameter value scheme in the selected value interval can be further searched under the more preferable parameter value scheme, so that the evaluation accuracy of the perception performance is improved under the premise of reducing the calculated amount.
In a second aspect, embodiments of the present disclosure also provide an apparatus for sensor deployment, the apparatus comprising:
a first determining module, configured to determine at least one sensor group deployed on the driving device and a plurality of candidate deployment parameter groups of each sensor group according to a main body parameter of the driving device, a range to be perceived, a set number of sensors, and an isolated perception performance of each sensor;
The second determining module is used for determining a candidate deployment parameter combination of the sensor with optimal perception performance according to the perception performance of each sensor under each candidate deployment parameter combination;
and the deployment module is used for deploying the sensor on the running equipment according to the determined candidate deployment parameters of the sensor combination.
In a third aspect, an embodiment of the present disclosure further provides an electronic device, including: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory in communication over the bus when the electronic device is running, the machine-readable instructions when executed by the processor performing the steps of the method of sensor deployment according to the first aspect and any of its various embodiments.
In a fourth aspect, the presently disclosed embodiments also provide a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method of sensor deployment as described in the first aspect and any of its various embodiments.
The description of the effects of the apparatus, the electronic device, and the computer-readable storage medium for sensor deployment will be referred to the description of the method for sensor deployment, and will not be repeated here.
The foregoing objects, features and advantages of the disclosure will be more readily apparent from the following detailed description of the preferred embodiments taken in conjunction with the accompanying drawings.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings required for the embodiments are briefly described below, which are incorporated in and constitute a part of the specification, these drawings showing embodiments consistent with the present disclosure and together with the description serve to illustrate the technical solutions of the present disclosure. It is to be understood that the following drawings illustrate only certain embodiments of the present disclosure and are therefore not to be considered limiting of its scope, for the person of ordinary skill in the art may admit to other equally relevant drawings without inventive effort.
FIG. 1 illustrates a flow chart of a method of sensor deployment provided by embodiments of the present disclosure;
FIG. 2 illustrates an application schematic of a method of sensor deployment provided by embodiments of the present disclosure;
FIG. 3 illustrates a schematic diagram of an apparatus for sensor deployment provided by embodiments of the present disclosure;
fig. 4 shows a schematic diagram of an electronic device provided by an embodiment of the disclosure.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the embodiments of the present disclosure more apparent, the technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the drawings in the embodiments of the present disclosure, and it is apparent that the described embodiments are only some embodiments of the present disclosure, but not all embodiments. The components of the embodiments of the present disclosure, which are generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present disclosure provided in the accompanying drawings is not intended to limit the scope of the disclosure, as claimed, but is merely representative of selected embodiments of the disclosure. All other embodiments, which can be made by those skilled in the art based on the embodiments of this disclosure without making any inventive effort, are intended to be within the scope of this disclosure.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
The term "and/or" is used herein to describe only one relationship, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist together, and B exists alone. In addition, the term "at least one" herein means any one of a plurality or any combination of at least two of a plurality, for example, including at least one of A, B, C, and may mean including any one or more elements selected from the group consisting of A, B and C.
It has been found that in the related art of autopilot, sensors play a vital role in the eyes of a vehicle to sense information in the surrounding environment of the vehicle in real time.
For example, a sensor such as a lidar can capture various kinds of object-rich three-dimensional (3D) information in the environment, and meanwhile, distinguish high-inverse objects from low-inverse objects based on the reflectivity of the objects, and is widely applied to related directions such as positioning, mapping, sensing and the like. However, due to the number and distribution characteristics of the laser radar beams, the point cloud data are sparse, and the sensing distance is usually limited to about 100 meters; for another example, although the sensor of the camera has natural disadvantages in acquiring depth information, the sensor can provide rich semantic and color information, and is very helpful for object detection and classification; for another example, millimeter wave radar can reach a detection distance of about 250 meters, and the real-time speed of an object can be perceived through the doppler effect. However, the return data is more sparse than the point cloud returned by the lidar, resulting in lower positioning accuracy for the target object.
In order to realize the environmental perception of full coverage around the vehicle as much as possible, based on manual experience, various sensors are usually required to be arranged, and the type, the installation position and the like of the sensor to be installed can be determined by a user, so that the sensor is excessive in deployment, the resource waste is caused, or the sensor is too few in deployment, and the perception performance of the automatic driving vehicle cannot be met.
Based on the above researches, the disclosure provides a method, a device, an electronic device and a storage medium for deploying a sensor, which can improve the perception performance on the premise of saving sensor resources and ensure that acquired sensor data can better serve subsequent applications.
For the sake of understanding the present embodiment, first, a detailed description will be given of a method for deploying a sensor disclosed in an embodiment of the present disclosure, where an execution subject of the method for deploying a sensor provided in the embodiment of the present disclosure is generally a computer device having a certain computing capability, where the computer device includes, for example: the terminal device, or server or other processing device, may be a User Equipment (UE), mobile device, user terminal, cellular phone, cordless phone, personal digital assistant (Personal Digital Assistant, PDA), handheld device, computing device, vehicle mounted device, wearable device, etc. In some possible implementations, the method of sensor deployment may be implemented by way of a processor invoking computer readable instructions stored in a memory.
Referring to fig. 1, a flowchart of a method for deploying a sensor according to an embodiment of the disclosure is shown, where the method includes steps S101 to S103, where:
S101, determining at least one sensor combination deployed on the driving equipment and a plurality of candidate deployment parameter combinations of each sensor combination according to main body parameters of the driving equipment, a range to be perceived, the number of set sensors and isolated perception performance of each sensor;
s102, determining a candidate deployment parameter combination of a sensor with optimal perception performance according to the perception performance of each sensor under each candidate deployment parameter combination;
s103, deploying the sensors on the driving equipment according to the determined candidate deployment parameters of the sensor combination.
Here, in order to facilitate understanding of the method for sensor deployment provided by the embodiments of the present disclosure, a detailed description is first made of an application scenario of the method. The sensor deployment method can be mainly applied to the technical field of automatic driving, and when the sensor is deployed on a vehicle, a robot or other running equipment by using the method, the surrounding environment of the running equipment can be better perceived, so that the running equipment can be controlled according to the perceived surrounding environment.
The sensor may be a radar device that captures 3D information of various objects in the surrounding environment, an image acquisition device that captures semantic information and color information, a millimeter wave radar device for distance detection, or a sensor having other functions, which is not particularly limited in this disclosure.
For better sensor deployment, the method for sensor deployment provided by the embodiments of the present disclosure may first determine at least one sensor combination to be deployed on a driving apparatus and a plurality of candidate deployment parameter combinations for each sensor combination.
Wherein, a candidate deployment parameter combination of a sensor combination can be obtained by each candidate deployment parameter combination of each sensor in the sensor combination, that is, a candidate deployment parameter can be respectively extracted from each sensor, and the extracted candidate deployment parameters of each sensor can be combined to obtain a candidate deployment parameter combination.
Here, considering that one sensor may have a plurality of parameters in general, for example, for a radar apparatus, the parameters herein may be an installation position, an installation angle, a horizontal resolution, and the like; for another example, for an image capturing device, the parameters herein may be camera in-out parameters, mounting position, etc.; for another example, for the millimeter wave radar apparatus, the parameters here may be the angle of view, the installation height, and the like, and based on the above description, various combinations of the candidate deployment parameters may be made between the respective sensors, so that it is determined that the candidate deployment parameter combinations corresponding to each sensor combination may correspond to a plurality.
It should be noted that, for each sensor, the actual value may be plural, for example, for the parameter of the installation height of the radar apparatus, the actual value may be the height of 0.1 meter installed at the vehicle roof, or the actual value may be the height of 0.5 meter installed at the vehicle roof, that is, the value of the parameter of each sensor may be plural, or may be a range, for example, the installation position 1 of the radar apparatus, the installation angle (angle a to angle B), the horizontal resolution a forms a first candidate deployment parameter of the radar apparatus, the installation position 2 of the radar apparatus, the installation angle (angle a to angle B), the horizontal resolution a forms a second candidate deployment parameter of the radar apparatus, the installation position 3 of the radar apparatus, the installation angle (angle C to angle D), the horizontal resolution B forms a third candidate deployment parameter of the radar apparatus, and so on. Candidate deployment parameter combinations for each sensor in a sensor combination are then obtained for the plurality of candidate deployment parameter combinations for the sensor combination.
In the embodiment of the disclosure, the scheme of the sensor combination can be determined according to the main body parameters of the driving equipment, the range to be perceived, the set number of the sensors and the isolated perception performance of each sensor. The isolated sensing performance of each sensor can be the nominal performance of the sensor when leaving the factory or the performance obtained by testing under the condition of no relation with the deployment environment.
The method mainly considers that the sensor deployment method provided by the embodiment of the disclosure depends on running equipment, different running equipment needs to sense different ranges under different running environments, and the sensor deployment policy related to the running equipment is provided by considering that the variety, the number and the like of the actually provided sensors are rich, so that the sensor combination mode capable of being combined can be determined under the condition that the number of the sensors deployed on the running equipment, the main parameters of the running equipment and the isolated sensing performance of each sensor are preset.
The driving device may be a car, a bus, or other driving device requiring sensor deployment. For example, the car may include information about the length, width, height, etc. of the car, and may include information about the mass, minimum ground clearance, etc. of the car.
In addition, the sensing performance of the sensor is limited by the sensor type, and the isolated sensing performance of different sensors is not the same, for example, a 360 ° scanning radar device can sense targets in all peripheral ranges, and for an image acquisition device, the sensing range is generally limited to a smaller range.
Therefore, in practical application, if the range of the sensing required by one driving device is larger, the number of the set sensors is larger, and the image sensing can be performed by matching more image acquisition devices on the premise that radar sensing is performed by using radar devices with sensing ranges matched with the range of the sensing required; if the range of the sensing required by one driving device is larger, the number of the set sensors is smaller, and the sensing range can be matched with the range of the sensing required by a plurality of radar devices to perform radar sensing, so that one or more sensors which can be deployed on the driving device are selected.
The sensors in one sensor group selected here may be all sensors or partial sensors provided in advance, and for example, may be 8 partial sensors selected from 10 sensors.
Here, whether information sensing is performed based on all selected sensors or information sensing is performed based on selected portions of sensors, the sensing performance of each sensor combination under each candidate deployment parameter combination may be determined, and the candidate deployment parameter combination of the sensor combination whose sensing performance is optimal may be determined based on the determined sensing performance.
Considering that each candidate deployment parameter combination of a sensor combination is derived from candidate deployment parameter combinations of a plurality of sensors, the overall perceived performance of the sensor combination can be determined based on the perceived performance of each sensor included in the sensor combination under the respective candidate deployment parameters, and the higher the perceived performance the better the sensor combination is selected as the probability of deployment on the running device.
In the embodiment of the disclosure, the sensors can be deployed on the running equipment according to the candidate deployment parameter combination of the sensor combination with optimal sensing performance, so that better sensing performance can be obtained under the condition of less quantity, and better sensing performance can be obtained under the condition of lower sensor cost.
Considering the influence of distance factors on the sensing performance of the sensor, the embodiment of the disclosure can divide the range of the driving equipment to be sensed into a plurality of sensing areas including a close sensing area and a long-distance sensing area, and can respectively determine the sensing performance based on the divided close sensing area and long-distance sensing area, so as to determine the sensing performance of each sensor included in one sensor. The method comprises the following steps:
Dividing a range which is required to be perceived by the driving equipment into a plurality of perception areas according to a preset dividing size; the plurality of sensing regions includes at least one close range sensing region and at least one far range sensing region;
step two, for each sensor included in the sensor, respectively performing information sensing on at least one close-range sensing area and performing information sensing on at least one long-range sensing area based on each candidate deployment parameter of the sensor, and determining each first sensing performance of the sensor for the at least one close-range sensing area and each second sensing performance of the sensor for the at least one long-range sensing area;
and thirdly, obtaining the perception performance of each sensor included in the sensor under each candidate deployment parameter based on each first perception performance and each second perception performance.
The range to be perceived by the driving device may be preset for different driving devices, for example, a cubic space with the center point of the vehicle as the origin and 150 meters in front and 150 meters in back and about 50 meters in width and 6 meters in height is defined in the space as the space perception range.
In the embodiment of the present disclosure, the range to be sensed may be divided into a plurality of sensing areas according to a preset dividing size. For example, the above-described cubic space may be divided into combinations of small cubic lattices (corresponding to a plurality of sensing regions) at intervals of 0.1 m. The sensing region may be a short-distance sensing region closer to the driving apparatus, or a long-distance sensing region farther from the driving apparatus.
In the embodiment of the present disclosure, the distance between the sensing region and the driving apparatus may be determined for each sensing region, and here, the distance between the center of the sensing region and the center of the vehicle may be adopted. And under the condition that the distance is smaller than a preset threshold value, the sensing area can be determined to belong to a close-range sensing area, and on the contrary, the sensing area can be determined to belong to a long-range sensing area.
The preset threshold is not too large or too small. An excessively large preset threshold will cause the perceptual performance of the actually distant perceptual area to be not well reflected, and an excessively small preset threshold will cause the perceptual performance of the actually close perceptual area to be not well reflected. In the embodiment of the disclosure, the preset threshold may be determined in combination with a preset division size, where a larger preset threshold may be set in case of a larger preset division size, and a smaller preset threshold may be set in case of a smaller preset division size, for example, in case of a preset division size of 0.1 meter, the preset threshold may be set to 1 meter.
As shown in fig. 2, the division result corresponding to the range to be perceived in the top view is shown, wherein the rectangle in the middle is used for representing the driving device, and four areas ABCD are: representing 4 near-distance sensing area groups (each near-distance sensing area group corresponds to a plurality of near-distance sensing areas) on the front, back, left and right respectively; region E: the front main long-distance sensing area group (comprising a plurality of long-distance sensing areas) has the highest importance level ranking; three regions of GHF: the three remote sensing region groups respectively represent 3 remote sensing region groups (each remote sensing region group corresponds to a plurality of remote sensing regions) at the left and right rear, and the importance degree is slightly lower than that of the E region; region I: the sensing area (including the long-distance sensing area and the short-distance sensing area) representing the above-ground object comprises traffic lights, traffic signs and the like.
In the case where the remote sensing area and the close sensing area are determined in the above manner, the information sensing of at least one close sensing area and the information sensing of at least one remote sensing area may be performed based on each candidate deployment parameter of each sensor included in one sensor, and each first sensing performance of the sensor for at least one close sensing area and each second sensing performance of the sensor for at least one remote sensing area may be determined, so that the sensing performance of each sensor included in one sensor in the whole range to be sensed may be determined based on each first sensing performance and each second sensing performance.
Under the condition that the candidate deployment parameters selected by the sensor are different, the corresponding first perception performance and second perception performance are also different, and the first perception performance and the second perception performance can be determined based on the candidate deployment parameters of the sensor.
In practical applications, the requirements for the perceptual performance of different sensing regions are different in view of different application scenarios, for example, in applications for some simple scenarios, the perceptual performance of near distances is often more focused, and in applications for some complex scenarios, the perceptual performance of far distances is more focused. Based on the above consideration, in the embodiment of the disclosure, when determining the sensing performance of a sensor in the whole range where sensing is required, the sensing performance of the sensor in the candidate deployment parameter is obtained by respectively giving corresponding inter-region sensing weights to the first sensing performance and the second sensing performance, that is, weighting and summing the first sensing performance and the second sensing performance of the sensor in the candidate deployment parameter, so as to better adapt to the requirements of different scenes.
In consideration of the high perceptibility of the sensor to the close range target, even if the acquired sensor data of part of the target is the target, namely, the perception performance of the sensor has no strong dependence on the close range target object, therefore, the information perception can be carried out on the synthesized area corresponding to at least one close range perception area only based on each candidate deployment parameter of each sensor, the target object is not needed to be considered, the operation is simple, and the first perception performance is determined to be more suitable for the requirement of close range perception.
The composite region may be a combination of at least one of the close range sensing regions, and for the example shown in fig. 2, the composite region may correspond to a combination of A, B, C, D of the close range sensing region groups.
In the embodiments of the present disclosure, the method for determining the first sensing performance is not the same for different types of sensors, and the following two aspects may be specifically described herein.
First aspect: in the case where the sensor is a radar device, the first perceived performance of the sensor for the composite region may be determined as follows:
step one, determining, for a candidate deployment parameter of a radar device, a first number of short-range sensing areas in which radio beams fall in a synthetic area in a case where the radar device transmits the radio beams to the synthetic area after deployment according to the candidate deployment parameter;
and step two, determining first perception performance of the radar equipment for the synthesized area under the candidate deployment parameters according to the first quantity.
Here, when different radar devices are deployed according to corresponding candidate deployment parameters and then transmit radio beams to the synthesis region, the first number of the radio beams falling into the close-range sensing regions in the synthesis region is different, and the more the first number falls into, the better the sensing effect of the deployment parameters such as the position, the height and the like of the current radar device on the close-range sensing regions is, and conversely, the worse the sensing effect is. Thus, the corresponding first perceived performance may be determined here based on a first quantity, where the first quantity and the first perceived performance may be exhibiting a positive correlation.
In practical application, the quality of the sensing effect can be directly determined based on the first duty ratio between the first number and the total number of the short-distance sensing areas contained in the synthesized area, and the higher the duty ratio is, the better the sensing effect of the corresponding point cloud is, the lower the duty ratio is, and the worse the sensing effect of the corresponding point cloud is.
It should be noted that the radar apparatus in the embodiment of the present disclosure may be a rotary scanning lidar, where the radio beam emitted is a laser beam, but other types of lidar, or other types of radar apparatuses, are not particularly limited herein.
In practical applications, the radio beams may be simulated based on simulation technology, and the remote sensing areas corresponding to the target object may be realized based on simulation process. And the deployment cost is greatly reduced by using a simulation technology.
Second aspect: in the case that the sensor is an image acquisition device, the first perceptual performance of the sensor for the synthesized region may be determined as follows:
step one, determining a second number of short-distance sensing areas in a synthesized area contained in an acquired image when the image acquisition device acquires the image of the synthesized area after deployment according to a candidate deployment parameter of the image acquisition device;
And step two, determining a first perception performance of the image acquisition equipment aiming at the synthesized region under the candidate deployment parameters according to the second quantity.
Similarly, when different image acquisition devices are deployed according to corresponding candidate deployment parameters and then acquire images of a synthesized region, the second number of the close-range sensing regions in the synthesized region contained in the acquired images is different, and the more the second number falls, the better the sensing effect of the deployment parameters such as the position, the height and the like of the current image acquisition device on the close-range sensing regions is, and conversely, the worse the sensing effect is. Thus, the corresponding first perceived performance may be determined here based on a second number, which may be a positive correlation with the first perceived performance.
In practical applications, the quality of the perceived effect may be directly determined based on a second duty ratio between the second number and the total number of the short-distance perceived areas included in the synthesized area, where the higher the duty ratio is, the better the perceived effect of the corresponding image is, and the lower the duty ratio is, the worse the perceived effect of the corresponding image is.
Similarly, in practical application, the above process may be implemented based on a simulation technique, which is not described herein.
It should be noted that, in the embodiments of the present disclosure, the first sensing performance may be determined for the radar device and the image capturing device based on the above two aspects, and the first sensing performance may also be determined based on other sensors, which are not described herein.
In consideration of weak perceptibility of the sensor to the remote target, even if the acquired sensor data of all the targets may not determine the target, that is, the perceptibility of the sensor has strong dependence on the remote target, so that information perception can be performed on the target object set in each remote perception region, that is, one candidate deployment parameter of a sensor, for each remote perception region, information perception is performed on the remote perception region based on the candidate deployment parameter of the sensor, and the perceptibility of the sensor corresponding to the remote perception region under the candidate deployment parameter is determined.
In the embodiments of the present disclosure, the method for determining the second sensing performance is not the same for different types of sensors, and the following two aspects may be specifically described herein.
First aspect: in the case where the sensors are radar devices, the second perceived performance of each sensor for a remote perceived area under a candidate deployment parameter may be determined as follows:
step one, determining, for each of at least one remote sensing area, the number information of radio beams striking a target object of the radar device in the case of transmitting the radio beams to the target object in the remote sensing area after deployment according to the candidate deployment parameters;
and step two, determining the perception performance of the sensor for the remote perception area under the candidate deployment parameters based on the number information of radio beams of the target object.
The target object in the embodiments of the present disclosure may be a human body, such as an adult, a child, or the like, or may be an object, such as a vehicle, a building, or the like.
In the case where the corresponding perceived performance is determined for each remote perceived area, the target object may be placed in that remote perceived area, and in the case where it is determined that the radar apparatus transmits a radio beam to that remote perceived area after deployment according to one of the candidate deployment parameters, the perceived performance corresponding to that remote perceived area is determined based on the number information of radio beams of the target object.
The more the number of radio beams of the target object is, the better the perception performance of the radar device can be explained to some extent.
Second aspect: in the case where the sensors are image acquisition devices, the second perceived performance of each sensor for a remote perceived area under a candidate deployment parameter may be determined as follows:
step one, determining the duty ratio information of a target object in an acquired image when the image acquisition equipment acquires the target object in the remote sensing area after deployment according to the candidate deployment parameters aiming at each remote sensing area in at least one remote sensing area;
and secondly, determining the sensing performance of the sensor for the remote sensing area under the candidate deployment parameters based on the duty ratio information.
In the case where the corresponding perceived performance is determined for each remote sensing area, the target object may be placed in that remote sensing area, and in the case where it is determined that the image capturing apparatus captures an image of the target object within that remote sensing area after deployment according to one of the candidate deployment parameters, the perceived performance corresponding to that remote sensing area is determined based on the duty ratio information of the target object in the captured image.
The larger the value of the duty ratio information of the target object in the acquired image is, the larger the probability that the target object is detected by the image acquisition equipment can be described to a certain extent, and the better the perception performance of the image acquisition equipment is further described.
Similarly, in the embodiments of the present disclosure, the second sensing performance may be determined for the radar device and the image capturing device based on the above two aspects, and may also be determined based on other sensors, which will not be described herein.
In consideration of that the sensing requirements for different remote sensing areas are not the same, in practical application, the second sensing performance of the sensor for at least one remote sensing area under the candidate deployment parameters can be obtained based on the sensing performance of the sensor for each remote sensing area under the candidate deployment parameters and the in-area sensing weight set for each remote sensing area by weighted summation, so that the determined second sensing performance meets the requirements of a practical application scene more.
In a specific application, for one sensor combination, multiple types of sensors may be combined to determine the sensing performance of one sensor combination under one candidate deployment parameter combination, where the sensing performance of the corresponding type may be set for each sensor separately, and then the sensing performance corresponding to one sensor combination is determined through weighted summation.
In addition, there may be one or more target objects in the embodiments of the present disclosure. In practical application, considering that the reflectivity and other information of different target objects are different for the same sensor, the finally presented perception performance is different, here, different object perception weights can be set for different target objects or different classes of target objects, and the perception performance can be comprehensively evaluated by combining the above perception area division and different types of sensors.
In the embodiment of the disclosure, target objects which may appear in different scenes can be taken as main bodies, and the sensing performance is embodied as the three-dimensional detection capability of the sensor on the objects, and is embodied as the ranging performance and the identification performance on the objects.
Specifically, in the embodiment of the disclosure, the whole range to be sensed is first discretized into different sensing areas, then different objects are placed in the sensing areas within the sensing range, and for each sensor, after the results of the respective sensors included in the sensor are subjected to fusion processing, the detection performance of the sensor for the object at this time and the total sensing capability for the object at this time are evaluated. Finally, the perceptual performance of different objects and different positions can be weighted and averaged to obtain the final total perceptual score.
To further understand the above-described method of sensor deployment, a description may be provided below in connection with formulas.
Assuming that m classes of objects to be perceived in the current application scene correspond to c 1 ,c 2 ,...c m The method comprises the steps of carrying out a first treatment on the surface of the N sensing regions in the range needing sensing are corresponding to x 1 ,x 2 ,...x n The number of sensors is t and is denoted as p 1 ,p 2 ,...p t
Here, for any sensor p k For placement at x j Object c of (2) i The perceptual performance of (c) may be expressed as a function Sen (c i ,x j ,p k ) To express, the total perceived performance (which can evaluate the probability that the target is detected) for the entire range that needs to be perceived is:
and after the F (Fusion) function fuses the results of different sensors, outputting the total perception performance of the sensors. W is the weight of the perceptual performance of different objects, different perceived regions (integrating the object perceived weight and the inter-region perceived weight).
In view of the fact that the perceived performance in the near-distance perceived area is less dependent on the object itself, the perceived performance may be determined by using the above-described first duty ratio corresponding to the perceived area scanned by the radar device in the relevant synthesized area or the second duty ratio corresponding to the perceived area acquired by the image acquisition device in the relevant synthesized area. The following description is a manner of evaluating the perceptual performance of the target object in the remote perceptual area.
Taking a lidar device as an example, the perceived performance of a target object and the number of point clouds that radar emits onto the object are positively correlated. Thus for a target object C in pose X, the number of bus beams M impinging on the target is first obtained based on the horizontal and vertical angular resolution of the laser beam.
In view of the fact that the detection capability for the target is not directly proportional to M, here, a point cloud detection model may be used as a reference to balance the relationship between the number of harnesses and the detection capability. The detection model here gives the probability that different targets are detected at different distances in the pose of a particular radar apparatus. Under the placement position, the number of the wire harnesses hit by the target at different distances at the moment can be counted, so that the relation between the number of the wire harnesses of the target and the detection probability is obtained. In particular, it can be determined by fitting a curve as follows.
Sen (c, x, p=lidar device) =f1 (log (M))
The calculation process of the perception performance of the image acquisition device is similar to that of a laser radar, and the perception performance of the target and the pixel size of the object in the image are positively correlated. Therefore, the relationship between the pixel size N of different targets and the probability of being detected at a specific camera placement position can be obtained by the camera detection model. Here, the following quadratic curve relationship between the detection probability and the logarithm of the pixel size can be obtained:
Sen (c, x, p=photographing apparatus) =f2 (log (N))
Here, for a plurality of sensors of the same type, calculation may be performed in a fusion-first manner, and for example, for a lidar device, the number of strands of wire that hit a target may be added to obtain the overall sensing performance of the plurality of sensors of the same type. Taking two radar devices as an example, there are corresponding: f (lidar device) =f1 (log (m1+m2)).
In the embodiment of the disclosure, the sensing performance of a plurality of sensors of the same type can also be calculated in a post-fusion mode. For the image acquisition equipment, considering that the shooting positions corresponding to the image acquisition equipment are different, the corresponding determined pixel duty ratio is also different, wherein the perception performance of each image acquisition equipment under the corresponding pixel duty ratio can be determined first, and then the perception performance is fused, so that the overall perception performance of the image acquisition equipment of the same type is obtained. Taking two image acquisition devices as examples, the two image acquisition devices correspond to:
f (photographing apparatus) =f2 (log n 1) +f2 (log n 2) -F2 (log n 1) ×f2 (log n 2)
For various sensors of different types, the calculation can be performed in a post-fusion mode, and the corresponding formula is as follows:
F(Sen1,Sen2)=Sen1+Sen2-Sen1×Sen2
Based on the above, the above detection probability can be used as an evaluation index of the sensing performance in the application stage, that is, the sensing performance evaluation value in the whole range to be sensed can be determined based on the detection probability, and the larger the detection probability, the better the sensing performance of the corresponding sensor is described to a certain extent, otherwise, the worse the sensing performance is.
When the parameter value of the sensor is a range, in order to reduce the calculated amount of the larger number of perception performance evaluations caused by the combination of the candidate deployment parameters of the sensor, the parameter value range can be divided into a plurality of intervals, so that the embodiment of the disclosure can divide the candidate deployment parameters into the value intervals, and carry out information perception based on the divided value intervals, which is realized by the following steps:
step one, determining a candidate deployment parameter corresponding to each sensor from each candidate deployment parameter combination of each sensor for each sensor combination, and dividing the parameter value of each parameter in the candidate deployment parameters into a plurality of value intervals;
and step two, determining the perception performance of the sensor under the candidate deployment parameter combination based on the assigned parameter values in the multiple value intervals after the candidate deployment parameters corresponding to each sensor are divided.
In the embodiment of the disclosure, multiple information sensing from thick to thin can be performed. For rough information sensing, the parameter value of each parameter in the candidate deployment parameters corresponding to a sensor can be divided into value intervals for each sensor included in the sensor, then for the candidate deployment parameters of the sensor, the designated parameter value is selected from the value intervals to participate in the information sensing, for example, the endpoint parameter value, the middle parameter value and the like of one value interval are selected as the designated parameter value to determine the sensing performance, and compared with the problem that the calculated amount is large caused by the fact that all the parameter values of the candidate deployment parameters of all the sensors are participated in the information sensing, the interval division mode is adopted, so that the integrity of parameter sampling is ensured, and the calculated amount is greatly reduced.
For refined information sensing, a value interval corresponding to the candidate deployment parameter of each sensor in the candidate deployment parameter combination of the sensor with optimal sensing performance can be determined first, then, information sensing is performed again on the range of the driving equipment to be sensed based on a plurality of parameter values in the value interval corresponding to the determined candidate deployment parameter of each sensor, and thus, the determined one parameter value group with optimal sensing performance can be determined as the candidate deployment parameter combination of the sensor with optimal sensing performance.
It can be known that, according to the embodiment of the disclosure, based on rough information sensing, a preferable parameter value scheme can be searched, and then, by combining refined information sensing, a nearby parameter value scheme can be further searched under the preferable parameter value scheme, and on the premise of reducing the calculated amount, the evaluation accuracy of the sensing performance is improved.
It will be appreciated by those skilled in the art that in the above-described method of the specific embodiments, the written order of steps is not meant to imply a strict order of execution but rather should be construed according to the function and possibly inherent logic of the steps.
Based on the same inventive concept, the embodiment of the present disclosure further provides a sensor deployment device corresponding to the sensor deployment method, and since the principle of solving the problem by the device in the embodiment of the present disclosure is similar to that of the sensor deployment method in the embodiment of the present disclosure, the implementation of the device may refer to the implementation of the method, and the repetition is omitted.
Referring to fig. 3, a schematic diagram of an apparatus for sensor deployment according to an embodiment of the disclosure is shown, where the apparatus includes: a first determination module 301, a second determination module 302, and a deployment module 303; wherein,
A first determining module 301, configured to determine at least one sensor group deployed on the driving device and a plurality of candidate deployment parameter groups of each sensor group according to a main body parameter of the driving device, a range to be perceived, a set number of sensors, and an isolated perception performance of each sensor;
a second determining module 302, configured to determine, according to the perceived performance of each sensor combination under each candidate deployment parameter combination, a candidate deployment parameter combination of the sensor combination with optimal perceived performance;
a deployment module 303 for deploying the sensors on the running device according to the determined candidate deployment parameters of the sensor combination.
With the adoption of the sensor deployment device, under the condition that at least one sensor combination and a plurality of candidate deployment parameter combinations of each sensor combination are determined, the sensor deployment device can determine the candidate deployment parameter combination of the sensor with optimal perception performance according to the perception performance of each sensor combination under each candidate deployment parameter combination, the perception performance can represent the perception performance of each sensor combination result (namely, the larger the performance value is), the better the perception performance is explained to a certain extent, so that the sensor deployment device can determine the candidate deployment parameter combination of the sensor with optimal perception performance, and deploy the sensor according to the determined candidate deployment parameter combination, and the deployed sensor can obtain better perception performance under the condition of a smaller number of sensors, thereby obtaining better perception performance under the condition of lower sensor cost.
In one possible implementation, the second determining module 302 is configured to determine the perceived performance of each sensor combination under each candidate deployment parameter combination according to the following steps:
for each candidate deployment parameter combination of each sensor combination, determining the perceived performance of the sensor combination under the candidate deployment parameter combination based on the perceived performance of each sensor included in the sensor combination under the respective candidate deployment parameters.
In one possible implementation, the second determining module 302 is configured to determine the perceived performance of each sensor included in a sensor under the respective candidate deployment parameters according to the following steps:
dividing a range which is required to be perceived by the driving equipment into a plurality of perception areas according to a preset division size; the plurality of sensing regions includes at least one close range sensing region and at least one far range sensing region;
for each sensor included by the sensor, respectively performing information sensing on at least one close-range sensing area and information sensing on at least one far-range sensing area based on each candidate deployment parameter of the sensor, and determining each first sensing performance of the sensor for the at least one close-range sensing area and each second sensing performance of the sensor for the at least one far-range sensing area;
Based on the first perceived performance and the second perceived performance, the perceived performance of each sensor included in the sensor under the candidate deployment parameters is obtained.
In a possible implementation manner, the second determining module 302 is configured to perform information sensing on at least one close-range sensing area based on each candidate deployment parameter of a sensor according to the following steps, and determine each first sensing performance of the sensor for at least one close-range sensing area:
determining a synthesis region corresponding to at least one close-range sensing region; wherein the synthesis region is formed by combining at least one close-range sensing region;
and based on each candidate deployment parameter of the sensor, performing information sensing on the synthetic region to obtain each first sensing performance of the sensor aiming at the synthetic region.
In a possible implementation manner, in the case that the sensor is a radar device, the second determining module 302 is configured to perform information sensing on the composite area based on each candidate deployment parameter of one sensor according to the following steps, so as to obtain each first sensing performance of the sensor for the composite area:
determining, for a candidate deployment parameter of a radar device, a first number of short-range-aware regions in which the radar device, when deployed according to the candidate deployment parameter, transmits radio beams to the composite region;
According to the first number, a first perceived performance of the radar device for the composite region under the candidate deployment parameters is determined.
In a possible implementation manner, in a case where the sensor is an image capturing device, the second determining module 302 is configured to perform information sensing on the synthesized area based on each candidate deployment parameter of one sensor according to the following steps, so as to obtain each first sensing performance of the sensor for the synthesized area:
determining, for a candidate deployment parameter of an image acquisition device, a second number of close-range perceived areas in a synthesized area contained in an acquired image when the image acquisition device acquires an image of the synthesized area after deployment according to the candidate deployment parameter;
according to the second number, a first perceived performance of the image acquisition device for the composite region under the candidate deployment parameter is determined.
In a possible implementation manner, the second determining module 302 is configured to perform information sensing on at least one remote sensing area based on each candidate deployment parameter of a sensor according to the following steps, and determine each second sensing performance of the sensor for at least one remote sensing area:
For a candidate deployment parameter of a sensor, for each remote sensing area in at least one remote sensing area, performing information sensing on target objects in the remote sensing area based on the candidate deployment parameter of the sensor, and determining the sensing performance of the sensor for the remote sensing area under the candidate deployment parameter;
and obtaining second perception performance of the sensor for at least one remote perception region under the candidate deployment parameters based on the perception performance of the sensor for each remote perception region under the candidate deployment parameters.
In a possible implementation manner, in the case that the sensor is a radar device, the second determining module 302 is configured to perform information sensing on the target object in the remote sensing area based on a candidate deployment parameter of a sensor according to the following steps, and determine sensing performance of the sensor for the remote sensing area under the candidate deployment parameter:
determining, for each of at least one remote sensing area, information of a number of radio beams of the target object that the radar device is to hit in transmitting radio beams to the target object within the remote sensing area after deployment according to the candidate deployment parameters;
Based on the number of radio beams striking the target object, a perceived performance of the sensor for the remote perceived area under the candidate deployment parameters is determined.
In a possible implementation manner, in the case that the sensor is an image capturing device, the second determining module 302 is configured to perform information sensing on the target object in the remote sensing area based on a candidate deployment parameter of a sensor according to the following steps, and determine a sensing performance of the sensor for the remote sensing area under the candidate deployment parameter:
determining, for each of at least one remote sensing region, duty cycle information of a target object in an acquired image when the image acquisition device acquires the target object in the remote sensing region after deployment according to the candidate deployment parameters;
based on the duty cycle information, a perceived performance of the sensor for the remote perceived region under the candidate deployment parameters is determined.
In a possible implementation manner, the second determining module 302 is configured to obtain, based on the perceived performance of the sensor for each remote sensing area under the candidate deployment parameter, a second perceived performance of the sensor for at least one remote sensing area under the candidate deployment parameter according to the following steps:
Based on the sensing performance of the sensor for each remote sensing area under the candidate deployment parameters and the in-area sensing weight set for each remote sensing area, the weighted summation obtains the second sensing performance of the sensor for at least one remote sensing area under the candidate deployment parameters.
In a possible implementation manner, the second determining module 302 is configured to determine at least one close-range sensing area and at least one far-range sensing area according to the following steps:
determining, for each of a plurality of sensing regions, a distance between the sensing region and the driving apparatus;
under the condition that the determined distance is smaller than a preset threshold value, determining that the sensing area belongs to a short-distance sensing area; or,
and under the condition that the determined distance is greater than or equal to a preset threshold value, determining that the sensing area belongs to a long-distance sensing area.
In one possible implementation, where different types of sensors are included in a sensor combination, the second determination module 302 is configured to determine the perceived performance of a sensor combination at a candidate deployment parameter combination based on the perceived performance of each sensor in the sensor combination at the candidate deployment parameter combination according to the following steps:
Based on the perceived performance of each sensor in the sensor combination under the candidate deployment parameter combination of the sensor combination and the type perceived weight set for each sensor, weighted summation results in the perceived performance of the sensor combination under the candidate deployment parameter combination.
In one possible implementation, the second determining module 302 is configured to determine the perceived performance of each sensor combination under each candidate deployment parameter combination according to the following steps:
for each sensor included in each sensor combination, determining a candidate deployment parameter corresponding to the sensor from each candidate deployment parameter combination of the sensor combination, and dividing the parameter value of each parameter in the candidate deployment parameters into a plurality of value intervals;
and determining the perception performance of the sensor under the candidate deployment parameter combination based on the designated parameter values in the multiple value intervals after the candidate deployment parameters corresponding to each sensor are divided.
In one possible implementation, the second determining module 302 is configured to determine a candidate deployment parameter combination of the sensor combination with optimal perceived performance according to the following steps:
determining a value interval corresponding to the candidate deployment parameter of each sensor in the candidate deployment parameter combination of the sensor which enables the perception performance to be optimal;
Determining the sensing performance of the sensor under the candidate deployment parameter combination again based on the determined values of a plurality of parameters in the value interval corresponding to the candidate deployment parameter of each sensor;
and determining one parameter value group with optimal perception performance as a candidate deployment parameter combination of the sensor with optimal perception performance.
The process flow of each module in the apparatus and the interaction flow between the modules may be described with reference to the related descriptions in the above method embodiments, which are not described in detail herein.
The embodiment of the disclosure further provides an electronic device, as shown in fig. 4, which is a schematic structural diagram of the electronic device provided by the embodiment of the disclosure, including: a processor 401, a memory 402, and a bus 403. The memory 402 stores machine-readable instructions executable by the processor 401 (e.g., execution instructions corresponding to the first determination module 301, the second determination module 302, the deployment module 303, etc. in the sensor-deployed device of fig. 3), which when the electronic device is running, are in communication with the memory 402 via the bus 403, and when executed by the processor 401, perform the following processes:
determining at least one sensor combination deployed on the running equipment and a plurality of candidate deployment parameter combinations of each sensor combination according to main body parameters of the running equipment, a range to be perceived, the set number of sensors and isolated perception performance of each sensor;
According to the perception performance of each sensor combination under each candidate deployment parameter combination, determining the candidate deployment parameter combination of the sensor combination with optimal perception performance;
and deploying the sensor on the driving equipment according to the determined candidate deployment parameters of the sensor combination.
The disclosed embodiments also provide a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method of sensor deployment described in the method embodiments above. Wherein the storage medium may be a volatile or nonvolatile computer readable storage medium.
Embodiments of the present disclosure further provide a computer program product, where the computer program product carries program code, where instructions included in the program code may be used to perform steps of a method for deploying a sensor described in the foregoing method embodiments, and specific reference may be made to the foregoing method embodiments, which are not described herein.
Wherein the above-mentioned computer program product may be realized in particular by means of hardware, software or a combination thereof. In an alternative embodiment, the computer program product is embodied as a computer storage medium, and in another alternative embodiment, the computer program product is embodied as a software product, such as a software development kit (Software Development Kit, SDK), or the like.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described system and apparatus may refer to corresponding procedures in the foregoing method embodiments, which are not described herein again. In the several embodiments provided in the present disclosure, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. The above-described apparatus embodiments are merely illustrative, for example, the division of the units is merely a logical function division, and there may be other manners of division in actual implementation, and for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some communication interface, device or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present disclosure may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer readable storage medium executable by a processor. Based on such understanding, the technical solution of the present disclosure may be embodied in essence or a part contributing to the prior art or a part of the technical solution, or in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method described in the embodiments of the present disclosure. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Finally, it should be noted that: the foregoing examples are merely specific embodiments of the present disclosure, and are not intended to limit the scope of the disclosure, but the present disclosure is not limited thereto, and those skilled in the art will appreciate that while the foregoing examples are described in detail, it is not limited to the disclosure: any person skilled in the art, within the technical scope of the disclosure of the present disclosure, may modify or easily conceive changes to the technical solutions described in the foregoing embodiments, or make equivalent substitutions for some of the technical features thereof; such modifications, changes or substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the disclosure, and are intended to be included within the scope of the present disclosure. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (16)

1. A method of sensor deployment, the method comprising:
determining at least one sensor combination deployed on the running equipment and a plurality of candidate deployment parameter combinations of each sensor combination according to main body parameters of the running equipment, a range to be perceived, the set number of sensors and isolated perception performance of each sensor;
dividing the range which the driving equipment needs to sense into a plurality of sensing areas according to a preset dividing size; the plurality of sensing regions includes at least one close range sensing region and at least one far range sensing region;
for each sensor included in the sensor, respectively performing information sensing on the at least one close-range sensing area and information sensing on the at least one far-range sensing area based on each candidate deployment parameter of the sensor, and determining each first sensing performance of the sensor for the at least one close-range sensing area and each second sensing performance of the sensor for the at least one far-range sensing area;
based on the first sensing performance and the second sensing performance, sensing performance of each sensor included in the sensor under the candidate deployment parameters is obtained;
According to the perception performance of each sensor combination under each candidate deployment parameter combination, determining the candidate deployment parameter combination of the sensor combination with optimal perception performance;
and deploying the sensor on the driving equipment according to the determined candidate deployment parameters of the sensor combination.
2. The method of claim 1, wherein determining the perceived performance of each sensor combination at each candidate deployment parameter combination comprises:
for each candidate deployment parameter combination of each sensor combination, determining the perceived performance of the sensor combination under the candidate deployment parameter combination based on the perceived performance of each sensor included in the sensor combination under the respective candidate deployment parameters.
3. The method of claim 1, wherein information sensing the at least one close-range sensing region based on the respective candidate deployment parameters of a sensor, determining respective first sensing capabilities of the sensor for the at least one close-range sensing region, comprises:
determining a synthesis region corresponding to the at least one close-range sensing region; wherein the synthesis region is formed by combining the at least one close-range sensing region;
And based on each candidate deployment parameter of the sensor, performing information sensing on the synthesized region to obtain each first sensing performance of the sensor for the synthesized region.
4. A method according to claim 3, wherein, in the case where the sensor is a radar device, the information sensing of the composite area based on each candidate deployment parameter of a sensor, to obtain each first sensing performance of the sensor for the composite area, includes:
determining, for a candidate deployment parameter of a radar device, a first number of short-range-aware areas in the composite area into which the radio beams fall if the radar device transmits radio beams to the composite area after deployment according to the candidate deployment parameter;
and determining a first perception performance of the radar device for the synthesized region under the candidate deployment parameters according to the first quantity.
5. A method according to claim 3, wherein, in the case where the sensor is an image capturing device, based on each candidate deployment parameter of the one sensor, performing information sensing on the synthesized region to obtain each first sensing performance of the sensor for the synthesized region, including:
Determining, for a candidate deployment parameter of an image acquisition device, a second number of close-range perceived areas in the synthesized area contained in the acquired image when the image acquisition device acquires the image of the synthesized area after deployment according to the candidate deployment parameter;
and determining a first perception performance of the image acquisition device for the synthesized region under the candidate deployment parameters according to the second quantity.
6. The method of claim 1, wherein information sensing the at least one remote sensing region based on the respective candidate deployment parameters of a sensor, determining respective second sensing capabilities of the sensor for the at least one remote sensing region, comprises:
for a candidate deployment parameter of a sensor, for each remote sensing area in the at least one remote sensing area, performing information sensing on target objects in the remote sensing area based on the candidate deployment parameter of the sensor, and determining sensing performance of the sensor for the remote sensing area under the candidate deployment parameter;
and obtaining a second perception performance of the sensor for the at least one remote perception region under the candidate deployment parameters based on the perception performance of the sensor for each remote perception region under the candidate deployment parameters.
7. The method of claim 6, wherein, in the case where the sensor is a radar device, performing information sensing on the target object in the remote sensing area based on a candidate deployment parameter of a sensor, determining sensing performance of the sensor for the remote sensing area under the candidate deployment parameter, comprises:
determining, for each of the at least one remote sensing area, a number of radio beams of the radar device that hit a target object within the remote sensing area upon transmitting the radio beams to the target object after deployment according to the candidate deployment parameters;
based on the number of radio beams striking the target object, a perceived performance of the sensor for the remote perceived area under the candidate deployment parameters is determined.
8. The method of claim 6, wherein, in the case where the sensor is an image capturing device, performing information sensing on the target object in the remote sensing area based on a candidate deployment parameter of a sensor, determining sensing performance of the sensor for the remote sensing area under the candidate deployment parameter, includes:
Determining, for each of the at least one remote sensing region, duty cycle information of a target object in an image acquired by the image acquisition device when the image acquisition device acquires the image of the target object in the remote sensing region after deployment according to the candidate deployment parameters;
and determining the perception performance of the sensor for the remote perception region under the candidate deployment parameters based on the duty ratio information.
9. The method of claim 6, wherein the deriving a second perceived performance of the sensor for the at least one remote perceived region at the candidate deployment parameter based on the perceived performance of the sensor for each remote perceived region at the candidate deployment parameter comprises:
based on the sensing performance of the sensor for each remote sensing area under the candidate deployment parameters and the in-area sensing weight set for each remote sensing area, the weighted summation obtains the second sensing performance of the sensor for the at least one remote sensing area under the candidate deployment parameters.
10. The method according to any of the claims 1, characterized by determining at least one close range sensing area and at least one far range sensing area according to the steps of:
Determining, for each of the plurality of perceived areas, a distance between the perceived area and the driving apparatus;
under the condition that the determined distance is smaller than a preset threshold value, determining that the sensing area belongs to the short-distance sensing area; or,
and under the condition that the determined distance is greater than or equal to a preset threshold value, determining that the sensing area belongs to the long-distance sensing area.
11. The method of claim 1, wherein, in the case where different types of sensors are included in one sensor group, determining the perceived performance of one sensor group at one candidate deployment parameter combination based on the perceived performance of each sensor in the sensor group at the one candidate deployment parameter combination, comprises:
based on the perceived performance of each sensor in the sensor combination under the candidate deployment parameter combination of the sensor combination and the type perceived weight set for each sensor, weighted summation results in the perceived performance of the sensor combination under the candidate deployment parameter combination.
12. The method of any of claims 1-11, wherein the perceived performance of each sensor combination at each candidate deployment parameter combination is determined as follows:
For each sensor included in each sensor combination, determining a candidate deployment parameter corresponding to the sensor from each candidate deployment parameter combination of the sensor combination, and dividing the parameter value of each parameter in the candidate deployment parameters into a plurality of value intervals;
and determining the perception performance of the sensor under the candidate deployment parameter combination based on the designated parameter values in the multiple value intervals after the candidate deployment parameters corresponding to each sensor are divided.
13. The method of claim 12, wherein the determining the candidate deployment parameter combinations for the sensor combination that is optimal in perceived performance comprises:
determining a value interval corresponding to the candidate deployment parameter of each sensor in the candidate deployment parameter combination of the sensor which enables the perception performance to be optimal;
determining the sensing performance of the sensor under the candidate deployment parameter combination again based on the determined values of a plurality of parameters in the value interval corresponding to the candidate deployment parameter of each sensor;
and determining one parameter value group with optimal perception performance as a candidate deployment parameter combination of the sensor with optimal perception performance.
14. An apparatus for sensor deployment, the apparatus comprising:
a first determining module, configured to determine at least one sensor group deployed on the driving device and a plurality of candidate deployment parameter groups of each sensor group according to a main body parameter of the driving device, a range to be perceived, a set number of sensors, and an isolated perception performance of each sensor;
a second determining module, configured to divide a range to be perceived by the driving device into a plurality of perception areas according to a preset division size, where the plurality of perception areas includes at least one close range perception area and at least one far range perception area, and determine, for each sensor included in the sensor, respective first perception performances of the sensor for the at least one close range perception area and respective second perception performances of the sensor for the at least one far range perception area, based on respective candidate deployment parameters of the sensor, respectively performing information perception on the at least one close range perception area and information perception on the at least one far range perception area; based on the first sensing performance and the second sensing performance, sensing performance of each sensor included in the sensor under the candidate deployment parameters is obtained; the method is also used for determining a candidate deployment parameter combination of the sensor combination with optimal perception performance according to the perception performance of each sensor combination under each candidate deployment parameter combination;
And the deployment module is used for deploying the sensor on the running equipment according to the determined candidate deployment parameters of the sensor combination.
15. An electronic device, comprising: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory in communication over the bus when the electronic device is running, the machine-readable instructions when executed by the processor performing the steps of the method of sensor deployment according to any of claims 1 to 13.
16. A computer-readable storage medium, characterized in that it has stored thereon a computer program which, when run by a processor, performs the steps of the method of sensor deployment according to any of claims 1 to 13.
CN202011506638.4A 2020-12-18 2020-12-18 Sensor deployment method and device, electronic equipment and storage medium Active CN112684450B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011506638.4A CN112684450B (en) 2020-12-18 2020-12-18 Sensor deployment method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011506638.4A CN112684450B (en) 2020-12-18 2020-12-18 Sensor deployment method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112684450A CN112684450A (en) 2021-04-20
CN112684450B true CN112684450B (en) 2024-03-22

Family

ID=75449791

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011506638.4A Active CN112684450B (en) 2020-12-18 2020-12-18 Sensor deployment method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112684450B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110751823A (en) * 2019-10-25 2020-02-04 上海商汤临港智能科技有限公司 Monitoring method and device for automatic driving fleet
CN111177869A (en) * 2020-01-02 2020-05-19 北京百度网讯科技有限公司 Method, device and equipment for determining sensor layout scheme

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110462543B (en) * 2018-03-08 2022-09-30 百度时代网络技术(北京)有限公司 Simulation-based method for evaluating perception requirements of autonomous vehicles

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110751823A (en) * 2019-10-25 2020-02-04 上海商汤临港智能科技有限公司 Monitoring method and device for automatic driving fleet
CN111177869A (en) * 2020-01-02 2020-05-19 北京百度网讯科技有限公司 Method, device and equipment for determining sensor layout scheme

Also Published As

Publication number Publication date
CN112684450A (en) 2021-04-20

Similar Documents

Publication Publication Date Title
Ouaknine et al. Carrada dataset: Camera and automotive radar with range-angle-doppler annotations
CN102763420B (en) depth camera compatibility
CN110046640B (en) Distributed representation learning for correlating observations from multiple vehicles
CN102741887B (en) depth camera compatibility
EP2555166A1 (en) Space error parameter for 3D buildings and terrain
CN113111513B (en) Sensor configuration scheme determining method and device, computer equipment and storage medium
CN111739099B (en) Falling prevention method and device and electronic equipment
CN115147333A (en) Target detection method and device
CN112911249A (en) Target object tracking method and device, storage medium and electronic device
Matzner et al. ThermalTracker-3D: A thermal stereo vision system for quantifying bird and bat activity at offshore wind energy sites
Fernandes et al. Automatic early detection of wildfire smoke with visible light cameras using deep learning and visual explanation
CN112684450B (en) Sensor deployment method and device, electronic equipment and storage medium
Muckenhuber et al. Sensors for automated driving
CN107816990B (en) Positioning method and positioning device
CN116466307B (en) Millimeter wave Lei Dadian cloud simulation method and device based on depth map
CN116467848B (en) Millimeter wave radar point cloud simulation method and device
Braga et al. Estimation of UAV position using LiDAR images for autonomous navigation over the ocean
CN113895482B (en) Train speed measuring method and device based on trackside equipment
CN112578369B (en) Uncertainty estimation method and device, electronic equipment and storage medium
CN115471574A (en) External parameter determination method and device, storage medium and electronic device
CN112487984B (en) Point cloud data lightweight rapid generation method
Egi et al. Classified 3D mapping and deep learning-aided signal power estimation architecture for the deployment of wireless communication systems
CN111316119A (en) Radar simulation method and device
CN116577762B (en) Simulation radar data generation method, device, equipment and storage medium
CN113465614B (en) Unmanned aerial vehicle and generation method and device of navigation map thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant