CN114399788A - Object detection method and system - Google Patents

Object detection method and system Download PDF

Info

Publication number
CN114399788A
CN114399788A CN202111600753.2A CN202111600753A CN114399788A CN 114399788 A CN114399788 A CN 114399788A CN 202111600753 A CN202111600753 A CN 202111600753A CN 114399788 A CN114399788 A CN 114399788A
Authority
CN
China
Prior art keywords
information
weight
detection
object information
detected
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111600753.2A
Other languages
Chinese (zh)
Other versions
CN114399788B (en
Inventor
张洪昌
黄宇涵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University of Technology WUT
Original Assignee
Wuhan University of Technology WUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University of Technology WUT filed Critical Wuhan University of Technology WUT
Priority to CN202111600753.2A priority Critical patent/CN114399788B/en
Priority claimed from CN202111600753.2A external-priority patent/CN114399788B/en
Publication of CN114399788A publication Critical patent/CN114399788A/en
Application granted granted Critical
Publication of CN114399788B publication Critical patent/CN114399788B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Traffic Control Systems (AREA)

Abstract

The application provides an object detection method and system. The method comprises the steps of obtaining object information of an object to be detected, which is detected by a plurality of detection devices, and obtaining a plurality of prediction results; determining a prediction probability matrix of the object to be detected corresponding to various prediction results based on the object information; determining a weight vector of the detection equipment and determining a weight vector of the object information; based on the prediction probability matrix, the weight vector of the detection equipment and the weight vector of the object information, the detection result of the object to be detected is selected from various prediction results, and the accuracy of pedestrian detection can be improved to a certain extent.

Description

Object detection method and system
Technical Field
The application relates to the field of intelligent driving, in particular to an object detection method and system.
Background
With the development of science and technology, the automobile intelligent networking technology becomes the focus of the global scientific and technological innovation field, and is another high point of the automobile industry after new energy automobiles, and the pedestrian detection task is always a hot problem in the automobile intelligent networking technology.
At present, in the aspect of pedestrian detection, traffic monitoring equipment is mostly adopted for regional detection, but the method has the problems of low contrast, more noise, low working reliability and the like, so that the pedestrian detection is inaccurate.
Disclosure of Invention
The application aims to provide an object detection method and device, which can improve the accuracy of pedestrian detection at least to a certain extent.
According to an aspect of an embodiment of the present application, there is provided an object detection method, including: acquiring object information of an object to be detected, which is detected by a plurality of detection devices, and acquiring various prediction results; determining a prediction probability matrix of the object to be detected corresponding to the plurality of prediction results based on the object information; determining a weight vector of the detection device and determining a weight vector of the object information; and selecting the detection result of the object to be detected from the multiple prediction results based on the prediction probability matrix, the weight vector of the detection equipment and the weight vector of the object information.
According to an aspect of the embodiments of the present application, there is provided an object detection system including a plurality of detection apparatuses and a central processing unit, the central processing unit including: the acquisition module is configured to acquire object information of the object to be detected, which is detected by the detection devices, and acquire various prediction results; the matrix determination module is configured to determine a prediction probability matrix of the object to be detected corresponding to the plurality of prediction results based on the object information; a vector determination module configured to determine a weight vector of the detection device and determine a weight vector of the object information; and the selection module is configured to select the detection result of the object to be detected from the multiple prediction results based on the prediction probability matrix, the weight vector of the detection device and the weight vector of the object information.
In an embodiment of the present application, based on the foregoing solution, the vector determination module is configured to: acquiring a plurality of preset equipment weights corresponding to each detection equipment; grouping the preset device weights to obtain a plurality of preset device weight groups; selecting a group median of a group in which the maximum frequency is located in the preset equipment weight groups as the weight of each detection equipment; combining the weights of a plurality of detection devices to obtain the weight vector of the detection device.
In an embodiment of the present application, based on the foregoing solution, the vector determination module is configured to: determining a maximum device weight and a minimum device weight of the plurality of preset device weights; determining a device weight group distance of the preset device weight based on the maximum device weight and the minimum device weight; and grouping the preset equipment weights based on the equipment weight group distance.
In an embodiment of the present application, based on the foregoing scheme, there are a plurality of types of object information of the object to be detected, and the vector determination module is configured to: acquiring a plurality of preset information weights corresponding to various object information; grouping the preset information weights to obtain a plurality of preset information weight groups; selecting a group median of a group in which the maximum frequency is located in a plurality of preset information weight groups as the weight of various object information; and combining the weights of the object information corresponding to the same detection equipment to obtain a weight vector of the object information.
In an embodiment of the present application, based on the foregoing solution, the vector determination module is configured to: determining a maximum information weight and a minimum information weight in the plurality of preset information weights; determining an information weight group distance of the preset information weight based on the maximum information weight and the minimum information weight; and grouping the preset information weight based on the information weight group distance.
In an embodiment of the present application, based on the foregoing solution, the selecting module is configured to: performing synthetic operation on the probability matrix, the weight vector of the detection device and the weight vector of the object information to obtain a synthetic operation result; carrying out normalization processing on the synthetic operation result to obtain the probability of matching the object to be detected with the multiple prediction results; and taking the prediction result corresponding to the maximum probability in the probabilities corresponding to the plurality of prediction results as the detection result.
In an embodiment of the present application, based on the foregoing scheme, there are a plurality of types of object information of the object to be detected, and the matrix module is configured to: respectively determining the prediction probability of the object to be detected matched with the multiple prediction results based on various object information; combining the prediction probabilities corresponding to the same kind of object information to obtain a prediction probability vector; and combining the prediction probability vectors corresponding to the object information to obtain the prediction probability matrix.
In an embodiment of the present application, based on the foregoing solution, the matrix module is configured to: acquiring a membership function corresponding to the object information; and substituting the object information into a membership function corresponding to the object information type to obtain a function result, wherein the membership function accords with Gaussian distribution, and the prediction probability is determined based on the function result.
In an embodiment of the application, based on the foregoing, before determining the prediction probability, the matrix module is further configured to: acquiring environment information of the environment where the object to be detected is located; and correcting the membership function corresponding to the object information by using the environment information, and substituting the object information into the corrected membership function to obtain a function result.
According to an aspect of embodiments of the present application, there is provided a computer program medium storing computer program instructions which, when executed by a computer, cause the computer to perform the method of any one of the above.
According to an aspect of an embodiment of the present application, there is provided an electronic apparatus including: a processor; a memory having computer readable instructions stored thereon which, when executed by the processor, implement the method of any of the above.
The technical scheme provided by the embodiment of the application can have the following beneficial effects:
in the technical solutions provided in some embodiments of the present application, object information of an object to be detected, which is detected by a plurality of detection devices, is obtained, and a plurality of prediction results are obtained; determining a prediction probability matrix of the object to be detected corresponding to various prediction results based on the object information, so as to fuse the object information of the object to be detected, which is detected by a plurality of detection devices; determining a weight vector of the detection equipment and determining a weight vector of the object information; and selecting the detection result of the object to be detected from the multiple prediction results based on the prediction probability matrix, the weight vector of the detection equipment and the weight vector of the object information, so that the obtained detection result is more accurate.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application.
FIG. 1 shows a schematic diagram of an exemplary system architecture to which aspects of embodiments of the present application may be applied;
FIG. 2 schematically shows a flow diagram of an object detection method according to an embodiment of the present application;
FIG. 3 schematically shows a flow diagram of an object detection method according to an embodiment of the present application;
FIG. 4 schematically illustrates a flow diagram of a design object detection model according to one embodiment of the present application;
FIG. 5 schematically shows a schematic diagram of a membership function image according to an embodiment of the present application;
FIG. 6 schematically shows a block diagram of an object detection apparatus according to an embodiment of the present application;
FIG. 7 is a hardware diagram illustrating an electronic device according to an example embodiment.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the application. One skilled in the relevant art will recognize, however, that the subject matter of the present application can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and so forth. In other instances, well-known methods, devices, implementations, or operations have not been shown or described in detail to avoid obscuring aspects of the application.
The block diagrams shown in the figures are functional entities only and do not necessarily correspond to physically separate entities. I.e. these functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor means and/or microcontroller means.
The flow charts shown in the drawings are merely illustrative and do not necessarily include all of the contents and operations/steps, nor do they necessarily have to be performed in the order described. For example, some operations/steps may be decomposed, and some operations/steps may be combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
Fig. 1 shows a schematic diagram of an exemplary system architecture 100 to which the technical solutions of the embodiments of the present application can be applied.
As shown in fig. 1, the system architecture 100 may include a terminal device 101 (which may be one or more of a smartphone, a tablet, a laptop, a desktop computer, a detection device), a network 102, and a server 103. The detection devices may include thermal imaging sensors and sensors such as lidar, and network 102 is used to provide a medium for a communication link between terminal device 101 and server 103. Network 102 may include various connection types, such as wired communication links, wireless communication links, and so forth.
It should be understood that the number of terminal devices 101, networks 102, and servers 103 in fig. 1 is merely illustrative. There may be any number of terminal devices 101, networks 102, and servers 103, as desired for implementation. For example, the server 103 may be a server cluster composed of a plurality of servers.
In an embodiment of the present application, the server 103 may obtain object information of an object to be detected, which is detected by a plurality of detection devices, and obtain a plurality of prediction results; determining a prediction probability matrix of the object to be detected corresponding to various prediction results based on the object information, so as to fuse the object information of the object to be detected, which is detected by a plurality of detection devices; determining a weight vector of the detection equipment and determining a weight vector of the object information; and selecting the detection result of the object to be detected from the multiple prediction results based on the prediction probability matrix, the weight vector of the detection equipment and the weight vector of the object information, so that the obtained detection result is more accurate.
It should be noted that the object detection method provided in the embodiment of the present application is generally executed by the server 103, and accordingly, the object detection apparatus is generally disposed in the server 103. However, in other embodiments of the present application, the terminal device 101 may also have a similar function to the server 103, so as to execute the object detection method provided in the embodiments of the present application.
The implementation details of the technical solution of the embodiment of the present application are set forth in detail below:
fig. 2 schematically shows a flowchart of an object detection method according to an embodiment of the present application, where an execution subject of the object detection method may be a server, such as the server 103 shown in fig. 1.
Referring to fig. 2, the object detection method at least includes steps S210 to S240, which are described in detail as follows:
in step S210, object information of the object to be detected by the plurality of detection devices is acquired, and a plurality of kinds of prediction results are acquired.
In one embodiment of the present application, the object to be detected may be an object that appears in the automobile during driving, such as other vehicles, buildings, plants, animals, or pedestrians, and the plurality of prediction results may include: the detection result is human, the detection result is non-human, the detection is impossible, and the like.
In an embodiment of the present application, there may be only one object to be detected, and there may be one or more object information of the object to be detected, the number of each object information may be one or more, and the type and the number of the object information may be determined according to the type and the number of the detection device, for example, when the detection device includes a thermal imaging sensor, the object information includes object temperature and shape aspect ratio information, and the like; when the detection device includes a laser radar, the object information includes the moving speed and height information of the object. In other embodiments of the present application, there may be a plurality of objects to be detected.
In step S220, a prediction probability matrix of the object to be detected corresponding to the plurality of prediction results is determined based on the object information.
In an embodiment of the application, when the object information of the object to be detected is multiple, the prediction probability of the object to be detected matching with multiple prediction results can be determined based on various object information respectively; and determining a prediction probability matrix based on the prediction probability of the detection object matched with the plurality of prediction results.
In an embodiment of the application, when the object information of the object to be detected is multiple, the prediction probability of the object to be detected matching with multiple prediction results can be determined based on various object information respectively; combining the prediction probabilities corresponding to the same kind of object information to obtain a prediction probability vector; and combining the prediction probability vectors corresponding to the various object information to obtain a prediction probability matrix.
In an embodiment of the present application, a membership function corresponding to an object information type may be obtained; and substituting the object information into a membership function corresponding to the object information type to obtain a function result, and determining the prediction probability based on the function result, wherein the membership function accords with Gaussian distribution.
In an embodiment of the present application, if there are a plurality of pieces of same object information corresponding to the same detection device, a prediction result of matching each piece of object information may be determined according to a function result, and a percentage of a plurality of pieces of object information corresponding to each piece of object information may be matched with a number of various prediction results as a prediction probability that an object to be detected matches each prediction result determined based on the piece of object information, for example, a motion speed of the object to be detected is multiple, a percentage of a number of people matching the detection result in the plurality of motion speeds to a total number of motion speeds may be used as a prediction probability that the object to be detected and the detection result are people determined based on the motion speed.
In an embodiment of the present application, the function result may be compared with a set range to obtain a prediction result matching each object information, for example, if the function result is within the set range, the prediction result may be determined as a human.
In an embodiment of the application, before determining the prediction probability, environment information of an environment where an object to be detected is located may also be obtained; and correcting the membership function corresponding to the object information by using the environment information, and substituting the object information into the corrected membership function to obtain a function result, so that the environment information influencing the operation of the detection equipment can be considered, the obtained prediction probability is more accurate, and the obtained detection result is more accurate.
In an embodiment of the present application, after obtaining the membership function corresponding to the object information, environmental data may also be collected; and modifying the membership function corresponding to the object information by using the collected environment data, substituting the object information into the modified membership function to obtain a function result, so that the environment information influencing the operation of the detection equipment can be considered, the obtained prediction probability is more accurate, and the obtained detection result is more accurate.
With continued reference to fig. 2, in step S230, a weight vector of the detection device is determined, and a weight vector of the object information is determined.
In an embodiment of the present application, weights occupied by each detection device in a plurality of detection devices may be obtained, and the weights occupied by each detection device in the plurality of detection devices may be combined to obtain a weight vector of the detection device.
In an embodiment of the present application, a plurality of preset device weights corresponding to each detection device may be obtained; grouping the multiple preset equipment weights to obtain multiple preset equipment weight groups; selecting a group median of a group where the maximum frequency is located in a plurality of preset equipment weight groups as the weight of each detection equipment; combining the weights of the plurality of detection devices to obtain a weight vector of the detection devices, wherein the group median can be a median of the group and can be a median of the group.
In one embodiment of the present application, the preset device weight may be set empirically, or may be set empirically by an expert.
In one embodiment of the present application, a maximum device weight and a minimum device weight of a plurality of preset device weights may be determined; determining a device weight group distance of preset device weights based on the maximum device weight and the minimum device weight; and grouping the preset device weights based on the device weight group distance.
In an embodiment of the present application, if the maximum device weight is represented by M and the minimum device weight is represented by M, a proper positive integer Q may be selected, a group distance is calculated through (M-M)/Q, the weights are grouped from small to large, and the frequency count and the frequency of each group of weights are calculated, so as to determine the group where the maximum frequency is located in the plurality of preset device weight groups.
In an embodiment of the present application, weights of various object information corresponding to the same detection device in all kinds of object information corresponding to the detection device may be obtained, and the weights of the various object information corresponding to the same detection device in all kinds of object information corresponding to the detection device may be combined to obtain a weight vector of the object information.
In an embodiment of the present application, a plurality of preset information weights corresponding to various object information may be obtained; grouping the preset information weights to obtain a plurality of preset information weight groups; selecting a group median of a group in which the maximum frequency is located in a plurality of preset information weight groups as the weight of various object information; and combining the weights of the object information corresponding to the same detection equipment to obtain a weight vector of the object information.
In one embodiment of the present application, the preset information weight may be set empirically, or may be set empirically by an expert.
In one embodiment of the present application, a maximum information weight and a minimum information weight of a plurality of preset information weights may be determined; determining an information weight group distance of preset information weight based on the maximum information weight and the minimum information weight; and grouping the preset information weights based on the information weight group distance, wherein the calculation mode of the group distance can refer to the calculation mode of the group distance in the preset equipment weight group.
In an embodiment of the present application, a weight vector of object information may be used as a weight of a detection device corresponding to the object information, and a weight vector of a detection device may be obtained by combining weight vectors of a plurality of types of object information.
With continued reference to fig. 2, in step S240, a detection result of the object to be detected is selected from the plurality of kinds of prediction results based on the prediction probability matrix, the weight vector of the detection device, and the weight vector of the object information.
In an embodiment of the present application, the probability matrix, the weight vector of the detection device, and the weight vector of the object information may be subjected to a synthesis operation to obtain a synthesis operation result; and selecting a detection result of the object to be detected from the plurality of prediction results based on a synthesis operation result, wherein the synthesis operation can be a fuzzy synthesis operation.
In one embodiment of the present application, the probability that the object to be detected matches with multiple prediction results may be determined based on the result of the synthesis operation; and taking the prediction result corresponding to the maximum probability in the probabilities corresponding to the plurality of types of prediction results as the detection result.
In an embodiment of the present application, the result of the synthesis operation may be normalized to obtain the probability that the object to be detected matches with multiple prediction results.
In the embodiment shown in fig. 2, object information of an object to be detected by a plurality of detection devices is obtained, and a plurality of prediction results are obtained; determining a prediction probability matrix of the object to be detected corresponding to various prediction results based on the object information, so as to fuse the object information of the object to be detected, which is detected by a plurality of detection devices; determining a weight vector of the detection equipment and determining a weight vector of the object information; and selecting the detection result of the object to be detected from the multiple prediction results based on the prediction probability matrix, the weight vector of the detection equipment and the weight vector of the object information, so that the obtained detection result is more accurate.
In an embodiment of the present application, an object detection system is provided, and for details not disclosed in the embodiment of the system of the present application, please refer to the embodiment of the object detection method described above in the present application.
In one embodiment of the present application, the system includes a plurality of detection units, an interface unit, a central processing unit, a power supply, and an acousto-optic warning unit, wherein the plurality of detection units may include a thermal imaging sensor and a laser radar, and the central processing unit may include: the acquisition module is configured to acquire object information of the object to be detected, which is detected by the detection devices, and acquire various prediction results; the matrix determination module is configured to determine a prediction probability matrix of the object to be detected corresponding to various prediction results based on the object information; the vector determination module is configured to determine a weight vector of the detection device and determine a weight vector of the object information; and the selection module is configured to select the detection result of the object to be detected from the multiple prediction results based on the prediction probability matrix, the weight vector of the detection device and the weight vector of the object information.
In one embodiment of the application, the thermal imaging sensor can be connected with the interface unit through a USB (universal serial bus) line, the laser radar is connected with the interface unit through a network cable, the interface unit is connected with the central processing unit through an internal bus, the central processing unit is connected with the acousto-optic warning unit, and the power supply is connected with all the modules. The thermal imaging sensor transmits thermal imaging picture data to the interface unit, the laser radar transmits laser radar point cloud data to the interface unit, the interface unit transmits data filtering to the central processing unit, and the central processing unit controls the acousto-optic warning unit through transmitting level signals. The thermal imaging sensor is used for detecting temperature and shape height-width ratio information, the laser radar is used for detecting movement speed and height information, the central processing unit is used for processing all operation algorithms, and the warning acousto-optic unit is used for warning monitoring personnel.
In one embodiment of the present application, the operating step of the object detection system may comprise: the method comprises the steps that two kinds of object information of speed and height information of an object are obtained through a laser radar, two kinds of object information of temperature and shape aspect ratio information of the object are obtained through a thermal imaging sensor, a fuzzy comprehensive judgment algorithm with a variable membership function is provided, multi-source object information data of detection equipment are fused, a comprehensive judgment system is formed, and whether the detection object is a pedestrian or not is achieved.
In an embodiment of the present application, as shown in fig. 3, fig. 3 schematically shows a flowchart of an object detection method according to an embodiment of the present application, and an object detection model may be designed according to correlation and combination of pedestrian speed (i.e., moving speed), height information (i.e., height information), body temperature (i.e., temperature), and body type aspect ratio information (i.e., shape aspect ratio information) acquired by a thermal imaging sensor and a laser radar; selecting a Gaussian function capable of accurately describing the information distribution of pedestrian speed, height information, body question and body height-width ratio as a membership function, and modifying the membership function according to a main environment change rule influencing the operation of the sensor to form a membership function with a variable mean value; constructing a judgment matrix through a multi-source information data fusion processing algorithm with variable membership functions; and determining the weight of each sensor by constructing a ternary frequency statistical method, and then performing comprehensive evaluation to realize the detection of the object.
In an embodiment of the present application, a plurality of types of object information collected by the laser radar X and the thermal imaging sensor Y may be detected, correlated, combined, and the like, and an object detection model M is designed according to the processed speed V, height H, temperature T, and body height-to-width ratio Z, as shown in fig. 4, fig. 4 schematically shows a flowchart for designing an object detection model according to an embodiment of the present application, which may improve the accuracy of object detection.
In one embodiment of the present application, the inventors found that: the speed of people is approximately distributed in the range of 0-10 m/s, the body temperature is distributed in the range of 36 degrees, the height is 1-2.5 m, the aspect ratio of the shape is about 4, according to the naive Bayes theory, the detection objects which accord with the given characteristics are considered to belong to the category of 'people' (because of the identification error, the fuzzy judgment of 'unrecognizable' is given in the text), according to the fuzzy theory, the membership functions of the speed V, the body temperature T, the height H and the aspect ratio Z of the pedestrian can be determined through an intuition method,the four kinds of object information are uniformly described by adopting a Gaussian function; in the actual use process, the thermal imaging sensor and the laser radar are easily interfered by the environment and may influence the detection result, the application provides a correction function f (delta) related to the change rule of main environmental factors influencing the operation of the sensor, and the mean value of the membership function is corrected (the mean value of the function before correction is directly added with the corresponding correction function, namely the original membership function is translated by f (delta) units to the left), wherein delta is the main environmental factor influencing the operation of the sensor, such as temperature, wind speed and the like; let the modified mean values be velocity v ', body temperature t', height h ', and aspect ratio z', respectively, as shown in fig. 5, fig. 5 schematically shows a schematic diagram of a membership function image according to an embodiment of the present application, where y in fig. 5 represents a membership degree and y ∈ [0, 1 ∈ respectively],α1、α3、α5、α7To a selected upper bound, α2、α4、α6、αsFor the selected lower bound, the abscissa is the corrected parameter.
In an embodiment of the present application, a prediction result set V { the recognition result is non-human, the recognition result is not human, the recognition result is human }, the collected object information of the four sensors is respectively substituted into the corresponding membership functions shown in fig. 2 (for example, the altitude information collected by the lidar is substituted into the membership functions determined in step 2 regarding altitude, and a y value corresponding to the object information is obtained after calculation), and the y value obtained from each object information is respectively associated with the selected α in the membership functions corresponding to the object informationUpper bound ofAnd alphaLower boundBy comparison, if y ∈ (0, α)Lower bound]If the judgment result of the object information is 'recognition result is not man', if y is within the range of alphaUpper bound of,αLower bound]If the object information is determined to be "unrecognizable", then y ∈ (α)Lower bound,1]If the judgment result of the object information is 'human' and the identification result is 'human', each object information of each type of sensor is substituted into a membership function, after P times of calculation (P depends on the quantity of the object information collected by each type of sensor), pairs in a plurality of object information collected by each type of sensor are respectively countedThe percentage of "recognition result is non-human", "not recognized" and "recognition result is human" should be integrated to form 2X 3 matrices X and Y, that is:
Figure BDA0003431655330000111
Figure BDA0003431655330000112
wherein X and Y respectively represent a prediction probability matrix of a thermal imaging sensor and a laser radar corresponding prediction result set V, and X11Indicates the percentage, x, of the several speed data collected by the lidar corresponding to "recognition result is not human12Representing a percentage, x, of the velocity data corresponding to "no recognition13Representing a percentage, x, of the velocity data corresponding to "no recognition2jRepresents the percentage, y, of the height data collected by the millimeter wave radar corresponding to the three prediction results1jRepresents the percentage of the several temperature data collected by the thermal imaging sensor corresponding to the three predictions, y2jThe data of the plurality of outline aspect ratios collected by the thermal imaging sensor are expressed as percentages corresponding to three prediction results, and the rest symbols are similar in meaning and are not repeated herein.
In an embodiment of the present application, since the accuracy of each detection device and the status of the measurement parameter are different, a ternary frequency statistical method is proposed herein to determine the weight of each detection device, the method is based on the laser radar and thermal imaging sensor provided in the above embodiment, and uses expert experience to provide i groups of weight distribution, and according to the given weight distribution, performs weight statistics on each type of sensor according to the following steps:
1) finding out a maximum value M and a minimum value M from the given i groups of weights of a certain type of sensors;
2) properly selecting a positive integer Q, calculating group distance through (M-M)/Q, and grouping weights from small to large;
3) calculating the frequency and frequency of each group of weights;
4) selecting the group median of the group in which the maximum frequency is located as the weight x of the sensoriThereby obtaining a weight vector, N ═ x1,x2...xn]And then, normalizing the obtained weight vector:
Figure BDA0003431655330000121
5) the above operation is repeated for the other sensors.
The weight vector determined by the above method is: n ═ N1 n2]N1=[n11 n12]N2=[n21 n22]Where N represents a weight vector for two sensors X, Y, N1Weight vector, N, representing sensor X speed and altitude index2A weight vector representing the sensor Y temperature and profile aspect ratio indicators.
In one embodiment of the present application, the step of comprehensive evaluation may include: the probability matrix, the weight vector of the detection device, and the weight vector of the object information are subjected to synthesis operation, that is, the calculation can be performed by the following formula:
Figure BDA0003431655330000122
wherein,
Figure BDA0003431655330000123
similar to matrix multiplication, a composite relationship is represented, for example:
Figure BDA0003431655330000124
and (3) carrying out normalization treatment on the A:
Figure BDA0003431655330000125
the results show that there is P1% probability judgment of recognition result is non-human, with P2% probability is not identifiable, and P3% probability of identifying the object as human, and taking PmaxAnd the corresponding judgment result of% is the final judgment result, and if the result is unidentifiable, manual investigation is carried out.
The application provides an object detection method, which is used for identifying pedestrians based on a thermal imaging sensor and a laser radar, has good fault tolerance, can eliminate the influence of error data on a correct result in a fusion process, and can reduce data redundancy; the method can analyze and process multi-source comprehensive information, including detecting, correlating and combining various object information. The application also provides a method for determining the membership functions of the motion speed V, the temperature T, the height H and the shape aspect ratio Z according to the Bayes theory; the membership function with the variable mean value is provided, so that the influence of detection errors caused by environmental changes on a detection result is reduced, and the obtained fuzzy input signal is more accurate. The application also provides a method for constructing the judgment matrix by using the membership function variable multi-object information fusion processing algorithm.
Embodiments of the apparatus of the present application are described below, which may be used to perform the object detection methods of the above-described embodiments of the present application. For details that are not disclosed in the embodiments of the apparatus of the present application, please refer to the embodiments of the object detection method described above in the present application.
Fig. 6 schematically shows a block diagram of an object detection apparatus according to an embodiment of the present application.
Referring to fig. 6, an object detection apparatus 600 according to an embodiment of the present application includes an obtaining module 601, a matrix determining module 602, a vector determining module 603, and a selecting module 604.
In some embodiments of the present application, based on the foregoing scheme, the obtaining module 601 is configured as an obtaining module configured to obtain object information of an object to be detected, which is detected by a plurality of detecting devices, and obtain a plurality of prediction results; the matrix determination module 602 is configured to determine a prediction probability matrix of the object to be detected corresponding to the plurality of prediction results based on the object information; the vector determination module 603 is configured to determine a weight vector of the detection device and determine a weight vector of the object information; the selecting module 604 is configured to select a detection result of the object to be detected from the plurality of prediction results based on the prediction probability matrix, the weight vector of the detection device, and the weight vector of the object information.
In one embodiment of the present application, based on the foregoing scheme, the vector determination module 603 is configured to: acquiring a plurality of preset equipment weights corresponding to each detection equipment; grouping the multiple preset equipment weights to obtain multiple preset equipment weight groups; selecting a group median of a group where the maximum frequency is located in a plurality of preset equipment weight groups as the weight of each detection equipment; and combining the weights of the plurality of detection devices to obtain a weight vector of the detection device.
In one embodiment of the present application, based on the foregoing scheme, the vector determination module 603 is configured to: determining a maximum device weight and a minimum device weight in a plurality of preset device weights; determining a device weight group distance of preset device weights based on the maximum device weight and the minimum device weight; and grouping the preset device weights based on the device weight group distance.
In an embodiment of the present application, based on the foregoing scheme, there are multiple types of object information of the object to be detected, and the vector determination module 603 is configured to: acquiring a plurality of preset information weights corresponding to various object information; grouping the preset information weights to obtain a plurality of preset information weight groups; selecting a group median of a group in which the maximum frequency is located in a plurality of preset information weight groups as the weight of various object information; and combining the weights of the object information corresponding to the same detection equipment to obtain a weight vector of the object information.
In one embodiment of the present application, based on the foregoing scheme, the vector determination module 603 is configured to: determining a maximum information weight and a minimum information weight in a plurality of preset information weights; determining an information weight group distance of preset information weight based on the maximum information weight and the minimum information weight; and grouping the preset information weights based on the information weight grouping distance.
In an embodiment of the present application, based on the foregoing solution, the selecting module 604 is configured to: performing synthetic operation on the probability matrix, the weight vector of the detection equipment and the weight vector of the object information to obtain a synthetic operation result; carrying out normalization processing on the synthetic operation result to obtain the probability that the object to be detected is matched with various prediction results; and taking the prediction result corresponding to the maximum probability in the probabilities corresponding to the plurality of types of prediction results as the detection result.
In an embodiment of the present application, based on the foregoing scheme, there are multiple types of object information of the object to be detected, and the matrix module 602 is configured to: respectively determining the prediction probability of the object to be detected matched with various prediction results based on various object information; combining the prediction probabilities corresponding to the same kind of object information to obtain a prediction probability vector; and combining the prediction probability vectors corresponding to the plurality of object information to obtain a prediction probability matrix.
In an embodiment of the present application, based on the foregoing solution, the matrix module 602 is configured to: acquiring a membership function corresponding to the object information; and substituting the object information into a membership function corresponding to the object information type to obtain a function result, wherein the membership function accords with Gaussian distribution, and determining the prediction probability based on the function result.
In an embodiment of the present application, based on the foregoing scheme, before determining the prediction probability, the matrix module 602 is further configured to: acquiring environment information of an environment where an object to be detected is located; and correcting the membership function corresponding to the object information by using the environment information, and substituting the object information into the corrected membership function to obtain a function result.
As will be appreciated by one skilled in the art, aspects of the present application may be embodied as a system, method or program product. Accordingly, various aspects of the present application may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 70 according to this embodiment of the present application is described below with reference to fig. 7. The electronic device 70 shown in fig. 7 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 7, the electronic device 70 is embodied in the form of a general purpose computing device. The components of the electronic device 70 may include, but are not limited to: the at least one processing unit 71, the at least one memory unit 72, a bus 73 connecting different system components (including the memory unit 72 and the processing unit 71), and a display unit 74.
Wherein the storage unit stores program code executable by the processing unit 71 to cause the processing unit 71 to perform the steps according to various exemplary embodiments of the present application described in the section "example methods" above in this specification.
The storage unit 72 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM)721 and/or a cache memory unit 722, and may further include a read only memory unit (ROM) 723.
The memory unit 72 may also include a program/utility 724 having a set (at least one) of program modules 725, such program modules 725 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 73 can be any one or more of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 70 may also communicate with one or more external devices (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 70, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 70 to communicate with one or more other computing devices. Such communication may be through an input/output (I/O) interface 75. Also, the electronic device 70 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet) via the network adapter 76. As shown, the network adapter 76 communicates with the other modules of the electronic device 70 via the bus 73. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with electronic device 70, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present application can be embodied in the form of a software product, which can be stored in a non-volatile storage medium (which can be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to make a computing device (which can be a personal computer, a server, a terminal device, or a network device, etc.) execute the method according to the embodiments of the present application.
There is also provided, in accordance with an embodiment of the present application, a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, various aspects of the present application may also be implemented in the form of a program product comprising program code for causing a terminal device to perform the steps according to various exemplary embodiments of the present application described in the "exemplary methods" section above of this specification, when the program product is run on the terminal device.
In one embodiment of the present application, a program product for implementing the above method is provided, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present application is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Furthermore, the above-described figures are merely schematic illustrations of processes involved in methods according to exemplary embodiments of the present application, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
It will be understood that the present application is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (10)

1. An object detection method, comprising:
acquiring object information of an object to be detected, which is detected by a plurality of detection devices, and acquiring various prediction results;
determining a prediction probability matrix of the object to be detected corresponding to the plurality of prediction results based on the object information;
determining a weight vector of the detection device and determining a weight vector of the object information;
and selecting the detection result of the object to be detected from the multiple prediction results based on the prediction probability matrix, the weight vector of the detection equipment and the weight vector of the object information.
2. The object detection method of claim 1, wherein the determining the weight vector for the detection device comprises:
acquiring a plurality of preset equipment weights corresponding to each detection equipment;
grouping the preset device weights to obtain a plurality of preset device weight groups;
selecting a group median of a group in which the maximum frequency is located in the preset equipment weight groups as the weight of each detection equipment;
combining the weights of a plurality of detection devices to obtain the weight vector of the detection device.
3. The object detection method of claim 2, wherein grouping the plurality of preset device weights to obtain a plurality of preset device weight groups comprises:
determining a maximum device weight and a minimum device weight of the plurality of preset device weights;
determining a device weight group distance of the preset device weight based on the maximum device weight and the minimum device weight;
and grouping the preset equipment weights based on the equipment weight group distance.
4. The object detection method according to claim 1, wherein there are a plurality of object information of the object to be detected, and the determining the weight vector of the object information includes:
acquiring a plurality of preset information weights corresponding to various object information;
grouping the preset information weights to obtain a plurality of preset information weight groups;
selecting a group median of a group in which the maximum frequency is located in a plurality of preset information weight groups as the weight of various object information;
and combining the weights of the object information corresponding to the same detection equipment to obtain a weight vector of the object information.
5. The object detection method of claim 4, wherein grouping the plurality of preset information weights to obtain a plurality of preset information weight groups comprises:
determining a maximum information weight and a minimum information weight in the plurality of preset information weights;
determining an information weight group distance of the preset information weight based on the maximum information weight and the minimum information weight;
and grouping the preset information weight based on the information weight group distance.
6. The object detection method according to claim 1, wherein the selecting the detection result of the object to be detected from the plurality of kinds of prediction results based on the probability matrix, the weight vector of the detection device, and the weight vector of the object information includes:
performing synthetic operation on the probability matrix, the weight vector of the detection device and the weight vector of the object information to obtain a synthetic operation result;
carrying out normalization processing on the synthetic operation result to obtain the probability of matching the object to be detected with the multiple prediction results;
and taking the prediction result corresponding to the maximum probability in the probabilities corresponding to the plurality of prediction results as the detection result.
7. The object detection method according to claim 1, wherein there are a plurality of types of object information of the object to be detected, and the determining the prediction probability matrix of the object to be detected corresponding to the plurality of types of prediction results based on the object information includes:
respectively determining the prediction probability of the object to be detected matched with the multiple prediction results based on various object information;
combining the prediction probabilities corresponding to the same kind of object information to obtain a prediction probability vector;
and combining the prediction probability vectors corresponding to the object information to obtain the prediction probability matrix.
8. The object detection method according to claim 7, wherein the determining the prediction probability that the object to be detected matches a plurality of prediction results based on the respective object information comprises:
acquiring a membership function corresponding to the object information;
substituting the object information into a membership function corresponding to the object information type to obtain a function result, wherein the membership function accords with Gaussian distribution;
based on the function result, the prediction probability is determined.
9. The object detection method of claim 8, wherein prior to determining the prediction probability, the method further comprises:
acquiring environment information of the environment where the object to be detected is located;
and correcting the membership function corresponding to the object information by using the environment information, and substituting the object information into the corrected membership function to obtain a function result.
10. An object detection system comprising a plurality of detection devices and a central processing unit, the central processing unit comprising:
the acquisition module is configured to acquire object information of the object to be detected, which is detected by the detection devices, and acquire various prediction results;
the matrix determination module is configured to determine a prediction probability matrix of the object to be detected corresponding to the plurality of prediction results based on the object information;
a vector determination module configured to determine a weight vector of the detection device and determine a weight vector of the object information;
and the selection module is configured to select the detection result of the object to be detected from the multiple prediction results based on the prediction probability matrix, the weight vector of the detection device and the weight vector of the object information.
CN202111600753.2A 2021-12-24 Object detection method and system Active CN114399788B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111600753.2A CN114399788B (en) 2021-12-24 Object detection method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111600753.2A CN114399788B (en) 2021-12-24 Object detection method and system

Publications (2)

Publication Number Publication Date
CN114399788A true CN114399788A (en) 2022-04-26
CN114399788B CN114399788B (en) 2024-09-03

Family

ID=

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109670141A (en) * 2018-11-30 2019-04-23 网易传媒科技(北京)有限公司 Prediction technique, system, medium and electronic equipment
CN111739060A (en) * 2019-08-01 2020-10-02 北京京东尚科信息技术有限公司 Identification method, device and storage medium
WO2020211388A1 (en) * 2019-04-16 2020-10-22 深圳壹账通智能科技有限公司 Behavior prediction method and device employing prediction model, apparatus, and storage medium
CN112183299A (en) * 2020-09-23 2021-01-05 成都佳华物链云科技有限公司 Pedestrian attribute prediction method and device, electronic equipment and storage medium
WO2021135566A1 (en) * 2019-12-31 2021-07-08 华为技术有限公司 Vehicle control method and apparatus, controller, and smart vehicle

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109670141A (en) * 2018-11-30 2019-04-23 网易传媒科技(北京)有限公司 Prediction technique, system, medium and electronic equipment
WO2020211388A1 (en) * 2019-04-16 2020-10-22 深圳壹账通智能科技有限公司 Behavior prediction method and device employing prediction model, apparatus, and storage medium
CN111739060A (en) * 2019-08-01 2020-10-02 北京京东尚科信息技术有限公司 Identification method, device and storage medium
WO2021135566A1 (en) * 2019-12-31 2021-07-08 华为技术有限公司 Vehicle control method and apparatus, controller, and smart vehicle
CN112183299A (en) * 2020-09-23 2021-01-05 成都佳华物链云科技有限公司 Pedestrian attribute prediction method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
US11487941B2 (en) Techniques for determining categorized text
CN110689043A (en) Vehicle fine granularity identification method and device based on multiple attention mechanism
CN116881832B (en) Construction method and device of fault diagnosis model of rotary mechanical equipment
US11379685B2 (en) Machine learning classification system
CN115082920B (en) Deep learning model training method, image processing method and device
CN110852881A (en) Risk account identification method and device, electronic equipment and medium
EP3971783A1 (en) Combining data driven models for classifying data
CN114048468A (en) Intrusion detection method, intrusion detection model training method, device and medium
CN114399321A (en) Business system stability analysis method, device and equipment
CN116340796A (en) Time sequence data analysis method, device, equipment and storage medium
CN117041017A (en) Intelligent operation and maintenance management method and system for data center
CN113487223B (en) Risk assessment method and system based on information fusion
CN117269742A (en) Method, device and medium for evaluating health state of circuit breaker in high-altitude environment
CN111091099A (en) Scene recognition model construction method, scene recognition method and device
CN116956197B (en) Deep learning-based energy facility fault prediction method and device and electronic equipment
Miller et al. Hyperparameter Tuning of Support Vector Machines for Wind Turbine Detection Using Drones
CN117668737A (en) Pipeline detection data fault early warning checking method and related device
CN110580483A (en) indoor and outdoor user distinguishing method and device
CN117036732A (en) Electromechanical equipment detection system, method and equipment based on fusion model
CN114399788B (en) Object detection method and system
CN114757097B (en) Line fault diagnosis method and device
CN111031042A (en) Network anomaly detection method based on improved D-S evidence theory
CN114399788A (en) Object detection method and system
CN116188445A (en) Product surface defect detection and positioning method and device and terminal equipment
CN115984646A (en) Distributed target detection method and device for remote sensing cross-satellite observation and satellite

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant