CN115620069A - Target identification method and device, terminal equipment and storage medium - Google Patents

Target identification method and device, terminal equipment and storage medium Download PDF

Info

Publication number
CN115620069A
CN115620069A CN202211372177.5A CN202211372177A CN115620069A CN 115620069 A CN115620069 A CN 115620069A CN 202211372177 A CN202211372177 A CN 202211372177A CN 115620069 A CN115620069 A CN 115620069A
Authority
CN
China
Prior art keywords
target
sensor
prediction
prediction probability
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211372177.5A
Other languages
Chinese (zh)
Inventor
吴金英
马冰
王亚军
刘建超
张其俊
王邓江
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Wanji Iov Technology Co ltd
Original Assignee
Suzhou Wanji Iov Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Wanji Iov Technology Co ltd filed Critical Suzhou Wanji Iov Technology Co ltd
Priority to CN202211372177.5A priority Critical patent/CN115620069A/en
Publication of CN115620069A publication Critical patent/CN115620069A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Abstract

The embodiment of the application is suitable for the technical field of vehicle networking, and provides a target identification method, a device, terminal equipment and a storage medium, wherein the terminal equipment is provided with sensors of different types, and the method comprises the following steps: acquiring current weather information; acquiring target data of a target object acquired by a sensor aiming at any one sensor of each type; inputting target data into a target recognition model corresponding to a sensor to obtain a first prediction position and a first prediction category of a target object; determining the target reliability when the target recognition model outputs a first prediction position and a first prediction category according to the influence relation of preset weather information on the target data collected by the sensor; and determining the target position and the target type of the target object according to all the target credibility and the corresponding first prediction positions and first prediction types. By adopting the method, the accuracy of identifying the target object can be improved.

Description

Target identification method and device, terminal equipment and storage medium
Technical Field
The application belongs to the technical field of vehicle networking, and particularly relates to a target identification method, a target identification device, terminal equipment and a storage medium.
Background
With the rapid development of modern society and economy and the acceleration of urbanization process, automobiles occupy more and more important positions in the production and life of people. In order to improve the traffic safety of the vehicle, a vehicle road coordination system and a vehicle intelligent auxiliary driving system are required to be operated. For example, the vehicle-road coordination system or the vehicle intelligent driving assistance system can sense traffic information such as the positions of surrounding vehicles, the number of vehicles, the positions of pedestrians or obstacles according to sensing equipment on the vehicle, and then accurately plan a vehicle driving path according to the sensed traffic information so as to improve the traffic safety of the vehicle.
At present, the perception mode of traffic information is mainly based on the perception of a visual sensor or a laser radar. However, in severe weather such as heavy fog or heavy rain, the image shot by the camera is not clear, and the point cloud obtained by scanning the laser radar is lost, so that the perception accuracy of the traffic information is reduced, and potential safety hazards are caused.
Disclosure of Invention
The embodiment of the application provides a target identification method, a target identification device, terminal equipment and a storage medium, and can solve the problem of low traffic information perception precision.
In a first aspect, an embodiment of the present application provides a target identification method, which is applied to a terminal device, where the terminal device is configured with different types of sensors, and the method includes:
acquiring current weather information;
for any one sensor of each type, acquiring target data of a target object acquired by the sensor;
inputting target data into a target recognition model corresponding to a sensor to obtain a first prediction position and a first prediction category of a target object;
determining the target reliability when the target recognition model outputs a first prediction position and a first prediction category according to the influence relation of preset weather information on the target data collected by the sensor;
and determining the target position and the target type of the target object according to all the target credibility and the corresponding first prediction positions and first prediction types.
In a second aspect, an embodiment of the present application provides an object recognition apparatus, which is applied to a terminal device, where the terminal device is configured with different types of sensors, and the apparatus includes:
the first acquisition module is used for acquiring current weather information;
the second acquisition module is used for acquiring target data of the target object acquired by the sensor aiming at any one sensor of each type;
the input module is used for inputting target data into a target recognition model corresponding to the sensor to obtain a first prediction position and a first prediction category of a target object;
the first determination module is used for determining the target reliability when the target recognition model outputs a first prediction position and a first prediction category according to the influence relation of preset weather information on the target data collected by the sensor;
and the second determining module is used for determining the target position and the target category of the target object according to all the target credibility and the corresponding first predicted position and first predicted category.
In a third aspect, an embodiment of the present application provides a terminal device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the computer program, the method according to the first aspect is implemented.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, in which a computer program is stored, and the computer program, when executed by a processor, implements the method according to the first aspect.
In a fifth aspect, embodiments of the present application provide a computer program product, which, when run on a terminal device, causes the terminal device to execute the method of the first aspect.
In a sixth aspect, an embodiment of the present application provides a terminal device, where multiple types of sensors and a target recognition device are disposed in the terminal device, the target recognition device is connected to the multiple types of sensors, and the target recognition device is configured to perform the method of the first aspect.
Compared with the prior art, the embodiment of the application has the beneficial effects that: the terminal device may first obtain current weather information and target data of the target object respectively acquired by each type of sensor. Then, for any sensor of each type, the acquired target data is input into a target recognition model corresponding to the sensor, and a first prediction position and a first prediction category of the target object are obtained. And then, quantifying the influence of the current weather information on the sensor to acquire the target data according to the influence relation of the preset weather information on the sensor to acquire the target data so as to determine the target reliability when the target recognition model outputs the first prediction position and the first prediction category. And finally, determining the target position and the target type of the target object according to all the target credibility and the corresponding first prediction position and first prediction type. Based on the above, by combining the first prediction type and the first prediction position output by the target recognition model corresponding to the different types of sensors, the accuracy of the acquired target position and target type can be improved. In addition, when the target type and the target position are determined, the influence of the current weather information on the target data collected by the sensor is also considered, so that the accuracy of identifying the target object is further improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a flowchart illustrating an implementation of a target identification method according to an embodiment of the present application;
fig. 2 is a schematic diagram illustrating an implementation manner of obtaining target data in a target identification method according to an embodiment of the present application;
fig. 3 is a schematic diagram illustrating an implementation manner of determining a target reliability in a target identification method according to an embodiment of the present application;
fig. 4 is a schematic diagram illustrating an implementation manner of determining a target location and a target category in a target identification method according to an embodiment of the present application;
FIG. 5 is a flowchart illustrating an implementation of a target recognition method according to another embodiment of the present application;
fig. 6 is a schematic structural diagram of an object recognition apparatus according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
At present, a perception device is used as the 'eye' of a road coordination system in an automatic driving vehicle to perceive external traffic information, and the perception precision of the perception device has great influence on the whole road coordination system. Among them, the sensing devices are mainly image pickup devices and radar devices (millimeter wave radar sensors or laser radar sensors).
However, in severe weather such as heavy fog or heavy rain, information perceived by the perception device is disturbed, so that the perception accuracy is degraded. For example, in severe weather, an image shot by the camera device is not clear, or point cloud data acquired by the radar device is missing, so that the detection range of the radar device is reduced, and the perception accuracy of traffic information is reduced. Therefore, the vehicle-road cooperation system has certain potential safety hazards when the path planning is carried out based on the traffic information with low perception precision.
Based on this, in order to improve the accuracy of the sensing device in sensing the traffic information, the embodiment of the present application provides a target identification method, which may be applied to a terminal device. The terminal device may be a vehicle-mounted terminal on a vehicle, for example, a driving assistance terminal, an automatic driving terminal, a commercial vehicle intelligent terminal, a navigation terminal, and the like, which is not limited thereto. It will be appreciated that a plurality of different types of sensors are also typically deployed on a vehicle. For example, an image sensor (image pickup device) and a radar sensor (laser radar device or millimeter wave radar device) may be connected to the vehicle-mounted terminal to upload the collected information to the vehicle-mounted terminal.
In another embodiment, the terminal device may also be a road side device. Specifically, the roadside device includes a roadside computing device, a roadside communication device and a roadside sensing device.
The roadside computing equipment is deployed along the road and used for completing traffic information processing and decision making by matching with other traffic systems, and comprises multiple access edge computing units or data processing units and other equipment. The roadside Communication device is configured to perform wireless Communication with the Vehicle-mounted terminal, and the Communication mode includes, but is not limited to, a mode in which the Vehicle performs hundred percent of interconnection Communication (V2X) with the outside, or a Dedicated Short Range Communication (Dedicated Short Range Communication), and the like, which is not limited thereto. Roadside sensing devices for sensing traffic environment and road traffic state are generally image sensors and radar sensors installed at the roadside.
In the present embodiment, the type of the sensor configured by the terminal device and the number of each type of the sensor are not limited. For convenience of explanation, the present embodiment is described taking as an example that the types of sensors are an image sensor and a radar sensor, respectively, and the number of the image sensors and the radar sensors is 1.
Referring to fig. 1, fig. 1 is a flowchart illustrating an implementation of a target identification method according to an embodiment of the present application, where the method includes the following steps:
and S101, acquiring current weather information.
In one embodiment, the weather information includes, but is not limited to, information formed by one or more weather factors such as light, temperature, humidity, and weather category. The weather category may be one or more of rain, snow, fog, or the like. The current weather information is information formed by collecting the one or more weather factors at the current moment.
The terminal device may obtain the current weather information in real time through an internet function or at preset time intervals, or may determine the current weather information according to the current location information and the corresponding weather sensing device.
Specifically, the terminal device may obtain the current location information, and determine the weather sensing device located in the location range corresponding to the current location information; and finally, sending a weather information acquisition request to the weather sensing equipment, and receiving the current weather information returned by the weather sensing equipment based on the weather information acquisition request.
The current location information is location information of the terminal device, which can be determined according to a preset positioning device. For example, when the terminal device is an in-vehicle terminal, it may determine the current location information from a positioning device on the vehicle. When the terminal device is a roadside device, the position information stored in advance may be determined as the current position information. The current location information may be information formed by the longitude and latitude where the terminal device is located.
In an embodiment, the weather sensing device includes, but is not limited to, an anemometer, a wind vane, a rain gauge, a dryness-humidity meter, and a barometer. Wherein, above-mentioned meteorological sensing equipment and position range can be set up by the staff according to actual conditions in advance to can be with meteorological sensing equipment's positional information storage in terminal equipment, perhaps externally announce, so that terminal equipment can acquire meteorological equipment's positional information, and confirm corresponding meteorological sensing equipment according to current positional information.
In another embodiment, the weather sensing device can also be set by staff of a weather bureau for collecting weather information all over the country. Afterwards, the weather perception equipment can send the weather information who gathers to the weather observation stations in each place, and then is released outward by the weather observation station. Based on this, the terminal device may further generate a weather information acquisition request including the current location information, and transmit the weather information request to the weather observation station. And then, the weather observation station can determine corresponding current weather information according to the current position information and return the current weather information to the terminal equipment. In this embodiment, the manner in which the terminal device obtains the current weather information from the weather sensing device is not limited.
S102, acquiring target data of the target object acquired by the sensor aiming at any sensor of each type.
In one embodiment, the different types of sensors typically collect different target data. Illustratively, when the sensor is an image sensor, the target data it collects is an image; when the sensor is a radar sensor, the acquired target data is point cloud data.
In an embodiment, the target object includes, but is not limited to, a vehicle, a pedestrian, other dynamic obstacle or static obstacle, and the like. The sensor can acquire target data containing a target object in real time or at preset time intervals and then send the target data to the terminal equipment.
It should be noted that, when the image acquired by the sensor does not include the target object, the terminal device may not need to perform the following steps on the image.
However, it is necessary to supplement that the accuracy of the data collected by the sensor is not high because the sensor is usually affected by the weather information when collecting the target data. Based on this, in order to obtain target data with higher precision, the terminal device may perform processing according to steps S201 to S203 shown in fig. 2, which are detailed as follows:
s201, acquiring initial data of the target object acquired by the sensor.
S202, determining a target processing mode corresponding to initial data under the current weather information according to the type of the sensor and the incidence relation between the preset weather information and the preset processing mode.
And S203, processing the initial data by adopting a target processing mode to obtain target data.
In an embodiment, the initial data is data acquired by the sensor on the target object, and the accuracy of the data may be low. The incidence relation is used for representing the relation among the sensor type, the preset weather information and the preset processing mode. The terminal device may determine a target processing mode for processing the initial data according to the type of the sensor corresponding to the acquired initial data and the current weather information based on the association relationship.
In an embodiment, the target processing manner includes, but is not limited to, one or more of snow removal, rain removal, fog removal, and the like for the image to be recognized, and the processing manner is not limited thereto.
It is understood that, because the types of data collected by the different types of sensors are different, the target processing manner determined by the different types of sensors under the current weather information is also different, and the detailed description is omitted here.
S103, inputting the target data into a target recognition model corresponding to the sensor to obtain a first prediction position and a first prediction type of the target object.
In an embodiment, the target recognition model is a model trained in advance, and the number of the models corresponds to the types of the sensors one to one. For example, when the sensor is an image sensor, the target data thereof is an image, and thus, the corresponding target recognition model may be a target image recognition model; when the sensor is a radar sensor, the target data is point cloud data, and thus, the corresponding target identification model can be a target point cloud identification model.
It should be understood that, for target data acquired by different types of sensors on the same target object, after the target data is input to the corresponding target recognition models, the first predicted positions output by each target recognition model may be the same or different; and the first prediction category may also be the same, or different.
In addition, when the target recognition model outputs the first predicted position and the first prediction type of the target object, the first prediction probability corresponding to the first predicted position and the second prediction probability corresponding to the first prediction type are also output, and the description thereof will not be given.
It should be noted that, when there are multiple sensors of the same type, there will also be multiple target data of the same type collected by the sensors. At this time, the target recognition model processes each target data, and correspondingly outputs a first predicted position and a first predicted category.
And S104, determining the target reliability when the target recognition model outputs the first prediction position and the first prediction type according to the influence relation of the preset weather information on the target data collected by the sensor.
In one embodiment, the target confidence level may be used to quantify the impact that the sensor has on collecting target data under the current weather information.
In an embodiment, the influence relationship may be set in advance by a worker and stored in the terminal device, or may be determined by the terminal device according to sampling data acquired by each type of sensor on the target object, which is not limited herein.
Specifically, for any sensor, the terminal device may acquire sampling data acquired by the sensor for the target object under each preset weather information; and then, inputting the sampling data into a target recognition model corresponding to the sensor to obtain a target output result corresponding to the sampling data. And then, determining the prediction accuracy rate of the target identification model for predicting the sampling data under each preset weather information by using the target output result and the real result of the pre-marked sampling data under each preset weather information. And finally, the terminal equipment can establish the influence relation according to the prediction accuracy corresponding to each preset weather information. At this time, the influence relationship is established by factors such as preset weather information, a sensor, prediction accuracy and the like. The terminal device may determine the prediction accuracy as the target reliability.
It should be noted that, in the above S101, information formed by weather information including, but not limited to, one or more weather factors such as light, temperature, humidity, and weather category is described. Therefore, the preset weather information should also be formed of one or more of the above weather factors. Based on this, when establishing the above influence relationship, the terminal device may specifically establish the influence relationship between each preset weather factor, the sensor, and the prediction accuracy. At this time, when determining the target reliability from the influence relationship, the terminal device may perform the determination according to S301 to S303 shown in fig. 3. The details are as follows:
s301, determining the influence value of each weather factor in the current weather information on the sensor according to the influence relation.
In an embodiment, as can be seen from the above description, when the preset weather information is formed by one or more of the weather factors, the terminal device may establish an influence relationship between each preset weather factor, the sensor, and the prediction accuracy. At this time, the terminal device may determine the prediction accuracy as the influence value of each weather factor on the sensor.
Based on the above, the terminal device may determine, according to the influence relationship, an influence value caused by each weather factor in the current weather information when the sensor acquires the target data.
S302, determining a preset number of target influence values from the plurality of influence values; the target impact value is greater than the non-target impact value.
In an embodiment, because the current weather information includes many and too complex weather factors, and the influence values of each weather factor on the acquisition of the target data by the sensor may be different, the terminal device may further determine a preset number of target influence values to participate in the subsequent processing according to the magnitude of the plurality of influence values. Wherein the target impact value is greater than the non-target impact value.
And S303, determining the target reliability according to the target influence value.
In an embodiment, the terminal device may determine a weighted sum of all target influence values as the target reliability; wherein, the weights of all target influence values are subject to normal distribution. Alternatively, the sum of all target influence values is determined as the target reliability, which is not limited.
In one embodiment, the normal distribution can be regarded as a probability distribution of probabilities (weights), and is used for giving the probability of occurrence of all specific weights to each target influence value according to the normal distribution when the specific weight of each target influence value is unknown. Therefore, the obtained target credibility is more objective, and the influence of the current weather information on the sensor can be represented.
And S105, determining the target position and the target type of the target object according to all the target credibility and the corresponding first prediction position and first prediction type.
In one embodiment, based on S104 above, the first predicted location and the first predicted category output by each target recognition model will correspond to a target confidence level. Based on this, the terminal device may determine the target position and the target class of the target object according to S401-S404 as shown in fig. 4. The details are as follows:
s401, aiming at any sensor, a first prediction probability of a first prediction position output by the target recognition model is obtained, and a second prediction probability of a first prediction category is output.
In one embodiment, in the above S103, when the target recognition model outputs the first predicted position and the first predicted category of the target object, the target recognition model also generally outputs a first predicted probability corresponding to the first predicted position and a second predicted probability corresponding to the first predicted category. Therefore, the terminal device can directly acquire the first prediction probability and the second prediction probability.
S402, correcting the first prediction probability according to the target reliability to obtain a third prediction probability.
And S403, correcting the second prediction probability according to the target reliability to obtain a fourth prediction probability.
In an embodiment, the modification method may specifically be: multiplying the target reliability by the first prediction probability to obtain a third prediction probability; and correcting the target reliability and the second prediction probability to obtain a fourth prediction probability.
S404, determining the target position and the target category of the target object according to the third prediction probability and the fourth prediction probability corresponding to all the sensors.
In an embodiment, after obtaining the third prediction probability and the fourth prediction probability corresponding to the target data acquired by each sensor, the terminal device may determine the first prediction position corresponding to the maximum value of all the third prediction probabilities as the target position; and determining the first prediction category corresponding to the maximum value in all the fourth prediction probabilities as the target category.
In this embodiment, the terminal device may first obtain current weather information and target data of the target object respectively acquired by each type of sensor. Then, for any sensor of each type, the acquired target data is input into a target recognition model corresponding to the sensor, and a first prediction position and a first prediction category of the target object are obtained. And then, quantifying the influence of the current weather information on the sensor to acquire the target data according to the influence relation of the preset weather information on the sensor to acquire the target data so as to determine the target reliability when the target recognition model outputs the first prediction position and the first prediction category. And finally, determining the target position and the target type of the target object according to all the target credibility and the corresponding first prediction positions and first prediction types. Based on the above, by combining the first prediction type and the first prediction position output by the target recognition model corresponding to different types of sensors, the accuracy of the acquired target position and the target type can be improved. In addition, when the target type and the target position are determined, the influence of the current weather information on the target data collected by the sensor is also considered, so that the accuracy of identifying the target object is further improved.
In another embodiment, when the radar sensor collects the point cloud data of the target object, the corresponding position of each point cloud can be directly collected. Therefore, after the point cloud belonging to the target object is identified by the target identification model corresponding to the radar sensor, the first predicted position of the target object can be directly determined. However, the target recognition model corresponding to the image sensor needs to determine a target object from the image to be recognized, and perform world coordinate conversion according to the position of the target object in the image to be recognized to obtain a first predicted position of the target object.
Therefore, when predicting the position of the target object, it is generally considered that the accuracy of the first predicted position output by the target recognition model corresponding to the image sensor is lower than the accuracy of the first predicted position output by the target recognition model corresponding to the radar sensor.
Similarly, when determining the target class, the image sensor directly senses the image of the target object, and the radar sensor senses the shape of the target object. Therefore, when predicting the type of the target object, it is generally considered that the accuracy of the first prediction type output by the target recognition model corresponding to the image sensor is higher than the accuracy of the first prediction type output by the target recognition model corresponding to the radar sensor.
Based on this, in order to be able to further improve the recognition accuracy of the target object, the terminal device may also determine the target position and the target category according to S501-S505 shown in fig. 5. The details are as follows:
s501, if the sensor corresponding to the target recognition model outputting the first prediction probability is an image sensor, weighting the first prediction probability and the first preset weight to obtain a fifth prediction probability.
S502, if the sensor corresponding to the target recognition model outputting the first prediction probability is an image sensor, weighting the first prediction probability and a second preset weight to obtain a fifth prediction probability; the first preset weight is less than the second preset weight.
As can be seen from the above description, when the target position is identified, the accuracy of the first predicted position output based on the target data collected by the radar sensor and the corresponding target identification model is higher. Therefore, the terminal device may set the first preset weight to a value smaller than the second preset weight.
Then, the terminal device may weight the first prediction probability corresponding to the image sensor with a first preset weight to obtain a fifth prediction probability corresponding to the image sensor; and weighting the first prediction probability corresponding to the radar sensor and a second preset weight with a larger numerical value to obtain a fifth prediction probability corresponding to the radar sensor.
And S503, if the sensor corresponding to the target recognition model outputting the second prediction probability is a radar sensor, weighting the second prediction probability and a third preset weight to obtain a sixth prediction probability.
S504, if the sensor corresponding to the target recognition model outputting the second prediction probability is a radar sensor, weighting the second prediction probability and a fourth preset weight to obtain a sixth prediction probability; the third preset weight is greater than the fourth preset weight.
Similarly, when the target type is identified, the accuracy of the first prediction type output based on the target data collected by the image sensor and the corresponding target identification model is higher. Therefore, the terminal device may set the third preset weight to a value larger than the fourth preset weight.
Then, the terminal device may weight the second prediction probability corresponding to the image sensor with a third preset weight having a smaller value to obtain a sixth prediction probability corresponding to the image sensor; and weighting the second prediction probability corresponding to the radar sensor and a fourth preset weight to obtain a sixth prediction probability corresponding to the radar sensor.
And S505, determining the target position and the target type of the target object according to all the target credibility and the corresponding fifth prediction probability and sixth prediction probability.
In an embodiment, the step S505 is similar to the step S105, and will not be described again.
Referring to fig. 6, fig. 6 is a block diagram of a target identification device according to an embodiment of the present disclosure. The target identification apparatus in this embodiment includes modules for executing the steps in the embodiments corresponding to fig. 1 to 5. Please refer to fig. 1 to 5 and fig. 1 to 5 for related descriptions. For convenience of explanation, only the portions related to the present embodiment are shown. The target identification device is applied to terminal equipment, and the terminal equipment is provided with various types of sensors; referring to fig. 6, the object recognition apparatus 600 may include: a first obtaining module 610, a second obtaining module 620, an input module 630, a first determining module 640, and a second determining module 650, wherein:
the first obtaining module 610 is configured to obtain current weather information.
And a second obtaining module 620, configured to obtain, for any one sensor of each type, target data of the target object acquired by the sensor.
The input module 630 is configured to input the target data into a target recognition model corresponding to the sensor, so as to obtain a first predicted position and a first predicted category of the target object.
The first determining module 640 is configured to determine, according to an influence relationship of preset weather information on target data collected by the sensor, a target reliability when the target recognition model outputs the first predicted position and the first predicted category.
A second determining module 650, configured to determine a target location and a target category of the target object according to all target reliabilities and the corresponding first predicted locations and first predicted categories.
In an embodiment, the first obtaining module 610 is further configured to:
acquiring current position information; determining weather sensing equipment in a position range corresponding to the current position information; sending a weather information acquisition request to the weather sensing equipment, and receiving current weather information returned by the weather sensing equipment based on the weather information acquisition request.
In an embodiment, the second obtaining module 620 is further configured to:
acquiring initial data of a target object acquired by a sensor; determining a target processing mode corresponding to initial data under the current weather information according to the type of the sensor and the incidence relation between the preset weather information and the preset processing mode; and processing the initial data by adopting a target processing mode to obtain target data.
In one embodiment, the weather information includes a plurality of weather factors; the first determining module 640 is further configured to:
determining the influence value of each weather factor in the current weather information on the sensor according to the influence relation; determining a preset number of target influence values from the plurality of influence values; the target impact value is greater than the non-target impact value; and determining the target reliability according to the target influence value.
In an embodiment, the first determining module 640 is further configured to:
determining the weighted sum of all target influence values as the target reliability; the weights of all target influence values follow a normal distribution.
In an embodiment, the second determining module 650 is further configured to:
aiming at any one sensor, acquiring a first prediction probability of a target recognition model outputting a first prediction position and outputting a second prediction probability of a first prediction category; correcting the first prediction probability according to the target reliability to obtain a third prediction probability; correcting the second prediction probability according to the target reliability to obtain a fourth prediction probability; and determining the target position and the target category of the target object according to the third prediction probability and the fourth prediction probability corresponding to all the sensors.
In an embodiment, the second determining module 650 is further configured to:
determining a first prediction position corresponding to the maximum value of the third prediction probability as a target position; and determining the first prediction category corresponding to the maximum value of the fourth prediction probability as the target category.
In an embodiment, the sensor includes at least an image sensor and a radar sensor; the object recognition apparatus 600 further includes:
and the first weighting module is used for weighting the first prediction probability and the first preset weight to obtain a fifth prediction probability if the sensor corresponding to the target recognition model outputting the first prediction probability is an image sensor.
The second weighting module is used for weighting the first prediction probability and a second preset weight if the sensor corresponding to the target recognition model outputting the first prediction probability is a radar sensor to obtain a fifth prediction probability; the first preset weight is less than the second preset weight.
And the third weighting module is used for weighting the second prediction probability and a third preset weight to obtain a sixth prediction probability if the sensor corresponding to the target recognition model outputting the second prediction probability is an image sensor.
The fourth weighting module is used for weighting the second prediction probability and a fourth preset weight to obtain a sixth prediction probability if the sensor corresponding to the target recognition model outputting the second prediction probability is a radar sensor; the third preset weight is greater than the fourth preset weight.
And the third determining module is used for determining the target position and the target category of the target object according to all the target credibility and the corresponding fifth prediction probability and sixth prediction probability.
It should be understood that, in the structural block diagram of the target identifying apparatus shown in fig. 6, each module is used to execute each step in the embodiment corresponding to fig. 1 to fig. 5, and each step in the embodiment corresponding to fig. 1 to fig. 5 has been explained in detail in the above embodiment, and specific reference is made to the relevant description in the embodiments corresponding to fig. 1 to fig. 5 and fig. 1 to fig. 5, which is not repeated here.
Fig. 7 is a block diagram of a terminal device according to an embodiment of the present application. As shown in fig. 7, the terminal device 700 of this embodiment includes: a processor 710, a memory 720 and a computer program 730, such as a program for an object recognition method, stored in the memory 720 and executable on the processor 710. The processor 710, when executing the computer program 730, implements the steps in the embodiments of the object recognition methods described above, such as S101 to S105 shown in fig. 1. Alternatively, the processor 710, when executing the computer program 730, implements the functions of the modules in the embodiment corresponding to fig. 6, for example, the functions of the modules 610 to 650 shown in fig. 6, and refer to the related description in the embodiment corresponding to fig. 6 specifically.
Illustratively, the computer program 730 may be divided into one or more modules, which are stored in the memory 720 and executed by the processor 710 to implement the object recognition methods provided by the embodiments of the present application. One or more of the modules may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 730 in the terminal device 700. For example, the computer program 730 may implement the object recognition method provided by the embodiment of the present application.
Terminal device 700 can include, but is not limited to, a processor 710, a memory 720. Those skilled in the art will appreciate that fig. 7 is merely an example of a terminal device 700 and does not constitute a limitation of terminal device 700 and may include more or fewer components than shown, or some components may be combined, or different components, e.g., the terminal device may also include input-output devices, network access devices, buses, etc.
The processor 710 may be a central processing unit, but may also be other general purpose processors, digital signal processors, application specific integrated circuits, off-the-shelf programmable gate arrays or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 720 may be an internal storage unit of the terminal device 700, such as a hard disk or a memory of the terminal device 700. The memory 720 may also be an external storage device of the terminal device 700, such as a plug-in hard disk, a smart card, a flash memory card, etc. provided on the terminal device 700. Further, the memory 720 may also include both internal and external memory units of the terminal device 700.
The embodiments of the present application provide a computer-readable storage medium, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the computer program, the object recognition method in the above embodiments is implemented.
The embodiment of the present application provides a computer program product, which, when running on a terminal device, enables the terminal device to execute the target identification method in the foregoing embodiments.
The embodiment of the application also provides another terminal device, wherein the terminal device is provided with various types of sensors and a target recognition device, the target recognition device is connected with the various types of sensors, and the target recognition device is used for executing the target recognition method in each embodiment.
The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. An object recognition method is applied to a terminal device, wherein the terminal device is provided with different types of sensors, and the method comprises the following steps:
acquiring current weather information;
for any one of the sensors of each type, acquiring target data of a target object acquired by the sensor;
inputting the target data into a target recognition model corresponding to the sensor to obtain a first prediction position and a first prediction category of the target object;
determining the target reliability of the target recognition model when the first prediction position and the first prediction category are output according to the influence relation of preset weather information on the target data collected by the sensor;
and determining the target position and the target type of the target object according to all the target credibility and the corresponding first prediction positions and first prediction types.
2. The method of claim 1, wherein the obtaining current weather information comprises:
acquiring current position information;
determining weather sensing equipment in a position range corresponding to the current position information;
sending a weather information acquisition request to the weather sensing equipment, and receiving the current weather information returned by the weather sensing equipment based on the weather information acquisition request.
3. The method of claim 1, wherein the acquiring target data of the target object acquired by the sensor comprises:
acquiring initial data of the target object acquired by the sensor;
determining a target processing mode corresponding to the initial data under the current weather information according to the type of the sensor and the incidence relation between the preset weather information and a preset processing mode;
and processing the initial data by adopting the target processing mode to obtain the target data.
4. The method of claim 1, wherein the weather information includes a plurality of weather factors; the determining, according to an influence relationship of preset weather information on the sensor collecting the target data, a target reliability when the target recognition model outputs the first predicted position and the first predicted category includes:
determining the influence value of each weather factor in the current weather information on the sensor according to the influence relation;
determining a preset number of target influence values from a plurality of the influence values; the target impact value is greater than a non-target impact value;
and determining the target reliability according to the target influence value.
5. The method of claim 4, wherein said determining said target confidence level based on said target impact value comprises:
determining a weighted sum of all the target impact values as the target confidence; and the weights of all the target influence values are subjected to normal distribution.
6. The method of any one of claims 1-5, wherein determining the target location and the target class of the target object based on all of the target trustworthiness and the corresponding first predicted locations and first predicted classes comprises:
for any one sensor, acquiring a first prediction probability of the target recognition model outputting the first prediction position and outputting a second prediction probability of the first prediction category;
correcting the first prediction probability according to the target reliability to obtain a third prediction probability;
correcting the second prediction probability according to the target reliability to obtain a fourth prediction probability;
and determining the target position and the target category of the target object according to the third prediction probability and the fourth prediction probability corresponding to all the sensors.
7. The method of claim 6, wherein the sensors comprise at least an image sensor and a radar sensor; the method further comprises the following steps:
if the sensor corresponding to the target recognition model outputting the first prediction probability is the image sensor, weighting the first prediction probability and a first preset weight to obtain a fifth prediction probability;
if the sensor corresponding to the target recognition model outputting the first prediction probability is the radar sensor, weighting the first prediction probability and a second preset weight to obtain a fifth prediction probability; the first preset weight is smaller than the second preset weight;
if the sensor corresponding to the target recognition model outputting the second prediction probability is the image sensor, weighting the second prediction probability and a third preset weight to obtain a sixth prediction probability;
if the sensor corresponding to the target recognition model outputting the second prediction probability is the radar sensor, weighting the second prediction probability and a fourth preset weight to obtain a sixth prediction probability; the third preset weight is greater than the fourth preset weight;
and determining the target position and the target category of the target object according to all the target credibility and the corresponding fifth prediction probability and sixth prediction probability.
8. An object recognition apparatus applied to a terminal device equipped with sensors of different types, the apparatus comprising:
the first acquisition module is used for acquiring current weather information;
the second acquisition module is used for acquiring target data of a target object acquired by the sensor aiming at any one sensor of each type;
the input module is used for inputting the target data into a target recognition model corresponding to the sensor to obtain a first prediction position and a first prediction category of the target object;
the first determination module is used for determining the target reliability of the target recognition model when the first prediction position and the first prediction category are output according to the influence relation of preset weather information on the target data collected by the sensor;
and the second determining module is used for determining the target position and the target category of the target object according to all the target credibility and the corresponding first predicted position and first predicted category.
9. A terminal device provided with a plurality of types of sensors and an object recognition means connected to the plurality of types of sensors, the object recognition means being configured to perform the method according to any one of claims 1 to 7.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 7.
CN202211372177.5A 2022-11-03 2022-11-03 Target identification method and device, terminal equipment and storage medium Pending CN115620069A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211372177.5A CN115620069A (en) 2022-11-03 2022-11-03 Target identification method and device, terminal equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211372177.5A CN115620069A (en) 2022-11-03 2022-11-03 Target identification method and device, terminal equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115620069A true CN115620069A (en) 2023-01-17

Family

ID=84875883

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211372177.5A Pending CN115620069A (en) 2022-11-03 2022-11-03 Target identification method and device, terminal equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115620069A (en)

Similar Documents

Publication Publication Date Title
CN108318043B (en) Method, apparatus, and computer-readable storage medium for updating electronic map
CN113642633B (en) Method, device, equipment and medium for classifying driving scene data
CN111382768B (en) Multi-sensor data fusion method and device
CN111770451B (en) Road vehicle positioning and sensing method and device based on vehicle-road cooperation
CN112700470A (en) Target detection and track extraction method based on traffic video stream
US10803751B2 (en) Processing device
CN110703739B (en) Vehicle diagnosis method, roadside unit, on-board unit, system, and storage medium
CN110304068B (en) Method, device, equipment and storage medium for collecting automobile driving environment information
CN110388929B (en) Navigation map updating method, device and system
CN110807924A (en) Multi-parameter fusion method and system based on full-scale full-sample real-time traffic data
US20220215197A1 (en) Data processing method and apparatus, chip system, and medium
CN112949782A (en) Target detection method, device, equipment and storage medium
CN113537362A (en) Perception fusion method, device, equipment and medium based on vehicle-road cooperation
CN114639085A (en) Traffic signal lamp identification method and device, computer equipment and storage medium
CN111947669A (en) Method for using feature-based positioning maps for vehicles
CN117372979A (en) Road inspection method, device, electronic equipment and storage medium
CN113033493A (en) Target object inspection method and device, electronic equipment and storage medium
CN117130010A (en) Obstacle sensing method and system for unmanned vehicle and unmanned vehicle
CN113050081A (en) Method and device for detecting shelter, radar, vehicle and storage medium
CN115620069A (en) Target identification method and device, terminal equipment and storage medium
CN114895274A (en) Guardrail identification method
CN114968189A (en) Platform for perception system development of an autopilot system
CN115331480A (en) Vehicle early warning method and device and computing equipment
US11815362B2 (en) Map data generation apparatus
CN115775329A (en) Target identification method and device, terminal equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination