CN210270946U - Target identification system based on multi-sensor information fusion - Google Patents

Target identification system based on multi-sensor information fusion Download PDF

Info

Publication number
CN210270946U
CN210270946U CN201921152880.9U CN201921152880U CN210270946U CN 210270946 U CN210270946 U CN 210270946U CN 201921152880 U CN201921152880 U CN 201921152880U CN 210270946 U CN210270946 U CN 210270946U
Authority
CN
China
Prior art keywords
target
monitoring
sensor
information fusion
monitoring area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201921152880.9U
Other languages
Chinese (zh)
Inventor
蔡锦华
熊健
兰童玲
张返立
熊勰
吴包琳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangxi Lianchuang Precision Electromechanics Co ltd
Original Assignee
Jiangxi Lianchuang Precision Electromechanics Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangxi Lianchuang Precision Electromechanics Co ltd filed Critical Jiangxi Lianchuang Precision Electromechanics Co ltd
Priority to CN201921152880.9U priority Critical patent/CN210270946U/en
Application granted granted Critical
Publication of CN210270946U publication Critical patent/CN210270946U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The utility model relates to a target identification system based on multi-sensor information fusion.A monitoring center deploys an information fusion target identification device; a monitoring device is deployed in the rear-end monitoring area and consists of a detection/sensor, a portable photovoltaic system and a portable battery which are connected with each other; the front-end monitoring area is also provided with a monitoring device; the monitoring device of the rear-end monitoring area is in wired communication with the monitoring device of the front-end monitoring area; and the monitoring device of the rear-end monitoring area is in wireless communication with the information fusion target identification device of the monitoring center. The utility model discloses utilize D-S to merge the rule and merge into a new evidence body with each single evidence body, and the target of the better reaction discernment of new evidence physical stamina carries out the decision-making according to the decision-making rule and judges, reaches the purpose of discernment target, has improved the recognition rate.

Description

Target identification system based on multi-sensor information fusion
Technical Field
The utility model relates to an information fusion and target identification technique is applicable to unmanned on duty's stereoscopic monitoring system under the complicated environment is prevented at the sea by the limit, in particular to target identification system based on multisensor information fusion.
Technical Field
The border sea defense area has long patrol route, wide patrol range, complex geographic environment and bad weather conditions. The traditional manual patrol mode is limited by limited personnel, and due to the influences of fatigue, limited observation range and the like caused by long-time patrol, the traditional manual patrol mode often causes hostile force and terrorist invasion, and causes serious consequences. The existing active side defense and sea defense forces monitoring equipment is single and generates a large amount of invalid data, and monitoring soldiers need to face mass data every day, and the monitoring soldiers are clearly contrasted with limited personnel, so that relevant information of an intrusion target cannot be effectively acquired, and the monitoring area is subjected to intrusion danger. Therefore, a target identification device is urgently needed, the requirement of three-dimensional monitoring of the offshore protection unattended operation is met, and the safety of the offshore protection is widely guaranteed.
The border sea defense area is in soil with multiple countries, the area situation is complex and various, and the periphery faces severe safety examination. The method mainly has the following characteristics: 1) the terrain is flat and clear; 2) river boundaries, mountain roads rugged; 3) the land is in danger, chong mountain and mountain mountains; 4) densely distributing jungles and frequently generating thunder and lightning; 5) flooding with yellow sand, gobi oasis.
The target information has various forms, contents including all things, and great amount as the smoke sea. Under the background, a single sensor can only acquire a certain attribute of target information, reflects information of a certain side face of a target to be detected, and cannot acquire all-directional information of the target, so that the target is detected by adopting multiple sensors, and the complementation of different types of sensors in time, space and functions can be realized; the method can prevent a single sensor from being out of order in the detection process and other sensors can still effectively acquire target multi-dimensional information when the single sensor cannot work normally; meanwhile, the detection result of the single sensor can be confirmed by other sensors and fused with each other, so that the confidence of the detection information and the fault-tolerant capability of the system are improved. Therefore, the integrity and the accuracy of target information can be better ensured compared with a single sensor, the decision judgment of the target type is facilitated, and the recognition rate is improved.
The multi-sensor information fusion technology is applied to the military field for the first time, and has become a hotspot of current research along with development of subjects such as artificial intelligence, machine vision, image processing, artificial neural networks, fuzzy control and the like and appearance of sensors with different types and performances like bamboo shoots in spring after rain. The multi-sensor information fusion can be divided into low-level data layer fusion, middle-level feature layer fusion and high-level decision layer fusion according to the abstraction degree of information; the method comprises the following steps of dividing into a centralized fusion structure model, a distributed fusion structure model and a mixed structure model according to the range, the data type and the fusion position of information data processing; the information fusion method is roughly classified into an optimal estimation theory method (least square method, kalman filter, and maximum likelihood estimation), a statistical method (bayesian theory, evidence theory, and markov process), an information theory method (entropy method and minimum description length), and an artificial intelligence method (artificial neural network, genetic algorithm, and fuzzy logic). At present, the multi-sensor information fusion technology is widely applied and is mainly used for medical diagnosis, traffic monitoring, fire monitoring, water quality monitoring, industrial control, ocean monitoring and the like in the civil field.
The target identification is one of main application cases of multi-sensor information fusion, and is mainly characterized in that features are extracted from an obtained unknown target through real-time signal processing, a target feature vector is established, and the target feature vector is matched with a known target feature database to achieve a target of target identification. If only a single sensor is used, the information it collects is single, local and uncertain, resulting in a reduced target recognition rate. The multiple sensors detect the target in multiple dimensions, multiple angles and multiple attributes, the information of the target is more abundant and complete, multiple characteristic vectors of the target are extracted for fusion processing, and the target identification rate is effectively improved.
Disclosure of Invention
An object of the utility model is to provide a target identification system based on multisensor information fusion. The problem that the information detected by the sensor lacks deep fusion can be solved; the personnel, the vehicle and the low-altitude small target can be identified, and a solution is provided for the offshore defense unmanned region.
The purpose of the utility model is realized by the following technical scheme. A target recognition system based on multi-sensor information fusion comprises a monitoring center, a rear-end monitoring area and a front-end monitoring area, wherein an information fusion target recognition device is deployed in the monitoring center; the monitoring device is arranged in the rear-end monitoring area and consists of a detection/sensor, a portable photovoltaic system and a portable battery which are connected with each other; the front-end monitoring area is also provided with a monitoring device; the monitoring device of the rear-end monitoring area is in wired communication with the monitoring device of the front-end monitoring area; and the monitoring device of the rear-end monitoring area is in wireless communication with the information fusion target identification device of the monitoring center.
The utility model utilizes a plurality of detecting/sensors to collect the attribute information of the target to be identified, and generates the target BPA; and then carrying out data association and fusion processing on the data information with the characteristic attributes. The data fusion is based on a D-S evidence theory method, namely, the measurement of a target is generated according to target characteristic information acquired by each sensor, an evidence body in the theory is formed, corresponding basic probability distribution functions are constructed by using the evidence bodies, a single sensor gives a credibility to the identified target, each single evidence body is combined into a new evidence body by using a D-S combination rule, the new evidence body can better reflect the identified target, decision judgment is carried out according to a decision rule, the purpose of identifying the target is achieved, and the identification rate is improved.
Drawings
FIG. 1 is a system connection diagram of the present invention;
FIG. 2 is a flow chart of the present invention;
FIG. 3 is a flow chart of a multi-sensor information fusion method of the present invention;
fig. 4 is a block diagram of the multi-target recognition model of the present invention.
In the figure: 100. the monitoring center, 200, a rear end monitoring area and 300, a front end monitoring area; 1. information fusion target identification device, 2, monitoring device, 21, detection/sensor, 22, portable photovoltaic system, 23, portable battery.
Detailed Description
The technical implementation of the present invention is described in detail below with reference to the accompanying drawings and examples. Referring to fig. 1 to 4, a target identification system based on multi-sensor information fusion includes a monitoring center 100, a back-end monitoring area 200 and a front-end monitoring area 300, wherein the monitoring center 100 is deployed with an information fusion target identification device 1; the monitoring device 2 is deployed in the rear-end monitoring area 200, and the monitoring device 2 is composed of a detection/sensor 21, a portable photovoltaic system 22 and a portable battery 23 which are connected with each other; the front-end monitoring area 300 is also provided with a monitoring device 2; the monitoring device 2 of the back-end monitoring area 200 and the monitoring device 2 of the front-end monitoring area 300 are in wired communication; the monitoring device 2 in the rear-end monitoring area 200 wirelessly communicates with the information fusion target recognition device 1 in the monitoring center 100.
The utility model discloses target identification system based on multisensor information fusion, at first acquire the information of treating the discernment target through multisensor, to target information pretreatment, extract the target characteristic; then, the single sensor carries out primary identification on the target to form a target BPA (basic probability assignment), and the information fusion target identification device carries out data fusion calculation on a target credibility function according to an improved D-S evidence theory method by acquiring the target BPA calculated by the single sensor; and finally, identifying the target through a decision logic judgment rule.
Example (b):
fig. 1 is a system connection diagram of the present invention. The system is designed for the offshore defense unattended monitoring area, and various sensors can be freely combined and flexibly configured according to different monitoring environments. Fig. 1 specifically adopts a combination of an acoustic array sensor, a vibration sensor, a visible light detector and a thermal infrared image detector, which is only an example combination, for the characteristics of identifying the types of targets (people, vehicles and low-altitude small targets). Alternatives to this embodiment may also incorporate other types of detection/sensors 21 (e.g., buried fiber grating vibration sensors, buried fiber acoustic sensors, magnetic sensors, microwave sensors, laser sensors, etc.), or eliminate portions of the sensors, whose principle of action is consistent. The front-end monitoring area 300 is provided with a vibration sensor, identified target information is reported to the information fusion target identification device 1 in a wired communication mode, the rear-end monitoring area 200 is provided with an acoustic array sensor, a visible light detector and an infrared thermal image detector, and the identified target information is reported to the information fusion target identification device 1 in a wireless communication mode. Which is deployed at the monitoring center 100 for identifying objects, determining object types and orientations.
The multiple detection/sensors 21 are intelligently deployed mainly according to the characteristics of different types of sensors and the use environment, and the vibration sensor is a passive detection technology and can perform reconnaissance on the surrounding 360-degree targets; the device is not limited by visibility and visibility, and is a fully passive and omnibearing reconnaissance device. The vibration sensor has strong viability and is not easily interfered by electrons, but has the defects of short detection distance, low positioning precision, easy influence of wind, rain and other environments and the like. Therefore, the vibration sensor is deployed in a front-end monitoring area, the arrangement distance is determined according to actual needs, the arrangement distance can be selected within the interval of 30-100 meters, and the vibration sensor can be used for identifying personnel, vehicles and the like.
The acoustic array sensor consists of a plurality of microphones with good sensitivity and phase consistency, can receive acoustic signals sent out when a target moves, has phase difference when reaching the plurality of microphones through the acoustic signals, and determines the position of the target by adopting a high-resolution array signal processing algorithm; and simultaneously, judging the type of the target according to the time domain and frequency domain characteristics of the acoustic signal. The acoustic array sensor has the characteristics of flat response characteristic, high sensitivity, small nonlinear distortion and high recognition rate. Therefore, the acoustic array sensor is arranged in a back-end monitoring area and can be used for identifying low-altitude small targets and the like.
The visible light sensor can sense the texture (shape, contour and illumination change) and color characteristics of the target, and can acquire rich characteristics of the target. The infrared thermal image detector can detect thermal radiation temperature data, can also sense the characteristics of the appearance, the outline and the like of a target, and can make up for the defect that the visible light detector is not sufficiently identified in the environment of night and foggy days. Therefore, the visible light sensor and the infrared thermal image detector are arranged in the rear-end monitoring area and can be used for identifying personnel, vehicles, low-altitude small targets and the like.
Fig. 2 is a flow chart of the present invention. Selecting the type and the number of the detection/sensors 21 to be deployed according to the characteristics of the used geographic environment, and carrying out intelligent deployment; after deployment is completed, starting the sensor, and initializing the sensor; acquiring target attribute information through each sensor, extracting target characteristics, preliminarily identifying a target and generating a target BPA; carrying out data fusion according to an improved D-S evidence theory to generate a credibility function; and finally, identifying the target through a decision logic judgment rule. The decision rules are as follows:
is provided with
Figure DEST_PATH_IMAGE001
And satisfies the following conditions:
Figure 286079DEST_PATH_IMAGE002
if so:
Figure DEST_PATH_IMAGE003
then proposition A1In order to be the final result of the criterion,
Figure 65816DEST_PATH_IMAGE004
for a set threshold, U represents an uncertain proposition.
Fig. 3 shows the multi-sensor information fusion method of the present invention. First, the basic concepts related to D-S evidence theory will be described. Basic Probability Assignment (BPA), set as a recognition framework, function m:2θ→[0,1]( 2θA set made for all θ subsets) satisfies the following condition: m (∅) =0,
Figure DEST_PATH_IMAGE005
then m (A) is BPA of A, and represents the trust degree of A; trust function (BF) with θ as recognition frame, function Bel:2θ→[0,1],
Figure 274075DEST_PATH_IMAGE006
Bel (A), called the belief function, represents the sum of the basic probabilities of all subsets of A; the similarity function (PL), let θ be an identification frame, function Bel:2θ→[0,1],
Figure DEST_PATH_IMAGE007
plA, called the plausibility function, represents the confidence that A is not negated, and has Bel (A). ltoreq.plA. Wherein [0, Bel (A)]Indicates that the proposition A is true in this interval, [ Bel (A), pl (A)]Indicates that the proposition A cannot be determined to be true in this interval, [0, pl (A)]Indicates that in this interval A is not negated as true, but is not affirmed, [ pl (A), 1]Indicating that proposition a is false in this interval. The evidence synthesis rule comprises two evidences E1 and E2 under an identification frame theta, wherein BPA and focal element of E1 are m respectively1And A1,…,AkE2, wherein BPA and Joule are m2And B1,…,BrThe evidence synthesis rule of the two is
Figure 150764DEST_PATH_IMAGE008
Wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE009
it reflects the degree of conflict between the evidences, becoming a conflict factor. The synthesis rule can also be generalized to multiple evidence synthesis.
The detailed description of the information fusion method of the multi-sensor information fusion target recognition system comprises the following steps:
step 1: identifying a frame theta = { personnel, vehicles, low-altitude small targets }, wherein the sensor types comprise an acoustic array sensor, a vibration sensor, a visible light detector and an infrared thermal image detector, and target characteristic information 1, 2, 3 and 4 is obtained through each sensor;
step 2: calculating BPA of 4 sensors for personnel, vehicles, low-altitude small targets and uncertain targets according to the primary recognition result of each sensor, and respectively generating evidence bodies 1, 2, 3 and 4;
step 3: respectively fusing the evidence bodies 1, 2, 3 and 4 according to an improved D-S evidence theory fusion rule to generate new evidence bodies 1 and 2 and a target evidence body;
step 4: and determining the type of the recognition target according to the decision judgment rule by the fusion result.
Experiments prove that the information fusion algorithm has greater advantages compared with other fusion algorithms and has obvious effect of improving the target recognition rate.
Fig. 4 is a block diagram of the multi-target recognition model of the present invention. The multi-sensor information fusion target identification system can identify targets in multiple batches, and the implementation principle of the multi-sensor information fusion target identification system is consistent with that of single target identification. The detection/sensor respectively extracts target characteristics 1, … and target characteristics n from the target objects 1 and … and the target object n, and finally obtains target recognition results Q1 and … and a target recognition result Qn according to the recognition process of a single target.
The multi-sensor information fusion target identification system is realized by combining VC + + programming with an OPENCV open source library, and the version number is as follows: v1.00.00 are provided. The target identification based on multi-sensor information fusion is suitable for an unattended three-dimensional monitoring system in a side-sea defense complex environment, and can also be used in the civil field (such as medical diagnosis, traffic monitoring, fire monitoring, water quality monitoring, industrial control and marine monitoring, unmanned driving and the like). The system mainly comprises a target information data input module, an equipment parameter management module, an information fusion module, a fusion result display module and a decision-making judgment module.
The functional module responsibilities are as follows:
the target information data input module mainly acquires target information data preliminarily identified by the sensors in the front-end monitoring area and the rear-end monitoring area and parameter information of each sensor. The information is stored in a database for device management and log management by the application. Each individual sensor is capable of individually performing a preliminary identification of the target.
The equipment parameter management module is used for realizing the functions of managing parameters of each sensor, communicating between the sensors, controlling work or dormancy, reading electric quantity, configuring parameters and the like.
The information fusion module is mainly used for receiving target information data provided by the target information data input module, preprocessing the information data, performing information fusion according to an improved D-S evidence theory method, and storing and outputting a fusion result.
The fusion result display module can display the first, second and third fusion results in a table mode, the table content mainly represents the credibility of the target in a numerical value mode, and the final fusion result is displayed through a curve graph, such as the recognition rate.
And the decision-making judgment module judges the information fusion result according to the decision-making rule to generate a final target recognition result.

Claims (1)

1. A target recognition system based on multi-sensor information fusion comprises a monitoring center, a rear-end monitoring area and a front-end monitoring area, and is characterized in that an information fusion target recognition device is deployed in the monitoring center; the monitoring device is arranged in the rear-end monitoring area and consists of a detection/sensor, a portable photovoltaic system and a portable battery which are connected with each other; the front-end monitoring area is also provided with a monitoring device; the monitoring device of the rear-end monitoring area is in wired communication with the monitoring device of the front-end monitoring area; and the monitoring device of the rear-end monitoring area is in wireless communication with the information fusion target identification device of the monitoring center.
CN201921152880.9U 2019-07-22 2019-07-22 Target identification system based on multi-sensor information fusion Active CN210270946U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201921152880.9U CN210270946U (en) 2019-07-22 2019-07-22 Target identification system based on multi-sensor information fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201921152880.9U CN210270946U (en) 2019-07-22 2019-07-22 Target identification system based on multi-sensor information fusion

Publications (1)

Publication Number Publication Date
CN210270946U true CN210270946U (en) 2020-04-07

Family

ID=70011882

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201921152880.9U Active CN210270946U (en) 2019-07-22 2019-07-22 Target identification system based on multi-sensor information fusion

Country Status (1)

Country Link
CN (1) CN210270946U (en)

Similar Documents

Publication Publication Date Title
Thombre et al. Sensors and AI techniques for situational awareness in autonomous ships: A review
Munawar et al. A review on flood management technologies related to image processing and machine learning
Arshad et al. Computer vision and IoT-based sensors in flood monitoring and mapping: A systematic review
CN110850403B (en) Multi-sensor decision-level fused intelligent ship water surface target feeling knowledge identification method
D'Addabbo et al. A Bayesian network for flood detection combining SAR imagery and ancillary data
Mazzarella et al. SAR ship detection and self-reporting data fusion based on traffic knowledge
KR20220155559A (en) Autonomous navigation method using image segmentation
CN110866887A (en) Target situation fusion sensing method and system based on multiple sensors
Pallotta et al. Traffic knowledge discovery from AIS data
CN110456320B (en) Ultra-wideband radar identity recognition method based on free space gait time sequence characteristics
US10062255B1 (en) VMD fused radar—a hyper-volumetric ultra-low NAR sensor system
US9558564B1 (en) Method for finding important changes in 3D point clouds
CN110501006B (en) Heterogeneous sensor combined track association and tracking method
KR102466804B1 (en) Autonomous navigation method using image segmentation
Andersson et al. Recognition of anomalous motion patterns in urban surveillance
CN112084712A (en) Flood submerging range dynamic simulation method fusing active and passive microwave remote sensing information
Vespe et al. Maritime multi-sensor data association based on geographic and navigational knowledge
Khan et al. Investigation of flash floods on early basis: A factual comprehensive review
CN210270946U (en) Target identification system based on multi-sensor information fusion
KR101920707B1 (en) smart navigation information generation apparatus based on image information and method thereof
Misović et al. Vessel detection algorithm used in a laser monitoring system of the lock gate zone
CN116469276A (en) Water area safety early warning method, device, equipment and storage medium
Dabrowski et al. Context-based behaviour modelling and classification of marine vessels in an abalone poaching situation
Hadzagic et al. Hard and soft data fusion for maritime traffic monitoring using the integrated ornstein-uhlenbeck process
Alkhathami et al. Models and techniques analysis of border intrusion detection systems

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant