CN114419825A - High-speed rail perimeter intrusion monitoring device and method based on millimeter wave radar and camera - Google Patents

High-speed rail perimeter intrusion monitoring device and method based on millimeter wave radar and camera Download PDF

Info

Publication number
CN114419825A
CN114419825A CN202210315202.XA CN202210315202A CN114419825A CN 114419825 A CN114419825 A CN 114419825A CN 202210315202 A CN202210315202 A CN 202210315202A CN 114419825 A CN114419825 A CN 114419825A
Authority
CN
China
Prior art keywords
point cloud
image
wave radar
camera
millimeter wave
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210315202.XA
Other languages
Chinese (zh)
Other versions
CN114419825B (en
Inventor
孙琪皓
刘桂卫
崔庆国
张璇钰
王飞
王衍汇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Railway Design Corp
Original Assignee
China Railway Design Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Railway Design Corp filed Critical China Railway Design Corp
Priority to CN202210315202.XA priority Critical patent/CN114419825B/en
Publication of CN114419825A publication Critical patent/CN114419825A/en
Application granted granted Critical
Publication of CN114419825B publication Critical patent/CN114419825B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/243Classification techniques relating to the number of classes
    • G06F18/24323Tree-organised classifiers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/181Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using active radiation detection systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a high-speed rail perimeter intrusion monitoring device and method based on a millimeter wave radar and a camera, belonging to the technical field of high-speed rail perimeter security, and being characterized in that a sensing module comprises an imaging millimeter wave radar and a camera; the information processing part receives the point cloud data and the image data of the sensing part, and firstly carries out preprocessing and foreign matter feature recognition on the point cloud data and carries out preprocessing and foreign matter feature recognition on the image data; then, carrying out preliminary judgment and identification on the invading foreign matters, finally carrying out fusion processing on point cloud data and image data, and judging and identifying the invading foreign matters again; the response part receives the output result of the information processing part and performs actions of alarming and information display according to the result. According to the invention, the sensing information of the imaging millimeter wave radar and the ultra-high-definition camera is fused, so that the limitation of a single monitoring means is overcome, the fast and accurate identification of the invading foreign matters such as pedestrians, animals and falling rocks in the complex environment can be realized, and the alarm information is automatically issued.

Description

High-speed rail perimeter intrusion monitoring device and method based on millimeter wave radar and camera
Technical Field
The invention belongs to the technical field of high-speed rail perimeter security and protection, and particularly relates to a high-speed rail perimeter intrusion monitoring device and method based on a millimeter wave radar and a camera.
Background
At present, the mileage of the high-speed railway which is put into operation in China exceeds 4 kilometers, and the high-speed railway is the country with the longest operating mileage, the highest operating speed and the most complex operating scene in the world. With the rapid increase of the mileage of the high-speed railway, the perimeter intrusion of the high-speed railway becomes an important problem which needs to be solved urgently by the safety of the railway at present. There are three main types of potential threats to the high-speed rail perimeter: illegal intrusion, man-made destruction and natural destruction. Because of more important equipment in high-speed rail stations and areas along the rail, the requirement on safety level is high, and once intrusion damage occurs, serious economic and social influences can be caused. At present, aiming at the problem of perimeter intrusion of high-speed rails, an operation and maintenance department still mainly adopts a monitoring means of manual inspection, the method is time-consuming and labor-consuming, the efficiency is low, and perimeter intrusion events with burstiness and randomness are difficult to prevent.
In recent years, relatively extensive research is carried out at home and abroad aiming at the railway perimeter intrusion monitoring technology and equipment. The perimeter security system was first developed and applied in australia, japan, usa, etc. Research institutions such as Chinese academy and university of Compound denier also develop perimeter intrusion monitoring technology research based on technologies such as optical fiber, laser and image. However, from the perspective of testing and application effects, a single perimeter intrusion detection method has disadvantages, such as poor anti-interference capability and high false alarm rate of vibration optical fiber, leaky cable, optical wave correlation detection, multi-dimensional three-technology vibration detection technology, etc., a pulse electronic fence, a tension fence, a microwave array, a laser radar, etc., has disadvantages of high construction and operation costs, and infrared detection, video monitoring, etc., have disadvantages of poor environmental adaptability. The measures cannot be suitable for the complex environment along the high-speed rail, and the application requirements of 'no missing report, low false report and low cost' of high-speed rail perimeter intrusion security are met. Therefore, a proper monitoring technology is urgently needed to be adopted, and a high-speed railway perimeter intrusion monitoring alarm device with high precision, low false alarm rate, strong anti-interference capability and high cost performance is researched and developed so as to effectively improve the operation safety level of the high-speed railway.
Disclosure of Invention
The invention aims to solve the technical problems in the known technology and provides a high-speed rail perimeter intrusion monitoring device and method based on a millimeter wave radar and a camera, which are used for fusing the sensing information of an imaging millimeter wave radar and an ultra-high-definition camera, further overcoming the limitation of a single monitoring means, meeting the application requirements of different scenes, illumination intensity and weather conditions of the high-speed rail perimeter, and realizing the rapid and accurate identification of the invading foreign matters such as pedestrians, animals and falling rocks in the complex environment and automatically issuing alarm information based on a multi-source data fusion model and a deep learning algorithm.
The invention provides a high-speed rail perimeter intrusion monitoring device based on a millimeter wave radar and a camera, which at least comprises:
the sensing module comprises an imaging millimeter wave radar and a camera;
the information processing part receives the point cloud data and the image data of the sensing part, and firstly carries out pretreatment and foreign matter feature recognition on the point cloud data and carries out pretreatment and foreign matter feature recognition on the image data; then, carrying out preliminary judgment and identification on the invading foreign matters; finally, point cloud data and image data are fused, and the invading foreign bodies are judged and identified again;
and the response part receives the output result of the information processing part and performs actions of alarming and information display according to the result.
Preferably, the process of the point cloud data and image data fusion processing is as follows: converting a coordinate system of the millimeter wave radar into an image coordinate system to realize spatial fusion of point cloud data and image data; uniformly setting the data acquisition frequencies of the millimeter wave radar and the camera to be 20Hz, and realizing the fusion processing of point cloud data and image data at corresponding moments; based on the distance and space perspective transformation principle, a foreign object target rectangular frame detected by a millimeter wave radar is projected to a corresponding position of an image in a two-dimensional image pixel coordinate mode, information matching of the same foreign object target is carried out, and fusion of detected foreign object feature information is achieved.
Preferably, the imaging millimeter wave radar is based on a 77GHz frequency band, a frequency modulation continuous wave system is adopted, the effective detection distance is 150m, the maximum range of a blind area is 30m, and the height from the ground is 3.2 m; the camera is 800 ten thousand pixel level, the effective detection distance in sunny days is 100m, the maximum range of the blind area is 8m, and the distance from the camera to the ground is 3.7 m.
Preferably, the information processing part comprises a point cloud preprocessing module, a point cloud intelligent processing module, an image preprocessing module, an image intelligent processing module, a data storage module, a data fusion module, a comprehensive intelligent detection module, a control module and a communication module.
The second purpose of the invention is to provide a high-speed rail perimeter intrusion monitoring method based on a millimeter wave radar and a camera, which comprises the following steps:
s1, arranging imaging millimeter wave radar and cameras on the periphery of the high-speed rail;
s2, establishing a point cloud database and an image database of different foreign body characteristics;
s3, collecting point cloud and image data in real time on site at the same frequency;
s4, preprocessing point cloud data and image data;
s5, identifying foreign body features based on point cloud data; the method specifically comprises the following steps: step one, determining whether foreign objects exist or not by adopting a constant false alarm detector; secondly, filtering false targets by adopting a target life cycle method; thirdly, estimating the motion parameters of the foreign object target by adopting a Kalman filtering method; fourthly, training a random forest classifier based on the characteristic information to realize automatic classification of the detected foreign object target;
s6, identifying foreign body features based on the image data; the method specifically comprises the following steps: identifying foreign object targets by using different types of foreign object characteristic information in S2 and adopting an algorithm combining an image interframe difference method and a background difference method, identifying effective foreign object target information in image data and simultaneously removing ineffective target information; the formula of the algorithm is as follows:
Figure 488873DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 669318DEST_PATH_IMAGE002
and
Figure 800085DEST_PATH_IMAGE003
respectively representing the pixel values of the image of the k-th frame and the (k +1) -th frame, T is a differential image binary threshold value,
Figure 938943DEST_PATH_IMAGE004
differential image pixel values;
s7, judging whether an invasive foreign body exists or not;
s8, fusing point cloud data and image data; the method specifically comprises the following steps: converting a coordinate system of the millimeter wave radar into an image coordinate system to realize spatial fusion of the point cloud data acquired at S5 and the image data acquired at S6; uniformly setting the data acquisition frequencies of the millimeter wave radar and the camera to be 20Hz, and realizing the fusion processing of point cloud data and image data at corresponding moments; based on the distance and space perspective transformation principle, projecting a foreign object target rectangular frame detected by a millimeter wave radar to a corresponding position of an image in a two-dimensional image pixel coordinate mode, and performing information matching of the same foreign object target to realize fusion of detected foreign object feature information;
s9, intelligently detecting foreign body targets based on the fusion model;
and S10, when the foreign matter is detected, alarming in real time and transmitting an alarm image to a high-speed rail safety control center, otherwise, not alarming.
Preferably, S2 is specifically: foreign matters of different materials and sizes are selected, a simulation test of perimeter intrusion of the high-speed rail is carried out on site, point cloud data and image data are collected respectively by a sensing part, foreign matter characteristic information is extracted through a deep learning algorithm, a perimeter intrusion foreign matter standard point cloud database and an image database are established and stored.
Preferably, S4 is specifically: firstly, point cloud data and image data are segmented, and a monitoring range is defined; then, preprocessing the point cloud data in the S3, including point cloud filtering and feature enhancement; the image data in S3 is preprocessed, including graying, filtering, histogram equalization.
Preferably, S7 is specifically: and returning identification information according to the foreign matter feature identification results of S5 and S6, judging that the invasive foreign matter exists if at least one step of S5 and S6 identifies the foreign matter, and judging that the invasive foreign matter does not exist if the foreign matter is not identified in both steps.
Preferably, S9 is specifically: and establishing a foreign body feature fusion model based on the foreign body feature fusion information obtained in the step S8, highlighting the fused foreign body target by adopting a distance conversion method, comprehensively obtaining the type, distance and movement speed information of the foreign body target and tracking the target.
Preferably, S10 is specifically: when S9 detects the invading foreign body, it sends alarm signal to the high-speed rail safety control center in real time through the control box communication module, and returns the image information triggering alarm to the high-speed rail safety control center.
The invention has the advantages and positive effects that:
the invention utilizes an imaging millimeter wave radar and an ultra-high-definition camera to realize a high-speed rail perimeter intrusion monitoring device and method based on multi-source perception information fusion. The imaging millimeter wave radar has the advantages of all weather, interference resistance and high precision, the ultrahigh-definition camera is more superior in the aspects of intelligence, high efficiency and accuracy, the imaging millimeter wave radar and the ultrahigh-definition camera perception information are organically fused, and the real-time accurate detection of the invasion foreign matters such as pedestrians, animals and rockfall is realized based on a deep learning algorithm. Compared with the prior art, the detection efficiency, accuracy and precision of the invention are greatly improved, the application requirements of different scenes, illumination intensity and weather conditions of the high-speed rail perimeter can be met, all-weather, full-coverage, high-precision and intelligent perimeter intrusion monitoring is effectively realized, the method is scientific and feasible, the device is simple to install and arrange, the cost investment is controllable, the operation safety level of the high-speed railway can be greatly improved, and the invention has very wide application and popularization prospects.
Drawings
FIG. 1 is a schematic diagram of the structure of the apparatus in the preferred embodiment of the present invention;
FIG. 2 is a schematic structural diagram of a sensing portion according to a preferred embodiment of the present invention;
FIG. 3 is a flow chart of a preferred embodiment of the present invention;
FIG. 4 is a diagram of the arrangement of monitoring devices on the perimeter of a high speed rail in a preferred embodiment of the present invention;
FIG. 5 is a flowchart of a method for detecting an intruding foreign object based on an imaging millimeter wave radar;
FIG. 6 is a schematic diagram of a CA-CFAR detection algorithm;
FIG. 7 is a comparison graph of foreign object target detection results of an actual scene and a point cloud;
FIG. 8 is a flow chart of the method for detecting the target of the invading foreign body based on the ultra-high definition camera of the invention;
FIG. 9 is a comparison of actual scene and image-based foreign object detection results;
fig. 10 is a flowchart of an intelligent detection method for an invading foreign body target based on a fusion model.
Wherein: 1. monitoring the vertical rod; 2. an ultra-high definition camera; 3. imaging a millimeter wave radar; 4. a control box; 5. an intelligent processor; 6. a data transmission line.
Detailed Description
In order to further understand the contents, features and effects of the present invention, the following embodiments are illustrated and described in detail with reference to the accompanying drawings:
referring to fig. 1 to 10, the technical solution of the present invention is:
a high-speed rail perimeter intrusion monitoring device based on a millimeter wave radar and a camera comprises a sensing part, an information processing part and a response part. As shown in fig. 1 and 2, the sensing part and the information processing part are both installed on the monitoring upright rod 1, and the response part is arranged in a high-speed rail safety control center; the sensing part is connected with the information processing part through a data transmission line 6 so as to realize data interaction, and the information processing part is connected with the response part through the data transmission line 6.
The sensing part comprises an imaging millimeter wave radar 3 and an ultra-high definition camera 2 which are connected in parallel and are respectively erected on a support of the monitoring upright rod 1. The combined structure is shown in fig. 2. The imaging millimeter wave radar adopts a frequency modulation continuous wave system based on a 77GHz frequency band, can realize high-precision measurement on a target by using a small antenna size, and has the advantages of simple structure, stable performance and low cost, wherein the effective detection distance is about 150m, and the maximum range of a blind area is 30 m. The ultra-high-definition camera is 800 ten thousand pixel levels, has the functions of backlight compensation, strong light inhibition, infrared light supplement and the like, is dustproof and waterproof, and has high reliability, the effective detection distance in sunny days is about 100m, and the maximum range of a blind area is 8 m. The height of the vertical rod is 4m, the height of the position of the camera support is 3.7m, the height of the position of the millimeter wave radar support is 3.2m, and the installation position of the camera is slightly higher than that of the millimeter wave radar so as to obtain a larger visual field range and a larger detection distance.
The information processing part comprises an intelligent processor 5 which is arranged in a control box 4 on the monitoring upright rod. The intelligent processor comprises a point cloud preprocessing module, a point cloud intelligent processing module, an image preprocessing module, an image intelligent processing module, a data storage module, a data fusion module, a comprehensive intelligent detection module, a control module and a communication module, wherein the modules are all in parallel relation. The height of the control box on the upright stanchion is about 2.5m away from the ground, so that the control box can prevent animals and lawbreakers from being artificially damaged.
The response part comprises a real-time alarm system and a real-time display screen. The two are connected in series and are arranged in a high-speed rail safety control center.
The imaging millimeter wave radar and the ultra-high-definition camera are respectively connected with the intelligent processor through data transmission lines, and the intelligent processor is connected with the bus through a data connection line and is connected with the real-time alarm system and the real-time display screen through the bus.
The high-speed rail perimeter intrusion monitoring device is powered by an external power supply.
In order to achieve the above object, as another aspect of the present invention, there is also provided a high-speed rail perimeter intrusion monitoring method based on imaging millimeter wave radar and ultra-high-definition camera perception information fusion, including S1 arranging imaging millimeter wave radar and ultra-high-definition camera perception equipment on the high-speed rail perimeter; s2, establishing a point cloud database and an image database of different foreign body characteristics; s3 point cloud data and image data are collected in real time on site at the same frequency; s4 point cloud data and image data are preprocessed, and S5 foreign matter feature recognition is carried out on the basis of the point cloud data; s6 performing foreign object feature recognition based on the image data; s7 judging whether there is invasion foreign matter according to the results of S5 and S6; s8 point cloud data and image data are fused; s9 intelligent detection of foreign body targets based on the fusion model; and S10, when the foreign body is found, alarming in real time and transmitting an alarm image to a high-speed rail safety control center, otherwise, not alarming.
The S1 specifically includes: erecting upright posts at a road shoulder at one side of the high-speed rail according to a certain distance, installing an imaging millimeter wave radar and an ultra-high definition camera, and adjusting and fixing the orientation direction, angle and initial parameters of each device so as to cover the whole range of the rail and the road shoulders at two sides, thereby realizing the full-coverage and dead-angle-free monitoring of the section. The front end sensing equipment is connected with the control box through a transmission line.
The S2 specifically includes: foreign matters of different materials and sizes are selected, a simulation test of perimeter intrusion of a high-speed rail is carried out on site, massive point cloud data and image data are collected respectively by using front-end sensing equipment installed in S1, foreign matter characteristic information is extracted through a deep learning algorithm, a perimeter intrusion foreign matter standard point cloud database and an image database are established and stored in an intelligent processor in a control box.
The S3 specifically includes: and (4) acquiring point cloud and image data in a monitoring range in real time at the same frequency by using the point cloud and image acquisition device installed in the S1, and uploading the point cloud and image data to an intelligent processor in the control box in real time through a transmission line.
The S4 specifically includes: a preprocessing algorithm module in an intelligent processor is utilized to firstly segment point cloud and image data, define a monitoring range and reduce invalid data generated by non-dangerous targets and non-concerned areas as far as possible. And preprocessing the point cloud data acquired in real time by S3, including point cloud filtering and feature enhancement. And preprocessing the image data acquired in real time by S3, including graying, filtering and histogram equalization.
The S5 specifically includes: step one, determining whether foreign objects exist or not by adopting a constant false alarm detector; secondly, filtering false objects which appear in a very short time and disappear quickly by adopting an object life cycle method; thirdly, accurately estimating the motion parameters of the foreign object target by adopting a Kalman filtering method; and fourthly, training a random forest classifier based on the characteristic information to realize automatic classification of the detected foreign object target.
The S6 specifically includes: and identifying the foreign object target by using the characteristics of different types of foreign objects extracted in the S2 and adopting an algorithm principle of combining an image interframe difference method and a background difference method, identifying effective foreign object target information in image data, and simultaneously removing ineffective target information such as passing trains, inspection vehicles, inspection workers and the like. The principle is shown in the following formula:
Figure 850267DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 365562DEST_PATH_IMAGE002
and
Figure 554098DEST_PATH_IMAGE003
respectively representing the pixel values of the image of the k-th frame and the (k +1) -th frame, T is a differential image binary threshold value,
Figure 129436DEST_PATH_IMAGE004
are differential image pixel values.
The S7 specifically includes: and identifying the foreign object target through parallel S5 and S6, returning identification information, judging that the invasive foreign object exists if at least one of the steps S5 and S6 identifies the foreign object, and judging that the invasive foreign object does not exist if no foreign object is identified in the steps S5 and S6.
The S8 specifically includes: and converting the coordinate system of the millimeter wave radar into an image coordinate system, and realizing spatial fusion of the point cloud data acquired at S5 and the image data acquired at S6. The data acquisition frequencies of the millimeter wave radar and the camera are uniformly set to be 20Hz, and the fusion processing of point cloud data and image data at corresponding moments is realized. Based on the distance and space perspective transformation principle, a foreign object target rectangular frame detected by a millimeter wave radar is projected to a corresponding position of an image in a two-dimensional image pixel coordinate mode, information matching of the same foreign object target is carried out, and fusion of detected foreign object feature information is achieved.
The S9 specifically includes: and establishing a foreign body feature fusion model based on the foreign body feature fusion information obtained in the step S8, highlighting the fused foreign body target by adopting a distance conversion method, comprehensively obtaining information such as the type, distance and movement speed of the foreign body target, and tracking the target.
The S10 specifically includes: when S9 detects the invading foreign matters such as pedestrians, animals and rockfall, an alarm signal is sent to the high-speed rail safety control center in real time through the control box communication module, the image information triggering the alarm is transmitted back to the high-speed rail safety control center, and the staff timely takes measures after confirming to eliminate potential safety hazards.
A high-speed rail perimeter intrusion monitoring method based on a millimeter wave radar and a camera comprises the following steps:
step S1: and imaging millimeter wave radars and ultra-high definition camera sensing equipment are arranged on the periphery of the high-speed rail.
The front-end sensing equipment adopted in the invention comprises an imaging millimeter wave radar and an ultra-high definition camera. Before the equipment is installed and erected, initial parameters such as a detection visual angle and acquisition frequency of a millimeter wave radar are firstly set, initial parameters such as a camera focal length and acquisition frequency are set, orientation directions and pitch angles of two pieces of equipment are installed and adjusted, through repeated debugging, the most appropriate installation position and equipment posture are determined so as to achieve the best monitoring effect, vertical rods of monitoring equipment are erected on road shoulders on two sides of a high-speed rail in a staggered mode as shown in figure 4, the distance between every two adjacent vertical rods is 100m, blind areas are monitored by covering each other, and full coverage and dead-angle-free monitoring of rails and the road shoulders on two sides are achieved.
Step S2: and establishing a point cloud database and an image database of different foreign body characteristics.
According to the invention, a perimeter intrusion comprehensive test field is established in advance in a high-speed rail real operation and maintenance scene, step S1 is utilized, millimeter wave radar and ultra-high-definition camera sensing equipment are arranged in the test field, and each equipment is used for independently carrying out a perimeter foreign matter intrusion simulation test in skylight time so as to respectively collect a large amount of point cloud and image sample data. Typical foreign matters such as pedestrians, animals, rockfall and the like are selected, field simulation tests are carried out under various environmental background conditions according to different invasion modes, wherein the field simulation tests comprise reverse samples in a normal operation state and forward samples in a simulated foreign matter invasion state, and a foundation is provided for development of a follow-up foreign matter detection algorithm and data comparison analysis.
The point cloud database is a VGG model based on deep learning, 8-dimensional characteristic information capable of reflecting foreign object targets is extracted from the point cloud database, the characteristic foreign object data samples including the number of target point clouds, the radial distance, the average height, the length, the width, the relative movement speed, the azimuth angle and the signal-to-noise ratio of a minimum rectangular target frame are marked, meanwhile, reverse samples of invalid targets such as passing trains and inspection vehicles are also marked, a point cloud database containing different foreign object characteristic information is formed, and the point cloud database is stored into a storage module of an intelligent processor.
The image database extracts characteristic information capable of reflecting foreign object targets in images based on a deep learning VGG model, the characteristic information comprises information such as contour boundaries, sizes, distances, colors, brightness, edges, textures and pixel numbers, typical foreign object data samples such as pedestrians, animals and falling rocks are marked, reverse samples of invalid targets such as passing trains and inspection vehicles are marked, an image database containing different foreign object characteristic information is formed, and the image database is stored into a storage module of an intelligent processor.
Typical foreign matter includes pedestrians, animals, rockfall, etc. of different forms, sizes, colors. The foreign matter invasion mode comprises entering the monitoring area at different distances, speeds and angles, and the entering mode comprises jumping, falling and crossing. The environmental background conditions comprise sunny days, rainstorm days, snowy days, dust-raising days, thick fog days and dark night.
Step S3: and point cloud and image data are collected in real time on site at the same frequency.
In the invention, the millimeter wave radar and the camera which are installed in the step S1 are used for respectively collecting point cloud and image data.
The method for collecting the point cloud data comprises the following steps: before the millimeter wave radar starts working, the sampling frequency is set to be 20Hz, the detection visual angle and the initial sampling time are set, the attitude of the equipment is adjusted and fixed, point cloud data covering the whole range of the rail and the shoulders on two sides are collected in real time, and the point cloud data are transmitted to a point cloud preprocessing module of an intelligent processor in real time.
The method for acquiring the image data comprises the following steps: before the camera starts to work, the sampling frequency is set to be 20Hz, the initial sampling time identical to that of a millimeter wave radar is set based on real time obtained through networking, the same-frequency sampling is achieved, the posture of equipment is adjusted and fixed, image data covering the whole range of a rail and the whole range of road shoulders on two sides are collected in real time, and the image data are transmitted to an image preprocessing module of an intelligent processor in real time.
Step S4: and (4) point cloud and image data preprocessing.
And respectively carrying out point cloud and image data preprocessing. And (4) respectively carrying out segmentation processing on the point cloud and the image data acquired in the step S3 by using a module built-in segmentation algorithm, so that the monitoring range only comprises the concerned area, namely, the monitoring range covers the rail and the two side road shoulders.
After the point cloud segmentation is finished, the point cloud preprocessing module automatically and quickly processes point cloud data by using a built-in algorithm, and a bilateral filtering method is adopted to perform small-scale fluctuation and noiseThe sound is smoothly and smoothly processed, and the characteristic enhancement is carried out to eliminate the system error that the instrument arouses and isolated point, hash point etc. that the object characteristic arouses, improve data accuracy, concrete step includes: first, calculate any data point
Figure 200160DEST_PATH_IMAGE005
Is/are as follows
Figure 332195DEST_PATH_IMAGE007
Neighborhood point
Figure 437554DEST_PATH_IMAGE008
(ii) a Second, calculate data points
Figure 121477DEST_PATH_IMAGE005
Bilateral filtering space domain weight function of each adjacent point
Figure 679497DEST_PATH_IMAGE009
Parameter (d) of
Figure 864491DEST_PATH_IMAGE010
And a frequency domain weight function
Figure 824356DEST_PATH_IMAGE011
Parameter (d) of
Figure 475917DEST_PATH_IMAGE012
And
Figure 724496DEST_PATH_IMAGE013
wherein
Figure 385285DEST_PATH_IMAGE014
And
Figure 511241DEST_PATH_IMAGE015
is a normal vector of points; thirdly, respectively calculating by adopting a formula
Figure 864862DEST_PATH_IMAGE016
And
Figure 131896DEST_PATH_IMAGE017
namely:
Figure 534058DEST_PATH_IMAGE018
Figure 202937DEST_PATH_IMAGE019
Figure 789776DEST_PATH_IMAGE020
and
Figure 544105DEST_PATH_IMAGE021
respectively, the standard deviation of the spatial domain and frequency domain weight functions; fourthly, calculating bilateral filtering factors according to a formula
Figure 546696DEST_PATH_IMAGE022
I.e. by
Figure 273344DEST_PATH_IMAGE023
(ii) a Fifthly, calculating updated data points after filtering by using a formula
Figure 968768DEST_PATH_IMAGE024
I.e. by
Figure 23442DEST_PATH_IMAGE025
After the image segmentation is finished, the image preprocessing module automatically and quickly processes image data by using a built-in algorithm, grays the image by using a weighted average value method, suppresses noise by using a bilateral filtering method, enables the image to tend to be smooth, and improves the image contrast by using a histogram equalization method. The method eliminates irrelevant information in the image by utilizing the algorithm, enhances the detectability of useful real information, and simplifies the data to the maximum extent so as to improve the reliability of image feature matching and identification.
The image data graying processing comprises the following steps: the first step, extracting R, G, B value of each pixel; secondly, calculating the Gray value of the pixel by using an average value algorithm, namely Gray = R a + G B + B c, wherein a, B and c are respectively 0.3, 0.59 and 0.11; third, the calculated gray value is used instead of the original R, G, B value of the pixel.
The bilateral filtering method is consistent with the point cloud preprocessing.
The image data histogram equalization processing includes: first, determining the gray level of the original image
Figure 564145DEST_PATH_IMAGE026
(ii) a Second, the original histogram is calculated using a formula, i.e.
Figure 473195DEST_PATH_IMAGE027
Wherein
Figure 11624DEST_PATH_IMAGE029
Is the total number of the pixels,
Figure 6125DEST_PATH_IMAGE030
is a gray scale
Figure 412835DEST_PATH_IMAGE031
The number of pixels of (a); third, the cumulative histogram of the original image is calculated using a formula, i.e.
Figure 176392DEST_PATH_IMAGE032
(ii) a The fourth step, calculate the pixel mapping using a formula, i.e.
Figure 682460DEST_PATH_IMAGE033
Wherein
Figure 101940DEST_PATH_IMAGE034
And
Figure 250024DEST_PATH_IMAGE035
respectively taking the maximum value and the minimum value of the gray value of the original image; fifthly, calculating the equalization straight line by using the mapping relationA block diagram, mapping each pixel to a new pixel.
Step S5: and identifying foreign objects based on the point cloud data.
The foreign object target recognition based on the point cloud data is carried out by the point cloud intelligent processing module, as shown in fig. 5, the steps include: step one, determining whether foreign objects exist or not by adopting a constant false alarm detector; secondly, filtering false objects which appear in a very short time and disappear quickly by adopting an object life cycle method; thirdly, accurately estimating the motion parameters of the foreign object target by adopting a Kalman filtering method; and fourthly, training a random forest classifier based on the characteristic information to realize automatic classification of the detected foreign object target.
Whether a foreign object exists is judged by adopting a constant false alarm detector, and the method specifically comprises the following steps: firstly, determining a dynamic threshold value capable of changing in a self-adaptive manner by adopting a CA-CFAR detection algorithm; and step two, comparing the input end signal with the threshold value, if the input end signal exceeds the threshold value, a foreign object target exists, otherwise, no foreign object target exists.
The principle of the CA-CFAR detection method is as follows: the background power is estimated using the average of the sampled data over the reference window, resulting in a varying detection threshold. As shown in fig. 6, D is the unit to be inspected, both sides are the protection units,
Figure 179672DEST_PATH_IMAGE036
and
Figure 122220DEST_PATH_IMAGE037
is a reference cell that is a reference cell,
Figure 825734DEST_PATH_IMAGE038
is the threshold scaling factor. The detailed implementation steps comprise: the low frequency signal is first detected from the input signal using a detector, then the reference cell data is summed and averaged X, Y and Z with a processing unit, passing a threshold scaling factor
Figure 449613DEST_PATH_IMAGE039
Obtaining a dynamic detection threshold value T, comparing the unit D to be detected with the T, and if D is larger than TAnd T, if the detection result is that the foreign object target exists, otherwise, no foreign object target exists, and the result is output.
A target life cycle method is adopted to filter false targets which appear in a very short time and disappear quickly, and the life cycle of the invading target is divided into three stages: a formation phase, a persistence phase, and a disappearance phase. The filtering operation is mainly carried out through two parameters of the occurrence frequency A and the loss frequency B, the occurrence frequency A represents the detected frequency and the loss frequency B represents the continuous loss frequency aiming at the same foreign object. And determining the threshold values of the A and B parameters by carrying out test tests on the site, and setting the threshold values of the A and B parameters to carry out filtering operation when the device is in formal operation. The false foreign object target can appear in a short time and disappear quickly, and the duration is short, so that the condition of the A threshold value cannot be met in the formation stage, and the formation stage cannot be entered; if a foreign object with large fluctuation is lost only for a short time, the parameter does not reach the threshold value for determining the disappearance of the object in the continuation phase B, and therefore the foreign object can be considered to be valid and stable tracking can be continued.
The extended Kalman filter utilizes the concept of Kalman filtering algorithm in a nonlinear system, and predicts and estimates the current state by measuring the previous state based on the basic condition that any system meeting the physical law is continuous. The method comprises the following concrete steps: firstly, calculating a prediction mean value representing the current state according to the optimal value of the previous state and the control quantity to be applied; secondly, calculating the covariance of the predicted value; thirdly, calculating Kalman gain according to the measured value, the covariance of the predicted value, the covariance of the measured value and the state proportionality coefficient; and fourthly, calculating a state estimation value through Kalman gain.
The specific implementation steps of training the random forest classifier based on the feature information are as follows: firstly, carrying out standardization processing on extracted point cloud 8-dimensional feature information by adopting a maximum method, and eliminating the influence of inconsistency of numerical ranges and dimensions between feature data, wherein the formula is as follows:
Figure 187762DEST_PATH_IMAGE040
wherein
Figure 363529DEST_PATH_IMAGE041
Is a normalized value of the characteristic x,
Figure 288759DEST_PATH_IMAGE042
respectively the minimum and maximum of the feature x.
Secondly, a C4.5 algorithm is selected to construct a decision tree model, the information gain rate is used as a criterion for selecting branches, and the following formula is adopted:
Figure 778646DEST_PATH_IMAGE043
wherein the content of the first and second substances,
Figure 308985DEST_PATH_IMAGE044
is the value of the entropy for a certain feature,
Figure 858915DEST_PATH_IMAGE045
is as follows
Figure 350070DEST_PATH_IMAGE047
The probability of a class or a class,
Figure 112490DEST_PATH_IMAGE048
is the splitting information of the feature x,
Figure 559652DEST_PATH_IMAGE049
is as follows
Figure 218166DEST_PATH_IMAGE051
The number of samples of the sub-data sets,
Figure 383568DEST_PATH_IMAGE052
the total number of samples of the original data set,
Figure 949679DEST_PATH_IMAGE053
is a warpThe sample set information gain after the over-feature x split,
Figure 313664DEST_PATH_IMAGE054
is the information gain rate after the characteristic x split.
And thirdly, setting parameters of a random forest, and setting the number of subtrees as 100, so as to ensure the stability of the point cloud classification result and relatively high training operation speed.
Based on the method, effective foreign object targets are identified, invalid target information of passing trains, inspection vehicles, inspection workers and the like is eliminated, the comprehensive classification accuracy of point cloud data is more than 95%, and the actual scene and point cloud identification result pair is shown in fig. 7.
Step S6: foreign object identification based on image data.
The foreign object target recognition based on image data is carried out in an image intelligent processing module, the flow is shown in fig. 8, and in consideration of the complex background of the high-speed rail peripheral environment, the foreign object target recognition is carried out by adopting an algorithm principle of combining an image interframe difference method and a background difference method, and the principle is shown in the following formula:
Figure 205397DEST_PATH_IMAGE055
wherein the content of the first and second substances,
Figure 795778DEST_PATH_IMAGE056
and
Figure 165580DEST_PATH_IMAGE057
respectively representing the pixel values of the image of the k-th frame and the (k +1) -th frame, T is a differential image binary threshold value,
Figure 898918DEST_PATH_IMAGE058
are differential image pixel values.
The concrete implementation steps comprise: firstly, selecting the first 100 frames of images in a video sequence to subtract one by one, finally obtaining a difference image, namely a foreground image containing a foreign object, and then, taking the current frame of image and the difference valueSubtracting the images to obtain a background image model without a foreign object target; secondly, after the difference is made between the current frame image and the background image to obtain a foreground image, further performing mathematical morphology and binarization operation to obtain a target area containing foreign matters; thirdly, identifying an effective foreign object target and information thereof by using an Adaboost algorithm based on the image foreign object characteristic information extracted in the field test of the step S2; fourthly, carrying out binarization processing on the difference image of the current frame image and the background image again, carrying out independent judgment on each pixel point of the image by using a background mask method, if the pixel value is 1 and is represented as a foreground pixel, updating the background by using the pixel value of the current background image, if the pixel value is 0 and is represented as a background pixel, keeping the pixel value of the point unchanged so as to obtain an instantaneous background image, and carrying out weighted addition on the instantaneous background image and the current background image so as to realize the real-time updating of the background model, wherein the background updating adopts the following formula:
Figure 961552DEST_PATH_IMAGE059
wherein
Figure 773650DEST_PATH_IMAGE060
For the k-th frame of the background image,
Figure 212722DEST_PATH_IMAGE061
the k-th frame of the instantaneous background image,
Figure 223403DEST_PATH_IMAGE062
update rate for background. The actual scene and image recognition result pair is shown in fig. 9.
The main process of identifying the foreign object by using the Adaboost algorithm comprises the following steps: first, a training data set is prepared
Figure 519255DEST_PATH_IMAGE063
Initializing the weight distribution of the training data, and endowing 1/N weight; secondly, carrying out multi-round iterative computation, and learning by using a data set with certain weight to obtain a basic classifier
Figure 880966DEST_PATH_IMAGE064
Calculating the weight of the basic classifier in the final classifier
Figure 530254DEST_PATH_IMAGE065
And classification error rate
Figure 926600DEST_PATH_IMAGE066
And updating the weight distribution of the data set with the result, using the formula:
Figure 65457DEST_PATH_IMAGE067
and thirdly, linearly combining each weak classifier by using a reinforced average method to obtain a final classifier, wherein the formula is as follows:
Figure 727514DEST_PATH_IMAGE068
step S7: and judging whether the invasive foreign matters exist or not.
In the invention, foreign object identification is respectively carried out through parallel steps S5 and S6, identification information is returned to a control module of the intelligent processor, and the control module issues and appoints according to the returned information. When the foreign matter is identified in at least one of the step S5 or the step S6, the control module judges that the invading foreign matter exists and issues an instruction, the step S8 is continuously executed, and the data of the two sensors are fused and then a comprehensive decision is made; when no foreign object is identified in step S5 or step S6, the controller determines that no foreign object is intruded and issues an instruction, and returns to step S3 to continue to collect point cloud and image data at the front end.
Step S8: and (5) point cloud and image data fusion processing.
After receiving the instruction sent by the control module, the invention carries out quick intelligent fusion of space, time and foreign object target information on the point cloud data acquired by the imaging millimeter wave radar and the image data acquired by the ultra-high definition camera in the data fusion module of the intelligent processor, thereby realizing information complementation and data optimization combination of the two sensors on the level and the space.
The data space fusion is to unify the data of two sensors into a coordinate system. Because the two sensors have different positions and the data of each sensor adopts a self-contained coordinate system, an accurate conversion relation needs to be established between the coordinate systems, including an imaging millimeter wave radar coordinate system, a three-dimensional world coordinate system, an ultra-high-definition camera coordinate system, an image coordinate system and a pixel coordinate system. Then data coordinate conversion is performed by the following steps: the method comprises the following steps of firstly, converting coordinates in a millimeter wave radar coordinate system into a three-dimensional world coordinate system with an ultra-high-definition camera as a center by operation steps of rotation, zooming, translation and the like by using a coordinate system conversion formula; secondly, converting the coordinates in the three-dimensional world coordinate system into an ultra-high definition camera coordinate system by using a coordinate system conversion formula; and thirdly, converting the coordinate of the coordinate system of the ultra-high-definition camera into an image coordinate system. Through the transformation process, the coordinates of the point cloud are converted to the corresponding positions of the images, and the spatial fusion of the data of the two sensors is realized.
The data time fusion is to ensure that the two sensors sample at the same frequency at the corresponding time. Before data acquisition, uniformly setting the data acquisition frequency of a millimeter wave radar and a camera to be 20Hz, providing an accurate clock by using a network communication system, adding a timestamp to each frame of point cloud data and image data, and setting the same initial time T of the millimeter wave radar and the camera in the system0And the operation is started, so that the synchronism of the collected data is ensured, and the time fusion of the data of the two sensors is realized.
The foreign object target information fusion is the fusion of foreign object target detection results obtained by two sensors, and the steps are as follows: the pixel distance of foreign object targets is identified by two sensors under a two-dimensional space, foreign object target deep information such as foreign object transverse and longitudinal sizes is utilized, based on the spatial perspective transformation principle, a foreign object target rectangular frame detected by a millimeter wave radar is projected to a corresponding position of an image in a two-dimensional image pixel coordinate mode, information matching and weighted fusion of the same target are carried out, information fusion of the foreign object targets on a decision level is realized, and the adopted fusion formula is as follows:
Figure 242809DEST_PATH_IMAGE069
Figure 431345DEST_PATH_IMAGE070
representing the gray-scale image formed by the fusion,
Figure 741103DEST_PATH_IMAGE071
the number of the fusion coefficients is represented,
Figure 77407DEST_PATH_IMAGE072
representing the ultra-high-definition camera image after the noise reduction processing,
Figure 458709DEST_PATH_IMAGE073
and representing an image formed by coordinate conversion of the imaging millimeter wave radar point cloud.
Step S9: and (3) intelligently detecting foreign object targets based on the fusion model.
In the invention, based on the fusion information obtained in the step S8, a foreign object target fusion model is established in a data fusion module, a distance transformation method is adopted to highlight the fused foreign object target, and information such as the type, distance, movement speed and the like of the foreign object target is comprehensively obtained and the target tracking is carried out.
The distance conversion method is carried out by adopting the following formula:
Figure 564069DEST_PATH_IMAGE074
Figure 247991DEST_PATH_IMAGE075
representing foreground object points
Figure 540432DEST_PATH_IMAGE076
V represents the pixel of the pointThe value, point (p, q), is the boundary point of the foreground object and the background area. When the fused data point is farther away from the boundary point, the gray value after distance conversion is larger, and therefore the brightness of the foreign object in the fused image is increased.
Step S10: and alarming in real time and transmitting an alarm image to a high-speed rail safety control center.
When the step S9 detects the invading foreign bodies such as pedestrians, animals, rockfall and the like, the control module sends an instruction, the instruction is transmitted to the high-speed rail safety control center in real time through the communication module, the real-time alarm system sends an alarm signal to prompt relevant managers, the fused image triggering the alarm and the information thereof are simultaneously returned to the high-speed rail safety control center, and the managers timely take measures after confirming the alarm information through the real-time display screen to eliminate potential safety hazards. After the hidden trouble is eliminated, the alarm signal is removed, and normal real-time monitoring is resumed.
It should be emphasized that the examples described herein are illustrative and not limiting, and thus the present invention includes, but is not limited to, the examples described in this detailed description, including but not limited to, high-speed rail lines using only the present invention, as well as ordinary rail lines. Other embodiments that can be derived from the technical solutions of the present invention by those skilled in the art also belong to the protection scope of the present invention.

Claims (10)

1. The utility model provides a high-speed railway perimeter intrusion monitoring device based on millimeter wave radar and camera which characterized in that includes at least:
the sensing module comprises an imaging millimeter wave radar and a camera;
the information processing part receives the point cloud data and the image data of the sensing part, and firstly carries out pretreatment and foreign matter feature recognition on the point cloud data and carries out pretreatment and foreign matter feature recognition on the image data; then, carrying out preliminary judgment and identification on the invading foreign matters; finally, point cloud data and image data are fused, and the invading foreign bodies are judged and identified again;
and the response part receives the output result of the information processing part and performs actions of alarming and information display according to the result.
2. The high-speed rail perimeter intrusion monitoring device based on the millimeter-wave radar and the camera as claimed in claim 1, wherein the process of the point cloud data and the image data fusion processing is as follows: converting a coordinate system of the millimeter wave radar into an image coordinate system to realize spatial fusion of point cloud data and image data; uniformly setting the data acquisition frequencies of the millimeter wave radar and the camera to be 20Hz, and realizing the fusion processing of point cloud data and image data at corresponding moments; based on the distance and space perspective transformation principle, a foreign object target rectangular frame detected by a millimeter wave radar is projected to a corresponding position of an image in a two-dimensional image pixel coordinate mode, information matching of the same foreign object target is carried out, and fusion of detected foreign object feature information is achieved.
3. The perimeter intrusion monitoring device for the high-speed rail based on the millimeter wave radar and the camera as claimed in claim 1, wherein the imaging millimeter wave radar is based on 77GHz frequency band, and adopts frequency modulation continuous wave system, the effective detection distance is 150m, the maximum range of the blind area is 30m, and the height from the ground is 3.2 m; the camera is 800 ten thousand pixel level, the effective detection distance in sunny days is 100m, the maximum range of the blind area is 8m, and the distance from the camera to the ground is 3.7 m.
4. The millimeter wave radar and camera based high-speed rail perimeter intrusion monitoring device according to claim 1, wherein the information processing part comprises a point cloud preprocessing module, a point cloud intelligent processing module, an image preprocessing module, an image intelligent processing module, a data storage module, a data fusion module, an integrated intelligent detection module, a control module and a communication module.
5. A high-speed rail perimeter intrusion monitoring method based on a millimeter wave radar and a camera is characterized by comprising the following steps:
s1, arranging imaging millimeter wave radar and cameras on the periphery of the high-speed rail;
s2, establishing a point cloud database and an image database of different foreign body characteristics;
s3, collecting point cloud and image data in real time on site at the same frequency;
s4, preprocessing point cloud data and image data;
s5, identifying foreign body features based on point cloud data; the method specifically comprises the following steps: step one, determining whether foreign objects exist or not by adopting a constant false alarm detector; secondly, filtering false targets by adopting a target life cycle method; thirdly, estimating the motion parameters of the foreign object target by adopting a Kalman filtering method; fourthly, training a random forest classifier based on the characteristic information to realize automatic classification of the detected foreign object target;
s6, identifying foreign body features based on the image data; the method specifically comprises the following steps: identifying foreign object targets by using different types of foreign object characteristic information in S2 and adopting an algorithm combining an image interframe difference method and a background difference method, identifying effective foreign object target information in image data and simultaneously removing ineffective target information; the formula of the algorithm is as follows:
Figure 45024DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 484096DEST_PATH_IMAGE002
and
Figure 822673DEST_PATH_IMAGE003
respectively representing the pixel values of the image of the k-th frame and the (k +1) -th frame, T is a differential image binary threshold value,
Figure 56208DEST_PATH_IMAGE004
differential image pixel values;
s7, judging whether an invasive foreign body exists or not;
s8, fusing point cloud data and image data; the method specifically comprises the following steps: converting a coordinate system of the millimeter wave radar into an image coordinate system to realize spatial fusion of the point cloud data acquired at S5 and the image data acquired at S6; uniformly setting the data acquisition frequencies of the millimeter wave radar and the camera to be 20Hz, and realizing the fusion processing of point cloud data and image data at corresponding moments; based on the distance and space perspective transformation principle, projecting a foreign object target rectangular frame detected by a millimeter wave radar to a corresponding position of an image in a two-dimensional image pixel coordinate mode, and performing information matching of the same foreign object target to realize fusion of detected foreign object feature information;
s9, intelligently detecting foreign body targets based on the fusion model;
and S10, when the foreign matter is detected, alarming in real time and transmitting an alarm image to a high-speed rail safety control center, otherwise, not alarming.
6. The perimeter intrusion monitoring method for the high-speed rail based on the millimeter wave radar and the camera as claimed in claim 5, wherein S2 specifically comprises: foreign matters of different materials and sizes are selected, a simulation test of perimeter intrusion of the high-speed rail is carried out on site, point cloud data and image data are collected respectively by a sensing part, foreign matter characteristic information is extracted through a deep learning algorithm, a perimeter intrusion foreign matter standard point cloud database and an image database are established and stored.
7. The perimeter intrusion monitoring method for the high-speed rail based on the millimeter-wave radar and the camera as claimed in claim 6, wherein S4 specifically is: firstly, point cloud data and image data are segmented, and a monitoring range is defined; then, preprocessing the point cloud data in the S3, including point cloud filtering and feature enhancement; the image data in S3 is preprocessed, including graying, filtering, histogram equalization.
8. The perimeter intrusion monitoring method for the high-speed rail based on the millimeter-wave radar and the camera as claimed in claim 7, wherein S7 specifically is: and returning identification information according to the foreign matter feature identification results of S5 and S6, judging that the invasive foreign matter exists if at least one step of S5 and S6 identifies the foreign matter, and judging that the invasive foreign matter does not exist if the foreign matter is not identified in both steps.
9. The perimeter intrusion monitoring method for the high-speed rail based on the millimeter-wave radar and the camera according to claim 8, wherein S9 specifically comprises: and establishing a foreign body feature fusion model based on the foreign body feature fusion information obtained in the step S8, highlighting the fused foreign body target by adopting a distance conversion method, comprehensively obtaining the type, distance and movement speed information of the foreign body target and tracking the target.
10. The perimeter intrusion monitoring method for the high-speed rail based on the millimeter-wave radar and the camera as claimed in claim 9, wherein S10 specifically is: when S9 detects the invading foreign body, it sends alarm signal to the high-speed rail safety control center in real time through the control box communication module, and returns the image information triggering alarm to the high-speed rail safety control center.
CN202210315202.XA 2022-03-29 2022-03-29 High-speed rail perimeter intrusion monitoring device and method based on millimeter wave radar and camera Active CN114419825B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210315202.XA CN114419825B (en) 2022-03-29 2022-03-29 High-speed rail perimeter intrusion monitoring device and method based on millimeter wave radar and camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210315202.XA CN114419825B (en) 2022-03-29 2022-03-29 High-speed rail perimeter intrusion monitoring device and method based on millimeter wave radar and camera

Publications (2)

Publication Number Publication Date
CN114419825A true CN114419825A (en) 2022-04-29
CN114419825B CN114419825B (en) 2022-06-21

Family

ID=81262903

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210315202.XA Active CN114419825B (en) 2022-03-29 2022-03-29 High-speed rail perimeter intrusion monitoring device and method based on millimeter wave radar and camera

Country Status (1)

Country Link
CN (1) CN114419825B (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114882661A (en) * 2022-04-29 2022-08-09 广州小鹏汽车科技有限公司 Outdoor early warning method, device, system and computer readable storage medium
CN115083088A (en) * 2022-05-11 2022-09-20 长江慧控科技(武汉)有限公司 Railway perimeter intrusion early warning method
CN115100519A (en) * 2022-06-23 2022-09-23 郑州儒慧信息技术有限责任公司 Method for identifying hidden danger along high-speed rail
CN115311534A (en) * 2022-08-26 2022-11-08 中国铁道科学研究院集团有限公司 Laser radar-based railway perimeter intrusion identification method and device and storage medium
CN115394065A (en) * 2022-10-31 2022-11-25 之江实验室 AI-based automatic identification packet loss behavior alarm method and device
CN115457253A (en) * 2022-11-11 2022-12-09 浪潮金融信息技术有限公司 Object detection method, system, equipment and medium based on multiple camera modules
CN115497242A (en) * 2022-09-07 2022-12-20 东南大学 Intelligent monitoring system and monitoring method for foreign matter invasion in railway business line construction
CN115620239A (en) * 2022-11-08 2023-01-17 国网湖北省电力有限公司荆州供电公司 Point cloud and video combined power transmission line online monitoring method and system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040208375A1 (en) * 2002-10-15 2004-10-21 Digicomp Research Corporation Automatic intrusion detection system for perimeter defense
GB201301101D0 (en) * 2012-01-24 2013-03-06 Accipiter Radar Technologies Inc Personal electronic target vision system, device and method
CN104821056A (en) * 2015-04-30 2015-08-05 湖南华诺星空电子技术有限公司 Intelligent guarding method based on radar and video integration
CN105389920A (en) * 2015-11-30 2016-03-09 北京正安维视科技股份有限公司 Perimeter alarm method based on two-dimensional laser radar and video analysis
US10062255B1 (en) * 2017-11-13 2018-08-28 National Technology & Engineering Solutions Of Sandia, Llc VMD fused radar—a hyper-volumetric ultra-low NAR sensor system
CN109920185A (en) * 2019-04-16 2019-06-21 中科九度(北京)空间信息技术有限责任公司 One kind merging the mobile mesh calibration method of detection with video data based on millimetre-wave radar
WO2019194091A1 (en) * 2018-04-02 2019-10-10 パナソニック株式会社 Intrusion detection system and intrusion detection method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040208375A1 (en) * 2002-10-15 2004-10-21 Digicomp Research Corporation Automatic intrusion detection system for perimeter defense
GB201301101D0 (en) * 2012-01-24 2013-03-06 Accipiter Radar Technologies Inc Personal electronic target vision system, device and method
CN104821056A (en) * 2015-04-30 2015-08-05 湖南华诺星空电子技术有限公司 Intelligent guarding method based on radar and video integration
CN105389920A (en) * 2015-11-30 2016-03-09 北京正安维视科技股份有限公司 Perimeter alarm method based on two-dimensional laser radar and video analysis
US10062255B1 (en) * 2017-11-13 2018-08-28 National Technology & Engineering Solutions Of Sandia, Llc VMD fused radar—a hyper-volumetric ultra-low NAR sensor system
WO2019194091A1 (en) * 2018-04-02 2019-10-10 パナソニック株式会社 Intrusion detection system and intrusion detection method
CN109920185A (en) * 2019-04-16 2019-06-21 中科九度(北京)空间信息技术有限责任公司 One kind merging the mobile mesh calibration method of detection with video data based on millimetre-wave radar

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
翟海涛: "改进的场面监视雷达杂波图恒虚警检测技术", 《指挥信息系统与技术》, 31 December 2019 (2019-12-31), pages 32 - 37 *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114882661A (en) * 2022-04-29 2022-08-09 广州小鹏汽车科技有限公司 Outdoor early warning method, device, system and computer readable storage medium
CN115083088A (en) * 2022-05-11 2022-09-20 长江慧控科技(武汉)有限公司 Railway perimeter intrusion early warning method
CN115100519A (en) * 2022-06-23 2022-09-23 郑州儒慧信息技术有限责任公司 Method for identifying hidden danger along high-speed rail
CN115100519B (en) * 2022-06-23 2024-04-26 郑州儒慧信息技术有限责任公司 Method for identifying hidden danger objects along high-speed rail
CN115311534A (en) * 2022-08-26 2022-11-08 中国铁道科学研究院集团有限公司 Laser radar-based railway perimeter intrusion identification method and device and storage medium
CN115497242A (en) * 2022-09-07 2022-12-20 东南大学 Intelligent monitoring system and monitoring method for foreign matter invasion in railway business line construction
CN115497242B (en) * 2022-09-07 2023-11-17 东南大学 Intelligent foreign matter invasion monitoring system and monitoring method for railway business line construction
CN115394065A (en) * 2022-10-31 2022-11-25 之江实验室 AI-based automatic identification packet loss behavior alarm method and device
CN115620239A (en) * 2022-11-08 2023-01-17 国网湖北省电力有限公司荆州供电公司 Point cloud and video combined power transmission line online monitoring method and system
CN115620239B (en) * 2022-11-08 2024-01-30 国网湖北省电力有限公司荆州供电公司 Point cloud and video combined power transmission line online monitoring method and system
CN115457253A (en) * 2022-11-11 2022-12-09 浪潮金融信息技术有限公司 Object detection method, system, equipment and medium based on multiple camera modules

Also Published As

Publication number Publication date
CN114419825B (en) 2022-06-21

Similar Documents

Publication Publication Date Title
CN114419825B (en) High-speed rail perimeter intrusion monitoring device and method based on millimeter wave radar and camera
WO2022141914A1 (en) Multi-target vehicle detection and re-identification method based on radar and video fusion
CN111369541B (en) Vehicle detection method for intelligent automobile under severe weather condition
Marquez et al. Intra-hour DNI forecasting based on cloud tracking image analysis
CN108983219A (en) A kind of image information of traffic scene and the fusion method and system of radar information
CN110532889A (en) Track foreign matter detecting method based on rotor unmanned aircraft and YOLOv3
GB2619196A (en) Multi-target vehicle detection and re-identification method based on radar and video fusion
CN104063882B (en) Vehicle video speed measuring method based on binocular camera
Ahmad et al. A novel method for vegetation encroachment monitoring of transmission lines using a single 2D camera
CN113791410B (en) Road environment comprehensive cognition method based on multi-sensor information fusion
CN115034324B (en) Multi-sensor fusion perception efficiency enhancement method
CN111667655A (en) Infrared image-based high-speed railway safety area intrusion alarm device and method
CN106845359A (en) Tunnel portal driving prompt apparatus and method based on infrared emission
CN114875877A (en) Ship lockage safety detection method
CN115035470A (en) Low, small and slow target identification and positioning method and system based on mixed vision
CN113253289A (en) Unmanned aerial vehicle detection tracking system implementation method based on combination of laser radar and vision
CN116846059A (en) Edge detection system for power grid inspection and monitoring
Chen et al. Ship target detection algorithm for maritime surveillance video based on gaussian mixture model
Sezgin et al. Safe autonomous driving in adverse weather: Sensor evaluation and performance monitoring
Liu et al. Manhole cover detection from natural scene based on imaging environment perception
CN104867331A (en) Microwave-based traffic incident detection method and apparatus
CN111784738A (en) Extremely dark and weak moving target correlation detection method based on fluctuation analysis
CN105551017A (en) Transmission line forest fire target extraction method on the basis of spatio-temporal union
CN109543553A (en) The photoelectricity recognition and tracking method of low small slow target based on machine learning
CN115436937A (en) Radar photoelectric integrated security platform system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant