CN111381232A - River channel safety control method based on photoelectric integration technology - Google Patents

River channel safety control method based on photoelectric integration technology Download PDF

Info

Publication number
CN111381232A
CN111381232A CN202010229622.7A CN202010229622A CN111381232A CN 111381232 A CN111381232 A CN 111381232A CN 202010229622 A CN202010229622 A CN 202010229622A CN 111381232 A CN111381232 A CN 111381232A
Authority
CN
China
Prior art keywords
target
radar
monitoring
video
tracking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010229622.7A
Other languages
Chinese (zh)
Inventor
黄琼
李财金
苏东旭
林健
卢秉彦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Shenshui Water Resources Consulting Co ltd
Original Assignee
Shenzhen Shenshui Water Resources Consulting Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Shenshui Water Resources Consulting Co ltd filed Critical Shenzhen Shenshui Water Resources Consulting Co ltd
Priority to CN202010229622.7A priority Critical patent/CN111381232A/en
Publication of CN111381232A publication Critical patent/CN111381232A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • G01S13/723Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
    • G01S13/726Multiple target tracking
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/46Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being of a radio-wave signal type
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D3/00Control of position or direction
    • G05D3/12Control of position or direction using feedback

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automation & Control Theory (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

A river channel safety control method based on a photoelectric integration technology mainly comprises the following steps of: laying a radar and a video monitor in a river warning area; step 2: the video monitoring acquires image information of a monitored target and analyzes the type of the monitored target, and the radar acquires the related information of the monitored target; and step 3: performing fusion analysis on data obtained by video monitoring and radar monitoring; and 4, step 4: and (3) finding and prejudging reasonable monitoring targets in the monitoring area, and tracking, monitoring and/or snapshotting for evidence obtaining. The invention aims to solve the technical problems that the traditional river channel safety monitoring and controlling technology is greatly influenced by environmental factors, cannot accurately monitor an intrusion object and cannot timely respond to an intrusion behavior.

Description

River channel safety control method based on photoelectric integration technology
Technical Field
The invention relates to the technical field of river channel safety control, in particular to a river channel safety control device based on a photoelectric integration technology.
Background
Traditional river course safety management and control is patrolled by the manual work and is gone on with video monitoring in coordination, and this mode receives that monitoring distance is short, the visual degree of night is poor, weather conditions influences greatly, and the cost of labor is higher. Due to the management and control mode, the supervision personnel cannot respond in time when acts such as illegal break-in occur. Under the background of 'intelligent water affairs', the river safety management and control technology is improved, water conservancy informatization is facilitated to be improved, the working strength of management personnel is reduced, the supervision quality is improved, and the growth and development of rivers and lakes are promoted.
Disclosure of Invention
The invention aims to solve the technical problems that the traditional river channel safety monitoring and controlling technology is greatly influenced by environmental factors, cannot accurately monitor an intrusion object and cannot timely respond to an intrusion behavior.
In order to solve the technical problems, the invention provides the following technical scheme:
a river channel safety control method based on photoelectric integration technology comprises the following steps,
step 1: laying a radar and a video monitor in a river warning area;
step 2: the video monitoring acquires image information of a monitored target and analyzes the type of the monitored target, and the radar acquires the related information of the monitored target;
and step 3: performing fusion analysis on data obtained by video monitoring and radar monitoring;
and 4, step 4: and (3) finding and prejudging reasonable monitoring targets in the monitoring area, and tracking, monitoring and/or snapshotting for evidence obtaining.
In step 3, the video accurately identifies the target and the radar accurately measures the speed, angle and distance of the target, and the monitoring data are subjected to fusion analysis by combining the advantages of video monitoring and the radar.
When monitoring data are subjected to fusion analysis, the following steps are adopted:
step 1) establishmentThe coordinate system, radar and video coordinate system adopt polar coordinates of (R, theta) and (R)vv) The origin coordinate of the radar sensor is (x)r0,yr0,zr0) The origin coordinate of the camera is (x)v0,yv0,zv0) Measuring coordinates by adopting a GPS;
step 2) using the formula
Figure BDA0002428888130000021
Determining the relation between Doppler frequency shift and radial velocity, and measuring the target distance by using frequency spectrum
Figure BDA0002428888130000022
Because the radar adopts a plurality of antennas to receive signals, the phase difference is utilized to calculate the target angle
Figure BDA0002428888130000023
Figure BDA0002428888130000024
According to the measured values, the plane coordinates of the radar monitoring target can be determined, the polar coordinates with the video sensor as the origin of coordinates can be obtained through conversion of the polar coordinates and the plane coordinates, and spatial fusion of matching of the radar monitoring target to the video image in space is completed;
step 3) taking the radar refreshing time as a reference, and enabling data refreshed by the radar to coincide with the next frame data monitored by the video every time, so that the time synchronization of the radar data and the video data is ensured, and the time fusion is completed;
and 4) sending information captured by the radar, including a target reflection area and a distance, to a video sensor as guide information, identifying and capturing a corresponding target by a control center through a video holder guide camera, further classifying the target by adopting a static image classification identification algorithm, identifying workers and non-workers, giving a unique serial number UID to the non-workers, tracking and recording the corresponding track of the UID in real time by the radar sensor, and if the video cannot be identified due to factors such as weather, distinguishing pedestrians and vehicles by adopting the reflection area, giving serial numbers to the pedestrians, and tracking and recording.
In step 4, the radar acquires the relevant information of the monitored target, including the distance, speed and angle information of the moving target, and tracks and monitors the position and speed of the moving target.
When tracking monitoring is carried out, the method comprises the following steps:
step 1) establishing a state equation and an observation equation to describe state variables of a moving target, and respectively defining state variables X of a tracking target at the moment kk=[xk,yk,x’k,y’k]And the observed variable Zk=[pxk,pyk,vx’k,vy’k]Wherein x isk,ykAnd x'k,y’kThe coordinate component and the velocity component in the x and y directions, px, respectively, at the target time kk,pykAre the coordinate components of the target in the x and y directions at time k, respectively, vx'k,vy’kIs the corresponding velocity component;
step 2) because the radar scanning frequency interval is short, the pedestrian can basically move on the control area as variable acceleration linear motion, and a state equation is established to describe the relationship between state variables at adjacent moments;
step 3) initial position x of moving target(0|0)Substitution into
Figure BDA0002428888130000031
The position x of the next time can be obtained(k|k-1)Values, wherein: x is the number ofk-1For initial estimation, A is a state transition matrix, and B is an adjustment parameter;
step 4) mixing x(k|k-1)And from Pk|k-1=APk-1|k-1AT+ Q initialized prior error covariance p(k|k-1)Are substituted together
Figure BDA0002428888130000032
Calculating a Kalman gain KkWherein: r is a constant matrix assumed from the actual motion model, PkIs the covariance of the prior estimate, H is the measurement system parameter;
step 5) mixingx(k|k-1)、Kk,Z(k)Substitution into
Figure BDA0002428888130000033
Get the correction value x(k|k)And judging whether the trajectory of the moving object is matched with the existing trajectory of the moving object, wherein: kkIs the kalman gain;
step 6) mixing x(k|k-1)、KkSubstitution into Pk|k=(I-KkH)Pk|k-1A posteriori error covariance P(k|k)
Step 7) of converting x(k|k)Turning to the step 3) again as an initial value of the next moment so as to realize continuous track tracking and correction;
therefore, the estimation value of the next moment is predicted by using the estimation value of the previous moment and the actual measurement value of the current moment, and the target track tracking and correction are completed.
In step 4, on the basis of continuous tracking, converting the motion track of the radar tracking target into a plane coordinate and displaying the plane coordinate on a satellite map of the monitoring center in real time; presetting a warning area, carrying out track prejudgment, and judging whether a target is in the warning area; when the pedestrian is judged to invade the warning area in advance, alarm information is popped up or the alarm is given to the staff in a form of sending a short message, and meanwhile, the camera receives coordinate information converted by the radar to monitor and capture the invaded object for evidence.
A method for fusing monitoring data of video monitoring and radar monitoring adopts the following steps when fusing and analyzing the monitoring data:
step 1) establishing a coordinate system, wherein the radar and the video coordinate system adopt polar coordinates of (R, theta) and (R) respectivelyvv) The origin coordinate of the radar sensor is (x)r0,yr0,zr0) The origin coordinate of the camera is (x)v0,yv0,zv0) Measuring coordinates by adopting a GPS;
step 2) using the formula
Figure BDA0002428888130000041
The relationship between the doppler shift and the radial velocity is determined,measuring target distance using frequency spectrum
Figure BDA0002428888130000042
Because the radar adopts a plurality of antennas to receive signals, the phase difference is utilized to calculate the target angle
Figure BDA0002428888130000043
Figure BDA0002428888130000044
According to the measured values, the plane coordinates of the radar monitoring target can be determined, the polar coordinates with the video sensor as the origin of coordinates can be obtained through conversion of the polar coordinates and the plane coordinates, and spatial fusion of matching of the radar monitoring target to the video image in space is completed;
step 3) taking the radar refreshing time as a reference, and enabling data refreshed by the radar to coincide with the next frame data monitored by the video every time, so that the time synchronization of the radar data and the video data is ensured, and the time fusion is completed;
and 4) sending information captured by the radar, including a target reflection area and a distance, to a video sensor as guide information, identifying and capturing a corresponding target by a control center through a video holder guide camera, further classifying the target by adopting a static image classification identification algorithm, identifying workers and non-workers, giving a unique serial number UID to the non-workers, tracking and recording the corresponding track of the UID in real time by the radar sensor, and if the video cannot be identified due to factors such as weather, distinguishing pedestrians and vehicles by adopting the reflection area, giving serial numbers to the pedestrians, and tracking and recording.
A method for tracking and monitoring target track, wherein radar obtains the relevant information of the monitored target including the distance, speed and angle information of the moving target, and tracks and monitors the position and speed of the moving target;
step 1) establishing a state equation and an observation equation to describe state variables of a moving target, and respectively defining state variables X of a tracking target at the moment kk=[xk,yk,x’k,y’k]And the observed variable Zk=[pxk,pyk,vx’k,vy’k]Wherein x isk,ykAnd x'k,y’kThe coordinate component and the velocity component in the x and y directions, px, respectively, at the target time kk,pykAre the coordinate components of the target in the x and y directions at time k, respectively, vx'k,vy’kIs the corresponding velocity component;
step 2) because the radar scanning frequency interval is short, the pedestrian can basically move on the control area as variable acceleration linear motion, and a state equation is established to describe the relationship between state variables at adjacent moments;
step 3) initial position x of moving target(0|0)Substitution into
Figure BDA0002428888130000061
The position x of the next time can be obtained(k|k-1)Values, wherein: x is the number ofk-1For initial estimation, A is a state transition matrix, and B is an adjustment parameter;
step 4) mixing x(k|k-1)And from Pk|k-1=APk-1|k-1AT+ Q initialized prior error covariance p(k|k-1)Are substituted together
Figure BDA0002428888130000062
Calculating a Kalman gain KkWherein: r is a constant matrix assumed from the actual motion model, PkIs the covariance of the prior estimate, H is the measurement system parameter;
step 5) mixing x(k|k-1)、Kk,Z(k)Substitution into
Figure BDA0002428888130000063
Get the correction value x(k|k)And judging whether the trajectory of the moving object is matched with the existing trajectory of the moving object, wherein: kkIs the kalman gain;
step 6) mixing x(k|k-1)、KkSubstitution into Pk|k=(I-KkH)Pk|k-1A posteriori error covariance P(k|k)
Step 7) of converting x(k|k)Turning to the step 3) again as an initial value of the next moment so as to realize continuous track tracking and correction;
therefore, the estimation value of the next moment is predicted by using the estimation value of the previous moment and the actual measurement value of the current moment, and the target track tracking and correction are completed.
Compared with the prior art, the invention has the beneficial effects that:
the invention can realize all-weather automatic safety supervision by utilizing radar and video monitoring, and improves the timeliness, the accuracy and the effectiveness of the safety supervision by setting the warning boundary for automatic alarm. The system can realize remote supervision and multi-platform access, and has very important significance for well doing works such as river safety protection, improving water conservancy informatization level and promoting development of river and lake growth.
Drawings
Fig. 1 is a schematic view of a working frame of a river safety control device based on photoelectric integration;
FIG. 2 is a schematic view of the optoelectronic integration apparatus;
FIG. 3 is a flow chart of a photoelectric integrated monitoring technique;
FIG. 4 is a plan view of the integrated optoelectronic monitoring facility;
FIG. 5 is a track tracking correction chart of a photoelectric integrated monitoring target based on Kalman filtering.
Detailed Description
A river channel safety control method based on photoelectric integration technology comprises the following steps,
step 1: laying a radar and a video monitor in a river warning area;
step 2: the video monitoring acquires image information of a monitored target and analyzes the type of the monitored target, and the radar acquires the related information of the monitored target;
and step 3: performing fusion analysis on data obtained by video monitoring and radar monitoring;
and 4, step 4: and (3) finding and prejudging reasonable monitoring targets in the monitoring area, and tracking, monitoring and/or snapshotting for evidence obtaining.
In step 3, the video accurately identifies the target and the radar accurately measures the speed, angle and distance of the target, and the monitoring data are subjected to fusion analysis by combining the advantages of video monitoring and the radar. Specifically, when monitoring data is subjected to fusion analysis, the following steps are adopted:
step 1) establishing a coordinate system, wherein the radar and the video coordinate system adopt polar coordinates of (R, theta) and (R) respectivelyvv) The origin coordinate of the radar sensor is (x)r0,yr0,zr0) The origin coordinate of the camera is (x)v0,yv0,zv0) Measuring coordinates by adopting a GPS;
step 2) using the formula
Figure BDA0002428888130000071
Determining the relation between Doppler frequency shift and radial velocity, and measuring the target distance by using frequency spectrum
Figure BDA0002428888130000072
Because the radar adopts a plurality of antennas to receive signals, the phase difference is utilized to calculate the target angle
Figure BDA0002428888130000073
Figure BDA0002428888130000081
According to the measured values, the plane coordinates of the radar monitoring target can be determined, the polar coordinates with the video sensor as the origin of coordinates can be obtained through conversion of the polar coordinates and the plane coordinates, and spatial fusion of matching of the radar monitoring target to the video image in space is completed;
step 3) taking the radar refreshing time as a reference, and enabling data refreshed by the radar to coincide with the next frame data monitored by the video every time, so that the time synchronization of the radar data and the video data is ensured, and the time fusion is completed;
and 4) sending information captured by the radar, including a target reflection area and a distance, to a video sensor as guide information, identifying and capturing a corresponding target by a control center through a video holder guide camera, further classifying the target by adopting a static image classification identification algorithm, identifying workers and non-workers, giving a unique serial number UID to the non-workers, tracking and recording the corresponding track of the UID in real time by the radar sensor, and if the video cannot be identified due to factors such as weather, distinguishing pedestrians and vehicles by adopting the reflection area, giving serial numbers to the pedestrians, and tracking and recording.
In step 4, the radar acquires the relevant information of the monitored target, including the distance, speed and angle information of the moving target, and tracks and monitors the position and speed of the moving target. Specifically, when tracking monitoring is performed, the method comprises the following steps:
step 1) establishing a state equation and an observation equation to describe state variables of a moving target, and respectively defining state variables X of a tracking target at the moment kk=[xk,yk,x’k,y’k]And the observed variable Zk=[pxk,pyk,vx’k,vy’k]Wherein x isk,ykAnd x'k,y’kThe coordinate component and the velocity component in the x and y directions, px, respectively, at the target time kk,pykAre the coordinate components of the target in the x and y directions at time k, respectively, vx'k,vy’kIs the corresponding velocity component;
step 2) because the radar scanning frequency interval is short, the pedestrian can basically move on the control area as variable acceleration linear motion, and a state equation is established to describe the relationship between state variables at adjacent moments;
step 3) initial position x of moving target(0|0)Substitution into
Figure BDA0002428888130000091
The position x of the next time can be obtained(k|k-1)Values, wherein: x is the number ofk-1For initial estimation, A is a state transition matrix, and B is an adjustment parameter;
step 4) mixing x(k|k-1)And from Pk|k-1=APk-1|k-1AT+ Q initialized prior error covariance p(k|k-1)Are substituted together
Figure BDA0002428888130000092
Calculating a Kalman gain KkWherein: r is a constant matrix assumed from the actual motion model, PkIs the covariance of the prior estimate, H is the measurement system parameter;
step 5) mixing x(k|k-1)、Kk,Z(k)Substitution into
Figure BDA0002428888130000093
Get the correction value x(k|k)And judging whether the trajectory of the moving object is matched with the existing trajectory of the moving object, wherein: kkIs the kalman gain;
step 6) mixing x(k|k-1)、KkSubstitution into Pk|k=(I-KkH)Pk|k-1A posteriori error covariance P(k|k)
Step 7) of converting x(k|k)Turning to the step 3) again as an initial value of the next moment so as to realize continuous track tracking and correction;
therefore, the estimation value of the next moment is predicted by using the estimation value of the previous moment and the actual measurement value of the current moment, and the target track tracking and correction are completed.
In step 4, on the basis of continuous tracking, converting the motion track of the radar tracking target into a plane coordinate and displaying the plane coordinate on a satellite map of the monitoring center in real time; presetting a warning area, carrying out track prejudgment, and judging whether a target is in the warning area, wherein specifically, a PNPOLY algorithm can be adopted for judging whether the target is in the warning area; when the pedestrian is judged to invade the warning area in advance, alarm information is popped up or the alarm is given to the staff in a form of sending a short message, and meanwhile, the camera receives coordinate information converted by the radar to monitor and capture the invaded object for evidence.
In terms of equipment, as shown in fig. 2, the present invention employs a photoelectric integration apparatus in which a lightning rod 1; a radar 2; a high-definition camera 3; a power supply 4; the dotted line frame represents a control center which is mainly responsible for monitoring data, processing data, coordinating control and sending instructions;
as shown in fig. 4, in the layout of the optoelectronic integration monitoring facility, the respective reference numerals are explained below, in which the pedestrian 5 outside the monitoring area; a radar scanning area 6; a monitoring area 7; monitoring objects 8 within the area; monitoring the distance 9 between the target and the radar origin; monitoring range 10 after the camera receives radar information; the angle 11 between the target and the coordinate axis is monitored.
The invention also comprises a method for fusing the monitoring data of video monitoring and radar monitoring, which adopts the following steps when fusing and analyzing the monitoring data:
step 1) establishing a coordinate system, wherein the radar and the video coordinate system adopt polar coordinates of (R, theta) and (R) respectivelyvv) The origin coordinate of the radar sensor is (x)r0,yr0,zr0) The origin coordinate of the camera is (x)v0,yv0,zv0) Measuring coordinates by adopting a GPS;
step 2) using the formula
Figure BDA0002428888130000101
Determining the relation between Doppler frequency shift and radial velocity, and measuring the target distance by using frequency spectrum
Figure BDA0002428888130000102
Because the radar adopts a plurality of antennas to receive signals, the phase difference is utilized to calculate the target angle
Figure BDA0002428888130000103
Figure BDA0002428888130000104
According to the measured values, the plane coordinates of the radar monitoring target can be determined, the polar coordinates with the video sensor as the origin of coordinates can be obtained through conversion of the polar coordinates and the plane coordinates, and spatial fusion of matching of the radar monitoring target to the video image in space is completed;
step 3) taking the radar refreshing time as a reference, and enabling data refreshed by the radar to coincide with the next frame data monitored by the video every time, so that the time synchronization of the radar data and the video data is ensured, and the time fusion is completed;
and 4) sending information captured by the radar, including a target reflection area and a distance, to a video sensor as guide information, identifying and capturing a corresponding target by a control center through a video holder guide camera, further classifying the target by adopting a static image classification identification algorithm, identifying workers and non-workers, giving a unique serial number UID to the non-workers, tracking and recording the corresponding track of the UID in real time by the radar sensor, and if the video cannot be identified due to factors such as weather, distinguishing pedestrians and vehicles by adopting the reflection area, giving serial numbers to the pedestrians, and tracking and recording.
The invention also comprises a target track tracking and monitoring method, wherein the radar acquires the relevant information of the monitored target, including the distance, speed and angle information of the moving target, and tracks and monitors the position and speed of the moving target;
step 1) establishing a state equation and an observation equation to describe state variables of a moving target, and respectively defining state variables X of a tracking target at the moment kk=[xk,yk,x’k,y’k]And the observed variable Zk=[pxk,pyk,vx’k,vy’k]Wherein x isk,ykAnd x'k,y’kThe coordinate component and the velocity component in the x and y directions, px, respectively, at the target time kk,pykAre the coordinate components of the target in the x and y directions at time k, respectively, vx'k,vy’kIs the corresponding velocity component;
step 2) because the radar scanning frequency interval is short, the pedestrian can basically move on the control area as variable acceleration linear motion, and a state equation is established to describe the relationship between state variables at adjacent moments;
step 3) initial position x of moving target(0|0)Substitution into
Figure BDA0002428888130000111
The position x of the next time can be obtained(k|k-1)Values, wherein: x is the number ofk-1For initial estimation, A is a state transition matrix, and B is an adjustment parameter;
step 4) mixing x(k|k-1)And from Pk|k-1=APk-1|k-1AT+ Q initialized prior error covariance p(k|k-1)Are substituted together
Figure BDA0002428888130000121
Calculating a Kalman gain KkWherein: r is a constant matrix assumed from the actual motion model, PkIs the covariance of the prior estimate, H is the measurement system parameter;
step 5) mixing x(k|k-1)、Kk,Z(k)Substitution into
Figure BDA0002428888130000122
Get the correction value x(k|k)And judging whether the trajectory of the moving object is matched with the existing trajectory of the moving object, wherein: kkIs the kalman gain;
step 6) mixing x(k|k-1)、KkSubstitution into Pk|k=(I-KkH)Pk|k-1A posteriori error covariance P(k|k)
Step 7) of converting x(k|k)Turning to the step 3) again as an initial value of the next moment so as to realize continuous track tracking and correction;
therefore, the estimation value of the next moment is predicted by using the estimation value of the previous moment and the actual measurement value of the current moment, and the target track tracking and correction are completed.
The invention can realize all-weather automatic safety supervision by utilizing radar and video monitoring, and improves the timeliness, the accuracy and the effectiveness of the safety supervision by setting the warning boundary for automatic alarm. The system can realize remote supervision and multi-platform access, and has very important significance for well doing works such as river safety protection, improving water conservancy informatization level and promoting development of river and lake growth.

Claims (8)

1. A river channel safety control method based on a photoelectric integration technology is characterized by comprising the following steps,
step 1: laying a radar and a video monitor in a river warning area;
step 2: the video monitoring acquires image information of a monitored target and analyzes the type of the monitored target, and the radar acquires the related information of the monitored target;
and step 3: performing fusion analysis on data obtained by video monitoring and radar monitoring;
and 4, step 4: and (3) finding and prejudging reasonable monitoring targets in the monitoring area, and tracking, monitoring and/or snapshotting for evidence obtaining.
2. The river channel safety control method based on the optoelectronic integration technology as claimed in claim 1, wherein in step 3, the video accurately identifies the target and the radar accurately measures the speed, angle and distance of the target, and the advantages of video monitoring and radar are combined to perform fusion analysis on the monitoring data.
3. The river channel safety control method based on the photoelectric integration technology as claimed in claim 2, wherein the following steps are adopted when monitoring data are subjected to fusion analysis:
step 1) establishing a coordinate system, wherein the radar and the video coordinate system adopt polar coordinates of (R, theta) and (R) respectivelyvv) The origin coordinate of the radar sensor is (x)r0,yr0,zr0) The origin coordinate of the camera is (x)v0,yv0,zv0) Measuring coordinates by adopting a GPS;
step 2) using the formula
Figure FDA0002428888120000011
Determining the relation between Doppler frequency shift and radial velocity, and measuring the target distance by using frequency spectrum
Figure FDA0002428888120000012
Because the radar adopts a plurality of antennas to receive signals, the phase difference is utilized to calculate the target angle
Figure FDA0002428888120000013
Figure FDA0002428888120000014
According to the measured values, the plane coordinates of the radar monitoring target can be determined, the polar coordinates with the video sensor as the origin of coordinates can be obtained through conversion of the polar coordinates and the plane coordinates, and spatial fusion of matching of the radar monitoring target to the video image in space is completed;
step 3) taking the radar refreshing time as a reference, and enabling data refreshed by the radar to coincide with the next frame data monitored by the video every time, so that the time synchronization of the radar data and the video data is ensured, and the time fusion is completed;
and 4) sending information captured by the radar, including a target reflection area and a distance, to a video sensor as guide information, identifying and capturing a corresponding target by a control center through a video holder guide camera, further classifying the target by adopting a static image classification identification algorithm, identifying workers and non-workers, giving a unique serial number UID to the non-workers, tracking and recording the corresponding track of the UID in real time by the radar sensor, and if the video cannot be identified due to factors such as weather, distinguishing pedestrians and vehicles by adopting the reflection area, giving serial numbers to the pedestrians, and tracking and recording.
4. The river channel safety control method based on the optoelectronic integration technology as claimed in claim 1, 2 or 3, wherein in step 4, the radar obtains the relevant information of the monitored target including the distance, speed and angle information of the moving target, and performs tracking monitoring on the position and speed of the moving target.
5. The river channel safety control method based on the photoelectric integration technology as claimed in claim 4, wherein the tracking monitoring comprises the following steps:
step 1) establishing a state equation and an observation equation to describe state variables of a moving target, and respectively defining state variables X of a tracking target at the moment kk=[xk,yk,x’k,y’k]And the observed variable Zk=[pxk,pyk,vx’k,vy’k]Wherein x isk,ykAnd x'k,y’kThe coordinate component and the velocity component in the x and y directions, px, respectively, at the target time kk,pykAre the coordinate components of the target in the x and y directions at time k, respectively, vx'k,vy’kIs the corresponding velocity component;
step 2) because the radar scanning frequency interval is short, the pedestrian can basically move on the control area as variable acceleration linear motion, and a state equation is established to describe the relationship between state variables at adjacent moments;
step 3) initial position x of moving target(0|0)Substitution into
Figure FDA0002428888120000031
The position x of the next time can be obtained(k|k-1)Values, wherein: x is the number ofk-1For initial estimation, A is a state transition matrix, and B is an adjustment parameter;
step 4) mixing x(k|k-1)And from Pk|k-1=APk-1|k-1AT+ Q initialized prior error covariance p(k|k-1)Are substituted together
Figure FDA0002428888120000032
Calculating a Kalman gain KkWherein: r is a constant matrix assumed from the actual motion model, PkIs the covariance of the prior estimate, H is the measurement system parameter;
step 5) mixing x(k|k-1)、Kk,Z(k)Substitution into
Figure FDA0002428888120000033
Get the correction value x(k|k)And judging whether the trajectory of the moving object is matched with the existing trajectory of the moving object, wherein: kkIs the kalman gain;
step 6) mixing x(k|k-1)、KkSubstitution into Pk|k=(I-KkH)Pk|k-1A posteriori error covariance P(k|k)
Step 7) of converting x(k|k)Turning to the step 3) again as the initial value of the next moment, thereby realizing continuous track trackingAnd correcting;
therefore, the estimation value of the next moment is predicted by using the estimation value of the previous moment and the actual measurement value of the current moment, and the target track tracking and correction are completed.
6. The river channel safety control method based on the photoelectric integration technology as claimed in claim 5, wherein in step 4, on the basis of continuous tracking, the motion trajectory of a radar tracking target is converted into plane coordinates to be displayed on a monitoring center satellite map in real time; presetting a warning area, carrying out track prejudgment, and judging whether a target is in the warning area; when the pedestrian is judged to invade the warning area in advance, alarm information is popped up or the alarm is given to the staff in a form of sending a short message, and meanwhile, the camera receives coordinate information converted by the radar to monitor and capture the invaded object for evidence.
7. A method for fusing monitoring data of video monitoring and radar monitoring is characterized in that the following steps are adopted when the monitoring data are fused and analyzed:
step 1) establishing a coordinate system, wherein the radar and the video coordinate system adopt polar coordinates of (R, theta) and (R) respectivelyvv) The origin coordinate of the radar sensor is (x)r0,yr0,zr0) The origin coordinate of the camera is (x)v0,yv0,zv0) Measuring coordinates by adopting a GPS;
step 2) using the formula
Figure FDA0002428888120000041
Determining the relation between Doppler frequency shift and radial velocity, and measuring the target distance by using frequency spectrum
Figure FDA0002428888120000042
Because the radar adopts a plurality of antennas to receive signals, the phase difference is utilized to calculate the target angle
Figure FDA0002428888120000043
Figure FDA0002428888120000044
According to the measured values, the plane coordinates of the radar monitoring target can be determined, the polar coordinates with the video sensor as the origin of coordinates can be obtained through conversion of the polar coordinates and the plane coordinates, and spatial fusion of matching of the radar monitoring target to the video image in space is completed;
step 3) taking the radar refreshing time as a reference, and enabling data refreshed by the radar to coincide with the next frame data monitored by the video every time, so that the time synchronization of the radar data and the video data is ensured, and the time fusion is completed;
and 4) sending information captured by the radar, including a target reflection area and a distance, to a video sensor as guide information, identifying and capturing a corresponding target by a control center through a video holder guide camera, further classifying the target by adopting a static image classification identification algorithm, identifying workers and non-workers, giving a unique serial number UID to the non-workers, tracking and recording the corresponding track of the UID in real time by the radar sensor, and if the video cannot be identified due to factors such as weather, distinguishing pedestrians and vehicles by adopting the reflection area, giving serial numbers to the pedestrians, and tracking and recording.
8. A method for tracking and monitoring a target track is characterized by comprising the following steps: the radar acquires relevant information of a monitored target, including distance, speed and angle information of the moving target, and tracks and monitors the position and the speed of the moving target;
step 1) establishing a state equation and an observation equation to describe state variables of a moving target, and respectively defining state variables X of a tracking target at the moment kk=[xk,yk,x’k,y’k]And the observed variable Zk=[pxk,pyk,vx’k,vy’k]Wherein x isk,ykAnd x'k,y’kThe coordinate component and the velocity component in the x and y directions, px, respectively, at the target time kk,pykAre the coordinate components of the target in the x and y directions at time k, respectively, vx'k,vy’kIs the corresponding velocity component;
step 2) because the radar scanning frequency interval is short, the pedestrian can basically move on the control area as variable acceleration linear motion, and a state equation is established to describe the relationship between state variables at adjacent moments;
step 3) initial position x of moving target(0|0)Substitution into
Figure FDA0002428888120000051
The position x of the next time can be obtained(k|k-1)Values, wherein: x is the number ofk-1For initial estimation, A is a state transition matrix, and B is an adjustment parameter;
step 4) mixing x(k|k-1)And from Pk|k-1=APk-1|k-1AT+ Q initialized prior error covariance p(k|k-1)Are substituted together
Figure FDA0002428888120000052
Calculating a Kalman gain KkWherein: r is a constant matrix assumed from the actual motion model, PkIs the covariance of the prior estimate, H is the measurement system parameter;
step 5) mixing x(k|k-1)、Kk,Z(k)Substitution into
Figure FDA0002428888120000053
Get the correction value x(k|k)And judging whether the trajectory of the moving object is matched with the existing trajectory of the moving object, wherein: kkIs the kalman gain;
step 6) mixing x(k|k-1)、KkSubstitution into Pk|k=(I-KkH)Pk|k-1A posteriori error covariance P(k|k)
Step 7) of converting x(k|k)Turning to the step 3) again as an initial value of the next moment so as to realize continuous track tracking and correction;
therefore, the estimation value of the next moment is predicted by using the estimation value of the previous moment and the actual measurement value of the current moment, and the target track tracking and correction are completed.
CN202010229622.7A 2020-03-27 2020-03-27 River channel safety control method based on photoelectric integration technology Pending CN111381232A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010229622.7A CN111381232A (en) 2020-03-27 2020-03-27 River channel safety control method based on photoelectric integration technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010229622.7A CN111381232A (en) 2020-03-27 2020-03-27 River channel safety control method based on photoelectric integration technology

Publications (1)

Publication Number Publication Date
CN111381232A true CN111381232A (en) 2020-07-07

Family

ID=71221702

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010229622.7A Pending CN111381232A (en) 2020-03-27 2020-03-27 River channel safety control method based on photoelectric integration technology

Country Status (1)

Country Link
CN (1) CN111381232A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112066977A (en) * 2020-09-15 2020-12-11 中国人民解放军63660部队 Photoelectric measurement network multi-target matching and cataloguing method
CN112114305A (en) * 2020-08-17 2020-12-22 西安电子科技大学 Non-contact river channel radar monitoring method, system, device and application
CN112702571A (en) * 2020-12-18 2021-04-23 福建汇川物联网技术科技股份有限公司 Monitoring method and device
CN113567972A (en) * 2021-08-03 2021-10-29 广州海事科技有限公司 Radar-based marine monitoring method, system, equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100594513B1 (en) * 2005-08-04 2006-06-30 한국전력공사 Image monitoring system connected with close range radar
US20110102237A1 (en) * 2008-12-12 2011-05-05 Lang Hong Fusion Algorithm for Vidar Traffic Surveillance System
CN102508246A (en) * 2011-10-13 2012-06-20 吉林大学 Method for detecting and tracking obstacles in front of vehicle
CN108205144A (en) * 2018-03-28 2018-06-26 李强 A kind of road work vehicle collision prewarning device, road work vehicle and anti-collision warning method
CN108965809A (en) * 2018-07-20 2018-12-07 长安大学 The video linkage monitoring system and control method of radar vectoring
CN109241839A (en) * 2018-07-31 2019-01-18 安徽四创电子股份有限公司 A kind of camera shooting radar joint deployment implementation method based on face recognition algorithms

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100594513B1 (en) * 2005-08-04 2006-06-30 한국전력공사 Image monitoring system connected with close range radar
US20110102237A1 (en) * 2008-12-12 2011-05-05 Lang Hong Fusion Algorithm for Vidar Traffic Surveillance System
CN102508246A (en) * 2011-10-13 2012-06-20 吉林大学 Method for detecting and tracking obstacles in front of vehicle
CN108205144A (en) * 2018-03-28 2018-06-26 李强 A kind of road work vehicle collision prewarning device, road work vehicle and anti-collision warning method
CN108965809A (en) * 2018-07-20 2018-12-07 长安大学 The video linkage monitoring system and control method of radar vectoring
CN109241839A (en) * 2018-07-31 2019-01-18 安徽四创电子股份有限公司 A kind of camera shooting radar joint deployment implementation method based on face recognition algorithms

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
刘煜等: "《稀疏表示基础理论与典型应用》", 31 October 2014, 国防科学技术大学出版社 *
郭壮等: "基于Halcon的运动目标追踪研究", 《现代电子技术》 *
黄海: "视频与雷达数据融合在围界入侵报警的应用探讨", 《智能建筑与智慧城市》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112114305A (en) * 2020-08-17 2020-12-22 西安电子科技大学 Non-contact river channel radar monitoring method, system, device and application
CN112066977A (en) * 2020-09-15 2020-12-11 中国人民解放军63660部队 Photoelectric measurement network multi-target matching and cataloguing method
CN112066977B (en) * 2020-09-15 2024-02-27 中国人民解放军63660部队 Multi-target matching and cataloging method for photoelectric measurement network
CN112702571A (en) * 2020-12-18 2021-04-23 福建汇川物联网技术科技股份有限公司 Monitoring method and device
CN112702571B (en) * 2020-12-18 2022-10-25 福建汇川物联网技术科技股份有限公司 Monitoring method and device
CN113567972A (en) * 2021-08-03 2021-10-29 广州海事科技有限公司 Radar-based marine monitoring method, system, equipment and storage medium

Similar Documents

Publication Publication Date Title
CN111381232A (en) River channel safety control method based on photoelectric integration technology
CN108965809B (en) Radar-guided video linkage monitoring system and control method
WO2022141914A1 (en) Multi-target vehicle detection and re-identification method based on radar and video fusion
CN112836737A (en) Roadside combined sensing equipment online calibration method based on vehicle-road data fusion
CN109459750A (en) A kind of more wireless vehicle trackings in front that millimetre-wave radar is merged with deep learning vision
CN111523465A (en) Ship identity recognition system based on camera calibration and deep learning algorithm
CN104378582A (en) Intelligent video analysis system and method based on PTZ video camera cruising
CN112687127B (en) Ship positioning and snapshot method based on AIS and image analysis assistance
KR20150049529A (en) Apparatus and method for estimating the location of the vehicle
CN109828267A (en) The Intelligent Mobile Robot detection of obstacles and distance measuring method of Case-based Reasoning segmentation and depth camera
CN111047879A (en) Vehicle overspeed detection method
CN108711172A (en) Unmanned plane identification based on fine grit classification and localization method
CN115019512A (en) Road event detection system based on radar video fusion
CN105141887A (en) Submarine cable area video alarming method based on thermal imaging
CN104063863A (en) Pitch-down type binocular vision system for watercourse monitoring and image processing method
CN116266360A (en) Vehicle target detection tracking method based on multi-source information fusion
CN112348882A (en) Low-altitude target tracking information fusion method and system based on multi-source detector
Wu et al. A new multi-sensor fusion approach for integrated ship motion perception in inland waterways
CN115060343B (en) Point cloud-based river water level detection system and detection method
CN110458089A (en) A kind of naval target interconnected system and method based on the observation of height rail optical satellite
CN114298163A (en) Online road condition detection system and method based on multi-source information fusion
CN107607939B (en) Optical target tracking and positioning radar device based on real map and image
Hautière et al. Experimental validation of dedicated methods to in-vehicle estimation of atmospheric visibility distance
CN111177297B (en) Dynamic target speed calculation optimization method based on video and GIS
CN105403886A (en) Automatic extraction method for airborne SAR scaler image position

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200707