WO2022022023A1 - 激光末制导飞行器组网控制方法 - Google Patents

激光末制导飞行器组网控制方法 Download PDF

Info

Publication number
WO2022022023A1
WO2022022023A1 PCT/CN2021/094853 CN2021094853W WO2022022023A1 WO 2022022023 A1 WO2022022023 A1 WO 2022022023A1 CN 2021094853 W CN2021094853 W CN 2021094853W WO 2022022023 A1 WO2022022023 A1 WO 2022022023A1
Authority
WO
WIPO (PCT)
Prior art keywords
sub
image
aircraft
target
block
Prior art date
Application number
PCT/CN2021/094853
Other languages
English (en)
French (fr)
Inventor
王辉
林德福
李涛
王伟
王江
宋韬
袁亦方
Original Assignee
北京理工大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京理工大学 filed Critical 北京理工大学
Priority to JP2023506252A priority Critical patent/JP2023536866A/ja
Publication of WO2022022023A1 publication Critical patent/WO2022022023A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4808Evaluating distance, position or velocity data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image

Definitions

  • the invention relates to a control method for a laser terminal guidance aircraft, in particular to a network control method for a laser terminal guidance aircraft.
  • the basic working principle of the laser-guided aircraft is: at the end of the ballistic trajectory, the laser irradiator begins to illuminate the target, and the laser detector on the bomb detects the laser signal diffusely reflected by the target in real time; when the target enters the field of view of the detector, the laser detector can The deviation signal that deviates from the center of the field of view controls the corresponding pulse engine or steering gear, corrects the flight trajectory, achieves precise strikes on the target, and greatly improves the aircraft's point target killing ability.
  • the present inventor has conducted in-depth research on the existing networking control methods, expecting to design a new networking control method for laser terminal-guided aircraft that can solve the above problems.
  • the inventors have carried out keen research and designed a network control method for laser terminal guidance aircraft.
  • the observation drone cruising in the target area cooperates with the lure aircraft to obtain target position information in a timely and accurate manner, and then guide the subsequent aircraft to fly to the target by emitting laser light from the observation drone, thereby completing the present invention.
  • the purpose of the present invention is to provide the following aspects: a method for networking control of a laser terminal-guided aircraft, the method comprising the following steps:
  • Step 1 Launch at least two aircraft 2 towards the target area through the launch unit 1, and the first aircraft arrives at the target area at least 5-10 seconds earlier than the other aircraft;
  • step 2 the radar signal is captured by the radar signal receiving module 21 installed on the first aircraft, and the position information of the radar transmitting vehicle is obtained accordingly;
  • Step 3 control the observation drone 3 to cruise in the target area in real time, and find the target by taking pictures of the target area in real time through the camera 31 installed on it;
  • Step 4 the target is irradiated by the laser irradiator 32 installed on the observation drone 3 .
  • the camera 31 on the observation UAV 3 takes pictures of the target before and after the landing of the aircraft, and transmits them to the command unit 4, so as to judge the landing point of the aircraft and the damage of the target.
  • step 2 the first aircraft sends the obtained position information of the radar transmitting vehicle to the observation drone 3, so that the observation drone can find and lock the target.
  • the first aircraft controls itself to fly to the radar transmitting vehicle;
  • At least one of the other aircraft is guided by the laser irradiator 32 to fly towards the radar transmitting vehicle.
  • each target is irradiated with a laser irradiator 32, and each laser irradiator 32 emits irradiation lasers of different frequencies.
  • the command unit 4 calculates the accurate countdown information in real time, and controls the laser irradiator 32 to emit the irradiating laser 1-3 seconds before the aircraft enters the final guidance section according to the countdown information.
  • step 3 includes the following sub-steps:
  • Sub-step 1 the observation drone 3 continues to obtain photos of the target area through the camera 31 during the movement;
  • Sub-step 3 converting the preprocessed image into a grayscale image
  • Sub-step 4 establishes a transformation model according to the grayscale image, and the transformation model is used to convert the previous frame image in the adjacent two frame images into a matching image, and the background of the matching image is the same as the background of the current frame image;
  • Sub-step 5 Calculate the target optical flow field according to the matching image and the current frame image, and then determine the target.
  • establishing the transformation model includes the following sub-steps:
  • x' represents the X-axis coordinate of a point in the matching image
  • y' represents the Y-axis coordinate of a point in the matching image
  • x represents the X-axis coordinate of a point in the previous frame of image
  • y represents the Y-axis coordinate of a point in the previous frame of image
  • Sub-sub-step b retrieve the current frame image and the previous frame image, and use the same method to divide the two frame images into multiple sub-blocks that do not completely overlap,
  • Sub-sub-step c find the best matching block of each sub-block in the current frame image from the sub-blocks of the previous frame image;
  • (x i , y i ) represents the center coordinates of the ith sub-block in the current frame image
  • (x i , y i ) ' i , y' i ) represents the center coordinate of the best matching block of the i-th sub-block in one frame of image;
  • Sub-step d use the least squares method to solve the conversion parameters in equation (1), as shown in the following equation (2):
  • N represents the number of sub-blocks divided in the current frame image
  • a sub-block in the current frame image is arbitrarily selected, and the gray of each sub-block in the previous frame image and all the pixel points of the sub-block in the current frame image are calculated one by one by formula (3).
  • the sum of the absolute values of the degree difference, and the sub-block in the previous frame image with the smallest value is selected as the best matching block;
  • I current block (m, n) represents the gray value of the pixel at position (m, n) in the image sub-block of the current frame
  • I matching block (m, n) represents (m, n) in the image sub-block of the previous frame.
  • n) the grayscale value of the pixel at the position
  • p represents the number of pixels in the X-axis direction of the sub-block
  • q represents the number of pixels in the Y-axis direction of the sub-block;
  • E(p) represents the energy function in the matching image and the current frame image
  • E m represents the optical flow constraint term
  • E s represents the smooth constraint term
  • represents all areas of the current frame image
  • the function f represents the position (x, y) of any pixel in the image at a certain moment, f x represents the partial derivative of the function f in the X-axis direction; f y represents the partial derivative of the function f in the Y-axis direction; f t represents the partial derivative of the function f at time t;
  • u represents the velocity component of any pixel in the image in the X-axis direction
  • v represents the velocity component of any pixel in the image in the Y-axis direction
  • dx represents the differential symbol
  • is a positive number, which represents the weight of the smooth constraint term E m .
  • the position information of the radar launch vehicle in the target area is captured by using the first aircraft as an inducing aircraft;
  • the target position can be accurately and timely found and locked, and then the target position can be accurately and accurately detected by guiding the laser Control the subsequent aircraft to fly to the target.
  • Fig. 1 shows the overall logic diagram of a network control method for a laser terminal-guided aircraft according to a preferred embodiment of the present invention
  • FIG. 2 shows a schematic diagram of the signal connection relationship between various components in a method for controlling a laser terminal-guided aircraft network according to a preferred embodiment of the present invention
  • FIG. 3 shows a schematic diagram of a motion trajectory in an embodiment of the present invention
  • FIG. 4 shows a partial enlarged view of FIG. 3 .
  • the targets targeted by the aircraft are often hidden under specific bunkers or camouflages, and it is difficult to find and lock the targets.
  • different targets will have different stress responses.
  • the radar car will send a radar signal when it enters the working state.
  • the target is a command vehicle or an interceptor car, it will enter the working state. It will continue to maneuver at high speed, or change the station at predetermined intervals, and it is easier to detect when the target is moving from stationary to moving or from moving to stationary, and the radar vehicle is also easier to detect when it sends out detection radar.
  • the present invention provides a network control method for a laser terminal-guided aircraft. As shown in FIG. 1 , the method includes the following steps:
  • Step 1 Launch at least two aircraft 2 towards the target area through the launch unit 1, and the first aircraft arrives at the target area at least 5-10 seconds earlier than the other aircraft;
  • step 2 the radar signal is captured by the radar signal receiving module 21 installed on the first aircraft, and the position information of the radar transmitting vehicle is obtained accordingly;
  • Step 3 control the observation drone 3 to cruise in the target area in real time, and find the target by taking pictures of the target area in real time through the camera 31 installed on it;
  • Step 4 the target is irradiated by the laser irradiator 32 installed on the observation drone 3 .
  • the target area mentioned in this application refers to a larger area where the target may exist, generally a fan-shaped area of 3 ⁇ 10 to 3 ⁇ 20 km 2 .
  • the plurality of aircraft can be launched at a predetermined time interval, or can be launched simultaneously and the time to reach the target area can be changed by adjusting their respective flight speeds.
  • the first aircraft arrives at the target area 5 seconds earlier than the second aircraft, when When the aircraft reaches the target area, it is very likely to be discovered by the radar vehicle in the target area. After being discovered, it will cause a chain reaction. The enemy's interceptor vehicles and command vehicles are very likely to start moving, which will give the observation drone. Find the target to provide convenience.
  • the radar signal receiving module can be selected from Zhang Jiaoyun. Research on Modeling and Simulation of Monopulse Radar Seeker [D]. Shaanxi: Xidian University, 2006. The radar signal receiving module introduced in Radar signal to find the location of the radar transmitter car.
  • the observation drone can start cruising in the target area before the aircraft is launched. Due to the small size of the observation drone, it is difficult for the radar launch vehicle to find the observation drone. When the target is stationary and camouflaged, it is difficult for the observation drone to find the target.
  • the step 3 includes the following sub-steps:
  • Sub-step 1 the observation drone 3 continues to obtain the target area photo by the camera 31 during the moving process;
  • Sub-step 2 pre-processing the photos obtained by the camera 31, specifically, reducing random noise through median filtering, and enhancing image clarity through image sharpening;
  • Sub-step 3 converting the preprocessed image into a grayscale image
  • Sub-step 4 establishes a transformation model according to the grayscale image, and the transformation model is used to convert the previous frame image in the adjacent two frame images into a matching image, and the background of the matching image is the same as the background of the current frame image;
  • Sub-step 5 Calculate the target optical flow field according to the matching image and the current frame image, and then determine the target.
  • establishing the transformation model includes the following sub-steps:
  • x' represents the X-axis coordinate of a point in the matching image
  • y' represents the Y-axis coordinate of a point in the matching image
  • x represents the X-axis coordinate of a point in the previous frame of image
  • y represents the Y-axis coordinate of a point in the previous frame of image
  • Sub-sub-step b retrieve the current frame image and the previous frame image, and use the same rule to divide the two frame images into multiple complementary and partially overlapping sub-blocks,
  • Sub-sub-step c find the best matching block of each sub-block in the current frame image from the sub-blocks of the previous frame image;
  • (x i , y i ) represents the center coordinates of the ith sub-block in the current frame image
  • (x i , y i ) ' i , y' i ) represents the center coordinate of the best matching block of the i-th sub-block in one frame of image;
  • Sub-step d use the least squares method to solve the conversion parameters in equation (1), as shown in the following equation (2):
  • N represents the number of sub-blocks divided in the current frame image
  • the six parameters affect each other, so the combination of the optimal value of each parameter is not the global optimal solution; the iterative optimization of Equation 2 is carried out with the help of a computer.
  • the solution method is to enumerate many groups (a, b, c, d, e, f) in the global scope, and substitute them into formula (2), and the group of parameters with the smallest output value is the optimal solution.
  • the formula (1) can be used to convert the previous frame of image into a matching image.
  • the method for dividing the sub-blocks is to obtain the overall number of pixels P ⁇ Q of the image, that is, there are P pixels in the X-axis direction of the rectangular image, and Q pixels in the Y-axis direction. pixel.
  • the sub-blocks are also rectangular image blocks, and each sub-block has a total of P/10 pixels in the X-axis direction and Q/10 pixels in the Y-axis direction.
  • the lower right corner pixel of the first sub-block coincides with the lower right corner pixel of the current frame image/previous frame image; the X-axis between the lower right corner pixel of the second sub-block and the lower right corner pixel of the first sub-block
  • the interval is P/1000 pixels in the direction, and/or, the interval is Q/1000 pixels in the Y-axis direction; the interval between the lower-right corner pixel of the third sub-block and the lower-right corner pixel of the second sub-block is in the X-axis direction P/1000 pixels, and/or, at intervals of Q/1000 pixels in the Y-axis direction; according to this rule, all sub-blocks that satisfy this condition are continuously segmented and selected.
  • a sub-block in the current frame image is arbitrarily selected, and the relationship between each sub-block in the previous frame image and the current frame image is calculated by formula (3) one by one.
  • the sum of the absolute values of the grayscale differences of all pixels in the sub-block, and the sub-block in the previous frame image with the smallest value is selected as the best matching block;
  • I current block (m, n) represents the gray value of the pixel at position (m, n) in the current frame image sub-block (that is, the current block), and I matching block (m, n) represents the previous frame image sub-block
  • p represents the number of pixels in the X-axis direction of the sub-block
  • q represents the number of pixels in the Y-axis direction of the sub-block;
  • the minimum value of the energy function expression is obtained by the following formula (4):
  • E(p) represents the energy function in the matching image and the current frame image
  • both Em and E s are obtained by integrating the values of each point in the image
  • E m represents the optical flow constraint item, the purpose is to ensure that the image sequence reaches the optical flow constraint with a constant gray level
  • E s represents the smoothness constraint item, the purpose is to ensure that the optical flow field of the image sequence has been kept globally smooth;
  • represents all areas of the current frame/matching image
  • the function f represents the position (x, y) of any pixel in the image at a certain moment
  • f x represents the partial derivative of the function f in the X-axis direction
  • land f y represents the partial derivative of the function f in the direction of the Y axis
  • f t represents the partial derivative of the function f at time t
  • u represents the velocity component of any pixel in the image in the X-axis direction
  • v represents the velocity component of any pixel in the image in the Y-axis direction
  • dx represents the differential symbol
  • is a positive number, which represents the weight of the smooth constraint item E m , and the smaller the value is, the more complex the optical flow field is.
  • the method further includes step 5, by observing the camera 31 on the UAV 3 to take pictures of the target before and after the aircraft landed, and transmit them to the command unit 4, and then Determine the landing point of the aircraft and the damage to the target.
  • the camera 31 captures and obtains a photo of the target in real time, and the observation drone 3 sends the target photo to the calculation module of the command unit in real time.
  • the calculation module evaluates the damage effect according to the change degree of the pixel gray level of the target photo before and after the aircraft landing.
  • the pixel value of the target photo after the aircraft has landed refers to the pixel value of the target photo 10 to 15 seconds after the aircraft has landed, preferably the pixel value of the target photo after 12 seconds.
  • the camera 31 continuously shoots the target 10 seconds after the aircraft lands to obtain a photo of the target, and the target photo refers to a photo of a circular area with a diameter of 3 to 5 meters including the target.
  • the camera 31 It can also directly judge whether the target is moving according to the target photo. If the target moves, it can be considered that the target damage effect has not reached the expectation. If the target does not move, the target photo of the aircraft 12 seconds after landing is collected for further analysis and evaluation.
  • the specific further analysis and evaluation method is as follows: First, the grayscale change of the target photo image is solved by the following formula (5):
  • p t0 is the pixel value of the target photo before the aircraft landed
  • P t1 is the pixel value of the target photo after the aircraft has landed
  • N b is the number of pixels in the target photo
  • H b is the average grayscale change of the target photo.
  • H b evaluate the degree of grayscale change of the target photo image pixels. For each pixel of the target photo, when
  • the command unit displays the target damage effect value.
  • the first aircraft sends the obtained position information of the radar transmitting vehicle to the observation drone 3, so that the observation drone can find and lock the target.
  • the observation drone 3 determines the position of the radar launch vehicle in the photo through coordinate transformation, and can then quickly lock the target.
  • the first aircraft controls itself to fly to the radar transmitter vehicle; since the radar transmitter vehicle has already discovered the first aircraft, the first aircraft The probability of being intercepted is relatively high.
  • At least one of the other aircraft flies to the radar launch vehicle, that is, the radar launch vehicle is regarded as an important strike target.
  • each target is irradiated with a laser irradiator 32, and each laser irradiator 32 emits irradiation lasers of different frequencies.
  • the observation drone 3 sends the captured target information to the command unit 4 in real time, and the command unit can temporarily increase the number of aircraft according to the target information and target damage status, and control the launch unit to launch more aircraft to fly to the target area.
  • a laser encoder is pre-stored in the aircraft, which can randomly select a variety of pseudo-random frequencies, and control the laser irradiator 32 to emit laser light of this frequency to irradiate the target.
  • the pseudo-random frequency family can simultaneously reduce the target detection laser signal and the active interference of the laser signal. possibility.
  • the laser seeker of the aircraft is equipped with a laser frequency decoder, which can calculate the laser frequency emitted by the laser irradiator according to the same coding rules, so that the laser seeker can capture the guiding laser in time and complete the laser terminal guidance.
  • the accurate countdown information is calculated in real time by the command unit 4, and according to the countdown information, the laser irradiator 32 is controlled to emit an irradiating laser 1-3 seconds before the aircraft enters the final guidance segment.
  • the command unit 4 calculates the countdown information according to the target position information and the position information and speed information of the aircraft. Preferably, 1 second after the countdown is over, the aircraft just enters the terminal guidance section, and the laser seeker starts to work. At this time, the laser irradiator 32 on the observation drone 3 also starts to work, just so that the aircraft captures the target position information, And control the aircraft to fly to the target.
  • Three aircraft are launched towards the target area 20km away from the launch unit, the first aircraft arrives at the target area at least 5 seconds earlier than the other two aircraft; the second aircraft and the third aircraft arrive at the target area at the same time; it is known that the aircraft's The effective flight distance is 25km; the launch unit contains three launch vehicles, and the three aircraft are launched by three launch vehicles;
  • the radar signal receiving module installed on the first aircraft captures the radar wave signal when entering the target area, and obtains the position information of the radar transmitter vehicle accordingly; 3 seconds later, the interceptor vehicle in the target area launches the interceptor aircraft and starts move, the radar transmitter also starts to move,
  • the observation drone takes pictures of the target area in real time through the camera installed on it. After the radar launch vehicle and the interceptor vehicle move, the position of the radar launch vehicle and the interceptor vehicle is found, and the second and third aircraft enter the terminal guidance. In the first 1 second of the segment, irradiation lasers with different frequencies are emitted to irradiate the two targets respectively, and guide the second and third aircraft to fly to the targets.
  • Figures 3 and 4 The motion trajectories of the first, second, third, radar launch vehicle and interceptor vehicle are shown in Figures 3 and 4, wherein Figure 4 is a partial enlarged view of Figure 3, and Figure 4 mainly shows three The trajectory of the aircraft as it approaches the landing and the trajectory of the two targets. As can be seen from the picture, the first aircraft was intercepted, the second aircraft hit the radar launch vehicle, and the third aircraft hit the interceptor vehicle.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

一种激光末制导飞行器组网控制方法,该方法包括如下步骤:步骤1,通过发射单元(1)朝向目标区域发射至少两个飞行器(2),第一个飞行器(2)至少比其他飞行器(2)早5~10秒到达目标区域;步骤2,通过安装在第一个飞行器(2)上的雷达信号接收模块(21)捕获雷达波信号,并据此获得雷达发射车的位置信息;步骤3,控制观测无人机(3)实时在目标区域巡航,并通过其上安装的摄像机(31)实时拍摄目标区域照片来寻找目标;步骤4,通过安装在观测无人机(3)上的激光照射器(32)照射目标;步骤5,通过观测无人机(3)上的摄像机(31)拍摄飞行器(2)着陆前后的目标照片,并将之传送给指挥单元(4),进而判断飞行器(2)落点和目标毁伤情况。

Description

激光末制导飞行器组网控制方法 技术领域
本发明涉及一种激光末制导飞行器的控制方法,具体涉及一种激光末制导飞行器组网控制方法。
背景技术
激光制导飞行器的基本工作原理是:在弹道末段,激光照射器开始照射目标,弹上激光探测器实时探测目标漫反射的激光信号;当目标进入探测器视场后,激光探测器可根据目标偏离视场中心的偏差信号控制相应的脉冲发动机或舵机,对飞行弹道进行修正,实现对目标的精确打击,大大提高了飞行器的点目标杀伤能力。
为了实现预定目标,往往需要多个飞行器一同工作,彼此辅助,协同增效,组成信息组网传输系统。
在实际工作控制过程中,较为重要的一个工作内容就是快速准确地获得需要激光照射的目标的位置信息,只有确定了准确的位置信息后才能顺利执行后续的作业,而现有技术中对于组网系统彼此辅助快速获得目标位置信息的方案还不够成熟,难以快速准确地获得期望的目标位置信息。
由于上述原因,本发明人对现有的组网控制方法做了深入研究,以期待设计出一种能够解决上述问题的新的激光末制导飞行器组网控制方法。
发明内容
为了克服上述问题,本发明人进行了锐意研究,设计出一种激光末制导飞行器组网控制方法,该方法中通过诱导飞行器 首先进入目标区域,引诱目标区域中的目标启动工作并移动,再通过目标区域中巡航的观测无人机与该引诱飞行器相配合,及时准确地获得目标位置信息,再通过观测无人机发射照射激光引导后续的飞行器飞向目标,从而完成本发明。
具体来说,本发明的目的在于提供以下方面:激光末制导飞行器组网控制方法,该方法包括如下步骤:
步骤1,通过发射单元1朝向目标区域发射至少两个飞行器2,第一个飞行器至少比其他飞行器早5~10秒到达目标区域;
步骤2,通过安装在第一个飞行器上的雷达信号接收模块21捕获雷达波信号,并据此获得雷达发射车的位置信息;
步骤3,控制观测无人机3实时在目标区域巡航,并通过其上安装的摄像机31实时拍摄目标区域照片来寻找目标;
步骤4,通过安装在观测无人机3上的激光照射器32照射目标。
其中,通过观测无人机3上的摄像机31拍摄飞行器着陆前后的目标照片,并将之传送给指挥单元4,进而判断飞行器落点和目标毁伤情况。
其中,在步骤2中,第一个飞行器将获得的雷达发射车的位置信息发送给观测无人机3,以便于观测无人机发现并锁定该目标。
其中,所述第一个飞行器在获得雷达发射车的位置信息后,控制其自身飞向该雷达发射车;
其他飞行器中至少有一个飞行器在激光照射器32的导引下飞向该雷达发射车。
其中,当步骤3中寻找到两个或两个以上目标时,在步骤4中,每个目标都用一个激光照射器32进行照射,且各个激光照射器32发射不同频率的照射激光。
其中,通过指挥单元4实时解算准确的倒计时信息,并根据该倒计时信息在飞行器进入末制导段前1-3秒时控制激光照射器32发出照射激光。
其中,所述步骤3包括如下子步骤:
子步骤1,观测无人机3在移动过程中持续通过摄像机31获得目标区域照片;
子步骤2,对摄像机31获得的照片做预处理,
子步骤3,将预处理后的图像转换为灰度图像;
子步骤4,根据灰度图像建立变换模型,所述变换模型用于将相邻两帧图像中上一帧图像转换为匹配图像,所述匹配图像的背景与当前帧图像的背景相同;
子步骤5,根据匹配图像和当前帧图像计算目标光流场,进而确定目标。
其中,建立变换模型包括如下亚子步骤:
亚子步骤a,建立变换模型为下式(一)
Figure PCTCN2021094853-appb-000001
其中,x'表示匹配图像中一个点的X轴坐标,y'表示匹配图像中的一个点的Y轴坐标;
x表示上一帧图像中一个点的X轴坐标,y表示上一帧图像中一个点的Y轴坐标,
a、b、c、d、e、f都表示转换参数,
亚子步骤b,调取当前帧图像和上一帧图像,采用相同的方法将两帧图像都分割为不完全重叠的多个子块,
亚子步骤c,从上一帧图像的子块中找到当前帧图像中每个子块的最佳匹配块;(x i,y i)表示当前帧图像中第i个子块的中心 坐标,(x′ i,y′ i)表示该第i个子块在一帧图像中最佳匹配块的中心坐标;
亚子步骤d,用最小二乘法求解式(一)中的转换参数,如下式(二)中所示:
Figure PCTCN2021094853-appb-000002
其中,N表示当前帧图像中分割的子块数量,
其中,在所述亚子步骤c中,任意选择一个当前帧图像中的子块,通过式(三)逐一解算上一帧图像中每个子块与该当前帧图像中子块所有像素点灰度差值的绝对值之和,并选择使得取值最小的上一帧图像中子块作为最佳匹配块;
Figure PCTCN2021094853-appb-000003
其中,I 当前块(m,n)表示当前帧图像子块中(m,n)位置处像素点的灰度值,I 匹配块(m,n)表示上一帧图像子块中(m,n)位置处像素点的灰度值;p表示子块X轴方向像素点的个数,q表示子块Y轴方向像素点的个数;
待确定一个当前帧图像中子块的最佳匹配块后,继续选择另一个当前帧图像中子块,继续通过式(三)寻找对应的最佳匹配块,直至找到当前帧图像中所有子块的最佳匹配块。
其中,在子步骤5中,通过下式(四)获得能量函数表达式最小值:
min(E(p))=min(E m+E s)   (四)
其中,E(p)表示匹配图像和当前帧图像中的能量函数,
E m表示光流约束项;
E s表示平滑约束项;
Figure PCTCN2021094853-appb-000004
Figure PCTCN2021094853-appb-000005
其中,Ω表示当前帧图像的所有区域;
函数f表示图像中任意像素点在某一时刻所处的位置(x,y),f x表示函数f在X轴方向上的偏导数;f y表示函数f在Y轴方向上的偏导数;f t表示函数f在时间t上的偏导数;
u表示图像中任意像素点在X轴方向上的速度分量,v表示图像中任意像素点在Y轴方向上的速度分量;
dx表示微分符号;α为正数,表示光滑约束项E m的权重。
本发明所具有的有益效果包括:
(1)根据本发明提供的激光末制导飞行器组网控制方法,通过第一个飞行器作为诱导飞行器,捕获目标区域中的雷达发射车的位置信息;
(2)根据本发明提供的激光末制导飞行器组网控制方法,通过观测无人机与第一个飞行器之间的配合,在目标运动后及时准确地发现并锁定目标位置,进而通过导引激光控制后续飞行器飞向目标。
附图说明
图1示出根据本发明一种优选实施方式的激光末制导飞行器组网控制方法整体逻辑图;
图2示出根据本发明一种优选实施方式的激光末制导飞行器组网控制方法中各个部件间信号连接关系示意图;
图3示出本发明实施例中运动轨迹示意图;
图4示出图3的局部放大图。
附图标号说明:
1-发射单元
2-飞行器
21-雷达信号接收模块
3-观测无人机
31-摄像机
32-激光照射器
4-指挥单元
具体实施方式
下面通过附图和实施例对本发明进一步详细说明。通过这些说明,本发明的特点和优点将变得更为清楚明确。
在这里专用的词“示例性”意为“用作例子、实施例或说明性”。这里作为“示例性”所说明的任何实施例不必解释为优于或好于其它实施例。尽管在附图中示出了实施例的各种方面,但是除非特别指出,不必按比例绘制附图。
在实际工作过程中,飞行器所针对的目标往往都隐藏在特定掩体或者伪装之下,发现并锁定目标的难度较大。但当目标进入工作状态时,不同的目标会有不同的应激反应,如目标为雷达车时,雷达车进入工作状态会发出雷达信号,当目标为指挥车或拦截车时,在进入工作状态时会持续高速机动,或者间隔预定时间更换驻地,在目标从静止到运动或者从运动到静止的过程中更容易被发现,雷达车在发出探测雷达时也更容易被发现。针对这样的实时情况,本发明提供一种激光末制导飞行器组网控制方法,如图1中所示,该方法包括如下步骤:
步骤1,通过发射单元1朝向目标区域发射至少两个飞行器2,第一个飞行器至少比其他飞行器早5~10秒到达目标区域;
步骤2,通过安装在第一个飞行器上的雷达信号接收模块21捕获雷达波信号,并据此获得雷达发射车的位置信息;
步骤3,控制观测无人机3实时在目标区域巡航,并通过其上安装的摄像机31实时拍摄目标区域照片来寻找目标;
步骤4,通过安装在观测无人机3上的激光照射器32照射目标。
本申请中所述目标区域是指目标可能存在的较大区域,一般为3×10~3×20km 2的扇形区域。
所述多个飞行器可以间隔预定时间发射,也可以同时发射并通过调整各自飞行速度改变到达目标区域的时间,优选地,所述第一个飞行器比第二个飞行器早5秒到达目标区域,当飞行器到达目标区域时,极大的可能被目标区域中的雷达车发现,被发现后会引起连锁反应,敌方的拦截车和指挥车等目标极有可能开始移动,这就给观测无人机发现目标提供便利条件。
优选地,所述雷达信号接收模块可以选用张娇云.单脉冲雷达导引头建模与仿真研究[D].陕西:西安电子科技大学,2006.中介绍的雷达信号接收模块,能够通过接收雷达信号来发现雷达发射车的位置。
观测无人机可以在飞行器发射前就开始在目标区域巡航,由于观测无人机体积小,雷达发射车难以发现该观测无人机,也由于观测无人机不能过于靠近地面,在雷达车等目标静止且伪装的情况下,观测无人机难以发现目标。
通过第一个飞行器引诱各个目标启动工作并移动,为观测无人机发现目标提供便利条件,进而与后续其他飞行器配合,能够获得良好的打击效果。
在一个优选的实施方式中,所述步骤3包括如下子步骤:
子步骤1,观测无人机3在移动过程中持续通过摄像机31获 得目标区域照片;
子步骤2,对摄像机31获得的照片做预处理,具体来说,通过中值滤波方法降低随机噪声,通过图像锐化增强图像清晰度;
子步骤3,将预处理后的图像转换为灰度图像;
子步骤4,根据灰度图像建立变换模型,所述变换模型用于将相邻两帧图像中上一帧图像转换为匹配图像,所述匹配图像的背景与当前帧图像的背景相同;
子步骤5,根据匹配图像和当前帧图像计算目标光流场,进而确定目标。
优选地,建立变换模型包括如下亚子步骤:
亚子步骤a,建立变换模型为下式(一)
Figure PCTCN2021094853-appb-000006
其中,x'表示匹配图像中一个点的X轴坐标,y'表示匹配图像中的一个点的Y轴坐标;
x表示上一帧图像中一个点的X轴坐标,y表示上一帧图像中一个点的Y轴坐标,
a、b、c、d、e、f都表示转换参数,
亚子步骤b,调取当前帧图像和上一帧图像,采用相同的规则将两帧图像都分割为互补且部分重叠的多个子块,
亚子步骤c,从上一帧图像的子块中找到当前帧图像中每个子块的最佳匹配块;(x i,y i)表示当前帧图像中第i个子块的中心坐标,(x′ i,y′ i)表示该第i个子块在一帧图像中最佳匹配块的中心坐标;
亚子步骤d,用最小二乘法求解式(一)中的转换参数,如下式(二)中所示:
Figure PCTCN2021094853-appb-000007
其中,N表示当前帧图像中分割的子块数量,
此六个参数之间相互影响,因而每个参数取最优值的组合不是全局最优解;借助计算机对式二进行迭代寻优,具体的解算方法很多,最简单但计算耗时久的求解方法就是在全局范围内枚举很多组(a,b,c,d,e,f),代入式(二)中,使输出值最小的那一组参数即为最优解。得到最优解以后,直接将该最优解代入到式(一)中以后,该式(一)即可用于将上一帧图像转换为匹配图像。
优选地,在亚子步骤b中,所述分割子块的方法为:获知图像的整体像素点数量P×Q,即为矩形图像的X轴方向共P个像素点,Y轴方向共Q个像素点。子块也是矩形图像块,每个子块的X轴方向共P/10个像素点,Y轴方向共Q/10个像素点。第一个子块的右下角像素点与当前帧图像/上一帧图像的右下角像素点重合;第二个子块的右下角像素点与第一个子块右下角像素点之间在X轴方向间隔P/1000个像素点,和/或,在Y轴方向间隔Q/1000个像素点;第三个子块的右下角像素点与第二个子块右下角像素点之间在X轴方向间隔P/1000个像素点,和/或,在Y轴方向间隔Q/1000个像素点;按照此规律持续分割选择出所有满足该条件的子块。
在一个优选的实施方式中,在所述亚子步骤c中,任意选择一个当前帧图像中的子块,通过式(三)逐一解算上一帧图像中每个子块与该当前帧图像中子块所有像素点灰度差值的绝对值之和,并选择使得取值最小的上一帧图像中子块作为最佳匹配块;
Figure PCTCN2021094853-appb-000008
其中,I 当前块(m,n)表示当前帧图像子块(即当前块)中(m,n)位置处像素点的灰度值,I 匹配块(m,n)表示上一帧图像子块(即匹配块)中(m,n)位置处像素点的灰度值,p表示子块X轴方向像素点的个数,q表示子块Y轴方向像素点的个数;
待确定一个当前帧图像中子块的最佳匹配块后,继续选择另一个当前帧图像中子块,继续通过式(三)寻找对应的最佳匹配块,直至找到当前帧图像中所有子块的最佳匹配块。
在一个优选的实施方式中,在子步骤5中,通过下式(四)获得能量函数表达式最小值:
min(E(p))=min(E m+E s)   (四)
其中,E(p)表示匹配图像和当前帧图像中的能量函数,E m和E s两项都由图像中各点值的积分得到;
E m表示光流约束项,目的是确保图像序列达到灰度为常值的光流约束;
E s表示平滑约束项,目的是确保图像序列的光流场一直保持全局平滑;
Figure PCTCN2021094853-appb-000009
Figure PCTCN2021094853-appb-000010
其中,Ω表示当前帧/匹配图像的所有区域;函数f表示图像中任意像素点在某一时刻所处的位置(x,y),f x表示函数f在X轴方向上的偏导数,具体地
Figure PCTCN2021094853-appb-000011
f y表示函数f在Y轴方向上的偏导数,具体地
Figure PCTCN2021094853-appb-000012
f t表示函数f在时间t上的偏导数,具体地
Figure PCTCN2021094853-appb-000013
u表示图像中任意像素点在X轴方向上的速度分量,v代表图像中任意像素点在Y轴方向上的速度分量;dx代表微分符号;
α为正数,表示光滑约束项E m的权重,其值越小所对应的光流场越复杂。
Figure PCTCN2021094853-appb-000014
代表u在X轴方向上的偏导数,
Figure PCTCN2021094853-appb-000015
代表u在Y轴方向上的偏导数,
Figure PCTCN2021094853-appb-000016
代表v在X轴方向上的偏导数,
Figure PCTCN2021094853-appb-000017
代表v在Y轴方向上的偏导数。
在一个优选的实施方式中,如图1中所示,该方法还包括步骤5,通过观测无人机3上的摄像机31拍摄飞行器着陆前后的目标照片,并将之传送给指挥单元4,进而判断飞行器落点和目标毁伤情况。
具体来说,摄像机31实时拍摄获得目标照片,观测无人机3将该目标照片实时发送给指挥单元的解算模块。解算模块依据飞行器着陆前后目标照片像素点灰度的变化程度来评估毁伤效果。优选地,所述飞行器着陆后目标照片的像素值是指飞行器着陆10~15秒后目标照片的像素值,优选为12秒以后目标照片的像素值。本发明人发现,10~15秒以后由于飞行器着陆导致的火光烟雾等影响照片采集的因素可以大部分消散,能够获得可供识别的照片。
进一步优选地,所述摄像机31在飞行器着陆10秒后连续拍摄目标,获得目标照片,所述目标照片是指包含目标在内的直径为3~5米的圆形区域的照片,所述摄像机31还能够根据目标照片直接判断目标是否移动,若目标移动,则可认为目标毁伤效果未达到预期,若目标无移动,则采集飞行器着陆12秒后的目标照片做进一步分析评估。具体的进一步分析评估方法如下:首先通过下式(五)求解目标照片图像的灰度变化:
Figure PCTCN2021094853-appb-000018
其中,p t0为飞行器着陆前目标照片的像素值,P t1为飞行器着陆后目标照片的像素值,N b为目标照片的像素点数目;H b为目标照片灰度变化均值。
计算目标照片毁伤部分的像素点数目:
使用H b作为判断标准,评估目标照片图像像素点灰度变化程度。针对目标照片的每一个像素点,当|P t0(x)-P t1(x)|≥H b时,则判定该像素点为目标毁伤部分的像素点,最终获得目标毁伤部分的像素点总数为S HS
再用飞行器着陆前后目标照片所含毁伤部分像素点数目的变化来评估目标毁伤效果S=S HS/N b。其中,S表示目标毁伤效果。
当目标毁伤效果S≥80%时,判定该飞行器对目标达到毁伤要求,并由指挥单元显示该目标毁伤效果数值。
在一个优选的实施方式中,在步骤2中,第一个飞行器将获得的雷达发射车的位置信息发送给观测无人机3,以便于观测无人机发现并锁定该目标。观测无人机3通过坐标转换确定该雷达发射车在照片中的位置,进而能够快速锁定该目标。
在一个优选的实施方式中,所述第一个飞行器在获得雷达发射车的位置信息后,控制其自身飞向该雷达发射车;由于该雷达发射车已经发现第一个飞行器,第一个飞行器被拦截的可能性比较高。
其他飞行器中至少有一个飞行器在激光照射器32的导引下飞向该雷达发射车,即将该雷达发射车作为重要的打击目标。
优选地,当步骤3中寻找到两个或两个以上目标时,在步 骤4中,每个目标都用一个激光照射器32进行照射,且各个激光照射器32发射不同频率的照射激光。
所述观测无人机3实时将捕获的目标信息发送给指挥单元4,指挥单元可以根据目标信息及目标毁伤状况临时增加飞行器数量,控制发射单元发射更多的飞行器飞向目标区域。
所述飞行器中预存有激光编码器,能够随机选择多种伪随机频率,并控制激光照射器32发射该频率的激光照射目标,伪随机频率族能够同时降低目标发现激光信号和激光信号被主动干扰的可能性。飞行器的激光导引头中设置有激光频率解码器,能够依据同样的编码规则解算出激光照射器发出的激光频率,使激光导引头能够及时捕捉到导引激光,完成激光末制导。
优选地,通过指挥单元4实时解算准确的倒计时信息,并根据该倒计时信息在飞行器进入末制导段前1-3秒时控制激光照射器32发出照射激光。所述指挥单元4根据目标位置信息和飞行器的位置信息及速度信息来解算倒计时信息。优选地,倒计时结束后再过1秒飞行器刚好进入末制导段,激光导引头启动工作,此时观测无人机3上的激光照射器32也启动工作,刚好使得飞行器捕获到目标位置信息,并且控制飞行器飞向目标。
实施例:
通过发射单元朝向20km外的目标区域发射三个飞行器,第一个飞行器至少比另外两个飞行器早5秒到达目标区域;第二个飞行器和第三个飞行器基本同时到达目标区域;已知飞行器的有效飞行距离为25km;发射单元包含三辆发射车,三个飞行器分别由三辆发射车发射;
安装在第一个飞行器上的雷达信号接收模块在进入目标区域时捕获到雷达波信号,并据此获得雷达发射车的位置信息; 3秒后,目标区域中的拦截车发射拦截飞行器,并开始移动,雷达发射车也开始移动,
观测无人机通过其上安装的摄像机实时拍摄目标区域照片,在雷达发射车和拦截车移动后,发现雷达发射车和拦截车的位置,并在第二个飞行器和第三个飞行器进入末制导段前1秒发出不同频率的照射激光分别照射两个目标,引导第二个和第三个飞行器飞向目标。
第一个、第二个飞行器、第三个飞行器、雷达发射车和拦截车的运动轨迹如图3和图4中所示,其中,图4是图3的局部放大图,图4主要展示三个飞行器在接近着陆时的轨迹和两个目标的轨迹。从图中可以看出,第一个飞行器被拦截,第二个飞行器命中雷达发射车,第三个飞行器命中拦截车。
以上结合了优选的实施方式对本发明进行了说明,不过这些实施方式仅是范例性的,仅起到说明性的作用。在此基础上,可以对本发明进行多种替换和改进,这些均落入本发明的保护范围内。

Claims (10)

  1. 一种激光末制导飞行器组网控制方法,其特征在于,该方法包括如下步骤:
    步骤1,通过发射单元(1)朝向目标区域发射至少两个飞行器(2),第一个飞行器至少比其他飞行器早5~10秒到达目标区域;
    步骤2,通过安装在第一个飞行器上的雷达信号接收模块(21)捕获雷达波信号,并据此获得雷达发射车的位置信息;
    步骤3,控制观测无人机(3)实时在目标区域巡航,并通过其上安装的摄像机(31)实时拍摄目标区域照片来寻找目标;
    步骤4,通过安装在观测无人机(3)上的激光照射器(32)照射目标。
  2. 根据权利要求1所述的激光末制导飞行器组网控制方法,其特征在于,该方法还包括步骤5,
    通过观测无人机(3)上的摄像机(31)拍摄飞行器着陆前后的目标照片,并将之传送给指挥单元(4),进而判断飞行器落点和目标毁伤情况。
  3. 根据权利要求1所述的激光末制导飞行器组网控制方法,其特征在于,
    在步骤2中,第一个飞行器将获得的雷达发射车的位置信息发送给观测无人机(3),以便于观测无人机发现并锁定该目标。
  4. 根据权利要求1所述的激光末制导飞行器组网控制方法,其特征在于,
    所述第一个飞行器在获得雷达发射车的位置信息后,控制其自身飞向该雷达发射车;
    其他飞行器中至少有一个飞行器在激光照射器(32)的导引下飞向该雷达发射车。
  5. 根据权利要求1所述的激光末制导飞行器组网控制方法, 其特征在于,
    当步骤3中寻找到两个或两个以上目标时,在步骤4中,每个目标都用一个激光照射器(32)进行照射,且各个激光照射器(32)发射不同频率的照射激光。
  6. 根据权利要求1所述的激光末制导飞行器组网控制方法,其特征在于,
    通过指挥单元(4)实时解算准确的倒计时信息,并根据该倒计时信息在飞行器进入末制导段前1-3秒时控制激光照射器(32)发出照射激光。
  7. 根据权利要求1所述的激光末制导飞行器组网控制方法,其特征在于,
    所述步骤3包括如下子步骤:
    子步骤1,观测无人机(3)在移动过程中持续通过摄像机(31)获得目标区域照片;
    子步骤2,对摄像机(31)获得的照片做预处理,
    子步骤3,将预处理后的图像转换为灰度图像;
    子步骤4,根据灰度图像建立变换模型,所述变换模型用于将相邻两帧图像中上一帧图像转换为匹配图像,所述匹配图像的背景与当前帧图像的背景相同;
    子步骤5,根据匹配图像和当前帧图像计算目标光流场,进而确定目标。
  8. 根据权利要求7所述的激光末制导飞行器组网控制方法,其特征在于,
    建立变换模型包括如下亚子步骤:
    亚子步骤a,建立变换模型为下式(一)
    Figure PCTCN2021094853-appb-100001
    其中,x'表示匹配图像中一个点的X轴坐标,y'表示匹配图像中的一个点的Y轴坐标;
    x表示上一帧图像中一个点的X轴坐标,y表示上一帧图像中一个点的Y轴坐标,
    a、b、c、d、e、f都表示转换参数,
    亚子步骤b,调取当前帧图像和上一帧图像,采用相同的方法将两帧图像都分割为不完全重叠的多个子块,
    亚子步骤c,从上一帧图像的子块中找到当前帧图像中每个子块的最佳匹配块;(x i,y i)表示当前帧图像中第i个子块的中心坐标,(x i',y i')表示该第i个子块在上一帧图像中最佳匹配块的中心坐标;
    亚子步骤d,用最小二乘法求解式(一)中的转换参数,如下式(二)中所示:
    Figure PCTCN2021094853-appb-100002
    其中,N表示当前帧图像中分割的子块数量。
  9. 根据权利要求8所述的激光末制导飞行器组网控制方法,其特征在于,
    在所述亚子步骤c中,任意选择一个当前帧图像中的子块,通过式(三)逐一解算上一帧图像中每个子块与该当前帧图像中子块所有像素点灰度差值的绝对值之和,并选择使得取值最小的上一帧图像中子块作为最佳匹配块;
    Figure PCTCN2021094853-appb-100003
    其中,I 当前块(m,n)表示当前帧图像子块中(m,n)位置处像素 点的灰度值,I 匹配块(m,n)表示上一帧图像子块中(m,n)位置处像素点的灰度值;p表示子块X轴方向像素点的个数,q表示子块Y轴方向像素点的个数;
    待确定一个当前帧图像中子块的最佳匹配块后,继续选择另一个当前帧图像中子块,继续通过式(三)寻找对应的最佳匹配块,直至找到当前帧图像中所有子块的最佳匹配块。
  10. 根据权利要求1所述的激光末制导飞行器组网控制方法,其特征在于,
    在子步骤5中,通过下式(四)获得能量函数表达式最小值:
    min(E(p))=min(E m+E s)  (四)
    其中,E(p)表示匹配图像和当前帧图像中的能量函数,
    E m表示光流约束项;
    E s表示平滑约束项;
    Figure PCTCN2021094853-appb-100004
    Figure PCTCN2021094853-appb-100005
    其中,Ω表示当前帧图像的所有区域;
    函数f表示图像中任意像素点在某一时刻所处的位置(x,y),f x表示函数f在X轴方向上的偏导数;f y表示函数f在Y轴方向上的偏导数;f t表示函数f在时间t上的偏导数;
    u表示图像中任意像素点在X轴方向上的速度分量,v表示图像中任意像素点在Y轴方向上的速度分量;
    dx表示微分符号;α为正数,表示光滑约束项E m的权重。
PCT/CN2021/094853 2020-07-28 2021-05-20 激光末制导飞行器组网控制方法 WO2022022023A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2023506252A JP2023536866A (ja) 2020-07-28 2021-05-20 レーザー端末誘導航空機のネットワーキングを制御する方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010740505.7 2020-07-28
CN202010740505.7A CN114002700A (zh) 2020-07-28 2020-07-28 激光末制导飞行器组网控制方法

Publications (1)

Publication Number Publication Date
WO2022022023A1 true WO2022022023A1 (zh) 2022-02-03

Family

ID=79920579

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/094853 WO2022022023A1 (zh) 2020-07-28 2021-05-20 激光末制导飞行器组网控制方法

Country Status (3)

Country Link
JP (1) JP2023536866A (zh)
CN (1) CN114002700A (zh)
WO (1) WO2022022023A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117111624A (zh) * 2023-10-23 2023-11-24 江苏苏启智能科技有限公司 基于电磁反制技术的反无人机方法及系统

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020080061A1 (en) * 2000-12-11 2002-06-27 Rafael - Armament Development Authority Ltd. Method and system for active laser imagery guidance of intercepting missiles
CN104698453A (zh) * 2015-03-15 2015-06-10 西安电子科技大学 基于合成孔径天线阵列的雷达信号被动定位方法
CN105791398A (zh) * 2016-02-29 2016-07-20 北京航空航天大学 一种应用于无人机的多任务组件通讯方法
CN106950984A (zh) * 2017-03-16 2017-07-14 中国科学院自动化研究所 无人机远程协同察打方法
CN107977987A (zh) * 2017-11-20 2018-05-01 北京理工大学 一种无人机载多目标探测跟踪、指示系统及方法
CN109508032A (zh) * 2018-12-12 2019-03-22 北京理工大学 带有辅助无人机的制导飞行器系统及制导方法
CN111079556A (zh) * 2019-11-25 2020-04-28 航天时代飞鸿技术有限公司 一种多时相无人机视频图像变化区域检测及分类方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020080061A1 (en) * 2000-12-11 2002-06-27 Rafael - Armament Development Authority Ltd. Method and system for active laser imagery guidance of intercepting missiles
CN104698453A (zh) * 2015-03-15 2015-06-10 西安电子科技大学 基于合成孔径天线阵列的雷达信号被动定位方法
CN105791398A (zh) * 2016-02-29 2016-07-20 北京航空航天大学 一种应用于无人机的多任务组件通讯方法
CN106950984A (zh) * 2017-03-16 2017-07-14 中国科学院自动化研究所 无人机远程协同察打方法
CN107977987A (zh) * 2017-11-20 2018-05-01 北京理工大学 一种无人机载多目标探测跟踪、指示系统及方法
CN109508032A (zh) * 2018-12-12 2019-03-22 北京理工大学 带有辅助无人机的制导飞行器系统及制导方法
CN111079556A (zh) * 2019-11-25 2020-04-28 航天时代飞鸿技术有限公司 一种多时相无人机视频图像变化区域检测及分类方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117111624A (zh) * 2023-10-23 2023-11-24 江苏苏启智能科技有限公司 基于电磁反制技术的反无人机方法及系统
CN117111624B (zh) * 2023-10-23 2024-02-02 江苏苏启智能科技有限公司 基于电磁反制技术的反无人机方法及系统

Also Published As

Publication number Publication date
CN114002700A (zh) 2022-02-01
JP2023536866A (ja) 2023-08-30

Similar Documents

Publication Publication Date Title
CN107883817B (zh) 带有混合制导武器的无人直升机控制系统及控制方法
KR101664618B1 (ko) 포획장치가 구비된 무인비행시스템 및 이를 이용한 포획방법
CN106950984B (zh) 无人机远程协同察打方法
CN111123983B (zh) 一种无人机截击网捕控制系统及控制方法
CN107111958A (zh) 使用光、声和/或多光谱模式检测的商用和通用飞机规避
CN110988819B (zh) 基于无人机编队的激光诱骗干扰设备诱偏效果评估系统
CN107054679A (zh) 一种高机动主动捕获式反无人机系统及方法
WO2022022023A1 (zh) 激光末制导飞行器组网控制方法
CN107885230A (zh) 带有激光制导武器的无人直升机控制系统及其控制方法
CN107870631A (zh) 一种无人直升机机载系统及其控制方法
CN105468018A (zh) 一种无人机目标特性模拟系统
CN111044989A (zh) 一种激光诱骗干扰设备诱偏效果外场评估系统
CN114508966A (zh) 一种地空联合多层次拦截的随行防御系统
CN114820701A (zh) 一种基于多模板的红外成像导引头捕获跟踪目标方法
CN112902959B (zh) 一种激光制导飞行器指控系统及指控方法
KR102349818B1 (ko) 개선된 cnn 기반으로 한 도로 균열 감지 자율 비행 드론
CN111637797A (zh) 一种炮兵实弹射击自动报靶装置及方法
CN219247855U (zh) 一种黑飞无人机智能抓捕装置
CN206797767U (zh) 一种高机动主动捕获式反无人机系统
CN115328201A (zh) 一种黑飞无人机智能抓捕装置及其智能抓捕方法
CN110579138B (zh) 一种高射炮实弹射击报靶方法、系统及装置
CN112461059B (zh) 一种图像寻的制导导弹地面发射方法
CN109885101B (zh) 一种利用无人飞行器模拟导弹末制导的方法及系统
CN113671981A (zh) 远程激光制导飞行器控制系统及其控制方法
RU2750924C1 (ru) Устройство захвата беспилотных летательных аппаратов

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21849173

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2023506252

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 06/06/2023)

122 Ep: pct application non-entry in european phase

Ref document number: 21849173

Country of ref document: EP

Kind code of ref document: A1