CN105424006B - Unmanned plane hovering accuracy measurement method based on binocular vision - Google Patents

Unmanned plane hovering accuracy measurement method based on binocular vision Download PDF

Info

Publication number
CN105424006B
CN105424006B CN201510736167.9A CN201510736167A CN105424006B CN 105424006 B CN105424006 B CN 105424006B CN 201510736167 A CN201510736167 A CN 201510736167A CN 105424006 B CN105424006 B CN 105424006B
Authority
CN
China
Prior art keywords
msub
mrow
mtd
camera
mtr
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510736167.9A
Other languages
Chinese (zh)
Other versions
CN105424006A (en
Inventor
王万国
刘俍
刘越
张方正
董罡
雍军
吴观斌
慕世友
傅孟潮
魏传虎
张飞
李建祥
赵金龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
State Grid Intelligent Technology Co Ltd
Original Assignee
State Grid Corp of China SGCC
Electric Power Research Institute of State Grid Shandong Electric Power Co Ltd
Shandong Luneng Intelligence Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by State Grid Corp of China SGCC, Electric Power Research Institute of State Grid Shandong Electric Power Co Ltd, Shandong Luneng Intelligence Technology Co Ltd filed Critical State Grid Corp of China SGCC
Priority to CN201510736167.9A priority Critical patent/CN105424006B/en
Publication of CN105424006A publication Critical patent/CN105424006A/en
Application granted granted Critical
Publication of CN105424006B publication Critical patent/CN105424006B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • G01C11/08Interpretation of pictures by comparison of two or more pictures of the same area the pictures not being supported in the same relative position as when they were taken

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses the hovering accuracy measurement method of the unmanned plane based on binocular vision, comprise the following steps:Calibration phase:Camera is demarcated using Zhang Zhengyou chessboard calibration methods, so that it is determined that calibrating parameters define calibration result parameter;Positioning stage:When carrying out unmanned plane hovering precision measure, slide rail is positioned over immediately below unmanned plane hovering point, binocular camera is fixed on slide rail according to setpoint distance is parallel, and binocular camera can move along slide direction, camera lens are placed vertically upward, two camera imaging planes should be generally aligned in the same plane, and optical axis is parallel to each other;Left mesh camera and right mesh camera gather unmanned plane image respectively, transmit to computer;Computer is according to left mesh image and right mesh image, according to calibration result parameter, calculates the three-dimensional location coordinates of unmanned plane;After hovering terminates, according to no-manned plane three-dimensional trajectory calculation hovering precision.Beneficial effects of the present invention:Realize detection, tracking, accurate matching and the three-dimensional localization of unmanned plane target.

Description

Unmanned plane hovering accuracy measurement method based on binocular vision
Technical field
The present invention relates to a kind of unmanned plane hovering accuracy measurement method based on binocular vision.
Background technology
Unmanned plane hovering precision is an important indicator of unmanned plane performance, reflects the core of unmanned plane --- flight control The stability and accuracy of system processed.At present, in the inspection detection process to unmanned plane during flying function, the measurement for precision of hovering Method is the method manually observed, it is impossible to ensures its security, objectivity and normalization.
No-manned plane three-dimensional space orientation technique mainly has two ways:Positioning method based on airborne equipment and based on ground The positioning method of equipment.
According to the difference of airborne equipment type, the location technology based on airborne equipment mainly has 3 kinds:Fig. 4 (a) is based on GPS Equipment, Fig. 4 (b) are based on the location technology of Airborne Video System and Fig. 4 (c) based on inertial navigation unit.Airborne equipment is integrated in each In the flight control system of unmanned plane, rather than independently of unmanned plane outside, therefore flexibility is poor, is not suitable for for difference The task that unmanned plane is positioned.
Positioning method based on ground installation can avoid above mentioned problem.According to the difference of ground device type, based on ground The location technology of face equipment can be divided into 3 kinds:Fig. 4 (d) ultrasonic range finders, Fig. 4 (e) laser range finders and Fig. 4 (f) are based on The location technology of machine vision.Wherein, ultrasonic wave or laser range finder are used for the measurement of target range, and three-dimensional laser ranging The sweep speed of instrument is slower, is mainly used in the reconstruction of static three-dimensional scene, can not calculate the movement locus of target.
The content of the invention
The purpose of the present invention is exactly to solve the above problems, there is provided a kind of unmanned plane hovering precision based on binocular vision Measuring method, it can calculate the three-dimensional flight trajectory of unmanned plane in real time, and calculate hovering precision automatically, improve the standard of measurement True property and normalization.
To achieve these goals, the present invention adopts the following technical scheme that:
Unmanned plane hovering accuracy measurement method based on binocular vision, comprises the following steps:
Step (1):Calibration phase:Camera is demarcated using Zhang Zhengyou chessboard calibration methods, so that it is determined that calibrating parameters, Define calibration result parameter;
Step (2):Positioning stage:When carrying out unmanned plane hovering precision measure, slide rail is positioned over unmanned plane hovering point Underface, binocular camera is fixed on slide rail according to setpoint distance is parallel, and binocular camera can move along slide direction, phase Machine camera lens is placed vertically upward, and binocular camera imaging plane should be generally aligned in the same plane, and optical axis is parallel to each other;Left mesh camera and right mesh Camera gathers unmanned plane image respectively, transmits to computer;Computer according to the left mesh image and right mesh image collected, with reference to The calibration result parameter that step (1) obtains, calculate the three-dimensional location coordinates of unmanned plane;After hovering terminates, according to no-manned plane three-dimensional Trajectory calculation hovering precision.
The step of step (1) is:
Step (1-1):Two cameras are fixed on same slide rail, define distance L, adjust two-phase seat in the plane on slide rail Put, it is L to make the distance between its central point;
Step (1-2):Camera is demarcated using Zhang Zhengyou chessboard calibration methods, and records calibration result parameter result ={ Mleft,Dleft,Mright,Dright,R,T}.Result represents calibration result parameter, MleftAnd DleftLeft mesh camera is represented respectively Camera matrix and distortion factor vector, MrightAnd DrightThe camera matrix and distortion factor vector of right mesh camera are represented respectively, R and T represents the spin matrix and translation vector between two cameras respectively.For each camera,
Wherein, M is camera matrix, fx,fyIt is the focal length in units of pixel.
The step of step (2) is:
Step (2-1):When carrying out unmanned plane hovering precision measure, slide rail is positioned over immediately below unmanned plane hovering point, It is fixed on binocular camera is parallel on slide rail, camera lens are placed vertically upward, and two camera imaging planes should be generally aligned in the same plane, Optical axis is parallel to each other;Left mesh camera gathers the left mesh image of unmanned plane, and right mesh camera gathers the right mesh image of unmanned plane;
Step (2-2):Target-region locating:By the way of choosing manually, the target area in left mesh image is selected;
Step (2-3):Target following:Using TLD algorithms, the target in left mesh image is tracked;
Step (2-4):Object matching:In right mesh image, the matching most like with the target area of left mesh image is found Region;
Step (2-5):Homotopy mapping:The central point of rectangular target areas in left mesh and right mesh image is utilized respectively, is made For same place;
Step (2-6):Three-dimensional coordinate calculates:The coordinate system of camera is established, the calibration result ginseng obtained with reference to step (1) Number, calculate three-dimensional coordinate of the target point in camera coordinates system;
Step (2-7):Hover accuracy evaluation, according to the three-dimensional coordinate track of target point, calculates the hovering precision of unmanned plane.
The step of step (2-2) is:T=0 at the time of order starts to position target.Frame manual first takes target Region, target area are with BLFor upper left angle point, high h, wide w rectangular area.
The step of step (2-3) is:The left mesh image target area determined according to the t=0 moment, t=1 and its with At the time of afterwards, the target of left mesh image is tracked using TLD algorithms.
The step of step (2-4) is:After obtaining the target area in left mesh image every time, in right mesh image, seek The matching area most like with the target area of left mesh image is looked for, matching area is with BRFor upper left angle point, high h, wide w rectangle Region;
Then object matching is expressed as:
Wherein, IleftRepresent left mesh image intensity value, IrightRepresent right mesh image intensity value, (xL, yL) represent point BLSeat Mark, (xR, yR) represent point BRCoordinate;Now search area xR∈[0,xL],Wherein sh For the height of Search Area.So, obtain causing the B that formula (1) is minimumRPoint coordinates (xR, yR) after, then parallax d=xL-xR
At the moment of t >=1, after left mesh image obtains new target area by TLD algorithms, to the search area of right mesh image It is updatedDetermined according to formula (1) Target area in right mesh image;By that analogy, the target area in the left mesh image of each frame is calculated, and same target exists Corresponding region in right mesh image.
The step of step (2-5) is:
Utilize the central point of left mesh objective area in imageAs the same of left mesh target Famous cake, utilize the central point of the target area in right mesh imageAs the same of right mesh target Famous cake.
The same place is the pixel at the corresponding same position of realistic objective, it is necessary to protect in left mesh image and right mesh image Same place before and after card in frame, left and right mesh image corresponds to the same position of realistic objective.
The step of step (2-6) is:
Camera coordinates system is with the photocentre O of left mesh cameraLFor origin, XOLY plane is parallel to imaging plane, optical axis direction Z axis, according to the camera parameter result of demarcation, obtain re-projection matrix
Wherein,For the principal point coordinate of left camera,For the principal point coordinate of right camera;TxFor two cameras it Between translation matrix X-axis component;flFor left camera focal length;
In the case of in left and right, sight axle is parallel to each other, it is known that left mesh image identical point coordinates PL(xL,yL) and right mesh image it is same Name point coordinates PR(xR,yR), calculate parallax d=x of the target point in left and right viewL-xR, then make
Obtain three-dimensional coordinate of the target point in camera coordinates system:
WhereinWithFor intermediate result variable, xc、ycAnd zcRespectively target point PcIn camera coordinates system X, Y and Z axis coordinate.
The step of step (2-7) is:
When unmanned plane hovers, calculate the three-dimensional location coordinates of unmanned plane in real time using step (2-5), obtain its flight rail Mark;
Assuming that the collection for obtaining tracing point is combined into P=[P1,P2,...,PN], altogether comprising N number of point, wherein
Pn=[xn,yn,zn]T, n=1,2 ..., N;
The barycenter of flight path point set
, it is specified that the terrain clearance of unmanned plane hovering is denoted as H during progress unmanned plane hovering accuracy test0
When carrying out hovering accuracy detection, binocular camera is positioned over to the underface of hovering, is by left mesh phase specifically Machine is positioned over hovering point and makes optical axis perpendicular to horizontal plane;Hovering precision is divided into the horizontal departure of spot hover precisionWith Vertical missingThe horizontal departure of Hovering control precisionAnd vertical missing
Due to OLFor coordinate origin, calculation formula is:
Wherein, eXAnd eYRespectively during unmanned plane hovering, under the camera coordinates system using left mesh camera as origin, X-axis Direction and the range of movement of Y direction, znRepresent unmanned plane during flying tracing point PnZ axis coordinate.
Beneficial effects of the present invention:
1 can calculate the three-dimensional flight trajectory of unmanned plane in real time, and calculate hovering precision automatically, improve the accurate of measurement Property with it is normative;
2 without making any transformation to unmanned plane, and the scalability of system is preferable;
3 whole measurement process are with little need for the participation of people, automaticity height;
The application method of 4 equipment is easy, need to only be positioned over binocular camera below the hovering point pre-set, you can counting Calculate flight path and hovering precision that generator terminal obtains unmanned plane.
5 three-dimensional localization techniques based on machine vision, can be whole by the detection of target, tracking, matching, three-dimensional localization process It is combined, it is only necessary to place the cameras at specified location, all related algorithms are all completed and shown on computers.Therefore Inspection Detection task is directed to, a kind of effective ways are:Using binocular vision technology, with reference to the spy of unmanned plane inspection Detection task Different situation and demand, realize detection, tracking, accurate matching and the three-dimensional localization of unmanned plane target.
Brief description of the drawings
Fig. 1 is the overall flow figure of the present invention;
Fig. 2 is the calibration phase flow chart of the present invention;
Fig. 3 is the positioning stage flow chart of the present invention;
Fig. 4 (a) is the positioning method based on GPS airborne equipments;
Fig. 4 (b) is the positioning method based on Airborne Video System equipment;
Fig. 4 (c) is the positioning method based on airborne inertial navigation set;
Fig. 4 (d) is the positioning method based on ground ultrasonic range finder;
Fig. 4 (e) is the positioning method based on ground laser range finder;
Fig. 4 (f) is the positioning method based on ground machine vision;
Unmanned plane hovering accuracy measurement systems of the Fig. 5 based on binocular vision;
Fig. 6 binocular cameras demarcate schematic diagram;
Fig. 7 target followings are with matching;
The homotopy mapping of Fig. 8 or so mesh, front and rear frame;
Fig. 9 Locating System with Binocular coordinate system schematic diagrames;
Figure 10 hovering error schematic diagrames.
Embodiment
The invention will be further described with embodiment below in conjunction with the accompanying drawings.
Hardware forms:2 industrial cameras, 1 camera slide rail, 1 set of industrial computer and display, 1 chessboard calibration version.It is whole Individual system architecture is as shown in Figure 5.
1、2:Industrial camera.Lens focus 5mm, resolution ratio 1384x1032,16 frames of frame per second/second.
3:Camera slide rail.Length 1.2m, with scale, scale division value 1mm.
4:Industrial computer and display.Ling Hua MXC-6000 industrial computers.
5:Chessboard calibration plate.Grid number 19 × 17, each grid width 20mm.
6:Unmanned plane.Depopulated helicopter, multi-rotor unmanned aerial vehicle.
As shown in figure 1, whole process is divided into 2 stages:Calibration phase and positioning stage.In calibration phase, using just Friendly chessboard calibration method[1]Camera is demarcated.In positioning stage, according to calibrating parameters, the three-dimensional position of unmanned plane target is calculated Put.
3.2 calibration phase:
For the theoretical question of calibration technique related algorithm, the solution all obtained substantially.Current camera calibration master There are three class methods:Traditional scaling method, active vision scaling method and self-calibrating method.Traditional scaling method utilizes geometry about Calibrating template known to beam calculates camera parameters, and simple equipments, precision is high, is method the most frequently used at present.Active vision mark Determine method video camera is fixed on the mechanical mechanisms such as head, the strict rotation and translation motion for limiting video camera, stated accuracy Height, equipment is complicated, nominal time length;The corresponding relation that self-calibrating method only relies in scene between multiple image calculates video camera Parameter, demarcation is flexible, but belongs to nonlinear calibration, and robustness is not high.
The present invention utilizes traditional standardization, is extended on the basis of document [1] method.Define calibration result result ={ Mleft,Dleft,Mright,Dright,R,T}.Result represents calibration result, MleftAnd DleftThe camera of mesh camera is represented respectively Matrix and distortion factor vector, MrightAnd DrightThe camera matrix and distortion factor vector of right mesh camera are represented respectively, and R and T divide The spin matrix and translation vector between two cameras are not represented.As shown in Figure 2,6, demarcating steps are:
1 is fixed on two cameras on same slide rail, defines distance L, adjusts two camera positions on slide rail, makes its center The distance between point is L.
2 are demarcated using Zhang Zhengyou chessboard calibration methods to camera, and it is following form to record calibration result parameter:
Result={ Mleft,Dleft,Mright,Dright,R,T}
Wherein result represents calibration result, MleftAnd DleftRespectively represent mesh camera camera matrix and distortion factor to Amount, MrightAnd DrightThe camera matrix and distortion factor vector of right mesh camera are represented respectively, and R and T are represented between two cameras respectively Spin matrix and translation vector.
3.3 positioning stage:
As shown in figure 5, when carrying out unmanned plane hovering precision measure, it is fixed on binocular camera is parallel on slide rail, two-phase Machine imaging plane should try one's best and be generally aligned in the same plane, and optical axis is parallel to each other.Slide rail is positioned over immediately below unmanned plane hovering point, camera Camera lens is placed vertically upward.Left mesh and right mesh camera collection unmanned plane image, are transmitted to portable computer by GigE kilomega networks.Just Computer is taken according to left mesh and right mesh image, using the three-dimensional location coordinates of related algorithm positioning unmanned plane.After hovering terminates, according to No-manned plane three-dimensional trajectory calculation hovering precision.
As shown in figure 3, the three-positional fix algorithm of unmanned plane is broadly divided into following steps:Target-region locating with Tracking, left and right mesh object matching, homotopy mapping, three-dimensional coordinate calculate and the assessment of hovering precision.
Target-region locating and tracking
Target-region locating can be by the way of manual selection or automatic detection.Because unmanned plane target image background is quiet The uniform background of state, therefore conspicuousness target detection mode can be used, obtain target area.As shown in fig. 7, in t=0 Carve, manually the mode of selection or automatic detection, after obtaining the target location (solid-line rectangle frame) in left mesh image, use TLD (Tracking-Learning-Detection) algorithm[2]It is tracked.The advantage of TLD algorithms is to remove and lay equal stress in target When newly entering image-region, algorithm remains to detect and tracks this target.
Left and right mesh object matching
Assuming that the target area of left mesh image is determined for the first time in moment t=0, as shown in fig. 7, target area is with BLFor Upper left angle point, high h, wide w rectangular area.Left and right mesh object matching is exactly in right mesh image, is found and left mesh target area Most like matching area, matching area are with BRFor upper left angle point, high h, wide w rectangular area.Then object matching can represent For problems with:
Formula (1)
Wherein, IleftAnd IrightLeft mesh and right mesh image intensity value, (x are represented respectivelyL, yL) and (xR, yR) it is respectively point BL And BRCoordinate.Now search area xR∈[0,xL],Such as t=0 time charts picture in Fig. 7 Shown in grey area.Obtain causing (the x that formula (1) is minimumR, yR) after, then parallax d=xL-xR
At the t+1 moment, left mesh image obtains new target area by TLD algorithms, and the search area of right mesh image is entered Row renewalSuch as ash in t+1 time chart pictures Shown in color region.Target area in right mesh image is determined according to formula (1).By that analogy, the left mesh of each frame and the right side are calculated The target area of mesh image.
Homotopy mapping
Same place, i.e., in left mesh and right mesh image, the pixel at the corresponding same position of realistic objective.As shown in figure 8, we It must assure that the same place in front and rear frame, left and right mesh image corresponds to the same position of realistic objective.A kind of simple mode is profit With the central point of objective area in image, i.e.,WithMake For left mesh and the same place of right mesh target.
Three-dimensional coordinate calculates
Camera coordinates system is with the photocentre O of left mesh cameraLFor origin, XOLY plane is parallel to imaging plane, optical axis direction Z axis, as shown in Figure 9.According to the camera parameter of demarcation, re-projection matrix is obtained
Formula (2)
Wherein(do not used in formula for the principal point coordinate of left and right camera);TxBetween two cameras The X-axis component of translation matrix;flFor left camera focal length.In the case of in left and right, sight axle is parallel to each other, it is known that the mesh image of left and right Identical point coordinates PL(xL,yL) and PR(xR,yR), calculate parallax d=x of the target point in left and right viewL-xR, then make
Formula (3)
Thus obtain three-dimensional coordinate of the target point in camera coordinates system:
Formula (4)
Hover accuracy evaluation
When unmanned plane hovers, calculate the three-dimensional location coordinates of unmanned plane in real time using above method, obtain its flight rail Mark.Assuming that the collection for obtaining tracing point is combined into P=[P1,P2,...,PN], altogether comprising N number of point, wherein
Pn=[xn,yn,zn]T, n=1,2 ..., N.Point in Figure 10 is projection of the flight path point in horizontal plane.Flight The barycenter of track point setWhen carrying out hovering accuracy detection, by binocular camera The underface of hovering is positioned over, is that left mesh camera is positioned over hovering point and makes optical axis perpendicular to horizontal plane specifically.Stop Precision is divided into the horizontal departure of spot hover precisionAnd vertical missingThe horizontal departure of Hovering control precisionAnd vertical missingDue to OLFor coordinate origin, calculation formula is:
Bibliography:
[1]Zhang Z.A flexible new technique for camera calibration[J].Pattern Analysis and Machine Intelligence,IEEE Transactions on,2000,22(11):1330-1334.
[2]Kalal Z,Mikolajczyk K,Matas J.Tracking-learning-detection[J] .Pattern Analysis and Machine Intelligence,IEEE Transactions on,2012,34(7): 1409-1422.
Although above-mentioned the embodiment of the present invention is described with reference to accompanying drawing, model not is protected to the present invention The limitation enclosed, one of ordinary skill in the art should be understood that on the basis of technical scheme those skilled in the art are not Need to pay various modifications or deformation that creative work can make still within protection scope of the present invention.

Claims (8)

1. the unmanned plane hovering accuracy measurement method based on binocular vision, it is characterized in that, comprise the following steps:
Step (1):Calibration phase:Camera is demarcated using Zhang Zhengyou chessboard calibration methods, so that it is determined that calibrating parameters define Calibration result parameter;
Step (2):Positioning stage:When carrying out unmanned plane hovering precision measure, under slide rail is positioned over into unmanned plane hovering point just Side, binocular camera is fixed on slide rail according to setpoint distance is parallel, and binocular camera can move along slide direction, camera mirror Head is placed vertically upward, and binocular camera imaging plane should be generally aligned in the same plane, and optical axis is parallel to each other;Left mesh camera and right mesh camera Unmanned plane image is gathered respectively, is transmitted to computer;Computer is according to the left mesh image and right mesh image collected, with reference to step (1) the calibration result parameter obtained, the three-dimensional location coordinates of unmanned plane are calculated;After hovering terminates, according to no-manned plane three-dimensional track Calculate hovering precision;
The step of step (2) is:
Step (2-1):When carrying out unmanned plane hovering precision measure, slide rail is positioned over immediately below unmanned plane hovering point, will be double Mesh camera is parallel to be fixed on slide rail, and camera lens are placed vertically upward, and binocular camera imaging plane should be generally aligned in the same plane, light Axle is parallel to each other;Left mesh camera gathers the left mesh image of unmanned plane, and right mesh camera gathers the right mesh image of unmanned plane;
Step (2-2):Detect target area:Using conspicuousness target detection mode, the target area in left mesh image is obtained;
Step (2-3):Target following:Using TLD algorithms, the target in left mesh image is tracked;
Step (2-4):Object matching:In right mesh image, the matching area most like with the target area of left mesh image is found;
Step (2-5):Homotopy mapping:The central point of rectangular target areas in left mesh and right mesh image is utilized respectively, as same Famous cake;
Step (2-6):Three-dimensional coordinate calculates:The coordinate system of camera is established, the calibration result parameter obtained with reference to step (1), meter Calculate three-dimensional coordinate of the target point in camera coordinates system;
Step (2-7):Hover accuracy evaluation, according to the three-dimensional coordinate track of target point, calculates the hovering precision of unmanned plane.
2. the unmanned plane hovering accuracy measurement method based on binocular vision as claimed in claim 1, it is characterized in that, the step (1) the step of is:
Step (1-1):Two cameras are fixed on same slide rail, define distance L, two camera positions on slide rail is adjusted, makes The distance between its central point is L;
Step (1-2):Camera is demarcated using Zhang Zhengyou chessboard calibration methods, and records calibration result parameter result= {Mleft,Dleft,Mright,Dright,R,T};Result represents calibration result, MleftAnd DleftThe camera square of mesh camera is represented respectively Battle array and distortion factor vector, MrightAnd DrightThe camera matrix and distortion factor vector of right mesh camera, R and T difference are represented respectively Represent the spin matrix and translation vector between two cameras;For each camera,
<mrow> <mi>M</mi> <mo>=</mo> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <msub> <mi>f</mi> <mi>x</mi> </msub> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <msub> <mi>c</mi> <mi>x</mi> </msub> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <msub> <mi>f</mi> <mi>y</mi> </msub> </mtd> <mtd> <msub> <mi>c</mi> <mi>y</mi> </msub> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <mo>;</mo> </mrow>
Wherein, M is camera matrix, fx,fyIt is the focal length in units of pixel, cxRepresent the abscissa of the principal point coordinate of camera, cy Represent the ordinate of the principal point coordinate of camera.
3. the unmanned plane hovering accuracy measurement method based on binocular vision as claimed in claim 1, it is characterized in that, the step The step of (2-2) is:Order sets t=0 at the time of starting to position target, is obtained first using conspicuousness target detection mode Target area, target area are with BLFor upper left angle point, high h, wide w rectangular area.
4. the unmanned plane hovering accuracy measurement method based on binocular vision as claimed in claim 1, it is characterized in that, the step The step of (2-3) is:The left mesh image target area determined according to the t=0 moment, t=1 and its it is later at the time of, using TLD Algorithm is tracked to the target of left mesh image.
5. the unmanned plane hovering accuracy measurement method based on binocular vision as claimed in claim 1, it is characterized in that, the step The step of (2-4) is:After obtaining the target area in left mesh image every time, in right mesh image, the mesh with left mesh image is found The most like matching area in region is marked, matching area is with BRFor upper left angle point, high h, wide w rectangular area;
Then object matching is expressed as:
<mrow> <munder> <mi>min</mi> <mrow> <msub> <mi>x</mi> <mi>R</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>R</mi> </msub> </mrow> </munder> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>i</mi> <mo>=</mo> <mn>0</mn> </mrow> <mi>h</mi> </munderover> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>j</mi> <mo>=</mo> <mn>0</mn> </mrow> <mi>w</mi> </munderover> <mo>|</mo> <msub> <mi>I</mi> <mrow> <mi>l</mi> <mi>e</mi> <mi>f</mi> <mi>t</mi> </mrow> </msub> <mo>&amp;lsqb;</mo> <msub> <mi>x</mi> <mi>L</mi> </msub> <mo>+</mo> <mi>i</mi> <mo>&amp;rsqb;</mo> <mo>&amp;lsqb;</mo> <msub> <mi>y</mi> <mi>L</mi> </msub> <mo>+</mo> <mi>j</mi> <mo>&amp;rsqb;</mo> <mo>-</mo> <msub> <mi>I</mi> <mrow> <mi>r</mi> <mi>i</mi> <mi>g</mi> <mi>h</mi> <mi>t</mi> </mrow> </msub> <mo>&amp;lsqb;</mo> <msub> <mi>x</mi> <mi>R</mi> </msub> <mo>+</mo> <mi>i</mi> <mo>&amp;rsqb;</mo> <mo>&amp;lsqb;</mo> <msub> <mi>y</mi> <mi>R</mi> </msub> <mo>+</mo> <mi>j</mi> <mo>&amp;rsqb;</mo> <mo>|</mo> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow>
Wherein, IleftRepresent left mesh image intensity value, IrightRepresent right mesh image intensity value, (xL, yL) represent point BLCoordinate, (xR, yR) represent point BRCoordinate;Now search area xR∈[0,xL],Wherein shTo search Seek the height in region, swFor the width of Search Area, obtain causing the B that formula (1) is minimumRPoint coordinates (xR, yR) after, then parallax
D=xL-xR
At the moment of t >=1, after left mesh image obtains new target area by TLD algorithms, the search area of right mesh image is carried out RenewalRight mesh is determined according to formula (1) Target area in image;By that analogy, the target area in the left mesh image of each frame is calculated, and same target is in right mesh Corresponding region in image.
6. the unmanned plane hovering accuracy measurement method based on binocular vision as claimed in claim 1, it is characterized in that, the step The step of (2-5) is:
Utilize the central point of left mesh objective area in imageAs the same place of left mesh target, Utilize the central point of the target area in right mesh imageSame place as right mesh target.
7. the unmanned plane hovering accuracy measurement method based on binocular vision as claimed in claim 1, it is characterized in that, the step The step of (2-6) is:
Camera coordinates system is with the photocentre O of left mesh cameraLFor origin, XOLFor Y plane parallel to imaging plane, optical axis direction is Z axis, According to the camera parameter of demarcation, re-projection matrix is obtained
<mrow> <mi>Q</mi> <mo>=</mo> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <mn>1</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mrow> <mo>-</mo> <msubsup> <mi>c</mi> <mi>x</mi> <mi>l</mi> </msubsup> </mrow> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mrow> <mo>-</mo> <msubsup> <mi>c</mi> <mi>y</mi> <mi>l</mi> </msubsup> </mrow> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <msup> <mi>f</mi> <mi>l</mi> </msup> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mrow> <mo>-</mo> <mn>1</mn> <mo>/</mo> <msub> <mi>T</mi> <mi>x</mi> </msub> </mrow> </mtd> <mtd> <mrow> <mo>(</mo> <msubsup> <mi>c</mi> <mi>x</mi> <mi>l</mi> </msubsup> <mo>-</mo> <msubsup> <mi>c</mi> <mi>x</mi> <mi>r</mi> </msubsup> <mo>)</mo> <mo>/</mo> <msub> <mi>T</mi> <mi>x</mi> </msub> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>2</mn> <mo>)</mo> </mrow> </mrow>
Wherein,For the principal point coordinate of left camera,For the principal point coordinate of right camera;TxTranslated between two cameras The X-axis component of matrix;flFor left camera focal length;
In the case of in left and right, sight axle is parallel to each other, it is known that left mesh image identical point coordinates PL(xL,yL) and right mesh image same place Coordinate PR(xR,yR), calculate parallax d=x of the target point in left and right viewL-xR, then make
<mrow> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <msub> <mover> <mi>x</mi> <mo>^</mo> </mover> <mi>c</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mover> <mi>y</mi> <mo>^</mo> </mover> <mi>c</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mover> <mi>z</mi> <mo>^</mo> </mover> <mi>c</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mover> <mi>w</mi> <mo>^</mo> </mover> <mi>c</mi> </msub> </mtd> </mtr> </mtable> </mfenced> <mo>=</mo> <mi>Q</mi> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <msub> <mi>x</mi> <mi>L</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>y</mi> <mi>L</mi> </msub> </mtd> </mtr> <mtr> <mtd> <mi>d</mi> </mtd> </mtr> <mtr> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <mo>=</mo> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <msub> <mi>x</mi> <mi>L</mi> </msub> <mo>-</mo> <msubsup> <mi>c</mi> <mi>x</mi> <mi>l</mi> </msubsup> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mi>y</mi> <mi>L</mi> </msub> <mo>-</mo> <msubsup> <mi>c</mi> <mi>y</mi> <mi>l</mi> </msubsup> </mrow> </mtd> </mtr> <mtr> <mtd> <msup> <mi>f</mi> <mi>l</mi> </msup> </mtd> </mtr> <mtr> <mtd> <mfrac> <mrow> <mo>-</mo> <mi>d</mi> <mo>+</mo> <msubsup> <mi>c</mi> <mi>x</mi> <mi>l</mi> </msubsup> <mo>-</mo> <msubsup> <mi>c</mi> <mi>x</mi> <mi>r</mi> </msubsup> </mrow> <msub> <mi>T</mi> <mi>x</mi> </msub> </mfrac> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>3</mn> <mo>)</mo> </mrow> </mrow>
Obtain three-dimensional coordinate of the target point in camera coordinates system:
<mrow> <msub> <mi>P</mi> <mi>c</mi> </msub> <mo>=</mo> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>c</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>c</mi> </msub> <mo>,</mo> <msub> <mi>z</mi> <mi>c</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mrow> <mo>(</mo> <msub> <mover> <mi>x</mi> <mo>^</mo> </mover> <mi>c</mi> </msub> <mo>/</mo> <msub> <mover> <mi>w</mi> <mo>^</mo> </mover> <mi>c</mi> </msub> <mo>,</mo> <msub> <mover> <mi>y</mi> <mo>^</mo> </mover> <mi>c</mi> </msub> <mo>/</mo> <msub> <mover> <mi>w</mi> <mo>^</mo> </mover> <mi>c</mi> </msub> <mo>,</mo> <msub> <mover> <mi>z</mi> <mo>^</mo> </mover> <mi>c</mi> </msub> <mo>/</mo> <msub> <mover> <mi>w</mi> <mo>^</mo> </mover> <mi>c</mi> </msub> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>4</mn> <mo>)</mo> </mrow> <mo>;</mo> </mrow>
WhereinWithFor intermediate result variable, xc、ycAnd zcRespectively target point PcX, Y in camera coordinates system With Z axis coordinate.
8. the unmanned plane hovering accuracy measurement method based on binocular vision as claimed in claim 1, it is characterized in that, the step The step of (2-7) is:
When unmanned plane hovers, calculate the three-dimensional location coordinates of unmanned plane in real time using step (2-5), obtain its flight path;
Assuming that the collection for obtaining tracing point is combined into P=[P1,P2,...,PN], altogether comprising N number of point, wherein
Pn=[xn,yn,zn]T, n=1,2 ..., N;
The barycenter of flight path point set
, it is specified that the terrain clearance of unmanned plane hovering is denoted as H during progress unmanned plane hovering accuracy test0
When carrying out hovering accuracy detection, binocular camera is positioned over to the underface of hovering, is to put left mesh camera specifically It is placed in hovering point and makes optical axis perpendicular to horizontal plane;Hovering precision is divided into the horizontal departure of spot hover precisionWith it is vertical DeviationThe horizontal departure of Hovering control precisionAnd vertical missing
Due to OLFor coordinate origin, calculation formula is:
<mrow> <msubsup> <mi>E</mi> <mrow> <mi>H</mi> <mi>o</mi> <mi>v</mi> <mi>i</mi> <mi>n</mi> <mi>g</mi> </mrow> <mi>h</mi> </msubsup> <mo>=</mo> <msqrt> <mrow> <msubsup> <mi>x</mi> <mi>m</mi> <mn>2</mn> </msubsup> <mo>+</mo> <msubsup> <mi>y</mi> <mi>m</mi> <mn>2</mn> </msubsup> </mrow> </msqrt> <mo>,</mo> </mrow>
<mrow> <msubsup> <mi>E</mi> <mrow> <mi>H</mi> <mi>o</mi> <mi>v</mi> <mi>i</mi> <mi>n</mi> <mi>g</mi> </mrow> <mi>v</mi> </msubsup> <mo>=</mo> <mo>|</mo> <msub> <mi>z</mi> <mi>m</mi> </msub> <mo>-</mo> <msub> <mi>H</mi> <mn>0</mn> </msub> <mo>|</mo> <mo>,</mo> </mrow>
<mrow> <msubsup> <mi>E</mi> <mrow> <mi>C</mi> <mi>o</mi> <mi>n</mi> <mi>t</mi> <mi>r</mi> <mi>o</mi> <mi>l</mi> </mrow> <mi>h</mi> </msubsup> <mo>=</mo> <mfrac> <msqrt> <mrow> <msubsup> <mi>e</mi> <mi>X</mi> <mn>2</mn> </msubsup> <mo>+</mo> <msubsup> <mi>e</mi> <mi>Y</mi> <mn>2</mn> </msubsup> </mrow> </msqrt> <mn>2</mn> </mfrac> <mo>,</mo> </mrow>
<mrow> <msubsup> <mi>E</mi> <mrow> <mi>C</mi> <mi>o</mi> <mi>n</mi> <mi>h</mi> <mi>o</mi> <mi>l</mi> </mrow> <mi>v</mi> </msubsup> <mo>=</mo> <mfrac> <mrow> <munder> <mi>max</mi> <mrow> <mi>n</mi> <mo>&amp;Element;</mo> <mo>&amp;lsqb;</mo> <mn>1</mn> <mo>,</mo> <mi>N</mi> <mo>&amp;rsqb;</mo> </mrow> </munder> <mrow> <mo>(</mo> <msub> <mi>z</mi> <mi>n</mi> </msub> <mo>)</mo> </mrow> <mo>-</mo> <munder> <mi>min</mi> <mrow> <mi>n</mi> <mo>&amp;Element;</mo> <mo>&amp;lsqb;</mo> <mn>1</mn> <mo>,</mo> <mi>N</mi> <mo>&amp;rsqb;</mo> </mrow> </munder> <mrow> <mo>(</mo> <msub> <mi>z</mi> <mi>n</mi> </msub> <mo>)</mo> </mrow> </mrow> <mn>2</mn> </mfrac> </mrow>
Wherein, eXAnd eYRespectively during unmanned plane hovering, under the camera coordinates system using left mesh camera as origin, X-direction With the range of movement of Y direction, znRepresent unmanned plane during flying tracing point PnZ axis coordinate.
CN201510736167.9A 2015-11-02 2015-11-02 Unmanned plane hovering accuracy measurement method based on binocular vision Active CN105424006B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510736167.9A CN105424006B (en) 2015-11-02 2015-11-02 Unmanned plane hovering accuracy measurement method based on binocular vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510736167.9A CN105424006B (en) 2015-11-02 2015-11-02 Unmanned plane hovering accuracy measurement method based on binocular vision

Publications (2)

Publication Number Publication Date
CN105424006A CN105424006A (en) 2016-03-23
CN105424006B true CN105424006B (en) 2017-11-24

Family

ID=55502384

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510736167.9A Active CN105424006B (en) 2015-11-02 2015-11-02 Unmanned plane hovering accuracy measurement method based on binocular vision

Country Status (1)

Country Link
CN (1) CN105424006B (en)

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107036625A (en) * 2016-02-02 2017-08-11 中国电力科学研究院 A kind of flying quality detection method of power transmission line unmanned helicopter patrol inspection system
CN105957109A (en) * 2016-04-29 2016-09-21 北京博瑞爱飞科技发展有限公司 Target tracking method and device
CN106020218B (en) * 2016-05-16 2018-11-13 国家电网公司 A kind of the hovering method for testing precision and system of unmanned plane
CN106153008B (en) * 2016-06-17 2018-04-06 北京理工大学 A kind of rotor wing unmanned aerial vehicle objective localization method of view-based access control model
CN107300377B (en) * 2016-11-01 2019-06-14 北京理工大学 A kind of rotor wing unmanned aerial vehicle objective localization method under track of being diversion
CN106709955B (en) * 2016-12-28 2020-07-24 天津众阳科技有限公司 Space coordinate system calibration system and method based on binocular stereo vision
CN108965651A (en) * 2017-05-19 2018-12-07 深圳市道通智能航空技术有限公司 A kind of drone height measurement method and unmanned plane
CN107424156B (en) * 2017-06-28 2019-12-06 北京航空航天大学 Unmanned aerial vehicle autonomous formation accurate measurement method based on visual attention of barn owl eyes
CN109211185A (en) * 2017-06-30 2019-01-15 北京臻迪科技股份有限公司 A kind of flight equipment, the method and device for obtaining location information
CN107490375B (en) * 2017-09-21 2018-08-21 重庆鲁班机器人技术研究院有限公司 Spot hover accuracy measuring device, method and unmanned vehicle
CN108489454A (en) * 2018-03-22 2018-09-04 沈阳上博智像科技有限公司 Depth distance measurement method, device, computer readable storage medium and electronic equipment
CN109211573B (en) * 2018-09-12 2021-01-08 北京工业大学 Method for evaluating hovering stability of unmanned aerial vehicle
CN109360240B (en) * 2018-09-18 2022-04-22 华南理工大学 Small unmanned aerial vehicle positioning method based on binocular vision
CN109855822B (en) * 2019-01-14 2019-12-06 中山大学 unmanned aerial vehicle-based high-speed rail bridge vertical dynamic disturbance degree measuring method
CN109813509B (en) * 2019-01-14 2020-01-24 中山大学 Method for realizing measurement of vertical dynamic disturbance degree of high-speed rail bridge based on unmanned aerial vehicle
CN110986891B (en) * 2019-12-06 2021-08-24 西北农林科技大学 System for accurately and rapidly measuring crown width of tree by using unmanned aerial vehicle
CN111688949B (en) * 2020-06-24 2022-06-28 天津大学 Unmanned aerial vehicle hovering attitude measuring device and method
CN112188112A (en) * 2020-09-28 2021-01-05 苏州臻迪智能科技有限公司 Light supplement control method, light supplement control device, storage medium and electronic equipment
CN112365526B (en) * 2020-11-30 2023-08-25 湖南傲英创视信息科技有限公司 Binocular detection method and system for weak and small targets
CN114877876B (en) * 2022-07-12 2022-09-23 南京市计量监督检测院 Unmanned aerial vehicle hovering precision evaluation method

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101489149B (en) * 2008-12-25 2010-06-23 清华大学 Binocular tri-dimensional video collecting system
US9071819B2 (en) * 2010-03-23 2015-06-30 Exelis Inc. System and method for providing temporal-spatial registration of images
CN101876532B (en) * 2010-05-25 2012-05-23 大连理工大学 Camera on-field calibration method in measuring system
CN102967305B (en) * 2012-10-26 2015-07-01 南京信息工程大学 Multi-rotor unmanned aerial vehicle pose acquisition method based on markers in shape of large and small square
US9501218B2 (en) * 2014-01-10 2016-11-22 Microsoft Technology Licensing, Llc Increasing touch and/or hover accuracy on a touch-enabled device
CN104006803B (en) * 2014-06-20 2016-02-03 中国人民解放军国防科学技术大学 The photographing measurement method of spin stabilization spacecraft rotational motion parameter
CN104932523A (en) * 2015-05-27 2015-09-23 深圳市高巨创新科技开发有限公司 Positioning method and apparatus for unmanned aerial vehicle

Also Published As

Publication number Publication date
CN105424006A (en) 2016-03-23

Similar Documents

Publication Publication Date Title
CN105424006B (en) Unmanned plane hovering accuracy measurement method based on binocular vision
US8655094B2 (en) Photogrammetry system and method for determining relative motion between two bodies
WO2022170878A1 (en) System and method for measuring distance between transmission line and image by unmanned aerial vehicle
CN103761737B (en) Robot motion&#39;s method of estimation based on dense optical flow
WO2018103408A1 (en) Aerial image capturing method and system for unmanned aerial vehicle to survey traffic accident scene
US7196730B2 (en) Method and system for complete 3D object and area digitizing
CN108810473B (en) Method and system for realizing GPS mapping camera picture coordinate on mobile platform
CN108168521A (en) One kind realizes landscape three-dimensional visualization method based on unmanned plane
CN107255443A (en) Binocular vision sensor field calibration method and device under a kind of complex environment
CA2961921A1 (en) Camera calibration method using a calibration target
CN103226838A (en) Real-time spatial positioning method for mobile monitoring target in geographical scene
US20090154793A1 (en) Digital photogrammetric method and apparatus using intergrated modeling of different types of sensors
CN109238235B (en) Method for realizing rigid body pose parameter continuity measurement by monocular sequence image
CN104089628B (en) Self-adaption geometric calibration method of light field camera
CN104034305B (en) A kind of monocular vision is the method for location in real time
CN103136747A (en) Automotive camera system and its calibration method and calibration program
CN102927908A (en) Robot eye-on-hand system structured light plane parameter calibration device and method
CN102693543B (en) Method for automatically calibrating Pan-Tilt-Zoom in outdoor environments
CN102519434A (en) Test verification method for measuring precision of stereoscopic vision three-dimensional recovery data
CN106408601A (en) GPS-based binocular fusion positioning method and device
CN106403900A (en) Flyer tracking and locating system and method
CN104931070B (en) A kind of optical signal injected simulation method
CN105758623A (en) TDI-CCD-based large-aperture long-focal length remote sensing camera distortion measurement device and measurement method
CN106096207A (en) A kind of rotor wing unmanned aerial vehicle wind resistance appraisal procedure based on multi-vision visual and system
Mouget et al. Photogrammetric archaeological survey with UAV

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder
CP01 Change in the name or title of a patent holder

Address after: Wang Yue Central Road Ji'nan City, Shandong province 250002 City No. 2000

Co-patentee after: National Network Intelligent Technology Co., Ltd.

Patentee after: Electric Power Research Institute of State Grid Shandong Electric Power Company

Co-patentee after: State Grid Corporation of China

Address before: Wang Yue Central Road Ji'nan City, Shandong province 250002 City No. 2000

Co-patentee before: Shandong Luneng Intelligent Technology Co., Ltd.

Patentee before: Electric Power Research Institute of State Grid Shandong Electric Power Company

Co-patentee before: State Grid Corporation of China

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20201029

Address after: 250101 Electric Power Intelligent Robot Production Project 101 in Jinan City, Shandong Province, South of Feiyue Avenue and East of No. 26 Road (ICT Industrial Park)

Patentee after: National Network Intelligent Technology Co.,Ltd.

Address before: Wang Yue Central Road Ji'nan City, Shandong province 250002 City No. 2000

Patentee before: ELECTRIC POWER RESEARCH INSTITUTE OF STATE GRID SHANDONG ELECTRIC POWER Co.

Patentee before: National Network Intelligent Technology Co.,Ltd.

Patentee before: STATE GRID CORPORATION OF CHINA