CN114200397A - Mooring unmanned aerial vehicle photoelectric positioning system independent of satellite navigation and distance measuring equipment - Google Patents

Mooring unmanned aerial vehicle photoelectric positioning system independent of satellite navigation and distance measuring equipment Download PDF

Info

Publication number
CN114200397A
CN114200397A CN202111282905.9A CN202111282905A CN114200397A CN 114200397 A CN114200397 A CN 114200397A CN 202111282905 A CN202111282905 A CN 202111282905A CN 114200397 A CN114200397 A CN 114200397A
Authority
CN
China
Prior art keywords
ground
solid
state camera
airborne
laser target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111282905.9A
Other languages
Chinese (zh)
Inventor
张润哲
王全喜
李伟
李迅
李庶中
王泽众
罗军
曾浩
鉴福升
李洁
李越强
赵鹏鹏
张毅
赵东伟
闫鹏浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan Yidian Aviation Technology Co ltd
Unit 91977 Of Pla
Original Assignee
Sichuan Yidian Aviation Technology Co ltd
Unit 91977 Of Pla
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan Yidian Aviation Technology Co ltd, Unit 91977 Of Pla filed Critical Sichuan Yidian Aviation Technology Co ltd
Priority to CN202111282905.9A priority Critical patent/CN114200397A/en
Publication of CN114200397A publication Critical patent/CN114200397A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves

Landscapes

  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a photoelectric positioning system of a tethered unmanned aerial vehicle, which does not depend on satellite navigation and distance measuring equipment, and comprises: the system comprises an airborne optical alignment device deployed on a tethered unmanned aerial vehicle, a ground optical alignment device deployed on a ground control station and a resolving and positioning module; wherein the airborne optical alignment device comprises an airborne solid-state camera and an airborne laser target, and the ground optical alignment device comprises a ground solid-state camera and a ground laser target; the airborne optical alignment device is used for imaging the ground laser target through the airborne solid-state camera to obtain the coordinate value of the ground laser target on the target surface of the airborne solid-state camera; the ground optical alignment device is used for imaging the airborne laser target through the ground solid-state camera to obtain the coordinate value of the airborne laser target on the target surface of the ground solid-state camera; and the resolving and positioning module is used for resolving the coordinate values to obtain the position information of the unmanned aerial vehicle according to the optical geometric relationship, so that the positioning is realized.

Description

Mooring unmanned aerial vehicle photoelectric positioning system independent of satellite navigation and distance measuring equipment
Technical Field
The invention belongs to the technical field of photoelectricity, and particularly relates to a photoelectric positioning system of a tethered unmanned aerial vehicle, which does not depend on satellite navigation and distance measuring equipment.
Background
Mooring unmanned aerial vehicle system is connected the unmanned aerial vehicle platform with ground control station through mooring cable, by ground control station through mooring cable to mooring unmanned aerial vehicle transmission electric energy and control command etc. mooring unmanned aerial vehicle transmits the state of acquireing, information such as target location through mooring cable to ground control station, thereby realize target indication, functions such as ground weapon system attack guide, have that the dead time is long, unmanned aerial vehicle location precision is high, detection distance is far away, advantages such as detection precision height.
The premise that the mooring unmanned aerial vehicle system can achieve functions such as high-precision target indication is that the mooring unmanned aerial vehicle platform has high positioning precision, the difference satellite navigation technology is mainly adopted for achieving high-precision positioning at present, particularly the RTK satellite navigation technology can achieve positioning precision of the platform in centimeter magnitude, and the mooring unmanned aerial vehicle platform becomes a positioning technology widely adopted by the mooring unmanned aerial vehicle platform.
Although the differential satellite navigation technology has the advantages of wide application range, extremely high positioning accuracy and the like, the differential satellite navigation technology still has more defects which are difficult to overcome in practical use at present, for example, in a dense building area, a GPS signal is easily shielded and reflected by a building, so that a receiving end cannot receive the signal or receives multiple paths of reflected signals, and positioning instantaneity and accuracy are influenced; when the RTK base station has the situations of incapability of searching satellites or data disconnection and the like, the system is converted into a mode of utilizing a common single-point satellite navigation to provide common positioning data, and the positioning precision of the tethered unmanned aerial vehicle cannot be continuously maintained; in addition, once the guard channel technology is interfered, the whole system cannot work. In summary, there is a need to develop a tethered drone system that does not rely on satellite navigation technology to achieve precise navigation.
Disclosure of Invention
The invention aims to overcome the defects of a tethered unmanned aerial vehicle system using a satellite navigation technology, provides a tethered unmanned aerial vehicle photoelectric positioning system independent of satellite navigation and distance measuring equipment, accurately measures the coordinate value, heading and pitching of the tethered unmanned aerial vehicle in a tethered platform coordinate system by using a photoelectric cross-pointing technology, and solves the problem that an unmanned aerial vehicle platform in the conventional tethered unmanned aerial vehicle system can only realize high-precision positioning by using the satellite navigation technology.
In order to achieve the above object, a tethered drone optoelectronic positioning system that does not rely on satellite navigation and ranging equipment, the system comprising: the system comprises an airborne optical alignment device deployed on a tethered unmanned aerial vehicle, a ground optical alignment device deployed on a ground control station and a resolving and positioning module; wherein the airborne optical alignment device comprises an airborne solid-state camera and an airborne laser target, and the ground optical alignment device comprises a ground solid-state camera and a ground laser target;
the airborne optical alignment device is used for imaging the ground laser target through the airborne solid-state camera to obtain the coordinate value of the ground laser target on the target surface of the airborne solid-state camera;
the ground optical alignment device is used for imaging the airborne laser target through the ground solid-state camera to obtain the coordinate value of the airborne laser target on the target surface of the ground solid-state camera;
and the resolving and positioning module is used for resolving the coordinate values to obtain the position information of the unmanned aerial vehicle according to the optical geometric relationship, so that the positioning is realized.
As an improvement of the above system, the onboard solid-state camera employs a CMOS or CCD sensor, the ground solid-state camera employs a CMOS or CCD sensor, and the distance resolution between the onboard solid-state camera and the ground solid-state camera during normal operation is not less than a preset threshold.
As an improvement of the system, the airborne laser target comprises not less than 2 light sources which are respectively arranged on the left side and the right side of the azimuth axis of the tethered unmanned aerial vehicle and are equidistant from the azimuth axis of the unmanned aerial vehicle, the connecting lines of the 2 light sources penetrate through the geometric center of the tethered unmanned aerial vehicle, and the pointing directions of the light sources are all parallel to the direction of the optical axis of the airborne solid-state camera;
the ground laser target comprises at least 2 light sources, a connecting line between the 2 light sources penetrates through an optical axis of the camera, a perpendicular bisector of the connecting line of the 2 light sources penetrates through the optical axis of the camera, the geometric center of the ground laser target coincides with the position of the optical axis of the ground solid-state camera, and the direction of a reference line of the light sources is parallel to the direction of the optical axis of the ground solid-state camera.
As an improvement of the above system, the specific processing procedure of the resolving positioning module is as follows:
establishing a coordinate system with a ground control station as an origin;
abscissa m of 2 light sources of receiver-borne laser target on ground solid-state camera target surfacepAnd np
Vertical coordinate k of 2 light sources for receiving airborne laser targets on ground solid-state camera target surfacepAnd lp
Abscissa m of 2 light sources for receiving ground laser targets on airborne solid-state camera target surfacedAnd nd
Vertical coordinate k of 2 light sources for receiving ground laser target on airborne solid-state camera target surfacedAnd ld
According to the abscissa mpAnd npAnd ordinate kpAnd lpAnd calculating to obtain the azimuth alpha of the geometric center position of the airborne laser target deviating from the optical axis of the ground solid-state camerad-pAnd pitch betad-p
According to the abscissa mdAnd ndAnd ordinate kdAnd ldAnd calculating the direction alpha of the geometric center position of the ground laser target deviating from the optical axis of the airborne solid-state camerap-dAnd pitch betap-d
Calculating the slope distance R of the ground solid-state camera and the airborne solid-state camera by combining the roll upsilon output by the airborne inertial navigation equipment of the tethered unmanned aerial vehicle;
and calculating the coordinates, heading and pitching of the tethered unmanned aerial vehicle according to the optical geometric relationship by using the azimuth, the pitching and the slant distance.
As an improvement to the above system, said reference m is based on the abscissapAnd npAnd ordinate kpAnd lpAnd calculating to obtain the azimuth alpha of the geometric center position of the airborne laser target deviating from the optical axis of the ground solid-state camerad-pAnd pitch betad-p(ii) a The method specifically comprises the following steps:
according to the abscissa mpAnd npAnd calculating the abscissa x of the geometric center position of the airborne laser target on the target surface of the ground solid-state camera according to the following formula0-pComprises the following steps:
Figure BDA0003331795590000031
according to the following formula, calculating to obtain the optical axis orientation alpha of the solid-state camera with the geometric center position of the airborne laser target deviating from the groundd-pComprises the following steps:
αd-p=x0-p·Δθp
wherein, Delta thetapFor the angular resolution of a terrestrial solid-state camera,
Figure BDA0003331795590000032
fpthe single phase element size of the terrestrial solid-state camera is N for the focal length of the terrestrial solid-state camerap×Np
According to ordinate kpAnd lpAnd calculating the ordinate y of the geometric center position of the airborne laser target on the target surface of the ground solid-state camera according to the following formula0-pComprises the following steps:
Figure BDA0003331795590000033
according to the following formula, calculating the pitching beta of the geometric center position of the airborne laser target deviating from the optical axis of the ground solid-state camerad-pComprises the following steps:
βd-p=y0-p·Δθp
as an improvement to the above system, said reference m is based on the abscissadAnd ndAnd ordinate kdAnd ldAnd calculating the direction alpha of the geometric center position of the ground laser target deviating from the optical axis of the airborne solid-state camerap-dAnd pitch betap-d(ii) a The method specifically comprises the following steps:
according to the abscissa mdAnd ndAnd calculating the abscissa x of the geometric center position of the ground laser target on the target surface of the airborne solid-state camera according to the following formula0-dComprises the following steps:
Figure BDA0003331795590000034
calculating the deviation of the geometric center position of the ground laser target from the optical axis orientation alpha of the airborne solid-state camera according to the following formulap-dComprises the following steps:
αp-d=x0-d·Δθd
wherein, Delta thetadFor the angular resolution of the on-board solid-state camera,
Figure BDA0003331795590000041
fdfor the focal length of the onboard solid-state camera, the size of a single phase element of the onboard solid-state camera is Nd×Nd
According to ordinate kdAnd ldAnd calculating the ordinate y of the geometric center position of the ground laser target on the target surface of the airborne camera according to the following formula0-dComprises the following steps:
Figure BDA0003331795590000042
calculating the pitching beta of the geometric center position of the ground laser target deviating from the optical axis of the airborne solid-state camera according to the following formulap-dComprises the following steps:
βp-d=y0-d·Δθd
as an improvement of the system, the roll upsilon output by the airborne inertial navigation device of the tethered unmanned aerial vehicle is combined to calculate the slope distance R of the ground solid-state camera and the airborne solid-state camera; the method specifically comprises the following steps:
Figure BDA0003331795590000043
wherein F (. alpha.) represents the value of p.alpha.p-dAnd when upsilon is known, the projection length of the interval rho of the 2 light sources of the airborne laser target on a plane parallel to the target surface of the ground solid-state camera at the position of the captive unmanned aerial vehicle.
As an improvement of the above system, the coordinates, heading and pitch of the tethered unmanned aerial vehicle are calculated from the azimuth, pitch and slant range according to the optical geometric relationship; the method specifically comprises the following steps:
coordinates (x) of tethered droned,yd,zd) Comprises the following steps:
xd=R·cos(β0d-p)·cos(α0d-p′)
yd=R·cos(β0d-p)·sin(α0d-p′)
zd=R·sin(β0d-p)
wherein alpha isd-p' is the orientation, alpha, of the geometric center position of the airborne laser target relative to the ground solid-state camera0Azimuth angle of optical axis of terrestrial solid-state camera, beta0Pitch angle, beta, of optical axis of ground solid-state camerad-pThe geometrical center position of the airborne laser target is deviated from the pitching of the optical axis of the ground solid-state camera, and R is the slant distance between the ground solid-state camera and the airborne solid-state camera;
mooring unmanned aerial vehicle heading alpha relative to ground control stationdComprises the following steps:
αd=360°-αp-d'
wherein alpha isp-d' is the orientation of the geometric center position of the ground laser target relative to the airborne solid-state camera;
pitching beta of tethered unmanned aerial vehicledComprises the following steps:
βd=-(-γ0p-d)
wherein, γ0The pitching of the airborne solid-state camera from the reference line of the tethered unmanned aerial vehicle is realized by beta of-90 degrees or moredp-d0≤90°,βdp-d0Are all positive and negative upward and downward from the horizontal plane.
As an improvement of the system, the ground optical alignment device further comprises a ground reference platform for realizing the initial calibration of the optical axis pointing direction of the ground solid-state camera
Compared with the prior art, the invention has the advantages that:
1. high-precision positioning is realized without depending on satellite navigation technology. The invention realizes the high-precision positioning, the heading positioning and the pitching of the mooring unmanned aerial vehicle relative to the mooring platform through the photoelectric cross-sight technology, and avoids the defects that the high-precision positioning, the signal cutoff of an RTK base station, the interference of sanitary guiding equipment and the like can not be realized at all time by using a satellite navigation technology solution;
2. the influence of environmental factors such as illumination intensity, cloud and fog weather is small, a high-definition solid-state camera is adopted to match with a laser target, accurate positioning, heading positioning and pitching are realized through air-ground mutual aiming, the visibility of the laser target is extremely high, and mutual aiming can be realized under strong illumination in the daytime, at night and in bad weather conditions;
3. the device is simple in structure and low in cost, key components are only 2 high-definition solid-state cameras and matched laser targets, the distance measuring device does not need to be additionally installed, the device is simple in structure, the purchasing cost is low, and the installation, the debugging and the use are easy.
Drawings
FIG. 1 is a general block diagram of the tethered drone optoelectronic positioning system of the present invention that does not rely on satellite navigation and ranging equipment;
FIG. 2 is a schematic view of an optical alignment apparatus;
fig. 3 is a schematic view of the operating position of the tethered drone system;
FIG. 4 is a schematic illustration of the principle of laser target imaging;
FIG. 5 is a schematic diagram illustrating a conversion relationship between the azimuth angle of the target surface of the camera and the real azimuth angle;
FIG. 6 is a schematic view of the airborne laser target projection geometry;
FIG. 7 is a schematic diagram of the imaging geometry of an airborne laser target on a ground solid-state camera;
fig. 8 is a tethered drone pitch solution schematic;
FIG. 9 is a schematic view of an onboard optical alignment device installation;
fig. 10 is a schematic diagram of the accuracy with which the system can identify the change in the position of a tethered drone.
Reference numerals
1. Mooring unmanned aerial vehicle 2 and ground control station
3. Airborne optical alignment device 4 and ground optical alignment device
5. Ground power supply 6 and signal processing equipment
7. Airborne solid-state camera 8 and airborne laser target
9. Ground solid-state camera 10 and ground laser target
11. Ground reference table 12 and mooring cable
Detailed Description
The invention provides a tethered unmanned aerial vehicle photoelectric positioning system independent of satellite navigation and distance measurement equipment, which comprises: the system comprises an airborne optical alignment device deployed on a tethered unmanned aerial vehicle, a ground optical alignment device deployed on a ground control station and a resolving and positioning module; wherein the airborne optical alignment device comprises an airborne solid-state camera and an airborne laser target, and the ground optical alignment device comprises a ground solid-state camera and a ground laser target;
the airborne optical alignment device is used for imaging the ground laser target through the airborne solid-state camera to obtain the coordinate value of the ground laser target on the target surface of the airborne solid-state camera;
the ground optical alignment device is used for imaging the airborne laser target through the ground solid-state camera to obtain the coordinate value of the airborne laser target on the target surface of the ground solid-state camera;
and the resolving and positioning module is used for resolving the coordinate values to obtain the position information of the unmanned aerial vehicle according to the optical geometric relationship, so that the positioning is realized.
The system is characterized in that the tethered unmanned aerial vehicle and the tethered platform are aimed at each other, namely shoot each other, through the high-definition solid-state camera and the laser target which are respectively arranged on the tethered unmanned aerial vehicle and the tethered platform, so as to measure the direction and the pitching of the tethered unmanned aerial vehicle; the laser target that mooring unmanned aerial vehicle carried is observed to the mooring platform, survey mooring unmanned aerial vehicle and mooring platform's slope distance, and the coordinate value of mooring unmanned aerial vehicle in mooring platform coordinate system is resolved and output to the system through above information, confirms mooring unmanned aerial vehicle for mooring platform's accurate position, provides the locating data for mooring unmanned aerial vehicle.
The present invention will be described in further detail with reference to the following drawings and specific examples.
Examples
As shown in fig. 1, an embodiment of the present invention proposes a tethered drone optoelectronic positioning system that does not rely on satellite navigation and ranging equipment, the system comprising: the system comprises a mooring unmanned aerial vehicle 1, a ground control station 2, an airborne optical alignment device 3, a ground optical alignment device 4, a ground power supply 5, a signal processing device 6 and a mooring cable 12. The mooring unmanned aerial vehicle 1 is used for carrying an airborne optical alignment device 3, and selecting an unmanned aerial vehicle meeting requirements according to the working requirements of a mooring unmanned aerial vehicle system, such as the type of executed tasks, the load weight and the like; a ground control station 2 as a carrier of a ground device; the ground power supply 5 is used for supplying power to the platform and the load of the mooring unmanned aerial vehicle 1; the signal processing device 6 is used for analyzing data such as angles and distances acquired by the system; mooring cable 12 for connect mooring unmanned aerial vehicle 1 and ground control station 2, information such as transmission control command, angle, distance and to mooring unmanned aerial vehicle 1 transmission electric power.
As shown in fig. 2, the on-board optical alignment device 3 includes: an onboard solid-state camera 7 for aiming at the ground laser target 10 and determining the direction alpha of the central position of the ground laser target 10 deviating from the optical axis direction of the onboard solid-state camera 7p-dAnd pitch betap-d(ii) a And the airborne laser target 8 is used for observing by the ground solid-state camera 9.
The ground optical alignment device 4 comprises: a ground solid-state camera 9 for aiming the airborne laser target 8 and determining the direction alpha of the central position of the airborne laser target 8 deviating from the optical axis direction of the ground solid-state camera 9d-pPitch betad-pAnd the slant distance R between the mooring unmanned aerial vehicle 1 and the ground control station; a ground laser target 10 for observation by the airborne solid-state camera 7; the ground reference platform 11 is used for realizing the initial calibration of the optical axis pointing direction of the ground solid-state camera 9.
The onboard solid-state camera 7 is a 1-unit high-definition camera, and a camera sensor can adopt CMOS or CCD. The scheme adopted by the embodiment is used for replacing the RTK guard technology to realize the accurate positioning of the captive unmanned aerial vehicle 1, so the positioning accuracy of the system is not less than that of the RTK guard technology, namely the airborne solid-state camera 7 is fixed with the groundThe distance resolution Δ d of the state camera 9 at the slant distance R should satisfy Δ dRLess than or equal to 0.1m, and in addition, the mutual aiming between the airborne solid-state camera 7 and the ground solid-state camera 9 is convenient to realize, namely, the opposite laser target can easily enter the camera view field, and the view field range of the solid-state camera at the slant distance R is large enough. The resolution of the camera is H multiplied by V, the focal length is f, the size of a single phase element is N multiplied by N, and the angular resolution of the solid-state camera is
Figure BDA0003331795590000071
Horizontal field angle
θH=H·Δθ
Vertical field of view
θV=V·Δθ
Similarly, the angular resolution of the ground solid-state camera can be calculated according to the related parameters of the ground solid-state camera.
The airborne laser target 8 is a device consisting of a plurality of light sources, the light sources can adopt diode lasers, and optical lenses are additionally arranged outside the lasers and used for narrowing divergence angles of the lasers. As shown in fig. 2, the airborne laser target 8 includes not less than 2 light sources, in this example, the case of 2 light sources is used, 2 light sources are respectively installed on the left and right sides of the tethered unmanned aerial vehicle 1 and are equidistant from the azimuth axis of the unmanned aerial vehicle, the mode that the connection line of 2 light sources passes through the geometric center of the tethered unmanned aerial vehicle 1 is the optimal scheme, the influence of the attitude change of the tethered unmanned aerial vehicle 1 on the spatial position of the light sources can be ensured to be minimum, calibration compensation should be paid attention to when the light sources adopt other installation modes, and the light source pointing direction should be ensured to be parallel to the optical axis direction of the airborne solid-state camera 7 when the light sources are installed and calibrated. In order to facilitate the ground solid-state camera 9 to observe the airborne laser target 8, the distance R is calculated by comparing the distance rho' of the image of the airborne laser target 8 on the target surface of the ground solid-state camera 9 with the real distance rho of 2 light sources on the airplane, and the distance rho of the 2 light sources is large enough, but the installation position is comprehensively considered by combining the actual conditions of mooring the load, the counterweight and the like of the unmanned aerial vehicle 1. To ensure that the image of the light source on the target surface of the ground solid-state camera 9 is less than 1 pixel, the size of the light source should satisfy d0≤Δd|RIn order to ensure that the solid-state camera can observe the light source facing the laser target in the field of view, the divergence angle of the light source is required to meet the condition that psi is more than or equal to max (theta)HV). In addition, in order to ensure that the ground solid-state camera 9 can observe the light source of the laser target without being damaged due to overlarge received light intensity, the light source of the laser target should be set with proper power.
The ground solid-state camera 9, referred to as the airborne solid-state camera 7, is used for keeping the horizontal axis of the target surface of the camera horizontal during installation and calibration.
The ground laser target 10 is a device composed of a plurality of light sources, and the ground laser target 10 includes not less than 2 light sources, in this example, the case of 2 light sources is taken as an example. As shown in fig. 2, the geometric center of the ground laser target 10 coincides with the optical axis of the ground solid-state camera 9, and the light source reference line direction should be parallel to the optical axis of the ground solid-state camera 9. To facilitate the onboard solid-state camera 7 to calculate the geometric center position of the ground laser target 10 by capturing the light sources, the 2 light source spacing ρ1Specific light source size d0At least 1 order of magnitude greater, which works well, but is not limited to necessarily greater than 1 order of magnitude.
The specific processing procedure of the system comprises the following steps:
the method comprises the following steps: a coordinate system with the ground control station 2 as the origin is established.
As shown in fig. 3, a coordinate system is established with the position of the ground laser target 10 as an origin, the origin of the coordinate is O, the xOy plane is a horizontal plane, and the y-axis direction is set as the relative true north of the coordinate system.
Step two: the airborne solid-state camera 7 and the ground solid-state camera 9 are aimed mutually, and the direction and the pitching of the geometric center of the laser target deviating from the optical axis of the opposite camera are calculated by identifying the laser target.
The ground solid-state camera 9 captures the light source of the airborne laser target 8, the imaging condition on the target surface of the camera is shown in fig. 4, the abscissa values of the 2 light sources on the target surface of the ground solid-state camera 9 are m and n respectively, and then the abscissa value of the geometric center position of the airborne laser target 8 on the target surface of the ground solid-state camera 9
Figure BDA0003331795590000081
In practical engineering application, in view of reducing error, a method of averaging multiple measurements may be used, and for convenience of description in this example, x is taken0(m + n)/2. Therefore, the direction of the geometric center position of the airborne laser target 8 deviating from the optical axis of the ground solid-state camera 9 can be calculated
αd-p=x0·Δθ
As shown in fig. 5, point a is a preset working position of the tethered drone 1, point B is an actual position of the tethered drone 1, and points a 'and B' are projections of point a and point B on an xOy plane, so that ═ AOB ═ αd-p,∠AOA′=∠BOB′=β0. Let AA ═ BB ═ h and OA ═ OB ═ R, the orientation of the geometric center position of the airborne laser target 8 relative to the ground solid-state camera 9 ═ a' OB ═ αd-p', the coordinate of the point A is (0, h.tan beta.)0H) and the coordinates of the point B are (h.tan beta)0·sinαd-p′,h·tanβ0·cosαd-p', h). From a vector geometric relationship of
Figure BDA0003331795590000091
Can deduce
Figure BDA0003331795590000092
Similarly, the ordinate values of the light sources of the airborne laser target 82 on the target surface of the ground solid-state camera 9 are k and l, respectively, and then the ordinate of the geometric center of the airborne laser target 8 on the target surface of the ground solid-state camera 9 is
Figure BDA0003331795590000093
Therefore, the pitching of the airborne laser target 8 with the geometric center position deviating from the optical axis of the ground solid-state camera 9 can be calculated
βd-p=y0·Δθ
The method for acquiring the orientation and the pitching of the geometric center position of the ground laser target 10 deviating from the optical axis of the airborne solid-state camera 7 by the airborne solid-state camera 7 is the same as the above method, and the orientation alpha of the geometric center position of the ground laser target 10 deviating from the optical axis of the airborne solid-state camera 7 is measured by the same methodp-dPitch betap-d
Similarly, the geometric center position of the ground laser target 10 is opposite to the position of the mooring unmanned aerial vehicle 1
Figure BDA0003331795590000094
Step three: by comparing the distance between the images of the airborne laser target 8 on the target surface of the ground solid-state camera 9 and the real distance between 2 light sources on the plane, the distance R is calculated by utilizing the roll upsilon of the captive unmanned aerial vehicle 1 output by the airborne inertial navigation equipment.
Image spacing of 82 light sources of airborne laser target on target surface of ground solid-state camera 9
Figure BDA0003331795590000095
Also, in practical engineering applications, a method of averaging a plurality of measurements may be used in view of reducing errors.
As shown in fig. 6, E is the light source 1, H is the light source 2, then EH ═ ρ, plane ξ is the horizontal plane where the tethered unmanned aerial vehicle 1 is located, E' is the projection point OF the light source 1 on the plane ξ, O is the ground solid-state camera position, OF is the optical axis direction OF the ground solid-state camera 9, crosses H to serve as the perpendicular line OF, crosses OF to P, crosses OE to K, and then HK ═ OF. At rho, alphap-dAnd when v is known, the projection length rho of rho on a plane parallel to the target surface of the ground solid-state camera at the position of the tethered unmanned aerial vehicle0Expressed as HK
ρ0=F(ρ,αp-d,υ)
As shown in FIG. 7, from the optical geometry, there are
Figure BDA0003331795590000101
Can obtain the product
Figure BDA0003331795590000102
Step four: the position, heading and pitching of the tethered unmanned aerial vehicle 1 relative to the ground control station 2 are calculated through the acquired distance, azimuth and pitching information
Mooring the x-coordinate of the drone 1 by the geometrical relationship shown in figure 3
xd=R·cos(β0d-p)·cos(α0d-p′)
Y coordinate of tethered drone 1
yd=R·cos(β0d-p)·sin(α0d-p′)
Z coordinate of tethered drone 1
zd=R·sin(β0d-p)
Mooring unmanned aerial vehicle 1 relative to the heading of ground control station 2
αd=360°-αp-d
As shown in fig. 8, the pitch of the tethered drone 1
βd=-(-γ0p-d)
Wherein gamma is0The pitching of the airborne solid-state camera 7 with the optical axis deviating from the datum line of the mooring unmanned aerial vehicle 1 is carried out, wherein beta is more than or equal to minus 90 degreesdp-d0≤90°,βdp-d0Are all positive and negative upward and downward from the horizontal plane.
The system configuration steps are as follows:
the method comprises the following steps: selecting a mooring unmanned aerial vehicle 1, and presetting relative coordinates of the working position of the mooring unmanned aerial vehicle 1
Selecting proper ones according to task requirementsMan-machine, the preset operating position who moors unmanned aerial vehicle 1. The operating position of the tethered unmanned aerial vehicle 1 is a, the projection point of the point a on the xOy plane is B, in this example, AB is preset to 200m, and for convenience of calculation, OB is set to 200m, then
Figure BDA0003331795590000103
Step two: determining solid state camera and laser target specifications
A 2-part solid-state camera with a resolution H × V1920 × 1080, a focal length f 135mm, and a single element size N × N4 × 4 μm was selected. Angular resolution of solid-state camera of this type
Figure BDA0003331795590000111
Horizontal field angle
θH=H·Δθ=1920×0.0296mrad≈3.256°
Vertical field of view
θV=V·Δθ=1080×0.0296mrad≈1.832°
Distance resolution of the solid-state camera at distance R282.8 m
Δd|R=282.8m=R·Δθ=282.8m×0.0296mrad≈8.37mm
Selecting the light source size d of the laser target05mm, the divergence angle psi of the light source is not less than max (theta)HV) 3.256 degrees, the distance rho between the 82 light sources of the airborne laser target is based on the actual installation condition, in this example, the distance rho is 2000mm, and the distance rho between the 102 light sources of the ground laser target is taken1=150mm。
Step three: calibration fixed ground optical alignment device 4
Installing a ground laser target 10 at the outer edge of a lens of a ground solid-state camera 9, installing the ground solid-state camera 9 on a ground reference table 11, calibrating the horizontal axis of the target surface of the ground solid-state camera 9 to be horizontal, and adjusting the elevation angle beta of the optical axis of the ground solid-state camera 90When beta is shown in FIG. 30When the angle AOB is adjusted to be 45 degrees, locking the optical axis of the ground solid-state camera 9 for pitching; adjusting the orientation of the ground reference table 11To orient the optical axis of the terrestrial solid-state camera 9 by alpha0B is pointed, a is pointed0When the azimuth is adjusted to ζ, the ground reference table 11 azimuth is locked. At this time, the calibration and fixation of the ground optical alignment device 4 are completed, and the optical axis of the ground solid-state camera 9 points to A.
Step four: calibration installation onboard optical alignment device 3
82 light sources of the airborne laser targets are respectively installed on the left side and the right side of the tethered unmanned aerial vehicle 1, and the pitching gamma of the airborne solid-state camera 7, the optical axis of which deviates from the datum line AJ of the tethered unmanned aerial vehicle 1, is determined according to the preset coordinates of the tethered unmanned aerial vehicle 10As shown in FIG. 9, the optical axis orientation of the onboard solid-state camera 7 should be the same as AJ, γ0The direction of an optical axis of the onboard optical alignment device 3 after being fixedly installed is shown as AK in fig. 8, and the light source of the calibration onboard laser target 8 points to be parallel to the AK.
Step five: the mooring unmanned aerial vehicle 1 is lifted off and mutually aimed with the ground control station 2, and the system acquires the information of the relative position, the pitching and the distance of the air and the ground
The mooring unmanned aerial vehicle 1 is electrified to ascend to the air, goes to a preset working position, and adjusts the heading of the mooring unmanned aerial vehicle 1 to enable the 102 light sources of the ground laser target to fall into the field range of the airborne solid-state camera 7, so that mutual aiming at the air and the ground is realized.
The ground solid-state camera 9 captures the light source of the airborne laser target 8, the imaging condition on the target surface of the camera is shown in figure 4, the abscissa values of 2 light sources on the target surface of the ground solid-state camera 9 are m respectively0And n0The geometric center of the airborne laser target 8 is positioned on the abscissa of the target surface of the ground solid-state camera 9
Figure BDA0003331795590000121
Therefore, the direction of the geometric center position of the airborne laser target 8 deviating from the optical axis of the ground solid-state camera 9 can be calculated
Figure BDA0003331795590000122
Airborne laser markOrientation alpha of geometric center position of target 8 relative to ground solid-state camera 9d-p′=φ0′。
The longitudinal coordinate values of the 82 light sources of the airborne laser target on the target surface of the ground solid-state camera 9 are respectively l0And k0The vertical coordinate of the geometric center of the airborne laser target 8 on the target surface of the ground solid-state camera 9
Figure BDA0003331795590000123
Therefore, the pitching of the airborne laser target 8 with the geometric center position deviating from the optical axis of the ground solid-state camera 9 can be calculated
Figure BDA0003331795590000124
The method for acquiring the orientation and the pitching of the geometric center position of the ground laser target 10 deviating from the optical axis of the airborne solid-state camera 7 by the airborne solid-state camera 7 is consistent with the above method, and the orientation alpha of the geometric center position of the ground laser target 10 deviating from the optical axis of the airborne solid-state camera 7 is measuredp-d=φ1In pitch
Figure BDA0003331795590000127
Orientation alpha of geometric center position of ground laser target 10 relative to airborne solid-state camera 7p-d′=φ1′。
Image spacing of 82 light sources of airborne laser target on target surface of ground solid-state camera 9
Figure BDA0003331795590000125
As shown in fig. 6, the onboard inertial navigation device outputs roll υ ═ υ of the tethered drone 10Is provided with
ρ0=F(ρ,φ1,v0)=ρ0d
From the geometric relationship in FIG. 6, there are
Figure BDA0003331795590000126
Can obtain the product
Figure BDA0003331795590000131
Step six: calculating the accurate coordinate, heading and pitching of the mooring unmanned aerial vehicle 1 relative to the ground control station 2 through the acquired azimuth, pitching and distance information
Mooring the x-coordinate of the drone 1 by the geometrical relationship shown in figure 3
Figure BDA0003331795590000133
Y coordinate of tethered drone 1
Figure BDA0003331795590000134
Z coordinate of tethered drone 1
Figure BDA0003331795590000136
Mooring unmanned aerial vehicle 1 relative to the heading of ground control station 2
αd=360°-αp-d′=360°-φ1
As shown in fig. 7, the pitch of the tethered drone 1
Figure BDA0003331795590000137
Feasibility analysis:
the horizontal coverage range of the camera at the slant distance R of 282.8m is not considered by the influence of terrain shading
Figure BDA0003331795590000135
Vertical coverage
Figure BDA0003331795590000138
It is believed that the selected solid-state camera has a sufficiently large field of view at R282.8 m, and that the on-board solid-state camera 7 and the ground solid-state camera 9 facilitate mutual aiming, i.e. the opposing laser target should be able to enter the camera field of view relatively easily.
As shown in fig. 10, point O is the position of the ground solid-state camera 9, point a is the working position of the tethered drone 1, plane ξ is the horizontal plane of the tethered drone 1, straight line AB is the intersection of the horizontal field of view of the ground solid-state camera 9 and plane ξ, and OA ═ AB is provided, in this example, < BAx ═ 45 °. Tethered unmanned aerial vehicle 1 moves Δ d in the AB directionR=282.8mWhen the distance is 8.37mm, the ground solid-state camera 9 can recognize the corresponding x-axis direction offset
Figure BDA0003331795590000132
In the same way, the offset delta y of the mooring unmanned aerial vehicle in the directions of the y axis and the z axis is approximately equal to 11.84mm, and the offset delta z is approximately equal to 11.84mm, namely, the mooring unmanned aerial vehicle 1 can be identified in the three axial directions of x, y and z as long as the offset is greater than 11.84mm, and the identified offset is sent to the mooring unmanned aerial vehicle 1 flight control system, so that the stability of the mooring unmanned aerial vehicle 1 is sufficiently ensured.
Given the parameters in this example, the system measures the range accuracy of the tethered drone 1 at R282.8 m
Figure BDA0003331795590000141
In practical application, the ground solid-state camera 9 can adopt a zoom lens to adjust the ground solid-state phase on the premise that the tethered unmanned aerial vehicle 1 stably hovers at a working position and realizes air-ground mutual aimingThe focal length of the drone 9, for example, is adjusted to f 270mm, when the horizontal coverage of the ground solid-state camera 9 at the tethered drone 1 is now
Figure BDA0003331795590000142
Vertical coverage
Figure BDA0003331795590000143
The captive unmanned aerial vehicle 1 can still be ensured to be positioned in the field of view of the ground solid-state camera 9, and the camera distance resolution can be improved to 2 times of the original resolution; in addition, the ground solid-state camera 9 can be used for positioning the airborne laser target 8 and adopting algorithms such as sub-pixel identification, the camera distance resolution can be continuously improved to 4 times of the original camera distance resolution, and meanwhile, the distance measurement precision of the tethered unmanned aerial vehicle 1 at the position where R is 282.8m can be improved to I by adopting the two methodsRAnd the positioning precision of the tethered unmanned aerial vehicle 1 given by the system can be equivalent to that of the RTK satellite navigation technology by being approximately equal to 0.148 m.
Finally, it should be noted that the above embodiments are only used for illustrating the technical solutions of the present invention and are not limited. Although the present invention has been described in detail with reference to the embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (9)

1. A tethered drone optoelectronic positioning system that does not rely on satellite navigation and ranging equipment, the system comprising: the system comprises an airborne optical alignment device deployed on a tethered unmanned aerial vehicle, a ground optical alignment device deployed on a ground control station and a resolving and positioning module; wherein the airborne optical alignment device comprises an airborne solid-state camera and an airborne laser target, and the ground optical alignment device comprises a ground solid-state camera and a ground laser target;
the airborne optical alignment device is used for imaging the ground laser target through the airborne solid-state camera to obtain the coordinate value of the ground laser target on the target surface of the airborne solid-state camera;
the ground optical alignment device is used for imaging the airborne laser target through the ground solid-state camera to obtain the coordinate value of the airborne laser target on the target surface of the ground solid-state camera;
and the resolving and positioning module is used for resolving the coordinate values to obtain the position information of the unmanned aerial vehicle according to the optical geometric relationship, so that the positioning is realized.
2. The system of claim 1, wherein the onboard solid-state camera is a CMOS or CCD sensor, the ground solid-state camera is a CMOS or CCD sensor, and the distance resolution between the onboard solid-state camera and the ground solid-state camera during normal operation is not less than a preset threshold.
3. The system of claim 1, wherein the airborne laser target comprises at least 2 light sources respectively mounted on the left and right sides of the azimuth axis of the airborne drone and equidistant from the azimuth axis of the drone, the 2 light sources are connected by a line passing through the geometric center of the airborne drone, and the light sources are all directed parallel to the direction of the optical axis of the airborne solid-state camera;
the ground laser target comprises at least 2 light sources, a connecting line between the 2 light sources penetrates through an optical axis of the camera, a perpendicular bisector of the connecting line of the 2 light sources penetrates through the optical axis of the camera, the geometric center of the ground laser target coincides with the position of the optical axis of the ground solid-state camera, and the direction of a reference line of the light sources is parallel to the direction of the optical axis of the ground solid-state camera.
4. The system according to claim 3, wherein the specific processing procedure of the positioning resolving module is as follows:
establishing a coordinate system with a ground control station as an origin;
abscissa m of 2 light sources of receiver-borne laser target on ground solid-state camera target surfacepAnd np
Receiver-borne laserOrdinate k of 2 light sources of target on ground solid-state camera target surfacepAnd lp
Abscissa m of 2 light sources for receiving ground laser targets on airborne solid-state camera target surfacedAnd nd
Vertical coordinate k of 2 light sources for receiving ground laser target on airborne solid-state camera target surfacedAnd ld
According to the abscissa mpAnd npAnd ordinate kpAnd lpAnd calculating to obtain the azimuth alpha of the geometric center position of the airborne laser target deviating from the optical axis of the ground solid-state camerad-pAnd pitch betad-p
According to the abscissa mdAnd ndAnd ordinate kdAnd ldAnd calculating the direction alpha of the geometric center position of the ground laser target deviating from the optical axis of the airborne solid-state camerap-dAnd pitch betap-d
Calculating the slope distance R of the ground solid-state camera and the airborne solid-state camera by combining the roll upsilon output by the airborne inertial navigation equipment of the tethered unmanned aerial vehicle;
and calculating the coordinates, heading and pitching of the tethered unmanned aerial vehicle according to the optical geometric relationship by using the azimuth, the pitching and the slant distance.
5. The system according to claim 4, wherein said system is characterized by m, mpAnd npAnd ordinate kpAnd lpAnd calculating to obtain the azimuth alpha of the geometric center position of the airborne laser target deviating from the optical axis of the ground solid-state camerad-pAnd pitch betad-p(ii) a The method specifically comprises the following steps:
according to the abscissa mpAnd npAnd calculating the abscissa x of the geometric center position of the airborne laser target on the target surface of the ground solid-state camera according to the following formula0-pComprises the following steps:
Figure FDA0003331795580000021
according to the following formula, calculating to obtain the optical axis orientation alpha of the solid-state camera with the geometric center position of the airborne laser target deviating from the groundd-pComprises the following steps:
αd-p=x0-p·Δθp
wherein, Delta thetapFor the angular resolution of a terrestrial solid-state camera,
Figure FDA0003331795580000022
fpthe single phase element size of the terrestrial solid-state camera is N for the focal length of the terrestrial solid-state camerap×Np
According to ordinate kpAnd lpAnd calculating the ordinate y of the geometric center position of the airborne laser target on the target surface of the ground solid-state camera according to the following formula0-pComprises the following steps:
Figure FDA0003331795580000023
according to the following formula, calculating the pitching beta of the geometric center position of the airborne laser target deviating from the optical axis of the ground solid-state camerad-pComprises the following steps:
βd-p=y0-p·Δθp
6. the system according to claim 5, wherein said system is characterized by m, mdAnd ndAnd ordinate kdAnd ldAnd calculating the direction alpha of the geometric center position of the ground laser target deviating from the optical axis of the airborne solid-state camerap-dAnd pitch betap-d(ii) a The method specifically comprises the following steps:
according to the abscissa mdAnd ndAnd calculating the abscissa x of the geometric center position of the ground laser target on the target surface of the airborne solid-state camera according to the following formula0-dComprises the following steps:
Figure FDA0003331795580000031
calculating the deviation of the geometric center position of the ground laser target from the optical axis orientation alpha of the airborne solid-state camera according to the following formulap-dComprises the following steps:
αp-d=x0-d·Δθd
wherein, Delta thetadFor the angular resolution of the on-board solid-state camera,
Figure FDA0003331795580000032
fdfor the focal length of the onboard solid-state camera, the size of a single phase element of the onboard solid-state camera is Nd×Nd
According to ordinate kdAnd ldAnd calculating the ordinate y of the geometric center position of the ground laser target on the target surface of the airborne camera according to the following formula0-dComprises the following steps:
Figure FDA0003331795580000033
calculating the pitching beta of the geometric center position of the ground laser target deviating from the optical axis of the airborne solid-state camera according to the following formulap-dComprises the following steps:
βp-d=y0-d·Δθd
7. the system of claim 6, wherein the slope distance R of the ground solid-state camera and the airborne solid-state camera is calculated in combination with the roll upsilon output by the airborne inertial navigation device of the tethered drone; the method specifically comprises the following steps:
Figure FDA0003331795580000034
wherein F (. alpha.) represents the value of p.alpha.p-dAnd upsilon are known, the excitation of the vehicleThe projected length of the spacing ρ of the 2 light sources of the light targets on a plane parallel to the ground solid state camera target surface at the tethered drone.
8. The system according to claim 7, wherein the coordinates, heading and pitch of the tethered drone are calculated from the azimuth, pitch and skew according to the optical geometry; the method specifically comprises the following steps:
coordinates (x) of tethered droned,yd,zd) Comprises the following steps:
xd=R·cos(β0d-p)·cos(α0d-p′)
yd=R·cos(β0d-p)·sin(α0d-p′)
zd=R·sin(β0d-p)
wherein alpha isd-p' is the orientation, alpha, of the geometric center position of the airborne laser target relative to the ground solid-state camera0Azimuth angle of optical axis of terrestrial solid-state camera, beta0Pitch angle, beta, of optical axis of ground solid-state camerad-pThe geometrical center position of the airborne laser target is deviated from the pitching of the optical axis of the ground solid-state camera, and R is the slant distance between the ground solid-state camera and the airborne solid-state camera;
mooring unmanned aerial vehicle heading alpha relative to ground control stationdComprises the following steps:
αd=360°-αp-d'
wherein alpha isp-d' is the orientation of the geometric center position of the ground laser target relative to the airborne solid-state camera;
pitching beta of tethered unmanned aerial vehicledComprises the following steps:
βd=-(-γ0p-d)
wherein, γ0The pitching of the airborne solid-state camera from the reference line of the tethered unmanned aerial vehicle is realized by beta of-90 degrees or moredp-d0≤90°,βdp-d0All get out of horizontal planeUpward is positive and downward is negative.
9. The system of claim 1, wherein the ground optical alignment device further comprises a ground reference stage for initial calibration of the ground solid state camera optical axis pointing direction.
CN202111282905.9A 2021-11-01 2021-11-01 Mooring unmanned aerial vehicle photoelectric positioning system independent of satellite navigation and distance measuring equipment Pending CN114200397A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111282905.9A CN114200397A (en) 2021-11-01 2021-11-01 Mooring unmanned aerial vehicle photoelectric positioning system independent of satellite navigation and distance measuring equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111282905.9A CN114200397A (en) 2021-11-01 2021-11-01 Mooring unmanned aerial vehicle photoelectric positioning system independent of satellite navigation and distance measuring equipment

Publications (1)

Publication Number Publication Date
CN114200397A true CN114200397A (en) 2022-03-18

Family

ID=80646594

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111282905.9A Pending CN114200397A (en) 2021-11-01 2021-11-01 Mooring unmanned aerial vehicle photoelectric positioning system independent of satellite navigation and distance measuring equipment

Country Status (1)

Country Link
CN (1) CN114200397A (en)

Similar Documents

Publication Publication Date Title
CN102928861B (en) Target positioning method and device for airborne equipment
US8300096B2 (en) Apparatus for measurement of vertical obstructions
US8649917B1 (en) Apparatus for measurement of vertical obstructions
CN114200396A (en) Tethered unmanned aerial vehicle photoelectric positioning system independent of satellite navigation technology
CN108387206B (en) Carrier three-dimensional attitude acquisition method based on horizon and polarized light
US6653650B2 (en) Streamlined method and apparatus for aligning a sensor to an aircraft
CN104931922B (en) Vehicle double antenna satellite direction finder azimuth transmission apparatus and method
KR101394881B1 (en) Method for geolocalization of one or more targets
CN103234555A (en) Photoelectric stabilized platform assembly zero calibration method
US7558688B2 (en) Angle calibration of long baseline antennas
CN113358135B (en) Method for correcting aircraft position by photoelectric measurement data
CN108225282B (en) Remote sensing camera stereo mapping method and system based on multivariate data fusion
CN115876197A (en) Mooring lifting photoelectric imaging target positioning method
CN109146936B (en) Image matching method, device, positioning method and system
CN113340272A (en) Ground target real-time positioning method based on micro-group of unmanned aerial vehicle
CN114200397A (en) Mooring unmanned aerial vehicle photoelectric positioning system independent of satellite navigation and distance measuring equipment
CN215767057U (en) Dynamic adjusting device for improving precision of rock mass of complex slope investigated by unmanned aerial vehicle
CN116222301A (en) Rapid aiming method and system for offshore launching rocket
CN114166202A (en) Tethered unmanned aerial vehicle optical positioning system based on annular laser target and solid-state camera
CN113109829B (en) Calibration method of synchronous scanning intersection measurement sensor
CN113777569A (en) Radar-linked photoelectric automatic dynamic calibration method and system
CN115359048B (en) Real-time dynamic alignment measurement method based on closed-loop tracking and aiming and tracking and aiming device
KR102400161B1 (en) Target tracking system, alignment method thereof and target tracking method thereof
US6864969B1 (en) Calibration system for calibrating orientation parameters of digital optoelectronic sensors in a mobile carrier used for remote reconnaisance
CN103913932B (en) Space infrared camera with visible light landmark navigation channel

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination