CN110989645B - Target space attitude processing method based on compound eye imaging principle - Google Patents

Target space attitude processing method based on compound eye imaging principle Download PDF

Info

Publication number
CN110989645B
CN110989645B CN201911211945.7A CN201911211945A CN110989645B CN 110989645 B CN110989645 B CN 110989645B CN 201911211945 A CN201911211945 A CN 201911211945A CN 110989645 B CN110989645 B CN 110989645B
Authority
CN
China
Prior art keywords
image
flying
target
compound eye
video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911211945.7A
Other languages
Chinese (zh)
Other versions
CN110989645A (en
Inventor
骆强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xi'an Ouyite Technology Co ltd
Original Assignee
Xi'an Ouyite Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xi'an Ouyite Technology Co ltd filed Critical Xi'an Ouyite Technology Co ltd
Priority to CN201911211945.7A priority Critical patent/CN110989645B/en
Publication of CN110989645A publication Critical patent/CN110989645A/en
Application granted granted Critical
Publication of CN110989645B publication Critical patent/CN110989645B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T90/00Enabling technologies or technologies with a potential or indirect contribution to GHG emissions mitigation

Abstract

The invention belongs to the technical field of information processing, and particularly relates to a target space posture processing method based on a compound eye imaging principle. The method specifically comprises the following steps of orthogonally shooting image information of a flying target through a compound eye imitation imaging system; processing image information orthogonally shot by a compound eye imaging system to obtain two groups of image sequences orthogonally shot by each other; acquiring position information of characteristic points and characteristic line pixels of a flying target at different moments from two groups of image sequences which are orthogonally shot; acquiring coordinate information of feature points and feature line pixels of a flying target by combining a scale; and calculating and acquiring the speed, the relative spatial position and the flight direction of the flight target. The calculation amount is smaller, the program is simpler, and the calculation result is more accurate.

Description

Target space attitude processing method based on compound eye imaging principle
Technical Field
The invention belongs to the technical field of information processing, and particularly relates to a target space posture processing method based on a compound eye imaging principle.
Background
With the development and progress of human society, unmanned aerial vehicles are increasingly widely used, and the unmanned aerial vehicles are widely applied to the aspects of modern movie shooting, disaster relief and emergency, resource exploration, even future express delivery and transportation and the like. However, the unmanned aerial vehicle of the new model tries to fly, and the flying track of the unmanned aerial vehicle is shot by a high-speed camera and analyzed. However, single lens imaging of the type in the past, single position imaging systems have failed to meet the measurement requirements of guided weapon systems.
Existing unmanned aerial vehicle space attitude monitoring is all monitored through setting up monitoring devices on unmanned aerial vehicle and realizing the aerial flight gesture to unmanned aerial vehicle, like an unmanned aerial vehicle flight gesture survey system of chinese invention patent, application number 2017112032120 specifically discloses an unmanned aerial vehicle flight gesture survey system, including unmanned aerial vehicle, and set up measurement module, control module, data processing module and the signal transmission module on the unmanned aerial vehicle, wherein: the measuring module comprises a plurality of laser ranging sensors, wherein the laser ranging sensors are respectively arranged at the opposite angles of the edges of the unmanned aerial vehicle and are perpendicular to the ground; the laser ranging sensors form an array, and height information of each part of the unmanned aerial vehicle relative to the ground is synchronously measured; the control module comprises a main control board and controls the flying height and the flying direction of the unmanned aerial vehicle; the data processing module comprises a measuring program, the data processing module processes and calculates the data obtained by the measuring module, and the data processing module eliminates systematic errors and random errors through calculation and analysis, and determines whether the unmanned aerial vehicle is inclined forwards, backwards, left and right or not and the degree of unmanned aerial vehicle inclination through the measuring program, so that unmanned aerial vehicle flight attitude information is obtained; the data processing module is used for processing and calculating the data obtained by the measuring module to obtain the height of the unmanned aerial vehicle, and comparing GPS (global positioning system) and RTK (real time kinematic) data to achieve the purpose of calibrating the height of the unmanned aerial vehicle, so that data support is provided for the height adjustment of the unmanned aerial vehicle; the signal transmission module transmits the unmanned aerial vehicle flight attitude information measured by the data processing module to the movable equipment for visual observation of an operator. However, this method of detection provided on a drone is certainly a great burden on the drone, and not all flying units are suitable for providing a flying attitude monitoring device on its structural body. In order to further and obviously reduce the burden of the flying unit, reduce the burden of the flying unit body, ensure the real-time performance and reliability of the field of view, overcome various adverse factors in a complex environment, and have a very necessary requirement for arranging an independent space attitude monitoring system of the aircraft on the ground, no better monitoring device or method exists in the prior art.
Disclosure of Invention
The invention discloses a target space posture processing method based on a compound eye imaging principle.
In order to achieve the above purpose, the specific technical scheme of the invention is that a target space gesture processing method based on compound eye imaging principle comprises the following steps of orthogonally shooting image information of a flying target through a compound eye imaging system;
processing image information orthogonally shot by a compound eye imaging system to obtain two groups of image sequences orthogonally shot by each other;
acquiring position information of characteristic points and characteristic line pixels of a flying target at different moments from two groups of image sequences which are orthogonally shot;
acquiring coordinate information of feature points and feature line pixels of a flying target by combining a scale;
and calculating and acquiring the speed, the relative spatial position and the flight direction of the flight target.
Further, processing the shot image information specifically includes editing a video image, enhancing the video image, stitching the image, compressing an image video, and adjusting the gray level of the image;
editing the video image refers to editing valuable parts of the image and the video, so that the data processing amount of the image video data is reduced in the subsequent processing process;
the enhancement of the video image means that the gray level distribution histogram of the image is adjusted, so that the histogram is more uniform, and the definition of the image is increased;
the image stitching refers to stitching the images of each view field of the image in a plurality of view field ranges so as to enlarge the view field of the image;
the compression of the image video means that the redundancy of the image video is reduced, so that the data volume of the image and the video is reduced, and the subsequent processing speed is improved;
the adjustment of the image gray level is to adjust the brightness of the image by the pointer on the intensity of the light imaged on the object, and the condition of uneven brightness of the image caused by illumination of the flying target in the flying process is overcome, so that the parameter calculation precision of the space flying target is improved.
Further, the speed calculation method of the flying target specifically includes that in a three-dimensional coordinate system Oxyz, an OC vector is a speed vector of the unmanned aerial vehicle target, and OB and OD are projection vectors of the vector OC on a coordinate plane Oxz and a plane Oyz respectively;
in the orthogonal imaging process, the OB vector and the OD vector are respectively velocity vectors of two imaging systems in two orthogonal directions, the ratio xOB is theta 1, the ratio yOD is theta 2, the three-dimensional included angle between the velocity vector and the horizontal plane Oxy is theta, the calculation relation between theta and theta 1 and theta 2 is,
Figure GDA0004125071690000041
calculating the ratio of unit pixels of two orthogonal imaging systems in the simulated compound eye orthogonal imaging system to the actual distance, and then rootCalculating the actual displacement S of the flying object in the x, y and z directions according to the proportional scale x S y S z And is obtained from the time interval t of two images formed by two orthogonal imaging systems,
Figure GDA0004125071690000042
Figure GDA0004125071690000043
Figure GDA0004125071690000044
the actual flying speed of the flying target is
Figure GDA0004125071690000045
The beneficial effects are that: through the use of the orthogonal compound eye-like imaging camera, the image information of orthogonal shooting with larger view field, higher definition and longer distance is obtained, so that the position information of the characteristic points and the characteristic line pixels of the flying target at different moments is extracted from the image information, the coordinate information of the characteristic points and the characteristic line pixels of the flying target is obtained according to the proportional scale, the speed, the relative space position and the flying direction of the flying target are calculated and obtained, the influence on the body of the flying target is small, the monitoring processing calculation result is more accurate, the program is simpler, and the adaptability to various flying targets is stronger; through the design of the image processing method, the post-transmission and processing data volume is smaller, the cleaning degree of the image is higher, the image view field is larger, the post-image processing speed is higher, the problem of uneven brightness of the image caused by the influence of illumination on a flying target is effectively solved, and the calculation and monitoring of various parameters of the flying target are more accurate; the flight attitude of the flight target is calculated through the orthographic projection method, so that the calculation amount is smaller, the program is simpler, and the calculation result is more accurate.
Drawings
Fig. 1 is a block diagram of a target spatial pose processing system based on compound eye imaging principles of the present invention.
FIG. 2 is a flow chart of an image acquisition processing method;
fig. 3 is a flow chart of a target spatial pose processing method based on the compound eye imaging principle of the invention.
Fig. 4 is a three-dimensional graph of a calculation of the flying speed of a flying target.
Reference numerals illustrate: 101. an image sequence 1; 102. an image sequence 2; 103. collecting characteristic points and characteristic lines of a flying target; 104. a scale of scale; 105. target feature points and position information of feature occurrence; 106. a speed; 107. a spatial location; 108. the direction of flight.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1 to fig. 4, the target space gesture processing system based on the compound eye imaging principle includes a compound eye simulating imaging system and a data processing system which are orthogonally arranged; the two groups of compound eye simulating imaging systems are arranged outside the monitoring area in an orthogonal mode, each compound eye simulating imaging system is composed of a plurality of high-speed cameras, and the compound eye simulating imaging system is formed by combining a plurality of high-speed cameras through the compound eye simulating principle of insects, so that the compound eye simulating imaging system has high-speed imaging capability with a large field of view. Thereby acquiring the image information of the flying target in the space region through the compound eye imitation imaging system.
The compound eye imitation imaging system is also provided with a control transmitting end and a transmitting antenna; the data processing system comprises a receiving antenna, a remote control and data receiving end, a data acquisition processing module and a data analysis processing module; the receiving antenna is connected with the remote control and data receiving end, and the remote control and data receiving end is connected with the data analysis processing module through the data acquisition processing module. The compound eye imitation imaging system is in communication connection with a receiving antenna on the data processing system through a transmitting antenna. The control transmitting end and the transmitting antenna are arranged on the simulated compound eye imaging system, so that the simulated compound eye imaging system is in wireless communication connection with the data processing system, and an adjusting platform can be fixedly arranged at the bottom of the simulated compound eye imaging system, so that the pitch angle and shooting direction of a lens of the simulated compound eye imaging system, the focal length and focal point of high-speed camera imaging and the adjustment of image triggering acquisition are realized, and the communication transmission distance between the simulated compound eye imaging system and the data processing system is required to be more than 10 km.
The orthogonal simulated compound eye imaging system is used for shooting orthogonal image information of a flying target and sending the acquired information to the data processing system through the control transmitter and the transmitting antenna; the data processing system is used for receiving orthogonal image information shot by the orthogonal compound eye imaging system, processing the image information, calculating the flying speed, the space position and the flying direction of the flying target, and determining the flying attitude of the flying target.
The data acquisition processing module comprises a memory module, a video image editing module, an image enhancement module, an image stitching module, an image video compression module and an image gray scale adjustment module; one end of the memory module is connected with the remote control and data receiving end and is used for storing the received image information acquired by the compound eye imitation imaging system, and the other end of the memory module is respectively connected with the video image editing module, the image enhancement module, the image stitching module, the image video compression module and the image gray scale adjustment module. The video image editing module is used for editing valuable parts of the images and the videos, so that the data processing amount of the image video data is reduced in the subsequent processing process; the image enhancement module is used for adjusting the gray distribution histogram of the image, so that the histogram is more uniform, and the definition of the image is increased; the image stitching module is used for stitching the images of each view field of the images in a plurality of view field ranges so as to enlarge the view field of the images; the image video compression module is used for reducing redundancy of image videos, so that the data volume of the images and the videos is reduced, and the speed of subsequent processing is improved; the image gray scale adjusting module is used for adjusting the brightness of the image according to the intensity of light imaged on the object, and overcoming the uneven brightness of the image caused by illumination of the flying target in the flying process, so as to improve the accuracy of parameter calculation of the space flying target.
And finally, the data acquisition processing module sends the processed image information to a data analysis processing module connected with the data acquisition processing module, and the flight speed 106, the spatial position 107 and the flight direction 108 of the flight target are obtained through analysis and calculation of the data analysis processing module, and finally the flight attitude of the flight target is determined.
As shown in fig. 3, a target space gesture processing method based on a compound eye imaging principle specifically includes the following steps that image information of a flying target is orthogonally shot through a compound eye imaging system, then the acquired image information of the flying target is remotely transmitted to a receiving antenna, a remote control and a data receiving end which are arranged on a data processing system through a control transmitting end and a transmitting antenna, and then the received image information of the flying target is stored by a memory module in a data acquisition processing module; then, editing the valuable parts of the image and the video through a video image editing module, so that the data processing amount of the image and the video data is reduced in the subsequent processing process; the gray level distribution histogram of the image is adjusted through the image enhancement module, so that the histogram is more uniform, and the definition of the image is increased; splicing the images of each view field of the images in a plurality of view field ranges by an image splicing module so as to enlarge the view field of the images; redundancy of the image video is reduced through the image video compression module, so that data quantity of the image and the video is reduced, and the subsequent processing speed is improved; the image gray level adjusting module is used for adjusting the brightness of the image according to the intensity of light imaged on the object, so that the condition that the brightness of the image is uneven due to illumination in the flight process of the flying target is overcome, and the parameter calculation accuracy of the space flying target is improved. The method comprises the steps of obtaining flight target image information shot by two orthogonally arranged compound eye-imitating imaging systems, processing the flight target image information by a data acquisition processing module, obtaining two groups of image sequences shot in an orthogonal mode, marking the two groups of image sequences as an image sequence 1 and an image sequence 2, and finally transmitting the processed flight target image information to a data analysis processing module.
The data analysis processing module acquires position information of pixels of feature points and feature lines 103 of a flying target at different moments by analyzing the image sequence 1 and the image sequence 2 which are shot in mutually orthogonal mode; and then combining the preset scale 104, so that the position information of the characteristic points and the characteristic line pixels of the flying target, namely the coordinate information of the characteristic points and the characteristic line pixels of the flying target, is obtained according to the proportional relation between the image information and the scale.
After knowing the coordinate information of the feature points and the feature line pixels of the flying object, calculating the speed 106 of the flying object according to the coordinate information of the feature points and the feature line pixels of the flying object, as shown in fig. 4, in the three-dimensional coordinate system Oxyz, the OC vector is the speed vector of the unmanned aerial vehicle object, and OB and OD are projection vectors of the vector OC on the coordinate plane Oxz and the plane Oyz respectively;
in the orthogonal imaging process, the OB vector and the OD vector are respectively velocity vectors of two imaging systems in two orthogonal directions, the ratio xOB is theta 1, the ratio yOD is theta 2, the three-dimensional included angle between the velocity vector and the horizontal plane Oxy is theta, the calculation relation between theta and theta 1 and theta 2 is,
Figure GDA0004125071690000091
thereby calculating the three-dimensional included angle between the velocity vector and the horizontal plane Oxy according to the relation and marking the three-dimensional included angle as theta, then calculating the ratio of unit pixels of image information of the flying object acquired by two orthogonal imaging systems in the simulated compound eye orthogonal imaging system to the actual distance according to the three-dimensional included angle theta, and then calculating the actual displacement S of the flying object in the x, y and z directions according to the scale x、 S y、 S z And is obtained from the time interval t of two images formed by two orthogonal imaging systems,
Figure GDA0004125071690000092
Figure GDA0004125071690000093
Figure GDA0004125071690000094
the actual flying speed of the flying target is
Figure GDA0004125071690000095
Wherein v x Velocity in x direction, v y For velocity in the y direction, v z Is the velocity in the z direction.
And then the space position and the flight direction can be calculated according to the relation between the speed time and the solid included angle theta.
Although embodiments of the present invention have been shown and described, it will be understood by those skilled in the art that various changes, modifications, substitutions and alterations can be made therein without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (2)

1. A target space attitude processing method based on a compound eye imaging principle is characterized by comprising the following steps of: the method specifically comprises the following steps of orthogonally shooting image information of a flying target through a compound eye imitation imaging system;
processing image information orthogonally shot by a compound eye imaging system to obtain two groups of image sequences orthogonally shot by each other;
acquiring position information of characteristic points and characteristic line pixels of a flying target at different moments from two groups of image sequences which are orthogonally shot;
acquiring coordinate information of feature points and feature line pixels of a flying target by combining a scale;
after knowing the coordinate information of the feature points and the feature line pixels of the flying object, calculating the speed of the flying object according to the coordinate information of the feature points and the feature line pixels of the flying object, wherein in a three-dimensional coordinate system Oxyz, the OC vector is the speed vector of the unmanned plane object, and OB and OD are projection vectors of the vector OC on a coordinate plane Oxz and a plane Oyz respectively;
in the orthogonal imaging process, the OB vector and the OD vector are respectively velocity vectors of two imaging systems in two orthogonal directions, the ratio xOB is theta 1, the ratio yOD is theta 2, the three-dimensional included angle between the velocity vector and the horizontal plane Oxy is theta, the calculation relation between theta and theta 1 and theta 2 is,
Figure FDA0004125071680000011
thereby calculating the three-dimensional included angle between the velocity vector and the horizontal plane Oxy according to the relation and marking the three-dimensional included angle as theta, then calculating the ratio of unit pixels of image information of the flying object acquired by two orthogonal imaging systems in the simulated compound eye orthogonal imaging system to the actual distance according to the three-dimensional included angle theta, and then calculating the actual displacement S of the flying object in the x, y and z directions according to the scale x 、S y 、S z And is obtained from the time interval t of two images formed by two orthogonal imaging systems,
Figure FDA0004125071680000021
Figure FDA0004125071680000022
Figure FDA0004125071680000023
the actual flying speed of the flying target is
Figure FDA0004125071680000024
Wherein v x Velocity in x direction v y Velocity in the y direction, v z Is the velocity in the z direction;
and then the space position and the flight direction can be calculated according to the relation between the speed time and the solid included angle theta.
2. The method for processing the target space posture based on the compound eye imaging principle according to claim 1, wherein the method is characterized by comprising the following steps of:
the processing of the shot image information comprises the steps of editing a video image, enhancing the video image, splicing the image, compressing the image video and adjusting the gray level of the image;
editing the video image refers to editing valuable parts of the image and the video, so that the data processing amount of the image video data is reduced in the subsequent processing process;
the enhancement of the video image means that the gray level distribution histogram of the image is adjusted, so that the histogram is more uniform, and the definition of the image is increased;
the image stitching refers to stitching the images of each view field of the image in a plurality of view field ranges so as to enlarge the view field of the image;
the compression of the image video means that the redundancy of the image video is reduced, so that the data volume of the image and the video is reduced, and the subsequent processing speed is improved;
the adjustment of the image gray level is to adjust the brightness of the image by the pointer on the intensity of the light imaged on the object, and the condition of uneven brightness of the image caused by illumination of the flying target in the flying process is overcome, so that the parameter calculation precision of the space flying target is improved.
CN201911211945.7A 2019-12-02 2019-12-02 Target space attitude processing method based on compound eye imaging principle Active CN110989645B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911211945.7A CN110989645B (en) 2019-12-02 2019-12-02 Target space attitude processing method based on compound eye imaging principle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911211945.7A CN110989645B (en) 2019-12-02 2019-12-02 Target space attitude processing method based on compound eye imaging principle

Publications (2)

Publication Number Publication Date
CN110989645A CN110989645A (en) 2020-04-10
CN110989645B true CN110989645B (en) 2023-05-12

Family

ID=70089072

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911211945.7A Active CN110989645B (en) 2019-12-02 2019-12-02 Target space attitude processing method based on compound eye imaging principle

Country Status (1)

Country Link
CN (1) CN110989645B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111563437B (en) * 2020-04-28 2023-04-07 德施曼机电(中国)有限公司 Iris detection method of door lock device
CN113188776B (en) * 2021-04-27 2022-11-11 哈尔滨工业大学 Compound eye imaging contact ratio detection system and detection method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011203057A (en) * 2010-03-25 2011-10-13 Tokyo Electric Power Co Inc:The Distance measuring instrument for flying object and flying object position measuring instrument
CN105758397A (en) * 2016-02-14 2016-07-13 中国船舶工业系统工程研究院 Flying vehicle image pickup positioning method
CN106408650A (en) * 2016-08-26 2017-02-15 中国人民解放军国防科学技术大学 3D reconstruction and measurement method for spatial object via in-orbit hedgehopping imaging
CN106643664A (en) * 2016-12-28 2017-05-10 湖南省道通科技有限公司 Method and device for positioning unmanned aerial vehicle
CN107422743A (en) * 2015-09-12 2017-12-01 深圳九星智能航空科技有限公司 The unmanned plane alignment system of view-based access control model
CN109521781A (en) * 2018-10-30 2019-03-26 普宙飞行器科技(深圳)有限公司 Unmanned plane positioning system, unmanned plane and unmanned plane localization method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011203057A (en) * 2010-03-25 2011-10-13 Tokyo Electric Power Co Inc:The Distance measuring instrument for flying object and flying object position measuring instrument
CN107422743A (en) * 2015-09-12 2017-12-01 深圳九星智能航空科技有限公司 The unmanned plane alignment system of view-based access control model
CN105758397A (en) * 2016-02-14 2016-07-13 中国船舶工业系统工程研究院 Flying vehicle image pickup positioning method
CN106408650A (en) * 2016-08-26 2017-02-15 中国人民解放军国防科学技术大学 3D reconstruction and measurement method for spatial object via in-orbit hedgehopping imaging
CN106643664A (en) * 2016-12-28 2017-05-10 湖南省道通科技有限公司 Method and device for positioning unmanned aerial vehicle
CN109521781A (en) * 2018-10-30 2019-03-26 普宙飞行器科技(深圳)有限公司 Unmanned plane positioning system, unmanned plane and unmanned plane localization method

Also Published As

Publication number Publication date
CN110989645A (en) 2020-04-10

Similar Documents

Publication Publication Date Title
CN105225241B (en) The acquisition methods and unmanned plane of unmanned plane depth image
CN111326023B (en) Unmanned aerial vehicle route early warning method, device, equipment and storage medium
CN106529495B (en) Obstacle detection method and device for aircraft
CN104215239B (en) Guidance method using vision-based autonomous unmanned plane landing guidance device
WO2017000875A1 (en) Aircraft and obstacle avoidance method and system thereof
US8649917B1 (en) Apparatus for measurement of vertical obstructions
US8300096B2 (en) Apparatus for measurement of vertical obstructions
WO2018210078A1 (en) Distance measurement method for unmanned aerial vehicle, and unmanned aerial vehicle
CN111435081B (en) Sea surface measuring system, sea surface measuring method and storage medium
CN110989645B (en) Target space attitude processing method based on compound eye imaging principle
CN207600385U (en) A kind of online machine vision metrology device of square steel billets three-dimensional dimension
CN112995516A (en) Focusing method and device, aerial camera and unmanned aerial vehicle
CN110750153A (en) Dynamic virtualization device of unmanned vehicle
CN110910502A (en) Unmanned aerial vehicle three-dimensional modeling system
CN113012292A (en) AR remote construction monitoring method and system based on unmanned aerial vehicle aerial photography
KR101614654B1 (en) Distance measurement of objects from droned with a monocular camera and GPS location data
CN113129280A (en) Target drop point measuring method based on building contour features
CN110849269A (en) System and method for measuring geometric dimension of field corn cobs
CN114442129A (en) Dynamic adjustment method for improving unmanned aerial vehicle survey precision of complex slope rock mass
CN215767057U (en) Dynamic adjusting device for improving precision of rock mass of complex slope investigated by unmanned aerial vehicle
KR101992417B1 (en) Apparatus and method for measuring airburst height of weapon system
CN116430879A (en) Unmanned aerial vehicle accurate guiding landing method and system based on cooperative targets
KR101957662B1 (en) Apparatus for calculating target information, method thereof and flight control system comprising the same
KR102298047B1 (en) Method of recording digital contents and generating 3D images and apparatus using the same
CN110989646A (en) Compound eye imaging principle-based target space attitude processing system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant