CN110686664A - Visual positioning system, unmanned aerial vehicle and method for self-detecting position of unmanned aerial vehicle - Google Patents

Visual positioning system, unmanned aerial vehicle and method for self-detecting position of unmanned aerial vehicle Download PDF

Info

Publication number
CN110686664A
CN110686664A CN201810725096.6A CN201810725096A CN110686664A CN 110686664 A CN110686664 A CN 110686664A CN 201810725096 A CN201810725096 A CN 201810725096A CN 110686664 A CN110686664 A CN 110686664A
Authority
CN
China
Prior art keywords
module
data
series
drone
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810725096.6A
Other languages
Chinese (zh)
Inventor
田瑜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Autoflight Co Ltd
Original Assignee
Shanghai Autoflight Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Autoflight Co Ltd filed Critical Shanghai Autoflight Co Ltd
Priority to CN201810725096.6A priority Critical patent/CN110686664A/en
Publication of CN110686664A publication Critical patent/CN110686664A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system

Abstract

The invention discloses a visual positioning system, an unmanned aerial vehicle and a method for self-detecting the position of the unmanned aerial vehicle. The body has a lower side. The camera lens module sets up the downside at unmanned aerial vehicle. The lens module has a wide field of view and captures a series of images of the area under the drone over time during flight. The matching module compares and compares features in each image in the series of images to derive first data. The translation module translates the first data into positioning data. This visual positioning system is used for this unmanned aerial vehicle. The invention may use a visual positioning system in the drone to allow the drone to have additional/alternative positioning information.

Description

Visual positioning system, unmanned aerial vehicle and method for self-detecting position of unmanned aerial vehicle
Technical Field
The invention relates to a visual positioning system, an unmanned aerial vehicle and a method for self-detecting the position of the unmanned aerial vehicle.
Background
Unmanned Aerial Vehicles (UAVs) are remotely flying or autonomous aircraft carrying cameras, sensors, communication devices, or other payloads. With the leap-type development of society and industry, unmanned aerial vehicle aerial photography has been applied to more and more fields, such as movie and television shooting, fire patrol shooting, traffic monitoring.
However, there is still a need for new methods to improve the ability of drones to detect their own position. Many conventional drones use Global Positioning System (GPS) location information to determine flight routes and how to maneuver around buildings and other objects in the sky. However, GPS location information may be inaccurate due to factors such as rain, wind, distortion of the GPS location information or dropping between high-rise buildings, being indoors where the GPS location information may be limited, and the like.
Disclosure of Invention
The technical problem that this disclosure will solve is to provide positioning system beyond GPS for unmanned aerial vehicle to overcome the shortcoming that GPS positioning information may be inaccurate under many different circumstances.
The present disclosure solves the above technical problems by the following technical solutions.
In one or more embodiments according to the present disclosure, a visual positioning system for a drone is provided. The system comprises a lens module, a matching module, a sensor module and a translation module. The lens module captures a series of fisheye view images over time during flight. The matching module derives first data by comparing and contrasting at least one feature from the fisheye view image. The sensor module collects a second data, the sensor module electrically coupled to the lens module. The translation module is electrically coupled to the sensor module to generate the positioning data by processing the first data in view of the second data. The sensor module includes one or more of a LiDAR sensor, an Inertial Measurement Unit (IMU), a GPS receiver, and a radar.
In one or more embodiments according to the present disclosure, the lens module includes a fisheye lens having a field of view of at least 180 degrees.
In one or more embodiments according to the present disclosure, the second data comprises one or more of altitude data, a GPS map, GPS coordinates, a position relative to at least one surrounding physical object. In one or more embodiments according to the present disclosure, the sensor module includes one or more of a LiDAR and a GPS receiver.
In one or more embodiments according to the present disclosure, an Unmanned Aerial Vehicle (UAV) is provided. This unmanned aerial vehicle includes body, fisheye lens module, matching module and translation module. The body has a lower side. Fisheye lens module sets up the downside at unmanned aerial vehicle. The fisheye lens module has a wide field of view and, over time, takes a series of images of the area under the drone during flight. The matching module compares and compares features of each image in the series of images to derive first data. The translation module translates the first data into positioning data.
In one or more embodiments according to the present disclosure, the drone further includes a verification module that compares the positioning data to a GPS map.
In one or more embodiments of the invention, the drone further comprises a gimbal connecting the fisheye lens module to the body, the gimbal having at most 2 axes.
In one or more embodiments according to the present disclosure, the series of images taken over time are taken at a rate of at least 60 images per second.
In one or more embodiments of the invention, the drone further comprises a sensor module for collecting second data, and the translation module is for considering the second data to derive the positioning data. The sensor module includes one or more of a LiDAR sensor, an Inertial Measurement Unit (IMU), a GPS receiver, and a radar.
In one or more embodiments of the invention, a method is provided for self-detecting the position of a drone itself when flying in areas with insufficient GPS signals. The method comprises the following steps: obtaining initial positioning data comprising altitude data; taking a series of fisheye view images during flight at a rate of over 60 images per second; comparing relative changes between the series of fisheye view images; and calculating the position of the unmanned aerial vehicle according to the initial positioning data and the relative change.
In one or more embodiments according to the present disclosure, the method further includes using data from the IMU.
In one or more embodiments according to the present disclosure, the method further comprises comparing the relative change to a GPS map.
According to the present disclosure, a visual positioning system may be used in a drone to allow the drone to have additional/alternative positioning information. Compared with the traditional unmanned aerial vehicle only using GPS positioning information, the unmanned aerial vehicle provided by the invention adopts a visual positioning system as a supplementary mode on the basis of the existing positioning system, so that more accurate positioning information can be obtained.
Drawings
The various aspects of the invention are best understood from the following detailed description when read with the accompanying drawing figures. It is noted that, in accordance with the standard practice in the industry, various features are not drawn to scale. In fact, the dimensions of the various features may be arbitrarily increased or reduced for clarity of discussion.
Fig. 1 is a side view of a UAV according to some embodiments of the present disclosure.
Fig. 2 is a block diagram of a UAV according to some embodiments of the present disclosure.
Fig. 3 is a side view of a lens module according to some embodiments of the present disclosure.
Fig. 4 illustrates a diffuse two-dimensional map of various points shown in a lens module according to some embodiments of the present disclosure.
Fig. 5 illustrates four images taken by a lens module representing four views of the ground over time, according to some embodiments of the present disclosure.
Fig. 6 is a perspective view of a lens module on a 2-axis gimbal according to some embodiments of the present disclosure.
Fig. 7 is a block diagram of a UAV according to some embodiments of the present disclosure.
Fig. 8 is a block diagram of a UAV according to some embodiments of the present disclosure.
Fig. 9 is a method of self-detecting the location of a drone itself, in accordance with some embodiments of the present disclosure.
Detailed Description
Various embodiments may now be better understood by turning to the following description. These embodiments are shown in the illustrated examples.
Many variations and modifications may be made by one of ordinary skill in the art without departing from the spirit and scope of the embodiments. Accordingly, it must be understood that the illustrated embodiments have been set forth only for the purposes of example, and that they should not be taken as limiting.
The words used in this specification to describe embodiments are to be understood not only in the sense of their commonly defined meanings, but to include by special definition in this specification structure, material or acts beyond the scope of the commonly defined meanings.
The definitions of the words or elements of the following claims, therefore, include not only the combination of elements which are literally set forth, but all equivalent structure, material or acts for performing substantially the same function in substantially the same way to obtain substantially the same result. In this sense it is therefore contemplated that an equivalent substitution of two or more elements may be made for any one of the elements disclosed or that a single element may be substituted for two or more elements. Although elements may be described herein as acting in certain combinations, it is to be expressly understood that one or more elements from a disclosed combination can in some cases be excised from the combination, and that the combination may be directed to a subcombination or variation of a subcombination.
Further, terms such as "below," "lower," "above," "upper," "lower," "left," and "right" may be used herein to facilitate describing one element or feature's relationship to another element or feature(s), as illustrated. Spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. The device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may be present.
The invention discloses a new mode for improving the integrity and effectiveness of the position detection of the unmanned aerial vehicle by using a positioning system. The positioning system may be a visual positioning system. According to some embodiments, the visual positioning system may generate a visual position, e.g., a relative position with respect to a reference point or point block of the drone. The visual location may be different from a Global Positioning System (GPS) defined location. According to some embodiments of the present disclosure, there is also provided a new method of self-detecting the relative position of a drone with respect to a reference point or block of points when flying through an area where GPS signals are insufficient.
In one or more embodiments in accordance with the present disclosure, a visual positioning system may be used in an Unmanned Aerial Vehicle (UAV) to obtain additional/alternative positioning information. When GPS signals are poor, additional/alternative positioning information may be used to locate the UAV. In one or more embodiments according to the present disclosure, a visual positioning system may take images (photos and/or videos) of the ground during flight and process the images (photos and/or videos) in real-time to obtain positioning information. According to some embodiments, the visual positioning system may select at least one reference object in the image, and analyze the image to calculate positioning information for the UAV based on the reference object.
In one or more embodiments of the present disclosure, the visual positioning system may be used as a supplement over existing positioning systems, and thus the drone may acquire more accurate positioning information, as compared to a conventional drone that uses only GPS positioning information. Furthermore, a drone having a visual positioning system according to one or more embodiments of the present disclosure can obtain accurate positioning information even when flying in places where GPS signals are limited or poorly received, such as when flying between high-rise buildings and structures.
Referring to fig. 1 and 2, fig. 1 is a side view of an Unmanned Aerial Vehicle (UAV)100 according to some embodiments of the present disclosure. Fig. 2 is a block diagram of UAV100 in accordance with some embodiments of the present disclosure. In one or more embodiments, UAV100 or drone is a multi-rotor aircraft. UAV100 may be a triple-rotor, quad-rotor, six-rotor, eight-rotor, or other multi-rotor aircraft. In one or more embodiments, UAV100 includes a body 101, a lens module 102, a matching module 103, and a translation module 104. In one or more embodiments, the lens module 102, in conjunction with the matching module 103 and the translation module 104, may be a visual positioning system. In one or more embodiments, matching module 103 and translation module 104 can be disposed inside ontology 101. In one or more embodiments, matching module 103 and translation module 104 may be disposed outside of ontology 101. For example, the matching module 103 and the translation module 104 may be provided together with the lens module 102. Alternatively, the matching module 103 may be provided together with the lens module 102, and the translation module 104 may be provided within the body 101, or vice versa.
In one or more embodiments, the body 101 may have different shapes and structures according to the design or requirements of the user. The body 101 has a lower side 1011. In one or more embodiments, underside 1011 is the bottom of UAV100 that faces the ground during flight of UAV 100.
In one or more embodiments, lens module 102 is disposed on an underside 1011 of UAV100 and is mounted to a bottom surface of UAV 100. In one or more embodiments, the lens module 102 may include a lens unit 1021 and an image sensor unit 1022 connected to each other. The image sensor unit 1022 is configured to sense an incident light signal from the lens unit 1021 to generate a series of fisheye view images. In one or more embodiments, the lens unit 1021 includes a wide-angle lens, an ultra-wide-angle lens, or a combination thereof. The wide-angle lens includes at least one lens having a field of view of at most 180 degrees. The ultra-wide angle lens unit includes at least one lens having a field of view of at least 180 degrees. In one or more embodiments, the ultra-wide angle lens may be a fisheye lens. In one or more embodiments, the lens module 102 may be considered a fisheye lens module. In one or more embodiments, the image sensor unit 1022 may include a CCD (charge coupled device) image sensor, a CMOS (complementary metal oxide semiconductor) image sensor, or a combination thereof.
In one or more embodiments, the lens module 102 can be controlled to capture or take a series of images (photographs and/or videos) I of an area (e.g., a ground area) beneath the UAV100 over time during flight. Each image in the series of images I may comprise a reference point or block of points on the ground area. For example, the reference point block may be a particular building or tree on the ground area. In one or more embodiments, the series of images I may be a series of fish-eye view images. In one or more embodiments, the series of images (photos and/or videos) I is taken at least 60 images per second. In one or more embodiments, the lens module 102 may begin taking images (photos and/or videos) I when the initial positioning information (e.g., GPS signals) received by the UAV is limited. For example, the signal strength of the initial positioning information is below a predetermined factor. It is noted that the lens module 102 may or may not take images (photos and/or videos) I from the beginning of the flight. In one or more embodiments, the lens module 102 may take images (photos and/or videos) I during a predetermined period of time in flight. In one or more embodiments, the lens module 102 may take an image (photo and/or video) I when certain predetermined factors are met, such as reaching a predetermined height. Of course, these are merely examples and are not intended to be limiting.
Referring to fig. 1 and 3, fig. 3 is a side view of a lens module 102 according to some embodiments of the present disclosure. In one or more embodiments, the lens module 102 may include an ultra-wide angle lens, such as a fisheye lens. The fisheye lens may have a field of view of 220 degrees (as covered by points a-E-B along the circumference in fig. 3). For example, a fisheye lens may capture two distant objects O1 and O2 in fig. 3. In other words, the lens module 102 may capture a single image I having a field of view of 220 degrees. Referring to fig. 4, fig. 4 illustrates an expanded two-dimensional view 400 of various points shown in the lens module 102, according to some embodiments of the present disclosure. As shown in fig. 4, a two-dimensional illustration of an image of the lens module 102 having a field of view of 220 degrees and a point A, B (shown in fig. 3) are shown due to such a wide field of view. In other words, the lens module 102 may capture an image I having a wider field of view through an ultra-wide angle lens such as a fisheye lens.
Referring again to fig. 3, optionally, the lens module 102 may also take a single image I having a field of view of 180 degrees (e.g., an overlay of points C through E through D along the circumference in fig. 3). In other words, the lens module 102 may include a wide-angle lens as needed. Of course, these are merely examples and are not intended to be limiting. In one or more embodiments, various field angles other than 220 degrees and 180 degrees are also contemplated.
On the other hand, referring again to fig. 1, UAV100 may tilt during flight, causing lens module 102 to also tilt. For example, when UAV100 moves forward, will naturally tilt forward and lens module 102 will tilt with body 101. Accordingly, in one or more embodiments, the lens module 102 has a wider field of view. When lens module 102 has a wider field of view, captured image I may contain sufficient ground information to calculate additional/alternative positioning information for UAV100 when GPS signals are poor. For example, when lens module 102 has a wider field of view, reference point blocks on the ground area are always included in the series of images I, where the reference point blocks are used to calculate additional/alternative positioning information for UAV 100.
Referring again to fig. 2, in one or more embodiments, the matching module 103 is electrically coupled with the lens module 102. In one or more embodiments, the matching module 103 may be incorporated into a processor, such as a Central Processing Unit (CPU), Digital Signal Processor (DSP), programmable controller, Application Specific Integrated Circuit (ASIC), Programmable Logic Device (PLD), or other similar device, or a combination of such devices. In one or more embodiments, the matching module 13 compares and compares features in each of the series of images (photos and/or videos) I to generate or derive the first data D1. In other words, the matching module 103 may identify features in each image I and compare the series of images I to distinguish differences between the series of images I. For example, matching module 103 may obtain a flight distance or movement vector of UAV100 based on the location of a predetermined reference point in series of images I. In one or more embodiments, the first data D1 may include image data, vector data, raster data or other forms of data, or a combination thereof.
Referring again to FIG. 2, in one or more embodiments, translation module 104 is electrically coupled with matching module 103. In one or more embodiments, translation module 104 may be incorporated into a processor, such as a central processing unit CPU, a Digital Signal Processor (DSP), a programmable controller, an Application Specific Integrated Circuit (ASIC), a Programmable Logic Device (PLD), or other similar device, or a combination of such devices. In one or more embodiments, translation module 104 may be incorporated in the same processor as matching module 103. Alternatively, translation module 104 may be incorporated in a different processor than matching module 103.
In one or more embodiments, the translation module 104 translates the first data D1 into positioning data. In one or more embodiments, translation module 104 may receive first data D1 from matching module 103, and translation module 104 may calculate a relative change (e.g., direction/altitude/speed) of UAV100 based on first data D1 to generate positioning data. In one or more embodiments, the calculation may take into account many other known factors, such as time of flight, the initial position of the UAV, or other useful factors.
Referring to fig. 2 and 5, fig. 5 illustrates four images 501-504 taken by the lens module 102 representing, for example, four views of the ground at different times t 1-t 4, according to some embodiments of the present disclosure. In one or more embodiments, the lens module 102 may take at least 60 images per second. Referring to fig. 5, UAV100 has flown/moved from position P1 to position P4. The lens module 102 (e.g., the image sensor unit 1022) is configured to capture respective images 501-504 in the order of times t1, t2, t3, and t4, respectively. It can be seen that the position of the reference point P in the images 501-504 changes according to the positions P1-P4. By comparing the relative change in position/location of the same reference point P in the series of images 501-504, UAV100 can calculate the relative change in direction/altitude/speed of the drone itself. In one or more embodiments, the matching module 103 can identify a reference point P in a series of images and compare the relative position/location of the same reference point P and compare the relative change in the relative position/location of the same reference point P. In one or more embodiments, matching module 103 may generate first data D1 based on a change in reference point P.
In one or more embodiments, translation module 104 may receive first data D1 and calculate a relative change (e.g., direction/altitude/speed) of UAV100 based on first data D1 to generate relative positioning data for UAV 100. In one or more embodiments, UAV100 may also include an antenna module (not shown) to transmit positioning data back to a remote control (not shown) of UAV 100. In fig. 5, the reference point or point block P is represented by a small dot. Of course, these are merely examples and are not intended to be limiting. In operation, reference point P can be any stationary object (e.g., a tree or building) shown in images 501-504.
In one or more embodiments according to the present disclosure, UAV100 with a visual positioning system may acquire additional/alternative positioning information in addition to traditional positioning information, e.g., GPS signals. In one or more embodiments in accordance with the present disclosure, lens module 102 may take images (photos and/or videos) I of the ground area during flight, and matching module 103 and translation module 104 may process images (photos and/or videos) I in real-time to obtain additional/alternative positioning information for UAV 100.
In one or more embodiments of the invention, UAV100 using a visual positioning system as a supplement may have more accurate positioning information over existing positioning systems than a traditional drone using only GPS positioning information. Furthermore, UAV100 in accordance with one or more embodiments of the present disclosure having a visual positioning system can obtain accurate positioning information even when UAV100 is flying around locations where reception of GPS signals is limited or poor, such as when flying between high-rise buildings and structures.
Referring to fig. 6, fig. 6 is a perspective view of the lens module 102 on the gimbal 601 according to some embodiments of the present disclosure. In one or more embodiments, UAV100 may include a gimbal 601. Gimbal 601 connects lens module 102 to body 1011 of UAV 100. In one or more embodiments, gimbal 601 has at most 2 axes. The 2-axis gimbal 601 may also be used to stabilize the lens module 102. In one or more embodiments, by using the 2-axis gimbal 601, the pitch and roll angles of the UAV100 may be compensated for, and thus images taken from the lens module 102 are not affected. In one or more embodiments, a 1-axis universal joint may be used. Of course, these are merely examples and are not intended to be limiting.
Referring to fig. 7, fig. 7 is a block diagram of a UAV700 according to some embodiments of the present disclosure. In one or more embodiments, UAV700 includes a body 701 and a lens module 702. In one or more embodiments, ontology 701 may include matching module 703, translation module 704, and sensor module 705. According to some embodiments, the lens module 702 in combination with the matching module 703, the translation module 704, and the sensor module 705 may be considered a visual positioning system 709.
In one or more embodiments, the matching module 703, the translation module 704, and the sensor module 705 can be disposed inside the body 701. In one or more embodiments, the matching module 703, the translation module 704, and the sensor module 705 may be arranged with the lens module 702. Alternatively, one of the matching module 703, the translation module 704, and the sensor module 705 may be disposed together with the lens module 702, and the rest may be disposed within the body 701. Alternatively, two of the matching module 703, the translation module 704, and the sensor module 705 may be disposed together with the lens module 702, and the other may be disposed within the body 701.
In contrast to UAV100 in fig. 2, visual positioning system 709 of UAV700 also includes sensor module 705. In one or more embodiments, the second data D2 is collected by a sensor module 705, the sensor module 705 being electrically coupled to the lens module 702 and the translation module 704. In one or more embodiments, the sensor module 705 may be incorporated into a processor, such as a Central Processing Unit (CPU), a Digital Signal Processor (DSP), a programmable controller, an Application Specific Integrated Circuit (ASIC), a Programmable Logic Device (PLD), or other similar device, or a combination of such devices. In one or more embodiments, the sensor module 705 may be incorporated into the same processor as the matching module 703 and the translation module 704. Alternatively, the sensor module 705 may be incorporated in a different processor than the matching module 703 or the translation module 704.
In one or more embodiments, the sensor module 705 may include a light detection and ranging (LiDAR) sensor, an Inertial Measurement Unit (IMU), a GPS receiver or radar, or a combination thereof. In one or more embodiments, the second data D2 includes altitude data, a GPS map, GPS coordinates, or a location relative to at least one surrounding physical object, or a combination thereof.
In one or more embodiments, the lens module 702 (or the image sensor unit 1022) may begin taking images (photos and/or videos) I when the second data D2 received by the UAV700 matches a predetermined factor. For example, the signal strength of the second data D2 (e.g., GPS data) is below a predetermined factor.
In one or more embodiments, the translation module 704 may receive the first data D1 from the matching module 703 and the second data D2 from the sensor module 705. In one or more embodiments, the translation module 704 calculates relative changes (e.g., direction/altitude/speed) of the UAV700 based on the first data D1 to generate relative positioning data of the UAV700, taking into account the second data D2. In one or more embodiments, the calculation may take into account many other known factors, such as time of flight, the initial position of the UAV given by GPS, the initial altitude of the UAV, the speed of travel of the UAV, data collected by other sensors, such as LiDAR sensors, IMU, or radar. Of course, these are merely examples and are not intended to be limiting. In one or more embodiments, the sensor module 705 may include an IMU to compensate or supplement the calculations/positioning of the translation module 704.
It is noted that the functions of the lens module 702, the matching module 703 and the translation module 704 may be similar to the functions of the lens module 102, the matching module 103 and the translation module 104, respectively. Therefore, a detailed description is omitted herein for the sake of brevity.
In one or more embodiments of the invention, UAV700 using a visual positioning system as a supplement over an existing positioning system may have more accurate positioning information than a traditional drone that uses only GPS positioning information. Furthermore, UAV700 in accordance with one or more embodiments of the present invention having a visual positioning system may obtain accurate positioning information even when UAV700 is flying in a location where GPS signals are limited or poorly received, such as, for example, between high-rise buildings and structures.
Referring to fig. 8, fig. 8 is a block diagram of a UAV 800 according to some embodiments of the present disclosure. In one or more embodiments, UAV 800 includes: an ontology 801, a lens module 802, a matching module 803, a translation module 804, and a verification module 806.
In one or more embodiments, matching module 803, translation module 804, and verification module 806 may be disposed within ontology 801. In one or more embodiments, the matching module 803, the translation module 804, and the verification module 806 may be arranged with the lens module 802. Alternatively, one of the matching module 803, the translation module 804, and the verification module 806 may be disposed with the lens module 802, while the rest may be disposed within the body 801. Alternatively, two of the matching module 803, the translation module 804, and the verification module 806 may be disposed with the lens module 802, and the other may be disposed within the body 801.
In contrast to UAV100 or 700, UAV 800 also includes a verification module 806. In one or more embodiments, verification module 806 is electrically coupled to translation module 804. In one or more embodiments, the verification module 806 may be incorporated into a processor, such as a Central Processing Unit (CPU), a Digital Signal Processor (DSP), a programmable controller, an Application Specific Integrated Circuit (ASIC), a Programmable Logic Device (PLD), or other similar device, or a combination of such devices. In one or more embodiments, the verification module 806 may be incorporated in the same processor as the matching module 803 and the translation module 804. Alternatively, the verification module 806 may be incorporated in a different processor than the matching module 803 and the translation module 804.
In one or more embodiments, the verification module 806 can compare the positioning data D3 generated by the translation module 804 to a GPS map (not shown), which can be a copy of a GPS map pre-stored in the UAV 800. Thus, the verification module 806 may further verify the position of the UAV 800 by using a visual positioning system as a supplemental way on top of an existing positioning system, and the UAV 800 may have more accurate positioning information. In one or more embodiments, the verification module 806 may also be used in the UAV700 as described in fig. 2 and receive GPS maps from the sensor module 705. A detailed description is omitted here for the sake of brevity.
It is noted that the functionality of the lens module 802, the matching module 803, and the translation module 804 may be similar to the functionality of the lens module 102, the matching module 103, and the translation module 104, respectively. Therefore, a detailed description is omitted herein for the sake of brevity.
Referring to fig. 9, fig. 9 is a method 900 of self-detecting the location of a drone itself, in accordance with some embodiments of the present disclosure. In one or more embodiments, method 900 includes steps 901-904. In step 901, the drone acquires initial positioning data containing altitude data. In one or more embodiments, the drone may also use many other known data, such as time of flight, travel speed, data collected by other sensors such as LiDAR sensors, IMUs, or radar.
In step 902, the drone takes a series of fisheye view images in flight at a rate of over 60 images per second. In one or more embodiments, the drone uses a lens module to capture ground images. The lens module may include a wide-angle lens, an ultra-wide-angle lens, or a combination thereof. The wide-angle lens includes at least one lens having a field of view of at most 180 degrees. The ultra-wide angle lens unit includes at least one lens having a field of view of at least 180 degrees. In one or more embodiments, the ultra-wide angle lens may be a fisheye lens. In one or more embodiments, the series of fisheye view images includes at least a first image at a first flight position and a second image at a second flight position. In one or more embodiments, the drone may level the lens module using a 2-axis universal joint to stabilize the lens module.
In step 903, the drone compares the relative changes between the series of fisheye view images. In one or more embodiments, the drone also compares the relative change to a GPS map. In one or more embodiments, the drone further measures the distance between the ground and the drone.
In step 904, the drone calculates the position of the drone from the initial positioning data and the relative changes. In one or more embodiments, the drone converts or translates the image information into the drone's speed and relative flight position/direction information to determine the drone's own position/location. In one or more embodiments, the drone may transmit at least one of a LiDAR signal, a radar signal, an RF signal, a satellite signal, etc., to the controller. In one or more embodiments, the drone may adjust/modify its own location by comparing and comparing the converted speed and/or relative flight position/direction information of the drone to GPS location information.
In one or more embodiments, the operation of UAVs 100, 700, 800 described in fig. 1-8 may be summarized as the steps described in fig. 9.
In one or more embodiments according to the present disclosure, a UAV/drone with a visual positioning system may acquire additional/alternative positioning information in addition to traditional positioning information. In one or more embodiments according to the present disclosure, the UAV/drone may take images (photos and/or videos) of the ground during flight, and the UAV/drone may process the images (photos and/or videos) in real-time to obtain positioning information of the UAV/drone.
In one or more embodiments of the invention, a drone that uses a visual positioning system as a supplement on top of an existing positioning system may have more accurate positioning information than a traditional drone that uses only GPS positioning information. Furthermore, drones according to one or more embodiments of the present disclosure with a visual positioning system can obtain accurate positioning information even when the UAV/drone is flying around places where GPS is limited to receive or poor in reception, such as when flying between high-rise buildings and structures.
According to some embodiments, a visual positioning system for a drone is provided. A visual positioning system for an unmanned aerial vehicle comprises a lens module, a matching module, a sensor module and a translation module. The lens module captures a series of fisheye view images over time during flight. The matching module derives first data by comparing and contrasting at least one feature from the fisheye view image. The sensor module collects second data, the sensor module being electrically coupled to the lens module. The translation module is electrically coupled to the sensor module to generate the positioning data by processing the first data in view of the second data. The sensor module includes one or more of a LiDAR sensor, an Inertial Measurement Unit (IMU), a GPS receiver, and a radar.
According to some embodiments, an Unmanned Aerial Vehicle (UAV) is provided. An Unmanned Aerial Vehicle (UAV), the UAV comprising a body, a fisheye lens module, a matching module, and a translation module. The body has a lower side. The fisheye lens module is disposed on the lower side. The fisheye lens module has a wide field of view and, over time, takes a series of images of the area under the drone during flight. The matching module compares and compares features in each of the series of images to derive first data. The translation module translates the first data into positioning data.
According to some embodiments, a method of self-detecting the location of a drone itself when flying through an area with insufficient GPS signals is provided. The method comprises the following steps: (1) acquiring initial positioning data containing height data; (2) taking a series of fisheye view images during flight at a rate of over 60 images per second; (3) comparing relative changes between the series of fisheye view images; (4) and calculating the position of the unmanned aerial vehicle according to the initial positioning data and the relative change.
Thus, specific embodiments and applications of the visual positioning system are disclosed. It will be apparent, however, to one skilled in the art that many more modifications besides those already described are possible without departing from the concepts herein disclosed. Insubstantial changes from the disclosure as viewed by a person with ordinary skill in the art are expressly contemplated as being equivalent within the scope of the disclosure. Therefore, obvious substitutions now or later known to one with ordinary skill in the art are defined to be within the scope of the present disclosure. The inventive subject matter disclosed is therefore to be understood as embracing each and every matter specifically illustrated and described above, conceptually equivalent, as well as what can be obviously substituted and also essentially combined with the basic idea of the embodiment.
The foregoing has outlined features of several embodiments so that those skilled in the art may better understand the aspects of the present disclosure. Those skilled in the art should appreciate that they may readily use the present disclosure as a basis for designing or modifying other processes and structures for carrying out the same purposes and/or achieving the same advantages of the embodiments introduced herein. Those skilled in the art should also realize that such equivalent constructions do not depart from the spirit and scope of the present disclosure, and that they may make various changes, substitutions and alterations herein without departing from the spirit and scope of the present disclosure.

Claims (20)

1. A visual positioning system (709) for a drone, characterized in that it comprises:
a lens module (702), the lens module (702) for capturing a series of fisheye view images (I) over time during flight;
a matching module (703), the matching module (702) for deriving first data by comparing and contrasting at least one feature from the fisheye view image (I);
a sensor module (705), the sensor module (705) for collecting second data (D2), and the sensor module (705) electrically coupled to the lens module (702); and
a translation module (704), the translation module (704) being electrically coupled to the sensor module (705) to generate positioning data by processing first data (D1) taking into account second data (D2);
wherein the sensor module (705) includes one or more of a slave LiDAR sensor, an inertial measurement unit, a GPS receiver, and a radar.
2. The visual positioning system of claim 1, wherein the lens module includes a lens unit (1021) and an image sensor unit (1022), and the image sensor unit is to sense incident optical signals from the lens unit to generate the series of fisheye view images.
3. The visual positioning system of claim 2, wherein the lens unit comprises a fisheye lens having a field of view of at least 180 degrees.
4. The visual positioning system of claim 2, wherein the image sensor unit generates the series of fisheye view images at a series of times during flight.
5. The visual positioning system of claim 2, wherein the image sensor unit generates the series of fisheye view images during flight when the second data matches a predetermined factor.
6. The visual positioning system of claim 1, wherein the second data includes one or more of altitude data, a GPS map, GPS coordinates, and a position relative to at least one surrounding physical object.
7. The visual positioning system of claim 6, wherein the sensor module (705) includes one or more of a LiDAR sensor and a GPS receiver.
8. A drone (100), characterized in that it comprises:
a body (101), the body (101) having a lower side (1011);
a fisheye lens module (102), the fisheye lens module (102) being disposed on the underside (1011);
wherein the fisheye lens module (102) has a wide field of view and captures a series of images (I) of an area under the drone (100) over time during flight
A matching module (103), the matching module (103) for comparing and comparing features in each image of the series of images (I) to derive first data (D1); and
a translation module (104), the translation module (104) for translating the first data (D1) into positioning data.
9. The drone of claim 8, wherein the fisheye lens module comprises a lens unit (1021) and an image sensor unit (1022) arranged to sense incident light signals from the lens unit to generate a series of fisheye view images.
10. The drone of claim 9, wherein the image sensor unit generates the series of images at a series of times during flight.
11. The drone of claim 8, further comprising a verification module that compares the positioning data to a GPS map.
12. The drone of claim 8, further comprising a gimbal to connect the fisheye lens module to the body, wherein the gimbal has at most 2 axes.
13. The drone of claim 8, wherein the series of images taken over time are taken at a speed of at least 60 images per second.
14. The drone of claim 8, further comprising a sensor module (705), the sensor module (705) to collect second data (D2), and the translation module to consider the second data to derive the positioning data;
wherein the sensor module includes one or more of a LiDAR sensor, an inertial measurement unit, a GPS receiver, and a radar.
15. The drone of claim 14, wherein the fisheye lens module generates the series of images during flight when the second data matches a predetermined factor.
16. A method of self-detecting the position of an unmanned aerial vehicle when flying in an area with insufficient GPS signals, the method comprising:
obtaining initial positioning data comprising altitude data; (901)
taking a series of fisheye view images during flight at a rate of over 60 images per second; (902) comparing relative changes between the series of fisheye view images; (903) and
calculating a position of the drone from the initial positioning data and the relative change (904).
17. The method of claim 16, further comprising using data from an inertial measurement unit.
18. The method of claim 16, further comprising comparing the relative change to a stored copy of a GPS map.
19. The method of claim 16, wherein capturing the series of fisheye view images during flight at a speed of more than 60 images per second comprises:
collecting data; and
during flight, the series of fisheye view images is generated when the data matches a predetermined factor.
20. The method of claim 16, wherein capturing the series of fisheye view images during flight at a speed of more than 60 images per second comprises:
during flight, the series of fisheye view images are generated at a series of times.
CN201810725096.6A 2018-07-04 2018-07-04 Visual positioning system, unmanned aerial vehicle and method for self-detecting position of unmanned aerial vehicle Pending CN110686664A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810725096.6A CN110686664A (en) 2018-07-04 2018-07-04 Visual positioning system, unmanned aerial vehicle and method for self-detecting position of unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810725096.6A CN110686664A (en) 2018-07-04 2018-07-04 Visual positioning system, unmanned aerial vehicle and method for self-detecting position of unmanned aerial vehicle

Publications (1)

Publication Number Publication Date
CN110686664A true CN110686664A (en) 2020-01-14

Family

ID=69106588

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810725096.6A Pending CN110686664A (en) 2018-07-04 2018-07-04 Visual positioning system, unmanned aerial vehicle and method for self-detecting position of unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN110686664A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111272148A (en) * 2020-01-20 2020-06-12 江苏方天电力技术有限公司 Unmanned aerial vehicle autonomous inspection self-adaptive imaging quality optimization method for power transmission line

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102788580A (en) * 2012-06-20 2012-11-21 天津工业大学 Flight path synthetic method in unmanned aerial vehicle visual navigation
CN105841687A (en) * 2015-01-14 2016-08-10 上海智乘网络科技有限公司 Indoor location method and indoor location system
US20160253808A1 (en) * 2015-02-26 2016-09-01 Hexagon Technology Center Gmbh Determination of object data by template-based uav control
CN106687878A (en) * 2014-10-31 2017-05-17 深圳市大疆创新科技有限公司 Systems and methods for surveillance with visual marker
CN107543539A (en) * 2016-06-29 2018-01-05 联芯科技有限公司 The location information acquisition method and unmanned plane of a kind of unmanned plane

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102788580A (en) * 2012-06-20 2012-11-21 天津工业大学 Flight path synthetic method in unmanned aerial vehicle visual navigation
CN106687878A (en) * 2014-10-31 2017-05-17 深圳市大疆创新科技有限公司 Systems and methods for surveillance with visual marker
CN105841687A (en) * 2015-01-14 2016-08-10 上海智乘网络科技有限公司 Indoor location method and indoor location system
US20160253808A1 (en) * 2015-02-26 2016-09-01 Hexagon Technology Center Gmbh Determination of object data by template-based uav control
CN107543539A (en) * 2016-06-29 2018-01-05 联芯科技有限公司 The location information acquisition method and unmanned plane of a kind of unmanned plane

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111272148A (en) * 2020-01-20 2020-06-12 江苏方天电力技术有限公司 Unmanned aerial vehicle autonomous inspection self-adaptive imaging quality optimization method for power transmission line
CN111272148B (en) * 2020-01-20 2021-08-31 江苏方天电力技术有限公司 Unmanned aerial vehicle autonomous inspection self-adaptive imaging quality optimization method for power transmission line

Similar Documents

Publication Publication Date Title
US11879737B2 (en) Systems and methods for auto-return
US10650235B2 (en) Systems and methods for detecting and tracking movable objects
US11604479B2 (en) Methods and system for vision-based landing
CN106647804B (en) A kind of automatic detecting method and system
US8174562B2 (en) Stereo camera having 360 degree field of view
US11019322B2 (en) Estimation system and automobile
CN108323190B (en) Obstacle avoidance method and device and unmanned aerial vehicle
WO2018086133A1 (en) Methods and systems for selective sensor fusion
CN110537365B (en) Information processing device, information processing method, information processing program, image processing device, and image processing system
JP6138326B1 (en) MOBILE BODY, MOBILE BODY CONTROL METHOD, PROGRAM FOR CONTROLLING MOBILE BODY, CONTROL SYSTEM, AND INFORMATION PROCESSING DEVICE
KR101160454B1 (en) Construction method of 3D Spatial Information using position controlling of UAV
KR101600862B1 (en) stereo vision system using a plurality of uav
Leira et al. A ligth-weight thermal camera payload with georeferencing capabilities for small fixed-wing UAVs
WO2020006709A1 (en) Visual positioning system, unmanned aerial vehicle and method for self-detecting position of unmanned aerial vehicle
JP2019144804A5 (en)
CN110686664A (en) Visual positioning system, unmanned aerial vehicle and method for self-detecting position of unmanned aerial vehicle
JP6949930B2 (en) Control device, moving body and control method
JP7468523B2 (en) MOBILE BODY, POSITION ESTIMATION METHOD, AND PROGRAM
US11415990B2 (en) Optical object tracking on focal plane with dynamic focal length
JP7437930B2 (en) Mobile objects and imaging systems
JP7317684B2 (en) Mobile object, information processing device, and imaging system
Abousleiman et al. Statistical Algorithm for Attitude Estimation from Real-Time Aerial Video
정호현 Development and Accuracy Evaluation of Customized UAV for Photogrammetry
Droeschel et al. 3d local multiresolution grid for aggregating omnidirectional laser measurements on a micro aerial vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination